- #1
unscientific
- 1,734
- 13
What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
It means "replace ##\mu## by ##\alpha## -- or vice versa".unscientific said:What do they mean by 'Contract ##\mu## with ##\alpha##'?
How can we simply do that?strangerep said:It means "replace ##\mu## by ##\alpha## -- or vice versa".
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?unscientific said:How can we simply do that?
Summing up the diagonal, i.e. ##\sum M_{ii}##.strangerep said:Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
Correct.unscientific said:Summing up the diagonal, i.e. ##\sum M_{ii}##.
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.strangerep said:Correct.
So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.
Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
[tex]R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}[/tex]unscientific said:Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
Indeed, it is not index "cancellation".unscientific said:Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
strangerep said:Indeed, it is not index "cancellation".
Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
unscientific said:So, the working is
[tex] 2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0 [/tex]
Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?
Tensor contraction is a mathematical operation that reduces the dimensionality of a tensor by summing over one or more indices. It is commonly used in physics, engineering, and mathematics to simplify complex equations and express them in a more compact form.
In tensor contraction, one or more indices are summed over to produce a new tensor with reduced dimensions. This is achieved by multiplying the corresponding components of the two tensors and then summing over the repeated index. The result is a new tensor with reduced dimensions and the remaining indices are determined by the remaining free indices of the original tensors.
Contracting ##\mu## with ##\alpha## is a specific type of tensor contraction that is commonly used in relativity and quantum field theory. It allows for the simplification of equations involving tensors with these indices and helps to express them in a more concise form.
Tensor contraction has many applications in physics and engineering. It is used to simplify equations in general relativity, quantum field theory, and electromagnetism. It is also utilized in computer science and machine learning for data manipulation and dimensionality reduction.
While tensor contraction is a useful mathematical tool, it does have some limitations. It can only be performed on tensors with the same index structure and if the indices being contracted are repeated twice. Additionally, tensor contraction can only be applied to tensors that represent physical quantities and have well-defined transformation properties.