Tensor Contraction: Contracting ##\mu## with ##\alpha##?

In summary, the term "contract" means to replace an index with another index, usually its dual. This is often done in equations and tensors to simplify notation and calculations. It does not simply mean "cancellation" of indices. To contract over multiple indices, you can use the covariant derivative operator and pass it through the metric, assuming it is covariantly constant.
  • #1
unscientific
1,734
13
tensor4.png


What do they mean by 'Contract ##\mu## with ##\alpha##'? I thought only top-bottom indices that are the same can contract? For example ##A_\mu g^{\mu v} = A^v##.
 
Physics news on Phys.org
  • #2
unscientific said:
What do they mean by 'Contract ##\mu## with ##\alpha##'?
It means "replace ##\mu## by ##\alpha## -- or vice versa".
 
  • #3
strangerep said:
It means "replace ##\mu## by ##\alpha## -- or vice versa".
How can we simply do that?
 
  • #4
unscientific said:
How can we simply do that?
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
 
  • #5
strangerep said:
Suppose you have an ordinary matrix. If I told you to "take the trace of that matrix", what would that mean to you?
Summing up the diagonal, i.e. ##\sum M_{ii}##.
 
  • #6
unscientific said:
Summing up the diagonal, i.e. ##\sum M_{ii}##.
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
 
  • #7
strangerep said:
Correct.

So if I have a matrix equation like ##M_{ij} = 0##, then it is also true that ##\sum_i M_{ii} = 0##, or in summation convention notation, ##M_{ii} = 0##. One says that we have contracted ##i## with ##j##, though a slightly more helpful phrase might be to say that we have contracted over ##i,j##.

Similarly in your OP, except that it deals with contraction over 2 indices of a 4th rank tensor instead of a 2nd rank matrix.
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
 
  • #8
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
[tex]R^{\alpha}{}_{\beta \alpha \nu} = \delta^{\mu}_{\alpha} R^{\alpha}{}_{\beta \mu \nu} = g_{\alpha \gamma} g^{\mu \gamma} R^{\alpha}{}_{\beta \mu \nu}[/tex]
 
  • #9
unscientific said:
Do you mind showing the working how they 'contracted the indices'? I think it's not so simple as cancelling them.
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.
 
  • #10
strangerep said:
Indeed, it is not index "cancellation".

Multiply both sides of the original equation by the 2 factors of ##g## indicated on the rhs of Samalkhaiat's post #8. Then the only remaining "trick" is realizing that you can pass them through the covariant derivative operators -- since the metric is assumed to covariantly constant in GR, e.g., ##\nabla_\lambda g_{\alpha\gamma} = 0##, etc.

So, the working is

[tex] \nabla_\gamma R^\mu _{\nu \alpha \beta} + \nabla_\beta R^\mu _{\nu \gamma \alpha} + \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0 [/tex]

[tex] \delta_\mu ^\alpha \nabla_\gamma R^\mu _{\nu \alpha \beta} + \delta_\mu ^\alpha \nabla_\beta R^\mu _{\nu \gamma \alpha} + \delta_\mu ^\alpha \nabla_\alpha R^\mu _{\nu \beta \gamma} = 0 [/tex]

[tex] \nabla_\gamma g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \alpha \beta} + \nabla_\beta g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \gamma \alpha} + \nabla_\alpha g_{\mu \gamma} g^{\alpha \gamma} R^\mu _{\nu \beta \gamma} = 0 [/tex]

[tex] \nabla_\gamma R^\alpha _{\nu \alpha \beta} + \nabla_\beta R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha R^\alpha _{\nu \beta \gamma} = 0 [/tex]

[tex] \nabla_\gamma g^{\nu \gamma} R^\alpha _{\nu \alpha \beta} + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} + \nabla_\alpha g^{\nu \gamma} R^\alpha _{\nu \beta \gamma} = 0 [/tex]

[tex] 2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0 [/tex]

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?
 
  • #11
unscientific said:
So, the working is

[tex] 2 \nabla_\nu R^\nu _\beta + \nabla_\beta g^{\nu \gamma} R^\alpha _{\nu \gamma \alpha} = 0 [/tex]

Shouldn't it be a ##+## on the second term: ## 2 \nabla_\nu R^\nu _\beta + \nabla_\beta R = 0##?

No, because R is defined to be
[tex] R^\alpha _\alpha = g^{\nu \gamma} R^\alpha _{\nu \alpha \gamma} [/tex]
and you can obtain that form by exchange of the last two indices, hence the minus sign
 

Related to Tensor Contraction: Contracting ##\mu## with ##\alpha##?

1. What is tensor contraction?

Tensor contraction is a mathematical operation that reduces the dimensionality of a tensor by summing over one or more indices. It is commonly used in physics, engineering, and mathematics to simplify complex equations and express them in a more compact form.

2. How does tensor contraction work?

In tensor contraction, one or more indices are summed over to produce a new tensor with reduced dimensions. This is achieved by multiplying the corresponding components of the two tensors and then summing over the repeated index. The result is a new tensor with reduced dimensions and the remaining indices are determined by the remaining free indices of the original tensors.

3. What is the significance of contracting ##\mu## with ##\alpha##?

Contracting ##\mu## with ##\alpha## is a specific type of tensor contraction that is commonly used in relativity and quantum field theory. It allows for the simplification of equations involving tensors with these indices and helps to express them in a more concise form.

4. What are some applications of tensor contraction?

Tensor contraction has many applications in physics and engineering. It is used to simplify equations in general relativity, quantum field theory, and electromagnetism. It is also utilized in computer science and machine learning for data manipulation and dimensionality reduction.

5. Are there any limitations to tensor contraction?

While tensor contraction is a useful mathematical tool, it does have some limitations. It can only be performed on tensors with the same index structure and if the indices being contracted are repeated twice. Additionally, tensor contraction can only be applied to tensors that represent physical quantities and have well-defined transformation properties.

Similar threads

  • Special and General Relativity
Replies
8
Views
2K
  • Special and General Relativity
Replies
22
Views
2K
  • Special and General Relativity
Replies
19
Views
1K
  • Special and General Relativity
Replies
3
Views
278
  • Special and General Relativity
Replies
4
Views
818
  • Special and General Relativity
Replies
17
Views
513
  • Special and General Relativity
Replies
5
Views
1K
  • Special and General Relativity
Replies
24
Views
2K
  • Special and General Relativity
Replies
7
Views
379
  • Special and General Relativity
2
Replies
62
Views
4K
Back
Top