Tensor Decomposition: Exploring Linear Decomposition Methods

In summary, the conversation discusses the concept of tensors and their decomposition in terms of linear maps. It also mentions the role of coordinates in dealing with tensors and the difference between dealing with linear transformations and their matrices. The conversation also touches on the geometric interpretation of tensors and the determinant of the tensor product of two vectors.
  • #1
Jhenrique
685
4
Given a vector [tex]\vec{r}=\begin{bmatrix}
x\\
y
\end{bmatrix}[/tex]
It's possible to decompose it linearly, so:

[tex]\vec{r}=x\hat{i}+y\hat{j}[/tex]
So, how would the linear decomposition of a tensor?

Thx!
 
Physics news on Phys.org
  • #2
Let [itex]T \in \mathcal{T}^k(V)[/itex] be a [itex]k[/itex] tensor on the [itex]n[/itex] dimensional vector space [itex]V[/itex] over [itex]\mathbb{R}[/itex]. In other words, [itex]T[/itex] is the multilinear map [itex]T: V^k \to \mathbb{R}[/itex]. Let [itex]\{\varphi_1, \dotsc, \varphi_n\}[/itex] be the dual basis of some basis [itex]\{v_1, \dotsc, v_n\}[/itex] of [itex]V[/itex]. Then [itex]\{\varphi_{i_1} \otimes \dotsb \otimes \varphi_{i_k}: 1 \leq i_1, \dotsc, i_k \leq n\}[/itex] is a basis of the vector space of all tensors, [itex]\mathcal{T}^k(V)[/itex]. Moreover, we have
[tex]
T = \sum_{i_1, \dotsc, i_k = 1}^n T(v_{i_1}, \dotsc, v_{i_k}) \varphi_{i_1} \otimes \dotsb \otimes \varphi_{i_k}
[/tex]
since [itex]\{\varphi_1, \dotsc, \varphi_n\}[/itex] is the dual basis. If you want, you can choose the standard basis for [itex]V[/itex] and the corresponding standard dual basis.
 
  • #3
I don't understand the notation recursive and summations, really!

Can you give me a concrete example? I refer to a 3x3 matrix, or 2x2, of rank 2.
 
  • #4
Let [itex]T[/itex] be a [itex]2[/itex] tensor, i.e. bilinear form, on [itex]\mathbb{R}^2[/itex] defined by
[tex]
T(x) = x^T
\begin{pmatrix}
1 & 2 \\
3 & 4
\end{pmatrix}
x
[/tex]
regarding [itex]x[/itex] as a column matrix. Let [itex]\{e_1, e_2\}[/itex] be the standard basis and [itex]\{\varphi_1, \varphi_2\}[/itex] be the standard dual basis. We have [itex]\varphi_i(e_j) = \delta_{ij}[/itex]. We can also express it in matrix form as
[tex]
[\varphi_1] =
\begin{pmatrix}
1 & 0
\end{pmatrix}
\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;
[\varphi_1] =
\begin{pmatrix}
1 & 0
\end{pmatrix}
[/tex]
where [itex][\varphi_i][/itex] denotes the matrix of [itex]\varphi_i[/itex] with respect to the standard basis. Now, consider the set [itex]\{\varphi_1 \otimes \varphi_1, \varphi_1 \otimes \varphi_2, \varphi_2 \otimes \varphi_1, \varphi_2 \otimes \varphi_2\}[/itex]. In general, we have
[tex]
(\varphi_i \otimes \varphi_j)(e_k, e_l) = \varphi_i(e_k) \varphi_j(r_l) = \delta_{ik}\delta_{jl}.
[/tex]
So we can write this in matrix form as
[tex]
(\varphi_i \otimes \varphi_j)(x) =
x^T
M_{ij}
x
[/tex]
where [itex]M_{ij}[/itex] denote the matrix with zeros everywhere except the entry [itex]1[/itex] in the [itex]i^\text{th}[/itex] row and [itex]j^\text{th}[/itex] column. For example
[tex]
(\varphi_1 \otimes \varphi_2)(x) =
x^T
M_{12}
x
=
x^T
\begin{pmatrix}
0 & 1 \\
0 & 0
\end{pmatrix}
x.
[/tex]
So we have
[tex]
T(x) =
x^T
\begin{pmatrix}
1 & 2 \\
3 & 4
\end{pmatrix}
x
=
x^T
\begin{pmatrix}
1 & 0 \\
0 & 0
\end{pmatrix}
x
+
x^T
\begin{pmatrix}
0 & 2 \\
0 & 0
\end{pmatrix}
x
+
x^T
\begin{pmatrix}
0 & 0 \\
3 & 0
\end{pmatrix}
x
+
x^T
\begin{pmatrix}
0 & 0 \\
0 & 4
\end{pmatrix}
x \\
=
x^T
\begin{pmatrix}
1 & 0 \\
0 & 0
\end{pmatrix}
x
+
2
x^T
\begin{pmatrix}
0 & 1 \\
0 & 0
\end{pmatrix}
x
+
3
x^T
\begin{pmatrix}
0 & 0 \\
1 & 0
\end{pmatrix}
x
+
4
x^T
\begin{pmatrix}
0 & 0 \\
0 & 1
\end{pmatrix}
x \\
= (\varphi_1 \otimes \varphi_1)(x) + 2(\varphi_1 \otimes \varphi_2)(x) + 3(\varphi_2 \otimes \varphi_1)(x) + 4(\varphi_2 \otimes \varphi_2)(x)
[/tex]
So [itex]T = \varphi_1 \otimes \varphi_1 + 2\varphi_1 \otimes \varphi_2 + 3\varphi_2 \otimes \varphi_1 + 4\varphi_2 \otimes \varphi_2[/itex] (this is the decomposition in terms of the chosen basis). As you can confirm, [itex]T(e_1, e_1) = 1, T(e_1, e_2) = 2, T(e_2, e_1) = 3, T(e_2, e_2) = 4[/itex]. So we can rewrite it as [itex]T = T(e_1, e_1)\varphi_1 \otimes \varphi_1 + T(e_1, e_2)\varphi_1 \otimes \varphi_2 + T(e_2, e_1)\varphi_2 \otimes \varphi_1 + T(e_2, e_2)\varphi_2 \otimes \varphi_2[/itex]. This is exactly what the sum is in the previous post. It is not recursive but merely stating that the two is equal.

If you want to see the above decomposition purely in terms of matrices, it is just the following statement
[tex]
\begin{pmatrix}
1 & 2 \\
3 & 4
\end{pmatrix}
=
\begin{pmatrix}
1 & 0 \\
0 & 0
\end{pmatrix}
+
2
\begin{pmatrix}
0 & 1 \\
0 & 0
\end{pmatrix}
+
3
\begin{pmatrix}
0 & 0 \\
1 & 0
\end{pmatrix}
+
4
\begin{pmatrix}
0 & 0 \\
0 & 1
\end{pmatrix}.
[/tex]
But tensors are actually the multilinear maps [itex]V^k \to \mathbb{R}[/itex]. To deal with their matrices when [itex]k = 2[/itex], we have to choose some basis. In the above example I chose the standard basis but we can get a different equality in terms of matrices for different basis.
 
  • #5
I understood. But, this multiplies my doubts! Although I know the Cauchy stress tensor, so far I haven't found any material that addresses the tensor as a geometric element. Would be convenient if I continued to ask my questions? The geometric interpretation of a tensor, the geometric interpretation of the tensor product between vectors, determinant, modulus... are things obscure to me...
 
Last edited:
  • #6
I am not a expert on tensors either. I learned it in the book called Calculus on Manifolds by Michael Spivak in chapter 4. Since it is not a book on algebra, I'm sure there's a much much more to tensors and multilinear algebra which I did not learn.

Geometrically, its very hard to visualize because they are multilinear maps which are functions and you would typically need many dimensions to graph it in some way (since you need to consider the domain space and range space). I think its better to see what it is algebraically instead.

Anyway, you should note the approach to tensors that I learned and used above is a coordinate free approach. If you are doing physics, then the way tensors are dealt with is different since coordinates are chosen. Its similar to the difference in dealing with linear transformations instead of their matrices in linear algebra.
 
  • #7
I'm not studying physics. I'm self-taught. I learned calculus alone and honestly, I don't like when others tell me what I need studying, as If I had not been born with this innate desire to learn. Anyway...

Do you know to tell me if the determinant of the tensor product of two vectors is equals area (parallelogram) formed between these two vectors? If yes, how to contrast this with the fact that the determinant of a 3x3 matrix corresponds to the volume (paralelepidido) formed by three vectors!?
 
  • #8
Jhenrique said:
I'm not studying physics. I'm self-taught. I learned calculus alone and honestly, I don't like when others tell me what I need studying, as If I had not been born with this innate desire to learn. Anyway...

I have no idea what I said to get this response. I looked at my previous post a few times and still confused at your response. Anyway, sorry if I offended you (whatever it was that offended you).

As for the determinant stuff ... I not familiar with the determinant of the tensor product of two vectors, so unfortunately I don't have the answer.
 
  • Like
Likes 1 person
  • #9
PSarkar said:
I have no idea what I said to get this response. I looked at my previous post a few times and still confused at your response. Anyway, sorry if I offended you (whatever it was that offended you).

AISEUhaIUHEiaHEiah

You didn't offend me! You supposed that I could be studying physics. I said that no and philosophized a bit, doing a critique to education system.
 
  • #10
Hahaha. Ok. Sorry for the misunderstanding!
 

Related to Tensor Decomposition: Exploring Linear Decomposition Methods

What is a linear decomposition tensor?

A linear decomposition tensor is a mathematical object used to represent linear transformations between vector spaces. It decomposes a linear transformation into a series of simpler transformations, making it easier to understand and analyze.

What are the applications of linear decomposition tensor?

Linear decomposition tensor has various applications in fields such as computer graphics, physics, and engineering. It is used to solve problems related to vector calculus, matrix algebra, and differential geometry. It is also used in image processing, signal processing, and machine learning.

How is a linear decomposition tensor different from a regular tensor?

A linear decomposition tensor is specifically used to represent linear transformations, while a regular tensor can represent any type of transformation. Additionally, a linear decomposition tensor is always a square matrix, while a regular tensor can have any number of dimensions.

What are the advantages of using linear decomposition tensor?

Linear decomposition tensor allows for the simplification and analysis of complex linear transformations. It also helps in the efficient computation of these transformations. Furthermore, it provides a better understanding of the underlying structure of the transformation and its effects on the vector space.

Can linear decomposition tensor be used in higher dimensions?

Yes, linear decomposition tensor can be used in any dimension. However, as the dimension increases, the complexity of the calculations also increases. Therefore, it is more commonly used in lower dimensions for practical applications.

Similar threads

Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
446
  • Linear and Abstract Algebra
Replies
10
Views
288
  • Linear and Abstract Algebra
Replies
1
Views
858
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
837
  • Linear and Abstract Algebra
Replies
1
Views
868
  • Linear and Abstract Algebra
Replies
5
Views
1K
Back
Top