Regarding the linear dependence of eigenvectors

In summary, eigenvectors of a certain n-square matrix are linearly independent if each vector belongs to a distinct eigenvalue. This also holds true for subsets of vectors belonging to the same eigenvalue, as long as the zero vector is not obtained through Gram-Schmidt or row operations. In some cases, special structure in the matrix, such as symmetry, can guarantee linear independence between eigenvectors.
  • #1
Adgorn
130
18
Let's say we have a set of eigenvectors of a certain n-square matrix. I understand why the vectors are linearly independent if each vector belongs to a distinct eigenvalue.
However the set is comprised of subsets of vectors, where the vectors of each subset belong to the same eigenvalue. For example, in a 7-square matrix, ##v_1, v_2, v_3## belong to ##λ_1##, ##v_4, v_5## belong to ##λ_2## and ##v_6, v_7## belong to ##λ_3##. How do we prove the vectors are still linearly independent?
 
Physics news on Phys.org
  • #2
Adgorn said:
Let's say we have a set of eigenvectors of a certain n-square matrix. I understand why the vectors are linearly independent if each vector belongs to a distinct eigenvalue.
However the set is comprised of subsets of vectors, where the vectors of each subset belong to the same eigenvalue. For example, in a 7-square matrix, ##v_1, v_2, v_3## belong to ##λ_1##, ##v_4, v_5## belong to ##λ_2## and ##v_6, v_7## belong to ##λ_3##. How do we prove the vectors are still linearly independent?

The most literal answer is to do gramm schmidt (or perhaps row operations) on ##v_1, v_2, v_3##, then repeat on ##v_4, v_5## then finally on ##v_6, v_7##. If in each case you don't get the zero vector as part of your set, then you have linear independence.

Now more generally, note that there could be special structure in your matrix that guarantees linear independence between (well chosen) eigenvectors. E.g. absorbing states in absorbing state markov chains are mutually orthonormal. (Theres a similar, more general argument about recurrent classes and so forth but that's outside the scope.)

The biggest case to be aware of is if your matrix is Hermitian (or in Reals, symmetric), then via Schur decomposition it can always be diagonalized because the eigenvectors can be chosen to be mutually orthonormal. (There's a bit more to it with normal matrices, etc. but the big one is symmetry.)
 
  • #3
StoneTemplePython said:
The most literal answer is to do gramm schmidt (or perhaps row operations) on ##v_1, v_2, v_3##, then repeat on ##v_4, v_5## then finally on ##v_6, v_7##. If in each case you don't get the zero vector as part of your set, then you have linear independence.

Now more generally, note that there could be special structure in your matrix that guarantees linear independence between (well chosen) eigenvectors. E.g. absorbing states in absorbing state markov chains are mutually orthonormal. (Theres a similar, more general argument about recurrent classes and so forth but that's outside the scope.)

The biggest case to be aware of is if your matrix is Hermitian (or in Reals, symmetric), then via Schur decomposition it can always be diagonalized because the eigenvectors can be chosen to be mutually orthonormal. (There's a bit more to it with normal matrices, etc. but the big one is symmetry.)
I see, thanks for the assist :).
 

Related to Regarding the linear dependence of eigenvectors

1. How do you determine if a set of eigenvectors are linearly dependent?

To determine if a set of eigenvectors are linearly dependent, you can use the determinant method or the rank method. The determinant method involves calculating the determinant of the matrix formed by the eigenvectors. If the determinant is equal to 0, then the eigenvectors are linearly dependent. The rank method involves forming a matrix with the eigenvectors as its columns and calculating the rank of the matrix. If the rank is less than the number of eigenvectors, then they are linearly dependent.

2. What does it mean for eigenvectors to be linearly dependent?

If eigenvectors are linearly dependent, it means that at least one of the eigenvectors can be expressed as a linear combination of the others. In other words, one of the eigenvectors is redundant and can be removed without changing the span of the set.

3. Can a set of linearly dependent eigenvectors still form a basis for a vector space?

No, a set of linearly dependent eigenvectors cannot form a basis for a vector space. A basis must consist of linearly independent vectors, meaning that none of the vectors can be expressed as a linear combination of the others. If a set of eigenvectors is linearly dependent, then at least one vector can be expressed as a linear combination of the others, making it redundant.

4. What implications does linear dependence of eigenvectors have on diagonalization?

If a set of eigenvectors is linearly dependent, it means that the matrix cannot be diagonalized. This is because a diagonalizable matrix must have a full set of linearly independent eigenvectors. If there are linearly dependent eigenvectors, then the matrix cannot be diagonalized and must be put in a different form.

5. Can a matrix have linearly dependent eigenvectors with distinct eigenvalues?

Yes, a matrix can have linearly dependent eigenvectors with distinct eigenvalues. The linear dependence of eigenvectors is determined by the relationship between the vectors, rather than the values of the eigenvalues. So even if the eigenvalues are distinct, the eigenvectors can still be linearly dependent if they can be expressed as linear combinations of each other.

Similar threads

  • Linear and Abstract Algebra
Replies
6
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
6
Views
6K
Back
Top