How Do Eigenvalues and Eigenvectors Connect to Fourier Transforms?

In summary, the conversation discusses the concept of eigenvalues and eigenvectors in linear algebra, and their importance in diagonalizing matrices and finding the "modes" of a system. It also mentions the significance of hermitian matrices and the limitations on when a matrix can have a complete set of eigenvectors. The conversation also touches on the connection between eigenvalues and Fourier transform, and the desire to learn more about the derivation of the Laplace and Z transforms.
  • #1
MrAlbot
12
0
Hello guys, is there any way someone can explain to me in resume what eigen values and eigenvectors are because I don't really recall this theme from linear algebra, and I'm not getting intuition on where does Fourier transform comes from.

my teacher wrote:
A[itex]\overline{v}[/itex] = λ[itex]\overline{v}[/itex]

then he said that for a vector [itex]\overline{x}[/itex]

[itex]\overline{x}[/itex] = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] xi [itex]\overline{e}[/itex]i

and he calls this [itex]\overline{ei}[/itex] the inicial ortonormal base

the he says that this is equal to

[itex]\overline{x}[/itex] = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i [itex]\overline{v}[/itex]i

where [itex]\overline{v}[/itex]i is the base of the eigenvectors of A


then he says that y=A[itex]\overline{x}[/itex]

[itex]\overline{y}[/itex] = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] yi [itex]\overline{e}[/itex]i = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{y}[/itex]i [itex]\overline{v}[/itex]i = A[itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i [itex]\overline{v}[/itex]i = [itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] A[itex]\widehat{x}[/itex]i [itex]\overline{v}[/itex]i = itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i A [itex]\overline{v}[/itex]i = as A[itex]\overline{v}[/itex]i is λ[itex]\overline{v}[/itex]i = itex]\sum^{n}[/itex] [itex]_{i=1}[/itex] [itex]\widehat{x}[/itex]i λi [itex]\overline{v}[/itex]i

So we get that [itex]\widehat{x}[/itex]i λ = [itex]\widehat{y}[/itex]i

I Would like to know the intuition behind this and how it relates to the Fourier Series/ Fourier Transform.
I'd really apreciate not to go into deep mathematics once I have very very weak Linear Algebra bases and I will have to waste some time relearning it, but unfortunately I don't have time now.


Hope someone can help!

Thanks in advance!

Pedro
 
Physics news on Phys.org
  • #2
In linear algebra you will have "diagonalized the matrix" towards the end of the term; this process finds the eigenvalues (the terms on the diagonal) and the eigenvectors (the new set of basis vectors for the system).

Thus if you can diagonalize the matrix, a complete set of eigenvectors will exist; they have very nice analytical properties. They correspond to the physical "modes of the system" - if you bang something in the same direction as one of its eigenvectors, then it will only respond in that direction; if you hit it elsewhere, you get multiple responses. That is the significance of the eigenvector equation ... used heavily in acoustics and quantum mechanics, among others.
 
  • #3
Not all matrices have a set of eigenvectors that spans the whole vector space. As an example consider the rotation matrix in ℝ2:

\begin{pmatrix}
cos\theta & -sin\theta \\
sin\theta & cos\theta
\end{pmatrix}

Unless ##\theta## is a multiple of ##\pi##, this matrix doesn't have eigenvectors at all!

Usually in applications in physics and engineering, the matrices are hermitian, which guarantees a complete set of eigenvectors.
 
  • #4
Mr. Albot:
Maybe you can look at hilbert2's example to illustrate the concept: if 1 were an eigenvalue, then a vector would be sent to itself ( or, more precisely, to the same place (x,y) where it starts; after a rotation). Clearly, like hilbert2 says, 1 can only be an eigenvalue if you rotate by an integer multiple of $$\pi$$ , and the eigenvectors would be all points that are fixed by the rotation. Notice that if $$\lambda=1$$ is an eigenvalue, that means $$Tv=v$$ , so that v is fixed by the transformation.
 
  • #5
Thanks Alot guys! I just started to study Linear Algebra from the beggining because I wasn't understanding anything you were saying, but only now I can to see how usefull your comments were! Algebra is beautifull ...Thanks a lot again!
 
  • #6
A correction to my post #4: that should be an integer multiple of ##2\pi## , not an integer multiple of ##\pi##.
 
  • #7
exactly! that makes a lot more sense now, but I got the point the first time. Do you know where can I find the best place to learn the derivation of Fourier transform? Right now I am learning from khan academy once I'm a little short on time but its being a pleasant trip over linear algebra. How exactly do I map from the R^n to the complex map ?
Best regards

edit: what I really want to know is the derivation of laplace transform and Z transform, once Fourier comes from that.
 
Last edited:

Related to How Do Eigenvalues and Eigenvectors Connect to Fourier Transforms?

1. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are concepts in linear algebra that are used to describe the behavior of a linear transformation on a vector space. Eigenvalues are the scalars that represent how a linear transformation stretches or compresses a vector, while eigenvectors are the corresponding vectors that are only scaled by the transformation.

2. How are eigenvalues and eigenvectors calculated?

Eigenvalues and eigenvectors can be calculated by solving the characteristic equation of a linear transformation. This involves finding the values of lambda, known as eigenvalues, that satisfy the equation (A-lambda*I)x = 0, where A is the matrix representing the transformation and I is the identity matrix.

3. What is the significance of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are important because they provide valuable information about linear transformations. They can be used to decompose a transformation into simpler forms, determine the stability of a system, and identify special directions or planes in a transformation.

4. Can a matrix have more than one eigenvalue and eigenvector?

Yes, a matrix can have multiple eigenvalues and corresponding eigenvectors. In fact, the number of eigenvalues and eigenvectors is equal to the dimension of the matrix. However, some matrices may have repeated eigenvalues, meaning that they have the same value but different corresponding eigenvectors.

5. How are eigenvalues and eigenvectors used in real-world applications?

Eigenvalues and eigenvectors have many applications in fields such as physics, engineering, and computer science. They are used in quantum mechanics, image processing, data compression, and more. For example, in image processing, eigenvectors can be used to reduce the dimensionality of an image while preserving its important features.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
2K
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
3
Views
2K
Replies
1
Views
647
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Differential Equations
Replies
4
Views
2K
Back
Top