Understanding proof about A(adj A) elements

In summary: For the diagonal elements, we have to take the scalar product of the row with its own cofactors. In this case, the result is the determinant of A, which we know is |A|. This leads to the conclusion that A(adj A) = |A|I.In summary, the main theorem states that for any matrix A, the elements of A(adj A) are |A| if on the diagonal and 0 if not. This can be proven by taking the scalar product of a row in A with the cofactors of another row, which results in 0. For the diagonal elements, the scalar product of a row with its own cofactors gives the determinant of A, which is |A|. This shows that A
  • #1
CoolFool
4
0
Hey everyone,
I've been going thru a linear alg textbook and there's this one theorem I can't quite follow. It proves that the elements of A(adj A) are |A| if on the diagonal and are 0 if not. Therefore, A(adj A) = |A|I.

The first part shows that along the diagonal of this product A(adj A), the resulting elements are |A| because they are determined by multiplying the row vector of A by its corresponding cofactors of the corresponding column in adj A. This first part I understand, even if I didn't put it very clearly.

The main theorem is: for any matrix A, [itex]\sum^{n}_{k=1}a_{tk}c_{ik} = \delta_{ti}|A|[/itex], where [itex]\delta_{ti}[/itex] is 1 if t = i, and 0 otherwise.

As an example, a 3x3 matrix B is constructed by replacing the second row vector of A with any of A's row vectors, including its second row vector. If the second row vector is replaced with itself, B=A and so if you multiply that row by its corresponding column of adj A, you get |A|, which we already know.

So far I follow, though constructing B as an example seems awkward. But the next step confuses me. Now, we have to prove that the elements not on the diagonal are equal to zero. The example is that if B's second row is replaced with another row, B has two identical rows and you can subtract one from the other and get a zero row vector, which means that |B| = 0. The rule where a matrix with 2 identical rows has a determinant of zero makes sense to me, but I don't get why the construction of B has anything to do with A(adj A). That's what throws me off: I don't understand the connection between this product and B.

What am I missing? I feel like it's right in front of me but I can't see it!
Thanks for your help!
 
Physics news on Phys.org
  • #2
Suppose we want to calculate the element at place ij in A(adj A), with i =/= j. Then, we take the scalar product of row i in A and column j in adj A. The latter column consists of the cofactors of the elements of row j in A. But if we take the scalar product of a row with the cofactors of another row in the same matrix, the result is 0, as you said you understand. Thus, the desired element in A(adj A) is 0.
 

Related to Understanding proof about A(adj A) elements

1. What is A(adj A) in proof?

A(adj A) is a notation used in linear algebra to represent the adjugate of a square matrix A. The adjugate is the transpose of the cofactor matrix of A.

2. Why is understanding proof about A(adj A) elements important?

Understanding proof about A(adj A) elements is important because it allows us to solve problems in linear algebra, such as finding the inverse of a matrix or solving systems of linear equations.

3. How is A(adj A) calculated?

A(adj A) is calculated by first finding the determinant of A, then finding the matrix of cofactors for A, and finally taking the transpose of the cofactor matrix.

4. Are there any special properties of A(adj A)?

Yes, A(adj A) has the property that multiplying it by the original matrix A results in the determinant of A times the identity matrix. In other words, A(adj A)A = det(A)I.

5. Can A(adj A) be used to find the inverse of a matrix?

Yes, A(adj A) can be used to find the inverse of a matrix. The inverse of A can be calculated by dividing A(adj A) by the determinant of A.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
940
  • Linear and Abstract Algebra
Replies
9
Views
4K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Precalculus Mathematics Homework Help
Replies
1
Views
618
  • Linear and Abstract Algebra
2
Replies
48
Views
5K
Back
Top