Elementary Linear Algebra - Lecture 25 - The Invertible Matrix Theorem
Lecture 25 - The Invertible Matrix Theorem
Learning Objectives
Understand the statement of the Invertible Matrix Theorem
Apply the Invertible Matrix Theorem to determine whether a given matrix is invertible
Important Theorems So Far
The Invertible Matrix Theorem will tie together ideas from several different theorems we have established so far in this course:
The Spanning Columns Theorem, first stated in Lecture 9 and then expanded in Lecture 20
The Linearly Independent Columns Theorem, first stated in Lecture 16 and then expanded in Lecture 20
The Left-Side Inverse Theorem and The Right-Side Inverse Theorem, from Lecture 22
The Invertible Matrix Theorem
The Invertible Matrix Theorem. Let \( A \) be a square \( n\times n\) matrix. The following statements are equivalent:
\( A \) is invertible
\( A \) is row-equivalent to \( I_n \)
\( A \) has \( n \) pivots
The only solution to the equation \( A\bbm x = \bbm 0 \) is \( \bbm x = \bbm 0 \)
The columns of \( A \) are linearly independent
The linear transformation \( T(\bbm x) = A\bbm x \) is one-to-one
The equation \( A\bbm x = \bbm b \) is consistent for all \( \bbm x \in \mathbb R^n \)
The columns of \( A \) span \( \mathbb R^n \)
The linear transformation \( T(\bbm x) = A\bbm x \) is onto
There is an \( n\times n\) matrix \( C \) for which \( CA = I_n \)
There is an \( n\times n\) matrix \( D \) for which \( AD = I_n \)
\( A^T \) is invertible
Proving the Invertible Matrix Theorem
First, keep in mind that since \( A \) is a square matrix, the statement "\( A\) has \( n \) pivots" is equivalent to both "\( A \) has a pivot in every column" and "\( A \) has a pivot in every row." This is also equivalent to saying "\( A \) is row-equivalent to \( I_n \)," since the reduced echelon form of \( A \) has a pivot in every row and every column. Thus, statements (a), (b), and (c) are equivalent.
From the Spanning Columns Theorem, we see that statements (b) and (c) are equivalent to statements (g), (h), and (i).
From the Linearly Independent Columns Theorem, we see that statements (b) and (c) are equivalent to statements (d), (e), and (f).
We have already seen that, if \( A\) is invertible, then \( A^T \) is invertible and \( (A^T)^{-1} = (A^{-1})^T \). Similarly, if \( A^T \) is invertible, then \( (A^T)^T \) is also invertible. Thus, statement (a) is equivalent to statement (l).
If \( A \) is invertible, then we can take \( C = A^{-1} \) for statement (j). If \( CA = I_n \), then the Left-Side Inverse Theorem tells us that \( A\) has a pivot in every column, which we have already proved is equivalent to \( A \) being invertible. Thus, (a) is equivalent to (j).
Finally, if \( A \) is invertible, then we can take \( D = A^{-1} \) for statement (k). If \( AD = I_n \), then the Right-Side Inverse Theorem tells us that \( A \) has a pivot in every row, which we have already proved is equivalent to \( A \) being invertible. Thus, (a) is equivalent to (k). \( \Box \)
Using the Invertible Matrix Theorem
If we are interested in determining whether a square matrix \( A \) is invertible, we can use any of the statements in the Invertible Matrix Theorem. While row-reducing \( A \) is always an option, for larger matrices the Invertible Matrix Theorem can be easier or quicker to use.
Note that the Invertible Matrix Theorem only works for square matrices! To analyze properites of non-square matrices, use more general facts like the Spanning Columns Theorem or the Linearly Independent Columns Theorem.
Example 1. Let \( A = \begin{bmatrix} -2 & 1 & 0 \\ 7 & -4 & 0 \\ 6 & 8 & 0 \end{bmatrix} \). Is \( A \) invertible? Use the Invertible Matrix Theorem to explain your answer.
There are several different ways that we could use the Invertible Matrix Theorem here. Any one of these reasons would suffice to answer this question:
The reduced echelon form of \( A \) is \( \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{bmatrix} \), which is not \( I_3 \), so \( A \) is not invertible by statement (b).
The third column of \( A\) is all zeroes, so \( A\) cannot have a pivot in that column. Since \( A\) has less than 3 pivots, \( A \) is not invertible by statement (c).
The equation \( A\bbm x = \bbm 0 \) has a nonzero solution \( \bbm x = \vecthree 001 \). So, \( A \) is not invertible by statement (d).
The columns of \( A \) are not linearly independent since \( 0 \vecthree {-2} 7 6 + 0 \vecthree 1 {-4} 8 + 1 \vecthree 000 = \vecthree 000 \) is a dependence relation. This tells us that \( A \) is not invertible by statement (e).
The transformation \( T(\bbm x) = A\bbm x\) is not one-to-one since \( T\left( \vecthree 001 \right) \) and \( T\left( \vecthree 002 \right) \) both equal \( \bbm 0 \). Therefore, \( A \) is not invertible by statement (f). \( \Box \)
Example 2. Let \( A = \begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \\ 9 & 10 & 11 & 12 \end{bmatrix} \). Is \( A \) invertible? Use the Invertible Matrix Theorem to explain your answer.
Again, there are several ways to use the Invertible Matrix Theorem to answer this question:
The reduced echelon form of \( A \) is \( \begin{bmatrix} 1 & 0 & -1 & -2 \\ 0 & 1 & 2 & 3 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix} \). Since this is not \( I_4 \), by statement (b) the matrix \( A \) is not invertible.
Since the third and fourth rows of \( A \) are equal, any linear combination of the columns of \( A \) will have its third and fourth entries equal. So, if \( \bbm b = \vecfour 1212 \), this tells us that \( A \bbm x = \bbm b\) has no solution. By statement (g), this means that \( A \) is not invertible.
By similar reasoning, the columns of \( A \) do not span \( \mathbb R^4 \) since the span of the columns of \( A\) will not contain the vector \( \vecfour 1212 \). Thus, by statement (h), \( A \) is not invertible.
The transformation \( T(\bbm x) = A\bbm x\) is not onto, since the equation \( T(\bbm x) = \vecfour 1212 \) has no solutions. Therefore, \( A \) is not invertible by statement (i).
The matrix \( A^T \) has two equal columns, so its columns are not linearly independent. By statement (e), this means that \( A^T \) is not invertible, and so by statement (l) we conclude that \( A \) is not invertible.