Lecture 25 - The Invertible Matrix Theorem

Learning Objectives

Important Theorems So Far

The Invertible Matrix Theorem will tie together ideas from several different theorems we have established so far in this course:

The Invertible Matrix Theorem

The Invertible Matrix Theorem. Let \( A \) be a square \( n\times n\) matrix. The following statements are equivalent:

  1. \( A \) is invertible
  2. \( A \) is row-equivalent to \( I_n \)
  3. \( A \) has \( n \) pivots
  4. The only solution to the equation \( A\bbm x = \bbm 0 \) is \( \bbm x = \bbm 0 \)
  5. The columns of \( A \) are linearly independent
  6. The linear transformation \( T(\bbm x) = A\bbm x \) is one-to-one
  7. The equation \( A\bbm x = \bbm b \) is consistent for all \( \bbm x \in \mathbb R^n \)
  8. The columns of \( A \) span \( \mathbb R^n \)
  9. The linear transformation \( T(\bbm x) = A\bbm x \) is onto
  10. There is an \( n\times n\) matrix \( C \) for which \( CA = I_n \)
  11. There is an \( n\times n\) matrix \( D \) for which \( AD = I_n \)
  12. \( A^T \) is invertible

Proving the Invertible Matrix Theorem

First, keep in mind that since \( A \) is a square matrix, the statement "\( A\) has \( n \) pivots" is equivalent to both "\( A \) has a pivot in every column" and "\( A \) has a pivot in every row." This is also equivalent to saying "\( A \) is row-equivalent to \( I_n \)," since the reduced echelon form of \( A \) has a pivot in every row and every column. Thus, statements (a), (b), and (c) are equivalent.

From the Spanning Columns Theorem, we see that statements (b) and (c) are equivalent to statements (g), (h), and (i).

From the Linearly Independent Columns Theorem, we see that statements (b) and (c) are equivalent to statements (d), (e), and (f).

We have already seen that, if \( A\) is invertible, then \( A^T \) is invertible and \( (A^T)^{-1} = (A^{-1})^T \). Similarly, if \( A^T \) is invertible, then \( (A^T)^T \) is also invertible. Thus, statement (a) is equivalent to statement (l).

If \( A \) is invertible, then we can take \( C = A^{-1} \) for statement (j). If \( CA = I_n \), then the Left-Side Inverse Theorem tells us that \( A\) has a pivot in every column, which we have already proved is equivalent to \( A \) being invertible. Thus, (a) is equivalent to (j).

Finally, if \( A \) is invertible, then we can take \( D = A^{-1} \) for statement (k). If \( AD = I_n \), then the Right-Side Inverse Theorem tells us that \( A \) has a pivot in every row, which we have already proved is equivalent to \( A \) being invertible. Thus, (a) is equivalent to (k). \( \Box \)

Using the Invertible Matrix Theorem

If we are interested in determining whether a square matrix \( A \) is invertible, we can use any of the statements in the Invertible Matrix Theorem. While row-reducing \( A \) is always an option, for larger matrices the Invertible Matrix Theorem can be easier or quicker to use.

Note that the Invertible Matrix Theorem only works for square matrices! To analyze properites of non-square matrices, use more general facts like the Spanning Columns Theorem or the Linearly Independent Columns Theorem.

Example 1. Let \( A = \begin{bmatrix} -2 & 1 & 0 \\ 7 & -4 & 0 \\ 6 & 8 & 0 \end{bmatrix} \). Is \( A \) invertible? Use the Invertible Matrix Theorem to explain your answer.

There are several different ways that we could use the Invertible Matrix Theorem here. Any one of these reasons would suffice to answer this question:

Example 2. Let \( A = \begin{bmatrix} 1 & 2 & 3 & 4 \\ 5 & 6 & 7 & 8 \\ 9 & 10 & 11 & 12 \\ 9 & 10 & 11 & 12 \end{bmatrix} \). Is \( A \) invertible? Use the Invertible Matrix Theorem to explain your answer.

Again, there are several ways to use the Invertible Matrix Theorem to answer this question:

« Lecture 24 Back to Top Lecture 26 »