Definition. Let \( H \) be a subspace of \( \mathbb R^n \). A basis for \( H \) is a set of vectors \( \{ \bbm b_1, \ldots, \bbm b_p \} \) such that
We have already seen one example of a basis. The standard basis vectors \( \{ \bbm e_i \} \) that we learned about in Lecture 19 form a basis for \( \mathbb R^n \). Recall that these vectors are the columns of the identity matrix \( I_n \), and these vectors are both linearly independent and span \( \mathbb R^n \).
Definition. The standard basis for \( \mathbb R^n \) is the set \( \{ \bbm e_1, \bbm e_2, \ldots, \bbm e_n \} \) containing the standard basis vectors.
Example 1. Let \( H = \left\{ \vecthree {a+b} {a} {-3b} : a,b\in \mathbb R \right\} \). Prove that \( \left\{ \vecthree 220, \vecthree {-1}03 \right\} \) is a basis for \( H \).
First, we prove that \( \left\{ \vecthree 220, \vecthree {-1}03 \right\} \) is a linearly independent set. We form the matrix that has these vectors as its columns and row-reduce: \[ \begin{bmatrix} 2 & -1 \\ 2 & 0 \\ 0 & 3 \end{bmatrix} \longrightarrow \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 0 & 0 \end{bmatrix} \] Since there is a pivot in every column, the vectors are linearly independent by the Linearly Independent Columns Theorem.
Now, we must prove that \( \left\{ \vecthree 220, \vecthree {-1}03 \right\} \) spans \( H \). To do this, we must show that any vector \( \bbm v \in H \) can be written as a linear combination of these vectors.
Let \( \bbm v\in H \). By the definition of \( H\) we have \( \bbm v = \vecthree {a+b} {a} {-3b} \) for some real numbers \( a,b\in \mathbb R\). Now, \[ \bbm v = \vecthree {a+b} {a} {-3b} = \vecthree aa0 + \vecthree b0{-3b} = \frac{a}{2} \vecthree 220 + (-b) \vecthree {-1}03. \] Since \( \left\{ \vecthree 220, \vecthree {-1}03 \right\} \) is a linearly independent set that spans \( H \), this is a basis for \( H \). \( \Box \)
Given an \( m\times n\) matrix, we want to find a basis for \( \Nul A \). We learned in Lecture 28 how to find a spanning set for \( \Nul A \), but now we want a spanning set that is also linearly independent.
Example 2. Find a basis for \( \Nul A \), where \( A = \begin{bmatrix} -3 & 6 & -1 & 1 & -7 \\ 1 & -2 & 2 & 3 & -1 \\ 2 & -4 & 5 & 8 & -4 \end{bmatrix} \).
We'll start by finding a spanning set for \( \Nul A \). Recall that, to do this, we need to find the parametric vector form of the solution to \( A \bbm x = \bbm 0\). First row-reduce the corresponding augmented matrix: \[ \begin{bmatrix} -3 & 6 & -1 & 1 & -7 & 0 \\ 1 & -2 & 2 & 3 & -1 & 0 \\ 2 & -4 & 5 & 8 & -4 & 0 \end{bmatrix} \longrightarrow \begin{bmatrix} 1 & -2 & 0 & -1 & 3 & 0 \\ 0 & 0 & 1 & 2 & -2 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \]
Now, write the solution in parametric vector form: \[ \bbm x = \vecfive {x_1} {x_2} {x_3} {x_4} {x_5} = \vecfive {2x_2+x_4-3x_5} {x_2} {-2x_4+2x_5} {x_4} {x_5} = x_2 \vecfive 21000 + x_4 \vecfive 10{-2}10 + x_5 \vecfive {-3}0201. \]
From this, we can see that every solution of \( A \bbm x = \bbm 0 \) can be written as a linear combination of \( \bbm v_1 = \vecfive 21000, \bbm v_2 = \vecfive 10{-2}10 \), and \( v_3 = \vecfive {-3}0201 \). So, \( \{ \bbm v_1, \bbm v_2, \bbm v_3 \}\) is a spanning set for \( \Nul A \). Is this set a basis for \( \Nul A \)?
Note that the vector \( \bbm v_1 \) has a 1 in its second entry and the other vectors have a 0 in this entry. This is because \( \bbm v_1 \) corresponds to the free variable \( x_2 \). Similarly, \( \bbm v_2 \) is the only vector in this set that has a nonzero fourth entry, and \( \bbm v_3 \) is the only vector in this set that has a nonzero fifth entry. If \( c_1 \bbm v_1 + c_2 \bbm v_2 + c_3 \bbm v_3 = \bbm 0 \), then we have: \[ \vecfive {2c_1+c_2-3c_3} {c_1} {-2c_2+2c_3} {c_2} {c_3} = \vecfive 00000, \] which gives \( c_1 = c_2 = c_3 = 0 \). Therefore, \( \{ \bbm v_1, \bbm v_2, \bbm v_3 \}\) is a linearly independent set. Since we already knew that this set spans \( \Nul A \), this shows that \( \{ \bbm v_1, \bbm v_2, \bbm v_3 \}\) is a basis for \( \Nul A \). \( \Box \)
In fact, the nonzero vectors in any parametric solution of a homogeneous vector equation (if there are any) are always linearly independent. This means that the process we followed in Lecture 28 to find a spanning set for \( \Nul A \) actually finds a basis for \( \Nul A \).
Example 3. Let \( A = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \end{bmatrix} \). Find a basis for \( \Nul A \).
We find the solution to \( A \bbm x = \bbm 0 \) and write it in parametric vector form: \[ \begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 6 \end{bmatrix} \longrightarrow \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 0 \end{bmatrix} \qquad \qquad \bbm x = \vecthree {x_1} {x_2} {x_3} = \vecthree {-2x_2-3x_3} {x_2} {x_3} = x_2 \vecthree {-2} 1 0 + x_3 \vecthree {-3} 0 1. \]
The basis we have found for \( \Nul A \) is \( \left\{ \vecthree {-2} 1 0, \vecthree {-3} 0 1 \right\} \). \( \Box \)
Given an \( m\times n\) matrix, we want to find a basis for \( \Col A \). We learned in Lecture 29 that it is easy to find a spanning set \( \Col A \): the columns of \( A \). In general, this set will not be linearly independent, so we will need to modify this set to turn it into a basis for \( \Col A \).
Example 4. Find a basis for \( \Col A \), where \( A = \begin{bmatrix} 1 & 4 & 0 & 2 & 0 \\ 0 & 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \).
Write \( \bbm a_1, \ldots, \bbm a_5 \) for the columns of \( A \). We know that \( \{ \bbm a_1, \ldots, \bbm a_5 \} \) is a spanning set for \( \Col A \), but this is not a basis because the set is not linearly independent. In fact, we can see some of the dependence relations in this set: \( \bbm a_2 = 4\bbm a_1 \) and \( \bbm a_4 = 2\bbm a_1 - \bbm a_3 \).
These relations allow us to write any element of \( \Col A \) without \( \bbm a_2 \) or \( \bbm a_4 \). Let \( \bbm w \in \Col A \). Since \( \bbm w \) is a linear combination of \( \{ \bbm a_1, \ldots, \bbm a_5 \} \), we can write \( \bbm w = c_1 \bbm a_1 + \cdots + c_5 \bbm a_5 \) for some scalars \( c_1, \ldots, c_5 \).
Now, we can rewrite \( \bbm w\) using \( \bbm a_2 = 4\bbm a_1 \) and \( \bbm a_4 = 2\bbm a_1 - \bbm a_3 \): \[ \begin{eqnarray*} \bbm w & = & c_1 \bbm a_1 + c_2 \bbm a_2 + c_3 \bbm a_3 + c_4 \bbm a_4 + c_5 \bbm a_5 \\ & = & c_1 \bbm a_1 + c_2 (4\bbm a_1) + c_3 \bbm a_3 + c_4 (2\bbm a_1 - \bbm a_3) + c_5 \bbm a_5 \\ & = & (c_1 + 4c_2 + 2c_4) \bbm a_1 + (c_3 - c_4) \bbm a_3 + c_5 \bbm a_5 \end{eqnarray*} \]
Since every element of \( \Col A \) can be written as a linear combination of \( \{ \bbm a_1, \bbm a_3, \bbm a_5 \} \), this smaller set is still a spanning set for \( \Col A \). Since these vectors are clearly linearly indepedent, this set is a basis for \( \Col A \). \( \Box \)
Example 5. Find a basis for \( \Col B \), where \( B = \begin{bmatrix} -2 & -8 & 0 & -4 & 1 \\ 2 & 8 & 2 & 2 & 2 \\ 2 & 8 & -1 & 5 & -2 \\ 2 & 8 & 1 & 3 & 2 \end{bmatrix} \).
We would like to apply a similar strategy as in Example 4, but the linear dependence relationships among the columns of \( B \) are harder to see since \( B \) is not in reduced echelon form. Let's row-reduce \( B \): \[ \begin{bmatrix} -2 & -8 & 0 & -4 & 1 \\ 2 & 8 & 2 & 2 & 2 \\ 2 & 8 & -1 & 5 & -2 \\ 2 & 8 & 1 & 3 & 2 \end{bmatrix} \longrightarrow \begin{bmatrix} 1 & 4 & 0 & 2 & 0 \\ 0 & 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \]
We see that the reduced echelon form of \( B \) is the same matrix \( A \) from Example 4. We know that this matrix has dependence relations \( \bbm a_2 = 4\bbm a_1 \) and \( \bbm a_4 = 2\bbm a_1 - \bbm a_3 \). You can check for yourself that \( B \) has these same relations. If we write \( \bbm b_1, \ldots, \bbm b_5 \) for the columns of \( B \), we have \( \bbm b_2 = 4\bbm b_1 \) and \( \bbm b_4 = 2\bbm b_1 - \bbm b_3 \).
Using the same process as in Example 4, these relations allow us to eliminate \( \bbm b_2 \) and \( \bbm b_4 \) from any linear combination of the columns of \( B \). Thus, \( \{ \bbm b_1, \bbm b_3, \bbm b_5 \} \) is a basis for \( \Col B \). \( \Box \)
In fact, row operations on a matrix always preserve the linear dependence relations among the columns of that matrix. Once the matrix is fully row-reduced, we can see which columns of the original matrix are pivot columns. Any non-pivot column will be a linear combination of one or more of the other columns, and can be eliminated.
Example 6. Find a basis for \( \Col C \), where \( C = \begin{bmatrix} -3 & 6 & -1 & 1 & -7 \\ 1 & -2 & 2 & 3 & -1 \\ 2 & -4 & 5 & 8 & -4 \end{bmatrix} \).
We row-reduce \( C \) to identify its pivot columns: \[ \begin{bmatrix} -3 & 6 & -1 & 1 & -7 \\ 1 & -2 & 2 & 3 & -1 \\ 2 & -4 & 5 & 8 & -4 \end{bmatrix} \longrightarrow \begin{bmatrix} 1 & -2 & 0 & -1 & 3 \\ 0 & 0 & 1 & 2 & -2 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \]
We see that the first and third columns of \( C \) are pivot columns. So, a basis for \( \Col C \) is \( \left\{ \vecthree {-3} 1 2, \vecthree {-1} 2 5 \right\} \). \( \Box \)
Given an \( m\times n\) matrix \( A\)...