In general, when we want to determine whether a set of vectors \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) is linearly independent, we must set up and solve the homogeneous vector equation \( x_1 \bbm v_1 + x_2 \bbm v_2 + \cdots x_p \bbm v_p = \bbm 0 \). However, there are certain special cases where we can make this determination without needing to solve the equation directly.
For example, when the number \( p \) of vectors is small, we can often directly tell whether the set is linearly independent. These special cases will give us insight into the more general case.
If \( p = 1\), then our set is \( \{ \bbm v_1 \} \), with just a single vector. The vector equation in this case looks like \( x_1 \bbm v_1 = \bbm 0\). By the definitions in Lecture 15, if this equation has a non-trivial solution, then the set \( \{ \bbm v_1 \} \) is linearly dependent. If this equation has only the trivial solution, then the set \( \{ \bbm v_1 \} \) is linearly independent.
If \( \bbm v_1 = \bbm 0 \), then the equation \( x_1 \bbm v_1 = \bbm 0\) has infinitely many solutions, as \( x_1 \) can equal any real number. In this case, the set \( \{ \bbm v_1 \} \) is linearly dependent.
If \( \bbm v_1 \ne \bbm 0 \), then the equation \( x_1 \bbm v_1 = \bbm 0\) has only the solution \( x_1 = 0 \). In this case, the set \( \{ \bbm v_1 \} \) is linearly independent.
If \( p = 2\), then the set of vectors is \( \{ \bbm v_1, \bbm v_2 \} \), and the vector equation we consider is \( x_1 \bbm v_1 + x_2 \bbm v_2 = \bbm 0\). By the definitions in Lecture 15, if this equation has a non-trivial solution, then the set \( \{ \bbm v_1, \bbm v_2 \} \) is linearly dependent. If this equation has only the trivial solution, then the set \( \{ \bbm v_1, \bbm v_2 \} \) is linearly independent.
Let's think about how this equation could have a non-trivial solution by supposing we have \( c_1 \bbm v_1 + c_2 \bbm v_2 = \bbm 0 \) for some scalars \( c_1 \) and \( c_2 \) that are not both zero. In this case we can solve \[ \bbm v_1 = -\frac {c_2} {c_1} \bbm v_2 \qquad \mbox{or} \qquad \bbm v_2 = -\frac {c_1} {c_2} \bbm v_1, \] depending on which of \( c_1 \) and \( c_2 \) is nonzero.
A set of two vectors \( \{ \bbm v_1, \bbm v_2 \} \) is linearly dependent if one of the vectors is a multiple of the other. Equivalently, we can say that \( \{ \bbm v_1, \bbm v_2 \} \) is linearly dependent if \( \bbm v_1 \) and \( \bbm v_2 \) point in the same or opposite directions, or if one of the vectors is \( \bbm 0 \).
A set of two vectors \( \{ \bbm v_1, \bbm v_2 \} \) is linearly independent if neither vector is a multiple of the other. Equivalently, we can say that \( \{ \bbm v_1, \bbm v_2 \} \) is linearly independent if \( \bbm v_1 \) and \( \bbm v_2 \) point in different, non-opposite directions and neither vector is the zero vector.
The idea in the two-vector case leads to this theorem that can be applied to sets with any number of vectors.
Theorem (When One Vector is a Multiple of Another). In the set of vectors \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \), if one of the vectors in the set is a multiple of one of the other vectors, then the set is linearly dependent.
To illustrate the idea behind why this theorem is true, consider the set \( \{ \bbm v_1, \bbm v_2, \bbm v_3, \bbm v_4 \} \), and suppose that we konw that \( \bbm v_3 = 6 \bbm v_1 \). Then, \( {\color{red} (-6)}\bbm v_1 + {\color{red} 0}\bbm v_2 + {\color{red} 1}\bbm v_3 + {\color{red} 0}\bbm v_4 = \bbm 0 \) is a dependence relation for these vectors. This might not be the only dependence relation that exists for these vectors, but one is enough for us to know for sure that the set is linearly dependent.
Proof of the "When One Vector is a Multiple of Another" Theorem: Let the set \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) be given, and suppose that \( \bbm v_i = r\ \bbm v_j \) for some scalar \( r\). Consider the linear combination \( c_1 \bbm v_1 + c_2 \bbm v_2 + \cdots + c_p \bbm v_p \), where \( c_i = 1 \), \( c_j = -r \), and all other \( c \)-values are zero. Then \( c_1 \bbm v_1 + c_2 \bbm v_2 + \cdots + c_p \bbm v_p = \bbm v_i - r\ \bbm v_j = \bbm 0 \), and this is a dependence relation since \( c_i \ne 0 \). \( \Box \)
The situation for sets of three or more vectors is more complicated. It's important to note that it is possible for a set of three or more vectors to be linearly dependent even when none of the vectors in the set is a multiple of another. However, it is true that, in a linearly dependent set, one of the vectors must be a linear combination of the other vectors in the set.
Theorem (Characterization of Linearly Dependent Sets). A set \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) of two or more vectors is linearly dependent if and only if at least one of the vectors \( \bbm v_i \) in the set is a linear combination of the preceding vectors \( \bbm v_1, \ldots, \bbm v_{i-1} \).
Proof ( \( \Rightarrow \) ). Suppose that \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) is linearly dependent. Then there exists a dependence relation \( c_1 \bbm v_1 + c_2 \bbm v_2 + \cdots + c_p \bbm v_p = \bbm 0 \) where the \( c_i \) are not all zero. Let \( i \) be the largest subscript for which \( c_i \ne 0 \). We have \[ c_1 \bbm v_1 + \cdots + c_{i-1} \bbm v_{i-1} + c_i \bbm v_i + 0 \bbm v_{i+1} + \cdots + 0 \bbm v_p = \bbm 0. \]
Since we know that \( c_i \ne 0 \), we can solve for \( \bbm v_i \): \[ \bbm v_i = -\frac{c_1}{c_i} \bbm v_1 - \cdots - \frac{c_{i-1}}{c_i} \bbm v_{i-1} \] Therefore, \( \bbm v_i \) is a linear combination of \( \bbm v_1, \ldots, \bbm v_{i-1} \).
Proof ( \( \Leftarrow \) ). Suppose that \( \bbm v_i \) is a linear combination of \( \bbm v_1, \ldots, \bbm v_{i-1} \). Specifically, write \( \bbm v_i = c_1 \bbm v_1 + \cdots + c_{i-1} \bbm v_{i-1} \) for some scalars \( c_1, \ldots, c_{i-1} \). Now, \[ c_1 \bbm v_1 + \cdots + c_{i-1} \bbm v_{i-1} + (-1) \bbm v_i + 0 \bbm v_{i+1} + \cdots + 0 \bbm v_p = \bbm 0. \]
Since the coefficient of \( \bbm v_i \) is \( -1 \), which is not zero, this is a dependence relation for the set \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \). This shows that the set is linearly dependent. \( \Box \)
There are a few more special cases where we can tell that a set is linearly dependent without needing to solve a homogeneous vector equation.
Theorem (More Vectors Than Entries). If a set contains more vectors than there are entries in each vector, then the set is linearly dependent. That is, any set \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) in \( \mathbb R^n \) is linearly dependent if \( p > n \).
Proof. Let \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) be a set of vectors in \( \mathbb R^n \) with \( p > n \). The vector equation \( x_1 \bbm v_1 + \cdots x_p \bbm v_p = \bbm 0 \) has a coefficient matrix whose columns are \( \bbm v_1, \ldots, \bbm v_p \).
This matrix has \( n \) rows and \( p \) columns. Since the matrix has more columns than rows, there must be at least one column without a pivot. That means that the vector equation has at least one free variable. So, the vector equation has non-trivial solutions, telling us that the set \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) is linearly dependent. \( \Box \)
Theorem (Sets Containing the Zero Vector). If a set of vectors \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) contains the zero vector, then this set is linearly dependent.
Proof. Suppose that \( \bbm v_i = \bbm 0 \). We can construct this dependence relation: \[ 0\bbm v_1 + \cdots 0 \bbm v_{i-1} + 1 \bbm v_i + 0 \bbm v_{i+1} + \cdots + 0\bbm v_p = \bbm 0. \Box \]
Similar to the Spanning Columns Theorem, we have one final theorem that help us understand the relationship between the linear independence of a set \( \{ \bbm v_1, \bbm v_2, \ldots, \bbm v_p \} \) and the matrix that has these vectors as its columns.
Linearly Independent Columns Theorem. Let \( A \) be an \( m \times n \) matrix. The following statements are logically equivalent:
This theorem follows from the definitions of linear dependence and independence, as well as the Characterization of Linearly Dependent Sets.
We have established a large number of tools that can be used to investigate the linear independence of a set of vectors:
It will take practice to determine which is the best tool to use in any given situation. In general, try to avoid solving a homogeneous equation when a faster or easier tool will work.
« Lecture 15 Back to Top Lecture 17 »