Skip to main content
Logo image

Coordinated Linear Algebra

Section 3.2 Linear Independence

If a friend told you that they have a line spanned by \([1,1]\text{,}\) \([2,2]\) and \([3,3]\text{,}\) you would probably think that your friend’s description is a little excessive. After all, should one of the above vectors not ne sufficient to describe the line? A line can be described as a span of one vector, but it can also be described as a span of two or more vectors. There are many advantages, however, to using the most efficient description possible. In this section we will begin to explore what makes a description ``more efficient."

Subsection 3.2.1 Redundant Vectors

Exploration 3.2.1.

Consider the following collection of vectors:
\begin{equation*} \left\{\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right\}. \end{equation*}
Problem 3.2.1.
What is the span of these vectors?
  1. A line.
  2. All of \(\mathbb{R}^2\text{.}\)
  3. A parallelogram.
  4. A parallelipiped.
Answer.
Option b: All of \(\mathbb{R}^2\)
In this Exploration we will examine what can happen to the span of a collection of vectors when a vector is removed from the collection.
First, let’s remove \([2,1]\) from
\begin{equation*} \left\{\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right\}. \end{equation*}
Problem 3.2.2.
Which of the following is true?
  1. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}\right)=\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)\text{.}\)
  2. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}\right)\) is a line.
  3. \(\displaystyle \mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}\right)=\R^2\)
  4. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}\right)\) is a parallelogram.
Answer.
Option b: \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}\right)\) is a line.
Problem 3.2.3.
Removing \([2,1]\) from
\begin{equation*} \left\{\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right\} \end{equation*}
does what?
  1. It changes the span.
  2. It does not change the span.
Answer.
Option a: It does change the span.
Now let’s remove \([-4,2]\) from the original collection of vectors.
Problem 3.2.4.
Which of the following is true?
  1. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix} 2\\1\end{bmatrix}\right)\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)\text{.}\)
  2. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)\) is a line.
  3. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)\) is the right side of the coordinate plane.
  4. \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)\) is a parallelogram.
Answer.
Option a: \(\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix} 2\\1\end{bmatrix}\right)\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)\text{.}\)
Problem 3.2.5.
Removing \([-4,2]\) from
\begin{equation*} \left\{\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right\} \end{equation*}
does what?
  1. It changes the span.
  2. It does not change the span.
Answer.
Option b: It does change the span.
As you just discovered, removing a vector from a collection of vectors may or may not affect the span of the collection. We will refer to vectors that can be removed from a collection without changing the span as redundant. In Exploration 3.2.1, \([-4,2]\) is redundant, while \([2,1]\) is not.

Definition 3.2.6.

Let \(\{\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\}\) be a set of vectors in \(\R^n\text{.}\) If we can remove one vector without changing the span of this set, then that vector is redundant. In other words, if
\begin{equation*} \mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)=\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right), \end{equation*}
we say that \(\mathbf{v}_j\) is a redundant element of \(\{\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\}\text{,}\) or simply redundant.
Our next goal is to see what causes \([-4,2]\) of Exploration 3.2.1 to be redundant. The answer lies not in the vector itself, but in its relationship to the other vectors.
Observe that \([-4,2]=-2[2,-1]\text{.}\) In other words, \([-4,2]\) is a scalar multiple of another vector in the set. To see why this matters, let’s pick an arbitrary vector \(\mathbf{w}=[0,2]\) in
\begin{equation*} \mbox{span}\left([2,-1], [-4,2], [2,1]\right). \end{equation*}
The vector \(\mathbf{w}\) is in the span because it can be written as a linear combination of the three vectors as follows
\begin{equation*} \mathbf{w}=\begin{bmatrix}0\\2\end{bmatrix}=\begin{bmatrix}2\\-1\end{bmatrix}+ \begin{bmatrix}-4\\2\end{bmatrix}+ \begin{bmatrix}2\\1\end{bmatrix}. \end{equation*}
But \([-4,2]\) is not essential to this linear combination because it can be replaced with \(-2[2,-1]\text{,}\) as shown below.
\begin{align*} \begin{bmatrix}0\\2\end{bmatrix} \amp = \begin{bmatrix}2\\-1\end{bmatrix}+ \begin{bmatrix}-4\\2\end{bmatrix}+ \begin{bmatrix}2\\1\end{bmatrix} \\ \amp =\begin{bmatrix}2\\-1\end{bmatrix}+ (-2)\begin{bmatrix}2\\-1\end{bmatrix}+ \begin{bmatrix}2\\1\end{bmatrix} \\ \amp =-\begin{bmatrix}2\\-1\end{bmatrix}+ \begin{bmatrix}2\\1\end{bmatrix}. \end{align*}
Regardless of what vector \(\mathbf{w}\) we write as a linear combination of\([2,-1]\text{,}\)\([-4,2]\) and \([2,1]\text{,}\) we will always be able to replace \([-4,2]\) with \(-2[2,-1]\text{,}\) placing \(\mathbf{w}\) into the span of \([2,-1]\) and \([2,1]\text{,}\) and making \([-4,2]\) redundant. Similarly, writing \([2,-1]=-\frac{1}{2}[-4,2]\) makes \([2,-1]\) redundant.
We conclude that only one of \([-4,2]\) and \([2,-1]\) is needed to maintain the span of the original three vectors. We have
\begin{equation*} \mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)=\mbox{span}\left(\begin{bmatrix}2\\-1\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right)=\mbox{span}\left(\begin{bmatrix}-4\\2\end{bmatrix}, \begin{bmatrix}2\\1\end{bmatrix}\right). \end{equation*}
The left-most collection in this expression contains redundant vectors; the other two collections do not. In Exploration 3.2.1 we found one vector to be redundant because we could replace it with a scalar multiple of another vector in the set. The following Exploration delves into what happens when a vector in a given set is a linear combination of the other vectors.

Exploration 3.2.2.

Consider the set of vectors
\begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\}. \end{equation*}
The three vectors are shown below. RIGHT-CLICK and DRAG to rotate the interactive graph.
Figure 3.2.7.
Problem 3.2.8.
\(\mbox{span}\left(\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right)\) is
  1. A line.
  2. A plane.
  3. All of \(\R^3\)
Answer.
Option b: A plane.
Can we remove one of the vectors from the set without changing the span? Observe that we can write \([4,4,1-1]\) as a linear combination of the other two vectors
\begin{equation} \begin{bmatrix}4\\4\\-1\end{bmatrix}=2\begin{bmatrix}1\\2\\-1\end{bmatrix}+1\begin{bmatrix}2\\0\\1\end{bmatrix}.\tag{3.2.1} \end{equation}
This means that we can write any vector in
\begin{equation*} \mbox{span}\left(\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right) \end{equation*}
as a linear combination of only \([1,2,-1]\) and \([2,0,1]\) by replacing \([4,4,1]\) with the expression in (3.2.1). For example,
\begin{equation*} \begin{bmatrix}7\\6\\-1\end{bmatrix}=\begin{bmatrix}1\\2\\-1\end{bmatrix}+\begin{bmatrix}2\\0\\1\end{bmatrix}+\begin{bmatrix}4\\4\\-1\end{bmatrix}=3\begin{bmatrix}1\\2\\-1\end{bmatrix}+2\begin{bmatrix}2\\0\\1\end{bmatrix}. \end{equation*}
We have
\begin{equation*} \mbox{span}\left(\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right)=\mbox{span}\left(\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix}\right). \end{equation*}
We conclude that vector \([4,4,1-1]\) is redundant. Can each of the other two vectors in the set
\begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\} \end{equation*}
be considered redundant? You will address this later in the practice problems. .
Collections of vectors that do not contain redundant vectors are very important in linear algebra. We will refer to such collections as linearly independent. Collections of vectors that contain redundant vectors will be called linearly dependent. The following section offers a definition that will allow us to easily determine linear dependence and independence of vectors.

Subsection 3.2.2 Linear Independence

Definition 3.2.9. Linear Independence.

Let \(\mathbf{v}_1, \mathbf{v}_2,\ldots ,\mathbf{v}_k\) be vectors of \(\R^n\text{.}\) We say that the set \(\{\mathbf{v}_1, \mathbf{v}_2,\ldots ,\mathbf{v}_k\}\) is linearly independent if the only solution to
\begin{equation} c_1\mathbf{v}_1+c_2\mathbf{v}_2+\ldots +c_k\mathbf{v}_k=\mathbf{0}\tag{3.2.2} \end{equation}
is the trivial solution \(c_1=c_2=\ldots =c_k=0\text{.}\)
If, in addition to the trivial solution, a non-trivial solution (not all \(c_1, c_2,\ldots ,c_k\) are zero) exists, then the set \(\{\mathbf{v}_1, \mathbf{v}_2,\ldots ,\mathbf{v}_k\}\) is called linearly dependent.

Remark 3.2.10.

Given a set of vectors \(X=\{\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\}\) we can now ask the following questions:
  1. Are the vectors in \(X\) linearly dependent according to Definition 3.2.9?
  2. Can we write one element of \(X\) as a linear combination of the others?
  3. Does \(X\) contain redundant vectors?
It turns out that these questions are equivalent. In other words, if the answer to one of them is ``YES", the answer to the other two is also ``YES". Conversely, if the answer to one of them is ``NO", then the answer to the other two is also ``NO". We will start by illustrating this idea with an example, then conclude this section by formally proving the equivalency.

Example 3.2.11.

What can we say about the following sets of vectors in light of Remark Remark 3.2.10?
  1. \begin{equation*} \begin{bmatrix}2\\-3\end{bmatrix}, \begin{bmatrix}0\\3\end{bmatrix},\begin{bmatrix}1\\-1\end{bmatrix},\begin{bmatrix}1\\-2\end{bmatrix}. \end{equation*}
  2. \begin{equation*} \begin{bmatrix}2\\1\\4\end{bmatrix},\begin{bmatrix}-3\\1\\1\end{bmatrix}. \end{equation*}
Answer.
We will start by addressing linear independence for Item 1. To do so, we will solve the vector equation
\begin{equation} c_1\begin{bmatrix}2\\-3\end{bmatrix}+c_2 \begin{bmatrix}0\\3\end{bmatrix}+c_3\begin{bmatrix}1\\-1\end{bmatrix}+c_4\begin{bmatrix}1\\-2\end{bmatrix}=\mathbf{0}.\tag{3.2.3} \end{equation}
Clearly \(c_1=c_2=c_3=c_4=0\) is a solution to the equation. The question is whether another solution exists. The vector equation translates into the following system:
\begin{equation*} \begin{array}{ccccccccc} 2c_1 \amp \amp \amp +\amp c_3\amp +\amp c_4\amp = \amp 0 \\ -3c_1\amp +\amp 3c_2\amp -\amp c_3\amp -\amp 2c_4\amp = \amp 0 \\ \end{array}. \end{equation*}
Writing the system in augmented matrix form and applying elementary row operations gives us the following reduced row-echelon form:
\begin{equation*} \left[\begin{array}{cccc|c} 2\amp 0\amp 1\amp 1\amp 0\\-3\amp 3\amp -1\amp -2\amp 0 \end{array}\right]\rightsquigarrow\left[\begin{array}{cccc|c} 1\amp 0\amp 1/2\amp 1/2\amp 0\\0\amp 1\amp 1/6\amp -1/6\amp 0 \end{array}\right]. \end{equation*}
This shows that ((3.2.3)) has infinitely many solutions:
\begin{equation*} c_1=-\frac{1}{2}s-\frac{1}{2}t,\quad c_2=-\frac{1}{6}s+\frac{1}{6}t,\quad c_3=s,\quad c_4=t. \end{equation*}
Letting \(t=s=6\text{,}\) we obtain the following:
\begin{equation} -6\begin{bmatrix}2\\-3\end{bmatrix}+0 \begin{bmatrix}0\\3\end{bmatrix}+6\begin{bmatrix}1\\-1\end{bmatrix}+6\begin{bmatrix}1\\-2\end{bmatrix}=\mathbf{0}.\tag{3.2.4} \end{equation}
We conclude that the vectors are linearly dependent.
Observe that (3.2.4) allows us to solve for one of the vectors and express it as a linear combination of the others. For example,
\begin{equation} \begin{bmatrix}2\\-3\end{bmatrix}=0 \begin{bmatrix}0\\3\end{bmatrix}+\begin{bmatrix}1\\-1\end{bmatrix}+\begin{bmatrix}1\\-2\end{bmatrix}.\tag{3.2.5} \end{equation}
This would not be possible if a nontrivial solution to the equation
\begin{equation*} c_1\begin{bmatrix}2\\-3\end{bmatrix}+c_2 \begin{bmatrix}0\\3\end{bmatrix}+c_3\begin{bmatrix}1\\-1\end{bmatrix}+c_4\begin{bmatrix}1\\-2\end{bmatrix}=\mathbf{0} \end{equation*}
did not exist. Using the linear combination in (3.2.5) and the argument of Exploration 3.2.2, we conclude that \([2,-3]\) is redundant in
\begin{equation*} \left\{\begin{bmatrix}2\\-3\end{bmatrix}, \begin{bmatrix}0\\3\end{bmatrix},\begin{bmatrix}1\\-1\end{bmatrix},\begin{bmatrix}1\\-2\end{bmatrix}\right\}. \end{equation*}
We find that the answer to all questions in Remark 3.2.10 is ``YES".
For part Item 2 To address linear independence, we need to solve the equation
\begin{equation*} c_1\begin{bmatrix}2\\1\\4\end{bmatrix}+c_2\begin{bmatrix}-3\\1\\1\end{bmatrix}=\mathbf{0}. \end{equation*}
Converting the equation to augmented matrix form and performing row reduction gives us
\begin{equation*} \left[\begin{array}{cc|c} 2\amp -3\amp 0\\1\amp 1\amp 0\\4\amp 1\amp 0 \end{array}\right]\rightsquigarrow\left[\begin{array}{cc|c} 1\amp 0\amp 0\\0\amp 1\amp 0\\0\amp 0\amp 0 \end{array}\right]. \end{equation*}
This shows that \(c_1=c_2=0\) is the only solution. Therefore the two vectors are linearly independent. Furthermore, we cannot write one of the vectors as a linear combination of the other. Do you see that the only way this would be possible with a set of two vectors is if they were scalar multiples of each other?
Finally, we observe that removing either vector would change the span from a plane in \(\R^3\) to a line in \(\R^3\text{,}\) so the answer to all three questions in Remark 3.2.10 is ``NO".

Proof.

For Item 1 \(\implies\) Item 2, if \(\mathbf{v}_1,\mathbf{v}_2,\dots ,\mathbf{v}_k\) are linearly dependent, then
\begin{equation*} c_1\mathbf{v}_1+c_2\mathbf{v}_2+\ldots +c_j\mathbf{v}_j+\ldots +c_k\mathbf{v}_k=\mathbf{0} \end{equation*}
has a non-trivial solution. In other words at least one of the constants, say \(c_j\text{,}\) does not equal zero. This allows us to solve for \(\mathbf{v}_j\text{:}\)
\begin{align*} -c_j\mathbf{v}_j\amp =c_1\mathbf{v}_1+c_2\mathbf{v}_2+\ldots +c_k\mathbf{v}_k \\ \\ \mathbf{v}_j \\ \amp =-\frac{c_1}{c_j}\mathbf{v}_1-\frac{c_2}{c_j}\mathbf{v}_2-\ldots -\frac{c_k}{c_j}\mathbf{v}_k. \end{align*}
Do you see why it was important to have one of the constants nonzero? This shows that \(\mathbf{v}_j\) may be expressed as a linear combination of the other vectors.
For the implication Item 2 \(\implies\) Item 3, suppose \(\mathbf{v}_j\) is a linear combination of \(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\text{.}\) We will show that \(\mathbf{v}_j\) is redundant by showing that
\begin{equation*} \mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)=\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right). \end{equation*}
To show equality of the two spans we will pick a vector in the left span and show that it is also an element of the span on the right. Then, we will pick a vector in the right span and show that it is also an element of the span on the left, and we will conclude that the sets are equal.
Observe that if \(\mathbf{w}\) is in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right)\text{,}\) then it has to be in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)\text{.}\) (Why?) Now suppose \(\mathbf{w}\) is in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)\text{.}\) We need to show that \(\mathbf{w}\) is also in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right)\text{.}\) By assumption, we can write \(\mathbf{v}_j\) as
\begin{equation} \mathbf{v}_j=a_1\mathbf{v}_1+a_2\mathbf{v}_2+\dots +a_{j-1}\mathbf{v}_{j-1}+a_{j+1}\mathbf{v}_{j+1}+\dots +a_k\mathbf{v}_k.\tag{3.2.6} \end{equation}
Since \(\mathbf{w}\) is in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)\text{,}\) we have
\begin{equation*} \mathbf{w}=b_1\mathbf{v}_1+b_2\mathbf{v}_2+\dots +b_j\mathbf{v}_j+\dots +b_k\mathbf{v}_k. \end{equation*}
Substituting the expression in (3.2.6) for \(\mathbf{v}_j\) and simplifying, we obtain the following
\begin{align*} \mathbf{w}=(b_1+b_ja_1)\mathbf{v}_1+(b_2+b_ja_2)\mathbf{v}_2 \amp + \dots \\ \amp + (b_{j-1}+b_ja_{j-1})\mathbf{v}_{j-1} \\ \amp + (b_{j+1}+b_ja_{j+1})\mathbf{v}_{j+1} \\ \amp + \dots \\ \amp + (b_k+b_ja_k)\mathbf{v}_k. \end{align*}
This shows that \(\mathbf{w}\) is in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right)\text{.}\) We now have
\begin{equation*} \mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)=\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right), \end{equation*}
which shows that \(\mathbf{v}_j\) is redundant.
We now show the implication Item 3 \(\implies\) Item 1 holds. Suppose that \(\mathbf{v}_j\) is redundant, so that
\begin{equation*} \mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)=\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right). \end{equation*}
Consider a vector \(\mathbf{w}\) in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\right)\)
\begin{equation} \mathbf{w}=a_1\mathbf{v}_1+a_2\mathbf{v}_2+\dots +a_j\mathbf{v}_j+\dots +a_k\mathbf{v}_k\tag{3.2.7} \end{equation}
Since the span contains ALL possible linear combinations of \(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\text{,}\) we may choose \(\mathbf{w}\) such that \(a_j\neq 0\text{.}\)
By assumption, \(\mathbf{w}\) is also in \(\mbox{span}\left(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_{j-1},\mathbf{v}_{j+1},\dots,\mathbf{v}_k\right)\text{.}\) Therefore, we can express \(\mathbf{w}\) as a linear combination
\begin{equation} \mathbf{w}=b_1\mathbf{v}_1+b_2\mathbf{v}_2+\dots +b_{j-1}\mathbf{v}_{j-1}+b_{j+1}\mathbf{v}_{j+1}+\dots +b_k\mathbf{v}_k.\tag{3.2.8} \end{equation}
We complete the proof by showing there exists a non-trivial solution to
\begin{equation} c_1\mathbf{v}_1+c_2\mathbf{v}_2+\ldots +c_j\mathbf{v}_j+\ldots +c_k\mathbf{v}_k=\mathbf{0}.\tag{3.2.9} \end{equation}
Subtracting expression (3.2.8) from (3.2.7) we obtain
\begin{align*} \mathbf{0}=\mathbf{w}-\mathbf{w}=(a_1-b_1)\mathbf{v}_1\amp + \dots \\ \amp + (a_{j-1}-b_{j-1})\mathbf{v}_{j-1}+a_j\mathbf{v}_j+(a_{j+1}-b_{j+1})\mathbf{v}_{j+1} \\ \amp + \dots \\ \amp + (a_k-b_k)\mathbf{v}_k \end{align*}
Recall that we ensured that \(a_j\neq 0\text{.}\) This implies that we have a non-trivial solution to (3.2.9). These three parts of the proof show that if one of the conditions is true, all three must be true.
It is a logical consequence that if one of the three conditions is false, all three must be false.

Subsection 3.2.3 Geometry of Linearly Dependent and Linearly Independent Vectors

Theorem Theorem 3.2.12 gives us a convenient ways of looking at linear dependence/independence geometrically. When looking at two or more vectors, we ask, ``can one of the vectors be written as a linear combination of the others?" We can also ask, ``is one of the vectors redundant?’’ If the answer to either of these questions is ``YES", then the vectors are linearly dependent.

Subsubsection 3.2.3.1 A Set of Two Vectors

Two vectors are linearly dependent if and only if one is a scalar multiple of the other. Two nonzero linearly dependent vectors may look like this:
Two vectors drawn
or like this:
One vector flipped
Two linearly independent vectors will look like this:
One vector rotated

Subsubsection 3.2.3.2 A Set of Three Vectors

Given a set of three nonzero vectors, we have the following possibilities:
  • (Linearly Dependent Vectors) The three vectors are scalar multiples of each other.
  • (Linearly Dependent Vectors) Two of the vectors are scalar multiples of each other.
  • (Linearly Dependent Vectors) One vector can be viewed as the diagonal of a parallelogram determined by scalar multiples of the other two vectors. All three vectors lie in the same plane.
  • (Linearly Independent Vectors) A set of three vectors is linearly independent if the vectors do not lie in the same plane. For example, vectors \(\mathbf{i}\text{,}\) \(\mathbf{j}\) and \(\mathbf{k}\) are linearly independent.

Exercises 3.2.4 Exercises

1.

    In Exploration 3.2.2 we considered the following set of vectors
    \begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\} \end{equation*}
    and demonstrated that \([4,4,-1]\) is redundant by using the fact that it is a linear combination of the other two vectors.
    1. Express each of \([1,2,-1]\) and \([2,0,1]\) as a linear combination of the remaining vectors.
    2. Which of the following is NOT true?
  • If \(\mathbf{w} \) is in
    \begin{equation*} \mbox{span}\left(\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right), \end{equation*}
    then \(\mathbf{w}\) is in
    \begin{equation*} \mbox{span}\left(\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right). \end{equation*}
  • Both \([1,2,-1]\) and \([2,0,1]\) are redundant in
    \begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\}. \end{equation*}
  • We can remove \([1,2,-1]\) and \([2,0,1]\) from
    \begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\} \end{equation*}
    at the same time without affecting the span.
Answer.
\begin{gather*} \begin{bmatrix}1\\2\\-1\end{bmatrix} =-\frac{1}{2}\begin{bmatrix}2\\0\\1\end{bmatrix}+\frac{1}{2}\begin{bmatrix}4\\4\\-1\end{bmatrix} \\ \begin{bmatrix}2\\0\\1\end{bmatrix}=-2\begin{bmatrix}1\\2\\-1\end{bmatrix}+1\begin{bmatrix}4\\4\\-1\end{bmatrix} \end{gather*}

Exercise Group.

Are the given vectors linearly independent?
2.
    \begin{equation*} \begin{bmatrix}-1\\0\end{bmatrix}, \begin{bmatrix}2\\3\end{bmatrix},\begin{bmatrix}4\\-1\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
If we rewrite
\begin{equation*} c_1\begin{bmatrix}-1\\0\end{bmatrix}+c_2 \begin{bmatrix}2\\3\end{bmatrix}+c_3\begin{bmatrix}4\\-1\end{bmatrix}=\begin{bmatrix}0\\0\end{bmatrix} \end{equation*}
as a system of linear equations, there will be more unknowns than equations.
3.
    \begin{equation*} \begin{bmatrix}1\\0\\5\end{bmatrix}, \begin{bmatrix}2\\2\\3\end{bmatrix},\begin{bmatrix}-1\\0\\1\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
If we let \(A\) be the matrix whose columns are these vectors, then \(\mbox{rref}(A)\) should tell us what we want to know.
4.
    \begin{equation*} \begin{bmatrix}3\\0\\5\end{bmatrix}, \begin{bmatrix}2\\0\\2\end{bmatrix},\begin{bmatrix}-1\\0\\-5\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
If we let \(A\) be the matrix whose columns are these vectors, then \(\mbox{rref}(A)\) should tell us what we want to know.
5.
    \begin{equation*} \begin{bmatrix}3\\1\\4\\1\end{bmatrix}, \begin{bmatrix}-2\\1\\1\\1\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
In a set of two vectors, the only way one could be redundant is if they are scalar multiples of each other.

Exercise Group.

True or false?
6.
    Any set containing the zero vector is linearly dependent.
  • TRUE
  • FALSE
Hint.
Can the zero vector be removed from the set without changing the span?
7.
    A set containing five vectors in \(\R^2\) is linearly dependent.
  • TRUE
  • FALSE
Hint.
If we rewrite (3.2.2) for five vectors in \(\R^2\) as a system of equations, how many equations and unknowns will it have? What does this imply about the number of solutions?

Exercise Group.

Each problem below provides information about vectors \(\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\text{.}\) If possible, determine whether the vectors are linearly dependent or independent.
8.
    \begin{equation*} 0\mathbf{v}_1+ 0\mathbf{v}_2+ 0\mathbf{v}_3=\mathbf{0} \end{equation*}
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination
9.
    \begin{equation*} 3\mathbf{v}_1+ 4\mathbf{v}_2- \mathbf{v}_3=\mathbf{0} \end{equation*}
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination
10.
    \begin{equation*} 2\mathbf{v}_1+ 0\mathbf{v}_2+ 0\mathbf{v}_3=\mathbf{0} \end{equation*}
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination

Exercise Group.

Each diagram below shows a collection of vectors. Are the vectors linearly dependent or independent?
11.
    Example with three vectors
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination
12.
    Example with two vectors
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination

13.

Suppose \(\{\mathbf{v}_{1}, \dots , \mathbf{v}_{m}\}\) is a linearly independent set in \(\R^n\text{,}\) and that \(\mathbf{v}_{m+1}\) is not in \(\mbox{span}\left(\mathbf{v}_{1}, \dots , \mathbf{v}_{m}\right)\text{.}\) Prove that \(\{\mathbf{v}_{1}, \dots , \mathbf{v}_{m}, \mathbf{v}_{m+1}\}\) is also linearly independent.

14.

Suppose \(\{{\mathbf{u}},{\mathbf{v}}\}\) is a linearly independent set of vectors. Prove that the set \(\{\mathbf{u} -\mathbf{v}, \mathbf{u}+2\mathbf{v}\}\) is also linearly independent.

15.

Suppose \(\{{\mathbf{u}},{\mathbf{v}}\}\) is a linearly independent set of vectors in \(\R^3\text{.}\) Is the following set dependent or independent \(\{\mathbf{u} -\mathbf{v}, \mathbf{u}+2\mathbf{v}, \mathbf{u}+\mathbf{v}\}\text{?}\) Prove your claim.