Skip to main content
Logo image

Coordinated Linear Algebra

Section 2.4 Homogeneous Linear Systems

Definition 2.4.1.

A system of linear equations is called homogeneous if the system can be written in the form
\begin{align*} a_{11}x_1 \amp + \amp a_{12}x_2\amp +\amp \ldots\amp +\amp a_{1n}x_n\amp = \amp 0 \\ a_{21}x_1 \amp + \amp a_{22}x_2\amp +\amp \ldots\amp +\amp a_{2n}x_n\amp = \amp 0 \\ \amp \amp \amp \amp \vdots\amp \amp \amp \amp \\ a_{m1}x_1 \amp + \amp a_{m2}x_2\amp +\amp \ldots\amp +\amp a_{mn}x_n\amp = \amp 0 \end{align*}
A homogeneous linear system is always consistent because \(x_1=0, x_2=0, \ldots ,x_n=0\) is a solution. This solution is called the trivial solution. Geometrically, a homogeneous system can be interpreted as a collection of lines or planes (or hyperplanes) passing through the origin. Thus, they will always have the origin in common, but may have other points in common as well.
If \(A\) is the coefficient matrix for a homogeneous system, then the system can be written as a matrix equation \(A\mathbf{x}=\mathbf{0}\text{.}\) The augmented matrix that represents the system looks like this
\begin{equation*} \left[\begin{array}{c|c} A\amp 0 \end{array}\right] \end{equation*}
As we perform elementary row operations, the entries to the right of the vertical bar remain \(0\text{.}\) It is customary to omit writing them down and apply elementary row operations to the coefficient matrix only.

Example 2.4.2.

Solve the given homogeneous system and interpret your solution geometrically.
\begin{align*} 4x \amp + \amp 5y\amp -\amp z\amp = \amp 0 \\ x\amp - \amp 4y\amp -\amp 2z\amp = \amp 0 \\ 3x \amp - \amp 6y\amp -\amp 4z\amp = \amp 0 \end{align*}
Answer.
We start by rewriting this system as a matrix equation
\begin{equation*} \begin{bmatrix}4\amp 5\amp -1\\1\amp -4\amp -2\\3\amp -6\amp -4\end{bmatrix}\begin{bmatrix}x\\y\\z\end{bmatrix}=\mathbf{0}. \end{equation*}
We will proceed to find the reduced row echelon form of the matrix as usual, but will omit writing the zeros to the right of the vertical bar.
\begin{equation*} \begin{bmatrix}4\amp 5\amp -1\\1\amp -4\amp -2\\3\amp -6\amp -4\end{bmatrix}\rightsquigarrow \begin{bmatrix}1\amp 0\amp -2/3\\0\amp 1\amp 1/3\\0\amp 0\amp 0\end{bmatrix}. \end{equation*}
\(x\) and \(y\) are leading variables, and \(z\) is a free variable. We let \(z=t\) and solve for \(x\) and \(y\text{.}\)
\begin{equation*} x =\frac{2}{3}t, \quad \ y =-\frac{1}{3}t, \quad \ z =t. \end{equation*}
Each of the equations in the original system represents a plane through the origin in \(\R^3\text{.}\) The system has infinitely many solutions. Geometrically, we can interpret these solutions as points lying on the line shared by the three planes. The above solution is a parametric representation of this line. Use the GeoGebra demo below to take a better look at the system. (RIGHT-CLICK and DRAG to rotate the image.)
Figure 2.4.3.

Subsection 2.4.1 General and Particular Solutions

Definition 2.4.4.

Given any linear system \(A\mathbf{x}=\mathbf{b}\text{,}\) the system \(A\mathbf{x}=\mathbf{0}\) is called the associated homogeneous system.
It turns out that there is a relationship between solutions of \(A\mathbf{x}=\mathbf{b}\) and solutions of the associated homogeneous system.

Exploration 2.4.1.

Let
\begin{equation*} A=\begin{bmatrix}1\amp 2\amp 4\\3\amp -7\amp -1\\-1\amp 4\amp 2\end{bmatrix}\quad\text{and}\quad\mathbf{b}=\begin{bmatrix}-2\\7\\-4\end{bmatrix} \end{equation*}
Consider the matrix equation \(A\mathbf{x}=\mathbf{b}\text{.}\) Row reduction produces the following.
\begin{equation*} \left[\begin{array}{ccc|c} 1\amp 2\amp 4\amp -2\\3\amp -7\amp -1\amp 7\\-1\amp 4\amp 2\amp -4 \end{array}\right]\begin{array}{c} \\ \rightsquigarrow\\ \\ \end{array}\left[\begin{array}{ccc|c} 1\amp 0\amp 2\amp 0\\0\amp 1\amp 1\amp -1\\0\amp 0\amp 0\amp 0 \end{array}\right] \end{equation*}
We conclude that \(\mathbf{x}=\begin{bmatrix}-2t\\-1-t\\t\end{bmatrix}\text{.}\)
Problem 2.4.5.
Let’s take a more careful look at \(\mathbf{x}\text{.}\) Rewrite
\begin{equation*} \mathbf{x}=\begin{bmatrix}-2t\\-1-t\\t\end{bmatrix} \end{equation*}
onto its parametric form.
Answer.
\begin{equation*} \mathbf{x}=\begin{bmatrix}-2t\\-1-t\\t\end{bmatrix}=\begin{bmatrix}0\\-1\\0\end{bmatrix}+\begin{bmatrix}-2\\-1\\1\end{bmatrix}t. \end{equation*}
We now see that the solution vector \(\mathbf{x}\) is made up of two distinct parts:
  • one specific vector \(\begin{bmatrix}0\\-1\\0\end{bmatrix}\)
  • infinitely many scalar multiples of \(\begin{bmatrix}-2\\-1\\1\end{bmatrix}\text{.}\)
The vector \(\begin{bmatrix}0\\-1\\0\end{bmatrix}\) is an example of a particular solution. This particular ``particular solution" corresponds to \(t=0\text{.}\) We can find any number of particular solutions by letting \(t\) assume different values. For example, the particular solution that corresponds to \(t=1\) is \(\begin{bmatrix}-2\\-2\\1\end{bmatrix}\text{.}\) Let \(\mathbf{x}_p\) be any particular solution of \(A\mathbf{x}=\mathbf{b}\text{.}\) It turns out that all vectors of the form
\begin{equation*} \mathbf{x}=\mathbf{x}_p+\begin{bmatrix}-2\\-1\\1\end{bmatrix}t \end{equation*}
are solutions of \(A\mathbf{x}=\mathbf{b}\text{.}\) We can verify this as follows
\begin{align*} A\mathbf{x} \amp =A\left(\mathbf{x}_p+\begin{bmatrix}-2\\-1\\1\end{bmatrix}t\right) \\ \amp =A\mathbf{x}_p+\begin{bmatrix}1\amp 2\amp 4\\3\amp -7\amp -1\\-1\amp 4\amp 2\end{bmatrix}\begin{bmatrix}-2\\-1\\1\end{bmatrix}t \\ \amp =A\mathbf{x}_p+\mathbf{0}=\mathbf{b}. \end{align*}
This shows that the specific vector \(\begin{bmatrix}0\\-1\\0\end{bmatrix}\) is not very special, as any solution of \(A\mathbf{x}=\mathbf{b}\) can be used in its place. The vector \(\begin{bmatrix}-2\\-1\\1\end{bmatrix}\text{,}\) however, is special. Note that
\begin{equation*} A\begin{bmatrix}-2\\-1\\1\end{bmatrix}=\begin{bmatrix}1\amp 2\amp 4\\3\amp -7\amp -1\\-1\amp 4\amp 2\end{bmatrix}\begin{bmatrix}-2\\-1\\1\end{bmatrix}=\mathbf{0}. \end{equation*}
So \(\begin{bmatrix}-2\\-1\\1\end{bmatrix}\) and all of its scalar multiples are solutions to the associated homogeneous system.

Observation 2.4.6.

In Exploration 2.4.1 we found that the general solution of the equation \(A\mathbf{x}=\mathbf{b}\) has the form:
\begin{equation*} \mathbf{x}=(\text{Any Particular Solution of}\,A\mathbf{x}=\mathbf{b}) + (\text{General Solution of}\,A\mathbf{x}=\mathbf{0}). \end{equation*}
It turns out that the general solution of any linear system can be written in this format. Theorem 2.4.7 formalizes this result.

Proof.

We will prove part Item 2. The proof of part Item 1 is left to the reader.
[Proof of Item 2]: Let \(\mathbf{x}_h=\mathbf{x}_1-\mathbf{x}_p\text{,}\) then
\begin{equation*} A\mathbf{x}_h=A(\mathbf{x}_1-\mathbf{x}_p)=A\mathbf{x}_1-A\mathbf{x}_p=\mathbf{b}-\mathbf{b}=\mathbf{0} \end{equation*}
and
\begin{equation*} \mathbf{x}_1=\mathbf{x}_p+\mathbf{x}_h \end{equation*}

Example 2.4.8.

Let
\begin{equation*} A=\begin{bmatrix}2\amp -4\amp -2\\1\amp -2\amp -1\end{bmatrix}\quad\text{and}\quad \mathbf{b}=\begin{bmatrix}8\\4\end{bmatrix}. \end{equation*}
If possible, find a solution of \(A\mathbf{x}=\mathbf{b}\) and express it as a sum of a particular solution and the general solution of the associated homogeneous system. (\(\mathbf{x}=\mathbf{x}_p+\mathbf{x}_h\))
Answer.
First, we obtain the reduced row echelon form
\begin{equation*} \left[\begin{array}{ccc|c} 2\amp -4\amp -2\amp 8\\1\amp -2\amp -1\amp 4 \end{array}\right]\begin{array}{c} \\ \rightsquigarrow\\ \\ \end{array}\left[\begin{array}{ccc|c} 1\amp -2\amp -1\amp 4\\0\amp 0\amp 0\amp 0 \end{array}\right]. \end{equation*}
So
\begin{equation*} \mathbf{x}=\begin{bmatrix}4+2s+t\\s\\t\end{bmatrix}=\begin{bmatrix}4\\0\\0\end{bmatrix}+\begin{bmatrix}2\\1\\0\end{bmatrix}s+\begin{bmatrix}1\\0\\1\end{bmatrix}t \end{equation*}
In this case
\begin{equation*} \mathbf{x}_p=\begin{bmatrix}4\\0\\0\end{bmatrix} \quad \ \text{and} \ \quad \mathbf{x}_h=\begin{bmatrix}2\\1\\0\end{bmatrix}s+\begin{bmatrix}1\\0\\1\end{bmatrix}t \end{equation*}

Subsection 2.4.2 Linear Independence

Homogeneous linear systems allow us to define a very important concept: linear independence.

Definition 2.4.9. Linear Independence.

Let \(\mathbf{v}_1, \mathbf{v}_2,\ldots ,\mathbf{v}_k\) be vectors of \(\R^n\text{.}\) We say that the set \(\{\mathbf{v}_1, \mathbf{v}_2,\ldots ,\mathbf{v}_k\}\) is linearly independent if the only solution to the homogeneous linear combination equation
\begin{equation} c_1\mathbf{v}_1+c_2\mathbf{v}_2+\ldots +c_k\mathbf{v}_k=\mathbf{0}\tag{2.4.1} \end{equation}
is the trivial solution \(c_1=c_2=\ldots =c_k=0\text{.}\)
If, in addition to the trivial solution, a non-trivial solution (not all \(c_1, c_2,\ldots ,c_k\) are zero) exists, then the set \(\{\mathbf{v}_1, \mathbf{v}_2,\ldots ,\mathbf{v}_k\}\) is called linearly dependent.

Remark 2.4.10.

Given a set of vectors \(X=\{\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\}\) we can now ask the following questions:
  1. Are the vectors in \(X\) linearly dependent according to Definition 2.4.9?
  2. Can we write one element of \(X\) as a linear combination of the others?
It turns out that these questions are equivalent. In other words, if the answer to one of them is ``YES", the answer to the other is also ``YES". Conversely, if the answer to one of them is ``NO", then the answer to the other is also ``NO". We will start by illustrating this idea with an example, then conclude this section by formally proving the equivalency.

Example 2.4.11.

What can we say about the following sets of vectors in light of Remark Remark 2.4.10?
  1. \begin{equation*} \begin{bmatrix}2\\-3\end{bmatrix}, \begin{bmatrix}0\\3\end{bmatrix},\begin{bmatrix}1\\-1\end{bmatrix},\begin{bmatrix}1\\-2\end{bmatrix}. \end{equation*}
  2. \begin{equation*} \begin{bmatrix}2\\1\\4\end{bmatrix},\begin{bmatrix}-3\\1\\1\end{bmatrix}. \end{equation*}
Answer.
We will start by addressing linear independence for Item 1. To do so, we will solve the linear combination equation
\begin{equation} c_1 \begin{bmatrix} 2\\-3 \end{bmatrix} + c_2 \begin{bmatrix}0\\3\end{bmatrix} + c_3\begin{bmatrix}1\\-1\end{bmatrix} + c_4\begin{bmatrix}1\\-2\end{bmatrix} = \mathbf{0}.\tag{2.4.2} \end{equation}
Clearly \(c_1=c_2=c_3=c_4=0\) is a solution to the equation. The question is whether another solution exists. The linear combination equation translates into the following system:
\begin{equation*} \begin{array}{ccccccccc} 2c_1 \amp \amp \amp +\amp c_3\amp +\amp c_4\amp = \amp 0 \\ -3c_1\amp +\amp 3c_2\amp -\amp c_3\amp -\amp 2c_4\amp = \amp 0 \\ \end{array}. \end{equation*}
Writing the system in augmented matrix form and applying elementary row operations gives us the following reduced row echelon form:
\begin{equation*} \left[\begin{array}{cccc|c} 2\amp 0\amp 1\amp 1\amp 0\\-3\amp 3\amp -1\amp -2\amp 0 \end{array}\right] \xrightarrow{\text{RREF}} \left[\begin{array}{cccc|c} 1\amp 0\amp 1/2\amp 1/2\amp 0\\0\amp 1\amp 1/6\amp -1/6\amp 0 \end{array}\right]. \end{equation*}
This shows that (2.4.2) has infinitely many solutions:
\begin{equation*} c_1=-\frac{1}{2}s-\frac{1}{2}t,\quad c_2=-\frac{1}{6}s+\frac{1}{6}t,\quad c_3=s,\quad c_4=t. \end{equation*}
Since there are infinitely many solutions, we conclude that the vectors are linearly dependent.
Now to address part Item 2 of the remark. We can use the solution to the homogeneous linear combination equation (2.4.2) to write one of the vectors as a linear combination of the others. Letting \(t=s=6\text{,}\) we obtain the following:
\begin{equation} -6\begin{bmatrix}2\\-3\end{bmatrix} + 0 \begin{bmatrix}0\\3\end{bmatrix} + 6\begin{bmatrix}1\\-1\end{bmatrix} + 6\begin{bmatrix}1\\-2\end{bmatrix} = \mathbf{0}.\tag{2.4.3} \end{equation}
Now we solve (2.4.3) for one of the vectors:
\begin{equation} \begin{bmatrix}2\\-3\end{bmatrix} = 0\begin{bmatrix}0\\3\end{bmatrix} + \begin{bmatrix}1\\-1\end{bmatrix} + \begin{bmatrix}1\\-2\end{bmatrix}.\tag{2.4.4} \end{equation}
This would not be possible if a nontrivial solution to the equation
\begin{equation*} c_1\begin{bmatrix}2\\-3\end{bmatrix} + c_2 \begin{bmatrix}0\\3\end{bmatrix} + c_3\begin{bmatrix}1\\-1\end{bmatrix} + c_4\begin{bmatrix}1\\-2\end{bmatrix} = \mathbf{0} \end{equation*}
did not exist. Therefore, we conclude the answer to both questions in Remark 2.4.10 is ``YES".
For part Item 2 To address linear independence, we need to solve the equation
\begin{equation*} c_1\begin{bmatrix}2\\1\\4\end{bmatrix}+c_2\begin{bmatrix}-3\\1\\1\end{bmatrix}=\mathbf{0}. \end{equation*}
Converting the equation to augmented matrix form and performing row reduction gives us
\begin{equation*} \left[\begin{array}{cc|c} 2\amp -3\amp 0\\1\amp 1\amp 0\\4\amp 1\amp 0 \end{array}\right] \xrightarrow{\text{RREF}} \left[\begin{array}{cc|c} 1\amp 0\amp 0\\0\amp 1\amp 0\\0\amp 0\amp 0 \end{array}\right]. \end{equation*}
This shows that \(c_1=c_2=0\) is the only solution. Therefore the two vectors are linearly independent. Furthermore, we cannot write one of the vectors as a linear combination of the other. Do you see that the only way this would be possible with a set of two vectors is if they were scalar multiples of each other?
So the answer to both questions in Remark 2.4.10 is ``NO".

Proof.

For Item 1 \(\implies\) Item 2, if \(\mathbf{v}_1,\mathbf{v}_2,\dots ,\mathbf{v}_k\) are linearly dependent, then
\begin{equation*} c_1\mathbf{v}_1+c_2\mathbf{v}_2+\ldots +c_j\mathbf{v}_j+\ldots +c_k\mathbf{v}_k=\mathbf{0} \end{equation*}
has a non-trivial solution. In other words at least one of the constants, say \(c_j\text{,}\) does not equal zero. This allows us to solve for \(\mathbf{v}_j\text{:}\)
\begin{align*} -c_j\mathbf{v}_j \amp = c_1\mathbf{v}_1 + c_2\mathbf{v}_2 + \ldots + c_k\mathbf{v}_k \\ \mathbf{v}_j \amp = -\frac{c_1}{c_j}\mathbf{v}_1 - \frac{c_2}{c_j}\mathbf{v}_2 - \ldots - \frac{c_k}{c_j}\mathbf{v}_k. \end{align*}
Do you see why it was important to have one of the constants nonzero? This shows that \(\mathbf{v}_j\) may be expressed as a linear combination of the other vectors.
For the implication Item 2 \(\implies\) Item 1, suppose \(\mathbf{v}_j\) is a linear combination of \(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\text{.}\) We will show that the vectors \(\mathbf{v}_1,\mathbf{v}_2,\dots ,\mathbf{v}_k\) are linearly dependent. That is, there is a non-trivial solution to the homogeneous linear combination equation
\begin{equation} x_1\mathbf{v}_1+x_2\mathbf{v}_2+\cdots +x_j\mathbf{v}_j+\cdots +x_k\mathbf{v}_k=\mathbf{0}.\tag{2.4.5} \end{equation}
Since \(\mathbf{v}_j\) is a linear combination of \(\mathbf{v}_1,\mathbf{v}_2,\dots,\mathbf{v}_k\text{,}\) we can write
\begin{equation*} \mathbf{v}_j = a_1\mathbf{v}_1+a_2\mathbf{v}_2+\ldots +a_{j-1}\mathbf{v}_{j-1}+a_{j+1}\mathbf{v}_{j+1}+\ldots +a_k\mathbf{v}_k \end{equation*}
for some constants \(a_1,a_2,\ldots ,a_{j-1},a_{j+1},\ldots ,a_k\text{.}\) We can now subtract \(\mathbf{v}_{j}\) from both sides of the equation to get:
\begin{equation*} a_1\mathbf{v}_1+a_2\mathbf{v}_2+\cdots +a_{j-1}\mathbf{v}_{j-1}-\mathbf{v}_{j}+a_{j+1}\mathbf{v}_{j+1}+\cdots +a_k\mathbf{v}_k = \mathbf{0} \end{equation*}
Hence, \(x_1=a_1, x_2=a_2, \ldots, x_{j-1}=a_{j-1}, x_{j}=-1, x_{j+1}=a_{j+1},\ldots, x_k=a_k\) is a non-trivial solution to the homogeneous linear combination equation (2.4.5). This shows that the vectors \(\mathbf{v}_1,\mathbf{v}_2,\dots ,\mathbf{v}_k\) are linearly dependent.
These two parts of the proof show that if one of the conditions is true, both must be true. It is a logical consequence that if one of the conditions is false, both must be false.

Subsection 2.4.3 Geometry of Linearly Dependent and Linearly Independent Vectors

Theorem 2.4.12 gives us a convenient way of looking at linear dependence/independence geometrically. When looking at two or more vectors, we ask, “Can one of the vectors be written as a linear combination of the others?” If the answer is “YES”, then the vectors are linearly dependent.

Subsubsection 2.4.3.1 A Set of Two Vectors

Two vectors are linearly dependent if and only if one is a scalar multiple of the other. Two nonzero linearly dependent vectors may look like this:
Two vectors that lie on the same line and point in the same direction.
or like this:
Two vectors that lie on the same line, but point in opposite directions.
Either way, the span of the two vectors is a line.
Two linearly independent vectors will look like this:
Two vectors that do not lie on the same line.
and in this case, the span of the two vectors is a plane.

Subsubsection 2.4.3.2 A Set of Three Vectors

Given a set of three nonzero vectors, we have the following possibilities:
  • (Linearly Dependent Vectors) The three vectors are scalar multiples of each other.
    Three vectors that lie on the same line.
    or perhaps one or two of the vectors point in the opposite direction. In all cases, the span of the three vectors is a line.
  • (Linearly Dependent Vectors) Two of the vectors are scalar multiples of each other.
    Two vectors that lie on the same line and one that lies on a different line.
    and in this case the span of the three vectors is a plane.
  • (Linearly Dependent Vectors) One vector can be viewed as the diagonal of a parallelogram determined by scalar multiples of the other two vectors. All three vectors lie in the same plane.
    Three vectors that lie in the same plane, with one vector being the diagonal of a parallelogram formed by the other two.
  • (Linearly Independent Vectors) A set of three vectors is linearly independent if the vectors do not lie in the same plane. For example, the standard basis vectors \(\mathbf{i}\text{,}\) \(\mathbf{j}\) and \(\mathbf{k}\) are linearly independent.

Exercises 2.4.4 Exercises

Exercise Group.

For each matrix \(A\) and vector \(\mathbf{b}\) below, find a solution to \(A\mathbf{x}=\mathbf{b}\) and express your solution as a sum of a particular solution and a general solution to the associated homogeneous system.
1.
\begin{equation*} A=\begin{bmatrix}1\amp 1\amp 3\amp 1\\3\amp 4\amp 2\amp 1\end{bmatrix}\quad\text{and}\quad\mathbf{b}=\begin{bmatrix}6\\1\end{bmatrix} \end{equation*}
2.
\begin{equation*} A=\begin{bmatrix}3\amp 2\amp 1\\1\amp -1\amp 1\\4\amp 1\amp 1\end{bmatrix}\quad\text{and}\quad\mathbf{b}=\begin{bmatrix}10\\2\\12\end{bmatrix} \end{equation*}
3.
Prove that a consistent system has infinitely many solutions if and only if the associated homogeneous system has infinitely many solutions.

Exercise Group.

If possible, construct an example of each of the following. If not possible, explain why.
4.
An inconsistent system with an associated homogeneous system that has infinitely many solutions.
5.
An inconsistent system with an associated homogeneous system that has a unique (trivial) solution.

6.

Prove that a linear combination of any number of solutions of a homogeneous equation is a solution of the same equation.

8.

Consider the following set of vectors:
\begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\}. \end{equation*}
  1. Express each of the vectors in the set as a linear combination of the remaining vectors.
  2. Which of the following is NOT true?
  • If \(\mathbf{w} \) is in
    \begin{equation*} \mbox{span}\left(\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right), \end{equation*}
    then \(\mathbf{w}\) is in
    \begin{equation*} \mbox{span}\left(\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right). \end{equation*}
  • We can remove \([1,2,-1]\) and \([2,0,1]\) from
    \begin{equation*} \left\{\begin{bmatrix}1\\2\\-1\end{bmatrix},\begin{bmatrix}2\\0\\1\end{bmatrix},\begin{bmatrix}4\\4\\-1\end{bmatrix}\right\} \end{equation*}
    at the same time without affecting the span.
Answer.
\begin{align*} \begin{bmatrix}1\\2\\-1\end{bmatrix} \amp = -\frac{1}{2}\begin{bmatrix}2\\0\\1\end{bmatrix} + \frac{1}{2}\begin{bmatrix}4\\4\\-1\end{bmatrix}\\ \begin{bmatrix}2\\0\\1\end{bmatrix} \amp = -2\begin{bmatrix}1\\2\\-1\end{bmatrix} + 1\begin{bmatrix}4\\4\\-1\end{bmatrix}\\ \begin{bmatrix}4\\4\\-1\end{bmatrix} \amp = 2\begin{bmatrix}1\\2\\-1\end{bmatrix} + 1\begin{bmatrix}2\\0\\1\end{bmatrix} \end{align*}

Exercise Group.

Are the given vectors linearly independent?
9.
\begin{equation*} \begin{bmatrix}-1\\0\end{bmatrix}, \begin{bmatrix}2\\3\end{bmatrix},\begin{bmatrix}4\\-1\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
If we rewrite
\begin{equation*} c_1\begin{bmatrix}-1\\0\end{bmatrix}+c_2 \begin{bmatrix}2\\3\end{bmatrix}+c_3\begin{bmatrix}4\\-1\end{bmatrix}=\begin{bmatrix}0\\0\end{bmatrix} \end{equation*}
as a system of linear equations, there will be more unknowns than equations.
10.
\begin{equation*} \begin{bmatrix}1\\0\\5\end{bmatrix}, \begin{bmatrix}2\\2\\3\end{bmatrix},\begin{bmatrix}-1\\0\\1\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
If we let \(A\) be the matrix whose columns are these vectors, then \(\mbox{rref}(A)\) should tell us what we want to know.
11.
\begin{equation*} \begin{bmatrix}3\\0\\5\end{bmatrix}, \begin{bmatrix}2\\0\\2\end{bmatrix},\begin{bmatrix}-1\\0\\-5\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
If we let \(A\) be the matrix whose columns are these vectors, then \(\mbox{rref}(A)\) should tell us what we want to know.
12.
\begin{equation*} \begin{bmatrix}3\\1\\4\\1\end{bmatrix}, \begin{bmatrix}-2\\1\\1\\1\end{bmatrix} \end{equation*}
  • Yes
  • No
Hint.
In a set of two vectors, the only way it can be linearly dependent is if one of the vectors is a scalar multiple of the other.

Exercise Group.

True or false?
13.
Any set containing the zero vector is linearly dependent.
  • TRUE
  • FALSE
Hint.
Can the zero vector be removed from the set without changing the span?
14.
A set containing five vectors in \(\R^2\) is linearly dependent.
  • TRUE
  • FALSE
Hint.
If we rewrite (2.4.1) for five vectors in \(\R^2\) as a system of equations, how many equations and unknowns will it have? What does this imply about the number of solutions?

Exercise Group.

Each problem below provides information about vectors \(\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\text{.}\) If possible, determine whether the vectors are linearly dependent or independent.
15.
\begin{equation*} 0\mathbf{v}_1+ 0\mathbf{v}_2+ 0\mathbf{v}_3=\mathbf{0} \end{equation*}
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination
16.
\begin{equation*} 3\mathbf{v}_1+ 4\mathbf{v}_2- \mathbf{v}_3=\mathbf{0} \end{equation*}
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination
17.
\begin{equation*} 2\mathbf{v}_1+ 0\mathbf{v}_2+ 0\mathbf{v}_3=\mathbf{0} \end{equation*}
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination

Exercise Group.

Each diagram below shows a collection of vectors. Are the vectors linearly dependent or independent?
18.
Example with three vectors
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination
19.
Example with two vectors
  • The vectors are linearly independent
  • The vectors are linearly dependent
  • There is not enough information given to make a determination

20.

Suppose \(\{\mathbf{v}_{1}, \dots , \mathbf{v}_{m}\}\) is a linearly independent set in \(\R^n\text{,}\) and that \(\mathbf{v}_{m+1}\) is not in \(\mbox{span}\left(\mathbf{v}_{1}, \dots , \mathbf{v}_{m}\right)\text{.}\) Prove that \(\{\mathbf{v}_{1}, \dots , \mathbf{v}_{m}, \mathbf{v}_{m+1}\}\) is also linearly independent.

21.

Suppose \(\{{\mathbf{u}},{\mathbf{v}}\}\) is a linearly independent set of vectors. Prove that the set \(\{\mathbf{u} -\mathbf{v}, \mathbf{u}+2\mathbf{v}\}\) is also linearly independent.

22.

Suppose \(\{{\mathbf{u}},{\mathbf{v}}\}\) is a linearly independent set of vectors in \(\R^3\text{.}\) Is the following set dependent or independent \(\{\mathbf{u} -\mathbf{v}, \mathbf{u}+2\mathbf{v}, \mathbf{u}+\mathbf{v}\}\text{?}\) Prove your claim.