August 9, 2025
Complete the following warm-up problems to re-familiarize yourself with concepts we’ll be leveraging today.
Determine whether the vector \(\vec{b} = \begin{bmatrix} -5\\ -2\\ -1\\ 4\end{bmatrix}\) is in \(\text{span}\left(\left\{\begin{bmatrix} 1\\ 1\\ 0\\ 2\end{bmatrix}, \begin{bmatrix} -3\\ 0\\ 1\\ 1\end{bmatrix}, \begin{bmatrix} 0\\ 1\\ 1\\ 0\end{bmatrix}\right\}\right)\)
Do the columns of the matrix \(\begin{bmatrix} 1 & 8 & -2 & 3\\ 0 & 0 & -1 & 1\\ 0 & 0 & 0 & 4\\ 0 & 0 & 0 & 0\end{bmatrix}\) span \(\mathbb{R}^4\)? Why or why not?
Describe the space spanned by the vectors \(\vec{v_1} = \begin{bmatrix} 1\\ 0\\ -1\end{bmatrix}\), \(\vec{v_2} =\begin{bmatrix} 2\\ 3\\ 0\end{bmatrix}\), and \(\vec{v_3} = \begin{bmatrix} 6\\ 15\\ 4\end{bmatrix}\).
Recently we’ve discussed linear combinations of vectors and spans of collections of vectors.
In our discussion on spans, we considered whether a collection of \(m\)-component vectors spanned all of \(\mathbb{R}^m\)
It is natural now to start considering whether we can remove any vectors from a collection without changing its span.
Goals for Today: After today’s discussion, you should be able to…
Define the notions of linear independence and linear dependence
Test whether a collection \(\left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\) is linearly independent
Explain what happens to the span if you
Consider a collection of non-zero vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\)
Consider a collection of non-zero vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\)
If we assume that no vector can be written as a linear combination of the other vectors in \(V\), then the only way to write \(\vec{0}\) as a linear combination of vectors from \(V\) is to have all weights \(c_i = 0\)
Definition (Linear Independence): Given a set of vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\), if the only solution to the vector equation \(c_1\vec{v_1} + c_2\vec{v_2} + \cdots + c_p\vec{v_p} = \vec{0}\) is that \(c_1 = c_2 = \cdots = c_p = 0\), then the vectors in \(V\) are linearly independent.
Definition (Linear Dependence): Given a set of vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\), if there exists a solution to the vector equation \(c_1\vec{v_1} + c_2\vec{v_2} + \cdots + c_p\vec{v_p} = \vec{0}\) with at least one \(c_i \neq 0\), then the vectors in \(V\) are linearly dependent.
Notice that reducing a linearly dependent collection of vectors to a linearly independent collection does not change its span.
Definition (Linear Independence): Given a set of vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\), if the only solution to the vector equation \(c_1\vec{v_1} + c_2\vec{v_2} + \cdots + c_p\vec{v_p} = \vec{0}\) is that \(c_1 = c_2 = \cdots = c_p = 0\), then the vectors in \(V\) are linearly independent.
Definition (Linear Dependence): Given a set of vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\), if there exists a solution to the vector equation \(c_1\vec{v_1} + c_2\vec{v_2} + \cdots + c_p\vec{v_p} = \vec{0}\) with at least one \(c_i \neq 0\), then the vectors in \(V\) are linearly dependent.
Notice that reducing a linearly dependent collection of vectors to a linearly independent collection does not change its span. In some sense, we might think of linearly independent collections as being “efficient”.
Definition (Homogeneous System/Equation): When the right-hand side / constant vector in a system or equation is the zero vector (ie. \(\vec{b} = \vec{0}\)), then that system or equation is called homogeneous
Homogeneous systems and equations are always consistent, because the zero-vector is always a solution.
\[\left\{\begin{array}{lcl} a_{11}x_1 + a_{12}x_2 + \cdots + a_{1p}x_p & = & 0\\ a_{21}x_1 + a_{22}x_2 + \cdots + a_{2p}x_p & = & 0\\ & \vdots & \\ a_{m1}x_1 + a_{m2}x_2 + \cdots + a_{mp}x_p & = & 0 \end{array}\right.\]
Has solution \(x_1 = 0,~x_2 = 0,~\cdots,~x_p = 0\)
Definition (Homogeneous System/Equation): When the right-hand side / constant vector in a system or equation is the zero vector (ie. \(\vec{b} = \vec{0}\)), then that system or equation is called homogeneous
Homogeneous systems and equations are always consistent, because the zero-vector is always a solution.
\[A\vec{x} = \vec{0} \text{ or } \left[\begin{array}{cccc|c} \vec{a_1} & \vec{a_2} & \cdots & \vec{a_p} & \vec{0}\end{array}\right]\]
Has solution \(\vec{x} = \vec{0}\)
Definition (Homogeneous System/Equation): When the right-hand side / constant vector in a system or equation is the zero vector (ie. \(\vec{b} = \vec{0}\)), then that system or equation is called homogeneous
Homogeneous systems and equations are always consistent, because the zero-vector is always a solution.
\[x_1\vec{v_1} + x_2\vec{v_2} + \cdots + x_p\vec{v_p} = \vec{0}\]
Has solution \(x_1 = 0,~x_2 = 0,~\cdots,~x_p = 0\)
Investigating homogeneous equations provides us with a simple strategy for determining whether a collection of vectors is linearly independent or linearly dependent.
Strategy: We identify whether \(\left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\) is linearly independent or dependent by determining whether or not the system corresponding to the augmented matrix \(\left[\begin{array}{cccc|c} v_{11} & v_{12} & \cdots & v_{1p} & 0\\ v_{21} & v_{22} & \cdots & v_{2p} & 0\\ \vdots & \vdots & \ddots & \vdots & \vdots\\ v_{m1} & v_{m2} & \cdots & v_{mp} & 0\\ \end{array}\right]\) has a free variable.
Strategy: We identify whether \(\left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\) is linearly independent or dependent by determining whether or not the system corresponding to the augmented matrix \(\left[\begin{array}{cccc|c} v_{11} & v_{12} & \cdots & v_{1p} & 0\\ v_{21} & v_{22} & \cdots & v_{2p} & 0\\ \vdots & \vdots & \ddots & \vdots & \vdots\\ v_{m1} & v_{m2} & \cdots & v_{mp} & 0\\ \end{array}\right]\) has a free variable.
Recall that, since \(\vec{x} = \vec{0}\) is a solution to the homogeneous system, the only way for a non-zero solution to exist is to have infinitely many solutions.
We’ve discovered previously that a system has infinitely many solutions if it has at least one free variable.
Any column of the coefficient matrix which is not a pivot column corresponds to a free variable.
Although they align with our observations and strategy above, it is worth pointing out a couple of special cases.
A collection of vectors \(\left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\) is linearly dependent if at least one of the vectors in the set can be written as a linear combination of the other vectors in the set. Otherwise, the collection is linearly independent.
We can use several of the tools/investigations we’ve already encountered to determine whether collections \(\left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\) are linearly independent.
Below are a few observations that we can make. Some of them were mentioned earlier in these slides, but others are mentioned here for the first time.
A collection containing a single, non-zero vector is linearly independent.
Any collection containing the zero vector \(\vec{0}\) is linearly dependent.
Any collection of vectors from \(\mathbb{R}^m\) that contains more than \(m\) vectors is linearly dependent.
Strategy: The most reliable (broadly applicable) method for determining whether a collection of vectors \(V = \left\{\vec{v_1},~\vec{v_2},~\cdots,~\vec{v_p}\right\}\) is linearly independent is to determine whether the matrix \(\begin{bmatrix} \vec{v_1} & \vec{v_2} & \cdots & \vec{v_p}\end{bmatrix}\) has a pivot in every column.
Example: Determine whether the collections of vectors below are linearly independent.
\(\left\{\left[\begin{array}{r} 1\\ -2\\ 0\\ 0\\ 3\end{array}\right]\right\}\)
\(\left\{\left[\begin{array}{r} 1\\ 2\end{array}\right], \left[\begin{array}{r} -1\\ 1\end{array}\right], \left[\begin{array}{r} 3\\ 3\end{array}\right]\right\}\)
\(\left\{\left[\begin{array}{r} 1\\ 0\\ 3\end{array}\right], \left[\begin{array}{r} 0\\ 0\\ 0\end{array}\right]\right\}\)
\(\left\{\left[\begin{array}{r} 1\\ 1\\ 0\end{array}\right], \left[\begin{array}{r} 0\\ -2\\ 1\end{array}\right], \left[\begin{array}{r} -1\\ 0\\ 1\end{array}\right]\right\}\)
Example: Determine the value(s) for \(h\) which make \(\vec{v_3} \in \text{span}\left(\left\{\vec{v_1},~\vec{v_2}\right\}\right)\) where \(\vec{v_1} = \left[\begin{array}{r} 1\\ -3\\ 5\end{array}\right]\), \(\vec{v_2} = \left[\begin{array}{r} -3\\ 9\\ 15\end{array}\right]\), and \(\vec{v_3} = \left[\begin{array}{r}2\\ -5\\ h\end{array}\right]\). For which values of \(h\) is \(\left\{\vec{v_1},~\vec{v_2},~\vec{v_3}\right\}\) linearly independent?
Example: Determine the value(s) of \(h\) which make \(\left[\begin{array}{r} 1\\ -2\\ 4\end{array}\right]\), \(\left[\begin{array}{r} -3\\ 7\\ 6\end{array}\right]\), and \(\left[\begin{array}{r} 2\\ 1\\ h\end{array}\right]\) linearly dependent.
Example: What are the possible row echelon forms of the matrices in the following scenarios:
Example: How many pivot columns must a \(6\times 4\) matrix have if its columns span \(\mathbb{R}^4\)? Why?
Example: Construct \(3\times 2\) matrices \(A\) and \(B\) such that \(A\vec{x} = \vec{0}\) has a non-trivial solution but \(B\vec{x} = \vec{0}\) has only the trivial solution.
\[\Huge{\text{Finish Homework 5}}\] \[\Huge{\text{on MyOpenMath}}\]
\(\Huge{\text{Matrix Transformations}}\)