
---
Commentary:
$\textbf{Exercise 14.}$ Suppose $v_1, \ldots, v_m$ is a list of vectors in $V$. For $k \in {1, \ldots, m}$, let $w_k = v_1 + \cdots + v_k.$ Show that the list $v_1, \ldots, v_m$ is linearly independent if and only if the list $w_1, \ldots, w_m$ is linearly independent.
$\textbf{Solution 14.}$ First suppose $v_1, \ldots, v_m$ is linearly independent. Suppose $c_1, \ldots, c_m \in \mathbb{F}$ and $c_1w_1 + \cdots + c_mw_m = 0.$ In the equation above, replace each $w_k$ with $v_1 + \cdots + v_k$ and rewrite the left side of the equation as a linear combination of $v_1, \ldots, v_m$. Because $v_1, \ldots, v_m$ is linearly independent, the coefficient of each $v_k$ equals 0. The coefficient of $v_m$ (after replacing each $w_k$ with $v_1 + \cdots + v_k$ in the equation above) is $c_m$. Thus $c_m = 0$.
Now that $c_m = 0$, the coefficient of $v_{m-1}$ is $c_{m-1}$. Thus $c_{m-1} = 0$. Continuing in this fashion, we see that $c_m = c_{m-1} = \cdots = c_1 = 0$. Thus $w_1, \ldots, w_m$ is linearly independent.
To prove the implication in the other direction, now suppose $w_1, \ldots, w_m$ is linearly independent. Suppose $a_1, \ldots, a_m \in \mathbb{F}$ and $a_1v_1 + \cdots + a_mv_m = 0.$ In the equation above, replace $v_1$ with $w_1$ and replace each $v_k$, for $k > 1$, with $w_k - w_{k-1}$ and rewrite the left side of the equation as a linear combination of $w_1, \ldots, w_m$. Because $w_1, \ldots, w_m$ is linearly independent, the coefficient of each $w_k$ equals 0. The coefficient of $w_m$ (after the replacements just described) is $a_m$. Thus $a_m = 0$.
Now that $a_m = 0$, the coefficient of $w_{m-1}$ is $a_{m-1}$. Thus $a_{m-1} = 0$. Continuing in this fashion, we see that $a_m = a_{m-1} = \cdots = a_1 = 0$. Thus $v_1, \ldots, v_m$ is linearly independent.
$\textit{Commentary:}$ This exercise is the linear independence analogue of Exercise 3. It shows that the linear independence of a list of vectors is preserved if each vector is replaced by the sum of all preceding vectors in the list.
The proof of the forward direction assumes that a linear combination of the $w_k$ equals zero, then rewrites this as a linear combination of the $v_k$ by replacing each $w_k$ with the sum $v_1 + \cdots + v_k$. Since the $v_k$ are linearly independent, all coefficients in this linear combination must be zero. The coefficients are then shown to be zero inductively, starting with $c_m$ and working backwards.
The proof of the reverse direction is similar, but starts with a linear combination of the $v_k$ equaling zero. This is rewritten as a linear combination of the $w_k$ by replacing $v_1$ with $w_1$ and each subsequent $v_k$ with $w_k - w_{k-1}$. Since the $w_k$ are linearly independent, all coefficients in this linear combination must be zero, and this is again shown inductively.
This result is useful in simplifying or transforming linearly independent sets. It shows that there's often flexibility in choosing a linearly independent set that spans the same subspace. The cumulative sums $w_k$ can be easier to work with in some situations than the original vectors $v_k$.
The exercise also provides practice in working with the definition of linear independence and in manipulating linear combinations. The inductive argument used in the proof is a common technique in linear algebra.
$\textit{Additional Examples:}$
In $\mathbb{Q}_p^3$, let $v_1 = (1, 0, 0)$, $v_2 = (0, p, 0)$, $v_3 = (0, 0, p^2)$. The list $v_1, v_2, v_3$ is linearly independent. Let $w_1 = v_1 = (1, 0, 0)$, $w_2 = v_1 + v_2 = (1, p, 0)$, $w_3 = v_1 + v_2 + v_3 = (1, p, p^2)$. The list $w_1, w_2, w_3$ is also linearly independent.
In the space of polynomials $\mathbb{F}_3[x]$, let $v_1 = 1$, $v_2 = x$, $v_3 = x^2$. The list $v_1, v_2, v_3$ is linearly independent. Let $w_1 = v_1 = 1$, $w_2 = v_1 + v_2 = 1 + x$, $w_3 = v_1 + v_2 + v_3 = 1 + x + x^2$. The list $w_1, w_2, w_3$ is also linearly independent.
In the space of continuous functions $C([0,1], \mathbb{C})$, let $v_1 = 1$, $v_2 = x$, $v_3 = e^x$. The list $v_1, v_2, v_3$ is linearly independent. Let $w_1 = v_1 = 1$, $w_2 = v_1 + v_2 = 1 + x$, $w_3 = v_1 + v_2 + v_3 = 1 + x + e^x$. The list $w_1, w_2, w_3$ is also linearly independent.
In the space of sequences $\ell^2(\mathbb{N})$, let $v_1 = (1, 0, 0, \ldots)$, $v_2 = (0, \frac{1}{2}, 0, 0, \ldots)$, $v_3 = (0, 0, \frac{1}{3}, 0, 0, \ldots)$. The list $v_1, v_2, v_3$ is linearly independent. Let $w_1 = v_1 = (1, 0, 0, \ldots)$, $w_2 = v_1 + v_2 = (1, \frac{1}{2}, 0, 0, \ldots)$, $w_3 = v_1 + v_2 + v_3 = (1, \frac{1}{2}, \frac{1}{3}, 0, 0, \ldots)$. The list $w_1, w_2, w_3$ is also linearly independent.
These examples demonstrate the principle in various vector spaces, including over the $p$-adic numbers, over a finite field, in a function space, and in a sequence space. In each case, the original list $v_1, \ldots, v_m$ is linearly independent, and so is the transformed list $w_1, \ldots, w_m$, where each $w_k$ is the sum of the first $k$ vectors in the original list. This illustrates the robustness and generality of this property of linear independence.