![](sol-3.pdf#page=7) --- Commentary: $\textbf{Exercise 9.}$ Suppose $v_1, \ldots, v_m$ is a list of vectors in $V$. For $k \in {1, \ldots, m}$, let $w_k = v_1 + \cdots + v_k.$ Show that $v_1, \ldots, v_m$ is a basis of $V$ if and only if $w_1, \ldots, w_m$ is a basis of $V$. $\textbf{Solution 9.}$ First suppose $v_1, \ldots, v_m$ is a basis of $V$. Thus $v_1, \ldots, v_m$ is linearly independent and spans $V$. By Exercises 14 and 3 in Section 2A, $w_1, \ldots, w_m$ is linearly independent and spans $V$. Thus $w_1, \ldots, w_m$ is a basis of $V$. Now suppose $w_1, \ldots, w_m$ is a basis of $V$. Thus $w_1, \ldots, w_m$ is linearly independent and spans $V$. By Exercises 14 and 3 in Section 2A, $v_1, \ldots, v_m$ is linearly independent and spans $V$. Thus $v_1, \ldots, v_m$ is a basis of $V$. $\textit{Commentary:}$ This exercise relates the concept of a basis to the transformation introduced in Exercise 3 of Section 2A. The transformation replaces each vector $v_k$ in a list with the sum of the first $k$ vectors in the list. The proof relies on the results of Exercises 14 and 3 from Section 2A. Exercise 14 states that this transformation preserves linear independence, while Exercise 3 states that it preserves span. To prove the forward direction, we assume that $v_1, \ldots, v_m$ is a basis of $V$. This means it is linearly independent and spans $V$. By Exercise 14, $w_1, \ldots, w_m$ is also linearly independent, and by Exercise 3, it also spans $V$. Thus, $w_1, \ldots, w_m$ is a basis of $V$. The proof of the reverse direction is analogous, using the same two exercises. This exercise illustrates a deeper principle: basis is a structural property of a list of vectors that is preserved under certain transformations. Here, the transformation is taking partial sums, but there are many other transformations that also preserve the basis property. $\textit{Examples:}$ 1. In $\mathbb{R}^3$, let $v_1 = (1, 0, 0)$, $v_2 = (0, 1, 0)$, $v_3 = (0, 0, 1)$. Then $w_1 = (1, 0, 0)$, $w_2 = (1, 1, 0)$, $w_3 = (1, 1, 1)$. Both $v_1, v_2, v_3$ and $w_1, w_2, w_3$ are bases of $\mathbb{R}^3$. 2. In $\mathcal{P}_2(\mathbb{R})$, let $v_1 = 1$, $v_2 = x$, $v_3 = x^2$. Then $w_1 = 1$, $w_2 = 1 + x$, $w_3 = 1 + x + x^2$. Both $v_1, v_2, v_3$ and $w_1, w_2, w_3$ are bases of $\mathcal{P}_2(\mathbb{R})$. 3. In the space of $2 \times 2$ matrices $M_{2 \times 2}(\mathbb{C})$, let $v_1 = \begin{pmatrix} 1 & 0 \ 0 & 0 \end{pmatrix}$, $v_2 = \begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix}$, $v_3 = \begin{pmatrix} 0 & 0 \ 1 & 0 \end{pmatrix}$, $v_4 = \begin{pmatrix} 0 & 0 \ 0 & 1 \end{pmatrix}$. Then $w_1 = \begin{pmatrix} 1 & 0 \ 0 & 0 \end{pmatrix}$, $w_2 = \begin{pmatrix} 1 & 1 \ 0 & 0 \end{pmatrix}$, $w_3 = \begin{pmatrix} 1 & 1 \ 1 & 0 \end{pmatrix}$, $w_4 = \begin{pmatrix} 1 & 1 \ 1 & 1 \end{pmatrix}$. Both $v_1, v_2, v_3, v_4$ and $w_1, w_2, w_3, w_4$ are bases of $M_{2 \times 2}(\mathbb{C})$. These examples demonstrate the principle in various spaces, including $\mathbb{R}^n$, polynomial spaces, and matrix spaces. In each case, transforming a basis by taking partial sums yields another basis. This illustrates a way to generate new bases from old ones while preserving the essential structure.