![](sol-3.pdf#page=8) --- Commentary: $\textbf{Exercise 10.}$ Suppose $U$ and $W$ are subspaces of $V$ such that $V = U \oplus W$. Suppose also that $u_1, \ldots, u_m$ is a basis of $U$ and $w_1, \ldots, w_n$ is a basis of $W$. Prove that $u_1, \ldots, u_m, w_1, \ldots, w_n$ is a basis of $V$. $\textbf{Solution 10.}$ First suppose $a_1, \ldots, a_m, b_1, \ldots, b_n \in \mathbb{F}$ are such that $a_1u_1 + \cdots + a_mu_m + b_1w_1 + \cdots + b_nw_n = 0.$ Because $a_1u_1 + \cdots + a_mu_m \in U$ and $b_1w_1 + \cdots + b_nw_n \in W$, the equation above and 1.45 imply that $a_1u_1 + \cdots + a_mu_m = 0 \text{ and } b_1w_1 + \cdots + b_nw_n = 0.$ Because $u_1, \ldots, u_m$ is linearly independent and $w_1, \ldots, w_n$ is linearly independent, this implies that $a_1 = \cdots = a_m = b_1 = \cdots = b_n = 0.$ Thus $u_1, \ldots, u_m, w_1, \ldots, w_n$ is linearly independent. Suppose $v \in V$. Then there exist $u \in U$ and $w \in W$ such that $v = u + w$. Because $u_1, \ldots, u_m$ spans $U$ and $w_1, \ldots, w_n$ spans $W$, there exist $a_1, \ldots, a_m, b_1, \ldots, b_n \in \mathbb{F}$ such that $u = a_1u_1 + \cdots + a_mu_m \text{ and } w = b_1w_1 + \cdots + b_nw_n.$ Thus $v = a_1u_1 + \cdots + a_mu_m + b_1w_1 + \cdots + b_nw_n,$ which shows that $u_1, \ldots, u_m, w_1, \ldots, w_n$ spans $V$. Because $u_1, \ldots, u_m, w_1, \ldots, w_n$ is linearly independent and spans $V$, this list is a basis of $V$. $\textit{Commentary:}$ This exercise deals with the situation where a vector space $V$ is the direct sum of two subspaces $U$ and $W$. It asks to prove that if we have bases for $U$ and $W$, then putting these bases together yields a basis for $V$. The proof has two parts. First, we prove that the combined list of basis vectors is linearly independent. We assume a linear combination of these vectors equals zero, then use the fact that $V$ is the direct sum of $U$ and $W$ to conclude that the parts of this linear combination in $U$ and $W$ must separately equal zero. Since the basis vectors of $U$ and $W$ are respectively linearly independent, this implies that all coefficients must be zero, proving the linear independence of the combined list. Second, we prove that the combined list spans $V$. We take an arbitrary vector $v$ in $V$ and use the direct sum property to write it as $u + w$ with $u \in U$ and $w \in W$. Since the basis vectors of $U$ span $U$ and the basis vectors of $W$ span $W$, we can write $u$ and $w$ as linear combinations of these basis vectors. Combining these linear combinations gives a linear combination of the combined list of basis vectors that equals $v$, proving that the combined list spans $V$. Together, these two parts show that the combined list is a basis of $V$. This exercise illustrates a fundamental principle: bases for direct sums can be constructed by combining bases of the summands. This is a powerful tool for understanding the structure of vector spaces that can be decomposed into direct sums. $\textit{Examples:}$ 1. In $\mathbb{R}^4$, let $U = {(a, b, 0, 0) : a, b \in \mathbb{R}}$ and $W = {(0, 0, c, d) : c, d \in \mathbb{R}}$. Then $\mathbb{R}^4 = U \oplus W$. A basis for $U$ is $(1, 0, 0, 0)$, $(0, 1, 0, 0)$, and a basis for $W$ is $(0, 0, 1, 0)$, $(0, 0, 0, 1)$. Together, these form a basis for $\mathbb{R}^4$. 2. In the space of polynomials $\mathbb{R}[x]$, let $U$ be the subspace of polynomials with even powers of $x$ and $W$ be the subspace of polynomials with odd powers of $x$. Then $\mathbb{R}[x] = U \oplus W$. A basis for $U$ is $1, x^2, x^4, \ldots$, and a basis for $W$ is $x, x^3, x^5, \ldots$. Together, these form a basis for $\mathbb{R}[x]$. 3. In the space of $2 \times 2$ matrices $M_{2 \times 2}(\mathbb{C})$, let $U = \left{\begin{pmatrix} a & 0 \ 0 & d \end{pmatrix} : a, d \in \mathbb{C}\right}$ and $W = \left{\begin{pmatrix} 0 & b \ c & 0 \end{pmatrix} : b, c \in \mathbb{C}\right}$. Then $M_{2 \times 2}(\mathbb{C}) = U \oplus W$. A basis for $U$ is $\begin{pmatrix} 1 & 0 \ 0 & 0 \end{pmatrix}$, $\begin{pmatrix} 0 & 0 \ 0 & 1 \end{pmatrix}$, and a basis for $W$ is $\begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix}$, $\begin{pmatrix} 0 & 0 \ 1 & 0 \end{pmatrix}$. Together, these form a basis for $M_{2 \times 2}(\mathbb{C})$. These examples demonstrate the principle in various spaces, including $\mathbb{R}^n$, polynomial spaces, and matrix spaces. In each case, we have a space that is a direct sum of two subspaces, and bases for the whole space can be formed by combining bases of the subspaces. This illustrates the utility of the direct sum decomposition in simplifying the task of finding bases for vector spaces.