$\textbf{Exercise 4.}$Suppose $V$ is finite-dimensional and $\dim V > 1$. Prove that the set of noninvertible linear maps from $V$ to itself is not a subspace of $\mathcal{L}(V)$.
$\textbf{Solution 4.}$ Let $n = \dim V$ and let $v_1, \ldots, v_n$ be a basis of $V$. Define $S, T \in \mathcal{L}(V)$ by
$\begin{align*}
S(a_1v_1 + \cdots + a_nv_n) &= a_1v_1 \\
T(a_1v_1 + \cdots + a_nv_n) &= a_2v_2 + \cdots + a_nv_n.
\end{align*}$
Then $S$ is not injective because $Sv_2 = 0$ (this is where we use the hypothesis that $\dim V > 1$), and $T$ is not injective because $Tv_1 = 0$.
Thus both $S$ and $T$ are not invertible. However, $S + T$ equals $I$, which is invertible. Thus the set of noninvertible linear maps from $V$ to itself is not closed under addition, and hence it is not a subspace of $\mathcal{L}(V)$.
$\textit{Comment:}$ If $\dim V \leq 1$, then the set of noninvertible linear maps from $V$ to itself equals $\{0\}$, which is a subspace of $\mathcal{L}(V)$.
---
![[sol-5.pdf#page=4]]