![](sol-1.pdf#page=13) --- Instructor's Solutions Manual, Section 1B Exercise 8 **8** Suppose V is a real vector space. • The *complexification* of V, denoted by $V_C$, equals V x V. An element of $V_C$ is an ordered pair (u, v), where $u, v \in V$, but we write this as u + iv. • Addition on $V_C$ is defined by $ (u_1 + iv_1) + (u_2 + iv_2) = (u_1+u_2) +i(v_1+v_2) for \; all\; u_1, v_1, u_2, v_2 \in V. $ • Complex scalar multiplication on $V_C$ is defined by $ (a + bi) (u + iv) = \color{green}{(au - bv)} + i(\color{red}{av + bu)} $ $ for\; all\; a, b \in R \;and \;all\; u,\; v \in V. $ **Prove that with the definitions of addition and scalar multiplication as above, $V_C$ is a complex vector space.** - (commutativity of addition; associativity of addition, scalar multiplication; - (additive identity; additive inverse; multiplicative identity;) - distributive property 1: multiply across addition; property 2: addition across scalar multiplier) *Think of V as a subset of $V_C$ by identifying $u \in V$ with u + i0. The construction of $V_C$ from V can then be thought of as generalizing the construction of C" from $\mathbb{R}^𝑛$ ## SOLUTION Suppose $u, v, u_1, u_2, v_1, v_2 \in V$and a, b, c, d $\in$ $\mathbb{R}$. Then $ \begin{align} (u_1 + iv_1,) + (u_2 + iv_2) &= (u_1 + u_2) +i(v_1+ v_2)\\ &= (u_2 + u_1) + i(v_2+v_1)\\ &= (u_2 + iv_2) + (u_1, + iv_1),\\ \end{align} $ where - the first equality comes from the definition of addition in $V_C$, - the second equality holds because addition in V is commutative, - and the third equality comes from the definition of addition in $V_C$. The equation above shows that **addition in $V_C$ is commutative.** Also, show **associativity of addition** $ \begin{align} ((u_1 + iv_1) + (u_2 + iv_2)) + (u + iv)\\ &= ((u_1 + u_2) +i(v_1 + v_2)) + (u + iv)\\ &= ((u_1 + u_2) +u) +i(v_1+v_2) + v)\\. \text{error in original, missing () or v inside i()} &= (u_1 +(u_2+u)) +i(v_1+(v_2+v))\\\\ &=(u_1+iv_1)+((u_2+u) +i(v_2 +v))\\ \text{error}= (u_2 + iv_2) + ((u_2 + iv_2) + (u + io)),\\\\ \text{should be}= (u_1 + iv_1) + ((u_2 + iv_2) + (u + io)), \end{align} $ where the third equality holds because addition in V is associative and the other equalities come from the definition of addition in $V_C$. The equation above shows that **addition in $V_C$ is associative.** Contrast ^1b8associativity [[1B.8#^associativityproof|Link to Chat proof]] [[1B.8#Associativity of addition| Link to Chat proof]] Also, **associativity of scalar and element multiplication** $ \colorbox{yellow}{((a + bi)(c + di))} \colorbox{magenta}{(u + iv)} $ $ \begin{array}{l} ((a + bi) (c + di)) (u + iv)\\ = ((ac - bd) + (ad + bc)i) (u + iv)\\ = \colorbox{orange}{((ac - bd)u - (ad + bc)v) + i((ac - bd)v + (ad + bc)u)}\\ \text{think A = ac-bd,B = ac+bd, C=u, and D =v, then multiply}.\\ (AC-BD) + i(AD+BC)\\ \text{compare with below for proof} \end{array} $ Furthermore, $ \require{color}\colorbox{yellow}{(a + bi)} \colorbox{magenta}{((c + di) (u + iv))} $ $ \begin{array}{l} (a + bi) ((c + di) (u + iv))\\ = (a + bi) ((cu - dv) + i(cv + du))\\ = (a(cu - dv) - b(cv + du)) + i(a(cv + du) + b(cu - dv))\\ \text{error in EF}= \colorbox{orange}{((ac - bd)u - (ad + bc)v) + i((ac - bd)v + (ad - bc)u)}.\\ \text{corrected answer}= \colorbox{orange}{((ac - bd)u - (ad + bc)v) + i((ac - bd)v + (ad + bc)u)}.\\ \end{array} $ Comparing the last two sets of equations, we conclude that $ ((a + bi) (c + di)) (u + io) = (a + bi) ((c + di)(u + io)), $ verifying the associative property required of scalar multiplication in a vector space. **Scalar multiplication in $V_C$ is associative** - Actually, showed associativity of element multiplication; didn't show scalar associativity: ab(a+bi)= a(b(a+bi)) Also, $ (u + iv) + (0 + i0) = (u + 0) + i(v + 0) = แ + iv $ **Thus 0 + iO is an additive identity for $V_C$ . This additive identity is usually denoted as just 0. Also, $ (u+iv) + (-u+i(-v)) = (u+(-u)) +i(v+(-v)) = 0 + i0, $ which shows that every element of $V_C$ has an **additive inverse**. Also, $ 1(u + iv) = (1u) + i(1v) = u + iv,$ which shows that the **multiplicative identity** works as required. Also, two distributive: scalar across sum of elements; scalar sum across element; here, show element across sum of elements $ \colorbox{yellow}{(a + bi)}\colorbox{magenta}((u_1+iv_1)+(u_2 + iv_2)) $ $\begin{array}{l}\\ = (a + bi)( (u_1 + u_2) + i(v_1 + v_2))\\ = (a(u_1+u_2) -b(v_1+v_2)) +i(a(v_1+v_2) +b(u_1 + u_2))\\ \colorbox{orange}{=}(au_1 + au_2 -bv_1 - bv_2) +i(av_1 + av_2 + bu_1 + bu_2). \end{array} $ Furthermore $ \begin{array}{l} (a + bi) (u_1 + iv_1) + (a + bi) (u_2 + iv_2)\\ = ((au_1 - bv_1) +i(av_1 +bu_1)) + ((au_2-bv_2) +i(av_2 + bu_2))\\ \colorbox{orange}{=} (au_1 + au_2-bv_1 - bv_2) +i(av_1 + av_2 + bu_1 + bu_2). \end{array} $ Comparing the last two sets of equations, we conclude that $ \colorbox{lightgreen}{verified}(a + bi) ((u_1 + iv_1) + (u_2 + iv_2)) = (a + bi) (u_1 + iv_1) + (a + bi) (u_2 + iv_2) $ **verifying the first distributive property** required in a vector space. Also, **second distributive property**: sum of elements across element; $ \begin{array}{l} \colorbox{yellow}{( (a + bi) + (c + di))}\colorbox{magenta}{ (u + iv)}\\ = ((a + c) + i(b + d))(u + iv)\\ = (a + c)u - (b+d)v +i(a+c)v+(b+d)u)\\\\ \colorbox{orange}{= (au + cu - bv-dv) + i(av + cv + bu + du)}.\\ \end{array} $ $ \begin{array}{l} (a + bi) (u + iv) + (c + di) (u + iv)\\ = ((au - bv) + i(av + bu)) + ((cu - dv) + i(cv + du))\\ \colorbox{orange}{= (au + cu - bv - dv) + i(av + cv + bu + du)}\\. \end{array} $ Comparing the last two sets of equations, we conclude that $ \colorbox{lightgreen}{((a + bi) + (c + di))(u + iv) = (a + bi) (u + iv) + (c + di) (u + iv)} $ **verifying the second distributive property** required in a vector space. All properties required for a complex vector space have now been verified for $V_C$ ![](sol-1.pdf#page=13) --- Commentary: $\textbf{Exercise 8.}$ Suppose $V$ is a real vector space. - The complexification of $V$, denoted by $V_\mathbb{C}$, equals $V\times V$. An element of $V_\mathbb{C}$ is an ordered pair $(u, v)$, where $u, v \in V$, but we write this as $u + iv$. - Addition on $V_\mathbb{C}$ is defined by $(u_1 + iv_1) + (u_2 + iv_2) = (u_1 + u_2) + i(v_1 + v_2)$ for all $u_1, v_1, u_2, v_2 \in V$. - Complex scalar multiplication on $V_\mathbb{C}$ is defined by $(a + bi)(u + iv) = (au - bv) + i(av + bu)$ for all $a, b \in \mathbb{R}$ and all $u, v \in V$. Prove that with the definitions of addition and scalar multiplication as above, $V_\mathbb{C}$ is a complex vector space. $\textit{Commentary:}$ Think of $V$ as a subset of $V_\mathbb{C}$ by identifying $u \in V$ with $u + i0$. The construction of $V_\mathbb{C}$ from $V$ can then be thought of as generalizing the construction of $\mathbb{C}^n$ from $\mathbb{R}^n$. $\textbf{Solution 8.}$ The proof is quite lengthy and involves verifying each of the vector space axioms for $V_\mathbb{C}$ with the given definitions of addition and scalar multiplication. I suggest breaking it down into smaller steps, verifying the commutativity and associativity of addition, the existence of an additive identity and inverses, the compatibility of scalar multiplication with complex multiplication, and the distributive properties. The key is to use the corresponding properties of $V$ and $\mathbb{C}$ and the definitions of addition and scalar multiplication in $V_\mathbb{C}$. If you need further assistance with any of the steps, please let me know. --- $\textbf{Exercise 8.}$ Suppose $V$ is a real vector space. - The complexification of $V$, denoted by $V_\mathbb{C}$, equals $V\times V$. An element of $V_\mathbb{C}$ is an ordered pair $(u, v)$, where $u, v \in V$, but we write this as $u + iv$. - Addition on $V_\mathbb{C}$ is defined by $(u_1 + iv_1) + (u_2 + iv_2) = (u_1 + u_2) + i(v_1 + v_2)$ for all $u_1, v_1, u_2, v_2 \in V$. - Complex scalar multiplication on $V_\mathbb{C}$ is defined by $(a + bi)(u + iv) = (au - bv) + i(av + bu)$ for all $a, b \in \mathbb{R}$ and all $u, v \in V$. Prove that with the definitions of addition and scalar multiplication as above, $V_\mathbb{C}$ is a complex vector space. $\textit{Commentary:}$ Think of $V$ as a subset of $V_\mathbb{C}$ by identifying $u \in V$ with $u + i0$. The construction of $V_\mathbb{C}$ from $V$ can then be thought of as generalizing the construction of $\mathbb{C}^n$ from $\mathbb{R}^n$. $\textbf{Solution 8.}$ We need to verify that $V_\mathbb{C}$ satisfies the axioms of a complex vector space: 1. Addition is commutative: $(u_1 + iv_1) + (u_2 + iv_2) = (u_1 + u_2) + i(v_1 + v_2) = (u_2 + u_1) + i(v_2 + v_1) = (u_2 + iv_2) + (u_1 + iv_1)$. 2. Addition is associative: $((u_1 + iv_1) + (u_2 + iv_2)) + (u_3 + iv_3) = ((u_1 + u_2) + i(v_1 + v_2)) + (u_3 + iv_3) = ((u_1 + u_2) + u_3) + i((v_1 + v_2) + v_3) = (u_1 + (u_2 + u_3)) + i(v_1 + (v_2 + v_3)) = (u_1 + iv_1) + ((u_2 + iv_2) + (u_3 + iv_3))$. 3. The zero vector is $0 + i0$, where $0$ is the zero vector in $V$: $(u + iv) + (0 + i0) = (u + 0) + i(v + 0) = u + iv$. 4. The additive inverse of $u + iv$ is $(-u) + i(-v)$: $(u + iv) + ((-u) + i(-v)) = (u + (-u)) + i(v + (-v)) = 0 + i0$. 5. Scalar multiplication is compatible with complex multiplication: $((a + bi)(c + di))(u + iv) = ((ac - bd) + i(ad + bc))(u + iv) = ((ac - bd)u - (ad + bc)v) + i((ac - bd)v + (ad + bc)u) = (a + bi)((cu - dv) + i(cv + du)) = (a + bi)((c + di)(u + iv))$. 6. $1(u + iv) = u + iv$: $1(u + iv) = (1u - 0v) + i(1v + 0u) = u + iv$. 7. Scalar multiplication is distributive over vector addition: $(a + bi)((u_1 + iv_1) + (u_2 + iv_2)) = (a + bi)((u_1 + u_2) + i(v_1 + v_2)) = (a(u_1 + u_2) - b(v_1 + v_2)) + i(a(v_1 + v_2) + b(u_1 + u_2)) = ((au_1 - bv_1) + (au_2 - bv_2)) + i((av_1 + bu_1) + (av_2 + bu_2)) = ((au_1 - bv_1) + i(av_1 + bu_1)) + ((au_2 - bv_2) + i(av_2 + bu_2)) = (a + bi)(u_1 + iv_1) + (a + bi)(u_2 + iv_2)$. 8. Scalar multiplication is distributive over complex addition: $((a + bi) + (c + di))(u + iv) = ((a + c) + i(b + d))(u + iv) = ((a + c)u - (b + d)v) + i((a + c)v + (b + d)u) = (au - bv) + (cu - dv) + i(av + bu) + i(cv + du) = ((au - bv) + i(av + bu)) + ((cu - dv) + i(cv + du)) = (a + bi)(u + iv) + (c + di)(u + iv)$. Therefore, $V_\mathbb{C}$ is a complex vector space. $\textit{Commentary:}$ This exercise demonstrates how to construct a complex vector space from a real vector space by a process called complexification. The idea is to pair each vector in the real space with another vector from the same space, and to treat this pair as a single "complex" vector, where the first vector plays the role of the "real part" and the second vector plays the role of the "imaginary part". The addition and scalar multiplication operations are then defined in a way that mimics the addition and multiplication of complex numbers, but with vectors taking the place of real numbers. This construction allows techniques from complex analysis to be applied to the study of real vector spaces. It also provides a way to embed a real vector space into a complex vector space, which can be useful in situations where complex techniques are more powerful or more convenient. $\textit{Example:}$ Let $V = \mathbb{R}^2$. Then $V_\mathbb{C} = \mathbb{R}^2 \times \mathbb{R}^2 = \mathbb{C}^2$. An element of $V_\mathbb{C}$ is of the form $((a, b), (c, d))$, which we write as $(a, b) + i(c, d)$. If $u = (1, 2) + i(3, 4)$ and $v = (-1, 1) + i(0, 1)$ are elements of $V_\mathbb{C}$, then: - $u + v = ((1, 2) + (-1, 1)) + i((3, 4) + (0, 1)) = (0, 3) + i(3, 5)$ If $\alpha = 2 - i$, then: - $\alpha u = (2 - i)((1, 2) + i(3, 4)) = (2(1, 2) - (3, 4)) + i(2(3, 4) + (1, 2)) = (-1, 0) + i(7, 10)$ Another example is the complexification of the vector space of real polynomials, $\mathbb{R}[x]$. An element of $(\mathbb{R}[x])_\mathbb{C}$ is of the form $p(x) + iq(x)$, where $p(x)$ and $q(x)$ are real polynomials. This complexification process can also be applied to function spaces and spaces of matrices. It's a very general and powerful construction in linear algebra and functional analysis. I hope these detailed solutions, commentaries, and examples help to clarify the concepts presented in the exercises. Please let me know if you have any further questions! --- Certainly! Here's an expanded version of the last answer that includes examples from other fields: $\textbf{Exercise 8.}$ (Expanded) $\textit{Commentary:}$ The complexification process can be applied to vector spaces over any field, not just the real numbers. In general, if $K$ is a field and $V$ is a $K$-vector space, then the complexification of $V$, denoted $V_\mathbb{C}$, is a $K(\sqrt{-1})$-vector space, where $K(\sqrt{-1})$ is the field extension of $K$ obtained by adjoining a square root of $-1$. $\textit{Example (p-adic numbers):}$ Let $K = \mathbb{Q}_p$ be the field of p-adic numbers for some prime $p$, and let $V = \mathbb{Q}_p^n$ be the $n$-dimensional vector space over $\mathbb{Q}_p$. Then the complexification of $V$ is $V_\mathbb{C} = \mathbb{Q}_p^n \times \mathbb{Q}_p^n = (\mathbb{Q}_p(\sqrt{-1}))^n$. An element of $V_\mathbb{C}$ is of the form $(x_1, \ldots, x_n) + i(y_1, \ldots, y_n)$, where $x_1, \ldots, x_n, y_1, \ldots, y_n \in \mathbb{Q}_p$ and $i$ is a square root of $-1$. The addition and scalar multiplication operations are defined as in the real case: - $(x_1, \ldots, x_n) + i(y_1, \ldots, y_n) + (x'_1, \ldots, x'_n) + i(y'_1, \ldots, y'_n) = (x_1 + x'_1, \ldots, x_n + x'_n) + i(y_1 + y'_1, \ldots, y_n + y'_n)$ - $(a + bi)((x_1, \ldots, x_n) + i(y_1, \ldots, y_n)) = (ax_1 - by_1, \ldots, ax_n - by_n) + i(ay_1 + bx_1, \ldots, ay_n + bx_n)$ for $a, b \in \mathbb{Q}_p$ The complexification of $\mathbb{Q}_p^n$ can be thought of as the space $(\mathbb{Q}_p(\sqrt{-1}))^n$, which is a vector space over the field $\mathbb{Q}_p(\sqrt{-1})$, the quadratic extension of $\mathbb{Q}_p$ obtained by adjoining a square root of $-1$. $\textit{Example (function fields):}$ Let $K = \mathbb{F}_q(t)$ be the field of rational functions over a finite field $\mathbb{F}_q$, and let $V = K^n$ be the $n$-dimensional vector space over $K$. Then the complexification of $V$ is $V_\mathbb{C} = K^n \times K^n = (K(\sqrt{-1}))^n$. An element of $V_\mathbb{C}$ is of the form $(f_1(t), \ldots, f_n(t)) + i(g_1(t), \ldots, g_n(t))$, where $f_1(t), \ldots, f_n(t), g_1(t), \ldots, g_n(t) \in K$ and $i$ is a square root of $-1$. The addition and scalar multiplication operations are defined as in the previous cases: - $(f_1(t), \ldots, f_n(t)) + i(g_1(t), \ldots, g_n(t)) + (f'_1(t), \ldots, f'_n(t)) + i(g'_1(t), \ldots, g'_n(t)) = (f_1(t) + f'_1(t), \ldots, f_n(t) + f'_n(t)) + i(g_1(t) + g'_1(t), \ldots, g_n(t) + g'_n(t))$ - $(a(t) + b(t)i)((f_1(t), \ldots, f_n(t)) + i(g_1(t), \ldots, g_n(t))) = (a(t)f_1(t) - b(t)g_1(t), \ldots, a(t)f_n(t) - b(t)g_n(t)) + i(a(t)g_1(t) + b(t)f_1(t), \ldots, a(t)g_n(t) + b(t)f_n(t))$ for $a(t), b(t) \in K$ The complexification of $K^n$ can be thought of as the space $(K(\sqrt{-1}))^n$, which is a vector space over the field $K(\sqrt{-1})$, the quadratic extension of $K$ obtained by adjoining a square root of $-1$. $\textit{Example (matrix spaces):}$ Let $K$ be a field and let $V = M_{m \times n}(K)$ be the space of $m \times n$ matrices over $K$. Then the complexification of $V$ is $V_\mathbb{C} = M_{m \times n}(K) \times M_{m \times n}(K) = M_{m \times n}(K(\sqrt{-1}))$. An element of $V_\mathbb{C}$ is of the form $A + iB$, where $A, B \in M_{m \times n}(K)$ and $i$ is a square root of $-1$. The addition and scalar multiplication operations are defined as follows: - $(A + iB) + (C + iD) = (A + C) + i(B + D)$ - $(a + bi)(A + iB) = (aA - bB) + i(aB + bA)$ for $a, b \in K$ The complexification of $M_{m \times n}(K)$ can be identified with the space $M_{m \times n}(K(\sqrt{-1}))$, which is a vector space over the field $K(\sqrt{-1})$, the quadratic extension of $K$ obtained by adjoining a square root of $-1$. These examples illustrate that the complexification process is a very general construction that can be applied to vector spaces over any field. It's particularly useful in situations where one wants to extend the scalars from a base field to a larger field, often to gain access to additional algebraic or analytic tools. In the case of p-adic numbers and function fields, complexification allows one to study these spaces using techniques from complex analysis and algebraic geometry that are not available over the original fields. In the case of matrix spaces, complexification is often used in the study of linear transformations and their invariants, such as eigenvalues and eigenvectors. Many important results in linear algebra, such as the Jordan canonical form and the spectral theorem, rely on the ability to work over an algebraically closed field like the complex numbers. I hope this expanded answer provides a broader perspective on the concept of complexification and its applications in various fields of mathematics. Let me know if you have any further questions!