Section 4.7 Isomorphisms
Subsection 4.7.1 Bases and linear transformations
Theorem 4.7.1. Bases and linear transformations.
Let \(B\) be a basis for the vector space \(V\text{.}\)
Let \(T\) and \(T'\) be linear transformations from \(V\) to \(W\text{.}\) If \(T(\boldu)=T'(\boldu)\) for all \(\boldu\in B\text{,}\) then \(T=T'\text{.}\)
-
Let \(W\) be a vector space. Any mapping
\begin{equation*} \boldu\mapsto \boldw_\boldu \end{equation*}assigning each element of \(\boldu\in B\) to a chosen element \(\boldw_\boldu\in W\) extends uniquely to a linear transformation \(T\colon V\rightarrow W\) satisfying
\begin{equation*} T(\boldu)=\boldw_\boldu \end{equation*}for all \(\boldu\in B\text{.}\) In more detail, given any \(\boldv\in V\text{,}\) if \(\boldv=c_1\boldu_1+c_2\boldu_2+\cdots +c_r\boldu_r\text{,}\) where \(\boldu_i\in B\) and \(c_i\ne 0\) for all \(1\leq i\leq r\text{,}\) then
\begin{equation} T(\boldv)=c_1\boldw_{\boldu_{1}}+c_2\boldw_{\boldu_{2}}+\cdots +c_r\boldw_{\boldu_{r}}\text{.}\label{eq_bases_transformations}\tag{4.7.1} \end{equation}
Proof.
Proof of (i).
Assume \(T\) and \(T'\) are linear transformations from \(V\) to \(W\) satisfying \(T(\boldu)=T'(\boldu)\) for all \(\boldu\in B\text{.}\) Given any \(\boldv\in V\) we can write \(\boldv=c_1\boldu_1+c_2\boldu_2+\cdots +c_r\boldu_r\text{.}\) It follows that
Since \(T(\boldv)=T'(\boldv)\) for all \(\boldv\in V\text{,}\) we have \(T=T'\text{.}\)
Proof of (ii).
That there can me at most one such \(T\colon V\rightarrow W\) follows from (i). Thus we need only show that such a \(T\) exists.
Since any \(\boldv\in V\) has a unique expression of the form
where \(c_i\ne 0\) for all \(1\leq i\leq r\text{,}\) the formula in (4.7.1) defines a function \(T\colon V\rightarrow W\) in a well-defined manner. Note also that the formula still applies even if some of the coefficients are equal to 0: if \(c_i=0\text{,}\) then \(c_iT(\boldv_i)=\boldzero\text{,}\) and the right-hand side of (4.7.1) is unchanged. We will use this fact below.
We now show that \(T\) is linear. Given \(\boldv, \boldv'\in V\) we can find a common collection of elements \(\boldu_1,\boldu_2,\dots, \boldu_r\in B\) for which
for some \(c_i, d_i\in \R\text{.}\) We can no longer assume that \(c_i\ne 0\) and \(d_i\ne 0\) for all \(1\leq i\leq r\text{,}\) but as observed above we still have
Given any \(c, d\in \R\text{,}\) we have
Thus \(T\) is a linear transformation.
Remark 4.7.2. Transformations determined by behavior on basis.
Let's paraphrase the two results of Theorem 4.7.1.
A linear transformation \(T\colon V\rightarrow W\) is completely determined by its behavior on a basis \(B\subseteq V\text{.}\) Once we know the images \(T(\boldu)\) for all \(\boldu\in B\text{,}\) the image \(T(\boldv)\) for any other \(\boldv\in V\) is then completely determined. Put another way, if two linear transformations out of \(V\) agree on the elements of a basis \(B\subseteq V\text{,}\) then they agree for all elements of \(V\text{.}\)
Once we have a basis \(B\subseteq V\) on hand, it is easy to construct linear transformations \(T\colon V\rightarrow W\text{:}\) simply choose images \(T(\boldu)\in W\) for all \(\boldu\in B\) in any manner you like, and then define \(T(\boldv)\) for any element \(\boldv\in V\) using (4.7.1).
Example 4.7.3. Composition of reflections.
Let \(r_0\colon \R^2\rightarrow\R^2\) be reflection across the \(x\)-axis, and let \(r_{\pi/2}\colon \R^2\rightarrow \R^2\) be reflection across the \(y\)-axis. (See Exercise 4.2.6.13.) Show that their composition \(r_{\pi/2}\circ r_{0}\) is \(\rho_\pi\text{,}\) rotation about the origin by 180 degrees.
Since \(r_0\) and \(r_{\pi/2}\) are both linear transformations (Exercise 4.2.6.13), so is the composition \(T=r_{\pi/2}\circ r_{0}\text{.}\) We wish to show \(T=r_{3\pi/4}\text{.}\) Since \(\rho_{\pi}\) is also a linear transformation, it suffices by Theorem 4.7.1 to show that \(T\) and \(\rho_\pi\) agree on a basis of \(\R^2\text{.}\) Take the standard basis \(B=\{(1,0), (0,1)\}\text{.}\) Compute:
Since \(T\) and \(\rho_\pi\) agree on the basis \(B\text{,}\) we have \(T=\rho_\pi\text{.}\)
As a corollary to Theorem 4.7.1 we can at last complete the partial description of linear transformations of the form \(T\colon \R^n\rightarrow \R^m\) given in Theorem 4.2.9.
Corollary 4.7.4. Matrix transformations II.
Given any linear transformation \(T\colon \R^n\rightarrow \R^m\) there is a unique \(m\times n\) matrix \(A\) such that \(T=T_A\text{.}\) In fact we have
where \(B=\{\bolde_1, \bolde_2, \dots, \bolde_n\}\) is the standard basis of \(\R^n\text{.}\) As a result, in the special case where the domain and codomain are both spaces of tuples, all linear transformations are matrix transformations.
Proof.
Let \(B=\{\bolde_1, \bolde_2, \dots, \bolde_n\}\) be the standard basis of \(\R^n\text{,}\) and let \(A\) be the \(m\times n\) matrix defined as
In other words, the \(j\)-th column of \(A\) is \(T(\bolde_j)\text{,}\) considered as an \(m\times 1\) column vector. The corresponding matrix transformation \(T_A\colon \R^n\rightarrow \R^m\) is linear by Theorem 4.2.9. Since \(T\) is linear by assumption, Theorem 4.7.1 applies: to show \(T=T_A\) we need only show that \(T(\bolde_j)=T_A(\bolde_j)\) for all \(1\leq j\leq n\text{.}\) We have
Thus \(T=T_A\text{,}\) as claimed.
Besides rounding out our theoretical discussion of linear transformations from \(\R^n\) to \(\R^m\text{,}\) Corollary 4.7.4, computationally it provides a recipe for computing a matrix formula for a linear transformation \(T\colon \R^n\colon \rightarrow \R^m\text{.}\) In other words, it tells us how to build the \(A\text{,}\) column by column, such that \(T=T_A\text{.}\) Of course, to use Corollary 4.7.4 we first need to know that the function \(T\) in question is linear.
Example 4.7.5. Rotation matrices revisited.
Fix an angle \(\alpha\text{.}\) Taking for granted that the rotation operation \(\rho_\alpha\colon\R^2\rightarrow\R^2\) is a linear transformation, re-derive the matrix formula for \(\rho_\alpha\colon \R^2\rightarrow\R^2\text{:}\) i.e., find the matrix \(A\) such that \(\rho_\alpha=T_A\text{.}\)
Let \(B=\{\bolde_1, \bolde_2\}=\{(1,0), (0,1)\}\text{.}\) According to Corollary 4.7.4
since \((1,0)\) gets rotated by \(\rho_\alpha\) to \((\cos\alpha, \sin\alpha)\text{,}\) and \((0,1)\) gets rotated to \((-\sin\alpha, \cos\alpha)\text{.}\)
Subsection 4.7.2 Isomorphisms
Definition 4.7.6. Isomorphism.
A linear transformation \(T\colon V\rightarrow W\) is an isomorphism if \(T\) is an invertible function (Definition 1.1.22): equivalently (Theorem 1.1.23), if \(T\) is bijective (Definition 1.1.19).
Vector spaces \(V\) and \(W\) are isomorphic if there is an isomorphism \(T\colon V\rightarrow W\text{.}\)
Remark 4.7.7. Proving \(T\) is an isomorphism.
According to the definition to prove a function \(T\colon V\rightarrow W\) is an isomorphism, we must show that
\(T\) is linear, and
\(T\) is invertible.
Since being invertible is equivalent to being bijective, there are two main approaches to proving that (ii) holds for a linear transformation \(T\colon V\rightarrow W\text{:}\)
we can show directly that \(T\) is invertible by providing an inverse \(T^{-1}\colon W\rightarrow V\text{;}\)
we can show that \(T\) is bijective (i.e., injective and surjective).
Which approach, (a) or (b), is more convenient depends on the linear transformation \(T\) in question.
Remark 4.7.8. Inverse of isomorphism is an isomorphism.
Let \(T\colon V\rightarrow W\) be an isomorphism. Since \(T\) is invertible, there is an inverse function \(T^{-1}\colon W\rightarrow V\text{.}\) Not surprisingly, \(T^{-1}\) is itself a linear transformation, though of course this requires proof. (See [provisional cross-reference: ex_isomorphism_inverse]
.) Since \(T^{-1}\) is also invertible (\(T\) is its inverse), it follows that \(T^{-1}\) is itself an isomorphism.
We mentioned in the introduction that two isomorphic vector spaces are, for all linear algebraic intents and purposes, essentially the same thing. The next theorem provides some evidence for this claim. It also illustrates how a given isomorphism \(T\colon V\rightarrow W\) can translate back and forth between two isomorphic vector spaces. Recall (Definition 1.1.18) that for a subset \(S\subseteq V\text{,}\) the image \(T(S)\) of \(S\) under \(T\) is the set
Theorem 4.7.9. Properties preserved by isomorphisms.
Let \(T\colon V\rightarrow W\) be an isomorphism. The following properties hold:
\(S\subseteq V\) is linearly independent if and only if \(T(S)\subseteq W\) is linearly independent;
\(S\subseteq V\) spans \(V\) if and only if \(T(S)\subseteq W\) spans \(W\text{;}\)
\(S\subseteq V\) is a basis of \(V\) if and only if \(T(S)\subseteq W\) is a basis of \(W\)
\(\dim V=\dim W\text{.}\)