Skip to main content

Section 4.5 Bases and dimension

Now that we have the notions of span and linear independence in place, we simply combine them to define a basis of a vector space. In the spirit of Section 4.4, a basis of a vector space \(V\) should be understood as a minimal spanning set.

This section includes many theoretical results. There are two in particular that are worth highlighting, especially in regard to computational techniques for abstract vector spaces:

  1. If \(B\) is a basis of \(V\) containing exactly \(n\) elements, then any other basis \(B'\) also contains exactly \(n\) elements. (Theorem 4.5.11)

  2. If \(B\) is a basis for \(V\text{,}\) then every element of \(V\) can be written as a linear combination of elements of \(B\) in a unique way. (Theorem 4.5.7)

The first result allows us to define the dimension of a vector space as the number of elements in any given basis. The second result allows us to take any \(n\)-dimensional vector space \(V\) with chosen basis \(B=\{\boldv_1, \boldv_2, \dots, \boldv_n\}\) and effectively identify vectors \(\boldv\in V\) with the sequence \((c_1,c_2,\dots, c_n)\in \R^n\text{,}\) where

\begin{equation*} \boldv=c_1\boldv_1+c_2\boldv_2+\cdots c_n\boldv_n\text{.} \end{equation*}

This observation has the following consequence: given any \(n\)-dimensional vector space \(V\text{,}\) no matter how exotic, once we choose a basis \(B\) of \(V\text{,}\) we can reduce any and all linear algebraic questions or computations about \(V\) to a corresponding question in \(\R^n\text{.}\) We will elaborate this idea further in Section 6.1.

Subsection 4.5.1 Bases of vector spaces

Definition 4.5.1. Basis.

A subset \(B\) of a vector space \(V\) is called a basis if

  1. \(B\) spans \(V\text{,}\) and

  2. \(B\) is linearly independent.

If the basis \(B\) comes equipped with an ordering (i.e., \(B\) is an ordered set), then we call \(B\) and ordered basis

Remark 4.5.2. Some standard bases.

The examples of standard spanning sets in Remark 4.4.6 are easily seen to be linearly independent, and hence are in fact bases. We list them again here, using the same notation, and refer to these as \(standard bases\) for the given spaces.

  • Zero space.

    Let \(V=\{\boldzero\}\text{.}\) The empty \(B=\emptyset=\{ \}\) is a basis for \(V\text{.}\) Note that \(B=\emptyset\) spans \(V\) by definition (Definition 4.4.1), and it satisfies the defining implication of linear independence (Definition 4.4.8) trivially.

  • Tuples.

    Let \(V=\R^n\text{.}\) The set \(B=\{\bolde_1, \bolde_2,\dots, \bolde_n\}\) is the standard basis of \(\R^n\text{.}\)

  • Matrices.

    Let \(V=M_{mn}\text{.}\) The set \(B=\{E_{ij}\colon 1\leq i\leq m, 1\leq j\leq n\}\) is the standard basis of \(M_{mn}\text{.}\)

  • Polynomials of bounded degree.

    Let \(V=P_n\text{.}\) The set \(B=\{x^n, x^{n-1}, \dots, x, 1\}\) is the standard basis of \(P_n\text{.}\)

  • Polynomials.

    Let \(V=P\text{,}\) the space of all polynomials. The set

    \begin{equation*} B=\{1, x, x^2, \dots\}=\{x^i\colon i\geq 0\} \end{equation*}

    is the standard basis of \(P\text{.}\)

Just as with spanning sets, bases are not in general unique: in fact, for any nonzero vector space there are infinitely many different bases.

For each \(V\) and \(B\) below, verify that \(B\) is a basis of \(V\text{.}\)

  1. \(V=\R^4\text{,}\) \(B=\{(1,1,1,1), (1,0,0,-1), (1,0,-1,0), (0,1,-1,0)\}\text{.}\)

  2. \(V=P_2\text{,}\) \(B=\{1+x+x^2, -x+x^2, -1+x^2\}\text{.}\)

  3. \(V=M_{22}\text{,}\)

    \begin{equation*} B=\left\{ \begin{bmatrix}3\amp 6\\ 3\amp -6 \end{bmatrix} , \begin{bmatrix}0\amp -1\\ -1\amp 0 \end{bmatrix} , \begin{bmatrix}0\amp -8\\ -12\amp -4 \end{bmatrix} , \begin{bmatrix}1\amp 0\\ -1\amp 2 \end{bmatrix} \right\}\text{.} \end{equation*}
Solution.

Each verification amounts to showing that the given \(B\) spans the given \(V\) and is linearly independent. We leave the details to you, using the techniques from Section 4.4.

Not every vector space has a finite basis, as we show in the next example.

Prove that \(P\text{,}\) the space of all real polynomials, does not have a finite basis.

Solution.

We show that no finite set of polynomials can span all of \(P\text{;}\) it follows that \(P\) does not have a finite basis.

Indeed let \(S=\{p_1, p_2, \dots, p_r\}\) be a finite set of polynomials, and let \(n\) be the maximal degree of all the polynomials \(p_i\in S\text{.}\) Then \(p_i\in P_n\) for all \(i\text{,}\) in which case \(\Span S\subseteq P_n\text{:}\) i.e., \(\Span S\) is a subspace of the space of polynomials of degree at most \(n\text{.}\) Since \(P_n\subsetneq P\text{,}\) we conclude that \(\Span S\ne P\text{,}\) as claimed.

Remark 4.5.5.

By Theorem 4.5.18 and its corollaries, we know that if \(V\) has a finite basis, then and subspace of \(V\) also has a finite basis. Let \(X\subseteq \R\) be an interval. Since

\begin{equation*} P\subseteq C^\infty(X)\subseteq C^n(X)\subseteq C(X)\subseteq F(X,\R) \end{equation*}

is a chain of subspaces, and since \(P\) does not have a finite basis, we conclude that none of these other function spaces has a finite basis.

Let \(V=\R_{>0}\text{,}\) and let \(S=\{\boldv\}\text{,}\) where \(\boldv=a\) is any positive real number. Prove: \(S\) is a basis if and only if \(a\ne 1\text{.}\)

Solution.
\((a=1\implies S \text{ not a basis})\).

Suppose \(a=1\text{.}\) Since \(1=\boldzero\in V\text{,}\) we have \(S=\{\boldzero\}\text{.}\) Any set containing the zero vector is linearly dependent (Remark 4.4.9). Thus \(S\) is not a basis.

\((a\ne 1 \implies S \text{ is a basis})\).

Since \(S=\{\boldv\}\) consists of one nonzero element, it is linearly independent (Remark 4.4.9). It remains only to show that \(S\) spans \(\R_{>0}\text{,}\) which amounts to showing that every \(b\in \R_{>0}\) is a scalar multiple of \(\boldv=a\text{.}\) Since by definition scalar multiplication in \(\R_{>0}\) is defined as \(c\boldv=a^c\text{,}\) this is equivalent to showing that every \(b\in \R_{>0}\) can be written in the form \(b=a^c\text{.}\) This fact is a familiar result from calculus, where you learn that the range (or image) of any exponential function \(f(x)=a^x\) is the set of all positive real numbers.

Proceeding directly from the definition, to show a set \(B\) is a basis of \(V\) we have to do two steps: (i) show \(\Span B= V\text{;}\) (ii) show that \(B\) is linearly independent. The following theorem offers gives rise to a one-step technique for proving \(B\) is a basis: show that every element of \(V\) can be written as a linear combination of elements of \(B\) in a unique way.

Since clearly (1) and (2) imply that \(B\) spans \(V\text{,}\) it is enough to show that \(B\) is linearly independent if and only if whenever

\begin{equation*} c_1\boldv_1+c_2\boldv_2+\cdots +c_r\boldv_r=d_1\boldv_1+d_2\boldv_2+\cdots +d_s\boldv_r\text{,} \end{equation*}

where \(\boldv_i\in B\) and \(\boldv_i\ne \boldv_j\) for all \(1\leq i, j\leq r\text{,}\) then \(c_i=d_i\) for all \(1\leq i\leq r\text{.}\) We prove each implication separately.

Assume \(B\) is linearly independent. Then

\begin{align*} c_1\boldv_1+c_2\boldv_2+\cdots +c_r\boldv_r=d_1\boldv_1+d_2\boldv_2+\cdots +d_r\boldv_r \amp \implies c_1\boldv_1+c_2\boldv_2+\cdots +c_r\boldv_r-(d_1\boldv_1+d_2\boldv_2+\cdots +d_r\boldv_r)=\boldzero \\ \amp\implies (c_1-d_1)\boldv_1+(c_2-d_2)\boldv_2+\cdots +(c_r-d_r)\boldv_r=\boldzero \\ \amp \implies c_i-d_i=0 \text{ for all } i \amp (B \text{ lin. independent}, \boldv_i\ne \boldv_j) \\ \amp \implies c_i=d_i \text{ for all } i \text{.} \end{align*}

Going the other way, assume linear combinations of \(B\) are unique in the sense described. Then

\begin{align*} c_1\boldv_1+c_2\boldv_2+\cdots c_r\boldv_r=\boldzero \amp \implies c_1\boldv_1+c_2\boldv_2+\cdots c_r\boldv_r=0\boldv_1+0\boldv_2+\cdots 0\boldv_r\\ \amp \implies c_1=0, c_2=0, \dots, c_r=0 \amp \text{(by uniqueness condition)}\text{.} \end{align*}

We conclude \(B\) is linearly independent in this case, as desired.

Let \(V=P_1\text{.}\) Use the one-step technique to verify whether the given \(S\) is a basis of \(V\text{.}\)

  1. \(\displaystyle S=\{p(x)=2x+1, x+1\}\)

  2. \(\displaystyle S=\{p(x)=x+1, q(x)=x-1, r(x)=7x+4\}\)

Solution.
  1. Take an arbitrary element \(ax+b\in P_1\) and consider the polynomial equation

    \begin{equation*} c_1p(x)+c_2q(x)=ax+b \end{equation*}

    The usual remark about polynomial equality implies that this is equivalent to the matrix equation

    \begin{equation*} \begin{bmatrix} 2\amp 1\\ 1\amp 1 \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \end{bmatrix} = \begin{bmatrix} a\\ b \end{bmatrix} =\boldy\text{.} \end{equation*}

    The matrix on the left is invertible, allowing us to solve

    \begin{equation*} \begin{bmatrix} c_1 \\ c_2 \end{bmatrix} =\begin{bmatrix} 1\amp -1\\ -1\amp 2 \end{bmatrix} \begin{bmatrix} a\\ b \end{bmatrix} =\begin{bmatrix} a-b\\ -a+2b \end{bmatrix}\text{.} \end{equation*}

    We conclude that any \(ax+b\in P_1\) can be written as \(c_1p+c_2q\) in a unique way: namely, with \(c_1=a-b\) and \(c_2=-a+2b\text{.}\) Thus \(S\) is a basis.

Subsection 4.5.2 Dimension of a a vector space

The next few results are all in the service of establishing a notion of dimension for a vector space \(V\text{,}\) which we define to be the number of elements in a basis for \(V\text{.}\) For this to be a well-defined quantity, we need to show that any two bases for a vector space have the same number of elements.

Definition 4.5.9. Cardinality of a set.

Let \(X\) be a set. The cardinality of \(X\), denoted \(\val{X}\) is defined as follows:

  • If \(X\) is finite, then its cardinality is the number \(n\) of distinct elements it contains, written \(\val{X}=n\text{.}\)

  • If \(X\) is infinite, then we say it has infinite cardinality, and write \(\val{X}=\infty\text{.}\)

The following theorem is the crucial result needed to show that any two finite bases of a vector space have the same cardinality.

Let \(S=\{\boldv_1, \boldv_2, \dots, \boldv_n\}\text{,}\) and let \(S'=\{\boldw_1, \boldw_2, \dots, \boldw_r\}\text{.}\) Since \(S\) spans \(V\text{,}\) each element of \(S'\) is a linear combination of elements of \(S\text{:}\) i.e., we have

\begin{equation} \boldw_j=a_{1j}\boldv_1+a_{2j}\boldv_2+\cdots a_{nj}\boldv_n=\sum_{i=1}^n\boldv_{ij}\label{eq_basis_bound}\tag{4.5.1} \end{equation}

for all \(1\leq j\leq r\text{.}\) Now consider the following chain of equivalences:

\begin{align*} c_1\boldw_1+c_2\boldw_2+\cdots c_r\boldw_r=\boldzero \amp \iff c_1(\sum_{i=1}^n\boldv_{i1})+c_2(\sum_{i=1}^n\boldv_{i2})+\cdots +c_r\sum_{i=1}^n\boldv_{in}=\boldzero\amp (\knowl{./knowl/eq_basis_bound.html}{\text{(4.5.1)}}) \\ \amp \iff \sum_{j=1}^rc_j\sum_{i=1}^na_{ij}\boldv_i=\boldzero\\ \amp\iff \sum_{i=1}^n(\sum_{j=1}^ra_{ij}c_j)\boldv_i=\boldzero \\ \amp \iff (\sum_{j=1}^ra_{1j}c_j)\boldv_1+(\sum_{j=1}^ra_{2j}c_j)\boldv_2+\cdots (\sum_{j=1}^ra_{nj}c_j)\boldv_n=\boldzero \text{.} \end{align*}

From the last vector equation, we see that if we can find a nonzero sequence \((c_1,c_2,\dots, c_r)\) satisfying

\begin{equation*} \sum_{j=1}^ra_{ij}c_j=0 \end{equation*}

for all \(1\leq i\leq n\text{,}\) then there is a nontrivial combination of the \(\boldw_i\) equal to the zero vector, and hence that \(S'\) is linearly dependent. Such a sequence \((c_1,c_2,\dots, c_n)\) corresponds to a solution to the homogeneous linear with augmented matrix

\begin{equation*} \begin{amatrix}[r|r] A\amp \boldzero \end{amatrix}\text{,} \end{equation*}

where \(A=[a_{ij}]_{n\times r}\text{.}\) Since this is a homogeneous system of \(n\) equations in \(r\) unknowns, and since \(r>n\text{,}\) there are in fact infinitely many solutions. (The system has at most \(n\) leading ones, and so there must be a free variable since one of the \(r\) columns in the equivalent row echelon matrix must fail to contain a leading one.) In particular there is a nonzero solution \((c_1,c_2,\dots, c_n)\ne \boldzero\text{,}\) and we conclude that \(S'\) is linearly dependent.

The next theorem not only ensures that our definition of dimension (4.5.12) is well-defined, it also characterizes dimension as the minimal cardinality of a spanning set, and the maximal cardinality of a linearly independent set.
  1. Suppose by contradiction that \(\Span S=V\) and \(\val{S} < n\text{.}\) Then Theorem 4.5.10 would imply \(B\) is linearly dependent. Since this is a contradiction, we conclude that \(\val{S}\geq n\text{.}\)

  2. This also follows from Theorem 4.5.10: since \(B\) is a spanning set of \(V\) any set containing more than \(n=\val{B}\) elements must be linearly dependent.

  3. This follows from (1) and (2): if \(B'\) is a basis, then since \(B'\) spans, we have \(\val{B'}\geq n\) (1); and since \(B\) is linearly independent we have \(\val{B'}\leq n\) (2). We conclude that \(\val{B'}=n\text{.}\)

Definition 4.5.12. Dimension of a vector space.

Let \(V\) be a vector space. The dimension of \(V\), denoted \(\dim V\text{,}\) is defined as follows:

  • If \(V\) has a finite basis \(B\text{,}\) then the dimension of \(V\) is the number of elements of \(B\text{:}\) i.e., \(\dim V=\val{B}\text{.}\)

  • If \(V\) does not have a finite basis, then we say \(V\) is infinite dimensional, and write \(\dim V=\infty\text{.}\)

Remark 4.5.13.

Wouldn't it have been more natural to simply say \(V\) is infinite dimensional if it has an infinite basis? As it turns out this is indeed equivalent to not having a finite basis, but to prove this we need to establish the general fact that we can always find a basis for a given vector space \(V\text{.}\) As intuitive as that claim may sound (i.e., that bases always exist), its proof requires some set theory methods that are not covered in this text. As such we will not include it in our treatment of dimension, and so will have to make do with our slightly awkward definition of infinite-dimensional vector spaces.

Remark 4.5.14. Computing dimension.

By definition, to show a vector space \(V\) has dimension \(n\text{,}\) we must exhibit a basis \(B\) with \(\val{B}=n\text{.}\) Similarly to show \(V\) is infinite dimensional, we must show that it does not have a finite basis: equivalently, if you believe Remark 4.5.13, we must exhibit an infinite basis of \(V\text{.}\)

Consider our list of familiar vector spaces, for example. Since each vector spaces on this list comes equipped with a standard basis, we compute its dimension simply by counting the elements in the given standard basis. For each \(V\) below we provide its standard basis \(B\) and compute \(\dim V=\val{B}\text{.}\)

  • Zero space.

    \(V=\{\boldzero\}\text{,}\) \(B=\emptyset=\{ \}\text{,}\) \(\dim V=0\)

  • Tuples.

    \(V=\R^n\text{,}\) \(B=\{\bolde_1, \bolde_2,\dots, \bolde_n\}\text{,}\) \(\dim V=\val{B}=n\)

  • Matrices.

    \(V=M_{mn}\text{,}\) \(B=\{E_{ij}\colon 1\leq i\leq m, 1\leq j\leq n\}\text{,}\) \(\dim V=\val{B}=m\, n\)

  • Polynomials of bounded degree.

    \(V=P_n\text{,}\) \(B=\{x^n, x^{n-1}, \dots, x, 1\}\text{,}\) \(\dim V=\val{B}=n+1\)

  • Polynomials.

    \(V=P\text{,}\) \(B=\{1,x, x^2, \dots\}\text{,}\) \(\dim V=\val{B}=\infty\)

The “contracting and expanding” theorem below is very useful theoretical consequence of Theorem 4.5.11. It allows us to construct a customized basis from a given set \(S\text{.}\) This method is used prominently in the proof of the rank-nullity theorem.

Let \(S=\{\boldv_1, \boldv_2, \dots , \boldv_r\}\text{.}\)

  1. Assume \(\Span S=V\text{.}\) Let \(S'\) be a subset of \(S\) of minimal cardinality such that \(\Span S'\) is still equal to \(V\text{.}\) Such a set is guaranteed to exist: since \(S\) is finite, it has a finite number of subsets; of these subsets some will span, others will not; of the subsets that do span, there will be one (or more) that has the least number of elements.

    We claim that such a spanning subset \(S'\) of minimal cardinality is linearly independent, and hence is a basis for \(V\text{,}\) as desired. We give a proof by contradiction. Suppose \(S'\) is linearly dependent. Then some element of \(S'\text{,}\) call it \(\boldv_0\text{,}\) can be expressed as a linear combination of the other elements (Remark 4.4.9). Then in terms of span, the element \(\boldv_0\) is “redundant”. In other words, if we let \(S''=S'-\{\boldv_0\}\text{,}\) the set obtained by “throwing out” \(\boldv_0\text{,}\) then we have \(\Span S''=\Span S'=V\text{.}\) Since \(\val{S''} < \val{S}\text{,}\) this contradicts our choice of \(S'\) as a spanning set of minimal cardinality. We conclude that \(S'\) is linearly independent, completing the proof.

  2. Assume \(S\) is linearly independent. The procedure below constructs a finite chain of sets

    \begin{equation*} S=S_0\subseteq S_1\dots \subseteq S_k \end{equation*}

    that ends with a basis \(B=S_k\text{.}\)

    • Initialization.

      Let \(S=S_0\)

    • Expansion loop.

      If \(\Span S_i=V\text{,}\) return \(B=S_i\text{.}\) Otherwise set \(S_{i+1}=S_i\cup \{\boldv\}\text{,}\) where \(\boldv\) is any element of \(V\) that is not contained in \(\Span S_i\) and repeat.

    Some observations:

    1. Each \(S_i\) is linearly independent. This can be shown by induction, the crucial point being that if \(S_i\) is linearly independent, and if \(\boldv\notin \Span S_i\text{,}\) then \(S_{i+1}=S_i\cup \{\boldv_0\}\) is linearly independent. The proof is left as an exercise.

    2. The procedure must halt. Why? Since \(\dim V=n\text{,}\) and since each \(S_i\) is linearly independent, we must have \(\val{S_i}\leq n\) for all \(i\) by Theorem 4.5.11. Since \(\val{S_i}< \val{S_{i+1}}\) and \(\val{S_{i}}\leq n\text{,}\) the procedure must halt in at most \(n\) steps.

    From (ii) we may assume the procedure halts at \(S_k\) for some \(k\geq 0\text{.}\) From (i) we know that \(S_k\) is linearly independent. Furthermore, since the procedure halts at \(S_k\text{,}\) we know that \(\Span S_k=V\text{.}\) It follows that \(B=S_k\) is a basis containing \(S\text{,}\) as desired.

The following corollary follows from Theorem 4.5.11 and Theorem 4.5.15. We call it a “street smarts” result as the first two statements give us a quick and dirty way of dashing a set's hopes of being a basis. The third statement asserts that when a finite set's cardinality matches the dimension of a space, then to prove it is a basis it suffices to prove either one of the two defining properties of Definition 4.5.1.

Statements (1) and (2) follow directly from Theorem 4.5.11. Statement (3) is an easy consequence of Theorem 4.5.15. For example, if \(S\) spans \(V\text{,}\) then there is a subset \(S'\) of \(S\) that is a basis of \(V\text{;}\) since all bases of \(V\) have \(n\) elements, and since \(\val{S}=n\text{,}\) we must have \(S'=S\text{;}\) thus \(S\) is a basis. The proof for a linear independent set \(S\) is similar, and is left to the reader.

We are often in a situation where we wish to show a given subspace of a vector space is in fact the entire space. For example, when deciding whether a set \(S\) spans a vector space \(V\text{,}\) we are asking whether \(W=\Span S\) is all of \(V\text{.}\) As another example, given a linear transformation \(T\colon V\rightarrow W\text{,}\) one of the first things we wish to determine is whether the subspace \(\im T\) is in fact all of \(W\text{.}\) As the next result illustrates, dimension is a very useful tool in this regard.

  1. Firstly, if \(\dim V=\infty\text{,}\) then clearly \(\dim W\leq \dim V\) for any subspace \(W\subseteq V\text{.}\)

    Now assume \(\dim V=n\text{.}\) Apply the “extending to a basis” procedure described in the proof of Theorem 4.5.15 to the emptyset \(S=\{\}\text{,}\) which is lienarly independent, considered as a subset of \(W\text{:}\) i.e., at each stage, if the current set \(S_i\) is not a basis of \(W\text{,}\) add any element \(\boldb\notin \Span S_i\text{.}\) Since \(W\subseteq V\text{,}\) and since \(\dim V\leq n\text{,}\) the linearly independent sets \(S_i\) cannot have more than \(n\) elements; thus the procedure must halt with a basis \(B\) of \(W\) satisfying \(\val{B}\leq n\text{.}\) We conclude that \(\dim W\leq \dim V\text{.}\)

  2. Clearly, if \(W=V\text{,}\) then \(\dim W=\dim V\text{.}\) For the other direction, assume \(\dim W=\dim V=n\text{.}\) Let \(B\) be a basis for \(W\text{.}\) Since \(B\) is lienarly independent, it can be extended to a basis \(B'\) of \(V\) (4.5.15). Since \(\val{B}=\dim W=\dim V\text{,}\) and since all bases of \(V\) have cardinality \(n\text{,}\) we must have \(B=B'\text{.}\) It follows that \(B\) is also a basis of \(V\text{,}\) and hence that

    \begin{equation*} V=\Span B=W\text{.} \end{equation*}
That was quite a dose of theory! For your convenience, we collect the various results on dimension, together with their nicknames, in one omnibus theorem.

Exercises 4.5.3 Exercises

Exercise Group.

For each vector space \(V\) and subset \(S\text{,}\) use the one-step technique (Example 4.5.8to decide whether \(S\) is a basis for \(V\text{.}\)

1.

\(V=\R^3\)

  1. \(\displaystyle S=\{(1,1,2),(1,-1,-4), (3,-1,0) \}\)

  2. \(\displaystyle S=\{(1,1,2),(1,-1,-4), (1,-3,1) \}\)

2.

\(V=P_2\)

  1. \(\displaystyle S=\{2x^2+x+1, x^2-2x-1, x^2+1 \}\)

  2. \(\displaystyle S=\{2x^2-3x+1, 4x^2+x+1, 7x-1\}\)

3.

\(V=M_{22}\)

  1. \(S=\{A_1, A_2, A_3, A_4 \}\text{,}\) where

    \begin{equation*} A_1=\begin{bmatrix}1\amp 1\\ 1\amp 1 \end{bmatrix}, \ A_2=\begin{bmatrix}1\amp -1\\ 0\amp 0 \end{bmatrix}, \ A_3=\begin{bmatrix}0\amp -1\\ 1\amp 0 \end{bmatrix}, \ A_4=\begin{bmatrix}1\amp 0\\ 0\amp 0 \end{bmatrix} \end{equation*}
  2. \(S=\{A_1, A_2, A_3, A_4 \}\text{,}\) where

    \begin{equation*} A_1=\begin{bmatrix}1\amp 1\\ 1\amp 1 \end{bmatrix}, \ A_2=\begin{bmatrix}1\amp -1\\ -1\amp 0 \end{bmatrix}, \ A_3=\begin{bmatrix}0\amp 1\\ 1\amp 0 \end{bmatrix}, \ A_4=\begin{bmatrix}1\amp 0\\ 0\amp 0 \end{bmatrix} \end{equation*}

Exercise Group.

For each vector space \(V\) and subset \(S\) use an appropriate statement from Corollary 4.5.16 to help decide whether \(S\) is a basis for \(V\text{.}\) Justify your answer.

4.

\(V=P_3\text{,}\) \(S=\{x^3-x^2+x-1, x^3+\pi\, x-\sqrt{2}, x^3+1\}\)

5.

\(V=\R^4\text{,}\) \(S=\{(1,1,1,1), (0,1,1,1), (0,0,1,1), (0,0,0,1) \}\)

6.

\(V=\R^3\text{,}\) \(S=\{(-217, 3e^2, 3), (1,1,1), (-\log 81, 15, 17), (2,-\sqrt{17}, \sqrt{19})\}\)

7.

Let \(W\subseteq \R^3\) be the set of solutions to the following homogeneous system:

\begin{align*} x+y+z\amp =\amp 0\\ 3x+2y-2z\amp =\amp 0\\ 4x+3y-z \amp =\amp 0\\ 6x+5y+z \amp =\amp 0\text{.} \end{align*}
  1. Compute a basis of \(W\text{.}\) Justify your answer.

  2. Compute \(\dim W\text{.}\)

8.

Compute bases and dimensions for the following subspaces of \(\R^4\text{.}\)

  1. \(\displaystyle W=\{(a,b,c,d)\in\R^4\colon d=a+b, c=a-b\}\)

  2. \(\displaystyle W=\{(a,b,c,d)\in \R^4\colon a+b=c+d\}\)

9.

Suppoe \(B=\{\boldv_1,\boldv_2,\boldv_3\}\) be a basis for the vector space\(V\text{.}\) Let \(B'=\{\boldu_1,\boldu_2,\boldu_3\}\text{,}\) where

\begin{equation*} \boldu_1 = \boldv_1, \boldu_2 = \boldv_1 + \boldv_2, \boldu_3 = \boldv_1 +\boldv_2 + \boldv_3\text{.} \end{equation*}

Prove that \(B'\) is a basis.

Hint.

First explain why it is enough to show that \(B'\) is linearly independent.

10.

In multivariable calculus, a plane in \(\R^3\) is defined as the set of solutions to an equation of the form \(ax+by+cz=d\text{,}\) where at least one of \(a, b, c\) is nonzero. In particular, a plane passing through the origin \((0,0,0)\) is the set of solutions to an equation of the form \(ax+by+cz=0\text{,}\) where at least one of \(a, b, c\) is nonzero.

Prove that the 2-dimensional subspaces of \(\R^3\) are precisely the planes of \(\R^3\) that pass through the origin. In other words, show (a) that any plane \(\mathcal{P}\subseteq\R^3\) passing through the origin is a 2-dimensional subspace, and conversely, (b) that any 2-dimensional subspace \(W\subseteq \R^3\) is a plane passing through the origin.

Hint.

For (b), begin with a basis of \(B=\{(a,b,c), (d,e,f)\}\) of \(W\text{,}\) and use the cross product to find a normal vector \(\boldn\) that defines \(W\) as a plane.

11.

Let \(V=M_{nn}\text{.}\) For each of the following subspaces \(W\subset M_{nn}\text{,}\) give a basis \(B\) of \(W\) and compute \(\dim(W)\text{.}\) You must explicitly describe the elements of your as linear combinations of the elements \(E_{ij}\) of the standard basis for \(M_{nn}\text{.}\) No justification needed, as long as your proposed basis is simple enough.

  1. \(\displaystyle W=\{A\in M_{nn}\colon A \text{ is upper triangular}\}\)

  2. \(W=\{A\in M_{nn}\colon A^T=A\}\) (symmetric matrices)

  3. \(W=\{A\in M_{nn}\colon A^T=-A\}\) (skew-symmetric matrices)

Hint.

It might help to look at the \(n=2\) and \(n=3\) cases to get an idea of what these bases should be.

12.

Let \(V=M_{33}\text{,}\) \(W=\{A\in M_{33}\colon A^T=-A\}\text{,}\) and \(W'=\Span\{ A_1, A_2, A_3\}\text{,}\) where

\begin{equation*} A_1=\begin{bmatrix}0\amp 1\amp 2\\ -1\amp 0\amp 1\\ -2\amp -1\amp 0 \end{bmatrix} , \ A_2=\begin{bmatrix}0\amp 1\amp -1\\ -1\amp 0\amp 1\\ 1\amp -1\amp 0 \end{bmatrix} ,\ A_3=\begin{bmatrix}0\amp 1\amp 0\\ -1\amp 0\amp -1\\ 0\amp 1\amp 0 \end{bmatrix} \end{equation*}

Show that \(W'=W\) as follows:

  1. Show that \(W'\subseteq W\text{.}\)

  2. Compute the dimensions of \(W'\) and \(W\) and use Corollary 4.5.17.

13.

Let \(V=M_{23}\) and define

\begin{equation*} W=\{A\in M_{23}\colon \text{ all rows and columns of \(A\) sum to 0 } \}\text{.} \end{equation*}

Find a basis for \(W\) by inspection and compute its dimension.

14.

Let \(C(\R)\) and let \(W=\Span S\text{,}\) where

\begin{equation*} S=\{f_1(x)=\cos(2x), f_2(x)=\sin(2x), f_3(x)=\sin(x)\cos(x), f_4(x)=\cos^2(x), f_5(x)=\sin^2(x) \}\text{.} \end{equation*}

Provide a basis for \(W\) and compute \(\dim W\text{.}\)

Hint.

You can contract \(S\) to a basis by throwing out some redundant elements.

15.

Let \(A\in M_{nn}\text{.}\) Show that there is a nonzero polynomial \(p(x)=a_{n^2}x^{n^2}+a_{n^2-1}x^{n^2-1}+\cdots +a_1x+a_0\) such that

\begin{equation*} p(A)=a_{n^2}A^{n^2}+a_{n^2-1}A^{n^2-1}+\cdots +a_1A+a_0I_n=\underset{n\times n}{\boldzero}\text{.} \end{equation*}
Hint.

Consider the set \(S=\{A^{n^2}, A^{n^2-1}, \dots, A, I\}\subseteq M_{nn}\) and use a relevant statement from Theorem 4.5.18. Treat two cases separately: (a) the powers of \(A\) are all distinct; (b) \(A^i=A^j\) for some \(0\leq i < j\leq n^2\text{.}\)