Sie sind auf Seite 1von 3

Math 315 Exam 2 Solutions

Apr 1, 2015

1. (10 pts) Suppose (v1 , . . . , vn ) is linearly independent in V and w ∈ V . Prove that if (v1 +
w, . . . , vn + w) is linearly dependent, then w ∈ span(v1 , . . . , vn ).

Suppose (v1 + w, . . . , vn + w) is linearly dependent. Then there exist scalars α1 , . . . , αn


not all 0 such that
α1 (v1 + w) + · · · + αn (vn + w) = 0.
Hence !
Xn
− αi w = α1 v1 + · · · + αn vn .
i=1
P
If αi = 0, then
0 = α1 v1 + · · · + αn vn .
But this contradicts
Pthe linear independence of (v1 , . . . , vn ) since we know at least one of the
αi 6= 0. Therefore αi 6= 0 and
−α1 −αn
w = P v1 + · · · + P vn ∈ span(v1 , . . . , vn ).
αi αi
2. (10 pts) Let V be a finite dimensional vector space. Prove that every linear map on a
subspace of V can be extended to a linear map on V . In other words, if U is a subspace
of V , W is any other vector space over the same field, and S ∈ L(U, W ), then there exists
T ∈ L(V, W ) such that T (u) = S(u) for all u ∈ U .

Since V is finite dimensional and U is a subspace of V , U is also finite dimensional. Hence


we can choose a basis (u1 , . . . , um ) for U . By Prop 2.33, we can extend this to a basis
(u1 , . . . , um , v1 , . . . , vn ) of V . By Prop 3.5, there exists a unique linear map T ∈ (V, W ) such
that T (ui ) = S(ui ) for all i and T (vi ) = 0 for all i.
We will now show that for all u ∈ U , T (u) = S(u). Since u ∈ U , there exist α1 , . . . , αm ∈ F
such that u = α1 u1 + · · · + αm um . By the linearity of T and S
T (u) = T (α1 u1 + · · · + αm um )
= α1 T (u1 ) + · · · + αm T (um )
= α1 S(u1 ) + · · · + αm S(um )
= S(α1 u1 + · · · + αm um )
= S(u)

3. (10 pts) Let V be a finite dimensional vector space and (v1 , v2 , . . . , vn ) a linearly independent
list of vectors in V . Prove that such a list can be extended to a basis of V .

This is Prop 2.33 in your textbook. We also proved it in class in a different way than the
proof in the book. Here is that proof.
Since V is finite dimensional, it has some finite spanning list of vectors. Let m be the
number of vectors in one such list. If V = span(v1 , . . . , vn ), then (v1 , . . . , vn ) is already
a basis of V and the needed extension is trivial. Otherwise choose vn+1 ∈ V such that
vn+1 6∈ span(v1 , . . . , vn ). Since vn+1 is not a linear combination of (v1 , . . . , vn ), the extended
list (v1 , . . . , vn+1 ) must be linearly independent by the Linear Dependence Lemma. Continue
adding vectors to the list that are not already in the span as long as you can. As the list
grow, it remains linearly independent. But no linearly independent list can be longer than
any spanning list. So this process must end after at most m − n steps. The only way it can
end is that there are no more vectors in V that are not linear combinations of the vectors in
the list. At that point the extended list a basis of V .

4. (5 pts each) Let V = P4 (R) and


U = {p ∈ V | p′ (1) = 0}.
(a) Find a basis of U .

Let p(x) = a4 x4 + a3 x3 + a2 x2 + a1 x + a0 ∈ U . Then p′ (x) = 4a4 x3 + 3a3 x2 + 2a2 x + a1


and
p′ (1) = 4a4 + 3a3 + 2a2 + a1 = 0 =⇒ a1 = −4a4 − 3a3 − 2a2 .
So
p(x) = a4 x4 + a3 x3 + a2 x2 + (−4a4 − 3a3 − 2a2 )x + a0
= a4 (x4 − 4x) + a3 (x3 − 3x) + a2 (x2 − 2x) + a0
Therefore every polynomial in U is a linear combination of (1, x2 − 2x, x3 − 3x, x4 − 4x).
Notice that this list is actually linearly independent since none of the polynomials can
be a linear combination of the preceding polynomials as its degree is higher than the
degrees of all of the preceding polynomials. Therefore (1, x2 − 2x, x3 − 3x, x4 − 4x) is a
basis of U .
(b) Extend the basis you found in part (a) to a basis of V .

To extend the list from part (a) to a basis of V , follow the process in the proof of
problem 3. Since V = span(1, x, x2 , x3 , x4 ) we need at most one more vector to extend
the list to a basis of V . An obvious choice of a polynomial not already in the span of
the list is p(x) = x. It cannot already be in the span because it is clearly not in U as
p′ (1) = 1. Hence the extended list (1, x2 − 2x, x3 − 3x, x4 − 4x, x) is linearly independent
and cannot be extended further while remaining linearly independent, so it must be a
basis of V .

5. (10 pts) Extra credit problem. Let V be a vector space over the field R. Prove that the
union of three subspaces of V is a subspace of V if and only if one of the three contains
the other two. (Hint: One of your homework exercises was to show that the union of two
subspaces is a subspace if and only if one of them contains the other. Start off by using this
result to show that if the union of three subspaces is a subspace then either one of them
contains the other two or none of them contains any of the other two.)

Let U1 , U2 , U3 be subspaces of V such that U1 ∪ U2 ∪ U3 is also a subspace. Suppose that


one of them contains one of the others, say U1 ⊆ U2 . Then U1 ∪ U2 ∪ U3 = U2 ∪ U3 . Since
this is a subspace, either U2 ⊆ U3 or U3 ⊆ U2 by exercise 1.C.12. In the first case, U3
contains both U1 and U2 , in the second, U2 contains both U1 and U3 . Hence either one of
the subspaces contains both of the other two or none of the subspaces contains either of the
other two. We will prove by contradiction that the latter is not possible.
Suppose that none of the subspaces contains either of the other two. Now, suppose U1 ⊆
U2 ∪ U3 . Then U1 ∪ U2 ∪ U3 = U2 ∪ U3 . Since this is a subspace, either U2 ⊆ U3 or U3 ⊆ U2
(again by exercise 1.C.12). But we assumed none of the subspaces contains either of the other
two. So U1 * U2 ∪ U3 . Therefore we can choose u1 ∈ U1 such that u1 6∈ U2 , U3 . Similarly,
there exists u2 ∈ U2 such that u2 6∈ U1 , U3 . Since u1 , u2 ∈ U1 ∪ U2 ∪ U3 , which is a subspace,
u1 + u2 ∈ U1 ∪ U2 ∪ U3 . Notice that u1 + u2 6∈ U1 , otherwise u2 = (u1 + u2 ) − u1 ∈ U1 .
Similarly, u1 + u2 6∈ U2 either. Hence u1 + u2 must be in U3 .
Now, u1 − u2 ∈ U1 ∪ U2 ∪ U3 . So it must be in U1 or U2 or U3 . But it cannot be in U1 ,
otherwise u2 = u1 − (u1 − u2 ) ∈ U1 . Similarly, u1 − u2 6∈ U2 . So it must be in U3 . But then
2u1 = (u1 + u2 ) + (u1 − u2 ) ∈ U3 . And hence u1 = (1/2)(2u1 ) ∈ U3 . Now we really have no
way out and have reached a contradiction. Hence the assumption that none of the subspaces
contains either of the other two must be false. The only remaining case is that one of them
contains the other two.

Das könnte Ihnen auch gefallen