Sie sind auf Seite 1von 3

Math 21b Midterm 1 Study Guide

1. How to solve systems of linear equations using elimination


Organize the equations into an augmented matrix, then use simple row operations (addition, subtraction, or scalar
multiplication) to reduce the matrix to RREF form
Note that there can either be 0, 1, or solutions to a system of linear equations.
Leading variables are the first variables in each row, and are the only variables in their columns. In RREF, these
variables should appear as 1s in the coefficient matrix. Free variables are the other variables in each row, and do
not appear as leading 1s.
An example of a RREF matrix:

1 0

0 1

3
2


1
2

1 3s
1
3
This gives the solution x1 = 1 3x3 and x2 = 2x3 + 2, written as 2s + 2 or 2 + s 2
s
0
1
2. Matrix-Vector Multiplication
Matrix multiplication involves taking the dot product of each of the rows of the matrix with the vector by which
you are mulitplying.
For example:

1
3
3

2
9
0

4
1
33
6 2 = 63
4
7
31

In order to multiply a vector by a matrix, the number of columns of the matrix must equal the number of rows
of the vector. That is, for a vector ~x Rn , it can only be multiplied by an m n matrix, giving a solution in the
form of a vector ~b Rm .
For example:

1
3
2

 
8
2
2
4
= 18
3
19
5

Order matters when multiplying matrices and vectors. Thus, there is no commutative property for vector multiplication. That is in general, AB(~x) 6= BA(~x) for two distinct matrices A and B.
3. Linear Transformations
Linear Transformations are functions that take a vector ~x in Rm and create another vector ~b in Rn . Note that m
and n can be the same number, but that n m.
A linear transformation can be used to scale, rotate, reflect, and project vectors onto one another.
A transformation from Rm Rn is linear only if it is closed under scalar multiplication and addition, i.e.
T (~x + ~y ) = T (~x) + T (~y ) and T (k~x) = kT (~x) ~x, ~y Rm .
A linear transformation is a linear combination of different vectors, and any transformation T (~x) = ~b can be
represented by a matrix A, such that A~x = ~b.
Examples:
Projection onto the x-axis


1

0


0
0

Reflection over the line y = x


0

1


1
0

Scaling by a factor of 2


2

0


0
2

Transformations can occur in many dimensions. The find the matrix of a transformation, the easiest way is to
find T (e~1 ), T (e~2 ), ..., T (e~n ), which give the columns, in order, of the matrix A of the transformation
4. Vector Spaces and Subspaces
1

A vector space is a collection of vectors that span a certain space, such as R3 , which is three-dimensional space
A basis of Rn is a set of n vectors that spans all of Rn . For instance e~1 , e~2 , , e~n span Rn .
A subspace of V Rn a set of vectors that span V that are closed under scalar multiplication and addition
The dimension of a subspace, written dim(V) = k, is the number of vectors that span the subspace, i.e. dim(R2 = 2
Another way of saying it: ~b V, there is a combination of v~1 , v~2 , ..., v~n such that ~b = c1 v~1 + c2 v~2 + ... + cn v~n ,
c1 ...cn R
If the vectors v~1 ...v~n are linearly independent, then that combination of scalars c1 , ..., cn is unique. The existence
of a unique combination ~b V implies that V is a subspace of Rn
The kernel and the image of a matrix are also subspaces. The kernel contains all vectors ~x such that A~x = ~0,
while the image is the span of the columns of the matrix, which corresponds to the span of its codomain.
For example, given the following matrix:

1
1
1

3
3
3

4
2
3


1
4
1
The vector spanning the kernel is 3 and the vectors spanning the image are 1 , 2

0
1
3

5. Linear Independence and Bases


A basis for a vector space is a set of vectors whose linear combinations span the entire space. A basis is a linearly
independent set of of n vectors that span a given n- dimensional space
There are many analogous statements to saying that a set of vectors is linearly independent.
For instance, saying that the ker(A) = ~0, where A is the matrix whose columns are v~1 , v~2 , ..., v~n , the vectors that
comprise the subspace V.
If n vectors span an n-dimensional space, such as Rn , then they are linearly independent and form a basis of that
space
Here is a list of all equivalent statements (taken from Bretscher, page 133):
(a) A is invertible.
(b)
(c)
(d)
(e)
(f)
(g)
(h)
(i)

The linear system A~x = ~b has a unique solution ~b Rn .


rref(A) = In
rank(A) = n
im(A) = Rn .
ker(A) = {~0}.
The column vectors of A form a basis of Rn .
The column vectors of A span Rn .
The column vectors of A are linearly independent.

For any matrix A:


dim(im(A)) = # of leading variables (or leading 1s in RREF)
dim(ker(A)) = # of free variables
The Rank-Nullity Theorem For an n m matrix A, dim(im(A)) + dim(ker(A)) = m
6. Coordinates and Coordinate Bases
Typically we described transformations with respect to the basis of Rn described by e~1 , e~2 , ..., e~n . However, for
some transformations it is easier to describe them with respect to a different basis.


 
1 1
1
1

with respect to the standard


Example: The projection onto the line described by the vector
is 2
  1
 1 1
1
1
basis. But, if we choose a new basis for R2 consisting of
,
Then the matrix becomes
1
1


1 0
B=
0 0
since we are projecting onto the first vector in the new basis.
2

Notice that the matrix above is the same as projecting onto the x-axis in the standard basis.
If we want to find the matrix of the transformation with respect to the standard basis, we can use the formula:
A = S 1 BS, where S is a matrix with columns corresponding to the vectors of the new basis we are using.
We can build the matrix B column-by-column using the fact that the columns of B are just [T (v~1 )T (v~2 )...T (v~n )],
where v~1 ... v~n are the new basis vectors

Das könnte Ihnen auch gefallen