Beruflich Dokumente
Kultur Dokumente
Semester Two,
2010
STANDARD PAPER
SURNAME/FAMILY NAME
Duration
Reading time
Working time
Total time
5 minutes
3 hours
3 hours 5 minutes
Attempt
Marks
80
Type of Exam
Special Instructions
Students are not permitted to write on the examination or any other paper during
reading time.
Do not commence the examination until you are told to do so.
1.
[8 marks]
3
1 b
1
1 3 0 11
(a)
What can you say about row 3 of A? Give an example of a possible third row for A.
(b)
(c)
(d)
(i)
(ii)
null space
column space
(e)
2.
[6 marks]
(a)
(b)
(c)
1
where b 1 1, b 2
1
1
1
1, and b 3 0 is a
0
1
3
of v = 2 relative to the basis B.
4
1
= 1 .
1
3.
[4 marks]
Quicklime (CaO) and carbon dioxide (CO2) combine to calcium carbonate (CaCO3).
(a)
(b)
How many parts each of quicklime and carbon dioxide are needed to produce 8 parts
of calcium carbonate?
4.
[8 marks]
Let T : R 2 R 2 be the linear transformation defined by T ( x1 , x 2 ) ( x1 3 x 2 ,5 x1 x 2 ) .
(a)
2
Find the image of under T.
1
(b)
(c)
(d)
5.
[6 marks]
1
4
Let u 2 and v 0 .
2
2
(a)
(b)
(c)
(d)
6.
[9 marks]
8 3 1
Let A 0 5
1 .
0 0 7
(a)
(b)
1
Use matrix multiplication to show that 1 is an eigenvector of A and state the
0
corresponding eigenvalue.
(c)
(d)
7.
[7 marks]
Suppose that the eigenvalues of a 2 2 matrix A are 1, and 7 and that the corresponding
1
1
eigenvectors are , and .
2
3
(a)
(b)
(c)
(d)
8.
[8 marks]
Let W span
1 1 1
1, 1, 0 .
1 0 1
(a)
(b)
(c)
0
Determine the projection of the vector v 1 on W.
0
9.
[7 marks]
A certain experiment produces data (1, 2), (3, 2), (5, 3), (6, 4) and (9, 7). Write down the
system of linear equations for the model that produces a least squares fit of these points by a
function of the form
y 0 1 x .
Find the least squares solution of the system of equations you derived and hence determine
the values for 0 and 1 that produce the least squares fit for the data.
10
10.
[5 marks]
Write down the matrix of the quadratic form x12 5 x1 x 2 4 x 22 and classify the given
quadratic form.
11
11.
[12 marks]
For each of the statements about matrices and vectors below, state if it is true or false.
Statement
If the linear transformation corresponding to a square matrix is one-to-one,
then it is invertible.
12
True
False
A system of linear equations is called homogeneous, if it can be written as Ax=0. The system is
called inhomogeneous if it is of the form Ax=b, b0.
Suppose the equation Ax=b is consistent for some given b, and let p be a solution. Then the solution
set of Ax=b is the set of all vectors of the form w=p+vh, where vh is any solution of the
homogeneous equation Ax=0.
Linear combinations and subspaces
The linear span of {v1,v2,,vp} Rn; denoted by Span{v1,v2,,vp}, is the set of all linear
combinations of v1,v2,,vp .
The columns of A=[ a1
a2
An eigenvector of an nn matrix A is a nonzero vector x such that Ax=x for some scalar .
A scalar is called an eigenvalue of A if there is a nontrivial solution x of Ax=x; such an x is
called an eigenvector corresponding to .
The set of all solutions to (A-I)x=0 is called the eigenspace of A corresponding to .
If v1,,vr are eigenvectors that correspond to distinct eigenvalues 1,,r of an nn matrix A, then
{v1,,vr} is a linearly independent set.
A scalar is an eigenvalue of A if and only if satisfies the characteristic equation det(AI)=0
Two nn matrices A and B are similar, if there exists an nn matrix P such that A=PBP.
If A and B are similar nn matrices, then A and B have the same characteristic polynomial and
hence the same eigenvalues.
A square matrix A is diagonalisable if A is similar to a diagonal matrix D.
An nn matrix A is diagonalisable if and only if A has n linearly independent eigenvectors.
In fact, A=PDP, with D a diagonal matrix, if and only if the columns of P are n linearly
independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that
correspond, respectively, to the eigenvectors in P.
Orthogonality
Given a vector u (u1 , u2 ) in R2 we define its length (or norm) u u12 u22
Properties:
For a scalar a and vectors v, u
v
1.
0 and v = 0 if and only if v = 0
2.
av
3.
vu v u
a v
uv=vu
b.
(u+v)w=uw+vw
c.
d.
(cu)v=c(uv)=u(cv)
14
Two vectors u and v are said to be orthogonal (to each other) if uv=0.
If a vector z is orthogonal to every vector in a subspace W of R, then z is said to be orthogonal to
W.
The set W of vectors z that are orthogonal to W is called the orthogonal complement of W.
Let A be an mn matrix. Then (Row A) Nul( A) and (Col A) Nul( AT )
A set of vectors {u1,u2,,up} in R is called an orthogonal set if uiuj=0 whenever ij.
Suppose S={u1,u2,,up} is an orthogonal set of nonzero vectors in R and W=span{u1,u2,,up}
Then S is a basis for W.
Let {u1,u2,,up} be an orthogonal basis for a subspace W of R. Then each y in W has a unique
representation as a linear combination of u1,u2,and up. In fact, if y= c1u1 +c2u2 ++ cpup, then
cj=(yuj)/(ujuj), (j=1,,p)
The set of least squares solutions of Ax=b is the set of all solutions of the normal equations
ATA x =ATb.
Let A be a square nn matrix. Then the following statements are equivalent (i.e., for a given A,
they are either all true or all false).
A is an invertible matrix.
a.
A is row equivalent to In
b.
A has n pivot positions.
c.
d.
The equation Ax=0 has only the trivial solution.
e.
The columns of A form a linearly independent set.
f.
The linear transformation xAx is one-to-one.
g.
The equation Ax=b has at least one solution for each b in R.
h.
The columns of A span R.
i.
The linear transformation xAx maps R onto R.
j.
There is an nn matrix C such that CA=In
k.
There is an nn matrix D such that AD= In
AT
is an invertible matrix.
l.
m.
The columns of A form a basis for Rn
n.
Col A = Rn
dim Col A = n
o.
p.
rank A = n
q.
Nul A = {0}
r.
dim Nul A = 0
s.
the number 0 is not an eigenvalue of A.
16