Sie sind auf Seite 1von 16

INTERNAL

Semester Two,
2010
STANDARD PAPER

MAT1163 Linear Algebra

Unit Code and Title


Student Number

SURNAME/FAMILY NAME

OTHER OR GIVEN NAME/S

Please print clearly

Duration
Reading time
Working time
Total time

5 minutes
3 hours
3 hours 5 minutes

Attempt

All eleven questions

Marks

80

Type of Exam

Closed Book exam -- Text books/reference books/notes


are not permitted.

Special Instructions

Mathematical Tables are supplied. These should be


handed back at the end of the examination period.
Hand-held calculators (not palm-tops) are permitted.
Write your answers in the spaces provided.
Give full details of your working for each question.
There are 12 pages in total.

Students are not permitted to write on the examination or any other paper during
reading time.
Do not commence the examination until you are told to do so.

MAT1163 EXAMINATION 2010

1.

[8 marks]
3
1 b
1
1 3 0 11

The reduced row echelon form of A 3


a
5 8 is equal to R 0 0 1 5 .
row3
0 0 0 0

(a)

What can you say about row 3 of A? Give an example of a possible third row for A.

(b)

Determine the values of a and b.

(c)

Determine the solution of the homogeneous system of equations Rx = 0 in


parametric vector form.

(d)

Circle the spaces that are the same for A as for R:

(i)
(ii)

null space
column space

(e)

What is the dimension of the column space of A? Do the columns of A span R3 ?

MAT1163 EXAMINATION 2010

2.

[6 marks]
(a)

Show that the set B b 1 , b 2 , b 3


basis of R3.

(b)

(c)

Find the coordinate vector v B

1
where b 1 1, b 2
1

1
1

1, and b 3 0 is a
0
1

3
of v = 2 relative to the basis B.
4

Find w if the coordinate vector w B of w relative to B is w B

1
= 1 .
1

MAT1163 EXAMINATION 2010

3.

[4 marks]
Quicklime (CaO) and carbon dioxide (CO2) combine to calcium carbonate (CaCO3).
(a)

Write a vector equation to balance this chemical reaction.

(b)

How many parts each of quicklime and carbon dioxide are needed to produce 8 parts
of calcium carbonate?

MAT1163 EXAMINATION 2010

4.

[8 marks]
Let T : R 2 R 2 be the linear transformation defined by T ( x1 , x 2 ) ( x1 3 x 2 ,5 x1 x 2 ) .
(a)

2
Find the image of under T.
1

(b)

Find x such that T (x) (4, 4) .

(c)

Determine the matrix of T.

(d)

Is the transformation T invertible? If yes, determine the inverse transformation for T.

MAT1163 EXAMINATION 2010

5.

[6 marks]
1
4

Let u 2 and v 0 .
2
2

(a)

Determine the angle between u and v.

(b)

Calculate the area of the triangle spanned by u and v.

(c)

Calculate the projection of v on u.

(d)

Calculate the distance of v from span{u}.

MAT1163 EXAMINATION 2010

6.

[9 marks]
8 3 1
Let A 0 5
1 .
0 0 7

(a)

(b)

State the eigenvalues of A.

1
Use matrix multiplication to show that 1 is an eigenvector of A and state the
0
corresponding eigenvalue.

(c)

Find the eigenvectors corresponding to the remaining two eigenvalues of A by


solving the appropriate systems of equations. Show your working.

(d)

Write down an invertible matrix S and a diagonal matrix D so that AS SD.

MAT1163 EXAMINATION 2010

7.

[7 marks]
Suppose that the eigenvalues of a 2 2 matrix A are 1, and 7 and that the corresponding
1
1
eigenvectors are , and .
2
3
(a)

Write down the characteristic polynomial of A.

(b)

Is the matrix A invertible? Justify your answer.

(c)

Is the linear transformation corresponding to A onto? Justify your answer.

(d)

Determine a matrix that has these eigenvalues and eigenvectors.

MAT1163 EXAMINATION 2010

8.

[8 marks]

Let W span

1 1 1
1, 1, 0 .

1 0 1

(a)

Determine the dimension of W and write down a basis for W.

(b)

Use the Gram-Schmidt process to find an orthonormal basis for W.

(c)

0
Determine the projection of the vector v 1 on W.
0

MAT1163 EXAMINATION 2010

9.

[7 marks]
A certain experiment produces data (1, 2), (3, 2), (5, 3), (6, 4) and (9, 7). Write down the
system of linear equations for the model that produces a least squares fit of these points by a
function of the form
y 0 1 x .

Find the least squares solution of the system of equations you derived and hence determine
the values for 0 and 1 that produce the least squares fit for the data.

10

MAT1163 EXAMINATION 2010

10.

[5 marks]
Write down the matrix of the quadratic form x12 5 x1 x 2 4 x 22 and classify the given
quadratic form.

11

MAT1163 EXAMINATION 2010

11.

[12 marks]

For each of the statements about matrices and vectors below, state if it is true or false.
Statement
If the linear transformation corresponding to a square matrix is one-to-one,
then it is invertible.

If the rank of the 55 matrix A is 4, then A is invertible .


If A is a 45 matrix, then the dimension of the null-space is no more than 4.
Two matrices which have the same reduced row echelon form have the same
column space.
Two matrices which have the same reduced row echelon form have the same
null space.
If the matrix A is diagonalisable, then A is invertible.
If the null space of A contains only the zero-vector then the linear
transformation corresponding to A is onto.
Every orthonormal set in R n is linearly independent.
If A and B are similar matrices, then they have the same eigenvalues.
If A is symmetric, then A is diagonalisable.
If A has size 57 and rank 3, then the dimension of the null space of A is 3.
If A has size 57 and there is a solution for Ax=b for every b in R5, then the
rank of A is 5.
END OF PAPER

12

True

False

MAT1163 EXAMINATION 2010

Systems of linear equations

A system of linear equations is called homogeneous, if it can be written as Ax=0. The system is
called inhomogeneous if it is of the form Ax=b, b0.
Suppose the equation Ax=b is consistent for some given b, and let p be a solution. Then the solution
set of Ax=b is the set of all vectors of the form w=p+vh, where vh is any solution of the
homogeneous equation Ax=0.
Linear combinations and subspaces

Given vectors v1 , v 2 , , v p in Rn and given scalars c1 , c2 , , c p the vector y defined


by y c1 v1 c2 v 2 c p v p is called a linear combination of v1 , v 2 , , v p using weights
c1 , c2 , , c p .

The linear span of {v1,v2,,vp} Rn; denoted by Span{v1,v2,,vp}, is the set of all linear
combinations of v1,v2,,vp .
The columns of A=[ a1

a2

ap] span Rm if every vector b in Rm is a linear

combination of a1,a2, , ap, i.e. Span{ a1,a2, , ap }=Rm


A set of vectors {v1,v2,,vp} in R is said to be linearly independent if the vector equation
x1 v1+ x2 v2++xp vp=0 has only the trivial solution. The set {v1,v2,,vp} is said to be linearly
dependent if there exist weights c1,,cp, not all 0, such that c1 v1+ c2 v2++cp vp=0.
A subspace of R is any set H in R that has the following three properties:
1.
The zero vector is in H
2.
For each u and v in H the sum u+v is in H
3.
For each u in H and each scalar c the vector cu is in H.
The column space of a matrix A is the set ColA of all linear combinations of the columns of A.
The null space of a matrix A is the set NulA of all solutions to the homogeneous equation Ax=0.
A basis for a subspace H of R is a linearly independent set in H that spans H.
Suppose that the set B={ b1, b2,..., bp } is a basis for a subspace H. For each x in H, the coordinates
of x relative to the basis {b1,b2,...,bp} are the weights c1, c2,..., cp such that x= c1b1+c2b2..., cpbp
c1
c
2
and the vector xB in Rp is called the coordinate vector of x relative to B.


c p
The dimension of a nonzero subspace H denoted by dim H, is the number of vectors in any basis
for H. The dimension of the zero subspace is defined to be 0.
The rank of a matrix A, denoted by rank(A), is the dimension of the column space of A.
If a matrix A has n columns, then rank(A)+dim Nul(A)=n.
13

MAT1163 EXAMINATION 2010

Eigenvalues and Eigenvectors

An eigenvector of an nn matrix A is a nonzero vector x such that Ax=x for some scalar .
A scalar is called an eigenvalue of A if there is a nontrivial solution x of Ax=x; such an x is
called an eigenvector corresponding to .
The set of all solutions to (A-I)x=0 is called the eigenspace of A corresponding to .
If v1,,vr are eigenvectors that correspond to distinct eigenvalues 1,,r of an nn matrix A, then
{v1,,vr} is a linearly independent set.
A scalar is an eigenvalue of A if and only if satisfies the characteristic equation det(AI)=0
Two nn matrices A and B are similar, if there exists an nn matrix P such that A=PBP.
If A and B are similar nn matrices, then A and B have the same characteristic polynomial and
hence the same eigenvalues.
A square matrix A is diagonalisable if A is similar to a diagonal matrix D.
An nn matrix A is diagonalisable if and only if A has n linearly independent eigenvectors.
In fact, A=PDP, with D a diagonal matrix, if and only if the columns of P are n linearly
independent eigenvectors of A. In this case, the diagonal entries of D are eigenvalues of A that
correspond, respectively, to the eigenvectors in P.
Orthogonality

Given a vector u (u1 , u2 ) in R2 we define its length (or norm) u u12 u22
Properties:
For a scalar a and vectors v, u
v
1.
0 and v = 0 if and only if v = 0
2.

av

3.

vu v u

a v

Given u and v in Rn, the dot product between u and v is defined by


v1
v
T
u v u v u1 u2 un 2 u1v1 u2v2 un vn


vn
Properties:
Let u,v and w be vectors in R, and let c be any scalar. Then
a.

uv=vu

b.

(u+v)w=uw+vw

c.
d.

(cu)v=c(uv)=u(cv)

uu0, and uu=0 if and only if u=0.

14

MAT1163 EXAMINATION 2010

Two vectors u and v are said to be orthogonal (to each other) if uv=0.
If a vector z is orthogonal to every vector in a subspace W of R, then z is said to be orthogonal to
W.
The set W of vectors z that are orthogonal to W is called the orthogonal complement of W.
Let A be an mn matrix. Then (Row A) Nul( A) and (Col A) Nul( AT )
A set of vectors {u1,u2,,up} in R is called an orthogonal set if uiuj=0 whenever ij.
Suppose S={u1,u2,,up} is an orthogonal set of nonzero vectors in R and W=span{u1,u2,,up}
Then S is a basis for W.
Let {u1,u2,,up} be an orthogonal basis for a subspace W of R. Then each y in W has a unique
representation as a linear combination of u1,u2,and up. In fact, if y= c1u1 +c2u2 ++ cpup, then
cj=(yuj)/(ujuj), (j=1,,p)

A set of vectors {u1,u2,,up} in R is called an orthonormal set if it is an orthogonal set of unit


vectors.
If W=span{u1,u2,,up} then {u1,u2,,up} is an orthonormal basis for W.
Let W be a subspace of R. Then each y in R can be uniquely represented in the form y= y +z
where y is in W and z is in W . In fact, if {u1,u2,,up} is any orthogonal basis of W, then
y =(yu1)/(u1u1)u1 +(yu2)/(u2u2)u2 ++ (yup)/(upup)up and z=y y .
Let W be a subspace of Rn, y any vector in Rn, and y the orthogonal projection of y onto W. Then
y is the point in W closest to y, in the sense that y y y v for all v in W distinct from y.
(Gram Schmid orthogonalisation) Suppose that W is a subspace of Rn and let {x1,x2,,xp} be a
basis of W. Then {v1,v2,,vp} is an orthogonal basis for W, where
v1 = x1
v 2 x 2 proj v1 x 2
v i x i projSpan{v1 ,..., v i1 }x i , i=3, , p-1

v p x p projSpan{v1 ,..., v p1}x p

The set of least squares solutions of Ax=b is the set of all solutions of the normal equations
ATA x =ATb.

A matrix A is called symmetric if A AT


If A is a symmetric matrix, then any two eigenvectors from different eigenspaces are orthogonal.
A matrix A is said to be orthogonally diagonalisable if there is a diagonal matrix D and an
orthogonal matrix P such that A=PDPT.
15

MAT1163 EXAMINATION 2010

An n n symmetric matrix A has the following properties:


(a)
(b)
(c)
(d)

A has n real eigenvalues, counting multiplicities.


The dimension of the eigenspace for each eigenvalue equals the multiplicity of the
eigenvalue as a root of the characteristic equation.
The eigenspaces are mutually orthogonal
A is orthogonally diagonalisable.

The Invertible Matrix Theorem

Let A be a square nn matrix. Then the following statements are equivalent (i.e., for a given A,
they are either all true or all false).
A is an invertible matrix.
a.
A is row equivalent to In
b.
A has n pivot positions.
c.
d.
The equation Ax=0 has only the trivial solution.
e.
The columns of A form a linearly independent set.
f.
The linear transformation xAx is one-to-one.
g.
The equation Ax=b has at least one solution for each b in R.
h.
The columns of A span R.
i.
The linear transformation xAx maps R onto R.
j.
There is an nn matrix C such that CA=In
k.
There is an nn matrix D such that AD= In
AT
is an invertible matrix.
l.
m.
The columns of A form a basis for Rn
n.
Col A = Rn
dim Col A = n
o.
p.
rank A = n
q.
Nul A = {0}
r.
dim Nul A = 0
s.
the number 0 is not an eigenvalue of A.

16

Das könnte Ihnen auch gefallen