Sie sind auf Seite 1von 3

Linear Algebra

Systems of linear equations


1. Matrix-vector product
Av = v
1
a
1
+v
2
a
2
+ +v
n
a
n
2. Gaussian elimination: matrix
forward pass

row echelon form


backward pass

reduced row echelon


form
Vector and matrix equations
1. Linear combination of vectors
c
1
u
1
+c
2
u
2
+ +c
k
u
k
2. Span of S = {u
1
, u
2
, , u
k
}: the set
of all linear combinations of
u
1
, u
2
, , u
k
3. a set of vectors {u
1
, u
2
, , u
k
} is
linear independent if
c
1
u
1
+c
2
u
2
+ +c
k
u
k
= 0, then
c
1
= c
2
= = c
k
= 0
Matrix algebra
1. Matrix multiplication
suppose the product of a mn
matrix A and a n p matrix B
is a mp matrix C, then the
elements of C
c
ij
=
n

k=1
a
ik
b
kj
AB = BA
property of transpose:
(AC)
T
= C
T
A
T
2. Linear correspondence property: if
R = rref(A), then the columns of R
and A have the same linear relations.
3. Matrix inversion: suppose there exist
operation P such that P[AI
n
] = [RB]
and R = rref(A) = I
n
, then B = A
1
4. LU decomposition: if
I l
ik
=
a
ik
a
kk
L, A ref U,
then A = LU
5. Linear transformation
(a) if T(u +v) = T(u) +T(v) and
T(cu) = cT(u), then T is linear.
(b) if T is linear
T(v) = Av
T(o) = o
T(u) = T(u)
T(au +bv) =
aT(u) +bT(v)
(c) Null A
T
: solutions to A
T
v = o
(d) dim(Null A
T
) = mrankA
(e) for an mn matrix
A = [T(e
1
) T(e
2
) T(e
n
)] (see
table on page 2)
Determinants
1. cofactor expansion
detA = a
i1
c
j1
+a
i2
c
j2
+ +a
in
c
jn
,
where c
ij
= (1)
i+j
detA
ij
2. detA = (1)
r
u
11
u
22
u
nn
, where
r = # of row interchanges (no scaling
operations)
1
Rank of A # of solutions to Ax =

b columns of A property of T
m at least one spanning set for R
m
onto
n at most one linearly independent one-to-one
m = n unique solution linearly independent spanning set invertible
Vectors space
1. Subspaces: set W is subspace if


0 setW
closed under addition
closed under scalar multiplication
2. Null A: solution set of Ax =

0
3. col A: span of columns of A
4. Row A: subspaces spanned by rows of
A
5. Basis and dimension
Basis: a linearly independent
spanning set.
dimension: number of vectors in
a basis.
to show B is a basis for subspace
V
(a) show B is contained in V
(b) show B is linearly
independent
(c) computer the dimension of
V , conrm the number of
vectors in B equals dim V .
Nonzero rows of rref form a basis
for Row A.
subspace containing space dimension
Null A
n
nullity A = n rankA
Row A
n
rank A
Col A
m
rank A
Eigenvalues and eigenvectors
Av = v
(A I
n
)v =

0
where are eigenvalues and v are
eigenvectors of A.
1. characteristic polynomial
det(A I
n
) = 0
2. diagonalization of matrices
A = PDP
1
where P = [v
1
v
2
v
n
],
D =

1
0 0
0
2
0
.
.
. 0
.
.
.
0
0 0
n

Orthogonality and least


squares
1.

U

V if

U

V = 0
2. Cauchy-Schwarz inequality
|

U

V | |

U||

V |
3. Triangle inequality
|

U +

V | |

U| +|

V |
2
4. Gram-Schmidt process
v
1
= u
1
v
i
= u
i

u
i
v
1
|v
1
|
2
v
1

u
i
v
2
|v
2
|
2
v
2

u
i
v
i1
|v
i1
|
2
v
i1
5. QR factorization: given
A = [u
1
, u
2
, , u
n
]
Gram-Schmidt v
i
normalize e
i
Q = [e
1
, e
2
, , e
n
]
R = Q
1
A = Q
T
A
to min |

b Ax|, Ax =

b, QRx =

b,
Rx = Q
1

b = Q
T

b, solve for x.
6. Orthogonal projection of

b onto
subspace W
w = A(A
T
A)
1
A
T

b
w = V V
T

b
where V is an orthonormal basis for
subspace W
7. Least squares
Ax =

b
A
T
Ax = A
T

b
x = (A
T
A)
1
A
T

b
Symmetric matrices
1. Singular value decomposition
A = UV
T
=

s
1
0 0
0 s
2
0
.
.
. 0
.
.
.
0
0 0 0

2. A is symmetric positive denite, if


A = B
T
B for a nonsingular matrix B.
3

Das könnte Ihnen auch gefallen