Sie sind auf Seite 1von 1

Linear Algebra cheat sheet

Vectors
dot product: u v = ||u|| ||v|| cos() = uxvx + uyvy
cross product: u v =
_
_
uyvz uzvy
uzvx uxvz
uxvy uyvx
_
_
norms:
xp :=
p
_
n
i=1
|xi|
p
x1 :=

n
i=1
|xi| x = max
i
|xi|
enclosed angle:
cos =
u v
||u|| ||v||
||u|| ||v|| =
_
(u
2
x
+ u
2
y
)(v
2
x
+ v
2
y
)
Matrices
basic operations
transpose: [A
T
]ij = [A]ji: mirror over main diagonal
conjungate transpose / adjugate: A

= (A)
T
= A
T
transpose and complex conjugate all entries
(same as transpose for real matrices)
multiply: ANM BRK = MNK
invert:
_
a b
c d
_
1
=
1
det(A)
_
d b
c a
_
=
1
adbc
_
d b
c a
_
norm:
A
p
= max
x=0
Ax
p
x
p
, induced by vector p-norm
A
2
=
_
max(A
T
A)
A
1
= max
j

m
i=1
|aij|,
A

= max
i

n
j=1
|aij|,
condition: cond(A) = A
_
_
A
1
_
_
determinants
det(A) =

Sn
sgn()

n
i=1
Ai,i
For 33 matrices (Sarrus rule):
arithmetic rules:
det(A B) = det(A) det(B)
det(A
1
) = det(A)
1
det (rA) = r
n
det A , for all A
nn
and scalars r
eigenvalues, eigenvectors, eigenspace
1. Calculate eigenvalues by solving det (AI) = 0
2. Any vector x that satises (AiI) x = 0 is eigenvector for i.
3. Eig
A
(i) = {x C
n
: (Ai)x = 0} is eigenspace for i.
deniteness
dened on nn square matrices:
(A).
> 0 positive-denite
0 positive-semidenite
< 0 negative-denite
0 negative-semidenite
if none true (positive and negative exist): indenite
equivalent: eg. x
T
Ax > 0 positive-denite
rank
Let A be a matrix and f(x) = Ax.
rank(A) = rank(f) = dim(im(f))
= number of linearly independent column vectors of A
= number of non-zero rows in A after applying Gauss
kernel
kern(A) = {x R
n
: Ax = 0} (the set of vectors mapping to 0)
For nonsingular A this has one element and dim(kern(A)) = 0 (?)
trace
dened on nn square matrices: tr(A) = a11 + a22 + + ann
(sum of the elements on the main diagonal)
span
Let v1, . . . , vr be the column vectors of A. Then:
span(A) = {1v1 + + rvr | 1, . . . , r R}
spectrum
(A) = { C : is eigenvalue of A}
properties
square: N N
symmetric: A = A
T
diagonal: 0 except a
kk
implies triangular (eigenvalues on main diagonale)
orthogonal
A
T
= A
1
normal and diagonalizable
unitary
Complex analogy to orthogonal: A complex square matrix is unitary
if all column vectors are orthonormal
diagonolizable
cond2(A) = 1
|det(A)| = 1
nonsingular
A
nn
is nonsingular = invertible = regular i:
There is a matrix B := A
1
such that AB = I = BA
det(A) = 0
Ax = b has exactly one solution for each b
The column vectors of A are linearly independent
rank(A) = n
f(x) = Ax is bijective (?)
det(A)
1
= det(A
1
)
(A
1
)
1
= A
(A
T
)
1
= (A
1
)
T
diagonalizable
A
nn
can be diagonalized i:
it has n linear independant eigenvectors
all eigenvalues are real and distinct
there is an invertible T, such that:
D := T
1
AT =
_
_
_
1
.
.
.
n
_
_
_
A = T
1
DT and AT = TD
1, . . . , n are the eigenvalues of A!
T can be created with eigenvectors of A and is nonsingular!
diagonally dominant matrix
i.|aii|

j=i
|aij|
nonsingular
Hermitian
A square matrix A where A

= A (equal to its adjugate)


A real matrix is Hermitian i symmetric
(det(A)) = 0 (determinante is real)
triangular
A square matrix is right triangular (wlog n = 3):
_
_
a11 a12 a13
0 a22 a23
0 0 a33
_
_
Eigenvalues on main diagonale
idempotent
A square matrix A for which AA = A.
block matrices
Let B, C be submatrices, and A, D square submatrices. Then:
det
_
A 0
C D
_
= det
_
A B
0 D
_
= det(A) det(D)
minors
A matrix A has minors Mi,j := remove row i and column j from A
principle minors: {det(upper left i i matrix of A) : i..n}
Sylvesters criterion for hermitian A:
A is positiv-denite i all principle minors are positive

Das könnte Ihnen auch gefallen