You are on page 1of 44

Matrix Representation

Matrix Rep
Rep. Same basics as introduced already.
already
Convenient method of working with vectors.

Superposition Complete set of vectors can be used to


express any other vector.

Complete set of N vectors can form other complete sets of N vectors.

Can find set of vectors for Hermitian operator satisfying


A u u .
Eigenvectors and eigenvalues

Matrix method Find superposition of basis states that are


eigenstates of particular operator. Get eigenvalues.

Copyright Michael D. Fayer, 2009


Orthonormal basis set in N dimensional vector space

e
j
basis vectors

Any N dimensional vector can be written as


N
x xj e j with xj e j x
j 1

To get this, project out

e j e j from x
xj e j e j e j x piece of x that is e j ,
then sum over all e j .

Copyright Michael D. Fayer, 2009


Operator equation

y A x
N N

yj e
j 1
j
A x j e j
j 1
Substituting the series in terms of bases vectors.

N
xj A e j
j 1

Left mult. by e i
N
yi e i A e j x j
j 1

Th N2 scalar
The l products
d
ei A e j N values of j ; N for each yi

l t l determined
are completely d t i d by
b
A and the basis set e .
j

Copyright Michael D. Fayer, 2009


Writing
aij e i A e j Matrix elements of A in the basis e j
gives for the linear transformation
N
yi aij x j j 1, 2, N
j 1

Know the aij because we know A and e


j

In terms of the vector representatives
x1 y1 Q 7 x 5 y 4 z vector
x y 7
x 2 y 2 5 vector representative
representative,

must know basis
x
N yN 4

(Set of numbers, gives you vector


when basis is known.)
The set of N linear algebraic
g equations
q can be written as
y Ax
double underline means matrix
Copyright Michael D. Fayer, 2009
A array of coefficients - matrix
a11 a12 a1 N
a a22 a2 N
A aij 21 The aij are the elements of the matrix A .


a N 1 aN 2 a NN

y Ax

vector representatives in particular basis


y1 a11 a12 a1 N x1
y a a22 x
2 21 2


yN aN 1 aN 2 a NN xN

The product of matrix A and vector representative x


is a new vector representative
p y with components
p
N
yi aij x j
j 1
Copyright Michael D. Fayer, 2009
Matrix Properties, Definitions, and Rules

Two matrices, A and B are equal


q
A B
if aij = bij.

The unit matrix


The zero matrix
1 0 0
0 1 0 ones down 0 0 0
1 ij 0 0 0
principal diagonal 0

0 0 1

0 0 0
Gives identity transformation
N 0x 0
yi ij x j xi
j 1

Corresponds to
y 1 x x
Copyright Michael D. Fayer, 2009
Matrix multiplication
Consider
y A x z B y operator equations

z BA x

Using the same basis for both transformations


N
zk bki yi zBy B matrix (bki )
i 1

z By B A x Cx

C BA has elements
Example
N
ckj bki aij 2 3 7 5 29 28
i 1 3 4 5 6 41 39

Law of matrix multiplication
Copyright Michael D. Fayer, 2009
Multiplication Associative

A B C A B C
Multiplication NOT Commutative

AB B A

Matrix addition and multiplication by complex number

A B C

cij aij bij

Inverse of a matrix A
1 1
i
inverse off A A
1
AA A A1 identity matrix

1 A
CT transpose of cofactor matrix (matrix of signed minors)
A
A determinant

A 0 If A 0 A is singular
Copyright Michael D. Fayer, 2009
Reciprocal of Product

A B B A
1 1 1

For matrix defined as A (aij )


Transpose
A a ji interchange rows and columns

Complex Conjugate
*

A aij* complex
l conjugate
j t off each
h element
l t

Hermitian Conjugate


A a *ji complex conjugate transpose

Copyright Michael D. Fayer, 2009


Rules

A B B A transpose of product is product of transposes in reverse order

| A | | A | determinant of transpose is determinant

( A B )* A B
* *
complex conjugate of product is product of complex conjugates

| A | | A |*
*
determinant of complex conjugate is
complex conjugate of determinant


( A B ) B A Hermitian conjugate of product is product of
Hermitian conjugates in reverse order


| A | | A |* determinant of Hermitian conjugate is complex conjugate
of determinant

Copyright Michael D. Fayer, 2009


Definitions

A A Symmetric


A A Hermitian

A A
*
Real

A A
*
Imaginary

1
A A Unitary

aij aij ij Diagonal

Powers of a matrix
A 1 A A A A A
0 1 2

2
A A
e 1 A
2!
Copyright Michael D. Fayer, 2009
Column vector representative one column matrix
x1
x
x 2

vector representatives in particular basis
xN
then y Ax
becomes
y1 a11 a12 a1 N x1
y a a22 x
2 21 2


yN aN 1 aN 2 a NN xN
row vector transpose
p of column vector
x x1 , x2 x N

y Ax y x A transpose
p

y Ax y x A Hermitian conjugate
Copyright Michael D. Fayer, 2009
Change of Basis
orthonormal basis
e i

then
e i e j ij i , j 1, 2, N

Superposition of
ei can form N new vectors
linearly independent
a new basis e i
N
e i
uik e k i 1, 2, N
k 1

complex numbers

Copyright Michael D. Fayer, 2009


New Basis is Orthonormal

e j e i ij

if the matrix
U uik
coefficients in superposition
N
e i
uik e k i 1,, 2,, N
k 1
meets the condition

U U 1
1
U U U is unitary

Important result. The new basis will be orthonormal


e i
if U , the transformation matrix, is unitary (see book
and Errata and Addenda ).
UU U U 1

Copyright Michael D. Fayer, 2009


Unitary transformation
substitutes orthonormal basis e for orthonormal basis e .

Vector x

x xi e i
i vector line in space (may be high dimensionality
abstract space)
x xi e i written in terms of two basis sets
i

x Same vector different basis.

The unitary transformation U can be used to change a vector representative


of x in one orthonormal basis set to its vector representative in another
orthonormal basis set.
x vector rep. in unprimed basis
x' vector rep. in primed basis
x U x change from unprimed to primed basis

x U x change from primed to unprimed basis
Copyright Michael D. Fayer, 2009
Example

Consider basis x , y , z
y

|s

z
x

Vector s - line in real space.

In terms of basis s 7 x 7 y 1z

Vector representative in basis x , y , z

7
s 7

1
Copyright Michael D. Fayer, 2009
Change basis by rotating axis system 45 around z .

Can find the new representative


p of s , s'

s U s

U is rotation matrix

x y z
cos sin 0
U sin cos 0
0 1
0

For 45 rotation around z

2/2 2 / 2 0

U 2 / 2 2 / 2 0
0 1
0

Copyright Michael D. Fayer, 2009
Then

2/2 2 / 2 0 7 7 2

s 2 / 2 2 / 2 0 7 0
0 1 1 1
0

7 2


s 0
vector representative
p of s in basis e
1

Same vector but in new basis.
basis
Properties unchanged.


1/ 2
Example
p length
g of vector ss
1/ 2
s s ( s* s )1/ 2 (49 49 1)1/ 2 (99)1/ 2
1/ 2
s s ( s* s )1/ 2 (2 49 0 1)1/ 2 (99)1/ 2
Copyright Michael D. Fayer, 2009
Can g
go back and forth between representatives
p of a vector x by
y

change from unprimed change from primed


to primed basis to unprimed
p basis

x U x x U x

components of x in different basis

Copyright Michael D. Fayer, 2009


Consider the linear transformation
y Ax operator equation

In the basis e can write


y Ax
or
yi aij x j
j

Change to new orthonormal basis e using U



y U y U A x U AU x
or
y A x

with the matrix A given by



A U AU

Because U is unitary
A U AU
1

Copyright Michael D. Fayer, 2009


Extremely Important


A U AU
1

Can change the matrix representing an operator in one orthonormal basis


into the equivalent matrix in a different orthonormal basis.

Called

Similarity Transformation

Copyright Michael D. Fayer, 2009


y Ax AB C A B C

In basis e
Go into basis e
y A x A B C A B C

Relations unchanged by change of basis.


Example AB C

U A BU U C U
1
Can insert U U between A B because U U U U 1

U AU U BU U C U
Therefore
A B C
Copyright Michael D. Fayer, 2009
Isomorphism between operators in abstract vector space
and
d matrix
t i representatives.
t ti

Because of isomorphism not necessary to distinguish


abstract vectors and operators
from their matrix representatives.
representatives

The matrices (for operators) and the representatives (for vectors)


can be used in place of the real things.

Copyright Michael D. Fayer, 2009


Hermitian Operators and Matrices

Hermitian operator

x A y y Ax

Hermitian operator Hermitian Matrix



A A

+ - complex conjugate transpose - Hermitian conjugate

Copyright Michael D. Fayer, 2009


Theorem (Proof: Powell and Craseman, P. 303 307, or linear algebra book)

For a Hermitian operator A in a linear vector space of N dimensions,


dimensions
there exists an orthonormal basis,
U1 , U2 U N
relative to which A is represented by a diagonal matrix

1 0 0
0 0
A 2
0 .

0 N

The vectors, U i , and the corresponding real numbers, i, are the


solutions of the Eigenvalue Equation
A U U
and there are no others.

Copyright Michael D. Fayer, 2009


Application of Theorem

Operator A represented by matrix A


in some basis e i
. The basis is any convenient basis.
In g
general,, the matrix will not be diagonal.
g
There exists some new basis eigenvectors

U
i

in which A represents operator and is diagonal eigenvalues.

To get from e to U
i i

y transformation.
unitary

U U e .
i i

A U AU
1
Similarity transformation takes matrix in arbitrary basis
into diagonal matrix with eigenvalues on the diagonal.
Copyright Michael D. Fayer, 2009
Matrices and Q.M.

Previously represented state of system by vector in abstract vector space.

Dynamical variables represented by linear operators.

O
Operators
t produce
d li
linear transformations.
t f ti y Ax

Real dynamical variables (observables) are represented by Hermitian operators.

Observables are eigenvalues of Hermitian operators. A S S

Solution of eigenvalue problem gives eigenvalues and eigenvectors.

Copyright Michael D. Fayer, 2009


Matrix Representation

Hermitian operators replaced by Hermitian matrix representations.


A A
In p p basis,, A is the diagonalized
proper g Hermitian matrix and
the diagonal matrix elements are the eigenvalues (observables).
1
A suitable transformation U AU takes A (arbitrary basis) into
A (diagonal eigenvector basis)
1
A U AU .
U takes arbitrary basis into eigenvectors.
Diagonalization of matrix gives eigenvalues and eigenvectors.

Matrix formulation is another way of dealing with operators


and solving eigenvalue problems.

Copyright Michael D. Fayer, 2009


All rules about kets, operators, etc. still apply.

p
Example
Two Hermitian matrices A and B
can be simultaneously diagonalized by the same unitary
transformation if and only if they commute.

All ideas about matrices also true for infinite dimensional matrices.

Copyright Michael D. Fayer, 2009


Example Harmonic Oscillator

Have already solved use occupation number representation kets and bras
(already diagonal).

H
2

1 2 2 1
2



P x aa a a

a n n n 1 a n n1 n1
matrix l t off a
t i elements 0 1 2 3
0 a 0 0
0 0 1 0 0 0

0a1 1 1 0 0 2 0 0
0 a 2 0 2 0 0 0 3 0


3 0
a
0 0 0 4

1a 0 0
1a 1 0

1a 2 2
1a 3 0

Copyright Michael D. Fayer, 2009
0 0 0 0

1 0 0 0

a
0 2 0 0



H
1
2
a a a a
0 0 3 0

0 0 0 4

0 1 0 0 0 0 0 0
1 0 0 0
1
0
0 2 0 0 0 0
0 2 0 0
0 2 0 0
aa 0
0 0 3 0 0 3 0
0 4 0
0 0 0 0 3 0 0 0 0 4


0 0 0 4

Copyright Michael D. Fayer, 2009


H
1
2

a a a a
0 0 0 0 0 1 0 0
0 0 0 0

0 1 0 0
1 0 0 0 0 2 0
0
0 2 0 0
a a 0 0 0 3 0 0 2 0
0 4

0 0 3 0

0 0 0
0 0 0 3
0 0 0 4

Copyright Michael D. Fayer, 2009


Adding the matrices a a and a a and multiplying by gives H

1 0 0 0 1 2 0 0 0
0 3 0 0 0 3 2 0 0
1
H 0 0 5 0 0 0 5 2 0
2
0 0 0 7 0 0 0 7 2

The matrix is diagonal with eigenvalues on diagonal.


diagonal In normal units
the matrix would be multiplied by .

This example
Thi l shows
h id
idea, b
but not h
how to di
diagonalize
li matrix
i when
h you
dont already know the eigenvectors.

Copyright Michael D. Fayer, 2009


Diagonalization

g
Eigenvalue equation
q eigenvalue
g

Au u Au u 0

matrix representing representative of


operator eigenvector

In terms of the components


N

a
j 1
ij ij u j 0 i 1, 2 N

This represents a system of equations


We know the aijj.
a11 u1 a12 u2 a13 u3 0
We don't know
a21 u1 a22 u2 a23 u3 0 - the eigenvalues
ui - the vector representatives,
representatives
a31 u1 a32 u2 a33 u3 0
one for each eigenvalue.
Copyright Michael D. Fayer, 2009
Besides the trivial solution
u1 u2 uN 0

A solution only exists if the determinant of the coefficients of the ui vanishes.

a11 a12 a13


a21 a22 a23 know aij, don't know 's
a31 a32 a33 0



Expanding the determinant gives Nth degree equation for the
the unknown 's (eigenvalues).

Then substituting one eigenvalue at a time into system of equations,


the ui (eigenvector representatives) are found.
N equations for u's gives only N - 1 conditions.
Use normalization.
u1* u1 u2* u2 u*N uN 1
Copyright Michael D. Fayer, 2009
Example - Degenerate Two State Problem

p
Basis - time independent kets orthonormal.

H E0
and not eigenkets.
g
H E0 Coupling .

These equations
q define H.

The matrix elements are And the Hamiltonian matrix is


H E0
H
H E0
H E0 H
E0

Copyright Michael D. Fayer, 2009


The corresponding system of equations is
( E0 ) 0 These only have a solution if
( E0 ) 0 the determinant of the coefficients vanish.

E0 Take the E0 Make


M k into
i t determinant.
d t i t
0
matrix E0
Subtract from the diagonal
E0
elements.

Expanding
2 Dimer Splitting
E0 2 0
2

E0 Excited State
2 E0 E 0
2 2
0
2

Energy Eigenvalues

E 0
Ground State
E 0 E = 0
Copyright Michael D. Fayer, 2009
To obtain Eigenvectors
Use system of equations for each eigenvalue.

a b
Eigenvectors associated with + and -.
a b

a , b and a , b are the vector representatives of and


in the , basis set.
We want to find these.

Copyright Michael D. Fayer, 2009


First, for the
E 0 eigenvalue
y
write system q
of equations.

( H 11 )a H 12 b 0

H 21a ( H 22 )b 0

H 11 H ; H 12 H ; H 21 H ; H 22 H Matrix elements of H

( E0 E0 )a b 0 The matrix elements are


H E0
a ( E0 E0 )b 0
H
H
H E0
The result is
a b 0
a b 0

Copyright Michael D. Fayer, 2009


An equivalent way to get the equations is to use a matrix form.

E 0 a 0

E0 b 0

S b tit t E0
Substitute

E0 E0 a 0

E0 E0 b 0

a 0
b
0

Multiplying the matrix by the column vector representative gives equations.

a b 0
a b 0

Copyright Michael D. Fayer, 2009


a b 0
The two equations are identical.
a b 0

a b

Always get N 1 conditions for the N unknown components.


Normalization condition gives necessary additional equation.

a2 b2 1

Then
1
a b
2
and
1 1
Eigenvector in terms of the
2 2 basis set.

Copyright Michael D. Fayer, 2009


For the eigenvalue
E0
using the matrix form to write out the equations

E0 a 0

E0 b 0

Substituting E0

a 0
b
0

a b 0
a b 0
Th
These equations
i give
i a b
1 1
Using normalization a b
2 2
1 1
Therefore
2 2 Copyright Michael D. Fayer, 2009
Can diagonalize by transformation
H U H U
1

diagonal not diagonal


Transformation matrix consists of representatives
p of eigenvectors
g
in original basis.

a a 1/ 2 1/ 2
U
b b 1/ 2 1/ 2

1
1/ 2 1/ 2
U complex conjugate transpose
1/ 2 1/ 2

Copyright Michael D. Fayer, 2009


Then

1/ 2 1/ 2 E0 1/ 2 1/ 2
H
1/ 2
1/ 2 E0 1/ 2 1/ 2

g out 1/ 2
Factoring

1 1 1 E0 1 1
H
2 1 1 E0 1 1

after matrix multiplication

1 1 1 E0 E0
H
2 1 1 E0 E0

more matrix multiplication

E0 0
H
E0
diagonal
g with eigenvalues
g on diagonal
g
0

Copyright Michael D. Fayer, 2009