Beruflich Dokumente
Kultur Dokumente
ln x !
ln ydy
1
x ln x x
For numbers in the order of Avogadro's number, the Sterling "approximation" is very
accurate.
1.1.2
Sharpness of Multiplicity Function
The following results of coin tossing suggest that the outcome will be getting more definite as
the number of trials increases:
For a system of N magnets, we denotes the number of "up" and "down" states as:
1
1
N
s and N
s , where s is an integer.
2 N
2 N
Hence, the spin excess is calculated as:
N N
2s
1 1
2N
1N
2
For a very large N, let us consider the logarithm of the multiplicity function:
ln W ln N ! ln N ! ln N !
Using the Stirling approximation,
ln W N ln N N N ln N N
N ln
Knowing that
N
ln
ln 12 1 2 s N
N
we have
N
ln
ln 2 2s N
N
N
N
N ln
N ln N
N
N
ln 2 ln 1 2 s N
1
2
2s N .
2
Consequently,
ln W
ln 2 2s N
1
2
2s N
ln 2 2s N
1
2
2s N
2 s2
N
ln 2 e
2N
W N, s
W N,0
1N
2
That is, when N is very large, the distribution is exceedingly sharply defined at s = 0.
We know from common experience that systems held at constant temperature usually have
well-defined properties; this stability of physical properties follows as a consequence of the
exceedingly sharp peak in the multiplicity function and of the steep variation of that function
away from the peak.
1.1.3
Method of Lagrange Undetermined Multipliers
Assume that there are n possible outcome for an experiment:
Events: 1, 2, 3, , n.
Perform the experiment for N times and denote the frequency of the outcome i as Ni:
N1, N2, , Nn
Obviously, we must have
Ni N
i
For a particular set of {N1, N2, , Nn}, the number of possible ways is
N!
W
N1 ! N 2 ! N n !
The total possible outcome is nN.
The probability of getting the distribution {N1, N2, , Nn} is:
W
p Ni
nN
Take a logarithm of W:
ln W ln N !
ln N i !
i
N ln N
N i ln N i
Ni
pi ln pi
i
pi ln pi
i 1
pi
i 1
is a constant.
Given that
dR
1 ln pi dpi
i
ln pi
dL
dpi
In the presence of 0, which is not yet defined, we can assert that all pi are independent
variables. Thus,
dL = 0
ln pi
pi e 0
0
The value of
pi
1.
i 1
That is,
n
ne
pi
1 n.
i 1
Consequently,
ln W
N pi ln pi
i
Mechanical variables (M): Properties that can be defined in purely mechanical terms such as
pressure, energy, volume, number of molecules, etc.
Non-mechanical variables: temperature, entropy, chemical potential, etc.
Ensemble: a collection of a very large number of systems, each constructed to be a replica of
a well-defined thermodynamic state.
Postulate 1: In the limit as the number of ensemble members approach , the ensemble
average of M corresponds to a parallel thermodynamic property, provided that the system of
the ensemble replicate the thermodynamic state and environment of the actual system of
interest. (Valid for all kinds of ensembles)
Postulate 2: In an ensemble of constant N, V, and E, the systems are distributed uniformly, i.e.,
with equal probability or frequency, over the possible quantum states consistent with the
specified values of N, V, and E. (Principle of equal a priori probabilities)
Postulates 1 and 2 Quantum ergodic hypothesis:
The single isolated system of interest spends equal amounts of time, over a long period of
time, in each of the available quantum states. That is, the time average and the ensemble
average of M would be identical.
1.3 Canonical Ensemble
Each system (square box with definite N, V, T) in the canonical ensemble has the same set of
energy states, viz., E1, E2, , etc., which are determined by N and V only.
Assume there are nt systems in the ensemble.
The entire ensemble, which is insulated, is effectively an isolated system with volume ntV,
numbers of molecules ntN, and a total energy Et. Thus, it can be considered as a
thermodynamic "supersystem", to which we can apply the postulate 2.
Quantum
States
Energy
Systems
E1 = 0.1
n1
E2 = 0.2
n2
E3 = 0.3
n3
Total systems nt n1 n2
Total energy Et n1 E1 n2 E2
nnt Ent
nt !
n1 !n2 !
(1)
pj
1
nt
n nj n
n
t
, where n j n
For nt
, we can regard all other weights t (n) as negligible compared with
denotes the most probable distribution. Hence,
1
nt
pj
n* n*j
n*j
nt
(n*) which
n Ej
n*j
Et
That is,
pjEj
nt
Ej
Et
nt
(2)
where
pj 1
(3)
Question: Which of all possible sets of nj's satisfying eqns (2) and (3) gives us the largest
(That is, we want to determine n*j )
From the definition of
t,
t?
eqn 3
where
and
eqn 2
dR
1 ln pi dpi
i
Thus, we find
ln p j
Ej
R
pj
1 ln p j
After rearrangement,
E
pj e e j
dE
E j dp j
p j dE j
dU
U.
Work done on or by the system may change V, on which the energy level Ej depends:
w
p j dE j
j
Ej
pj
dV
Ej
pj
p j Pj
N
Ej
Pj
PdV
Hence,
q
E j dp j
j
Thus, for a very small change of internal energy, work done means a weighted average of
change in energy levels and heat is a change of distribution of energy.
One may be tempted to express the factor of in terms of energies. However, the independent
variables of real interest here are N, V, and T. Therefore, we will invoke thermodynamic
argument to trace the connection between and T (a non-mechanical thermodynamic
variable).
From the ensemble average of energy
E
e j
E
Ej
Q
j
we obtain:
E
V
,N
Ej
Ej
V
,N
e j
Q2
Q
V
Ej
,N
Ej
Ej
V
,N
PE
E P
U
V
1
T
T ,N
P
1T
P
N ,V
By the virtue of the postulate 1, we can associate the thermodynamic pressure and internal
energy with the ensemble average of P and E.
Hence, it can be deduced that
1
kT
where k is a constant to be determined.
From the definition of E , we obtain
dE
E j dp j
p j dE j
j
Given that p j
p jQ
Ej
Ej
Q
ln p j
, we have
ln Q
Ej
Ej
ln p j
ln Q
p j ln p j
PdV
(4)
It can be shown that if any two systems are in thermal contact, they will have the same and
T at equilibrium. Thus, k is a universal constant known as the Boltzmann factor. Therefore,
we can evaluate k for a judiciously chosen system:
k = 1.381 10 23 JK 1.
It was Max Planck who first gave it an experimental value based on the law of black-body
radiation.
We finally obtained the well-known expression of the Boltzmann distribution:
e
p j N ,V , T
E j kT
and Q N , V , T
E j kT
E
T
k ln Q
kT ln Q N ,V , T .
Hence, if Q is available, we can obtain a rather complete set of thermodynamic functions from
those derivatives of the Helmholtz free energy.
Given that dA
A
P
V T ,N
SdT
kT
PdV
dN , we have
ln Q
V T ,N
10
ln Q
T
N ,V
ln Q
T
kT 2
N ,V
ln Q
T
N ,V
Fluctuation
1.3.1
In a canonical ensemble, N, V, and T are held fixed, and we can investigate fluctuations in the
energy, pressure, and related mechanical properties because these are the ones that can vary
from system to system.
The variance in the system energy is calculated as:
2
E2 E2
E
p j E 2j
E2
The above equation can be rewritten in a more convenient form by noting that
1
1
E
E
p j E 2j
E 2j e j
E je j
Q j
Q
j
j
E
ln Q
Given that
ln Q
N ,V
we have
E
p j E 2j
E2
Consequently,
E
2
kT 2
E
T
CV
kT 2CV
N ,V
2
E
2
kT
Heat capacity depends on the energy fluctuation
kT 2CV
E
1
N
1 for N = 1023
PE
E kT
N ,V , E , we have
E kT
where
ln
12
For any isolated system whatever, the more quantum states available to the system, the higher
the entropy.
1.4.1
Statistical-Mechanical Basis of the Third Law
On the basis of the experimental observations, T. W. Richards and W. Nernst independently
found that for any isothermal process involving only pure phases in internal equilibrium,
lim S 0 ,
T
That is,
k ln 0 ln
where
and
(*)
kN
0.
Practically, we have S 0
The entropy of each pure element or substance in a perfect crystalline form is zero at
absolute zero.
Max Planck (the third law of thermodynamics)
1.5 Molecular Partition Function
exp
Q
i
Eitrans
kT
exp
j
E rot
j
kT
exp
k
Ekvib
kT
exp
l
Elele
kT
13
1.5.1
Translational Partition Function
The energy states of a particle in a three-dimensional infinite well can be used to obtain the
partition function of qtrans:
h2
qtrans
nx2 n y2 nz2
exp
23
8mV kT
n x , n y , nz
h2n2
8mV 2 3 kT
exp
n 1
qtrans
0
h2 n2
dn
8mV 2 3 kT
Hence,
qtrans
2 mkT
h3
3
2
V
3
where
h
.
2 mkT
14
The energy of the molecular ground state would become D0, which can be determined
spectroscopically (the difference between D0 and De is the zero-point energy).
Hence, the partition function should read:
D
E1
qele exp 0 g 0 g1 exp
kT
kT
where the degeneracy is denoted by gi and E1 is the energy difference between the ground
state and the first-excited state.
If we define a characteristic temperature for an electronic transition as
E1
e
k
we would have e in the order of 104 K for E1 = 1 eV.
For example, the degeneracy of the first two levels of halogen atoms are 2P3/2 and 2P1/2, the
probability of finding the atom at the first excited state is:
2 exp
E1 kT
p1
4 2 exp
E1 kT
We have E1 = 0.050 eV for fluorine and 0.94 eV for iodine. At 1000 K, we find p1 = 0.22
and 9 10 6 for F and I, respectively.
1.6 Partition Functions of Many-Body Systems
a
i
a
i
b
j
c
k
kT
i , j ,k ,
qa qb qc
where q(V, T) is the molecular partition function.
q V ,T
15
1.6.2
Indistinguishable Particles
In QM, all known particles fall into two classes:
Fermions - wave function antisymmetric under the operation of interchanging two identical
particles
Bosons - wave function symmetric under the operation of interchanging two identical
particles
kT
i , j ,k ,
For bosons, the terms ( 1 + 1 + 3 + ) and ( 1 + 3 + 1 + ) are identical and should not be
counted twice. The corresponding distribution is known as the Bose-Einstein statistics.
For fermions, terms in which two or more indices are the same cannot be included in the
summation. The corresponding distribution is known as the Fermi-Dirac statistics.
That is, the canonical partition functions of bosons and fermions cannot be written as qN.
1.6.3
Boltzmann Statistics
Consider the energy states of a single particle in a three-dimensional infinite well:
h2
nx2 n y2 nz2 ,
nx , n y , nz
2
8ma
where nx, ny, nz = 1, 2, 3,
Let us define
nx2 n y2 nz2
8ma 2
h2
R2
14
1 4 8ma 2 2
8m
R3
2
h
8 3
8 3
6 h2
where a3 is the volume V.
3
2
Calculation done for one particle in a cube of one liter would give
an order of
30
magnitude of 10 .
Thus, the number of molecular quantum states available to a molecule at room temperature is
much greater than the number of molecules in the system. Alternatively, we can state that the
16
orbital occupancy is small in comparison with unity. This regime, which is favored by large
mass, high temperature, and low density, is known as the classical limit.
In the classical limit, the terms in each summation is much larger than N:
qN
a
i
kT
b
j
kT
c
k
kT
k
N terms
3),
3), (2
1),
1),
2),
2),
3),
3),
4)
1), (2
1), (3
2), (3
3), (2
4), (3
1), (3
2), (3
1),
4), (4
1), (4
2),
As such, qN will be dominated by terms which have different indices. Hence, the canonical
partition function of fermions or bosons will be given identically as
qN
Q N ,V , T
N!
where we have corrected the sum by N! for N different indices.
The corresponding distribution is known as the Boltzmann statistics.
To assure the validity of the Boltzmann statistics, we have stated that
3
8m 2
V
N
6 h2
Hence, we can rewrite the condition as
8mkT
h2
6
3
2
V N
(*)
Because (V/N)1/3 is a distance of the order of the average nearest-neighbor distance between
molecules, eqn (*) asserts that quantum effects (requiring the Bose-Einstein or Fermi-Dirac
17
statistics) will be absent if neighboring molecules are far apart relative to the "thermal" de
Broglie wavelength.
1.7 Ideal Gas
We can safely apply the Boltzmann statistics for gaseous systems with ideal gas behavior.
1.7.1
Ideal Monatomic Gas
We can obtain the Helmholtz energy as
N
q q
kT ln Q
kT ln trans ele
N!
kT N ln qtrans qele N ln N
For brevity, we assume that qele = 1 (i.e. no electronic excitation and degeneracy),
A
NkT ln qtrans ln N 1
e 2 mk
kT N ln
Nh3
3
2
3
ln T
2
ln V
ln Q
ln Q
T
kT 2
CV
kT 2
N ,V
3N
2T
3
NkT
2
3
RT
2
3
R
2
1.7.2
Chemical Equilibrium in Ideal Gas Mixture
Consider an ideal gas mixture made up of NX molecules of type X and NY of type Y in a closed
container with fixed V and T. We assume no chemical reaction takes place between X and Y.
18
Because the system is closed and has constant V and T, the second law states that the system
Helmholtz free energy will decrease as the reaction proceeds. At equilibrium, A will be at
minimum, i.e., Q is at maximum.
For convenience, we state that each X contains two Y so that we have
2 N X NY N
where N is a constant.
Consequently, we obtain
ln Q N X ln q X
N 2 N X ln qY
Hence,
ln Q
NX
N X ln N X
N 2 N X ln N 2 N X
NX
0
N ,V ,T
Knowing that
3
2 mkT 2
qtrans
V,
h3
one can remove the dependence on V by considering the concentration:
Y
X
NY V
NX V
qY V
qX V
2 kT 2 mY2 2
K eq
h3
mX
That is, we have Keq a function of temperature only.
It is also an important point to note that
K eq
qY V
qX V
1.8.1
State Functions in Thermodynamics
Consider the following reaction:
19
k1 A
d A
dt
0 by having
But then by adding a catalyst, we could affect "[Aeq]". Thus, the steady state
d A
dt
RT ln K1 RT ln K 2
RT ln K 3
As discussed in thermodynamics,
G10
RT ln K1
G20
RT ln K 2
0
3
RT ln K 3
Thus,
G10
G20
G30
1.8.2
Boltzmann Distribution
Chemical species is a kinetic collection of the microscopic states of a molecular system.
20
ki
pi
and j, respectively.
That is,
p j ki
pi
kj
Because the quotient of reaction rates represents some kind of equilibrium, in analogy to the
relation of G and equilibrium constant, we can therefore write
pj
kT
e j i
pi
Thus, we obtain
1 i kT
pi
e
Q
From the normalization that
pi
1 , we have
kT
We have obtained the Boltzmann distribution from the principle of detailed balance.
(Courtesy of CY Mou)
1.8.3
Statistical Effects on Chemical Equilibrium
Consider the following reaction of isomerization:
A
B
Suppose that some of the energy levels of A are lower than those of B, but that the levels of B
are closer together:
In statistical mechanics, we can take the point of view that a given molecule has accessible to
it the full set of energy states indicated by A + B, with the partition function of
21
A
i
exp
B
j
exp
kT
kT
j
qA
qB
There is a single Boltzmann distribution of molecules among all the levels A + B. Therefore,
the fraction of molecules in all levels belonging to the subgroup A is
A
exp
kT
NA
qA
i
NA
NB
At equilibrium, we have
B
exp
j kT
N B qB
j
A
N A qA
exp
kT
i
Kc
Equilibrium constant can be understood as a ratio of the assessable states of the products and
reactants.
22