Sie sind auf Seite 1von 28

Lecture 5

Entropy

James Chou
BCMP201 Spring 2008

Some common definitions of entropy

A measure of the disorder of a system.

A measure of the amount of energy in a system that is available


for doing work; entropy increases as matter and energy in the
universe degrade to an ultimate state of inert uniformity.

Differentiation of Heat (Q), dQ, is not an exact differential and


therefore cannot be integrated. Therefore we introduce an
integration factor (1/T) such that dQ/T can be integrated. And this
dQ/T is called entropy.

Strategies for understanding entropy

The concept of maximum entropy in statistics

Establish the link between statistical entropy and physical


entropy

Use the principle of maximum entropy to explain the energetic


properties of molecules in the nanoworld

An interesting observation

Random distribution of
kinetic energy through
random collisions

Consider velocity in the y direction

#v x2
f (v x ) =
exp 2
2
2"
2!"
1

! 2 = average of (v x " v x )
!=

-Vx

Vx

v x2 =

0
2

kB T
m
-Vx

Vx

Goal of this lecture:


Use the fundamental principle of maximum entropy to explain the physical
properties of a complex system in equilibrium with the universe.

Maximum Entropy = Minimum Bias

Principle of Maximum-Entropy in Statistics


Given some information or constraints about a random variable, we should choose that
probability distribution for it, which is consistent with the given information, but has otherwise
maximum uncertainty associated with it.

Example: Rolling a die with 6 possible outcomes.


The only constraint we have is

P ( X = 1) + P ( X = 2) + ...+ P ( X = 6) = 1
Without additional information about the die, the most unbiased distribution is such that all
outcomes are equally probable.

P ( X = 1) = P ( X = 2) = ... = P ( X = 6) = 1 6

Shannons Measure of Uncertainty


Shannon [1948] suggested the following measure of uncertainty, which is commonly
known as the statistical entropy.
N

H = !" pi ln pi
i=1

1. H is a positive function of p1, p2, , pn.

2. H = 0 if one outcome has probability of 1.

3. H is maximum when the outcomes are equally likely.


In the case of the die, you will find the maximum entropy to be
6

H = !" pi ln pi = ln6 .
i=1

A quick review on logarithm

log e x = ln x, e = 2.73

ln( AB) = ln A + ln B ,

ln( A / B) = ln A ! ln B ,

d
1
ln( x ) =
dx
x

Stirling approximation: ln ( N!) " N ln N ! N, for very large N

Shannons entropy in terms of the number of possible outcomes.


Example: the number of outcomes ! from rolling the die N times:

N!
!=
( Np1 )!( Np2 )!...( Np6 )!

Permutation of N numbers between 1 and 6


Factor out the redundant outcomes

ln! = ln N!"# ln(Npi )!


i=1

Using Stirlings approximation for very large N, ln N!! N ln N " N , ln# becomes
6

i=1

i=1

i=1

i=1

ln# = N ln N " $ Npi ln Npi = N ln N " ln N $ Npi " N $ pi ln pi = "N $ pi ln pi

ln ! = "N # pi ln pi = NH
i =1

Conclusion: ln! is linearly proportional to H. Therefore, maximizing the


total number of possible outcomes is equivalent to maximizing Shannons
statistical entropy.

# of possible outcomes

Statistical Entropy

H = Const ! ln"

Entropy in Statistical Physics


Definition of physical entropy:

S = const ! ln" ,

" = # of possible microstates of a close system.

A microstate is the detailed state of a physical system.


Example: In an ideal gas, a microstate consists of the position and velocity of every
molecule in the system. So the number of microstates is just what Feynman said: the
number of different ways the inside of the system can be changed without changing the
outside.

Principle of maximum entropy (The second law of thermodynamics)


If a closed system is not in a state of statistical equilibrium, its macroscopic state will
vary in time, until ultimately the system reaches a state of maximum entropy.
Moreover, at equilibrium, all microstates are equally probable.

An example of maximizing entropy:

S = Const x ln (# of Velocity States X # of Position States)


# of velocity states does not change.
# of position states does change

!S = S2 " S1 = Const # [ln$v2 + ln$r2 " ln$1v " ln$1r ]

!S = Const " ln(2V ) # lnV N = Const " N ln2


N

What is temperature?
Not in Equilibrium

Equilibrium

E2

E1

Picture from hyperphysics.phy-astr.gsu.edu

E = E1 + E 2 = const.

dE1 = !dE 2

S = Const " ln ( #1#2 ) = S1 ( E1 ) + S2 ( E2 )


Maximize S,

dS
dS
dS dE 2 dS1 dS2
= 1+ 2
=
!
=0
dE1 dE1 dE 2 dE1 dE1 dE 2

At equilibrium,

dS1 dS2
=
dE1 dE2

Temperature T is defined as 1 T = dS dE . The temperatures of bodies in equilibrium with


one another are equal.

What is the physical unit of T?


Since T is measured at a fixed number of particles N and volume V, a more stringent
definition is
T = ( dE dS ) N,V .
Thus far, S is defined to be const.! ln(") . If S is a dimension-less quantity, T has the
dimensions of energy (e.g. in units of Joules (J)).
But J is too large a quantity.

Example: Room temperature = 404.34 x 10-23 J !

It is more convenient to measure T in degrees Kelvin (K). The conversion factor between
energy and degree is the Boltzmanns constant, kB = 1.38 X 10-23 J / K. Hence we redefine
S and T by incorporating the conversion factor.

S = kB ln!

and

T " T / kB .

What does T = ( dE dS ) N ,V mean?

Lower T

3e
2e
e

Higher T

3e
2e
e

! 5! $
S1 = kB ln#
&
" 2!2!%

! 5! $
S2 = kB ln#
&
" 3!2!%

!E !S = e ( k B ln 3)
! 5! $
S1 = kB ln#
&
" 2!2!%

! 5! $
S2 = kB ln#
&
" 3!2!%

!E !S = 3e ( k B ln 3)

Same change in entropy, but more energy is given away by the system initially with higher
T. Hence temperature is a measure of the tendency of an object to spontaneously give up
energy to its surroundings.

Can we derive the equation of state of a gas (PV = nRT) from the
concept of entropy?

Step 1: Evaluate S = kB ln! .

! = !v " !r

!v = # of velocity states
!r = # of position states

One particle

!v " 4 #v 2 , $ is the speed in 3D.

N particles

!v " 4 #v 3N %1 & 4 #v 3N for large N .

Since v " E 1 2 ,

S = kB ln! =

3kB N
ln E + Const.
2

Step 2: Relate kinetic energy E to temperature.

S = kB ln! =

3kB N
ln E + Const.
2

1 dS 3k B N
=
=
T dE
2E
n = # of moles,

"

3
3
E = k B NT = k B ( nN 0 )T
2
2
N0 = 6.02 x 1023 mol-1

R = kBN0 = 8.314 J mol-1 K-1 (Rydberg constant)


Energy of n mole of ideal gas:
Energy of one ideal gas molecule:

3
E = nRT .
2
E=

3
kB T .
2

Step 3: Relate temperature to pressure.


For a particle in a box, each collision with a wall
occurs in a time interval of 2L v x , and change in
momentum (e.g. in x direction), !px is 2mv x .

F=

!px 2mv x
=
= mv x2 L .
!t 2L v x

For N particles in a box,


N

F = " mv x2 i L = Nm v x2 L .
i=1

v x2 + v y2 + v z2
1

2
2
2
2
Since P = F L = Nm v x V and E = N m v = N m v x ,
2
2

2
E.
3
3
E
=
nRT , we obtain PV = nRT .
Finally, since
2
we obtain PV =

How do we deal with the enormous complexity of a biological system?

Boltzmann and Gibbs Distribution


Goal: Describe the probability distribution of a molecule of interest, with energy E a , in
equilibrium with a macroscopic thermal reservoir with energy E B .
molecule of interest, a

The second law says that at equilibrium, or


maximum entropy, all microstates are equally
probable, with a probability P0 .
Surrounding, B

In the joint system, the probability of the molecule of interest in a particular state,
E a , is

# SB ( E B ) &
p( E a ) = !B ( E B ) " P0 = exp%
( " P0 .
$ kB '

SB ( E B ) = k B ln(!B ( E B ))

! S (E )$
p( E a ) = exp# B B & ' P0
" kB %

S( E B )

Ea

E tot = E B + E a ! E B
EB

E tot

We can use the first-order Taylors expansion to approximate SB ( E B ) because E B is very near E tot .

SB ( E B ) ! SB ( E tot ) "

dSB ( E tot )
1
E a = SB ( E tot ) "
Ea
dE B
kB T

Hence we obtain

" S ( E ) % " (E %
" (E %
p( E a ) = P0 ! exp$ B tot ' exp$ a ' = A ! exp$ a '
# k B & # kB T &
# kB T &

Boltzmann Distribution, also known


as the Gibbs Distribution

IMPORTANT: The probability distribution of the molecule of interest in equilibrium with its surrounding
depends only on the temperature of the surrounding.

Now we can explain the velocity distribution at equilibrium using the


Boltzmann distribution

Random distribution of
kinetic energy through
random collisions

" 1 2%
! mv
" !E %
$ 2
'
P = A exp $
=
A
exp
$ kT '
# k BT '&
$# B
'&
"
%
$ !v 2 '
= A exp $
'
k
T
"
%
B
$ 2$
'
# # m '& &

#v x2
f (v x ) =
exp 2
2
2"
2!"
1

!=

v x2 =

kB T
m

Example on rate constant and transition state

k
A !!" B
Transition
state

The reaction rate constant, k [s-1], is

" !E a %
' , where E a is the
RT &

proportional to exp$#

activation energy, or energy barrier, in units of


J mol-1.

Ea

" !E a %
k = Aexp$
'
# RT &

!E
A

Arrhenius equation

Suppose Ea of a reaction is 100 kJ mol-1 and a catalyst lowers this to


80 kJ mol-1. Approximately how much faster will the reaction proceed
with the catalyst?
k (catalyzed)
exp(!80 RT )
=
= e 8 " 3000
k (uncatalyzed) exp(!100 RT )
RT ! 2.5 kJ mol-1 at room temperature

High energy barriers result in high specificity in a cellular signaling


pathway.

The role of prion conformational switch in neurodegenerative diseases

k
Catalyzed by either mutation
or binding of PrPSc

!E ++ " 40 kcal mol-1 = 167.4 kJ mol-1

PrPC
PrPSc

What about low energy barrier?


The temperature-gated vanilloid receptor VR1, a pain receptor, is activated by
heat T > 45 C.

Ea

!E

It was found experimentally that the probability


of VR1 being in the open state is 0.04 at 40
C and 0.98 at 50 C. What is the energy
barrier?

Take home messages

Equilibrium = A system reaching a state of maximum entropy.


Equilibrium = All microstates are equally probable.

S = kB ln!

T = A measure of the tendency of an object to spontaneously give up energy to its


surroundings.

T = ( dE dS ) N,V

The Boltzmann & Gibbs Distribution

# "E &
p( E a ) = A ! exp% a (
$ kB T '

Das könnte Ihnen auch gefallen