Nothing Special   »   [go: up one dir, main page]

1st Half

Download as pdf or txt
Download as pdf or txt
You are on page 1of 40

Part II

Thermal & Statistical Physics


Eugene Terentjev
rm.245 Bragg

Relevant websites:
http://www-teach.phy.cam.ac.uk/teaching/webpages.php
http://www.bss.phy.cam.ac.uk/~emt1000/statphys.html

Michaelmas 2012 Part II Thermal & Statistical

Energy
Over-arching concept … Often-misused concept …
Physical “definition”: there exists a certain scalar
quantity that does not change in all the possible
transformations that Nature undergoes.
This is essentially an abstract mathematical constraint,
but we really can’t tell what energy “is”…

Potential Energy Kinetic Energy

Mass Energy Thermal Energy

1
Michaelmas 2012 Part II Thermal & Statistical

Dissipation of energy
In most areas of physics we only consider the situations
where the total energy of a given system was conserved.
The forces may have done some work, but it was possible
to convert it back into potential energy.

In many situations, however, some of the work will be “lost”.


Where would the energy go, if there is a universal Law of
conservation of energy?
We say it converts into “heat”. One often thinks that “heat”
is just the form of kinetic energy of many particles – but
since there are so many of them (1023 in a spoon of water)
we have no hope of harvesting this energy back… so it is
considered “lost”.

Michaelmas 2012 Part II Thermal & Statistical

Dissipation of energy
This loss of energy into heat is called “energy dissipation”.
It is a result of wet friction (or “dissipative friction”), which
in most cases is a force proportional, and opposite in
direction to the velocity:
Ffriction = −γ v Ffr N
v Ffr
v
mg
Compare this with the dry friction (resistance),
which is F = −µ N
friction

The force of friction is directly proportional to the applied load. (Amontons 1st Law)
The force of friction is independent of the area of contact. (Amontons 2nd Law)
The force of friction is independent of the sliding velocity. (Coulomb's Law)

2
Michaelmas 2012 Part II Thermal & Statistical

Rate of dissipation
Let us calculate the power loss in an oscillator (an example):
v m &x& = −k x − γ v + F (t ) external
Ffr

dx d 2x dx dx
P = F (t ) = ( m 2 + k x+γ )
dt dt dt dt
2
d  m v2 k x2   dx 
P = Fext ⋅ v =  +  +γ  
dt  2 2   dt 
Which means: you have to spend some external power to
make a particle moving against friction! If you don’t (P=0)
– then the rate of energy loss is: d
[Total Energy] = − Ffric. ⋅ v
dt
“loss of energy”

heat

Michaelmas 2012 Part II Thermal & Statistical

Work and Heat


The First Law of thermodynamics states:
“Energy is conserved when heat is taken into account”
In more quantitative terms, we claim that any change of
internal energy of any system (which we now call U) is
converted into work and the rest is released as heat:
∆U = ∆W + ∆Q
If you remember, the work (interpreted as the change
of P.E. to be specific) was determined as ∆W = − F ⋅ ∆x
It is common to start the subject of thermodynamics
with the analysis of gases, so the force is what acts
on the piston of area A:
∆W = − F ⋅ ∆x F
P
So: ∆U = ∆Q − PdV A
∆l

3
Michaelmas 2012 Part II Thermal & Statistical

Ideal gas: equation of state


Boyle (1662): at fixed temperature PV = constant
Charles (1787): at fixed mass of gas V = V0 + constant ⋅ (T − T0 )
V
Extrapolating this linear relation, scientists very early
have determined a “zero temperature” at which V=0, V0
which in today’s Celsius scale is at T = −273C. T
T0
A similar linear P-T relationship was also known
(Amontons 1702) and so, this required PV = constant ⋅ T
Finally, Avogadro (1811→1860) determined the
constant, which had to be proportional to the
amount of gas. He introduced a mol: PV = nR ⋅ T
A mol is the mass in grams of a substance
which is equal to its molecular weight. Each mol ∆V
P
contains exactly the same number of particles,
NA=6·1023 !
Total amount of particles is: N = nN A

Michaelmas 2012 Part II Thermal & Statistical

Heat capacity
One of the basic properties of all forms of matter
in response to heat is that of heat capacity.
∆Q = C ⋅ ∆T
The heat capacity describes the change in
temperature of a substance when it is supplied dQ
with a given amount of heat. C=
dT
But most materials (especially gas) can also expand on heating.
So we must specify in what conditions the heat capacity is
measured, e.g. at constant volume or at constant pressure
 dQ   dQ 
CV =   CP =  
 dT V  dT  P
The difference between them is the work (−PdV) on expanding the body:
V=constant: dQ = dU = CV dT
P=constant: dQ = dU + PdV = CV dT + nR dT = (CV + nR ) dT

4
Michaelmas 2012 Part II Thermal & Statistical

Temperature and Entropy


Entropy (Clausius 1850, Gibbs, Boltzmann) is a measure of
the unavailability of a system’s energy to do work (see
1st law ∆U = ∆W + ∆Q ). In simple terms, the entropy S
is the “heat per unit temperature”:
∆S = ∆Q or dQ = T ⋅ dS
T
However, the modern (primary) definition of entropy is from
statistical principles. This is often called “Boltzmann entropy”
and is defined as S = k ln Ω
B

where Ω is the number of states the system can explore


The Second Law of thermodynamics states that the entropy of
an isolated system can only increase with time (∆S>0) and
approaches its maximal value in equilibrium.
→ Heat cannot spontaneously flow from a material at lower
temperature to a material at higher temperature.

Michaelmas 2012 Part II Thermal & Statistical

Variable particle number


Consider a gas in a box – change particle number (N→N+dN)
while keeping the volume constant (dV=0). If this addition
was done reversible, that is, not creating heat (dQ=TdS=0),
then the increase in the energy of this gas:
∆U = µ ∆N
 ∂U 
This defines the chemical potential µ =  
 ∂ N V , S
In summary, for an arbitrary process capable of changing the
volume of gas (i.e. do mechanical work), converting some of
the energy into heat, and also exchanging particles – we
have the energy increment:
dU = TdS − pdV + µ dN

5
Michaelmas 2012 Part II Thermal & Statistical

Other sources of energy


“Paramagnetic Salt” is a model system where each atom is
fixed on a lattice and carries a magnetic moment (spin=1/2),
independently of its neighbours. ε = mB
ε↑ = ε↓ = 0 ↑

Without an external field the up-down


ε ↓ = − mB
spin energy levels are the same, but the
magnetic field splits the degeneracy: U = − M ⋅ B = ( N ↑ − N ↓ ) mB
dU = TdS − M ⋅ dB

Please watch the dimensionality… People (and textbooks)


are often flippant about the difference between “energy”
and “energy density” - and in this case: the “magnetic
moment” and “magnetisation”…

Michaelmas 2012 Part II Thermal & Statistical

Other sources of energy


“Simple Harmonic Oscillator” is a workhorse of physics (this
is because any potential energy is quadratic near equilibrium).
For the potential V(x)=½ a x2 the nth energy
level of the excited oscillation has the
V (x)
value: ε = hω (n + 1 )
n 2
n=2
where the natural frequency ω = a m
n=1
Total energy of an assembly of such
∆E
oscillators is then
n=0
U = ∑ hω ( ni + 1 2 ) x
i

6
Michaelmas 2012 Part II Thermal & Statistical

Other sources of energy


Van der Waals gas is an empirical model V(r)
system that takes into account the pair Short-range
interactions between particles of gas. repulsion

 N 2a  r
 p + 2 (V − Nb) = N k BT
 V 
Long-range
N k BT N 2 a attraction
or p = − 2
V − Nb V
We shall use this model in several contexts in this course.
Here let’s just note that with a potential like this, the energy
can be stored as kinetic or potential form – and one can
convert kinetic ↔ potential energy, for example, by Joule
expansion.

Michaelmas 2012 Part II Thermal & Statistical

Equation of state
The equation of state is a relationship between ( p,V,T ) – for
instance:
N k BT N 2 a
pV = N k BT or p = − 2 but not pV γ = const
V − Nb V
Such a relationship describes a surface in a 3D (p,V,T) space!

7
Michaelmas 2012 Part II Thermal & Statistical

Thermodynamic variables
Thermodynamic variables are observable properties of any
system. They fall into two categories, intensive and extensive:
System Intensive Extensive
Linear spring F force x displacement
Gas p pressure V volume
Particle exchange µ chemical potential N number
Surface film γ surface tension A area
Electrostatic V potential q charge
Magnetic B “magnetic field” M “magnetic moment”
…any… T temperature S entropy
Intensive and extensive variables form conjugate pairs,
whose product has the dimensionality of (and therefore
represents) energy,
e.g. ∆U = ∆Q + ∆W = T ⋅ ∆S − P ⋅ ∆V
Other forms of work:
∆W = − F ⋅ ∆x ∆W = γ ⋅ ∆A etc.

Michaelmas 2012 Part II Thermal & Statistical

Thermodynamic potentials
We have just seen, for the mean internal energy of the
system: ∆U = T ⋅ ∆S − p ⋅ ∆V or dU = T ⋅ dS − p ⋅ dV
This means that (S,V) are the natural variables of U(S,V)
which means that to determine the value of U we must
measure S and V (or “keep them under control”).
But in some situations (e.g. in an open container) it may
be hard to control/measure the volume, instead the
pressure P would be a more natural variable?
Introduce enthalpy: H = U + pV
then dH = dU + d ( pV ) = TdS − pdV + pdV + Vdp = TdS + Vdp
We conclude that the function H=H(S,p), with the conjugate
pair (p,V) “switched” – that is, now V plays the role of a “force”
while P the “displacement”

8
Michaelmas 2012 Part II Thermal & Statistical

Thermodynamic potentials
What we have just seen: U(S,V) → H(S,p) is one
example of a general “Legendre transformation” switching
between the “force” and “displacement” within pairs of
thermodynamic variables.
In many situations it may be hard to control/measure the
entropy, instead the temperature T would be a more
natural variable?
Introduce free energy: F = U − TS
then dF = dU − d (TS ) = TdS − pdV − TdS − SdT = − SdT − pdV
We conclude that the function F=F(T,V). In the similar way
we may introduce the Gibbs free energy: G = H − TS = F + pV
dG = dF + d ( pV ) = −TdS − pdV + pdV + Vdp = − SdT + Vdp

Michaelmas 2012 Part II Thermal & Statistical

Thermodynamic derivatives
Thermodynamic potentials are different forms of energy,
expressed in appropriate “natural variables”. We have just
seen four: U(S,V) → H(S,p) → G(T,p) → F(T,V).
dU = TdS − pdV dH = TdS + Vdp
dG = − SdT + Vdp dF = − SdT − pdV
This means that we can have partial derivatives, in each
case determining the corresponding thermodynamic force:
 dU   dH   dU   dF 
e.g. T =   =  p = −  = − 
 dS V =const  dS  p =const  dV  S =const  dV T =const
What will happen if we take a derivative with respect to a “wrong” variable?
 dH  TdS + Vdp  dH   dp 
e.g.  = so   =V 
 dV  dV  dV  S =const  dV  S =const
 dU   dS  C
  = T  =C V so dS = dT
 dT V = const  dT V =const T

9
Michaelmas 2012 Part II Thermal & Statistical

Maxwell relations
Thermodynamic potentials are different forms of energy,
expressed in appropriate “natural variables”. We have
seen four: U(S,V) → H(S,p) → G(T,P) → F(T,V).
dU = TdS − pdV dH = TdS + Vdp
dG = − SdT + Vdp dF = − SdT − pdV
One can evaluate a second derivative in two ways:
 d 2U   dT   dp 
e.g.   =   = −  One of Maxwell relations
 dS dV   dV  S  dS V

 dS   dV   dS   dp   dT   dV 
  = −    =    = 
 dP T  dT  P  dV T  dT V  dP  S  dS  P
Do you notice the pattern?
1) Which two variables → 2) Which potential → 3) What sign?

Michaelmas 2012 Part II Thermal & Statistical

Analytic Methods
 dx   dx   du 
Chain rule:   =     Maxwell rel:
 dy  z  du  z  dy  z  dX   dY 
Π = Π ( x, y ) :   = ± 
Reciprocity  dx   dx   dz   dy  x  dx  y
  = −    
theorem:  dy  z  dz  y  dy  x
Entropy of an ideal gas
 ∂S   ∂S 
Suppose S=S(p,T) dS =   dT +   dp
 ∂T  p  ∂p T
dT  ∂V  dT dp
dS = C p −  dp = C p − Nk B
T  ∂T  p T p
Now integrate:
S = C p ln T − Nk B ln p + const
 Nk T   V T 3/ 2 
S = Ns0 + Nk B ln T 5 / 2 − ln B  = Nk B ln const ⋅ 
 V   N 

10
Michaelmas 2012 Part II Thermal & Statistical

Cp and Cv once again...


Let us use this as another example of calculation:
For any function, such as S(T,V):  ∂S   ∂S 
dS =   dT +   dV
 ∂T V  ∂V T
Now evaluate
 dS   dS   dS   dV 
Cp = T  = T  +T   
 dT  p  dT V  dV T  dT  p
 dp   dV 
C p = CV + T    
 dT V  dT  p
We stop as soon as the combination (P,V,T) is reached
If ideal gas, then:  dp  Nk B  dV  Nk B
  = ;  =
 dT V V  dT  p p
so C p = CV + Nk B

Michaelmas 2012 Part II Thermal & Statistical

Joule expansion
Isolated system, so ∆U=0. During the expansion V1→V2
V2
 ∂T 
T2 − T1 = ∫   dV
V1 
∂V U

 dT   dT   dU  1 TdS − pdV 1   dp  
  = −    =− =− T   − p
 dV U  dU V  dV T CV dV T CV   dT V 
We stop as soon as the combination (P,V,T) is reached
 dp  Nk BT
If ideal gas, then: T   = = p ; so ∆T = 0
 dT V V
Nk BT N 2 a
Try to evaluate this for a non-ideal gas, e.g. p = −
V −b V 2

11
Summary so far……..

The 2nd Law of thermodynamics:


“Entropy of a closed system increases to a maximum in
equilibrium” dQ = TdS S = k B ln Ω

Thermodynamic variables come in conjugate pairs of


“force”–“variable”. A given set defines the corresponding
thermodynamic potential:
“Enthalpies” “Free energies”
U , H , ... = TdS + { Ydx } F , G , ... = − SdT + { Ydx }
Mean energy U(S,V,N) is the only potential that depends on
all the extensive variables! Hence U(λS, λV, λN)=λU(S,V,N)

Tools of analytical thermodynamics:


Maxwell relations – Reciprocity – Chain rule → stop at (P,V,T)

Michaelmas 2012 Part II Thermal & Statistical

Internal equilibrium
Let us consider a closed system, fully isolated from outside.
The 2nd Law demands that the change of entropy of this
system can only be positive, ∆S>0, and it should be maximum
in equilibrium.
U,V,N
Let’s divide the system into two parts: 1 2
they can exchange U, V and N
dU1 + p1dV1 − µ1dN 1 dU 2 + p2 dV2 − µ 2 dN 2
dS tot = dS1 + dS 2 = +
T1 T2
dU1 + p1dV1 − µ1dN 1 − dU1 − p2 dV1 + µ 2 dN 1
= +
T1 T2
1 1  p1 p2   µ1 µ 2 
=  − dU1 +  − dV1 −  − dN1
 T1 T2   T1 T2   T1 T2 
Between any two parts
inside closed system: T1 = T2 ; p1 = p2 ; µ1 = µ 2

12
Michaelmas 2012 Part II Thermal & Statistical

Equilibrium in open systems


A much more relevant problem is about a system interacting
with a reservoir (the rest of the Universe). The 2nd Law is still
in action, only it applies to the whole:
as before, the two parts of the (closed) 1
Universe can exchange U, V and N U,V,N
R
dU1 + p1dV1 − µ1dN 1 dU R + pR dVR − µ R dN R
dS tot = +
T1 TR since:
dU1 + p1dV1 − µ1dN 1 − dU1 − pR dV1 + µ R dN 1 ∆X 1 = − ∆X 2
= +
T1 TR
− dU − pR dV + µ R dN
= (TR dS − dU − pR dV + µ R dN )
1
= dS +
TR TR
So far, no assumptions were made, but surely the “reservoir” is very big...

Michaelmas 2012 Part II Thermal & Statistical

Availability
Now assuming the reservoir is so big that it “doesn’t notice”
any changes that our system inflicts on it:
dTR = dp R = dµ R = 0
So: p R dV = d ( p RV ) ; etc. 1
U,V,N
We can now define a new object for our R

system, called the availability A, as: dA = −TR dS tot


1
dS tot = (TR dS − dU − pR dV + µ R dN )
TR
Meaning that: dA = −TR dS + dU + p R dV − µ R dN
dA = (T − TR )dS − ( p − pR )dV + (µ − µR )dN

• Availability is the function of system variables (U,S,V,N)


• Since R-variables are constant: A = U − TR S + pRV − µR N
• In equilibrium, availability is at its minimum: dA ≤ 0

13
Michaelmas 2012 Part II Thermal & Statistical

Availability
dA ≡ −TR dS total = dU − TR dS + pR dV − µ R dN
A particularly important aspect of the 2nd Law in the form of
the availability reaching its minimum when the system is at
equilibrium with the (big) reservoir is that – for any variable X
that characterises our system – the probability is:
P ( X ) ∝ exp  − A( X ) 
 k BT 
Now consider our system at constant T,V and N. Then:
dA = dU − TR dS + pR dV − µ R dN T ,V , N
= dU − T dS = d (U − TS )
= dF(T ,V , N ) ... the Helmholtz free energy, which
has T,V and N as its proper variables.
Check yourself: dA T , p, N
; dA S , p, N
; or dA S ,V , N

Michaelmas 2012 Part II Thermal & Statistical

Phase equilibrium
If we are looking at an open system, then it has to be under the
(p,T) control, and so has to be described by G(T,p,N), e.g., in
contrast to a fixed-volume vessel, which requires (V,T) -and F.
When the two phases coexist,
the full potential is G = Gl + Gv
In equilibrium: dA = dG T , p = 0
dGl = − Sl dT + Vl dp + µ l dN l = −dGv = S v dT − Vv dp − µ v dN v
Now, dT=dp=0 and dNl=−dNv, so µl = µv

This is a part of our earlier equilibrium condition set for the


two parts of a system. Every time when there is a particle
exchange, we find µ matching between the subsystems.

14
Michaelmas 2012 Part II Thermal & Statistical

Phase equilibrium
Look at the familiar example of VdW isotherms and the
resulting gas-liquid transition. On the (p,V) plane we have:

And at a temperature
below the critical point
F there is a region of
coexistence. How can
we find the pressure
at which this occurs,
for a given T ?
µl ( A) = µv ( E )
E  ∂µ 
µ v ( E ) = µl ( A) + ∫   dp The Gibbs-Duhem equation:
A
 ∂p T dµ = − sdT + vdp
E
= µl ( A) + ∫ vdp
A The equal area rule gives the
= 0 vapour pressure p (T)
v

Summary so far……..

Availability – the energy function that reflects the balance


between the system and the reservoir:
dA = dU − TR dS + pR dV − µ R dN
= (T − TR )dS − ( p − pR )dV + ( µ − µ R )dN
The system interacting with a reservoir has a probability to
P ( X ) ∝ exp−  A( X ) 
have the value of its variable X given by
 k BT 

When you control a certain set of variables of your system,


e.g. (X,Y), then a small increment in availability is equal to
the increment of the corresponding T.D.potential Π(X,Y)
e.g. dA T ,V , N
= dF (T ,V , N )

15
Michaelmas 2012 Part II Thermal & Statistical

Microstates and Macrostate


Microstate is a particular configuration (realisation) of the
system with certain values of microscopic parameters, e.g.
• set of solutions of a Schrödinger equation with an energy Ei
• positions and velocities of particles in a classical gas
• an arrangement of spins in a paramagnetic lattice

Macrostate is a set of all microstates which has a certain


mean energy U and is subject to any other constraint: V, N, etc.
For an isolated system, all microstates compatible
with the given constraints are equally likely to occur
Statistical mechanics is all about finding 1  ∂ ln Ω 
the number microstates: Ω(U , V , N ) = 
k BT  ∂U 
The Boltzmann entropy: S = k B ln Ω

Michaelmas 2012 Part II Thermal & Statistical

Statistical entropy
The entropy (heat per unit temperature) is the measure of
irreversibility, of chaos, of disorder – quantitatively
measured as “number of configurations”: S = k B ln Ω
Examples: 1) Identical particles in a box: Ω=N!
S = k B ln( N !) ≈ ( N ln N − N ) using Stirling approximation:
N! ≈ N N e − N
N!
2) Several populations (N1+ N2 +N3 +…=N): Ω =
N1! N 2 ! N 3!...
 
S = k B ln Ω ≈ k B  N ln N − N − ∑ [ N i ln N i − N i ] 
 i =1, 2 , 3,... 
  Ni Ni
= k B  ∑ N i [ln N − ln N i ]  = − N k B ∑ ln
 i =1, 2 , 3 ,...  i =1, 2 , 3 ,... N N

“Gibbs entropy”: S = − N k B ∑i P (i ) lnP (i)

16
Michaelmas 2012 Part II Thermal & Statistical

Canonical ensemble
Microcanonical ensemble is just a collection of thermally
isolated systems, all with the same energy – and hence the
same probability of occurrence.
Canonical (or standard) ensemble is a collection of systems
all connected to a reservoir (and possibly to each other) so
that they can exchange energy. So the mean energy of this
ensemble U fluctuates. We have a definite number: N.
There will also be a grand canonical ensemble (for the lack of
a better name), in which the systems are also allowed to
exchange particles, so that both U and N fluctuate (while the
intensive variables T and µ are fixed).
Gibbs derives the probability that a chosen subsystem (i)
to be found in a microstate Ei : 1 − Ei k T
P (i ) = e B

Michaelmas 2012 Part II Thermal & Statistical

Partition function
The Boltzmann factor determines the probability for a
system to be found in a state with energy Ei
1 − Ei
P(i) = ⋅ e kBT where the normalization factor
exp − i 
Z E
Z= ∑
all states {i}

 k BT 
This is not just a mere normalization, but a very important
object in statistical physics, called the partition function.
It encodes the statistical properties of the whole system.
Its significance is mainly in its role in defining the free energy,
a most important form of thermodynamic potential energy,
expressed as F = −k BT ln Z or Z = e − F / k TB

U =< E >= ∑i Ei P(i ) =


∑ Ee i i
− Ei / k BT

We can find the average


energy of the system
∑ e i
− Ei / k BT

1 ∂Z ∂
U =− =− ln Z with β =1/kBT
Z ∂β ∂β

17
Michaelmas 2012 Part II Thermal & Statistical

Maximum probability
It is now quite obvious that the maximum probability P(i)
has the microstate with: [1] the lowest energy (Ei), and
[2] the largest “degeneracy” (Ni), i.e. the number of states
with the same energy:
Ei Ei Ei
− − −
Z= ∑ e k BT
= ∑{ e k BT
⋅ ( N i !) = ∑ e k BT
⋅ e ln Ωi
all states microstate s E i } {E i }
1 1
− ( Ei − k BT ln Ω i ) − ( Ei −TS i )
=∑ e k BT
= ∑ e k BT
= ∑ Zi
{E i } microstate s microstate s

Maximization of the partition function, or (equivalently!)


minimization of the free energy F = U – TS is the main
driving force in all Nature.
Average (potential) energy U → minimum? In balance…
Average entropy S → maximum?

Last lecture……..
Summary so far……..

The maximum probability P(i) is for the microstate with:


[1] the lowest energy (Ei), and [2] the largest “degeneracy”
(i.e. the number of states with the same energy)
1

Ei − ( Ei − k BT ln Ωi )
Z= ∑
all states
e k BT
= ∑
microstate s {E i }
e k BT
= ∑
microstate s
Zi

1
1 − ( Ei −TSi )
P[i] = e kBT
Z
Maximization of the partition function, or (equivalently!)
minimization of the free energy Fi = Ei – TSi is the main
driving force in all natural processes
Average energy U → minimum
In balance…
Average entropy S → maximum

18
Michaelmas 2012 Part II Thermal & Statistical

Two-level system
As one simplest example of a real physical case E=ε
that we can analyze within proper statistical <U >
mechanics, let’s consider a 2-level system:
E=0
The object (e.g. electron in an atom, or a spin) Ei
− −ε
can exist on either of the two levels: Z = ∑ e kBT = 1 + e kBT
all states
Once the partition function is found,
you know everything about the system!

Route 1: mean energy U = − ln Z where β =1/kBT
∂β
Here we have U
U=
1  ∂ −β ε  ε e− β ε
− e  = = ε
ε U =ε
−β ε  −β ε
2
1 + e  ∂β  1+ e e kBT + 1 Low-T
High-T
−ε
U =ε e k BT
T

Michaelmas 2012 Part II Thermal & Statistical

Two-level system
−ε
Partition function Z = 1 + e k BT
E=ε
Once the partition function is found, <U >
you know everything about the system! E=0

Route 2: free energy F = − k BT ln Z = − k BT ln 1 + e −ε / kBT ( )


Recall thermodynamic potential properties: dF = − SdT − PdV
 dF 
S = − (
 = k B ln 1 + e B +
−ε / k T
) k BT
1+ e −ε / k BT
d −ε / k BT
e
 dT V dT
ε
( )
= k B ln 1 + e −ε / k BT +
T e
1
ε / k BT
+1
S
S =k Bln(2)
At low temperature U → min “wins”; − ε Low-T
ε
entropy “loses out”… S= e kT B High-T
T
At high temperature S → max “wins”; T
energy follows…

19
Michaelmas 2012 Part II Thermal & Statistical

Paramagnetism B
Paramagnetism, in its simplest form is analogous E=+m·B
to the 2-level system: E=0
Consider a system (e.g. crystalline solid) in which E=−m·B
each atom has a magnetic moment [spin]: m = µ M s
Assume the spins do not interact (each on its own).
Quantum mechanics (selection rules for a spin s= ½) allows only in 2 states
for spins: “up” and “down” with the same energy. But if an external
magnetic field B is imposed, the “up” and “down” states have different
energy! E − mB
= 2 cosh  mB 
− i mB
Z1 = ∑ e
all states
k BT
=e k BT
+e

k BT
k T
B 

But there are N such atoms in the system, all independent: Z = Z1N

Once the partition function is found,


you know everything about the system!

Michaelmas 2012 Part II Thermal & Statistical

Paramagnetism B
N

Partition function Z =  e k BT + e k BT 
mB − mB
E=+m·B
 
E=0
Route 2: free energy
− mB
 mB
 E=−m·B
F = − N k BT ln e k BT
+e k BT

 
Magnetization M (exactly like dielectric polarization P) should be
defined as the sum of all m-dipoles per 1 volume. It is an extensive
thermodynamic variable, forming a conjugate pair with the (intensive)
B-field (same with PdE): dF = − SdT − PdV + MdB which sign?

 dF 
Average magnetization induced by an external B will be M = (?)  
determined by the appropriate thermodynamic derivative:  dB T ,V =const
− mB
M
d  mB
 Nm
M = N k BT ln e k BT
+e k BT

dB  
= N m ⋅ tanh  mB  B
 k BT 
−N m High-T

20
Michaelmas 2012 Part II Thermal & Statistical

Quantum oscillator
Oscillator (e.g. a vibrating molecular bond) has ∞ number En = hω (n + 2 )
1

of states (labeled by n) separated by equal energy gaps.


Whether we can “see” this discrete nature of oscillator motion
depends on our equipment: if it is sensitive to the accuracy ∆E = hω
So, from statistical point of view, we look for the
partition function: Z = ∞ e −hω ( n +1/ 2 ) k T
∑n=0
B

hω ∞ −

− 2 k BT
e 1
Geometric progression Z =e ∑a
2 k BT

n=0
n
=


= hω

1− e k BT
e 2 k BT
−e 2 k BT

Low-temperature limit: High-temperature limit:




Z ≈e 2 k BT
Z ≈ k BT

Mean energy: U = − ∂ ln Z U≈
∂ 1
ln (h ωβ ) = = k BT
∂β ∂β β
“Zero-temperature ≈ ∂  h ω β  = 1 h ω
  “Classical limit”, CV=kB
oscillations” ∂β  2  2

Michaelmas 2012 Part II Thermal & Statistical

Quantum oscillator
Oscillator (e.g. a vibrating molecular bond) has ∞ number En = hω (n + 2 )
1

of states (labeled by n) separated by equal energy gaps.

Once we know the partition function, we know


all properties of our system…


Low-temperature limit: Z ≈ e High-temperature limit: Z ≈ k BT
2 k BT

Free energy: F = −k BT ln Z ≈ 12 hω F ≈ −k BT ln k BT 
 h ω 
Entropy is zero in the Find entropy in this
“ground state”: F=U classical limit
∂F  + k T 1 
= k B ln B
k T
S =−
S(T) ∂T  h ω  B  T 
e ⋅ k BT
= k B ln 

 h ω 
k BT ≈ hω
kBT

21
Take a deep breath.

… Stretch.

… We move forward

Michaelmas 2012 Part II Thermal & Statistical

Continuous systems
So far we have seen how to handle physical systems in which
we can enumerate different microstates (Ei and Ni) – and find
the partition function, which in turn makes accurate predictions
about average (most probable) state of such systems.
But the simplest(?) and most common object to study in
thermodynamics is the ideal gas: PV=NkBT. Let’s look at it:

Actually, N particles is far too many: m


let’s start with just one particle in a box! v
Ei

Z= ∑e k BT

all states Volume V=L3


We have two (related) difficulties:
(a) what is E, when the particle doesn’t interact with anything,
and (b) how to count the “states”…

22
Michaelmas 2012 Part II Thermal & Statistical

Phase space
Normally, by E we would want to mean the
m v
potential energy W(x), which would have a
“minimum” and the higher Boltzmann factor.
But here there is no P.E. (only kinetic energy) V=L3

Normally, you would describe you system (here – just 1 particle)


by assigning it a position, say, x(t) if we look only on 1 dimension.
But this is clearly not a full description! x(t)
p(t)
Need velocity?...
t
x(t)

Introduce the “phase space”, the coordinate and the momentum


along this axis, x(t) and p=mv. An element of phase space, at a
time t, gives a full predictive description of where it will be at t+dt

Michaelmas 2012 Part II Thermal & Statistical

Continuous states
So we account for all the possible states of a
m v
particle by summing over all possible points in
its phase space. Consider how to do this on a
1-dimensional example (along x-axis): p(t) V=L3


all states
=∑ →
p i xi
∫ dx dp
∆x ∆p
dx
dp
x(t)
But note that the sum was non-dimensional, while the integral
has dimensionality of [kg.m2/s] ? The formal conversion of a sum
into an integral requires dividing by the elementary “step of
discretisation”. Here it is ∆x ∆p.
What is the smallest possible value ∆p ∆x can take? 2π h
dx dp dk
This result is worth remembering:
“the world is discrete in phase space”

all states
= ∫ (2π h ) = ∫ dx 2π

23
Michaelmas 2012 Part II Thermal & Statistical

One particle in a box


We can now make progress: the single particle
m v
of mass m and momentum p has its statistical
partition function: p2
−E d 3 x d 3 p − 2m kBT V=L3
Z1 = ∑e
all states
k BT
= ∫ (2π h )3
e

First of all notice that nothing under the integral depends on x,


i.e. there is no potential energy V(x): this is ideal gas!
3
p2  px 2 
− −
d p 3
2 m kBT  dp x 
Z1 = V ⋅ ∫ e = V ⋅ ∫ e
2 m kBT
 = V / λ3
(2π h )3  (2π h ) 
 
Secondly, instead of doing complicated 3-dimensional integrals,
note that p = p x + p y + p z and d 3 p = dp x dp y dp z
2 2 2 2

This is a very important expression, and let’s call it 1/λ

Michaelmas 2012 Part II Thermal & Statistical

One particle in a box


The particle of mass m and momentum p has its
partition function: λ3
2 p

d 3x d 3 p 2 m k BT V
Z1 = ∫ e = V=L3
(2π h )3 λ
3

This is how many ways you can “pack” the particle into this box!
So what is this length scale λ?
p 2

1 dp − 2m kBT
+∞
1 2π h 2
= ∫ e = 2π m k BT so λ =
λ −∞ 2π h 2π h m k BT
Recall the de Broglie wave length of a wave representation of a particle.
Also recall the mean thermal velocity from the Maxwell distribution.
2π h 2π h 2π h ( 2π h) 2
λ= = = =
p mv m ⋅ k BT / m m k BT
So we managed to “count” the possible states a free classical
particle can have in the box – and the result is just: Z=V/λ3

24
Michaelmas 2012 Part II Thermal & Statistical

Classical ideal gas


One particle of mass m has its partition function:
3
What is we have N such V  mk BT 
particles, all independent of Z1 = 3 = V  
2 
each other?
λ  2π h  V=L3
1 N
ZN = Z But if all particles are exactly the same
N! 1 (indistinguishable) – we could not tell the
difference between many configurations!
We now know everything
about ideal gas
∂ ∂
Route 1: Mean energy U = − ln Z N = − [N ln Z1 − ln( N!)]
∂β ∂β
∂   mkBT  
Using factorizing property 3
∂  1 
ln V 
of logarithms! U = − N   = −N ln
2 
∂β   2π h   ∂β  β 3 / 2 
 
3 1 3
U = N = Nk BT -- This is actually correct!
2 β 2

Michaelmas 2012 Part II Thermal & Statistical

Classical ideal gas


The ideal gas of particles of mass m has
3
Z1N V  mk BT 
ZN = where Z1 = 3 = V  
2 
N! λ  2π h  V=L3
Route 2: Free energy
F = − k BT ln Z N = − k BT [N ln Z1 − N ln N + N ]
Using factorizing property
of logarithms!
[ (
F = N k BT ln Nλ
3

V
) − 1] = N k T ln
B

V e
3

Let’s find the pressure: P = −


dF  N k BT
 =
 dV T =const V
Note: we could not find the ideal gas law from the mean energy U,
which was in the “wrong variables” T,V!
 dF  Nλ3 3
Entropy is a bit of work: S = −  = − Nk B ln + Nk B
 dT V Ve 2

25
Michaelmas 2012 Part II Thermal & Statistical

Grand partition function


In the same way, for a grand canonical ensemble, the
probability for a given system to have energy Ei and a number
of particles Ni (in contact with reservoir that maintains T,µ) is:
1  E − µN i 
P (i ) = ⋅ exp − i  with the grand partition function
Ξ  k BT  TOTAL   Ei − µN i  
Ξ= ∑  ∑ exp − 
k BT  
Ni =0 all states {i} 

There is a corresponding thermodynamic potential that needs


to be minimised in equilibrium. It is called the grand potential:

Φ = −k BT ln Ξ or Ξ = e − Φ / k BT
Note that its natural variables are (T,V,µ) and it is obtained
from F(T,V,N) by the Legendre transformation: Φ=F−µN

Michaelmas 2012 Part II Thermal & Statistical

Grand potential
In the same way as we have analysed the canonical p.f. Z, by
identifying a free energy of a microstate Ei−TSi, let us re-order
the grand-canonical summation and arrange the exponent:
TOTAL  kµ NT − i
E
  −
1
( Ei − k BT ln Ωi − µN i ) 
 −
1
Φi
Ξ= ∑
N =0
e

B
∑e
all states {i}
k BT
 = ∑ ∑ e
 microstates Ei  N i
k BT
 = ∑e
 { Ei }
k BT

The first observation is that one can have a “grand partition


function’’ and a grand potential for a given microstate, if it can
exchange particles with other microstates.
Secondly, as in the canonical ensemble, the minimum of Φi is
the most probable microstate, and at sufficiently low T is can
be treated as the average, i.e. thermodynamic Φ(T,V,µ)=F−µN.
1  E − k BT ln Ω i − µN i 
Probability P( Ei , N i ) = exp − i 
Ξ  kBT 

26
Michaelmas 2012 Part II Thermal & Statistical

Grand potential
TOTAL µN
You will have noticed: Ξ = ∑ Z (N )e
N =0
k BT

For instance, for a classical ideal gas we know what Z(N) is, so
Z1 (T , V ) N TOTAL ( Z1 e µ / k BT ) N
( )
TOTAL µN
Ξ= ∑ e k BT
= ∑ = exp Z1e µ / k BT
N =0 N! N =0 N!

So if this ideal gas is in contact, and can exchange particles,


with a (big) reservoir which maintains a chemical potential µ:
V  We have the pressure:
Φ = − k BT ln Ξ = −k BT  3 e µ / k BT ∂Φ
λ  p=−
k T
= B3 e β µ
Mean number of particles: ∂V T ,µ λ
∂Φ V µ
N =− = 3 e k BT so Φ = −k BT N ; Ξ = e
N

∂µ T ,V λ N − N
1 (e βµ Z1 ) N N e
Probability P ( N ) = ⋅ =
Ξ N! N!

Last lecture……..
Summary so far……..

Partition function of classical ideal gas


3
V  mk BT  Z1N
Z1 = 3 = V  
2  ZN = V=L3
λ  2π h  N!

Systems open to particle exchange:


grand partition function TOTAL 
 E − µN i  
Ξ = ∑  ∑ exp − i 
N = 0 all states {i}  kBT  
i
Classical ideal gas:
(
Ξ = exp Z1e µ / k BT )
V 
Φ = −k BT ln Ξ = ( F − µN ) = −k BT  3 e µ / k BT = −k BT N
λ 

27
Michaelmas 2012 Part II Thermal & Statistical

p-T and p-T-µ ensembles


By analogy with constructing the grand partition
function in an ensemble when we had particle
V0−V T, p0
exchange, under controlled µ, we should build a
corresponding statistical sum:
 − kp0VT
∞   −
1
( Ei − k BT ln Ω i + p0Vi ) 
 −
1
Gi ( T, p 0 , N )
Ψ = ∑ e B
Z (T , V , N ) = ∑ ∫ dVi e k BT
 = ∑e
k BT

V =0   microstates Ei   { Ei }
The corresponding thermodynamic potential is G(T,p,N)=F+pV.
If now in addition we also open the system to
V0−V exchange particles (at µ imposed by the reservoir),
N0−N
T, p0
then the corresponding thermodynamic potential
would be Y(T,p,µ) = G – µN = F + pV – µN = Φ+pV
In fact, in such an ensemble there is no proper (extensive)
thermodynamic potential of the system, that is, dY = 0.

Michaelmas 2012 Part II Thermal & Statistical

Chemical potential of ideal gas


Many different ways to obtain this, but the properly systematic
is to obtain the canonical Z(N), then its F(T,N) and then µ:
Nλ3  Nλ3  Nλ3
F = Nk BT ln ; µ = k BT  ln + 1 = k BT ln
Ve  Ve  V
Some additional factors may contributed to the single-particle Z1. For
instance, a constant (adsorption) potential φ, or an internal (e.g. vibrational)
degree of freedom: Z vib = k BT / hω

 Nλ3   Nλ3 e βφ 
F = Nk BT ln − βφ
; µ = k BT ln 
 Ve ⋅ e Z vib   V Z vib 
Alternatively, from
∂Φ V µ
N =− = 3 e k BT Extras…
∂µ T ,V λ Let us examine what
this factor implies

28
Michaelmas 2012 Part II Thermal & Statistical

Mixtures
Let us consider the p-T controlled ensemble (natural on the
laboratory benchtop) where the gas has several species, that
is, N=Σi Ni .The partial (osmotic) pressure law gives
N i k BT
p = ∑i pi = ∑i
and similarly for the entropy: V
 T 5/2  5 T  p 
S = ∑i Si = ∑i N i k B ln const  = N k B ln  − ∑i N i k B ln i 
 pi  2  T0   p0 
So the change in entropy is: ∆S = −k B ∑i N i ln (ci )

In the same way, using the general form for the chemical
potential of ideal gas, we have for each species:
N λ3 Nλ3
µi = kBT ln i i = k BT ln i + k BT ln(ci ) = µi ( p, T ) + k BT ln(ci )
V V

Pure gas of i-species Addition due to


at current p,T current ci=Ni/N

Michaelmas 2012 Part II Thermal & Statistical

Chemical reactions
Consider a generic chemical reaction, say, A + 2B = 2C.
Remaining in the p-T ensemble, we must work with the Gibbs
potential G(p,T,N) = ΣiGi. In equilibrium: dG=0, so we obtain
dG = −SdT + Vdp + ∑i µi dNi = µ AdN A + µ B dN B + µC dNC = 0
1 1
but dN A = dN B = − dN C
2 2
For an arbitrary reaction in equilibrium: ⇒ µ A + 2 µ B − 2 µC = 0

∑νµ
i i i (
= 0 = ∑i ν i µi ( p, T ) + k BT ln ∏i (ci ) i
ν
)
Define the chemical
K c ( p, T ) = ∏i (ci ) i c A c B2
ν
⇒ Kc =
equilibrium constant cC2

1
ln K c = − ∑ ν i µi ( p, T )
k BT i

29
Michaelmas 2012 Part II Thermal & Statistical

Chemical reactions
Consider a fully general chemical reaction, Σi νi Ai = 0
Remaining in the p-T ensemble, with dG = ∑ i
µi dNi = 0
Let us recall what we know about µi:  Ni λ3i  −βφ 
µi = k BT ln ⋅e 
 V  
hence ∑ i
ν i µi = k BT ∑i ν i (ln N i − ln Z1 (i ) ) = 0

K N ( p, T ) = ∏i (Ni ) i
Define an alternative chemical ν

equilibrium constant
Then its value is given by: K N = ∏ (Z1 (i ) )ν i
i

For instance, for A + 2B = 2C, we might get Potential gain −φ


(V / λ3A )(V / λ3B ) 2 (V / λ3A )(V / λ3B ) 2 −βφ
KN = >> 1 but K N = e << 1
(V / λ3C ) 2 (V / λ3C ) 2

Summary so far……..

A generic chemical reaction, Σi νi Ai = 0 (e.g. 2H+O=H2O )

A “chemical” (empirical) version of


K c ( p, T ) = ∏i (ci ) i
ν
chemical equilibrium constant

A “statistical” version of
K N ( p, T ) = ∏i (Ni ) i = ∏i (Z i ) i
ν ν
chemical equilibrium constant
This is how you calculate it…
For instance, for 2H + O = H2O, we get
(V / λ3O )(V / λ3H ) 2 (V / λ3O )(V / λ3H ) 2 − φ k BT
KN = >> 1 but K N = e << 1
(V / λ3H2O ) (V / λ3H2O )

Bonding potential energy −φ

30
Michaelmas 2012 Part II Thermal & Statistical

Classical vs. Quantum


The ideal gas of particles of mass m has
Z1N Nλ3
ZN = or F = N k BT ln
N! Ve V=L3
Note how this factor turns up under the logarithm!
3/ 2
What is the meaning of Nλ3 N  2π h 2 
=  
V V  m k BT 
Nλ3 Nλ3
1 >> >> 1
V V
Classical physics, particles are Quantum physics, particles
localized and interact via forces interact and behave as waves

Michaelmas 2012 Part II Thermal & Statistical

Classical vs. Quantum


We established an important operation, of changing the order
of summation (in different ensembles), to end up summing over
the microstates Ei. In the grand canonical ensemble:
TOTAL  kµNT   −
1
( Ei − k BT ln Ω i − µN i ) 
 −
1
Φ i ( T, V,µ )
Ξ= ∑
N =0
e Z ( N ) = ∑ ∑ e

B

 microstates Ei  N i
k BT
 = ∑e
 { Ei }
k BT

In this way we identified the grand partition function of a


given microstate, and the corresponding potential:
TOTAL  − k 1T [ε k ( n ) − µ n ] 
Ξk = ∑
n =0
e B

 ; Φ k = −k BT ln Ξ k


In the quantum regime, when the separate particles


cannot be properly distinguished and the statistical sum
over microstates could be very difficult, we have an
interesting way forward, if it happens so that: εk(n)=n·εk

31
Michaelmas 2012 Part II Thermal & Statistical

Classical vs. Quantum


When the energy of a microstate Ek factorises with the number
of particles in this state, Εk(n)= nk·εk (and remember, the
entropy is always extensive too), then

1
(ε 1 − µ ) n1 −
1
TOTAL − 1 (ε k − µ ) nk 
( ε 2 − µ ) n2 −
1
(ε 3 − µ ) n3
Ξ= ∑e k BT
e k BT
... = ∏  ∑ e k BT
k 
e k BT
 = ∏ Ξk
 k
n1 , n2 ,n3 ,...  nk =0
The corresponding full grand potential is just the sum over
each energy state:
Φ (T , µ ) = −k BT ∑ ln Ξ k = ∑ Φ k
k k

However, to find any average (e.g. the mean energy U), we


need to use the probability:
U (N ) = ∑
microstates { k }
ε k P(ε k , N )

Michaelmas 2012 Part II Thermal & Statistical

Fermi statistics
The Pauli exclusion principle prohibits more than one Fermi
particle to occupy a given energy level εk :
ε k −µ ε −µ
 − 1 [ε k − µ ]  − n
 − k 
Ξ k = ∑  e kBT  = 1 + e k BT
Φ k = − k BT ln1 + e k BT 
n = 0 ,1   
ε −µ
− k  
1 k BT
dΦ k e 1
n(ε k ) = −
k BT
= k BT ε k −µ
= β [ε k − µ ]
dµ − e +1
1+ e k BT

This is the famous expression of the Fermi occupation number:


it tells how many particles you can find having the energy E, at
temperature T. E
n(E) εF This picture makes it clear that
1
T=0 µ at low temperature is the
“Fermi energy” – the highest
0
µ E n level particles have to pack to.
0 1

32
Michaelmas 2012 Part II Thermal & Statistical

Fermi energy
Fermi energy is the chemical potential of Fermi particles at
very low temperatures, when the density of states is sharp:
1 n(E)
T→0
n( E ) = E −ε F 1

e k BT
+1 0
εF E
If we have the total of N particles, then the sum of all n(εk) has to = N
d 3x d 3 p
N= ∑ n(ε k ) =
microstates {ε k }
∫ (2π h )3
⋅ n( E ) This is actually very easy, if
you recall that E=p2/2m
∞ ∞
4π p 2 dp m3/ 2
N =V ∫ ⋅ n ( E ) = V ∫0 2π 2h 3 E dE ⋅ n( E )
0 (2π h )3
2/3
N m3 / 2
εF
2 m3 / 2 h2  N 
(ε F )3 / 2 ε F ≈ 7.4  
V
=
2π 2 h 3 ∫
0
E dE =
3π h
2 3 m V 
Note how Fermi energy depends on the particle density (or pressure),
i.e. it is increasingly hard to add more particles to the system.

Michaelmas 2012 Part II Thermal & Statistical

Bose statistics
If particles do not have half-integer spin, they are not subject
to the Pauli exclusion principle, and can all occupy the same
level of energy. These are called Bose particles.
ε −µ

 − 1 [ε k − µ ] 
n
1  − k 
Ξ(ε k ) = ∑  e kBT  = Φ k = k BT ln 1 − e kBT 

n =0   −
ε k −µ  
1− e k BT  
As before, we evaluate the ε k −µ

mean number of particles 1
e k BT
dΦ k 1
n(ε
k BT
k) = − = k BT =
with energy E:
ε k −µ
dµ − e β [ε k − µ ] − 1
1− e k BT

This is the famous expression of the Bose occupation number:


note that n(E) can easily be >1. But we really need to know
what is the value of chemical potential µ in this case!

33
Michaelmas 2012 Part II Thermal & Statistical

Bose condensate
Fermi particles, due to the exclusion principle, must occupy
increasingly high levels of energy. Bose particles do not! So,
at very low temperatures they can all sit at the level E=0,
which is called Bose condensate. E E
µ=εF
The picture for Bose particles already
suggests the answer for µ = ? µ=?
n n
0 1 0 NC
A more sophisticated argument says: we have some number of particles
(NC) in the E=0 condensate, and some number (N-NC) excited. If the two
subsystems are in equilibrium, then their chemical potentials are equal
(particles can exchange). How to find the optimal number NC? It is
achieved when the corresponding free energy is minimized, that is, when
dFC
= 0 … which conveniently happens to be the definition of µC=0.
dN C
Conclusion: as soon as the Bose condensate appears, µ≈ 0.

Michaelmas 2012 Part II Thermal & Statistical

Bose condensate
The total (fixed) number of particles is determined E
by the familiar constraint:
d 3x d 3 p
N= ∑ n(ε
microstates {ε k }
k )= ∫ (2π h )3
⋅ n( E ) µ=?
n
0
In 3D, we have for bosons: NC
3/ 2
V  2m  ∞ E
N=  
4π 2  h 2  ∫
0 eβ [ E−µ ] −1
dE

The area under each curve should be


equal to N, but at T→0 the area can’t
be preserved...
All particles in the ground state:
1
n( E ) = N for E = 0, hence =N
−β µ
e−1
1 1 k T
e−β µ = 1 + , so : − βµ ≈ , µ = − B → 0
N N N

34
Summary so far……..
3/ 2
Nλ3 N  2π h 2  << 1
Quantum vs. classical =  
V V  m k BT  >> 1

Fermi particles: 2/3


n(E) 32 / 3 π 4 / 3 h 2  N 
1 T→0 εF ≈ ⋅  
n( E ) = ( E − ε F ) / k BT
1 21/ 3 m V 
e +1
0 εF E

3/ 2
Bose particles: N  2π h 2 
Condensate from V  m k T  ≈1
1  B c 
n( E ) =
e ( E − µ )/ kBT − 1  k BT 
µ ≈ 0 in fact µ ≈ − 
 N 

Michaelmas 2012 Part II Thermal & Statistical

Chemical potential, again


In the classical regime: the ideal gas of N particles
(
µ = k BT ln Nλ V
3
) at Nλ3/V small
µF
µ
At Nλ3/V large, we are
in the quantum regime, µB
so Fermi and Bose 0
Nλ3/V ≈1 N/V
systems are different:
2/3
h2  N 
µ F = ε F ≈ 7 .4   classical
mV 
µ B ≈ 0 Due to the Bose-condensation This defines the condensation
in the state with E=0 temperature
k BTC =
2π h 2 N
m V
( ) 2/3

Can you estimate number of particles in


the condensate, NC, at any given T<TC?

35
Michaelmas 2012 Part II Thermal & Statistical

Ideal Fermi gas at low-T


By doing the “grand partition function” trick (for additive ε(n)=nεk)
we got the probability to occupy a level εk : n(ε ) == 1
k β [ε k − µ ]
e +1
This is essentially the probability P(ε,N),
to be used in place of our earlier forms of statistical probability,
which was normalised by the corresponding partition function.
Now we can find the averages, e.g.
d 3 xd 3 p E
U= ∑
microstates {k }
ε k n(ε k ) = ∫
( 2π h) 3 e β [ E −ε F ] + 1
in 3D

3/ 2 3/ 2 3/ 2
V  2m  ∞ E 3/ 2 V  2m  εF V  2m   2 5/ 2 
T = 0: U =  
4π 2  h 2  ∫
0 e β [ E −µ ] + 1
dE =  
4π 2  h 2  ∫
0
E 3 / 2 dE =  
4π 2  h 2 
 εF 
5 
n(E)
( )
2/3
1 h2  2 N  U h2 N 5/3
εF ≈  6π  , so ≈ 4. 6
2m  V V 2m V
0 εF
E

Michaelmas 2012 Part II Thermal & Statistical

Ideal Fermi gas at low-T


At low, but nonzero temperature the problem is much more
complicated V  2m 
3/ 2
∞ E 3/ 2
U=  
4π 2  h 2  ∫ 0 e β [ E −µ ] + 1
dE n(E)
1
Low-T series expansion of “Fermi integrals” E
0


0

e
f (E)
β [E −µ ]
+1
dE ≈ ∫0
εF
f ( E ) dE +
π2
6
(k BT )2 f ' (ε F ) + ... { (k T )
B
4
f ' ' ' (ε F ) }
Note that this works the same way
for any g(ε), i.e. in any dimension
For the mean internal energy, in 3-dimensions:
3/ 2 3/ 2
V  2mE  V  2m  π2
, so U = U 0 +  2  ε F1/ 2 (k BT ) ≡ U 0 + g (ε F ) ⋅ (k BT )
2 2
f =  
4π 2  h 2  16  h  4
∂U π2
This is U(T,V,N), so: CV = = g (ε F ) k B2T
∂T V ,N 2

36
Michaelmas 2012 Part II Thermal & Statistical

Ideal Fermi gas at low-T


But what if we need any other thermodynamic quantity, not just
CV , which is all we can get from U(T,V,N)?
The full grand partition function is Ξ = ∏
microstates {k }
Ξk
this is g(E)
Therefore the grand potential:
3/ 2
d 3 xd 3 p V  2m 
Φ = ∑ Φ k = ∫ Φ (ε k )
( 2π h ) 3 0

(
= − k BT ∫ ln 1 + e − β [ E − µ ]  
4π 2  h 2 
) E 1/ 2 dE
{k }
3/ 2
2 V  2m  ∞ E 3/ 2 2
Integration by parts → =− ⋅ 2 2 
3 4π  h  ∫0 e β [ E −µ ]
+1
dE ≡ − U
3
Now we can legitimately differentiate the proper thermodynamic
potential Φ(T,V,µ):
2 1/ 3
∂Φ  2m   V 
Pressure p=− = p F (T = 0) + const ⋅  2    (k BT )2
∂V  h  N
3/ 2
Entropy S = − ∂Φ = const ⋅  2m  k B2T
∂T 2
h 

Michaelmas 2012 Part II Thermal & Statistical

Bose gas at low-T


The principle is exactly the same: we either find the average
energy U(T,N) from which the heat capacity follows – or go for
the full grand potential Φ(T,µ) and the rest of thermodynamics.
The good (or bad) news is that µ=0. this is g(E)
3/ 2
E V  2m 
U= ∑ ε k n(ε k ) = ∫
microstates { k } eβ E
 
− 1 4π 2  h 2 
E dE in 3D
3/ 2
V  2m  ∞ x3/ 2
Non-dimensional substitution → =   (k BT )5 / 2 ∫0 dx = const ⋅ T 5 / 2
4π 2  h 2  ex −1
3/ 2
 2m 
Φ = ∑ Φ k = k BT ∫

(
ln 1 − e − β [ E − µ ] ) 4Vπ 2  2 
E dE
{k }
0
h  3/ 2
2 V  2m  ∞ E 3/ 2 2
Integrate by parts → =  
3 4π 2  h 2  ∫
0 βE
e −1
dE ≡ U
3

37
Be fluent at……..
3/ 2
Nλ3 N  2π h 2 
Quantum regime =   >> 1
V V  m k BT 

Fermi particles at low (non-zero) temperature:


1 How to “do” Fermi integrals…
n( E ) = ( E −ε F ) / k BT
e +1 (mean energy, Φ, pressure)

Bose particles at low temperature:


1 How to “do” Bose integrals (µ=0)
n( E ) = ( E − µ ) / k BT
e −1 (mean energy, etc)

Michaelmas 2012 Part II Thermal & Statistical

Photons
Photons (E=ħω) are Bose particles of a special type. Since
their mass=0, their “number of particles” is not fixed, but varies
with temperature. E.g. E=0 condensate has no particles at all.
1 Can we find the mean number of photons,
n( E ) = h ω / k BT
e − 1 of the selected “color” given by the fixed ω?
d 3x d 3 p
∑ n(ε k ) =
Instead of E=p2/2m now there
N=
microstates {ε k }
∫ (2π h )3
⋅ n ( hω ) is a different relation between
energy and momentum:
p = h k = hω / c
4π 2ω 2 dω 1 n(ω)
N =V ∫ ⋅ h ω / k BT
(2π ) c e
3 3
−1
Spectral density n(ω) is the number
of particles at a given ω: N = n(ω )dω
∫ ω

38
Michaelmas 2012 Part II Thermal & Statistical

Stefan-Boltzmann law
Photons in thermal equilibrium (emitted by a hot body) have a
very characteristic power spectrum, which has originally led
Planck to suggest E=ħω. Now we can understand why!
1 Let us find the mean energy of photons.
n( E ) = h ω / k BT
e −1 p = h k = hω / c
4π ω dω2 2

U =V ∫ ⋅ h ω / k BT = V ∫ u (ω ) dω
(2π ) c e
3 3
−1
1 hω3
The spectral density of energy is: u (ω ) = ⋅
2π 2 c 3 e h ω / k BT − 1
The full energy emitted:
U 1 ω = 2π c / λ
= ⋅ ( k BT ) 4
V 2π c h
2 3 3

Stefan-Boltzmann law: U=σ T4

Michaelmas 2012 Part II Thermal & Statistical

Other excitations
All elementary excitations are Bose particles with µ = 0, zero
rest mass (so E = hω ) and a dispersion relation between the
energy and the momentum ω = ω (k ) . ω
1
n( E ) =
e β hω − 1
kmax
d 3x d 3 p ∆ω
N= ∑ n(ε
microstates { k }
k )= ∫ (2π h )3
⋅ n ( hω ) k
In each case there is a need
to find the right form of
density of states g(E) dE,
which is now g(ω) dω
4π 2 k 2 dk hω
U = ∑ ε k n(ε k ) = V ∫ ⋅ h ω / k BT
k (2π h ) e3
−1

39
Michaelmas 2012 Part II Thermal & Statistical

Phonons and Debye model


Normal modes of vibrations in a lattice follow the dispersion
relation (lowest energy mode)

ω = 2ω0 sin( 12 ka)

where a is the lattice spacing and ω0


the frequency of each bond.
The Debye model preserves the cutoff, but ignores the slowing of the wave
ωD 3V V
3 modes per atom : 3 N = ∑1 = ∫
all states
0 2π c
2 3
ω 2dω ⇒
2π 2 c 3
ω D3

4
3V ∞ hω 3dω 3V h  k BT  ∞ y 3dy
low − T : U =
2π 2 c 3 ∫
0 e βhω − 1 2π 2 c 3  h  ∫0 e y − 1
=   ⋅ ⇒ CV ∝ T 3

3V ωD hω 3dω 3V ωD hω 3dω V
high − T : U =
2π 2 c 3 ∫ 0 β hω

e − 1 2π 2 c 3 ∫0
=
( β hω + ...) 2π 2 c 3
ω D3 ⋅ k BT

Michaelmas 2012 Part II Thermal & Statistical

Phonons and Debye model


At high-T one can re-express the integral via the 3N constraint,
giving the result:
This is the energy of 3N simple
high − T : U ≈ 3 Nk BT harmonic oscillators (see lecture 5).
There we had: 1
Z=
2 sinh ( 12 β hω0 )
CV
U = 12 hω0 tanh ( 12 β hω0 )

CV =
(hω0 )2 1
4k BT sinh ( β hω0 )
2 2 1
2

Low-T limit (quantum oscillator)


is very different…
T

So at high-T phonons behave exactly the


" photon gas" ⇒ CV ∝ T 3
same as classical oscillators, but at low-T
" energy gap" ⇒ CV ∝ e − hω0 k BT the major difference is due to the continuous
phonon spectrum (no gaps!)

40

You might also like