0% found this document useful (0 votes)
10 views21 pages

Ch. 4 The Statistical Interpretation of Entropy

The document discusses the statistical interpretation of entropy, focusing on the concepts of macrostate and microstate, the most probable distribution, and the connection to entropy. It explains configurational and thermal entropy and provides examples of statistical approaches in thermodynamics, including the use of partition functions. The document highlights the significance of Boltzmann's equation and the role of statistical mechanics in understanding thermodynamic systems.

Uploaded by

happyhankoh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views21 pages

Ch. 4 The Statistical Interpretation of Entropy

The document discusses the statistical interpretation of entropy, focusing on the concepts of macrostate and microstate, the most probable distribution, and the connection to entropy. It explains configurational and thermal entropy and provides examples of statistical approaches in thermodynamics, including the use of partition functions. The document highlights the significance of Boltzmann's equation and the role of statistical mechanics in understanding thermodynamic systems.

Uploaded by

happyhankoh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

Statistical Interpretation of Entropy ( 통계열역학 )

1. Macrostate vs microstate
2. The most probable distribution
3. Connection to entropy
4. Configurational entropy and thermal entropy
5. Examples of statistical approaches

Gibbs described entropy as a “degree of mixed-up ness” at the atomic or molecular level.
The full atomistic description was completed after statistical mechanics and quantum the-
ory were discovered.

1
1. Macrostate vs microstate

Suppose that there are three identical particles


(A,B,C) in the system and they can have certain en-
ergy values . For simplicity, energy spacing is identi-
cal . If the total internal energy is fixed to , system
can have three different distributions or macrostates:

Each macrostate has the corresponding detailed microstates are

Probability: 1/10 3/10 6/10 (most probable)

If the system were observed over a finite interval of time, during which the system rapidly
changed from one microstate to another, the fraction of time which the system in
distribution (c) would be 6/10.
The postulate of equal a priori probability: if the microstates have the same total energy
energy, volume, and number of particles, then they occur with equal frequency.
2
2. The most probable distribution

Let’s generalize the four-level system in the previous section and assume that there are
particles that can take an energy state . is not necessarily equal-spaced. The total energy
is .

Macrostate: For instance, in the previous


Number of microstates corresponding to example,
𝑛! Ω for (a)
Ω= Ω for (b)
𝑛0 ! 𝑛1 ! ⋯ 𝑛𝑟 !
Ω for (c)
satifies and (1)

Let’s consider a thermodynamic system with .


𝑟 𝑟 Stirling’s formula
ln Ω=𝑛 ln 𝑛 −𝑛 − ∑ ( 𝑛𝑖 ln𝑛 𝑖 −𝑛𝑖 ) =𝑛 ln𝑛 − ∑ 𝑛𝑖 ln 𝑛𝑖 for large
𝑖=0 𝑖=0

What maximizes (or ) under the constraint (1)?

3
Lagrange multiplier method

Finding minimum or maximum under constraint


ex) Find extremum of when
𝑓 (𝑥 , 𝑦) 𝑔 (𝑥 , 𝑦 )
(: Lagrange
Lagrangian function: ℒ ( 𝑥 , 𝑦 , 𝜆 )= 𝑓 ( 𝑥 , 𝑦 ) + 𝜆 𝑔 ( 𝑥 , 𝑦 ) =𝑥+ 𝑦 + 𝜆(𝑥 + 𝑦 )
2 2

multiplier)

𝜕ℒ 𝜕ℒ 1
=1+2 𝜆 𝑥=0 , =1+ 2 𝜆 𝑦=0 → 𝑥= 𝑦=−
𝜕𝑥 𝜕𝑦 2𝜆

2 2
𝑥 + 𝑦 = 1→ 𝜆=±
1
√2
→ ( 𝑥 , 𝑦 )=±
( 1
,
1
√ 2 √2 )

4
𝜕ℒ
=− ln 𝑛𝑖 −1 −𝛼 − 𝛽 𝜀𝑖 =0 ( for all 𝑖)
𝜕 𝑛𝑖

→ 𝑛𝑖= 𝑒− 𝛼 𝑒 − 𝛽 𝜀 (𝛼 +1=𝛼′ )𝑖

𝑛
∑ 𝑛𝑖=𝑒 ∑ 𝑒
′ ′ ′
−𝛼 − 𝛽 𝜀𝑖 −𝛼 −𝛼
=𝑒 𝑍 =𝑛 →𝑒 =
𝑖 𝑖 𝑍
𝑍 =∑ 𝑒
− 𝛽 𝜀𝑖
: ( atomic ) partition function
𝑖 At higher temperatures, higher
𝑒− 𝛽 𝜀𝑖
𝑛𝑖 𝑒− 𝛽 𝜀 𝑖 energy levels are more populated.
𝑛𝑖 =𝑛 → =
𝑍 𝑛 𝑍
β is determined by the energy constraint.
Statistical mechanics and thermodynamics are connected by setting . The prob-
ability that a particle occupies an energy state of :
𝑛𝑖 1
= exp −
𝑛 𝑍
𝜀𝑖
𝑘𝑇 (
∝ exp −
𝜀𝑖
𝑘𝑇 ) (
: Boltzmann distribution )
For macroscopic systems, the Boltzmann distribution dominates over other types of en-
ergy distribution in terms of number of microstates. 5
𝑛 𝑛 𝜕 𝜕
𝑈=∑ 𝜀𝑖 𝑛𝑖 = ∑ 𝑖 −𝛽𝜀
𝜀 𝑒 =−
𝑖
𝑍 =− 𝑛 ln 𝑍
𝑍 𝑍 𝜕𝛽 𝜕𝛽

𝑈 𝜕
=⟨ 𝑈 ⟩=− ln 𝑍 : mean energy per particle
𝑛 𝜕𝛽

𝑒− 𝛽 𝜀 𝑖

ln Ωmax =𝑛 ln 𝑛 − ∑ 𝑛 𝑖 ln 𝑛𝑖 (𝑛 𝑖=𝑛 → ln 𝑛𝑖 =ln𝑛 − 𝛽 𝜀𝑖 − ln 𝑍 )


𝑍

¿𝑛ln𝑛− ∑ 𝑛𝑖 ( ln𝑛− 𝛽𝜀𝑖 −ln 𝑍 ) =𝛽 ∑ 𝑛𝑖 𝜀𝑖 +𝑛ln 𝑍=𝛽𝑈+𝑛ln 𝑍

6
3. Connection to entropy

As becomes a large number, (simply ), the total number of possible microstates is


dominated by the most probable microstate (see Appendix).
𝑈
→ ln Ω =ln Ω max = 𝛽 𝑈 +𝑛 ln 𝑍 = +𝑛 ln 𝑍 (1)
𝑘𝑇

Suppose that the system is in equilibrium with heat bath T. If the system receives heat
from bath, . (The present model system does not expand or shrink.)
The change in number of states:
𝛿𝑈 𝛿𝑞 𝛿 𝑆
𝛿 ln Ω= = =
𝑘𝑇 𝑘𝑇 𝑘

→ 𝛿 𝑆=𝑘 𝛿 ln Ω
𝑈
By integration, : Boltzmann’s equation From (1), 𝑆=𝑛𝑘 ln 𝑍 +
𝑇
Ex) For freezing of supercooled liquid in the previous chapter
Ωsolid +bath
∆ 𝑆=0.137 J /K =𝑘 ln 𝑘=1.38 × 10
− 23
J/ K
Ωliquid +bath
Ω solid +bath 22 10
22

→ ln 10 → Ω solid +bath 𝑒 Ω liquid+ bath


Ω liquid+ bath
7
4. Configurational entropy and thermal entropy

Energy distribution → Thermal entropy


Spatial distribution → Configurational entropy
QM principle
Ex) Interface between solids:
Here the same atoms are not distinguishable. (In the previ-
ous example, atoms were spatially separated so were distin-
A A B B
guishable.)
A A B B
(unmixed)

(maximally mixed)

total number = 70 ( )
Suppose that there are N0 (~NA) sites on each side.

equal

8
If the energy does not depend on the position,
Ω tot =Ωt h ×Ωconf → 𝑆=𝑘 ln Ωt h +𝑘 ln Ωconf =𝑆th +𝑆 conf

In the gaseous system where atoms can move freely within a volume, the configurational en-
tropy depends explicitly on the volume. In other words, Ω is a function of U and V in general
(Ω(U,V)).
Ex) Ideal gas
𝑆 𝑉 3 𝑈
= 𝑅 ln + 𝑅 ln +𝐶
𝑛 𝑛 2 𝑛

Configurational Thermal

Note:
- Configurational and thermal entropies are not always separable. ex. Ideal gas under
gravitational field
- Depending on the type of energy source, the thermal entropy is called more specifi-
cally such as translational, rotational, vibrational entropy, etc. There is also spin en-
tropy.

9
Ex)
Pb (Z=82) crystal (fcc)

10
𝑛 Fe =𝑛 𝑋 =0.5 𝑁 𝐴
[ 1 1
2 2
1 1
]
¿ 𝑘 𝐵 𝑁 𝐴 ln 𝑁 𝐴 − ln 𝑁 𝐴 − ln 𝑁 𝐴 =𝑅 ln2
2 2

11
QM principle ( )
𝜖 𝑖= 𝑖+
1
2
h𝜈 (𝑖=0 , 1 ,2 , ..) Potential
energy
: Planck constant = 6.6252×10−34 J·s
Effective spring (harmonic
: vibration frequency (7×1013 s−1 for N2)
oscillator)

Ground state Excited states 1.1 Å


( 기저 , 바닥상태 ) ( 여기 , 들뜬상태 ) N-N distance

9.8 eV
(≈ 945 kJ mol⁻¹)

 결합에너지

12
( )
𝑛1 𝑛𝑖 +1
¿
𝑛0 𝑛𝑖

𝑛1
𝑛0

In general,

( )
𝑛𝑖
𝑛0
=exp − 𝑖
h𝜈
𝑘𝐵 𝑇
=exp −𝑖 (
3360 K
𝑇 )
At room temperature (T = 300 K) (no excitation)

13
5. Examples of the statistical approaches

Two approaches in statistical mechanics


i) : counting all microstates (microcanonical ensemble). It is easier to count Ω under
constant U and V (rather than T and P)

( )
𝜕𝑆 1
= → 𝑈 ( 𝑇 , 𝑉 ) → 𝐶𝑉 , …
𝜕𝑈 𝑉 𝑇
𝐶 𝑃 =𝐶 𝑉 + ( )
𝜕𝑉
𝜕𝑇 [ ( )]
𝑃+
𝜕𝑈
𝜕𝑉

( 𝜕𝑉 ) 𝑇 → 𝑉 ( 𝑇 , 𝑃 ) → equation of state
𝜕𝑆 𝑃 𝑃 𝑇

=
𝑈

ii) Partition function (canonical ensemble): system under constant temperature (T)

𝑍 =∑ 𝑒− 𝛽 𝜀 → 𝑍 ( 𝑇 ,𝑉 ) 𝛽=
𝑖
𝑖

( 1
𝑘𝑇 )
𝜕
𝑈=− 𝑛 ln 𝑍 →𝑈 ( 𝑇 , 𝑉 ) → 𝐶 𝑉 , …
𝜕𝛽
𝑈
𝑆=𝑛𝑘 ln 𝑍 + → 𝑆 ( 𝑇 ,𝑉 ) → 𝑆(𝑈 , 𝑉 )
𝑇

14
Example of approach using entropy: Tension of 1-d chain

There are n (n >> 1) elements with length a. The joint


can turn freely. Find entropy of the chain as a function
of x and obtain tension as a function of T.

Synthetic polymer: Biopolymer: DNA


Polypropylene ( 플라스틱 )

부타디엔고무 :

n+(−): number of chains pointing to the right (left)


n n  n x (n  n )a
n! n!
 conf ( x )  n Cn  
n !( n  n  )! n ! n !
S k ln  conf
k ( n ln n  n ln n  n ln n ) (We neglect thermal entropy (see the next page).)
 na  x na  x 
n
   , n   
 2a 2a 
 1 x   x  1 x   x 
nk ln 2   1   ln  1     1   ln  1  
 2 na   na  2  na   na   15
Entropy

This is the Hooke’s law. One interesting result of this model is that the spring constant in
the Hooke’s law increases with temperature, which is rather counterintuitive. Stretched
(short) chain  Lower (higher) entropy. High entropy is favored at elevated temperatures.
(This will be explained in the next chapter.) This is the simplest model embodying the es-
sential property of rubber elasticity.
See https://www.youtube.com/watch?v=ovVO8NDdon4

Here we ignored the thermal entropy that is related to the kinetic motion of beads,
which defines the temperature of the 1-d chain. Since we do not consider the spring
between beads and treat the inter-bead interaction as rigid rods, the thermal entropy
does not depend on the length. Therefore, it can be ignored when we focus on the
elastic property.

16
Example of approach using partition function: Schottky anomaly in heat capacity

Suppose that the system consist of n atoms that can have energy 0 or ε. Calculate
the specific heat capacity.
Z e   0  e   1  e  
  e   n
U  n ln Z n  
 
 1 e e 1
 U  dU dU d  1 dU
CV      
 T V dT d  dT kT 2 d 
 2e / kT
𝑛0
N
kT 2 ( e / kT  1) 2

Population at 0

Population at ε

Cv

17
Origin: hyperfine splitting – en-
ergy splitting of nuclear spin by
magnetic fields produced by f
electrons of Pr

18
Example of approach using partition function with continuous variable: equipartition theorem

c . f . 𝑍=∑ 𝑒
− 𝛽 𝜀𝑖
: ( atomic ) partition function
𝑖
For ideal gas with N monoatoms.

Hamiltonian
(energy) Momentum Position
1
𝑍= 3 𝑁 ∫ exp (− 𝛽 𝐻 ({ 𝐩𝑖 , 𝐪𝑖 })) 𝑑 𝐩1 … 𝑑 𝐩 𝑁 𝑑 𝐪 1 …𝑑 𝐪 𝑁
h 𝑁!
Indistinguishability of particles

(∫ ( 𝐩2𝑖
) )(∫
𝑁
1
¿ 3𝑁 exp − 𝛽 ∑ 𝑑 𝐩1 … 𝑑 𝐩 𝑁 𝑑 𝐪1 … 𝑑 𝐪𝑁 )
h 𝑁! 𝑖=1 2𝑚

( )
𝑁 3𝑁
𝑉2𝜋𝑚
¿ 3𝑁 2
h 𝑁! 𝛽
𝜕 3
Mean energy per particle: ⟨ 𝑈 ⟩ =− ln 𝑍 = 𝑁𝑘𝑇
𝜕𝛽 2

19
Appendix
* Proof of
Suppose that energy levels are equally spaced , i.e., . There are n parti-
cles that occupy energy levels.
The total internal energy is U.
Let’s first calculate
ki: the index of the energy state that ith particle occupy (i 1,  , n )
ex) if 5th particle occupy 10  k5 10
U k1u  k2u    knu
U
 ( k1    kn )  m
u
number of ways of choosing n non-negative integers (k1 , k2 ,  , kn ) adding up to m

(Stirling’s approximation)

Next, let’s calculate  max


First, partition function is given by

20
m

21

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy