0% found this document useful (0 votes)
12 views25 pages

1Lectures on Statistical Mechanics Mở đầuCh.1

Uploaded by

duyen phan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views25 pages

1Lectures on Statistical Mechanics Mở đầuCh.1

Uploaded by

duyen phan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

The lectures on

Statistical Mechanics
Lecturer: Prof. Nguyen Quang Bau.
Faculty of physics, College of Natural Science, Hanoi.

Chapter 1 : Introduction to Statistical Mechanics

1. INTRODUCTION.

1) The objects of Statistical Mechanics are the macroscopic systems


of multiple particles

2) Position of Statistical Mechanics:


Statistical Physics is a subject in four fundamental subjects of
theoretical Physics: theoretical mechanics , electrodynamics, quantum
mechanics and statistical physics.

3) Statistical Mechanics program includes two main parts:


 Equilibrium statistical physics

 Non-equilibrium Statistical Physics (Classical Statistical


Physics and Quantum Statistical Physics)

which mainly studied in depth the basic concepts of Statistical Physics, the
physical effects of Equilibrium Statistical Physics, but the effects of Non-
equilibrium Statistical Physics and Quantum Statistical Physics.

4)Syllabus of Statistical Mechanics

Chapter 1. Introduction to Statistical Mechanics.


1.Introduction.
2. Microscopic states and macroscopic states.

1
3. Description of a macroscopic system by the Statistically Physical
methods .
4. Liouville’s theorem . The role of energy .
5. Quantum Statistics .Density matrix.
6. Thermodynamic equilibrium states.
7. The interaction between the macroscopic systems.
8. Statistical weights of macroscopic states.
9. Entropy.
10.Absolute temperature

Chapter 2. Gibbs distribution.


1. Iso – probability theorem. Microcanonical distribution.
2. The Gibbs distribution.
3. Free energy. Gibbs - Helmholtz equation.
4. Application of Gibbs distribution to ideal mono - atomic gas.
5. Quantum theory of heat capacity of solids.

Chapter 3. Gibbs generalized distribution. Fermi - Dirac distribution


and Bose - Einstein distribution.
1. Gibbs generalized distribution
2. Ideal system includes of identical particles. Fermi – Dirac
distribution and Bose – Einstein distribution.
3. Apply Fermi – Dirac distribution to free electron gas in metal.
4. Apply Bose – Einstein distribution to Bose – Einstein
condensation.
Chapter 4. Some methods of quantum field theory for the system of
various seeds in the statistical physics.
1.Methods of creation operator and annihilation operator fermion.
2. Hamiltonian of the system of harmonic oscillators
3. Hamiltonian of the system of electrons.The second quantization

4. Hamiltonian of the system of electrons – phonons


5. Some of important dispersion relation of algebraic operator

Chapter 5. Feynman diagram

2
1. Shrodinger performers, Heisenberg performers, and interactive
performers.

2. Creation operator and annihilation operator electrons and holes


in interactive performers .
3. Feynman diagram.

Reference Material
1. Nguyen Quang Bau, Bui Bang Doan, Nguyen Van Hung. “Statistical
Physics”. Hanoi, 1998.
2. Feynman.R. “Statistical Mechanics”. California, 1972.
3. Kubo R. “Statistical Mechanics”. Amsterdam, 1965.
4. Paul.H.E. Meijer “Quantum Statistical Mechanics”. New York –
London – Paris, 1966.
5. Landau.L.D., Lifsitx.E.M. “Statistical Physics”. Moskow, 1964.
6. Abrikosov A.A., Gorkov L.P., Dzualoshinskiii I..E. “Methods of
Quantum Theory field in statistical Physics”. Moskow, 1978

3
2. Microscopic states and macroscopic states.
1) Microscopic states of a system.
Microscopic state of a system is the collection of data on the specific states
of the constituent particles at a given time.

a) From the point of classical mechanics: Microscopic state of a system is


a set of coordinates and impulses (momentums) of all constituent particles:

(q; p)  (q1 q2 q3 ... q f ; p1 p2 p3 ... p f ) (1)


in which f = 3N is the number of degrees of freedom of the system, N is the
number of particles. 2f values of coordinates and momentums are found
from the set of Hamiltonian motion equations.
The set of Hamiltonian motion equations (i = 1,2,3, ..., f = 3N):
dq H 
qi  i 
dt pi  
 (2)
dpi H 
pi  
dt qi 

(A set of 2f equations), where H is the Hamiltonian of the system.
(2) has 2f initial conditions:

qi(t=0)=qi(0); pi(t=0)=pi(0) (3)


b) From the point of quantum mechanics: the microstate of the system
will be described by the wave function, satisfying the steady state
Schrödinger equation
H n  En n (4)
( H is the Hamiltonian operator).
A phase space is a 2f-dimensional space with coordinate axes q1, q2,…,qf,
p1, p2, …pf, in which each point is called a phase point. Each phase point is
P corresponding to a kinetic state of the system. Over the
time, the state of the system changes, and
therefore the phase point changes and make
the orbital phase.

(p,q)
q

4
2) Macroscopic state of a system .
a) Conclusion 1: It is impossible to use the mechanical methods to study
multipartite systems.
+Equation (2) and condition (3) cannot be solved because the number of the
equations is very large (There are 1024 particles in one gram-molecule gas),
and it is unable determined the initial coordinate and velocity (momentum).
+A macroscopic system consists of many particles, according to (4) finding
the steady-state energy is nonsense. Indeed, the energy spectrum is dense
while the macroscopic systems always interact with the environment (though
weak) and this interactive energy is larger than energy gap between two
successive levels of the energy special
b) Conclusion 2: In fact, a macroscopic system is characterized by
quantities such as pressure, volume, temperature, magnetization, which are
called macroscopic parameters.
+Definition: A macroscopic state of a system is a state represented by the
macroscopic parameters:

+A thermodynamic equilibrium states is a state in which the macroscopic


parameters do not change over time t.
+Internal parameters: temperature R, energy E.
+External parameters: volume V, the external field.

5
3. Description of a macroscopic system by the Statistical
Physical methods

1) Content: If the probability of the microscopic states are known, the


observed values of the macroscopic parameters are calculated as the average
value of the microstates.
+Statistical average value of the quantity A in quantum statistical
physics:
A   Ann (1)
(the state corresponding to the probability . In that state An is A’s value).
+ Statistical average value of the dynamical quantity A in classical
statistical physics:

A(p,q)   A(p,q)(p,q)dpdq (2)


(  ( p, q) is the probability distribution function).
2)The purpose of statistically physics methods: Determine the probability
of the microscopic states: n ,  ( p, q) and the statistically average value of a
quantity A.

4. LIOUVILLE’S THEOREM. THE ROLE OF ENERGY.

1) Liouville’s theorem:

a) Theorem: If the number of the particles of the system remains constant,


the probability distribution function ω does not change along the orbital
d 
phase ( (i.e.    , H   0 )
dt t
6
b) Proof:
- In the general case, the probability distribution function depends on p, q
and t.
- Considering the variation of ω in t, i.e. along the phase orbit.

p
B(p+dp,q+dq)

A(p,q)
q

Suppose that at the time t, the state of the system is described by the phase
point A (p, q);
At the time t + dt, the state of the system is described by the phase point B (p
+ dp, q + dq). We have:

    f
   
d  dt  dp  dt  dt    dpi  dqi 
t p q t i 1  pi qi 
From this and the set of Hamiltonian equations, we get:

d  f
  dpi  dqi 
    
dt t i 1  pi dt qi dt 

 f
  H  H 
   
t i 1  qi pi pi qi 

   , H  . (1)
t

Suppose that the total number of the particles of the system is N and the
number of particles in volume V is Nv, we have:
NV  N   ( p, q, t )dpdq
V

NV 
 N dpdq (2)
t V
t

7
(2) is the formula of the variation of the number of particles in V in t.
It is the number of particles going through the boundary S of V, i.e:
NV
  N  ( , v) n d (3)
t S

( v is a vector motion in the phase space, index n is understood as projection


with respect to normal ).
Applying Gauss formula:
NV
  N  div( , v)dpdq 
t V
f
   dqi    dpi  
 N        dpdq 
V i 1 
qi  dt  pi  dt  
f
  dpi  dqi 
 N     dpdq 
 
 ip dt q dt 
V i 1 i (4)
f
  H  H 
 N     dpdq 
V i 1 
qi pi pi qi 
  N   , H  dpdq
V

Comparing (2) and (4), it is deduced:


 
   , H     , H   0 . (5)
t t
From (1) and (5) it is deduced:
d 
   , H   0 . (6)
dt t
Equation (6) represents Liouville’s theorem. .

2) The role of energy:

a) Result 1: ω is as a function of energy, or a function of H, i.e.


 ( p, q)    H  p, q   .

In thermodynamic equilibrium states, the statistically average value doesn’t



depend on time t, ) it is deduced that ω doesn’t depend on time t ( =0 ) .
t

from (5) and =0 we obtain , H   0 .
t
Moreover, classical mechanics shows that a quantity is a movable
integration if its Poisson bracket is 0. Therefore, ω is a movable integration.
According to mechanics, ω is one of 7 components of a movable integration
including of 3 components of momentum , 3 components of angular
8
momentum and energy. Consider a non-rotational stable system, it can be
seen that ω is an energy constituent. At the thermodynamic equilibrium
states, ω(p,q,t) becomes ω(p,q) and depends on energy.
 ( p, q)    H  p, q   (7)

b) Result 2: Function ( p, q)  Ae  H ( p,q ) in which, A is the normalized


coefficient , β is a constant with the significance of energy inverse.

Suppose that A macroscopic system consists of many macroscopic systems


and interactions between constituent macroscopic systems are negligible
(very weak). Energy of system is equal to the total energy of component
macroscopic system
H  H1  H 2  ....   H i
i

 ( H )   ( H i ) (8)
i

Due to the interaction between the main components, which are statistically
independent, it is deducted probability of a state of macroscopic system
equals the product of probabilities of macrostates of the component systems.
 ( H )   ( H1 ).( H 2 )... . (9)
ln  ( H )  ln ( H1 )  ln ( H 2 )  .....
ln  ( H )   ln ( H i ) (10)
i

Relation (10) says that ω(H) depends on H as an exponential function:


 ( p, q)  ( H ( p, q))  Ae   H ( p,q ) (11)

Coefficient A, β don’t depends on p,q. The minus sign in the exponent to


ensure that the probability cannot be greater than 1 , β is the coefficient with
the meaning of energy inverse to make the exponent with non-dimension. A
is normalized coefficient, (corresponding to the probability of a certain
event):

 Ae
  H ( p ,q )
dpdq  1
We obtain: A1   e  H ( p ,q ) dpdq  1 (12)

Note: Liouville theorem and ω(p,q) is established for the classical statistics.

9
5. QUANTUM STATISTICAL PHYSICS . DENSITY MATRIX.

Similar to unit 4, a question is proposed that how the probabilities of the


quantum states n related to its energy En .

1) Density matrix.

Assuming that macroscopic system can be in quantum states  


corresponding to probabilities Pα ( α=1, 2 , 3, …)
Expand   in orthogonal eigen function series m  of a Hermite operator
f , i.e:
f m  f mm (1)
    Cm m (2)
m

where Cm are the coefficients of the expansion
Consider a physical quantity A corresponding to operator A . According to
quantum mechanics, the observed value of A in the state   is:
A   * A  dq 

  Cm *Cn  m* An dq   Cm* Cn* Amn (3)


m, n m,n

Where Amn is the matrix elements of the operator A in the representation f.


The statistical average value of A is:
A   A P 

 
    P Cm*Cn  A mn   nm Amn  Sp ˆ A  (4)
m,n    m,n
 

Definition: the operator ̂ with the matrix elements nm :


nm   P Cm*Cn (5)

Is the density matrix operator, and the matrix in (5) is the density matrix.

10
Note : To calculate the average value A , we need to know nm . Hence, it can
be seen easily that nm plays a role as the probability distribution function
 ( p, q) in classical statistics.

2) Equation of motion of density matrix.

derive (5) with respect to time :


nm   Cm * 
 * Cn 
  P Cn  Cm
t 
(6)
t   t

Cm * Cn
Solving , from Schrödinger equation:
t t

 
i  H 
t
Ck
i  k   C Hk

k t k k

Multiplying the left-hand side by  n* , then integrating we obtain:

Ck
i  n*k dq   C  n* Hk dq

k t  k k

In which  n*k dq   nk and  n* Hk dq =Hnk , we get:


Cn
i   Ck *H nk (7)
t k
Similarly:
Cm *
i   Ck *H*mk   Ck *H km (8)
t k k
*
(due to H mk =H km )

Substituting (7) (8) into (6) we obtain:

11
nm  
i   P   Cn Ck *H km   Cm *Ck H nk  
t k    
 
    P Ck *Cn H km   P Cm *Ck H nk  
k    
   nk H km  km H nk  
k

   H nkkm  nk H km  
k

  H, ˆ 
  nm

As a result, equation for density matrix is:


nm 
i  H, ˆ  (9)
t   nm

And equation for density matrix operator is:

   i
i  H, ˆ  hay  ˆ ,H  (10)
t   t  

3) Consequences
a) In the thermodynamic equilibrium states, density matrix is in the
diagonalized form and the function of energy : ωn≡ωnn=ωn(En)

In the thermodynamic equilibrium states: A  Sp  A  does not depend on


 
time or  does not depend explicitly on time:

0   , H    H,   0
t    
According to quantum mechanics, two commutative operators can be in the
same diagonal form. In energy transformations En, operator H is in the
diagonalized form:
Hnn  En nm
it is inferred that in energy transformations En, operator  is in the
diagonalized form:
12
nm  n nm
ωn=ωnn is the diagonal elements of matrix ωnm.
According to quantum mechanics, if operator  commutative with H , ωn is
motion integral and with a non-rotational stable system ωn is only a function
of energy:
n  n ( En ) (11)

b) The formula for statistical average value A :


A =  n A n
n

where An=Ann=   E* n A En dq is the diagonal elements of matrix Anm in the


energy representation
A  Sp[ A]=n,m A m,n  n nm A m  n A n (12)
m ,n m ,n n

c) The energy dependence of ωn (it can be obtained in the same way as


in classical statistics ):
n  Ae  En
( A,β do not depend on En). A is normalization constant, found from
normalization condition  n  1 :
n

A   e  En
1
(13)
n

6 . Thermodynamic equilibrium states.


1)The example of thermodynamic equilibrium states

13
N

t=0 t=t1>0 ( redistributional particle system)

N/2 N/2

t   (thermodynamic equilibrium state)


macroscopic parameters n(t) is the number of gas molecules in the left half of
the container
N
lim n(t )   const
t  2

N
2

14
2)Characteristics of thermodynamic equilibrium states.
a) Characteristics 1: in thermodynamic equilibrium states,
macroscopic parameters do not depend on t.
Clearly/Obviously, in thermodynamic equilibrium states
N
lim n(t )   const
t  2
b)Characteristics 2: thermodynamic equilibrium states do not
depend on pre-history
It is obvious that the system reaches the thermodynamic equilibrium
states after a lot of complicated chaos. Consequently, the present state does
not depend on the initial state. In spite of the fact that N initial molecules in
the example can be in any position in the left half of the container, they will
be redistributed equally in two half of the container and the number of the
molecules in each half is proximately N/2.
c)Characteristics 3: thermodynamic equilibrium states are the
states with the highest level of random.
The number of the possible microstates is a measure of the randomness of
the macroscopic system.The higher randomness of the macroscopic system,
the larger the Number of microstates.
Each macroscopic state may exist many different microscopic states
Distributed molecular gas state in the container can be expressed in many
different ways. If the state is not balanced, the number of states is fewer (the
number of way to arrange of the gas molecular equals the number of
microscopic states ).
A process that a system moves towards the thermodynamic equilibrium state
from unbalanced states is an irreversible process.

15
7. THE INTERACTION BETWEEN THE MACROSCOPIC
SYSTEMS .

1) The interaction between the macroscopic systems.


Macroscopic state of a system can change in interaction with other
macroscopic systems. There are three interactional types:
a) The heat interaction: is the energy exchange when the external
parameters do not changed, ie without positive work and there is
temperature difference and contact one another between macroscopic
systems.
Example: An endothermic or exothermic object when its volume does not
change.
dE  dQdE
dQ
theinternalenergy variationof thesystem
theheat variation of thesystem

( dQ  0 endothermic system, dQ  0 exothermic system)


b) The mechanical interaction – the work done: is the energy exchange
associated with the change in external parameters
Example: A gas expands thermally and makes the piston move (doing
mechanical work)
If dA is a work element, we have:
dA  Xdx
X is generalized force and dx is external parameter variations.
If X is pressure (X=p) , dx is the variation of volume ( dx  dV )
We obtain : dA  pdV .
We have: dE  dA   Xdx
( dA  0 : work done by the system and the internal energy reduces, dA  0
work performed on the system and the internal external increases).
c) The matter interaction: is the exchange of energy via the exchange of
matter
Example: The internal energy of a system changes dE because of the
change of the number of particles of the system (increase or decrease) dN :
dE = -  dN
In which  is the rate coefficients or the chemical potential

16
2) The first law of thermodynamics: Interaction of mechanical- heat.
Suppose that system both gains heat (or losses heat) and a positive (or
negative) work done by the system at the same time. This is called the
mechanical - heat interaction (there are simultaneously two types of
interaction).
dE  dQ  dA
This is the first law of thermodynamics.
Meaning: Heat gained by the system has two effects, which alter the internal
energy and the work done

3) Note:
a) An isolated system is a system which does not interact with other
external systems.

dE  dQ  dA  0
An open system is a system that interact with the outside systems
b) An adiabatic process is a process without heat exchange ( dQ = 0 )
An isochoric process is a process with Constant volume
An isobaric process is a process with Constant pressure
An isothermal process is a process with Constant temperature

17
8. STATISTICAL WEIGH OF MACROSCOPIC STATE.

1) Definition:
Consider an isolated system with
dE = dQ = dA = 0 (1)
Because the system does not interact with the outside systems, the energy of
the system is fixed. “Fixed” does not mean energy is constant ( E  const ).
According to quantum mechanics, energy always has uncertainty dE related
to the period of observation dt of Heisenberg uncertainty :

dEdt (2)
Therefore, an isolated system is considered a system which has the energy in
the range:
[E, E + dE ] (3)
It means that an isolated system has the possible energy levels in the range
of (3):
En  E, E  dE  (4)
It leads to the fact that each macroscopic state of the system can correspond
to a lot of various microstates if En satisfies (4).
Definition : The total number of possible microscopic states
corresponding to macroscopic states is called statistical weights of that
macroscopic state, symbolized  .

2)Characteristics of statistical weight:

a) Statistical weight  depends on energy:


  ( E ) (5)
Considered ( E ) the total of microscopic states with energies less than or
equal to E:
( E )  ( E  dE )  ( E )
 
   E dE   ( E )dE
E E

 (E)  is called the state density of the system.
E
18
b)In thermodynamic equilibrium states, the number of
microstates is maximum , i.e. Statistical weights is maximum, too
: Cbnd   M ax
A process reaching thermodynamic equilibrium states is a process increasing
statistical weights
lim   Max (6)
t

c)Thermodynamic probability P related to statistical weights


 :

P (7)
 Max
( PCbnd  1 )
d)Statistical weights  and  ( E ) increase rapidly in energy
( E ) ( E  Eo ) f dE
 ( E ) ( E  Eo ) f
Suppose that each quantum state of a system determined by quantum
number is called a degree of freedom of the system. Each degree of freedom
contributes energy  into energy E of the system ( f  3N ).Suppose that:
 o   min , Eo  Emin
we have:
E  Eo  f (   o ) (8)
( E )  ( ) f (9)
 ( ) is a function as:  ( )  (   o ) (10) in that  is unit size
On the other hand:
   
  dE  dE (11)
E   E
   1 1
Thus: f  f 1 ,  (   o ) 1 ,  subtracting (11) we
  E E f

obtain:
  (   o ) ( f 1) 1 dE  (   o ) f 1 dE
(   is unit size so  f 1 )
  (   o ) f dE ( E  Eo ) f dE
Conclusion:

19
 ( E  Eo ) f dE
 ( E ) ( E  Eo ) f

9. ENTROPY

1) Definition: Quantity S  k ln  is called entropy S of the system


( k  1,38.1023 J / K is The Boltzmann’s constant)
Since  ( E  Eo ) f dE , f depends on the number of particles N . In
addition ,  depends on the fact that the system is in the thermodynamic
equilibrium states or not and relate to the number of microstates , which
external parameters X affects these macroscopic states.
it is deduced,  depends on E, as well as N , X . In general we have:
S  S ( E, X , N ) as a function of E, X , N .

2) Properties :
 n

a) Property 1: Entropy S is additive quantity  S   Si 
 i 
Suppose that a macroscopic state is divided into n macroscopic components
and independent of each other.
  1.2 .3....n
(since  is proportional to the thermodynamic probability p , which equals
the product of probabilities)
S  k ln   k ln 1  k ln  2  k ln 3  ....
n
 S1  S2  S3  ......   Si
i
b) Property 2: Entropy S is a monotonically increasing function of
energy ( S f ln( E  Eo )
   ( E )dE ( E  Eo )t dE (  cỡ đơn vị)

20
S  k ln  k f ln( E  Eo )  k ln dE
k f ln( E  Eo ) f ln( E  Eo )
c) Property 3: In thermodynamic equilibrium states, Entropy S of
the isolated system is maximum.
Cbnd   Max
SCbnd  k ln Cbnd  k ln  Max  S Max
Consequences: Thermodynamic probability of system

p
 Max
S
S
S  k ln    ln     e k
k
S Max S  S Max
 Max  e k
 pe k

p  exp[( S  S Max ) ) / k ]

d) Property 4: The principle of increase of entropy


“Entropy S of a system reaching the thermodynamic equilibrium state
increases”
Due to the general property of  :  of a system reaching
theThermodynamic equilibrium states increases, hence S increases, too.
St2  St1  S  0 (t2  t1 )
Summary: S  0 , where the greater – than sign applies to irreversible
processes and the equals sign to reversible processes ( the system’s entropy
remains constant).
Note: the entropy of the system increases but entropy may decrease in part
of a closed system:
S1  S2  0 may lead to S1  0, S2  0, S2  S1 .

21
10.TEMPERATURE

1) Energy distribution between systems constituting an isolated system


in thermodynamic equilibrium states.
a)We can prove that Energy distribution between the systems
constituting an isolated system in thermodynamic equilibrium states is
distributed so that entropy S of the system is maximum.
Consider an isolated system include two sub-systems 1 and 2

E  E1  E2  const (1)
1 2 ( with approximation dE )
E2, ∆Γ2(E2)
  1 ( E1 ). 2 ( E2 )
(2)
E1,∆Γ1(E1)    1 ( E1 ). 2 ( Eo  E1 )

(Large system have Eo ,  )

So, E1 increases, 1  E1  increases rapidly while ( Eo  E1 ) decreases and


2  E0  E1  decreases quickly at the same energy. Since  is the product
of two functions: an increasing function and a decreasing function of E1 . As
a result, there must exist E1 in which function  reaches a peak.
22
 2

1 

E1 E1

When E1  E1;
  1 ( E1 ) 2 ( Eo  E1 )   Max (4)
Because S  k ln  , when    Max , S  SMax . We have:
S  S1 ( E1 )  S2 ( Eo  E1 )  S Max (5)
a)Conclusion : With energy distribution E1  E1, E2  Eo  E1 , entropy S of
a system is maximum, i.e. the isolated system is in the thermodynamic
equilibrium state.

2)Absolute temperature:

From (5):
dS
0
dE1 E1  E1

dS S1 S2 E2 S1 S 2


    0
dE1 E1  E1
E1 E1  E1
E2 E1 E1  E1
E1 E1  E1
E2 E1  E1

E2
(  1 do E2  E0  E1 )
E1
S1 S2
we obtain: 
E1 E1  E1
E2 E1  E1

23
and similarly:
S1 S2
 (6)
E1 E1  E1
E2 E2  Eo  E1

Definition: Absolute temperature of a microscopic system in


thermodynamic equilibrium states is given by:
E
T (7)
S E  E
1 1
From (6) and (7) :  hay T1  T2
T1 T2
Conclusion: In thermodynamic equilibrium states, the temperature of the
system 1 equals that of 2.

Properties of absolute temperature


a) Absolute temperature is not negative: T  0 .
E
From definition T  we obtain:
S E  E
1 S 
   k ln  
T E E  E E E E

1 
  ln    0 (8)
kT E EE
(  is an increasing function in energy so that ln  is too, its derivative of
1
E must be non-negative or  0)
kT
b) Absolute temperature T is proportional to the average energy of
each degree of freedom, since the minimum Eo

From (8) we obtain T  0 .


 ( E  E0 ) f
ln   f ln( E  E0 )
According to (8):

24
1  1
  ln    f
kT E E  E0
E  E0 1 E  E0
kT T
f k f
E  E0
T (9)
f
Conclusion: Absolute temperature T is proportional to the average energy
of each degree of freedom, since the minimum Eo

2) The heat transfer of two objects contact each other:

Consider two objects with T1  T2 . Let two objects contact each other. At
first, without contact, two objects are in the thermodynamic equilibrium
states. Soon after contacting, the larger system consisting of the two objects
is not balance because of T1  T2 . Then, two objects exchange energy so that
the larger system reaches thermodynamic equilibrium states. This processes
is increasing entropy processes. i.e.:
dS  0
S  S1 ( E1 )  S2 ( E2 ) , Eo  E1  E2
it is deduced: S  S1 ( E1 )  S2 ( E0  E1 )
S S S S
dS  1 dE1  2 dE2  1 dE1  2 dE1
E1 E2 E1 E2

 S S 
dS   1  2  dE1  0 (10)
 E1 E2 
E1 E
Put :  T1 , 2  T2 , dE1  dQ1 into (10) we obtain:
S1 E1
S2 E
2

1 1
dS     dQ1  0 (11)
 T1 T2 

Conclusion: Expression (11) give two possibilities. If T1  T2 , dQ1  0 i.e.


object 1 losses heat. If T1  T2 , dQ1  0 i.e. object 1 gains heat. As a result, if
two objects contact each other, the heat transfers from the hot object to the
cold object

25

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy