Statistical Mechanics Part 1
Statistical Mechanics Part 1
Overview of course:
1
Topic 1: Energy states and populations
2
At T = 0 K all molecules occupy the lowest energy state but as temperature is
increased more molecules have enough energy to populate higher states. As
T tends to infinity all energy levels become equally populated. [Note: a
common error is to imagine that as T → ∞ all the molecules will be found in
the upper energy state; this does not happen – all states become equally
populated.]
To remove the need for a proportionality constant in Eq. 1.1 we often consider
the relative populations of the states, or the ratio of the number of molecules in
state 𝑖 (𝑛𝑖 ) with energy 𝜀𝑖 to the number in state 𝑗 (𝑛𝑗 ) with energy 𝜀𝑗 (Eq. 1.2):
(𝜀𝑖 −𝜀𝑗 )
𝑛𝑖 −
= e 𝑘𝑇 𝐸𝑞. 1.2
𝑛𝑗
An understanding of how energy is distributed according to the Boltzmann
distribution is key to understanding most aspects of thermodynamics and
kinetics in chemistry. It is clear that temperature is the only parameter that
governs the population of the available energy states.
𝑁 = 𝑛0 + 𝑛1 + 𝑛2 + ⋯ = ∑ 𝑛𝑖 𝐸𝑞. 1.3.
𝑖
𝑁 = ∑ 𝑛0 e−𝛽𝜀𝑖 = 𝑛0 ∑ e−𝛽𝜀𝑖
𝑖 𝑖
3
And rearranging for 𝑛0 gives:
𝑁
𝑛0 = Eq. 1.5
∑𝑖 e−𝛽𝜀𝑖
Now look at worked problems 1 (WP 1.1-1.3) where some examples of how to
obtain and interpret the partition function are discussed.
4
Topic 2: Molecular partition functions
In Topic 1 the energy states were treated in a very generic way and the form
or type of energy was not defined. This is because one of the principles of
Statistical Mechanics is the very powerful ‘principle of equal a priori
probabilities’. This is a complex way of stating the simple assumption that all
possibilities for the distribution of energy are equally probable. Thus, the
treatment of energy states in Topic 1 could be used for any types of molecular
energy – e.g. two conformations with different energy, different spin states,
different electronic energies etc.
In Topic 2 we are going to focus on the population of energy states concerned
with molecular motion: these are translational, rotational and vibrational
energy levels. The ‘principle of equal a priori probabilities’ allows us to
assume that if (for example) a rotation state and a vibrational state have the
same energy they have an equal probability of being occupied.
So first let’s revise what we know about the energy levels for molecular motion
from quantum mechanics and spectroscopy.
Vibrational energy levels can be derived by solving the Schrodinger
Equation for the Simple Harmonic Oscillator (SHO). This results in a series of
evenly spaced non-degenerate energy levels at 𝐸 = (𝑣 + 1⁄2)ℏ𝜔 (in J). Hence
the energy difference between levels is Δ𝐸 = ℏ𝜔.
In spectroscopy the vibrational energy levels are usually reported as a term
𝐺(𝑣) = 𝜔𝑒 (𝑣 + 1⁄2) determined in wavenumbers (cm-1). The spacing between
levels is therefore Δ𝐺(𝑣) = 𝜔𝑒 . As shown in spectroscopy lectures, typical
values of 𝜔𝑒 are 1000 – 3000 cm-1 (e.g. 𝜔𝑒 for HCl is 2987 cm-1).
Of course the SHO model does not take into account anharmonicity and the
fact that real bonds eventually break when vibrationally excited above the
dissociation energy. However, in the vibrational ground state the SHO model
is acceptable and hence we will use the resulting energy level expressions in
further discussions of the vibrational partition function 𝑞 𝑉 .
Rotational energy levels can be derived by solving the Schrodinger Equation
for a particle on a sphere or rigid rotor. This results in a series of energy levels
whose separation increases with increasing energy at 𝐸 = (ℏ⁄2𝐼) 𝐽(𝐽 + 1).
5
In spectroscopy the rotational levels are described in cm-1 as 𝐹(𝐽) = 𝐵𝐽(𝐽 + 1)
and difference between consecutive levels increases from 2B to 4B to 6B etc.
Typical values of B range from 0.1 to 60 cm-1 leading to differences between
lower rotational levels in the order of 10-1 to 102 cm-1.
Unlike vibrational energy levels, rotational energy levels exhibit multiple
degeneracy, with (2𝐽 + 1) states available at each energy level.
Translational energy levels can be derived and described using the particle
in a box model (see Topic 4) but for most practical purposes the energy levels
for a typical molecule in typical volumes at typical temperatures are so close
together as to be considered a continuum. As we experience in our everyday
lives, the translational motion of gases can be treated classically in most
situations. Exceptions would be very light molecules at low temperatures in
confined volumes.
You may or may not have noticed that I used the term ‘energy states’
throughout Topic 1, yet in section 2.1 the term ‘energy level’ was used to
describe specific molecular energies. It is important in statistical mechanics to
distinguish between the two. This is because a number of states (𝑔𝑖 ) may be
available at an energy (𝜀𝑖 ) and hence that energy level is 𝑔𝑖 -fold degenerate.
The Figure below shows an example where there are two energy levels, each
of which is 2-fold degenerate (i.e. 𝑔0 = 2 and 𝑔1 = 2).
If instead of determining the partition function over states we now sum over
levels we need to take into account the degeneracy and so we can write:
6
2.3. Contributions to the partition function
𝑇 𝑅 𝑉 𝑇 𝑅 𝑉
𝑞= ∑ e−𝛽𝜀𝑖 = ∑ e−𝛽(𝜀𝑖 +𝜀𝑖 +𝜀𝑖 ) = ∑ e−𝛽𝜀𝑖 ∑ e−𝛽𝜀𝑖 ∑ e−𝛽𝜀𝑖
𝑖 (𝑠𝑡𝑎𝑡𝑒𝑠) 𝑖 (𝑠𝑡𝑎𝑡𝑒𝑠) 𝑖(𝑇) 𝑖(𝑅) 𝑖(𝑉)
For now let’s consider typical values of 𝑞 𝑇 , 𝑞 𝑅 and 𝑞 𝑉 for 1H35Cl at room
temperature (298 K) so that we can appreciate how energy is distributed /
partitioned across different modes in typical diatomic molecule. The
calculations are explained in more detail in Worked Problems 2.
𝑉
𝑞 𝑉 = ∑ e−𝛽𝜀𝑖 = ∑ e−𝛽𝑣ℎ𝑐𝜔𝑒
𝑖 𝑣
7
𝑅
𝑞 𝑅 = ∑ e−𝛽𝜀𝑖 = ∑(2𝐽 + 1)e−𝛽𝐵ℎ𝑐𝐽(𝐽+1)
𝑖 𝐽
So, what do these values tell us about how energy partitions in a typical
molecule? First, it is clear that 𝑞 𝑉 is very small (close to 1) and 𝑞 𝑇 is huge –
many, many orders of magnitude greater. Thus, many more translational
energy levels are occupied at room temperature compared to vibrational (and
rotational) states.
Why is this? As discussed above, translational energy levels are very close
together in energy and the separation between them is far smaller than the
thermal energy available to the molecules at room temperature.
∆𝜀 𝑇 ≪ 𝑘𝑇
Hence the molecules have sufficient energy to occupy many translational
states and 𝑞 𝑇 is consequentially large.
8
Topic 3: Towards thermodynamic functions
𝐸 = ∑ 𝑛𝑖 𝜀𝑖 𝐸𝑞. 3.1
𝑖
1
〈𝜀〉 = ∑ 𝜀𝑖 e−𝛽𝜀𝑖 𝐸𝑞. 3.3
𝑞
𝑖
To manipulate this equation into a form that involves only 𝑞 (i.e. that eliminates
𝜀𝑖 ) we can note the following relationship:
d −𝛽𝜀
𝜀𝑖 e−𝛽𝜀𝑖 = − e 𝑖
d𝛽
Hence it follows that:
1 d −𝛽𝜀 1 d 1 d𝑞
〈𝜀〉 = − ∑ e 𝑖=− ∑ e−𝛽𝜀𝑖 = − 𝐸𝑞. 3.4
𝑞 d𝛽 𝑞 d𝛽 𝑞 d𝛽
𝑖 𝑖
So now you can see that we have derived an expression for mean molecular
energy only in terms of the partition function (and its temperature dependence
– remember 𝛽 = 1⁄𝑘𝑇).
9
However, we need to make two modifications to Eq 3.4. First, by convention
we have assumed that 𝜀0 = 0 when determining 𝑞. This is not always the case
if 𝜀𝑔𝑠 ≠ 0 (for example for vibration 𝜀𝑔𝑠 is the zero point energy for the SHO).
Hence the true mean energy would be 〈𝜀〉 + 𝜀𝑔𝑠 . Secondly, in some cases the
partition function might depend on the volume as well as temperature (this is
true for 𝑞 𝑇 ) so we need to re-write the derivative in Eq 3.4 as a partial
derivative. Thus, the complete expression relating the partition function to the
mean energy of a molecule is:
1 𝜕𝑞
〈𝜀〉 = 𝜀𝑔𝑠 − ( ) 𝐸𝑞. 3.5
𝑞 𝜕𝛽 𝑉
A demonstration of how to use this expression is given in Worked Problems 3.
10
3.3. The canonical ensemble and canonical partition function
We made the assumption above that the molecules were non-interacting but in
practice molecules do interact (real gases, liquids, solids etc) and we need to
construct a model to account for this. In Eq. 3.6 we assumed we can simply
scale the total energy of the system by multiplying the average energy of each
molecule by the number of molecules there are in the system. However, if
molecules interact with each other (for example electrostatic attractions and
repulsions, hydrogen-bonding etc) additional energy terms arise that do not
scale linearly with the number of molecules present.
The way around this is to imagine a closed collection of 𝑁 interacting
molecules at a specific volume and temperature. We imagine additionally that
these molecules can be distributed across a set of energy states and that each
molecule has a total energy 𝐸𝑖 . These energy states can adjust to any
intermolecular interactions that exist. We then replicate this system multiple
times and consider the behaviour of this collection of replications. This
imaginary collection is called the canonical ensemble and allows us to define
this partitioning of the molecules among the available system energy states as
the canonical partition function 𝑄.
In the same way that 𝑞 allows the calculation of mean molecular energy, so 𝑄
can be used to calculate the mean energy of an entire system composed of
molecules that may or may not be interacting with each other. Thus:
Where the sum here is over all members of the ensemble. 𝑄 is more versatile
than 𝑞 because molecular interactions are already ‘baked in’ to the definition
and so it can be applied to condensed phases and real gases. Thus, the
energy of an ensemble of interacting molecules is given by:
𝜕ln𝑄
𝑈 = 𝑈(0) − ( ) 𝐸𝑞. 3.9
𝜕𝛽 𝑉
Note that by comparison with Eq 3.7 A the scaling factor of 𝑁 is not required in
Eq 3.9, as 𝑄 already considers the partitioning of the 𝑁 molecules as a whole
and not as the product of 𝑁 separate molecular contributions.
11
3.4. Relationship between canonical and molecular partition functions
𝑄 = 𝑞𝑁 𝐸𝑞. 3.10
𝑞𝑁
𝜕ln ( ) 𝜕(ln𝑞𝑁 − ln𝑁!) 𝜕(𝑁ln𝑞) 𝜕ln𝑞
𝑁!
( )
𝑈 − 𝑈 0 = −( ) = −( ) = −( ) = −𝑁 ( )
𝜕𝛽 𝜕𝛽 𝑉
𝜕𝛽 𝑉
𝜕𝛽 𝑉
𝑉
Which is clearly the same result as Eq 3.7 A. That is, the mean energy of a
gas composed of 𝑁 indistinguishable molecules is 𝑁 times the mean energy of
a single molecule.
However, in a later Topic when we discuss entropy in detail you will see that it
is important in that case to identify whether the molecules are distinguishable
or not and hence whether Eq 3.10 or 3.11 should be used.
12
3.5. Some important thermodynamic functions
As mentioned above, important thermodynamic quantities can be expressed
as a function of 𝑞 or 𝑄. I’ve listed these below (in terms of 𝑄) and we’ll return
to their application later:
𝑈 − 𝑈(0)
𝑆= + 𝑘ln𝑄 𝐸𝑞. 3.12
𝑇
𝜕ln𝑄
𝑝 = 𝑘𝑇 ( ) 𝐸𝑞. 3.14
𝜕𝑉 𝑇
𝜕ln𝑄 𝜕ln𝑄
𝐻(𝑇) = 𝐻(0) − ( ) + 𝑘𝑇𝑉 ( ) 𝐸𝑞. 3.15
𝜕𝛽 𝑉 𝜕𝑉 𝑇
𝜕ln𝑄
𝐺(𝑇) = 𝐺(0) − 𝑘𝑇ln𝑄 + 𝑘𝑇𝑉 ( ) 𝐸𝑞. 3.16
𝜕𝑉 𝑇
13
Topic 4: Translational partition function
∞ ∞
14
2𝜋𝑚 1/2
𝑞𝑋𝑇 = ( 2 ) 𝑋 𝐸𝑞. 4.1
ℎ 𝛽
2𝜋𝑚 3/2
𝑇
𝑞 =( 2 ) 𝑉 𝐸𝑞. 4.2
ℎ 𝛽
4.2. Interpretation of 𝒒𝑻
The expression for 𝑞 𝑇 in Eq. 4.2 clearly shows that the partition function
increases with both volume of the container and mass of the molecule. This is
because an increase in each of these factors results in separation between
translational energy levels becoming smaller and hence more of them being
thermally accessible.
Clearly also, as 𝛽 = 1/𝑘𝑇, an increase in temperature results in increased
value of partition function because more energy states become accessible.
Looking back to Topic 2 (section 2.3) where we considered contributions to the
molecular partition function, it was stated that for H35Cl at 298 K, 𝑞 𝑇 ~ 1032
when considering a 1 m3 volume container. You can now see where this
number came from: it was calculated using Eq. 4.2 above (as shown in
Worked Problems 4). Such large values of 𝑞 𝑇 are very typical for even light
molecules in small volumes at room temperature.
Eq 4.2 can be rewritten as:
𝑉 ℎ
𝑞𝑇 = where: Λ = 𝐸𝑞. 4.3
Λ3 (2𝜋𝑚𝑘𝑇)1/2
15
4.3. Mean molecular and internal energy of monatomic perfect gas
Which by assuming the 𝜀𝑔𝑠 term can be neglected, and substitution of Eq. 4.2
gives:
3
〈𝜀〉 = 𝑘𝑇
2
The proof of this result is covered in the tutorial questions for this topic.
Further to this Eq. 3.9 and Eq. 3.11 can be used to determine that:
3
𝑈 − 𝑈(0) = 𝑅𝑇
2
4.4. Pressure and Gibb’s energy of a monatomic perfect gas
Which shows very beautifully how statistical mechanics can be related to, and
be consistent with, laws of state such as the perfect gas law.
16
Eq. 3.16 gave an expression for 𝐺:
𝜕ln𝑄
𝐺(𝑇) = 𝐺(0) − 𝑘𝑇ln𝑄 + 𝑘𝑇𝑉 ( )
𝜕𝑉 𝑇
The final term is clearly 𝑝𝑉 (compare to Eq. 3.4) so we can substitute in 𝑛𝑅𝑇
from the result above:
𝐺(𝑇) = 𝐺(0) − 𝑘𝑇ln𝑄 + 𝑛𝑅𝑇
Here, we are considering a gas, where molecules (atoms in this case) are
indistinguishable. We therefore must substitute in Eq 3.11 and if we use
Stirling’s approximation that ln𝑁! = 𝑁ln𝑁 − 𝑁 then we obtain:
𝑞
𝐺(𝑇) = 𝐺(0) − 𝑛𝑅𝑇ln
𝑁
This proof is covered in more detail in Worked Problems 4.
This result allows us to understand Gibb’s energy on a statistical level.
Because 𝑞 represents the number of thermally accessible states and 𝑁 is the
number of molecules, the difference 𝐺(𝑇) − 𝐺(0) is proportional to the log (ln)
of the average number of thermally accessible states available to the
molecules. As 𝑞⁄𝑁 increases, the log (ln) term becomes larger and hence
𝐺(𝑇) − 𝐺(0) becomes more negative. Thus, we can now see that the
thermodynamic tendency to lower Gibb’s energy is driven by the tendency to
maximise the number of thermally accessible states.
Mid-point summary
Hopefully now the relationship between population of molecular energy states
and bulk thermodynamic properties is clear to you as well as the importance of
the partition function in linking the microscopic and macroscopic worlds.
In the lecture material so far, I have explicitly discussed the translational
partition function. This is because at reasonable temperatures translational
motion contributes significantly more to the mean molecular energy of a gas
than vibration and rotation.
The vibrational partition function has also been covered indirectly in the
worked problems and the tutorial questions encourage you to link the relevant
worked problems to vibrational energy levels more explicitly.
In part 2 of this course we will cover the rotational partition function in more
detail, along with other examples, including the electronic partition function.
We will also spend some time considering entropy and the partitioning of
molecules among states from a statistical viewpoint.
17