0% found this document useful (0 votes)
8 views

PHAS0024 v1

Uploaded by

mondo040907
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

PHAS0024 v1

Uploaded by

mondo040907
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Revision

Thursday, March 28, 2024 18:35

Unit Term Symbol Definition


Laws of Zeroth Law If two systems are in thermal equilibrium with a third
Thermoc system, then they are in equilibrium with each other.
dynamic
s An intensive state function "temperature" may be
defined to decide whether two systems are in
equilibrium
First Law The total energy of an isolated system remains
constant.

There is a system property "energy" which can take


different forms and is conserved if the system is
isolated. The system property is , for
change in energy of the system, incoming heat, and
work done on the system
Second Law Heat cannot flow spontaneously from a colder to a
warmer body (Clausius Statement)

There is an extensive state function "entropy" which,


for an isolated system, either grows or is conserved.
When the inequality reduces to an equality, the
transformation is said to be reversible.
Third Law It is impossible to reduce the absolute temperature to
zero in a finite number of steps (Nernst's
unattainability principle).

The entropy of a system at zero temperature is an


absolute constant that does not depend on any other
thermodynamic variable.
State variables State variables summarise some property of a system
(extensive, intensive) with many particles, producing tenable calculations,
especially when systems become larger.
- Extensive variables scale with the amount of
material in the system (e.g., energy , volume ,
or number of constituents )
- Intensive variables remain the same for different
amounts of material in the system (e.g.,
temperature , pressure )
Process variables A variable that describe changes to the system, but do
not specify its state in equilibrium (e.g., heat , work
).

Lecture 1: The Four Laws


Entropy derived as a state function:
using:
- the first law,
- the ideal gas equation of state, and
- energy of ideal gas:

(q signifying quasi-static change, a process such that each intermediate state approaches
an equilibrium state)

The Clausius Statement of the 2nd Law - Spontaneous heat flow between two
systems
For two systems in equilibrium at temperature and , consider spontaneous heat flow
from to with . As both are in equilibrium, the quasistatic expression
holds:

By the Clausius statement, spontaneous heat flow is only possible if , and


therefore .

The 2nd Law, the Arrow of Time and Equilibrium

PHAS0024 Page 1
The 2nd Law, the Arrow of Time and Equilibrium
From the 2nd law above, we see that time-reversal symmetry is broken. This is not trivial,
considering all underlying mechanical laws are time-symmetric.

Due to this, we can also see that equilibrium will be reached by increasing entropy until it
is maximised.

Quasistatic approximation
The quasistatic approximation assumes a constant temperature , even with
nonzero heat transfer . This holds better with a small or a large system (with a
large specific heat capacity).
As such, for large reservoirs/environments, we take the approximation
. Note that the same isn't always applied to small systems.

The Kelvin Statement of the 2nd Law - Heat engine between two systems
Consider two systems and (with and ), with a heat engine.
Proceeding as per the Clausius Statement, noting

Defining the engine efficiency:

(Note that )

, if

By the Kelvin statement, "No device operating on a cycle may produce work from a single
thermal reservoir".

Note that efficiency approaches 1 if the second reservoir has temperature approaching
zero. This leads well into the third law - also known as Nernst's unattainability principle.

Nernst's Unattainability Principle: the 3rd law

Considering a series of isothermal compressions and adiabatic expansions on an S-T


diagram, alternating between two volumes, we see that Nernst's unattainability principle
only holds true when both isovolumetric curves converge towards to a single value. We
will by convention set this value to

Note that for an ideal gas, when .

Lecture 2: Free Energy and Entropy


Fundamental relation of thermodynamics:
The fundamental relation is motivated by searching for a "potential" that can be
minimised to find equilibrium state.

Though is a good starting point, the RHS is expressed currently in terms


of process variables instead of state variables.

From and the quasistatic relation :

For energy as a function of entropy, volume, and number of particles, by the definition of
the total derivative:

Where is the chemical potential, the change in energy of system upon adding
another particle.

Note that in this case, and , and , and and are conjugate thermodynamic
variables.

Equilibrium via minimisation of energy (SVN)

Examining the behaviour of total entropy when energy is minimised, subject to fixed SVN:

As

Equilibrium via minimisation of Helmholtz free energy (TVN)

Considering the quantity:

Where we removed -dependence by subtracting it multiplied by its conjugate


.

This new quantity is called the Helmholtz free energy.

Looking at the behaviour of total entropy when minimizing , subject to fixed TVN:

PHAS0024 Page 2
Again, total entropy is maximised.

Legendre Transformations
The dependence of a function can be modified:
The Legendre transform of function with respect to the
variable is given by:

Where is the conjugate of

Thermodynamic potentials:
Potential Name Definition System
Energy Is isolated
Helmholtz free energy Can exchange heat to keep constant
Enthalpy Can exchange volume to keep constant
Gibbs free energy Can exchange both heat and volume
Grand potential Can exchange heat and particles

The remaining 3 potentials are not defined as they have no physically meaningful
application. For instance, the potential would signify a system that can exchange
particles , but not heat .

Ideal gas subject to gravity

We modify the ideal gas situation by introducing a gravitational potential energy term:

(where is the number density)

Assuming constant :

To find the variation of number density with height, we consider the pressure exerted by
a layer of air between and over an area

e e
(i.e. the number density decays exponentially with height)

For two boxes at different heights and , assume and


.

The probability that a molecule is in box .

Dividing the room up into a large number of equally spaced lattice points each with a
volume of .

We can now generally write the probability as:


e e e

Significance:
- the molecules do not all gather in a small volume, but instead spreads over the
whole room, maximising entropy.
- Constant temperature restriction means that a maximisation of entropy
corresponds to a minimisation of the Helmholtz free energy.

Lecture 3: Core Concepts of Statistical


Physics
Due to the extent of systems in statistical physics, we must discard the details of
microscopic motion, and instead focus on the average behaviour.
However, this requires us to understand the statistical properties on the microscale to
then draw conclusions at the macroscale.

Basics of Probability

We can demonstrate many relevant concepts using the simple example of a die. (Taking a
six-sided die, D6 for the examples):

- Trial: each cast of the die is a trial


- Outcome: each possible result of a trial (1,2,3,4,5,6) is an outcome
- Weight: the probability of each outcome is its associated weight. Note that:
○ All probabilities must be positive
○ All outcomes must have probabilities which sum to 1
○ Outcomes related by symmetry must have the same probability
○ Provided independent trials:
▪ probabilities for the union of outcomes is equal to the sum of their
probabilities

▪ Probabilities for joint outcomes in separate trials is the product of


probabilities for each individual trial.

PHAS0024 Page 3
Microstates and Macrostates

For a pair of dice, we can introduce a few more ideas:


- A microstate is a description of a system where values of dynamic variables are
exactly specified (an outcome)
○ E.g., all positions and momenta of each particles in a gas,
○ all possible combinations of 1 to 6 on a pair of dice
- A macrostate is a set of microstates that have the same property (an event)
○ E.g., microstates that have the same energy or pressure
○ dice throws that add up to even numbers,

Note that microstates all have equal probability, while this is not necessarily the case for
macrostates. This can be quantified with the concept of multiplicity :
- A macrostate has a multiplicity of if microstates belong under that macrostate.

Macrostates are generally described by a region or surface defined by an equation of


states within the phase space.

Other examples of micro/macrostates:


- 1D SHO
○ Microstates: any configuration on the displacement-momentum phase space
○ Macrostates: microstates that share the same kinetic energy
- A group of particles with spin in a uniform magnetic field
○ Microstates: any configuration of all of the spins specified
○ Macrostates: microstates that share the same potential energy

Boltzmann definition of Entropy

As an extensive variable, entropy should scale proportionally with the extent of the
system. However, the number of microstates increases exponentially with the extent of
the system.
As such, one must take a logarithm of the number of microstates to obtain an extensive
variable.
Boltzmann defines entropy with pre-factor and base (natural log):

Statistical Ensembles
We concern ourselves with three main statistical ensembles:
- Microcanonical ensemble
○ The system is completely isolated. That is, the energy and number of particles
is conserved
○ Each microstate has an equal probability
- Canonical ensemble
○ Energy can be exchanged between the system and the reservoir
○ Each microstate (system + reservoir) is equally likely
○ Probability of microstate in system depends on microstates in reservoir
- Grand canonical ensemble
○ Energy and particles can be exchanged between the system and the reservoir
○ Microstates are equally likely
○ Probability of microstate in system depends on microstates in reservoir

We see the three statistical ensembles share the second property. This is the principle of
equal a priori probability: that the probability of a microstate in a closed system is
equal to that of any other microstate.

However, there is a dependence between the microstates of the system and the reservoir
for the canonical and grand canonical ensembles. We will see how these can potentially be
quantified.

Probability of macrostate in canonical ensemble

Define the system's energy to be and the reservoir's energy to be , the


probability that the system has energy will then be proportional to the product of
multiplicities:

Assuming that the reservoir is much larger than the system, and that their energies follow
the same trend, we can consider as a taylor expansion about
with deviation :

Where we've defined

As the prefactor is a constant, we can also write:

Per the properties of probability, we know that they must sum up to 1. We can thus divide
by a factor to normalise this probability:

Solving for , we get , which we term the canonical


partition function.

To determine 's significance, we can recall:

Due to the prevalence of this quantity in thermodynamics/statistical mechanics, it is also


called the inverse temperature or thermodynamic beta.

PHAS0024 Page 4
called the inverse temperature or thermodynamic beta.

Probability of macrostate in grand canonical ensemble

For a grand canonical ensemble, we employ a multivariate Taylor expansion about


with deviations .

We note:

By a similar path of reasoning as above, we therefore see the probability to be


proportional to:

Resulting in a new grand canonical partition function:

Table of Comparison: Statistical Ensembles


Microcanonical Canonical Grand canonical
Constants

Variables

m for constant:
Potential

Lecture 4: Statistical Physics in Action


Microcanonical ensemble and microstate enumeration

For bins and distinguishable objects, the multiplicity is:

If the objects are indistinguishable:

To distribute identical objects across boxes, we can also think of the situation as
distributing the box boundaries across the gaps between the objects:

For fewer number of boxes, we can illustrate a phase space to calculate the multiplicity of
each macrostate.

The expected value of a variable is the weighted average of that variable.


We can use this to express the mean and standard deviation:

Common summation facts to use include:

Stirling's Approximation:

Using this, we can approximate our partition function:

Energy - partition function relation


Observe that:

We therefore conclude:

Equipartition of Energy
The equipartition theorem states that: each term in the energy, or each degree of
freedom, contributes a term to the total average canonical energy at temperature .
Here is a constant and is an independent phase space variable.

PHAS0024 Page 5
Here is a constant and is an independent phase space variable.

Canonical Ensemble and the Quantum Oscillator


Examining the canonical ensemble, we turn to a non-isolated system of quantum
harmonic oscillators in the reservoir.

We can find system properties by the summation:


e

Given that each energy has a multiplicity of 1, we can write:

Given the energies of a QHO:

With , we can approximate the sum of a partial series as an infinite sum:

From the partition function, we then compute the expected energy:

s s

s
s
s

Observing the asymptotic behaviour:


- as t
- as t

We may also find the heat capacity of a quantum harmonic oscillator:

Considering atomic magnetic dipoles as paramagnets in a magnetic field, we can consider


the mean magnetisation:

s
t
s
t

t
Asymptotic behaviour:
- , depending on the sign of
-

Lecture 5: Statistical Mechanics in Action


(II) & Entropy
Maxwell Boltzmann Distribution

As relevant in deriving the expected kinetic energy, we will need to calculate the expected
squared velocity.

As such, we will need the probability distribution function for velocity squared, which is
termed the Maxwell Boltzmann distribution.

Consider the velocity squared as a sum of the three separate components squared:

Noting that all molecules with the same velocity are equal radius from the origin.

From the probability density function of velocity:

Transferring into polar coordinates:


s

PHAS0024 Page 6
From spherical symmetry, we may integrate over the latter two variables to yield:

Changing variables:

Integrating by parts:

Vacancies in Crystals

In crystals, atoms can be of energy or , depending if they are bound.

Assuming that:
- Vacancy is independent of other vacancies
- Atoms can hop between lattice sites
- Multiplicities of each atom are 1
This results in the grand partition function:

From the grand partition function, we obtain the expected occupancy:

Thermodynamic Limit

With greater energy, more microstates should be available, resulting in an increase of


multiplicity.

For large :

For large :

Given that , has some maximum at some energy .

PHAS0024 Page 7
Given the change of variable:

To find an expression for fixed of by taking the 2nd order Taylor


expansion:

From previously: . We will also define

The first two factors are constant with respect to , and the last is a Gaussian function
around . The curve is characterised by the standard deviation

Given , we want to find the standard deviation in terms of this expression.

For :

Gibbs Entropy

We define a new entropy called the Gibbs entropy, defined in terms of the probability of
microstate instead of the multiplicity:

We see that in a closed system:

The multiplicity for any microstate (given the principle of equal a priori probability) is 1:

And that this reduces back to Boltzmann entropy.

If the system is fully defined (i.e. for some )

We see that is therefore a measure of how much we do not know. As such, it is also
sometimes termed uncertainty.

Gibbs entropy of a canonical distribution:

PHAS0024 Page 8
Helmholtz free energy in terms of canonical partition function:

Gibbs entropy and grand potential in terms of grand canonical distribution:

Information entropy - the average amount of information needed to represent an event


drawn from a probability distribution for a random variable.

Consider a probability distribution , where refers to the probability that an


unspecified symbol may correspond to symbol .

If is the length of the message, each symbol would appear as . If all


symbols are different, the number of permutation of the symbols is:

For a message of symbols constructed of bits (0s and 1s). The total number of possible
messages is , where is the number of bits per symbol.

To find , we take the natural logarithm and employ Stirling's approximation to yield:

In general for bit encoding:

Note that this quantity is the minimum number of bits per symbol to convey a
message of symbols with probability of occurrence with out loss of information, also
called Shannon entropy.

Also termed the surprisal, this can be seen as a generalisation to the energy of a quantum
gas.

Lecture 6: Quantum Gases


Energy of a quantum gas
Given the TISE for a particle in a box with side lengths :

As the equation is separable, we yield solutions:


s

We can now consider the multiplicities by moving from the quantum number space
into the wavevector space .

To consider how the two scale, the interval between two consecutive is
. Taking spin into account, each lattice poinnt in -space has distinct quantum
states.

Be can see the canonical partition function as the sum of Boltzmann factors over all
available microstates. For a single particle:

PHAS0024 Page 9
The microstates are grouped into microstates depending on the magnitude of the
wavevector. Integrating the sum over all states via an integral:

Where and are density of states, allowing us to express and as


the multiplicity of microstates in the ranges and

As we're only counting positive wavevectors, we can determine the density of states by
calculating the microstate multiplicity in a thin shell of the first octant:

To obtain the density of states in terms of energy, we use the relation :

With this, we can now compute the partition function:

Make the substitution:

Where , the thermal de Broglie wavelength, is the characteristic


wavelength at which gases will display quantum effects when the density becomes
comparable to the quantum density

Bosons and Fermions

Upon exchange of two identical particles, a quantum state changes at most by a global
phase. Upon exchange and swapping back, we must recover the original wavefunction.

If , the wavefunction is symmetric on exchange on two particles.


If , the wavefunction is antisymmetric on exchange of two particles.

The spin-statistics theorem is related to this, stating that:


The wavefunction of particles:
- With integer spin ("bosons") is symmetric, with respect to particle exchange,
- With half-integer spin ("fermions") is anti-symmetric with respect to particle
exchange.

From this, we can also make the conclusion that:


- A single quantum state can be occupied by an arbitrary number of bosons
, but
- There can be no more than one fermion per quantum state, also known as the Pauli
exclusion principle ( )

The Grand Canonical Partition Function of a Single Energy Level

For energy allocated to a single state, this gives total energy .


s s
t
e m s

For bosons:

s e e e

Finding the mean number of bosons at one energy level:

Asymptotic behavior:
- Note that energy is strictly
-
-

For fermions:

PHAS0024 Page 10
Asymptotic behavior:
-
-

In conclusion:

( for bosons, for fermions)

These phenomena result in condensation processes for both Fermi-Dirac and Bose-
Einstein gases at lower temperatures.
- Bose Einstein Condensate
○ Note that Bose-Einstein condensation will happen much more noticeably due
to no PEP restricting the maximum number.
- Fermi degenerate gas
○ Highest level is called the Fermi level, determined by calculating half the
number of free electrons per unit volume/area/length. This is an intensive
property.

The Grand Canonical Partition Function of a Quantum Gas

Given the single-particle and multiple particle single level partition functions, we
now want to find the partition function for multiple states by multiple particles

Note that multiple wavevectors may result in the same energy, which means that they can
be collected through exponentiation. Note that the product now iterates through the
energies.

Taking the natural logarithm of this expression, in the limit that we consider a decreasing
spacing between energy levels:

Where m is once again the density of states

We substitute in the single-level grand canonical functions of bosons and fermions to


conclude:

Bosons:
Fermions:

From the grand canonical partition function above, as well as an expression for the

density of states ( ):

Entropy of a quantum gas

From the Gibbs entropy for the grand partition function:

Substituting known definitions:

Pressure of a quantum gas

From

(integration by parts?)

Lecture 7: Quantum Behaviour of Boson


PHAS0024 Page 11
Lecture 7: Quantum Behaviour of Boson
Gases
Defining a classical limit when the expected number per energy level,

Conducting a first order Taylor expansion, we see that the single-energy grand canonical
partition function matches:

By another Taylor expansion, we find the multi-energy grand canonical partition function

If this last integral looks familiar, you've been doing too much stat mech it's because it's
the one-particle canonical partition function:

Taking the Maclaurin expansion,

We take another approach from the definition of the grand canonical partition function,
dividing the summation over two indices into two separate summations by noting part of
the expression as the partition function with respect to number of particles :

Given our previous definition of entropy using expected values, we can derive the Sackur-
Tetrode equation for the entropy of a classical monoatomic ideal gas:

Implications of this equation include:


- Entropy (as expected) scales proportionally with the size of the system, as per an
extensive variable.
- Entropy of indistinguishable particles:

e e

From this, we are able to find the constant in our original derivation of entropy as
a state function:

And the influence of in shows that quantum mechanics imposes limits on the
granularity of , preventing it from approaching infinity.

Blackbody radiation

An ideal blackbody is a theorized object which absorbs all incident light, and is in thermal
equilibrium with its surroundings.

Recalling the relation between frequency, wavelength, wavenumber, momentum and


energy:

Applying these to calculate the density of states in terms of angular frequency (and noting
that photons have a degeneracy of 2 despite a spin of 1:

Ultraviolet Catastrophe and necessity of the statistical mechanics formulation

Calculating the energy density according to classical interpretations, we encounter an


issue - this tends towards infinity as angular frequency increases. This contradiction with
experimental evidence is termed the ultraviolet catastrophe.

As such, we must consider the expected energy density using the statistical mechanics
formulation. Ignoring the zero point energy:

PHAS0024 Page 12
Applying a change of variables
and using the standard integral :

Where is the Stefan-Boltzmann constant.

Peak Frequency (Wien's Law)

To find the frequency of light with the highest "contribution" to energy (peak frequency),
we can differentiate the integrand with respect to frequency. Note that the solution
requires use of the Lambert-W function.

Setting :
Expected Values
Use Case Expression

We can also take the classical limit, where :

Partition Functions
Use Case Symbol Expression
Single particle (canonical)

Grand canonical, single energy

Grand canonical, multiple energy


Greenhouse effect

The effect of the atmosphere becomes prevalent when we consider the sun and earth as
two ideal black bodies in thermal equilibrium.

From the Stefan-Boltzmann radiation law and the solid angle subtended by the Earth/ the
inverse square law of radiation, we can calculate the incident solar flux (incident power
per unit area):

(Solar luminosity - total power output of sun)

Note that only of the Earth will receive the incident radiation, but the energy will be
spread out over all of the surface:

(Incident solar flux - the input power per unit area received by the Earth)

Assuming no atmosphere, nonzero albedo (ratio of reflected to incident power) of 0.15


and thermal equilibrium, we can equate the Earth's incident and outgoing flux:

Resulting in a sub-zero temperature of .

Adding in an atmosphere with albedo of and setting up a system of equations:

Energy conservation - earth-atmosphere system


Incoming energy = outgoing energy
is the total albedo

Energy conservation - atmosphere


Incoming energy = outgoing energy

PHAS0024 Page 13
Incoming energy = outgoing energy
(assuming no energy is absorbed by the atmosphere on initial incidence)

Substituting equation 1 into 2, we yield a surface temperature:

At a habitable .

Bose-Einstein condensation

As we require for convergence of the grand partition function, the ground state
will then necessitate .

This reaches a maximum when as ( )

Using the standard integral: e allows us to calculate a


maximum number density of:

However, this contradicts our expectation of . This is


because we've approximated the sum as an integral over discrete states of energy, which
can be fixed approximately by setting in cases where .

To find the fraction of particles in the ground state:

Where is the critical temperature, the threshold at which a finite ratio of the total
number of particles occupy the single-particle ground state. Particles in such a state are
said to be in a Bose-Einstein condensate. This can be done through evapoartion cooling,
reducing the potential well depth and allowing particles to re-thermalise (reattain
thermal equilibrium).

Phenomena of BECs:
- Particles all occupy the same delocalised quantum wavefunction,
- Collisions are suppressed, viscosity approaches zero.

Other notable bosons include phonons, from which we can determine the heat capacity of
solids.

By visualising the excitation of bosons in a lattice, we can assume a large set of coupled
differential equations where bosons experience a spring force from its nearest
neighbours:

Establishing a travelling wave as an Ansatz:

s s

There is a solution for every pair of values of and , of this dispersion relation.
If , we can also make the small angle approximation:

Setting periodic boundary conditions ( ), we see something which resembles


Bloch's theorem:

Considering 3D motion, there are modes:


- modes along the 1D chain with spring constant , and
- transverse modes with spring constant

For variations over distances (atomic separation), where :

We can now also rewrite the dispersion relation under small angle approximations as:

Where

From which the density of states can be computed:

Heat Capacity

Peter Debye (1884-1966) capped the maximum frequency at , allowing us to count the
number of modes as the integral of density of states over frequency:

PHAS0024 Page 14
With this, we find the heat capacity at fixed volume , noting the zero point energy:

s s s s

s s

Taking the high temperature limit ( s )

For the low temperature limit, . Making the substituion and using a
standard integral:

Where we've defined the Debye temperature as: .

Comparing this to the Einstein model, the Debye model predicts a higher specific heat
capacity at lower temperatures.

Einstein model of heat capacity:


- independent oscillators
- Higher frequency
- Predicts lower at low temperatures
Debye model of heat capacity:
- Coupled oscillator with modes ( longitudinal, transverse)
- Lower frequency
- Predicts higher at high temperatures

Lecture 8: Quantum Behaviour of Fermi


Gases
Degenerate Fermi Gas

Unlike bosons which have no limitation to the number of particles in one energy level,
fermions are restricted by the Pauli exclusion principle. As such, at lower temperature
limits, a degenerate Fermi gas occurs.

Such a gas occurs when the single-particle energy eigenstates are occupied with
probability 1 up to a maximum single-particle energy (the Fermi energy ), with all
states above this energy empty.
This occurs approximately when , and Fermi energy is related to number density
as belows:

As the probability of occupation :

PHAS0024 Page 15
Which is to say, a degenerate Fermi gas can occur at any temperature provided a
sufficiently high temperature, since , and the criterion for Fermi degeneracy is
.

Energy and Pressure of a Degenerate Fermi Gas

Assuming es t e e t s t s e t s

The average pressure is then:

Note that this is temperature independent.

A case of degeneracy: white dwarf star

Stars are in hydrostatic equilibrium - there is a balance between the gas and radiation
pressures outwards, and the gravitational attraction inwards.
As nuclear fuel depletes, gravity is able to push the star further inwards, leading to our
condition for degeneracy fulfilled ( )

The lower the mass, the higher the Fermi energy. As such, electrons with lighter mass will
enter degeneracy first.

Taking the sun's figures:

Approximating number density using nucleon mass, we yield a Fermi energy:

With at about 1/65 times the Fermi energy, we may safely assume
degeneracy.

We can also investigate the radius threshold at which Fermi pressure equals gravitational
compression:

To find the force from each mass layer of thickness , we assume a star with
homogeneous mass ( ) and calculate the attraction felt by the shell:

We then integrate the force divided by surface area over each shell:

Equating the two pressures:

For the sun, this yields a radius times smaller.

However, a relativistic modification changes this result:

PHAS0024 Page 16
However, a relativistic modification changes this result:

In the extreme ultra-relativistic case where the kinetic energy greatly exceeds the rest
energy ( ): .

Recalculating , we yield the degeneracy pressure in an ultra-relativistic case:

And the fermi/gravitational pressure equation yields:


, independent of the radius. (?) The graph on the following slide shows the
Chandrasekhar limit to reside at approximately 1.4 solar masses.

Beyond White Dwarves

With further mass, further collapse is possible, now putting the nucleons in degeneracy,
yileding a radius:

Giving a radius of 4km, with density m . At this point, it becomes energetically


favourable for protons to undergo electron capture to become neutrons.

Another DFG: electron gases in metals

Following the free electron model of electron gas, we yield an effective electron mass
as a result of:
- repulsive electrostatic interactions, and
- screening from attractive interactions with the ions

Given the number density of free electrons in a metal, Fermi temperatures ( ) are
thousands of kelvins, allowing us to safely assume Fermi degeneracy.

This allows us to consider properties of metals, including:


- Electrical conductivity

In the Drude model for conductivity:


The electric current density can be defined both in terms of the applied electric field ,
or the density and velocity of conduction electrons:

Setting up Newton's second law:

Where is the time betweeen collisions.


Rearranging, we are able to solve the differential equation. Taking the asymptotic
behavior (limit as ):

Subbing into :

Allowing us to recover

- Heat capacity

Electrons with will contribute to the heat capacity, and the proportion which
contributes to storing this energy can be estimated from the width of the curve at
.

This proportion is equal to


As such, we anticipate a heat capacity of approximately:

- Heat conductivity

A better approximation of the heat capacity is:

The heat conductivity is defined via the heat flux , and also given by the

PHAS0024 Page 17
The heat conductivity is defined via the heat flux , and also given by the
expression below (where is equal to the mean time between electron-phonon collisions)

Given and from above:

Observing the ratio between thermal and electrical conductivity:

This proportionality is described in the Wiedemann-Franz law:


- The ratio between thermal and electrical conductivities is proportional to the
temperature through a constant ( ) dependent on no variable of the metal, either
microscopic or theromdynamic.

Microscopic phenomena affecting thermal and electrical conductivity:


- Thermal - all collisions hinder thermal conduction, while
- Electrical -collisions between the electrons and lattice strongly affect electrical
conductivity, as electron energy is converted to lattice vibrations.

The Debye frequency allows us to observe a regime in which the mean collision time is
not the same for thermal and electrical conductivity. When , some collisions
only affect thermal conductivity.

wef

PHAS0024 Page 18

You might also like