0% found this document useful (0 votes)
63 views3 pages

Very Good (Basic) Article On StatMech - MIT

This document provides a summary of key concepts from statistical thermodynamics. It begins by introducing the first law of thermodynamics, which states that the change in energy of a system equals the energy transferred as heat or work. It then discusses the second law and how the number of microscopic arrangements (microstates) increases with disorder. Systems spontaneously adopt arrangements where the number of microstates and entropy are maximum. The document concludes by deriving an equation showing the relationship between the number of Schottky defects in a crystal and the temperature.

Uploaded by

Ram Bhadauria
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views3 pages

Very Good (Basic) Article On StatMech - MIT

This document provides a summary of key concepts from statistical thermodynamics. It begins by introducing the first law of thermodynamics, which states that the change in energy of a system equals the energy transferred as heat or work. It then discusses the second law and how the number of microscopic arrangements (microstates) increases with disorder. Systems spontaneously adopt arrangements where the number of microstates and entropy are maximum. The document concludes by deriving an equation showing the relationship between the number of Schottky defects in a crystal and the temperature.

Uploaded by

Ram Bhadauria
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Some Thoughts on Statistical Thermodynamics

Davide Marini
Department of Mechanical Engineering and Division of Biological Engineering, Massachusetts Institute of Technology, Cambridge, MA 02139, USA (Dated: April 2002) This review is simply an organized collection of ideas that I gathered from a few textbooks on the subject. It is aimed at providing the absolute basics of statistical thermodynamics. I hope to convey a physical feeling for the microscopic basis of macroscopic behavior of matter in a few simple cases. INTRODUCTION

through its boundary as heat or work: U = q + w Here U is the energy of the system, q is the energy transferred to the system as heat and w is the work done on the system. Experimental evidence has demonstrated that the energy of a system is a state function: its value depends only on the current state of the system and is independent of how that state was reached. On the other hand, neither work nor heat is separately a state function: each is dependent on the path followed by a system in changing from one state to another.

The goal of statistical mechanics is the understanding and prediction of the physical properties of macroscopic systems from the properties of their constituent molecules. It may be classied into two branches: statistical thermodynamics, dealing with systems in equilibrium, and non-equilibrium statistical mechanics. The latter is a very dicult and currently active area of research.
THE FIRST LAW

For the purpose of thermodynamics, the universe is divided into two parts: a system and its surroundings. The system is that part of the world in which we have a special interest (e.g. a glass of water or a cell); the surroundings are where we make our observations. Work, heat and energy are the basic concepts of thermodynamics. The most fundamental of these concepts is work: all measurements of heat and changes in energy can be expressed in terms of measurements of work. The energy of a system is dened as its capacity to do work. At the molecular level, heat is the transfer of energy that makes use of chaotic molecular motion. The chaotic motion of molecules is called thermal motion. For example, when two bodies at dierent temperatures are put in contact, the thermal motion of the molecules of the hotter body stimulates the molecules in the colder one to move more vigorously, albeit in a disorganized way. As a result, the energy of the colder body increases. In molecular terms, work is the transfer of energy that makes use of organized motion. For example, when a weight is raised or lowered, its atoms move in an organized way. Notice that the distinction between work and heat is made in the surroundings. When work is done on a system, the energy leaves the surroundings in an organized way, but it does not necessarily arrive in the system that way. Think of heating a glass of water by stirring: because collisions between molecules quickly randomize their directions, the orderly motion of the stirrer is transformed into thermal motion of the water. The rst law of thermodynamics states that the change in energy of a system is equal to the energy that passes

THE SECOND LAW

Some processes in nature happen spontaneously, some dont: this simple observation is the heart of the second law. If we throw a rock in a lake, the coherent motion of its atoms is converted to chaotic thermal motion of the water molecules. The reverse process, a rock being ejected from still water by a sudden coherent motion of the surrounding molecules has never been observed, even though it violates neither the rst law of thermodynamics nor Newtons laws of motion. Spontaneous processes are characterized by the conversion of order to chaos: they occur in directions that increase the overall disorder of the universe (system and surroundings). In order to get a physical feeling of why this is so, let us consider an extremely simple case. Think of an isolated system consisting of two bulbs of equal volume, connected by a valve. Initially, one of the bulbs contains N identical molecules of an ideal gas and the other bulb is empty. If we open the valve between the bulbs, we expect the system to eventually reach an equilibrium state characterized by the N molecules being evenly distributed between the two bulbs. Why does this happen? Lets focus our attention on a specic molecule (imagine to put a red label on it). When the valve is open and the system is in equilibrium, such molecule will collide with other molecules and wander around freely in the entire available space. In more formal words, for a given macrostate, characterized by a total volume V , a number of molecules N and a total energy E, there can exist

2 many dierent microstates characterized by dierent distributions of molecules in space, dierent velocities, etc. Now we introduce the most important postulate of statistical mechanics: for an isolated system, all microstates compatible with the given constraints of the macrostate (in our case E, V and N ) are equally likely to occur. Since we have no a priori reason to prefer one microstate to another, we must assign equal a priori probabilities to them. In our example, each molecule can be located anywhere in the available space, so there is an equal probability that it will occupy either bulb. The total number of ways in which we can distribute N molecules among the two bulbs is therefore 2N . Lets now consider the possibility that L molecules be in the left bulb and N L be in the right one. The number of (indistinguishable) ways of placing L of the N molecules in the left bulb is WL = N! L!(N L)!

FIG. 1: Formation of a Schottky defect: an atom from the interior of the crystal lattice is displaced and migrates to the surface.

The probability of such an event is therefore: WL /2N . So what is the most probable state in which the system will be found? For any value of N , the state with the highest value of WL is the one with half the molecules in one bulb. Take for example a physically signicant number of molecules, say N = 1023 ; in this case the probability that the number of molecules in the left bulb diers from those in the right by one in ten billion is 10434 . Therefore, the reason the number of molecules in each bulb of our system is always observed to be equal is not because of any law of motion; it is because the probability of all other states is so insignicant. Is it possible that all the N molecules will suddenly collect in one of the bulbs? Yes, it is possible, but extremely unlikely. Recalling our rst example, is it possible that a rock be ejected from still water by a sudden orderly motion of the surrounding molecules? Again, it is possible. But the probability of this event is so small that, for all practical purposes, it can be considered zero. In macroscopic systems, W , the number of ways (microstates) of arranging a system in a particular (macro)state is usually inconveniently immense. In order to be able to deal with W more easily, Ludwig Boltzmann dened in 1877 a quantity known as entropy: S = k ln W that increases with W but in a more manageable way. Here k is the Boltzmanns constant. Any macroscopic system will spontaneously adopt the arrangement where entropy is maximum, simply because this state is so overwhelmingly probable. During real processes, the entropy of an isolated system always increases. In the state of equilibrium the entropy attains its maximum value. This is the basic form of the second law. The postulate of equal a priori probabilities of microstates is extremely powerful. From it we can derive a

relation between the probability of a microstate, its energy and the temperature of the system. Lets consider one of the simplest examples: the Schottky defect. At the absolute zero of temperature, the atoms of a solid are ordered completely regularly on a crystal lattice. As the temperature is increased thermal agitation introduces several kinds of irregularities. One of these irregularities consists of an atom being displaced from its lattice site and migrating to the surface of the crystal (Fig. 1). At T = 0K there will be no defects in the crystal. As the temperature increases, the number of defects will gradually increase. Our goal is to nd the relation between the number of Schottky defects and the temperature of the crystal. Atoms at a lattice site inside the crystal have a lower energy than atoms at the surface, since interior atoms are bonded to a larger number of surrounding atoms. Let be the energy of formation of a Schottky defect, measured relative to the energy of an interior atom. Let us consider a crystal consisting of N atoms and possessing n Schottky defects. The energy associated with these defects is therefore E = n . How many ways are there to remove n atoms from the interior of the crystal to its surface? Recalling our earlier example, Wn = N! n!(N n)!

and the entropy associated with the disorder resulting from these defects is N! S(n) = k ln . n!(N n)! For a crystal in thermal equilibrium we can then write: 1 S dS(n) dn 1 dS(n) = = = , T E dn dE dn where the rst equality is a fundamental relation of thermodynamics and the last step follows from E = n . To calculate dS(n)/dn we use an approximation of the factorial function known as Stirlings formula: ln n! = n ln n n, which, for very large n, reduces to ln n! = n ln n. Hence we obtain: S(n) = k[N ln N n ln n (N n) ln(N n)]

3 and dS(n) = k[ ln n + ln(N n)], dn from which we can write: 1 k N n = ln . T n Taking exponentials and solving for n leads to 1 n = , N e kT + 1 which, for n << N (or >> kT ), can be written as: n = e kT N .
[1] McQuarrie, Donald A. (2000) Statistical Mechanics (University Science Books, Sausalito, California). [2] Atkins, Peter W. (1994) Physical Chemistry (W. H. Freeman and Company, New York). [3] Mandl, Franz (1988) Statistical Physics (John Wiley and Sons, New York). [4] Voet, Donald and Voet, Judith (1995) Biochemistry (John Wiley and Sons, New York).

This equation is our nal result: it relates the concentration of Schottky defects to their activation energy and the temperature of the crystal. This exponential dependence on energy, referred to as Boltzmann factor, is a recurring theme in statistical mechanics.

You might also like