Lecture 1 (1)
Lecture 1 (1)
The subjects of Biophysics are the physical principles underlying all processes of
living systems. This also includes environmental biophysics, which represents physical
influences on physiological functions.
Does biophysics belong to biology, or is it a part of physics? Biology claims to be
a comprehensive science relating to all functions of the living systems. Hence,
biophysics, like genetics, biochemistry, physiology etc., should be considered as a
special part of biology. This view has not remained undisputed by physicists, since
physics is not confined to subjects of inanimate matter. Biophysics can be considered,
with equal justification, as a part of physics. Especially today, when the boundaries
between classical disciplines are no longer fixed, it would be futile to try to balance
those aspects against each other. Biophysics appears to be one of the best examples
of an interdisciplinary science.
What are stochastic events? - these are random events having certain probability of
taking place, but each separate event cannot be predicted in a determined way.
The existence of a living organism, the viability processes within it, are closely
related to the changes in its energetic balance and therefore to energy conversions.
Part of physics called Thermodynamics is the science of energy conversion.
But first we have to discuss some major concepts of thermodynamics.
Energy is a property of objects defining their ability to perform some type of work.
There are various types of energy in general, but in the living body energy conversions
happen between the following 4 types of energy:
Mechanical energy – is the form of energy describing the movement of a body
and its ability to perform work during movement. Mechanical energy is divided into
2
kinetic (defined by the velocity of body movement) and potential energy (defined by
the position of the body).
Thermal energy – is a sum of energies of thermal chaotic movements of atoms
and molecules of a substance. The indicator of thermal motion of particles is
temperature. Kinetic energy of molecules Ek is defined by the formula
Ek = 3/2 kT
where k = 1.380 x 10 J/K = 1.380 x 10-16 erg/K is a Boltzmann constant, T is an
-23
absolute temperature.
Chemical energy – is the energy of interaction of atoms and molecules. In fact, it
is the energy of electrons moving in outer orbits of atoms and molecules.
Electric energy – is the energy of interaction of electrically charged particles,
which causes their motion in electric field.
We also have to introduce a notion of a system and that of a structure of a system.
In thermodynamics, a system is an aggregate of elements with certain interrelations
between them. The systems are classified as follows according to the nature of their
boundary against their environment:
- The isolated system: this is an idealized system that does not exchange any kind
of energy or matter with its environment.
- The closed system: this system can exchange all kinds of energy with its
environment, but not matter.
- The open system: it can exchange both energy and matter with its environment.
The closed system can be influenced by its environment, and can cause changes in
its environment. However, it cannot be involved in an exchange of matter.
The state of a system can be described by a number of state variables. These are
either extensive or intensive parameters. Intensive parameters are non-additive. They
are independent of the size of the system (e.g. temperature, concentration, pressure,
density). Extensive parameters on the other hand, are additive when two systems are
combined (e.g. mass, volume).
What is a structure of a thermodynamic system? The totality of interrelations
between the elements of a system is called the structure of the system. This definition
does not prescribe at all what kind of elements, and what kind of interrelations these
are. It is applicable to all kinds of systems including biological systems and structures.
In physics applied to biological systems, we are especially interested in dynamic
systems, i.e. where the interrelations between their elements are interactions. In
contrast to this, in static systems the elements have no interaction at all, but are just
interrelated by formal relations. Examples for static systems are the system of natural
numbers in mathematics, or the system of animal species or of plants in biology.
3
Thermodynamics is the study of the relationship between heat, work, and the
associated flow of energy. After many decades of experience with heat phenomena,
scientists formulated two fundamental laws as the foundation of thermodynamics.
The First Law of Thermodynamics states that energy, which includes heat, is
conserved; that is, one form of energy can be converted into another, but energy can
neither be created nor destroyed. This implies that the total amount of energy in the
universe is constant. The second law, more complex than the first, can be stated in a
number of ways which, although they appear different, can be shown to be
equivalent. Perhaps the simplest statement of the Second Law of Thermodynamics is
that spontaneous change in nature occurs from a state of order to a state of disorder.
Thermodynamic Probability and Entropy
In 1854, German physicist Rudolf Clausius introduced entropy (S) as a parameter
of phenomenological thermodynamics, and defined it as the heat, added to a system
in a reversible way in relation to the temperature (dS = dQ rev/T). Later, in 1894, L.
Boltzmann used this parameter in the framework of statistical thermodynamics. In
this context, entropy and the second principle of thermodynamics become more
imaginable. Entropy appears as a kind of measure of disorder, or as a degree of random
distribution of the elements within the system. The correlation between order and
probability and, as will be explained later - information - is of great importance for the
understanding of the principles of biological organization.
Let us start with the assumption that entropy is a measure of randomization of a
given distribution. We will consider a system of maximal entropy as a system in
maximal disorder. Furthermore, let us demand that the entropy be an extensive
parameter. Therefore, like volume, or mass, but in contrast for example to
temperature or density, the entropies S1 and S2 of two systems can be added, if these
systems come together: S1 + S2 = S
Let us now ignore the fact that the spheres are distinguishable. Let us simply ask:
how large is the probability that just by stochastic distributions one of the relations
4:0, 3:1, 2:2, 1:3, or 0:4 occurs? Apparently, the probability of any kind distribution
will be larger if it can be realized in a larger number of ways. The distribution mode
2:2, for example, is six times more likely than the distribution 4:0, or 0:4. The number
of ways which lead to the realization of a definite situation, in fact, seems to be a
measure of the probability of the occurrence of it. We will designate this number of
ways by the parameter W which we will call thermodynamic probability. The amount
of W can be at least 1 and at maximum ∞, in contrast to the mathematical probability
(P), which will be introduced later, and which ranges between 0 and 1.
Now we come to the following sequence of conclusions: if W really is a measure
of the probability of getting a definite distribution, and if an increase of the degree of
order is the most uncertain result of a stochastic distribution and finally, if the entropy
(S) is a parameter, indicating the degree of disorder, then S should be a function of W.
And this function is expressed by the Boltzmann equation of entropy: S = k ln W.
Boltzmann's constant k was defined as a universal constant later by Max Planck. It
must have the same unit of measurement as entropy, and is as follows:
k = 1.380658 • 10-23 J K-1 = 8.6174 • 10-5 eV K-1.
This explanation has been based on the simplest experiment where four spheres
are distributed randomly over two compartments. One step towards thermodynamics
can be taken if, for example, the compartments of this box are thought of as molecules
of a system, and the spheres as quanta of energy, distributed among them. This
complication, of course, means a transition to the handling of larger numbers. If the
5
number of elements and classes are increased, W cannot be evaluated just by simple
trial. It is possible to calculate this value using the following equation:
where n is the total number of all elements in the system (in case of Fig. 2.8, the total
number of spheres: n = 4); n, (for i = 1,..., m) is the number of elements in each class
of state (this means the number n1 in compartment 1 and n2 in compartment 2); and
m is the number of classes of state (namely: number of compartments in the box).
7
The body does not consume energy, it changes it from one form to another. In fact,
the first law could lead us to the erroneous conclusion that animals should be able to
function without a source of external energy. The body takes in energy that is in the
chemical bonds of the food molecules and converts it to heat. If the weight and the
temperature of the body remain constant and if the body performs no external work,
the energy input to the body equals exactly the heat energy leaving the body. We may
suppose that if the heat outflow could be stopped — by good insulation, for example
— the body could survive without food. As we know, this supposition is wrong. The
need for energy is made apparent by examining the functioning of the body in the
light of the Second Law of Thermodynamics.
The body is a highly ordered system. A single protein molecule in the body may
consist of a million atoms bound together in an ordered sequence. Cells are more
complex still. Their specialized functions within the body depend on a specific
structure and location. We know from the Second Law of Thermodynamics that such
a highly ordered system, left to itself, tends to become disordered, and once it is
disordered, it ceases to function. Work must be done on the system continuously to
prevent it from falling apart. For example, the blood circulating in veins and arteries is
subject to friction, which changes kinetic energy to heat and slows the flow of blood.
If a force were not applied to the blood, it would stop flowing in a few seconds. The
concentration of minerals inside a cell differs from that in the surrounding
environment. This represents an ordered arrangement. The natural tendency is
toward an equalization with the environment. Work must be done to prevent the
contents of the cell from leaking out. Finally, cells that die must be replaced, and if the
animal is growing, new tissue must be manufactured. For such replacement and
growth, new proteins and other cell constituents must be put together from smaller,
relatively more random subcomponents. Thus, the process of life consists of building
and maintaining ordered structures. In the face of the natural tendency toward
disorder, this activity requires work. The situation is somewhat analogous to a pillar
made of small, slippery, uneven blocks that tend to slide out of the structure. The pillar
remains standing only if blocks are continuously pushed back.
The work necessary to maintain the ordered structures in the body is obtained
from the chemical energy in food. Except for the energy utilized in external work done
by the muscles, all the energy provided by food is ultimately converted into heat by
friction and other dissipative processes in the body. Once the temperature of the body
is at the desired level, all the heat generated by the body must leave through the
various cooling mechanisms of the body. The heat must be dissipated because, unlike
heat engines (such as the turbine or the steam engine), the body does not have the
ability to obtain work from heat energy. The body can obtain work only from chemical
energy. Even if the body did have mechanisms for using heat to perform work, the
8
amount of work it could obtain in this way would be small. Once again, the second law
sets the limit.
Of all the various forms of energy, the body can utilize only the chemical binding
energy of the molecules which constitute food. The body does not have a mechanism
to convert the other forms of energy into work. A person could bask in the sun
indefinitely, receiving large quantities of radiant energy, and yet die of starvation.
Plants, on the other hand, are able to utilize radiant energy. As animals use chemical
energy, so plants utilize solar radiation to provide the energy for the ordering
processes necessary for life.
The organic materials produced in the life cycle of plants provide food energy for
herbivorous animals, which in turn are food for the carnivorous animals that eat them.
The sun is, thus, the ultimate source of energy for life on Earth.
Since living systems create order out of relative disorder (for example, by
synthesizing large complex molecules out of randomly arranged subunits), it may
appear at first glance that they violate the Second Law of Thermodynamics, but this is
not the case. To ascertain that the second law is valid, we must examine the whole
process of life, which includes not only the living unit but also the energy that it
consumes and the by-products that it rejects. To begin with, the food that is consumed
by an animal contains a considerable degree of order. The atoms in the food molecules
are not randomly arranged but are ordered in specific patterns. When the chemical
energy in the molecular bindings of the food is released, the ordered structures are
broken down. The eliminated waste products are considerably more disordered than
the food taken in. The ordered chemical energy is converted by the body into
disordered heat energy.
The amount of disorder in a system can be expressed quantitatively by means of
a concept called entropy. Calculations show that, in all cases, the increase in the
entropy (disorder) in the surroundings produced by the living system is always greater
than the decrease in entropy (i.e., ordering) obtained in the living system itself. The
total process of life, therefore, obeys the second law. Thus, living systems are
perturbations in the flow toward disorder. They keep themselves ordered for a while
at the expense of the environment. This is a difficult task requiring the use of the most
complex mechanisms found in nature. When these mechanisms fail, as they
eventually must, the order falls apart, and the organism dies.
This mathematical insight formally shows that if energy and information are
available, the entropy in a given locality can be decreased by the amount of
information available to engage in the process of ordering. In other words, as in our
example of the messy living room, order can be created in a disordered system by
work that is directed by appropriate information. The second law, of course, remains
valid: the overall entropy of the universe increases. The work required to perform the
ordering, one way or another, causes a greater disorder in the surroundings than the
10
order that was created in the system itself. It is the availability of information and
energy that allows living systems to replicate, grow, and maintain their structures.
The chain of life begins with plants that possess information in their genetic
material on how to utilize the energy from the sun to make highly ordered complex
structures from the simple molecules available to them: principally water, carbon
dioxide, and an assortment of minerals. The process is, in essence, similar in human
beings and other animals. All the information required for the function of the
organism is contained in the intricate structure of DNA. The human DNA consists of
about a billion molecular units in a well-determined sequence. Utilizing the energy
obtained from the food that is consumed by the organism, the information in the DNA
guides the assembly of the various proteins and enzymes required for the functioning
of the organism.
What is the real importance of Shannon's information equation in biophysics? In
principle, it is possible to calculate the information content of a protein. The
requirements are, firstly, a statistical record of the frequency of the occurrence of the
individual amino acids in proteins. This will provide the probability (P) for the presence
of a given amino acid at a certain locus in the protein. Subsequently, the information
content of each monomer can be calculated and the information content of a whole
protein can be obtained by addition of the values of its monomers.
The information content of a nucleic acid can be obtained in the same way. One
mammalian DNA molecule consists on average of 15 000 pairs of nucleotides.
Assuming that the four possible types of nucleoside bases have an equal probability
of occurrence, then the information content of each single nucleotide will,
consequently, have a value of 2 bits. The information capacity of one DNA molecule,
in this case, amounts to 30 000 bits.
The problems of this kind of calculation may be illustrated in the following
example: the information content of an English text can be calculated from the
frequency of the use of individual letters. In this way one can derive the information
content of a word, a sentence, and, consequently, even of this textbook. It is, however,
obvious that this parameter does not reveal anything about the "information value"
of the book as generally understood. The same information (I) would be given by any
other book with the same number of meaningless strings of English words.
This situation, however, does not invalidate Shannon's information concept.
Everybody knows how important the calculation of information is today in the field of
computer science. So, for example, for the author of this book it is quite important to
know which sort of discs he needs to store the text and the figures of this book. But,
obviously, this is just a question of its volume, of the codes used, but in no case of its
"information value" in the common sense.
11
Does this mean that it is impossible to quantify biologically important information?
Does it mean that the information concept is not applicable to biological systems at
all? In fact, there is no reason for scepticism. Consequently, a distinction has to be
made between a syntactic measure of information, and a semantic measure.
The syntactic information content of a DNA molecule, as calculated above,
provides some information on the maximum storage capacity of the genome. The
amount of information actually stored is even lower, if the redundance in the storage
of genetic information which is required for the protection of the information is taken
into account. Estimates of information capacity in the genome vary between 3 • 10 2
bit and 1012 bit.
The semantic information, in contrast to the syntactic information, really should
contain some kind of validation of the content. Despite many attempts, quantification
of semantic information has not yet been achieved. In spite of the problems of giving
an exact and meaningful quantification of information, it is doubtless a quality of
matter and has a place in information theory and thermodynamics.
QUESTIONS TO BE PREPARED
13