0% found this document useful (0 votes)
7 views

Lecture 1 (1)

Biophysics studies the physical principles of living systems, encompassing both biological and physical aspects, and is recognized as an interdisciplinary science. It explores the molecular behavior of biological structures, highlighting the stochastic nature of microphysical processes and the deterministic behavior of larger systems. Additionally, thermodynamics is introduced as the study of energy conversions in living organisms, governed by the laws of energy conservation and entropy, which relate to the organization and disorder within systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Lecture 1 (1)

Biophysics studies the physical principles of living systems, encompassing both biological and physical aspects, and is recognized as an interdisciplinary science. It explores the molecular behavior of biological structures, highlighting the stochastic nature of microphysical processes and the deterministic behavior of larger systems. Additionally, thermodynamics is introduced as the study of energy conversions in living organisms, governed by the laws of energy conservation and entropy, which relate to the organization and disorder within systems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

NATURE AND SUBJECT OF BIOPHYSICS

The subjects of Biophysics are the physical principles underlying all processes of
living systems. This also includes environmental biophysics, which represents physical
influences on physiological functions.
Does biophysics belong to biology, or is it a part of physics? Biology claims to be
a comprehensive science relating to all functions of the living systems. Hence,
biophysics, like genetics, biochemistry, physiology etc., should be considered as a
special part of biology. This view has not remained undisputed by physicists, since
physics is not confined to subjects of inanimate matter. Biophysics can be considered,
with equal justification, as a part of physics. Especially today, when the boundaries
between classical disciplines are no longer fixed, it would be futile to try to balance
those aspects against each other. Biophysics appears to be one of the best examples
of an interdisciplinary science.

Two kinds of physical behavior meet on the molecular level of biological


structures: On the one hand, there are the characteristic properties of microphysical
processes, based on the individual behavior of single small particles like atoms,
molecules, or supramolecular structures. These processes are mostly stochastic. On
the other hand, there are reactions which resemble macrophysical properties, the
kind of behavior of “large” bodies. Macrophysics is ruled by the laws of classical
physics, as for the example of classical mechanics. Our daily experiences with
macrophysical systems teach us that their behavior is generally deterministic.
To explain this difference, let us consider a simple mechanical wheelwork. The
knowledge of its design and construction allows a precise prediction of the behavior
of the system. This prediction is based on the laws of classical mechanics. In contrast
to this, a chemical reaction with a small number of molecules in a homogeneous phase
depends on stochastic collisions of the individual molecules with each other. Since this
process is stochastic, it is predictable only in a statistical way.
This stochastic behavior of molecular systems can be transformed into a
deterministic one, if the number of participating stochastic events is large, or if the
degrees of freedom of the single reactions are extremely limited. The increase of
stochastic events can be realized either by an increasing number of participating
molecules, by enlarging the volume for example, where the reaction takes place, or
by an increase of the time interval of observation. This consideration indicates an
interesting interplay between volume, time constants, and reliability of a biochemical
reaction.
The limitation of the degree of freedom of a biochemical reaction is realized by a
property of the system which is called anisotropy. In contrast to isotropic systems, like
1
simple solutions, in anisotropic systems the mobility of molecules in various directions
is not identical, but is restricted in some directions, and promoted in others. This, for
example, is the case for enzymatic reactions, where the participating enzymes are
oriented in membranes, or if the reactions of charged or polar reactants occur in
strong electric fields of electrical double layers.
In many fields the biological organism works as an amplifier of microphysical
stochastics. A molecular mutation, for example, leads to a reaction chain, which finally
ends with a phenomenological alteration of the organism. Or, as another example: a
few molecular events in the pigments of an optical receptor can lead to perception
and to reaction in behavior.
During the first steps in considering molecular mechanisms of biological systems,
a further aspect is taken into consideration. Unfortunately, biologists often ignore that
a qualitative jump has to be considered in the transition from the “visible”
macrophysical structures to the microphysical systems such as atoms or molecules.
This includes not only the above-mentioned transition from the deterministic
behavior of macroscopic systems to the stochastic behavior of single molecules, but
many further aspects as well.

Some comments to the terms used above:

What are supramolecular structures? - these are assemblies of several molecules in


one complex for carrying out a certain process. There are no strong molecular bonds between
the molecules within the complex, rather they are kept together through the involvement of
weak forces (van der Waals forces, hydrophobic forces, electrostatic effects).

What are stochastic events? - these are random events having certain probability of
taking place, but each separate event cannot be predicted in a determined way.

FUNDAMENTAL CONCEPTS OF THERMODYNAMICS

The existence of a living organism, the viability processes within it, are closely
related to the changes in its energetic balance and therefore to energy conversions.
Part of physics called Thermodynamics is the science of energy conversion.
But first we have to discuss some major concepts of thermodynamics.
Energy is a property of objects defining their ability to perform some type of work.
There are various types of energy in general, but in the living body energy conversions
happen between the following 4 types of energy:
Mechanical energy – is the form of energy describing the movement of a body
and its ability to perform work during movement. Mechanical energy is divided into
2
kinetic (defined by the velocity of body movement) and potential energy (defined by
the position of the body).
Thermal energy – is a sum of energies of thermal chaotic movements of atoms
and molecules of a substance. The indicator of thermal motion of particles is
temperature. Kinetic energy of molecules Ek is defined by the formula
Ek = 3/2 kT
where k = 1.380 x 10 J/K = 1.380 x 10-16 erg/K is a Boltzmann constant, T is an
-23

absolute temperature.
Chemical energy – is the energy of interaction of atoms and molecules. In fact, it
is the energy of electrons moving in outer orbits of atoms and molecules.
Electric energy – is the energy of interaction of electrically charged particles,
which causes their motion in electric field.
We also have to introduce a notion of a system and that of a structure of a system.
In thermodynamics, a system is an aggregate of elements with certain interrelations
between them. The systems are classified as follows according to the nature of their
boundary against their environment:
- The isolated system: this is an idealized system that does not exchange any kind
of energy or matter with its environment.
- The closed system: this system can exchange all kinds of energy with its
environment, but not matter.
- The open system: it can exchange both energy and matter with its environment.
The closed system can be influenced by its environment, and can cause changes in
its environment. However, it cannot be involved in an exchange of matter.
The state of a system can be described by a number of state variables. These are
either extensive or intensive parameters. Intensive parameters are non-additive. They
are independent of the size of the system (e.g. temperature, concentration, pressure,
density). Extensive parameters on the other hand, are additive when two systems are
combined (e.g. mass, volume).
What is a structure of a thermodynamic system? The totality of interrelations
between the elements of a system is called the structure of the system. This definition
does not prescribe at all what kind of elements, and what kind of interrelations these
are. It is applicable to all kinds of systems including biological systems and structures.
In physics applied to biological systems, we are especially interested in dynamic
systems, i.e. where the interrelations between their elements are interactions. In
contrast to this, in static systems the elements have no interaction at all, but are just
interrelated by formal relations. Examples for static systems are the system of natural
numbers in mathematics, or the system of animal species or of plants in biology.

3
Thermodynamics is the study of the relationship between heat, work, and the
associated flow of energy. After many decades of experience with heat phenomena,
scientists formulated two fundamental laws as the foundation of thermodynamics.
The First Law of Thermodynamics states that energy, which includes heat, is
conserved; that is, one form of energy can be converted into another, but energy can
neither be created nor destroyed. This implies that the total amount of energy in the
universe is constant. The second law, more complex than the first, can be stated in a
number of ways which, although they appear different, can be shown to be
equivalent. Perhaps the simplest statement of the Second Law of Thermodynamics is
that spontaneous change in nature occurs from a state of order to a state of disorder.
Thermodynamic Probability and Entropy
In 1854, German physicist Rudolf Clausius introduced entropy (S) as a parameter
of phenomenological thermodynamics, and defined it as the heat, added to a system
in a reversible way in relation to the temperature (dS = dQ rev/T). Later, in 1894, L.
Boltzmann used this parameter in the framework of statistical thermodynamics. In
this context, entropy and the second principle of thermodynamics become more
imaginable. Entropy appears as a kind of measure of disorder, or as a degree of random
distribution of the elements within the system. The correlation between order and
probability and, as will be explained later - information - is of great importance for the
understanding of the principles of biological organization.
Let us start with the assumption that entropy is a measure of randomization of a
given distribution. We will consider a system of maximal entropy as a system in
maximal disorder. Furthermore, let us demand that the entropy be an extensive
parameter. Therefore, like volume, or mass, but in contrast for example to
temperature or density, the entropies S1 and S2 of two systems can be added, if these
systems come together: S1 + S2 = S

How can we now define a parameter which indicates the degree of


randomization or, on the contrary, the degree of order? What does order of
organization mean? Of course, our daily experience shows that an ordered system
spontaneously transforms into a disordered one, but not vice versa. This, actually, is
the consequence of the second principle of thermodynamics.
Let us consider a very simple structure, just the distribution of four
distinguishable spheres in two compartments of a box (Fig. 2.8). Let each of these
spheres, independently of the three others, fall just by chance into one or the other
compartment of the box. All of the 11 possibilities of the distribution, as indicated in
Fig. 2.8, therefore, have the same degree of probability, because the probability of
each sphere individually falling into compartment 1 or into compartment 2 is equal.
4
Summarizing the patterns of distribution, there is only one way to realize the
distributions 0:4 and 4:0. In contrast, there are four ways to realize the distribution
3:1 and 1:3, and, finally, six ways for equal distribution: 2:2.

Let us now ignore the fact that the spheres are distinguishable. Let us simply ask:
how large is the probability that just by stochastic distributions one of the relations
4:0, 3:1, 2:2, 1:3, or 0:4 occurs? Apparently, the probability of any kind distribution
will be larger if it can be realized in a larger number of ways. The distribution mode
2:2, for example, is six times more likely than the distribution 4:0, or 0:4. The number
of ways which lead to the realization of a definite situation, in fact, seems to be a
measure of the probability of the occurrence of it. We will designate this number of
ways by the parameter W which we will call thermodynamic probability. The amount
of W can be at least 1 and at maximum ∞, in contrast to the mathematical probability
(P), which will be introduced later, and which ranges between 0 and 1.
Now we come to the following sequence of conclusions: if W really is a measure
of the probability of getting a definite distribution, and if an increase of the degree of
order is the most uncertain result of a stochastic distribution and finally, if the entropy
(S) is a parameter, indicating the degree of disorder, then S should be a function of W.
And this function is expressed by the Boltzmann equation of entropy: S = k ln W.
Boltzmann's constant k was defined as a universal constant later by Max Planck. It
must have the same unit of measurement as entropy, and is as follows:
k = 1.380658 • 10-23 J K-1 = 8.6174 • 10-5 eV K-1.
This explanation has been based on the simplest experiment where four spheres
are distributed randomly over two compartments. One step towards thermodynamics
can be taken if, for example, the compartments of this box are thought of as molecules
of a system, and the spheres as quanta of energy, distributed among them. This
complication, of course, means a transition to the handling of larger numbers. If the
5
number of elements and classes are increased, W cannot be evaluated just by simple
trial. It is possible to calculate this value using the following equation:

where n is the total number of all elements in the system (in case of Fig. 2.8, the total
number of spheres: n = 4); n, (for i = 1,..., m) is the number of elements in each class
of state (this means the number n1 in compartment 1 and n2 in compartment 2); and
m is the number of classes of state (namely: number of compartments in the box).

First Law of Thermodynamics


One of the first to state the law of energy conservation was the German physician
Robert Mayer (1814–1878). In 1840 Mayer was the physician on the schooner Java,
which sailed for the East Indies. While aboard ship, he was reading a treatise by the
French scientist Laurent Lavoisier in which Lavoisier suggested that the heat produced
by animals is due to the slow combustion of food in their bodies. Lavoisier further
noted that less food is burned by the body in a hot environment than in a cold one.
When the ship reached the tropics, many of its crew became sick with fever.
Applying the usual remedy for fever, Mayer bled his patients. He noticed that the
venous blood, which is normally dark red, was nearly as red as arterial blood. He
considered this a verification of Lavoisier’s suggestion. Because in the tropics less fuel
is burned in the body, the oxygen content of the venal blood is high, giving it the
brighter color. Mayer then went beyond Lavoisier’s theory and suggested that in the
body there is an exact balance of energy (which he called force). The energy released
by the food is balanced by the lost body heat and the work done by the body. Mayer
wrote in an article published in 1842, “Once in existence, force [energy] cannot be
annihilated—it can only change its form.”
Considerably more evidence had to be presented before conservation of energy
was accepted as a law, but it is interesting that such a fundamental physical law was
first suggested from the observation of human physiology.

Fig. 1 The energetics of the body.


6
Conservation of energy is implicit in all our calculations of energy balance in living
systems. Consider, for example, the energetics for the functioning of an animal (see
Fig. 1). The body of an animal contains internal thermal energy Et, which is the product
of the mass and specific heat, and chemical energy Ec stored in the tissue of the body.
In terms of energy, the activities of an animal consist of simply eating, working, and
rejecting excess heat by means of various cooling mechanisms (radiation, convection,
perspiration, etc.). Without going into detailed calculations, the first law allows us to
draw some conclusions about the energetics of the animal. For example, if the internal
temperature and the weight of the animal are to remain constant (i.e., Ec and Et
constant),over a given period of time the energy intake must be exactly equal to the
sum of the work done and the heat lost by the body. An imbalance between intake
and output energy implies a change in the sum Ec+Et.
Second Law of Thermodynamics
There are many imaginable phenomena that are not forbidden by the First Law
of Thermodynamics but still do not occur. For example, when an object falls from a
table to the ground, its potential energy is first converted into kinetic energy; then, as
the object comes to rest on the ground, the kinetic energy is converted into heat. The
First Law of Thermodynamics does not forbid the reverse process, whereby the heat
from the floor would enter the object and be converted into kinetic energy, causing
the object to jump back on the table. Yet this event does not occur. Experience has
shown that certain types of events are irreversible. Broken objects do not mend by
themselves. Spilled water does not collect itself back into a container. The
irreversibility of these types of events is intimately connected with the probabilistic
behavior of systems comprised of a large ensemble of subunits.
One statement of the second law is: The direction of spontaneous change in a
system is from an arrangement of lesser probability to an arrangement of greater
probability; that is, from order to disorder. This statement may seem to be so obvious
as to be trivial, but, once the universal applicability of the second law is recognized, its
implications are seen to be enormous. We can deduce from the second law the
limitations on information transmission, the meaning of time sequence, and even the
fate of the universe. These subjects, however, are beyond the scope of our discussion.

Thermodynamics of Living Systems


It is obvious that animals need food to live, but the reason for this is less obvious.
The idea that animals need energy because they consume energy is, strictly speaking,
incorrect. We know from the First Law of Thermodynamics that energy is conserved.

7
The body does not consume energy, it changes it from one form to another. In fact,
the first law could lead us to the erroneous conclusion that animals should be able to
function without a source of external energy. The body takes in energy that is in the
chemical bonds of the food molecules and converts it to heat. If the weight and the
temperature of the body remain constant and if the body performs no external work,
the energy input to the body equals exactly the heat energy leaving the body. We may
suppose that if the heat outflow could be stopped — by good insulation, for example
— the body could survive without food. As we know, this supposition is wrong. The
need for energy is made apparent by examining the functioning of the body in the
light of the Second Law of Thermodynamics.
The body is a highly ordered system. A single protein molecule in the body may
consist of a million atoms bound together in an ordered sequence. Cells are more
complex still. Their specialized functions within the body depend on a specific
structure and location. We know from the Second Law of Thermodynamics that such
a highly ordered system, left to itself, tends to become disordered, and once it is
disordered, it ceases to function. Work must be done on the system continuously to
prevent it from falling apart. For example, the blood circulating in veins and arteries is
subject to friction, which changes kinetic energy to heat and slows the flow of blood.
If a force were not applied to the blood, it would stop flowing in a few seconds. The
concentration of minerals inside a cell differs from that in the surrounding
environment. This represents an ordered arrangement. The natural tendency is
toward an equalization with the environment. Work must be done to prevent the
contents of the cell from leaking out. Finally, cells that die must be replaced, and if the
animal is growing, new tissue must be manufactured. For such replacement and
growth, new proteins and other cell constituents must be put together from smaller,
relatively more random subcomponents. Thus, the process of life consists of building
and maintaining ordered structures. In the face of the natural tendency toward
disorder, this activity requires work. The situation is somewhat analogous to a pillar
made of small, slippery, uneven blocks that tend to slide out of the structure. The pillar
remains standing only if blocks are continuously pushed back.
The work necessary to maintain the ordered structures in the body is obtained
from the chemical energy in food. Except for the energy utilized in external work done
by the muscles, all the energy provided by food is ultimately converted into heat by
friction and other dissipative processes in the body. Once the temperature of the body
is at the desired level, all the heat generated by the body must leave through the
various cooling mechanisms of the body. The heat must be dissipated because, unlike
heat engines (such as the turbine or the steam engine), the body does not have the
ability to obtain work from heat energy. The body can obtain work only from chemical
energy. Even if the body did have mechanisms for using heat to perform work, the
8
amount of work it could obtain in this way would be small. Once again, the second law
sets the limit.
Of all the various forms of energy, the body can utilize only the chemical binding
energy of the molecules which constitute food. The body does not have a mechanism
to convert the other forms of energy into work. A person could bask in the sun
indefinitely, receiving large quantities of radiant energy, and yet die of starvation.
Plants, on the other hand, are able to utilize radiant energy. As animals use chemical
energy, so plants utilize solar radiation to provide the energy for the ordering
processes necessary for life.
The organic materials produced in the life cycle of plants provide food energy for
herbivorous animals, which in turn are food for the carnivorous animals that eat them.
The sun is, thus, the ultimate source of energy for life on Earth.
Since living systems create order out of relative disorder (for example, by
synthesizing large complex molecules out of randomly arranged subunits), it may
appear at first glance that they violate the Second Law of Thermodynamics, but this is
not the case. To ascertain that the second law is valid, we must examine the whole
process of life, which includes not only the living unit but also the energy that it
consumes and the by-products that it rejects. To begin with, the food that is consumed
by an animal contains a considerable degree of order. The atoms in the food molecules
are not randomly arranged but are ordered in specific patterns. When the chemical
energy in the molecular bindings of the food is released, the ordered structures are
broken down. The eliminated waste products are considerably more disordered than
the food taken in. The ordered chemical energy is converted by the body into
disordered heat energy.
The amount of disorder in a system can be expressed quantitatively by means of
a concept called entropy. Calculations show that, in all cases, the increase in the
entropy (disorder) in the surroundings produced by the living system is always greater
than the decrease in entropy (i.e., ordering) obtained in the living system itself. The
total process of life, therefore, obeys the second law. Thus, living systems are
perturbations in the flow toward disorder. They keep themselves ordered for a while
at the expense of the environment. This is a difficult task requiring the use of the most
complex mechanisms found in nature. When these mechanisms fail, as they
eventually must, the order falls apart, and the organism dies.

Information and the Second Law of thermodynamics


We have stressed earlier that work must be done to create and maintain the
highly ordered local state of life. We now turn to the question what else is needed for
such local ordering to occur? Perhaps we can get an insight into this issue from a
9
simple everyday experience. In the course of time, our apartment becomes
disordered. Books, which had been placed neatly, in alphabetical order, on a shelf in
the living room, are now strewn on the table and some are even under the bed. Dishes
that were clean and neatly stacked in the cupboard, are now dirty with half-eaten food
and are on the living room table. We decide to clean up, and in 15 minutes or so the
apartment is back in order. The books are neatly shelved, and the dishes are clean and
stacked in the kitchen. The apartment is clean.
Two factors were necessary for this process to occur. First, as was already stated,
energy was required to do the work of gathering and stacking the books and cleaning
and ordering the dishes. Second, and just as important, information was required to
direct the work in the appropriate direction. We had to know where to place the books
and how to clean the dishes and stack them just so. The concept of information is of
central importance here.
In the 1940s, Claude Shannon developed a quantitative formulation for the
amount of information available in a given system. The information of a message
depends on the effort required to guess it by a highly set system of questions. Hence,
information is some sort of degree of the actuality of a message.
It is not difficult to guess the result of the toss of a coin, since there are only two
possibilities of equal probability. To guess a certain card in a full deck of playing cards
is much more difficult. In this case, a much greater uncertainty factor has to be taken
into account. Using a more systematic approach, a large number of yes-no questions
have to be answered. Hence, the information content of a playing card is higher than
that of a tossed coin. Should a deck consist of cards which are all the same, and this is
known by the challenged person, guessing will not make sense at all. The information
content of each of these cards is zero. The probability by which possibilities are turned
into reality, consequently, seems to become a measure of information.
Shannon’s formula for information content is shown to be equivalent to the
formula for entropy, the measure of disorder, but with a negative sign, I = - k ln P,
where P is a mathematical probability which is defined as follows:

This mathematical insight formally shows that if energy and information are
available, the entropy in a given locality can be decreased by the amount of
information available to engage in the process of ordering. In other words, as in our
example of the messy living room, order can be created in a disordered system by
work that is directed by appropriate information. The second law, of course, remains
valid: the overall entropy of the universe increases. The work required to perform the
ordering, one way or another, causes a greater disorder in the surroundings than the
10
order that was created in the system itself. It is the availability of information and
energy that allows living systems to replicate, grow, and maintain their structures.
The chain of life begins with plants that possess information in their genetic
material on how to utilize the energy from the sun to make highly ordered complex
structures from the simple molecules available to them: principally water, carbon
dioxide, and an assortment of minerals. The process is, in essence, similar in human
beings and other animals. All the information required for the function of the
organism is contained in the intricate structure of DNA. The human DNA consists of
about a billion molecular units in a well-determined sequence. Utilizing the energy
obtained from the food that is consumed by the organism, the information in the DNA
guides the assembly of the various proteins and enzymes required for the functioning
of the organism.
What is the real importance of Shannon's information equation in biophysics? In
principle, it is possible to calculate the information content of a protein. The
requirements are, firstly, a statistical record of the frequency of the occurrence of the
individual amino acids in proteins. This will provide the probability (P) for the presence
of a given amino acid at a certain locus in the protein. Subsequently, the information
content of each monomer can be calculated and the information content of a whole
protein can be obtained by addition of the values of its monomers.
The information content of a nucleic acid can be obtained in the same way. One
mammalian DNA molecule consists on average of 15 000 pairs of nucleotides.
Assuming that the four possible types of nucleoside bases have an equal probability
of occurrence, then the information content of each single nucleotide will,
consequently, have a value of 2 bits. The information capacity of one DNA molecule,
in this case, amounts to 30 000 bits.
The problems of this kind of calculation may be illustrated in the following
example: the information content of an English text can be calculated from the
frequency of the use of individual letters. In this way one can derive the information
content of a word, a sentence, and, consequently, even of this textbook. It is, however,
obvious that this parameter does not reveal anything about the "information value"
of the book as generally understood. The same information (I) would be given by any
other book with the same number of meaningless strings of English words.
This situation, however, does not invalidate Shannon's information concept.
Everybody knows how important the calculation of information is today in the field of
computer science. So, for example, for the author of this book it is quite important to
know which sort of discs he needs to store the text and the figures of this book. But,
obviously, this is just a question of its volume, of the codes used, but in no case of its
"information value" in the common sense.

11
Does this mean that it is impossible to quantify biologically important information?
Does it mean that the information concept is not applicable to biological systems at
all? In fact, there is no reason for scepticism. Consequently, a distinction has to be
made between a syntactic measure of information, and a semantic measure.
The syntactic information content of a DNA molecule, as calculated above,
provides some information on the maximum storage capacity of the genome. The
amount of information actually stored is even lower, if the redundance in the storage
of genetic information which is required for the protection of the information is taken
into account. Estimates of information capacity in the genome vary between 3 • 10 2
bit and 1012 bit.
The semantic information, in contrast to the syntactic information, really should
contain some kind of validation of the content. Despite many attempts, quantification
of semantic information has not yet been achieved. In spite of the problems of giving
an exact and meaningful quantification of information, it is doubtless a quality of
matter and has a place in information theory and thermodynamics.

QUESTIONS TO BE PREPARED

• What is the difference between stochastic and deterministic events?


• What is the difference between isotropic and anisotropic systems?
• What is a thermodynamic system?
• What types of thermodynamic systems do we distinguish?
• What is meant by the term “dynamic system” and what is important in the
functioning of such a system?
• What is the difference between intensive and extensive parameters?
• What is the essence of entropy in thermodynamics?
• What is the essence of the first law of thermodynamics?
• In which of thermodynamic systems can entropy be decreased and in what case
can this happen?
• What is the essence of the second law of thermodynamics?
12
• What is the thermodynamic probability? What is its range?
• What is the mathematical probability? What is its range?
• Why heat energy cannot be completely converted to other forms of energy?
• What are the ways of use by the living bodies of the chemical energy received from
food?
• What is the difference between syntactic information and semantic information?

13

You might also like