Next: Second Law of Thermodynamics
Up: Laws of Physics :
Previous: Electric and Magnetic Fields
The conservation of energy is the bedrock of our understanding of
nature. Conservation of energy, however, only tells us that in any
process the total energy remains constant, that is .
Energy conservation, however, does not specify what can enter and
leave the energy conservation equation, only that the sum of energy
entering and leaving a system must always be the same. Furthermore,
energy conservation does not tell us what are the various forms of
energy, and how energy transforms from one form into another.
From our study of waves, light, electric and magnetic fields and
so on, we found that energy plays a central role in all these
phenomena. However, only the specific study of the various phenomena
could unravel the specific forms that energy takes. The concept of energy
also does not tell us what fraction of the total energy is kinetic
and what fraction is electrical, chemical and so on.
A new concept,in addition to energy, is needed for a more
of physical processes, in particular those involving the transformation of energy.
In all the examples we studied, energy could be completely
transformed from one form into another. For example, in our study
of magnetic fields, we found that we could fully convert mechanical energy
into electrical currents by Faraday's Law of Induction.
We may well ask: can all forms of energy be fully transformed from one form to another?
The answer is unexpectedly complicated. In some cases the answer is ``yes'', and in other
cases the answer is ``no''.
The concept of entropy addresses this question regarding the
transformation of the forms of energy, and tells us
how much energy can be converted from one form into another, and in particular,
how much energy is available for doing useful work.
Heat had defied all attempts to explain it using the concept of force,
and, in general using Newtonian mechanics. The fact that heat is a form of
energy presented a great
breakthrough in the understanding of heat,
and was fundamental to the emergence of energy as
one of the most fundamental ideas in physics.
It was, however, soon realized that heat was not an ordinary form of energy
such as gravitational or electrical energy. Rather, heat is a unique form of
energy in that only a certain fraction of heat energy can be transformed into
other forms of energy. It is for understanding the unique problems presented
by heat that new concepts such as entropy, which go beyond energy, were developed.
The idea of entropy
was introduced into physics in the nineteenth
century and is essential for understanding the phenomenon of
heat. Entropy is derived from the Greek word meaning ``transformation
content''. Famous names such as Sadi Carnot, R.J. Clausius, Count
Rumford, James Joule, Ludwig Boltzmann and so on contributed greatly to the ideas of
heat and entropy.
Entropy is a concept which does not answer all the
questions left unanswered by energy, but it does address a crucial
aspect of energy, namely how much energy can be transformed
from one form into another. In
particular, if a certain amount of heat energy is given, then
entropy tells us how much ``useful'' work can be extracted from
heat energy, in other words, what is the maximum possible efficiency
of a heat engine.
In general, since energy is conserved, one would think there is no
need to be concerned about recklessly wasting energy. But
we know from daily life that ``useful'' energy, or equivalently,
energy ``available'' for use, is a scarce resource. This intuitive
understanding that useful energy is a precious resource is
explained by the concept of entropy. Entropy is also
related to the concept or order and disorder, and a phase
transition from say water to gas is permissible only
if entropy increases in such a transformation.
For any system, entropy is a physically measurable quantity.
Furthermore, it is an experimentally observed fact that for all
processes in nature, the entropy of the total system can never decrease.
Note this empirical law concerning entropy is weaker than that of energy,
since it does not specify, for any physical process,
by how much does entropy increase.
The concept of entropy
has many other applications and vast ramifications in other disciplines.
Only those chemical
processes and reactions for which the total entropy of the system does not
decrease are allowed by nature. All living entities must maintain their
low entropic state to stay alive. More abstract applications of entropy occur
in the physics of
black holes, and it was only in 1997 that the entropy of black holes
could be calculated from first principles using results from string theory.
The concept of entropy also has a key application in information
theory. It was shown by Claude Shannon in 1950 that the
information content of a message is determined by a suitable
application of the idea of entropy to the discipline of information science.
The current understanding of entropy is that it is a large scale
(macroscopic) manifestation of the atomic nature of matter.
For example, one cubic
centimeter of air contains about number of atoms. This
is an unimaginably large number of particles. Even if under
some circumstances we
can view these atoms as classical particles, it is futile to apply
Newton's laws to such a large collection of particles. Instead, the best we
can do is to acknowledge our
total ignorance and assume that all the atoms are moving
randomly, that is, the position and velocity of every atom is a random
variable. Furthermore, for a system of atoms that reaches a stable state,
called equilibrium (more on this concept later) ,every possible
velocity and position of the atoms in one cubic centimeter of air is
equally likely. We will show later that the empirical law that entropy
never decreases is simply a statement that for a system
consisting of a collection of atoms which is not in equilibrium, the collection
of atoms will always move towards a state that is
the more probable, and will only reach equilibrium
on arriving at the most probable state available to the system.
It should be noted that the field of thermodynamics developed in
the eighteenth and nineteenth centuries without the concept of the
atomic composition of matter. It is remarkable that very general
results on how bulk matter behaves were obtained without any idea
of the microscopic composition of matter. From the point of view
of scientific methodology, the progress of thermodynamics shows
that there are concepts describing the large scale properties of
matter that have a logic of their own, and which later on are seen
to match on smoothly to a deeper and more complete understanding
of the same phenomenon.
We discuss thermodynamics, which is the branch of physics
dealing with heat, because this will give us a specific physical
system in which we can precisely define and discuss entropy and
related ideas. In particular, we will study the behaviour of
of an ideal gas to illustrate the concept of heat, energy and
Most of us have an intuitive understanding of heat, based on
having experienced hot and cold objects. The first thing
one must realize is that heat is different from
What do we mean by saying that an object is at temperature ?
Temperature is a physical property of all physical systems. From
the microscopic point of view, the temperature of a physical
object is a measure of the average kinetic energy of the atoms that make up that
object. If the atoms, on the average, are moving very fast, they have high kinetic
energy and hence the object has a high temperature, and vice versa
for slowly moving atoms. We can immediately see that since there is no
limit on how much momentum a particle can have, there is consequently no
upper limit on how high the temperature can be. The concept of a maximum
temperature does, however, appear in Big Bang cosmology, but it is so large
as to be practically infinite. On the other hand,
there must be a lower limit to temperature, which is reached when all of
the atoms that constitute an object are brought to the state of lowest possible
energy, called the ground state. It should be noted that at absolute zero temperature,
unlike classical systems is which there is no longer any motion, a quantum system in
at absolute zero temperature is in its ground state, and continues
to have quantum fluctuations.
Zeroth Law of Thermodynamics
Two objects are in thermal equilibrium if they are at the same
The zeroth law is the basis of the measurement of temperature, and of the
existence of thermometers. By bringing an object of known temperature into
contact with the object whose temperature is being measured, once thermal equilibrium
is reached one can conclude that both the temperatures are equal.
Any physical material whose behaviour under changes of temperature
is known can be used to measure temperature. For
example, if we know how a metal expands on being heated, or how the resistance of
a conductor changes, can be used for measuring temperature.
There is an absolute scale of temperature for which the lowest
temperature is at a temperature called absolute zero. This scale is
called Kelvin and denoted by . Let be the temperature of the object in
the Kelvin scale, and be its temperature on the ordinary Celsius scale.
We then have the following.
Hence, for example, .
Temperature is a new dimensional quantity in addition to and .
The units of temperature is and this is also used to signify
The fact that it is much more difficult to cool than to heat an
object is evidenced by the fact that refrigerators were only made
in the twentieth century whereas heat furnaces and engines have
been in existence for many centuries.
The lowest temperature reached to date was achieved in 1992 and is
For every extra decimal point of cooling closer to absolute zero, new
experimental techniques are
required and are usually based on qualitatively new physical properties
on various phenomena. A table of temperatures found in nature is
given in Table
Some typical temperatures of physical phenomena
|Universe at the Big Bang
|Highest Laboratory Temperature
|Surface of Sun
|Normal Human Temperature
|Freezing Temperature of Water
|Boiling of Helium -3
|Lowest Laboratory Temperature
Every object, at temperature , has an internal, thermal, energy
denoted by , and which can be thought of as being composed of the sum of the kinetic and
potential energy of all the atoms which constitute the object.
Heat, denoted by , is a form of energy, called thermal energy since it
is a form arising from temperature. Heat always refers to a process in which
thermal energy is exchanged between objects at different
temperatures. Heat is analogous to work in that both heat and work are
defined by a process, be it the
process of the flow of heat, or the process of a force acting over
some distance. Furthermore, neither heat nor
work are intrinsic properties of the system and cannot be stored
in the system; what can be stored instead is energy, and which is reflected
in the total energy of the system.
The dimension of heat is that of energy, and the unit of heat is calorie, defined
to be the energy required to
heat 1 gm of water by . Since J is also a unit of energy, we
have the following conversion
If two objects having different temperatures
are brought into contact such that heat flows between them - called
thermal contact - then heat will flow from the
object at the higher temperature to that which is at lower
temperature, until both the objects reach a common temperature. The
final state is said to be an equilibrium state in which there are no
more macroscopic changes taking place.
A very important concept in thermodynamics is that of the
heat bath, which is taken to be a large reservoir maintained at some
constant temperature . An example of a heat bath is a swimming pool
maintained at some constant temperature. When an object is brought
into thermal contact with the heat bath, it either gains heat if
it is cooler, or loses heat if it is hotter, than ; the
important point to note is that in the heat flows, the heat bath
is taken to be so big that its temperature does not change,
regardless of its exchange of heat with the object in question.
Heat flow is taken to be positive if energy flows into the object
and adds to the internal energy , and is taken to be
negative if a part of the internal energy flows out of the system.
Let the system's temperature be . Hence, if , the heat that
flows is positive, and if , then the heat flow is negative.
Figure 8.1 shows this flow of heat.
The specific heat of an object, say , determines how much the
temperature of an object, of mass m, changes if it absorbs amount of heat.
initial temperature of an object is and after absorbing
amount of heat its final temperature
is , then the definition of is given by
Heat flows of object and environment
Note heat has dimension of energy, that is
, and hence has dimension
. Since has units of , the unit of is
The specific heat of an object is a measure of how tightly bound together
are its atoms (or molecules). For an object made out of tightly
bound constituents, we expect to be large. A lot of heat has
to be supplied to it for increasing its temperature since a larger
fraction energy would be stored in the form of potential
energy instead of the kinetic energy of moving constituents, and
vice versa for materials with loosely bound constituents.
Specific Heat of Some Materials at Room Temperature
|Human Body (average)
So, for example, we can say that metals must have low specific heat since
they easily heat up, and this in turn implies that their electrons and
atoms are not very tightly bound.
Any gas is specified by its temperature , the
volume it occupies and its pressure . Pressure is defined to be the
force per unit area that the gas exerts on the walls which
enclose its volume. , , and are called state variables
which specify the state of the gas. Pressure
and volume .
Note that the state variables
are intrinsic properties of the gas, and, unlike heat, do not depend on the
history of how the gas arrived at these values. For example, in
the phase diagram given above, the state variables say at volume
and temperature shown above does not depend on the curve
which resulted in the gas being at that point of the phase
diagram; another process, at a different pressure, leads to the
same value of and , and it does not matter how the gas
arrived at that point of phase space.
An ideal gas is a gas that is so
dilute that its interactions can be completely neglected, and
each atom of the gas can be said to behave as a free particle. A number of
gases, including helium, oxygen, nitrogen and so on, do behave as ideal
gases at moderate pressures and at moderate temperatures, that is temperatures
that are far greater than their condensation temperatures.
In Section (4.2.2) a derivation is given of equation of state for the ideal gas
based on the kinetic theory. The result is the following.
Let be the number of atoms of a gas in a volume and
the constant be the famous Boltzman's constant given by
We then have the ideal gas law given by
For a gas made out of the collection of molecules (composed of many atoms)
the formula above is changed slightly. The state of an ideal gas given by (8.7).
A phase diagram is one in which any two of its state variables,
such as volume and pressure, are plotted as independent variables.
A point in the phase diagram represents the state variables of the
gas. An implicit assumption in drawing a phase diagram is that
thermodynamic variables such as
temperature, volume and pressure are continuous variables. This
assumption is clearly reasonable, since, in an experiment, we can
continuously vary these parameters . The phase diagram for
and ideal gas at constant pressure yields a straight line of versus and is shown
in phase diagram in Figure 8.2. Similarly, the phase diagram
shows how the pressure and volume is related for an ideal gas at different
(8.7) is an amazing result. It states that equal volumes
of a gas, at a given pressure and temperature, have the same number
of atoms, regardless of the kind of atom it is made out of! And
this conclusion, as shown in the Section (4.2.2), results
directly from the application of Newton's Law to a large collection of
atoms considered as classical particles.
Suppose we hold the temperature of the gas at a constant value, and vary
the pressure P. Let the initial values be , and the
final values be , . We then have from (8.7) that
A VT and a PV Phase Diagram of a gas
Similarly, let us hold the pressure as fixed, and vary the
temperature. Let initial values be and , and the
final values be and . Then we have
The internal energy of an ideal
gas is given by the following (derivation is given in the
Let us try and understand the physics of the above formulae. For
a typical gas, occupying a volume of say 1 litre, the number of particles is close
to Avogardo's number, which
is given by
. Hence, the number of particles in any
volume which we can directly perceive is truly unimaginably large.
The energy per particle is given by
; since energy of the particles is purely
kinetic, we see that the result is in accordance with our earlier discussion
that temperature is a measure of the kinetic energy of an atom in a gas.
The kinetic theory of gases is based on assuming that the position and velocity
of the atoms are random. If the random velocity of an atom is ,
then, as noted in Section (4.2.2), the kinetic energy, , of the
gas is simply the average kinetic energy of the atoms. Denoting the average of value
of say by , we have, in three dimensional space
The main reason that, in the nineteenth century, physics shifted from the
idea of force to the idea of
energy was due to the need for explaining the phenomenon of heat. If we include
heat as a (rather unique) form of energy, the law of
energy conservation then goes under the name of the First Law of
Thermodyamics, which states the following.
Let us examine what form the First Law of Thermodynamics takes for
the case of an ideal gas. We would like to apply force on the gas
from outside, and determine how much energy and other properties of
the gas are changed. We would also like to determine how much work
we can extract from a gas. The term work is used for indicating mechanical
processes that involve moving around macroscopic
bodies, such as pistons and pulleys. The energy which is expended in doing such
work is called mechanical energy, to differentiate it from the energy
of the gas, which is called internal energy. Recall work results from a
process of a force acting over some distance.
The sign convention for work is that work done by
the gas is taken to be negative, whereas work done on
the gas is taken to be positive. In other words, a gas loses
energy if it does work, whereas it will gain energy if work is
done on it.
Suppose the gas is contained
in a frictionless piston of area , and we hold gas at a fixed
temperature by keeping it in contact with a heat bath at
temperature ; we do work on the gas by compressing its
volume, so that , so that
. What is
the work that the piston does on the gas? Let the piston exert a
force , and move through a distance ; the work done on
the gas is given, as expected, by
However, recall that the pressure exerted by the gas is
defined to be force per unit area. Hence, we have
and consequently, since the volume is compressed
, we have
Suppose that heat of amount flows into the
gas during compression. We have, from the
First Law of Thermodynamics (conservation of energy), the following
We will have to discuss a number of concepts before we can precisely define
what is the entropy - denoted by - of a physical system. Before going
into the details of this discussion, we state the fundamental
result that we will arrive at, namely:
For an isolated system, its entropy S can never decrease.
The statement above tells us that all physical processes in which heat is
transformed into work and visa versa must satisfy the condition
that entropy must not decrease. In other words, if this condition
is not satisfied, the process cannot take place. For example a
process in which there is a net decrease in entropy, that is,
is forbidden as it would imply, as shown in Sec 2.6.3, that a
perpetual machine could be constructed.
To define the concept of entropy
quantitatively, we need to first discuss what
is meant by a reversible and an irreversible process.
Consider a gas inside a frictionless piston in contact with a heat bath.
If we pull out the piston very slowly, in a number of very small steps,
the gas will expand at
constant temperature , and if we then push the piston back,
again slowly and in small steps, the gas will compress and return to its former
state. This process is reversible as shown in Figure 8.3.
Note is a function only of the state of the gas, and does not
depend on how work was done on, or by, the gas, and how heat flowed
in or out of the gas. However, the presence of is process
dependent, and we need further analysis to show that is
Energy conservation does not tell us, for example, how much work
can be transformed into heat, or whether all the heat in a body
can be made to do ``useful'' mechanical work. The information about
the direction in which energy can flow, and how much energy can be
transformed from say heat into work, and vice versa, is given by the concept of
The point to note is that for a reversible process, for each step taken, the system
is in equilibrium,
and that a small step in one direction, say causing compression, can be exactly
reversed by a small step in the opposite direction, causing expansion.
If we had suddenly pulled the
piston out, the temperature of the gas would have dropped, there would have been turbulent
gas motion, heat would have flowed into the gas from the reservoir and so on. The process
would then have become an irreversible one. No process in nature is fully
reversible, and for a process to be considered as reversible, we need to ignore all
effects such as friction, imperfect insulation and so on which would cause irreversiblity.
Idealization of a thermodyamic process as being reversible is analogous to
Newton's idealization in stating his
first law, since complications which come from friction and other irreversible effects are
inessential in understanding the underlying principles of thermodyamics.
Consider a system in contact with a heat bath and held at constant
temperature ; for a very small change in the independent parameters of the system
such as volume, in which amount of heat flows into it
via a reversible process, the change in entropy of the system is given by
A Reversible Process
, and the unit of entropy is .
For energy conservation we consequently have
The entropy of an ideal gas is given by
The entropy per particle (ignoring a constant) is hence
We had assumed that for an ideal gas the interactions amongst the
particles could be ignored. We see that the formula above confirms this,
since the energy and entropy are both proportional to , which
comes from a simple sum over the individual gas atoms.
Although entropy is an intrinsic property of the system, it is
not an intrinsic property of an individual particle, but rather depends, for example, on the
total volume occupied by the gas. To know what a particle is doing, we need
to know both its position and velocity. We see that entropy
is, in a precise sense, our degree of ignorance about the particle's behaviour. For
example, entropy increases as we increase , which reflects the
fact that the position of the particle becomes more uncertain, since it
can be anywhere in a larger volume . Similarly, if we increase
temperature, entropy goes up since the larger the kinetic energy,
the more random the velocity of the particle, and hence a greater degree
in our ignorance about the particle.
The most significant and remarkable feature of entropy is that,
although is a quantity which is process dependent,
entropy -similar to energy-
is an intrinsic property of the system, and
unlike heat, entropy can be stored. We hence have
We examine what it means, for example, for energy and entropy to
depend on only and .
Consider the -phase diagram. That is, consider and to be the
independent variables, and let pressure
be fixed by the ideal gas law.
For every point on this -diagram, both and have some definite value.
Suppose we take the ideal gas through any process, be it reversible or irreversible, along
path 1, as indicated in Figure 8.4, from values to some new values .
(Note this path is not
a path in physical space, but rather a path in the phase space of .) The system
will go through a series of states with varying , each having its
The final change in energy and entropy is given by
Phase diagram of VT with Paths 1 and 2
Now let us take the ideal gas through a different and arbitrary process following
say path 2, also given in Figure 8.4. Then and will
have the same values as given in (8.25)! In other words, as mentioned
before, the values of depend only on , and not how the system
arrived at these values. The change in the energy and entropy for a reversible process does
not depend on the path taken; rather it only depends on the
initial and final values of entropy.
If one goes through a closed path following a reversible process,
it follows from above that the total change in the entropy of the
system is zero, that is it is restored to its original state. However, note
that in this reversible process, a net amount of heat has been expended. In
Recall entropy depends only on the
the state of the gas. Hence, if heat is supplied to the
gas through a reversible process, the total heat expended
can be computed from the change in entropy caused by the heat flows.
Consider going around, at room temperature , in a circle in physical space. Since we always
have to overcome friction for movement, certain amount of
work would be necessary to overcome friction. From energy conservation
this work would generate heat and lead to an increase of entropy
. Hence, in going around a closed
path, we have not come back to the original state, since we have had heating
due to friction, something that we can no longer reverse.
In general, if we follow an irreversible process in completing a complete
cycle, the entropy always increases. That is
Reversible process from initial state to final state
How would one determine the change in entropy for an
irreversible process? Since the formula we have for entropy
given in eq.(8.21) is valid only for reversible
processes, we cannot use it. We instead adopt the following
approach. The irreversible process starts from some initial state of the
system and ends up in a final state. We construct an
equivalent reversible process which
also goes from the same initial to the same final state. Since
entropy does not depend on how its values are attained, we can now
compute the change of entropy for the equivalent reversible
process, which will then be equal to the change in entropy for the
irreversible process in question.
To illustrate how to calculate the change in entropy in an irreversible
process, consider the irreversible free expansion of
an ideal gas from an initial volume into a vacuum such that its final
volume is . Suppose that the gas is thermally insulated from the
environment, and at initial temperature of .
Since the gas expands into a vacuum it does no work, that is ; furthermore, since it is thermally insulated there is no
flow of heat, and hence . The first law implies that
An Irreversible Process
Since total energy does not change for an ideal gas, this
implies that the initial and final temperature are equal.
So the initial state of the gas is , and the final state of
the gas is . We would like to replace the irreversible free
expansion of the ideal gas with an equivalent reversible process.
Since the initial and final temperatures are the same, we can
replace the free expansion with an (isothermal) expansion of the gas at
constant temperature, from volume to . To see that
expansion at constant temperature is a reversible process,
consider a gas in a piston and in contact with a heat bath at temperature
. If one pulls the piston very slowly, heat will flow in to
keep the temperature at ; on the other hand, if one slowly
compresses the piston, the gas will be compressed, and heat will
flow back into the reservoir. So we can see that an isothermal
expansion is reversible.
All we need to do now is to compute the difference between the entropy of an ideal
gas in the initial state and final state .
We hence have the total change in the entropy as given, from (8.24), by
The result obtained shows that entropy increases in the free expansion of a
gas, the precise amount being given by the formula above.
Next: Second Law of Thermodynamics
Up: Laws of Physics :
Previous: Electric and Magnetic Fields