Next: *Statistical Mechanics
Up: Laws of Physics :
There are many formulations of the Second Law of Thermodynamics,
and they are all, in essence, statements regarding the allowed changes for the
entropy of any physical system.
The second law is a statement that all processes go only in one
direction, which is the direction of greater and greater
degradation of energy, in other words, to a state of higher and higher entropy.
For example, when we stir a cup of tea,
the smooth and swirling motion that we make with a spoon soon
disappears. The swirling motion is converted - from conservation of energy - into
a very tiny increase in the temperature of
the tea. However, no matter how long we wait, the still tea in the cup will
never suddenly start to swirl accompanied by a tiny drop in its temperature.
Similarly, if a glass shatters, no amount of waiting will ever see
the glass suddenly re-assemble itself, although the glass in
tact and the shattered glass, upto some minute differences, have the same energy.
There are endless similar examples from
daily life that one can quote.
Since in all the examples, energy and momentum are
conserved, clearly it is not these considerations that are
responsible for events in time not being able to reverse
themselves. The question is the following: why are certain
physical phenomenon, allowed by conservation laws such as energy, nevertheless
forbidden from occurring?
The second law of thermodyamics is the underlying reason that
unlikely events do not occur. Entropy is a measure of the likelihood for some event
to occur, and only those events can occur for which entropy
increases, since they are more likely. In other words, an isolated system
always goes from a less probable to a more probable configuration.
We hence have the following statement for the second
I. In any physical process, the entropy S for an isolated system
never decreases; that is, we have
Unlikely as it may sound, the second law is one of the few fundamental laws of physics
arose from very practical questions, in particular the need to understand the theory
of heat engines. Carnot analyzed
how much mechanical work could be extracted from heat, and what, in
principle, is the most efficient heat engine that one could construct. His
the beginning of the concept of entropy. It was only much later, in the
work of Boltzmann, that there emerged a microscopic and more
fundamental understanding of the principle of entropy.
Based on the considerations of heat and work, we have a few other
formulations of the second law.
II. No mechanical work can be extracted from an isolated system at a single
III. Heat cannot spontaneously flow from a cold body to a hot body.
Although these formulations may seem to be a far cry from Statement I
of the second law, it will be shown to be identical to it.
Since only changes in entropy are defined, it was thought that there
was an additive constant which would always be arbitrary. However, it was
realized that this was not so.
The Third Law of Thermodynamics
As temperature tends to absolute zero, so does entropy. In other
Heat engines are ubiquitous, and are essential for the functioning
of society. All automobiles are powered by heat engines, and so
are power generators, ships, industrial prime movers and so on.
Energy from such
advanced sources such as nuclear reactors are ultimately used for
driving heat engines by
heating water to create steam, and which in turn drive great turbines to
generate electricity. The reverse of the heat engine is the refrigerator and
the air conditioner, something equally essential for daily life.
So no matter how high tech present day society becomes, the humble heat engine
which powered the industrial revolution in England in the
eighteenth century continues to be of importance, and definitely
deserves to be studied.
One may think that heat engines and the like
should be studied by engineers, and not in a course on
science. In general, this observation is correct if one is interested
in actually making a heat engine for a particular purpose. However, the
reason we are interested in heat engines is not for making a
better heat engine, but rather to examine only that part of the
theory of heat engines that says something fundamental and
universal about matter in general. And surprisingly enough, our
first and most concrete understanding of entropy comes from the
study of heat engines.
What is a heat engine? In essence, it is a device which converts heat
energy into mechanical work, like moving pistons, levers and so on.
The boiler contains some substance which is kept at a high
temperature by supplying it with energy obtained by converting other
(chemical) forms of energy into heat energy - either by burning wood, or
fossil fuels and so on. Once we have the substance
at a high temperature, the heat engine extracts mechanical work
To make the discussion more precise, suppose the substance we heat
is a given amount of an ideal gas. The central question for a heat
engine is its efficiency, namely, how much work can we extract for
a given amount of heat. To determine efficiency we have to make
the heat engine go one complete cycle, namely, starting from some initial
heat, doing mechanical work, and then going through a process that
restores the gas back to its initial state.
Once the heat engine has completed one cycle and
is back to its original state, we can ask the following question:
In one cycle, for a given amount of heat, say , supplied to the engine, how much
work did we obtain from the heat engine? In other words, the efficiency
of the heat engine is defined by
Phase Diagram of a Reversible Cycle
We study an idealized heat engine which is reversible, that is, all
the steps in the heat engine's workings are achieved through
reversible thermodynamical processes, and hence the engine can as easily be
run forward as backward.
We will later show that a reversible heat engine
is the most efficient heat engine that can exist.
The idea of a reversible engine can be compared to a water
mill in which no potential energy of the water is wasted. The
reversible heat engine is similar to an idealized case
of frictionless motion, as the essential features of say Newton's
Laws are seen more clearly by neglecting inessential complications
introduced by the presence of friction.
Consider an ideal gas contained in a frictionless
piston. We put the gas on a large heating pad
held at some constant temperature (this is our boiler).
A heat engine, according to Statement II of the second law, has to
operate between heat reservoirs at two temperatures, say the
boiler at temperature , and the environment at temperature
. The heat engine takes in heat energy of amount from the boiler,
does work of amount , and discharges heat energy to the
environment, and is schematically shown in Figure 9.2.
From the first law (conservation of energy), we have
Reversible Heat Engine
A reversible engine is one for which the total change of entropy in one complete cycle
is zero. Namely, in one complete cycle, the change in the entropy for the boiler and of
the environment, is zero. Hence, for every cycle
In one cycle what is the change of entropy of the boiler? Since it loses heat
of amount at fixed temperature in a reversible process,
Note the change is negative, since the boiler loses
entropy. On the other hand, the heat engine discharges, again in a
reversible process, heat energy at constant temperature
to the environment, which consequently gains entropy of amount
Hence, from (9.6), we have
From (9.3) and (9.4), the efficiency of a reversible heat engine
is given by
Eqn.(9.12) above is the great result of Carnot, and the
limit that he set for the most efficient possible heat engine has
not yet been exceeded.
Note the second law is automatically
fulfilled by eqn.(9.12), since if there is only one temperature, namely
, then, as expected, the efficiency of the heat engine is zero. That
is, no work can be extracted from a system at one temperature.
In summary, the maximum
efficiency of any heat engine operating between temperature
is given by
This result of Carnot is a result of immense simplicity as well as of great
number of things should be noted. Nowhere has any specific property
of an ideal gas been used. In fact, the result is independent of
the substance used to drive the heat engine, be it a gas, or
water, or alcohol and so on. Even after more than a century and a half has
elapsed since Carnot's discovery, till to date no one
has succeeded in making a device that goes beyond Carnot's predicted upper
limit for the efficiency of a heat engine.
Efficiencies of Some Heat Engines
|Type of Engine
|Gasoline engine with
|Actual gasoline engine
A refrigerator takes heat from a cold body and delivers the heat to
body at higher temperature. This process clearly can never happen spontaneously since
this would imply that heat can spontaneously flow from a cold body to a hot body, something
that has never been observed, and which would violate the Second Law. An object which is cooler
than its environment is in a state of low entropy compared to the environment.Hence to keep a
body cooler than its
ambient environment's temperature we have to constantly do work. We all know this is the case
from daily experience: if one switches off the air conditioner, the room soon warms up.
Note a refrigerator is in essence similar to a living cell. The
reason being that the cell maintains itself in a low entropy state
compared to its environment by constantly doing work, and hence
the need of a living organism to regularly consume food.
The most efficient refrigerator is, as one can guess, a reversible one in which all
the processes taking place are reversible, and which leads to no increase in the entropy
of the whole system. Suppose the cold reservoir C is at temperature , from which the
refrigerator extracts heat of amount (and by doing work
discharges heat to the environment E at temperature . From , as in (9.9), we have
Heat flows for a refrigerator
Since the heat delivered to the environment is greater than , we necessarily have
to do work on the system to generate the extra heat required by the Second Law.
Let the work done on the system be ; hence we
have from energy conservation that
In other words, to keep cooling the refrigerator, we take heat
from the refrigerator, add to it heat equal to the amount of work to
it, and then discharge heat
of amount to the environment - which is the minimum heat that
is required by the Second Law. Since the refrigerator is reversible,
the work that we do is the minimum amount of work required for achieving .
Similar to a heat engine, the efficiency of a
refrigerator, called , the coefficient of performance , is given by
the amount of heat extracted per unit amount of work. That is
We consequently have, from (), the following
Hence for a reversible refrigerator, its efficiency is given by
Unlike the efficiency of a heat engine where , we have that
the coefficient of
performance of a refrigerator . In a household refrigerator,
the value of , and for air conditioners it is about .
The reason that cannot be made infinitely large is a
consequence of the second law, since this case would imply that
heat would then flow from a cold body to a hot body without any
work being done.
All irreversible heat engines are less efficient than reversible ones.
In turn, the most efficient heat engine is a reversible one.
The way to prove this is to assume the contrary. In other words,
let a heat engine have a higher efficiency than a reversible one. That is
We couple this heat engine to a reversible refrigerator. Both the
machines work between a reservoir at temperature and another
reservoir at temperature .
Engine extracts work by an inflow of from the
boiler at temperature and discharges heat into a reservoir
at temperature . The reversible
refrigerator extracts heat from the reservoir at temperature
, does amount of work (that it obtained from the heat engine ), and
delivers heat to the boiler at temperature .
From (9.20), we have
Engine X coupled to a Reversible Refrigerator
The last equation simply says that the less efficient engine
requires more heat to do the same amount of work as the more
Energy conservation is required both for the heat engine as
well as for the reversible refrigerator. Hence, using the fact
that the work extracted from the the heat engine is then used
to drive the refrigerator, we have, from (9.4)
where we have used (9.22) for obtaining .
Hence, if we consider the system as a whole, we now have a refrigerator
which extracts heat from the reservoir at temperature and delivers it to the
boiler at the higher temperature with no work being done!
Since this contradicts the second law, we can see that the
assumption made in (9.20)that heat engine is more efficient than a reversible
heat engine is incorrect.
We have seen that the second law is fundamental in deciding
whether a physical process is allowed or not. In other words,
only those transformations of energy are allowed, for which
there is no change or a net increase in entropy. Entropy for an
isolated system can never spontaneously decrease. Or equivalently,
all spontaneous processes lead to a net increase in entropy.
So far we have considered only changes in the entropy of the
system, and have neglected to take into account changes in the entropy
of the heat bath. Define total system as
A physical, chemical or biological process can take place only if
the entropy of the total system increases. To account for
the entropy of the total system, we define a new concept called
free energy, and denote it by .
Consider a system with energy , entropy and volume .
Using the gas law, we can express both pressure and
temperature in terms of and . In effect, we then have
Recall energy conservation for this system was written in (8.23) as
Note that the equation above is written in a manner which is
suitable if and are the independent variables, since in
such a case the small variations in and are independent. However, entropy
is usually difficult to control and vary, and instead, it is better to
consider the independent variables to be and temperature .
In such a case, we would like to consider a different function of
the independent variables, the analog of energy , which would
result from the independent variation of and .
By definition, the change in free energy results from the independent variations
of and . Hence, the change in the free energy of the system is defined by
The finite form of above equation is given by
using the energy equation given in (8.23), and we have
Note , that is, free energy is a function only of the state variables.
That is, similar to energy and entropy, free energy depends only on the
state of the system, and not how that state was arrived at.
Due to the negative sign of above, free energy decreases
For all physical processes, we must always have
If we consider the system having energy and entropy , then equation
(9.33) above can be understood as the trade off between a
system trying to minimize its change of energy , while at the same time
trying to maximize its change in entropy . If we consider the system
together with the heat bath to be the universe, then the principle that
can also be viewed as simply expressing the Second Law that the
total entropy of the entire universe must always increase. We show
this to be the case with the derivation below.
Consider a process, at constant , which causes a small change in the free energy
of the system, namely
The second term is the change in the entropy of the
system. Recall the very fact the system has a temperature means that
it is not isolated, rather it is in contact with a heat bath at
The energy exchange with the heat bath is given by , and the
is the consequent change in the entropy of
the heat bath! As long as the
free energy decreases, the entropy of the total system
increases. Hence, even if there is a process in which the entropy
of the system decreases, the process will still be allowed
as long as the entropy of the total system increases.
In other words, for all physically allowed processes, the free
energy of the system must decrease. This really is just another
formulation of the second law.
Eq. (9.28) is one of the most fundamental equation in physics, chemistry and biology.
Only those chemical reaction which lower net free energy are
possible. In biology, (9.28) forms the basis of how life organizes and
A living cell is a highly structured system, having low entropy.
The cell has to carry out chemical reactions which constantly lower
its entropy, since random motion due to thermal motion is always at work creating
entropy inside the cell. The only way that a cell can lower its entropy is by increasing
the entropy of the environment, so that net entropy of the universe always increases.
As shown in (9.35), the cell can lower its entropy if at the same time
it manages to lower its free energy, that is, to engineer
. Hence if
, the increase in free
energy of the cell must be compensated by the cell losing energy to
the environment (recall our sign convention is that energy lost by the
system has a negative sign), resulting in the environment gaining entropy
. The net effect of this whole process
is the lowering of free energy for the cell. That is
The energy that is spent in mechanical work by a biological
entity is replaced by the food taken. But even more importantly, a
biological entity is usually at a higher temperature than the
environment, and constantly loses heat energy to it, as this is the
way the cell imparts high entropy to the environment.
The heat energy that the cell loses to the environment is
supplied to it by the food it eats. Hence we eat food, which has low entropy,
not only to regain lost energy, but rather to develop negative entropy by
ingesting low entropy food and then disposing high entropy to the
By losing heat energy we in effect lower our entropy. If
the cell fails to obtain food, its entropy increases indefinitely
leading to the destruction of its highly ordered low entropy
structures leading to cell (or organism) death.Hence we have the
paradox that we ingest energy to lose energy!
Ordered states in physics such as superconductors and superfluids
are usually possible at very low temperatures compared to the
environment (although the appearance of high temperature
superconductors has led to hopes to the contrary). To maintain
these highly ordered states, their low temperatures have to be
maintained, which in turn needs constant expenditure of energy.
Hence living systems are not very different from other ordered
states in nature that need constant expenditure of energy to
maintain their ordered low entropic state.
Entropy has an interesting application in evolution as well. Some
people have argued that the emergence of life contradicts the
second law since a highly organized state, with very low entropy,
emerges from a high entropy environment. This argument is
incorrect, since a living entity is not an isolated system; if one
takes into account the total entropy of the earth and Sun (from
which all radiation comes), then it can easily be shown that the total
entropy of the living entity, together with the earth and Sun,
We have alluded to the microscopic understanding of entropy a
number of times. Basing ourselves on the kinetic theory of gases,
we have assumed that matter is composed of atoms which are
moving randomly. How do we derive the properties of entropy from
the kinetic theory of gases? What does the picture of matter -
that it is made out of atoms that perpetually occupy random
positions and move with random velocities - tell us about entropy?
Instead of tackling this problem head-on, let us discuss a simpler
problem. Suppose we have two coins with equal likelihood of coming
up heads (H) or tails (T). Let be equal to number of possible
ways an outcome can occur. For the example of our two coins we have the
One can easily generalize this example to fair coins.
What does coin tossing have anything to do with atoms? The analogy
is that the atoms are like the coins that we toss; just like we
can throw coins and end up with a specific configuration - say
equal number of heads and tails - we can similarly think of
``throwing'' atoms into a volume , and they end with some
specific positions and velocities. We can run through all the
possible configurations allowed for the atoms by repeatedly
throwing the atoms. The random nature of the coin is that every
time we throw it, we do not know what will be the outcome. Similar
to the coins, the random positions and velocities of the atoms
implies that every time we ``throw'' the atoms, the outcome is
Entropy is simply defined to be proportional to the natural
logarithm of the number of ways a certain configuration can occur,
namely . In other words
The above equation is fundamental to physics, and is valid in
classical physics, general relativity, quantum theory and string
theory. The calculation of the entropy of a black hole using
string theory starts from the above equation. In information
theory, this equation is one of the most fundamental
equation. Boltzmann, with a profound premonition of the
future, considered the above equation so important that it is
written as the epitaph on his tombstone.
For the case of atoms in a gas, we assume - as is the case for
the kinetic theory gases - that we have absolutely no knowledge of
what the atoms are doing, and in effect we assume that, for the
atoms, all possible configurations (outcomes) are equally likely.
In other words, it is assumed that for the atoms in the gas, all possible
positions (inside volume ) and velocities are equally likely.
Hence, the probability for a given configuration - with, say, a given
energy, volume, pressure - to occur, is given, as
in (4.11), by
We see from above that entropy is proportional to
the logarithm of the probability for the given configuration
occurring for the gas.
Now, in practice, we are not interested in knowing the state of
every single atom. Rather, we would like to know, for example,
what is the total energy of the system, what is its pressure, and
so on. The analogy for this in tossing coins is the we want to
only know, for example, what is the total number of tails which
come up in a toss, not in what sequence they come up. Hence, to
obtain one head and one tail in two throws, both the outcomes HT and TH will
contribute. Suppose we toss coins, and let the total number of
tails be (with number of heads obviously being ). We can now
ask, regardless of what sequence the heads and tails appear, what
is the number of ways that tails will occur, denoted by ? The
answer is given - for a fair coin - by the binomial theorem, as in (4.4),
For the system of collection of fair coins, the entropy for say
obtaining tails, as in Figure 9.5, is given by
Entropy S vs state denoted by r
Note that the constant does not depend on the
configuration being considered, namely , and can be ignored.The constant is
fixed physically by the Third
Law of Thermodynamics, which states that
The generalization to atoms is straightforward. Suppose we want to
know the entropy of a gas for a given temperature and volume .
The atoms have a large number of ways of arranging themselves to have
the specified temperature and volume. We hence have entropy given
where we have used (8.24) to obtain the last line.
We hence have for an ideal gas, ignoring a muliplicative constant,
One can easily see that for even a small collection of
atoms, say with
and litre, the value for
The result for can be understood as follows. Since the atom can
be anywhere in volume, the uncertainty in the position of each
atom is given by the volume . Since there are atoms, the total
uncertainty is . The randomness in the velocity
of each particle
is given by
, and this is what
determines the temperature dependence of
We are not done yet. The second law states that the system
will always move towards a state such that its entropy increases
(or at least stays constant). How do we interpret this statement
from the microscopic point of view? Increasing simply means
moving towards the state which has a greater . As we have
already discussed, since every configuration
for the atoms has been assumed to be equally likely
an increase in means a more likely state.
words, the statement that for all spontaneous changes the entropy always
increases, or equivalently the value of for the new configuration is
always greater, is simply a
statement that the system moves to a configuration that is more likely.
Equilibrium is reached when entropy reaches its maximum value, and
this means that the system is in its most likely state.
Using our example of coins, suppose the coins are flipped
randomly one at a time, and the subsequent change in the coins is
accepted only if it obeys the Second Law, that is, is more likely. Any change in the system
of coins will mean that it will move towards that state which is
more and more likely. The evolution of arbitrary states towards
equilibrium is shown in Figure 9.6.
Once it arrives at the most likely state, it will no longer
change, and hence will have said to have arrived at equilibrium.
From Figure 9.6 it can be seen that for the case of coins, the most
likely state is that of equal number of heads and tails, and
which has the maximum value of
This state of equal number of heads and tails will be
its equilibrium state.
For a more physical example, consider the air in any closed room
with volume . Suppose the gas initially occupies only half the volume, that
is . Then, in moving from half the room to the whole room, we
know from (8.36) that for the free expansion of an ideal gas
Trajectory of initial states tending to equilibrium
In other words, since
, there is an incredible increase in
entropy, forcing the gas to uniformly distribute itself in the
whole volume of the room. It is easy to see that when the gas
occupies the whole room, has its maximum value
for . In this case, , and the gas is in
A more graphic way of explaining this phenomenon is the following.
Consider a configuration in which all the atoms are in one corner
of the room, occupying say a volume of , leaving everyone
grasping for breath. This configuration is not impossible
(not physically disallowed), but is so unlikely that one could wait for
longer than the life of the universe, and it would still not
occur. So don't hold your breath, this configuration is not going
to happen in your lifetime for sure! On the other hand, the most
likely configuration, and one for which the air will be in
equilibrium, is when it is uniformly distributed in the room - and
that is in fact what occurs!
The fact that the total system always moves towards greater
entropy is another way of saying that it is moving towards greater disorder.
Consider again the
example of coins. A highly ordered state of all the coins is if
they are all heads, since if we know the value of one coin, we
then know the state of all the other coins as well; the most
disordered state is when there are equal number of heads and
tails, since every time we examine a coin, we do not have any idea
of whether the coin is a head or tail. These ideas can be made more
quantitative by defining precisely how much information a sequence
of coins contains. The gist of our argument is that entropy makes
the system moves from an ordered state with low entropy, to a
disordered state having high entropy. For example, the melting of
ice - a highly ordered state of water - into a liquid, is entirely
dictated by the fact that, at the same temperature,
the entropy of water is much greater than the entropy of ice.
In summary, for any physical system, entropy will keep changing
until the system has reached equilibrium, which is a state for
which is a maximum; that is, it has reached the most likely
state. The isolated system will never move from a more likely
state to a less likely state, in other words, will never
We have seen from our study of entropy that the universe
inexorably moves towards a state of higher and higher entropy. In
fact, the inexorable increase of entropy is an unique phenomenon, in that
it states that for each and every instance of
change, that is, for all flow of time, must be such that entropy must
always increase. The reason this conclusion is unique is because, so
far, all the microscopic equations of physics, be they classical
or quantum, do not differentiate between whether time flows
forwards or backwards!
What does it mean to say that time can flow backwards? Newton's
second law tells us that force causes acceleration. Now if we
replace time by , the acceleration caused is the same, but now
becomes smaller and smaller (more negative) as physical time
increases. In other words, time flows backwards. What this means
is that if Newton's law predicts, for example , that a ball will
bounce in a certain way, then in a world where is the time, the
ball will bounce in a reverse motion.
The way to imagine this is to make a movie of the ball bouncing in
physical time; now, if the movie is run backwards, we will obtain
the motion that one would see in a world with time .
We all know that time flows in a certain fixed and irreversible
direction. If an egg breaks, time reversal would imply that a
broken egg reconstitutes itself following the laws of motion.
Clearly this is impossible. So here is the conundrum: the
equations of physics allow the universe to proceed forward and
backward in time, and in fact one can transform forward time into
backward time leaving the equations unchanged. But the world
clearly selects only one direction for the flow of time. So how do
we resolve this contradiction?
At present, there is in fact no resolution of this problem. The
fundamental equations do not have a preferred time direction.
Well, what about entropy? Can we simply define physical time to be
the flow of time in a direction for which entropy always
increases? Yes, one can do this, and in fact many physicists think
this is the solution, and consider entropy to be the quantity which
defines what is called the ``arrow of time''. The fact that the flow of
the time is irreversible is said to originate in the irreversible
processes which increase entropy. (Experience shows that time
can never flow backwards;
events which have happened can never be undone. ``The writing finger
having write moves on, neither all your tears nor sorrows can make
it erase a single line '' - Omar Khayyam)
This situation is very unsatisfactory, and many leading
physicists have expressed their dissatisfaction with this way of
defining the flow of time. Their objections are fundamental. Recall
that the idea of entropy was introduced because we had to deal
with an enormous collection of atoms, around ; hence, being
unable to follow the movement of every single atom, we consequently
assumed that we are totally ignorant of what the atoms are doing,
and in fact the ultimate form of this ignorance is to assume every
possible configuration for the atoms is equally likely.
So entropy does not arise from any fundamental microscopic property of
nature. Rather, only when we have a huge macroscopic collection of
particles that the idea of entropy and irreversibility enters.
There is a view that we should be able to deduce the
irreversibility of the flow of time from microscopic equations.
This hope has not yet been realized.
Quantum physics introduces a different kind of irreversibility
into physics. Once a measurement is performed on a quantum system,
the effects of the measurement are irreversible, in that the
changes resulting from measurement cannot be undone. So is the
irreversibility of time linked to quantum measurements? Again,
there is no answer to this question, since if we take the Universe
as a whole, we cannot determine if there is any such thing as a
quantum measurement. The reason being that the current theory
of quantum measurement requires that the experimental apparatus is an entity
external to what is being observed. Since there is no physical entity external
to the Universe, the concept of quantum measurement in this case is difficult to define.
In quantum physics, time reversal is also a
symmetry which is broken by some processes, such as the decay of
particles called Kaons. Does this allow us to define a direction
of time? Again, the answer is no, since if we combine time
reversal with some other operations such as reversing the sign of
all the charges of the particles, and so on, we can recover a
world in which time would be reversed, and this world would be
equivalent to ours and
related to our world by a well-defined transformation.
There remains another question, namely, can the flow of time be
reversed if the time-reversing process does not bring about any increase in
entropy? There is no clear answer to this question, which remains
open to further analysis and investigation.
So at present, entropy is the only physical quantity that requires that
only in one direction, namely, in the direction of increasing entropy. The next
generation of equations may yield new insights into why time flows
only in one direction.
The idea of heat death was widespread in the nineteenth century
when the idea of entropy was first understood. Since the universe
must move in a direction of greater and greater entropy, it was
assumed that ultimately the universe would reach a state of
equilibrium for which entropy would be a maximum, and in effect,
time would effectively cease to exist since no change would take place.
This ultimate equilibrium state was
said to indicate that the universe was heading towards a final
``heat death''. Well, is this true?
This answer is quite complicated. Given the fairly ordered state
of the Universe at present, what with stars burning, new stars being
formed, life existing on earth, and so on, one is led to the conclusion that the
Universe must have started at the Big Bang in a very low entropy
state, that is, in a highly ordered state. Ever since then, it has
been racing towards a state with greater and greater disorder.
Why was the initial state of the universe so highly ordered? To
understand this initial state, one has to construct a model for
cosmology, understand why should it be a low entropic state,
and then deduce from it what is the ultimate fate of the
universe - heat death or otherwise.
Next: *Statistical Mechanics
Up: Laws of Physics :