- Heat Engines
- Maximum Efficiency of Heat Engines
- Free Energy
- *Microscopic Definition of Entropy
- Irreversibility, Time and Heat Death

The second law is a statement that all processes go **only** in one
direction, which is the direction of greater and greater
degradation of energy, in other words, to a state of higher and higher entropy.
For example, when we stir a cup of tea,
the smooth and swirling motion that we make with a spoon soon
disappears. The swirling motion is converted - from conservation of energy - into
a very tiny increase in the temperature of
the tea. However, no matter how long we wait, the still tea in the cup will
never suddenly start to swirl accompanied by a tiny drop in its temperature.
Similarly, if a glass shatters, no amount of waiting will ever see
the glass suddenly re-assemble itself, although the glass in
tact and the shattered glass, upto some minute differences, have the same energy.
There are endless similar examples from
daily life that one can quote.
Since in all the examples, energy and momentum are
conserved, clearly it is not these considerations that are
responsible for events in time not being able to reverse
themselves. The question is the following: why are certain
physical phenomenon, allowed by conservation laws such as energy, nevertheless
forbidden from occurring?
The second law of thermodyamics is the underlying reason that
unlikely events do not occur. Entropy is a measure of the likelihood for some event
to occur, and only those events can occur for which entropy
increases, since they are more likely. In other words, an isolated system
always goes from a less probable to a more probable configuration.
We hence have the following statement for the second
law.
**I**. In any physical process, the entropy S for an isolated system
never decreases; that is, we have

(9.1) |

realized that this was not so.
**The Third Law of Thermodynamics**
As temperature tends to absolute zero, so does entropy. In other
words

(9.2) |

A reversible engine is one for which the total change of entropy in one complete cycle is zero. Namely, in one complete cycle, the change in the entropy for the boiler and of the environment, is zero. Hence, for every cycle

In one cycle what is the change of entropy of the boiler? Since it loses heat of amount at fixed temperature in a

(9.7) |

(9.8) |

From (9.3) and (9.4), the efficiency of a reversible heat engine is given by

Eqn.(9.12) above is the great result of Carnot, and the limit that he set for the most efficient possible heat engine has not yet been exceeded. Note the second law is automatically fulfilled by eqn.(9.12), since if there is only one temperature, namely , then, as expected, the efficiency of the heat engine is zero. That is, no work can be extracted from a system at one temperature.

(9.13) | |||

(9.14) | |||

(9.15) |

Since the heat delivered to the environment is greater than , we necessarily have to do work

In other words, to keep cooling the refrigerator, we take heat

(9.17) |

(9.18) |

(9.19) |

We couple this heat engine to a reversible refrigerator. Both the machines work between a reservoir at temperature and another reservoir at temperature . Engine extracts work by an inflow of from the boiler at temperature and discharges heat into a reservoir at temperature . The reversible refrigerator extracts heat from the reservoir at temperature , does amount of work (that it obtained from the heat engine ), and delivers heat to the boiler at temperature . From (9.20), we have

The last equation simply says that the less efficient engine requires more heat to do the same amount of work as the more efficient one. Energy conservation is required both for the heat engine as well as for the reversible refrigerator. Hence, using the fact that the work extracted from the the heat engine is then used to drive the refrigerator, we have, from (9.4)

(9.23) | |||

(9.24) | |||

(9.25) |

where we have used (9.22) for obtaining . Hence, if we consider the system as a whole, we now have a refrigerator which extracts heat from the reservoir at temperature and delivers it to the boiler at the higher temperature with no work being done! Since this contradicts the second law, we can see that the assumption made in (9.20)that heat engine is more efficient than a reversible heat engine is incorrect.

(9.26) |

Note that the equation above is written in a manner which is suitable if and are the independent variables, since in such a case the small variations in and are independent. However, entropy is usually difficult to control and vary, and instead, it is better to consider the independent variables to be and temperature . In such a case, we would like to consider a different function of the independent variables, the analog of energy , which would result from the independent variation of and .

By definition, the change in free energy results from the independent variations
of and . Hence, the change in the free energy of the system is defined by

The finite form of above equation is given by using the energy equation given in (8.23), and we have

Note , that is, free energy is a function only of the state variables. That is, similar to energy and entropy, free energy depends only on the state of the system, and not how that state was arrived at. Due to the negative sign of above, free energy

For **all** physical processes, we must always have

The second term is the change in the entropy of the system. Recall the very fact the system has a temperature means that it is

(9.33) | |||

(9.34) | |||

(9.35) |

The energy that is spent in mechanical work by a biological entity is replaced by the food taken. But even more importantly, a biological entity is usually at a higher temperature than the environment, and constantly loses heat energy to it, as this is the way the cell imparts high entropy to the environment. The heat energy that the cell loses to the environment is supplied to it by the food it eats. Hence we eat food, which has low entropy, not only to regain lost energy, but rather to develop negative entropy by ingesting low entropy food and then disposing high entropy to the environment. By losing heat energy we in effect lower our entropy. If the cell fails to obtain food, its entropy increases indefinitely leading to the destruction of its highly ordered low entropy structures leading to cell (or organism) death.Hence we have the paradox that we ingest energy to lose energy! Ordered states in physics such as superconductors and superfluids are usually possible at very low temperatures compared to the environment (although the appearance of high temperature superconductors has led to hopes to the contrary). To maintain these highly ordered states, their low temperatures have to be maintained, which in turn needs constant expenditure of energy. Hence living systems are not very different from other ordered states in nature that need constant expenditure of energy to maintain their ordered low entropic state. Entropy has an interesting application in evolution as well. Some people have argued that the emergence of life contradicts the second law since a highly organized state, with very low entropy, emerges from a high entropy environment. This argument is incorrect, since a living entity is not an isolated system; if one takes into account the total entropy of the earth and Sun (from which all radiation comes), then it can easily be shown that the total entropy of the living entity, together with the earth and Sun, always increases.

(9.36) |

(9.37) |

The above equation is fundamental to physics, and is valid in classical physics, general relativity, quantum theory and string theory. The calculation of the entropy of a black hole using string theory starts from the above equation. In information theory, this equation is one of the most fundamental equation. Boltzmann, with a profound premonition of the future, considered the above equation so important that it is written as the epitaph on his tombstone. For the case of atoms in a gas, we assume - as is the case for the kinetic theory gases - that we have absolutely no knowledge of what the atoms are doing, and in effect we assume that, for the atoms,

(9.40) | |||

(9.41) | |||

(9.42) |

We see from above that entropy is proportional to the logarithm of the

(9.43) | |||

(9.44) |

For the system of collection of fair coins, the entropy for say obtaining tails, as in Figure 9.5, is given by

(9.45) | |||

(9.46) | |||

(9.47) |

Note that the constant does not depend on the configuration being considered, namely , and can be ignored.The constant is fixed physically by the Third Law of Thermodynamics, which states that as . The generalization to atoms is straightforward. Suppose we want to know the entropy of a gas for a given temperature and volume . The atoms have a large number of ways of arranging themselves to have the specified temperature and volume. We hence have entropy given by

(9.48) | |||

(9.49) |

where we have used (8.24) to obtain the last line. We hence have for an ideal gas, ignoring a muliplicative constant, the following

(9.50) |

(9.51) | |||

(9.52) |

In other words, since , there is an incredible increase in entropy, forcing the gas to uniformly distribute itself in the whole volume of the room. It is easy to see that when the gas occupies the whole room, has its maximum value for . In this case, , and the gas is in equilibrium. A more graphic way of explaining this phenomenon is the following. Consider a configuration in which all the atoms are in one corner of the room, occupying say a volume of , leaving everyone grasping for breath. This configuration is not impossible (not physically disallowed), but is so unlikely that one could wait for longer than the life of the universe, and it would still not occur. So don't hold your breath, this configuration is not going to happen in your lifetime for sure! On the other hand, the most likely configuration, and one for which the air will be in equilibrium, is when it is uniformly distributed in the room - and that is in fact what occurs! The fact that the total system always moves towards greater entropy is another way of saying that it is moving towards greater disorder. Consider again the example of coins. A highly ordered state of all the coins is if they are all heads, since if we know the value of one coin, we then know the state of all the other coins as well; the most disordered state is when there are equal number of heads and tails, since every time we examine a coin, we do not have any idea of whether the coin is a head or tail. These ideas can be made more quantitative by defining precisely how much information a sequence of coins contains. The gist of our argument is that entropy makes the system moves from an ordered state with low entropy, to a disordered state having high entropy. For example, the melting of ice - a highly ordered state of water - into a liquid, is entirely dictated by the fact that, at the same temperature, the entropy of water is much greater than the entropy of ice. In summary, for any physical system, entropy will keep changing until the system has reached equilibrium, which is a state for which is a maximum; that is, it has reached the most likely state. The isolated system will never move from a more likely state to a less likely state, in other words, will never decrease.