next up previous contents
Next: Second Law of Thermodynamics Up: Laws of Physics : Previous: Electric and Magnetic Fields   Contents

Subsections

Entropy

Why do we need the concept of Entropy?

The conservation of energy is the bedrock of our understanding of nature. Conservation of energy, however, only tells us that in any process the total energy remains constant, that is $\Delta
E=0$. Energy conservation, however, does not specify what can enter and leave the energy conservation equation, only that the sum of energy entering and leaving a system must always be the same. Furthermore, energy conservation does not tell us what are the various forms of energy, and how energy transforms from one form into another. From our study of waves, light, electric and magnetic fields and so on, we found that energy plays a central role in all these phenomena. However, only the specific study of the various phenomena could unravel the specific forms that energy takes. The concept of energy also does not tell us what fraction of the total energy is kinetic and what fraction is electrical, chemical and so on. A new concept,in addition to energy, is needed for a more complete understanding of physical processes, in particular those involving the transformation of energy. In all the examples we studied, energy could be completely transformed from one form into another. For example, in our study of magnetic fields, we found that we could fully convert mechanical energy into electrical currents by Faraday's Law of Induction. We may well ask: can all forms of energy be fully transformed from one form to another? The answer is unexpectedly complicated. In some cases the answer is ``yes'', and in other cases the answer is ``no''. The concept of entropy addresses this question regarding the transformation of the forms of energy, and tells us how much energy can be converted from one form into another, and in particular, how much energy is available for doing useful work. Heat had defied all attempts to explain it using the concept of force, and, in general using Newtonian mechanics. The fact that heat is a form of energy presented a great breakthrough in the understanding of heat, and was fundamental to the emergence of energy as one of the most fundamental ideas in physics. It was, however, soon realized that heat was not an ordinary form of energy such as gravitational or electrical energy. Rather, heat is a unique form of energy in that only a certain fraction of heat energy can be transformed into other forms of energy. It is for understanding the unique problems presented by heat that new concepts such as entropy, which go beyond energy, were developed. The idea of entropy was introduced into physics in the nineteenth century and is essential for understanding the phenomenon of heat. Entropy is derived from the Greek word meaning ``transformation content''. Famous names such as Sadi Carnot, R.J. Clausius, Count Rumford, James Joule, Ludwig Boltzmann and so on contributed greatly to the ideas of heat and entropy. Entropy is a concept which does not answer all the questions left unanswered by energy, but it does address a crucial aspect of energy, namely how much energy can be transformed from one form into another. In particular, if a certain amount of heat energy is given, then entropy tells us how much ``useful'' work can be extracted from heat energy, in other words, what is the maximum possible efficiency of a heat engine. In general, since energy is conserved, one would think there is no need to be concerned about recklessly wasting energy. But we know from daily life that ``useful'' energy, or equivalently, energy ``available'' for use, is a scarce resource. This intuitive understanding that useful energy is a precious resource is explained by the concept of entropy. Entropy is also related to the concept or order and disorder, and a phase transition from say water to gas is permissible only if entropy increases in such a transformation. For any system, entropy is a physically measurable quantity. Furthermore, it is an experimentally observed fact that for all processes in nature, the entropy of the total system can never decrease. Note this empirical law concerning entropy is weaker than that of energy, since it does not specify, for any physical process, by how much does entropy increase. The concept of entropy has many other applications and vast ramifications in other disciplines. Only those chemical processes and reactions for which the total entropy of the system does not decrease are allowed by nature. All living entities must maintain their low entropic state to stay alive. More abstract applications of entropy occur in the physics of black holes, and it was only in 1997 that the entropy of black holes could be calculated from first principles using results from string theory. The concept of entropy also has a key application in information theory. It was shown by Claude Shannon in 1950 that the information content of a message is determined by a suitable application of the idea of entropy to the discipline of information science. The current understanding of entropy is that it is a large scale (macroscopic) manifestation of the atomic nature of matter. For example, one cubic centimeter of air contains about $10^{20}$ number of atoms. This is an unimaginably large number of particles. Even if under some circumstances we can view these atoms as classical particles, it is futile to apply Newton's laws to such a large collection of particles. Instead, the best we can do is to acknowledge our total ignorance and assume that all the atoms are moving randomly, that is, the position and velocity of every atom is a random variable. Furthermore, for a system of atoms that reaches a stable state, called equilibrium (more on this concept later) ,every possible velocity and position of the atoms in one cubic centimeter of air is equally likely. We will show later that the empirical law that entropy never decreases is simply a statement that for a system consisting of a collection of atoms which is not in equilibrium, the collection of atoms will always move towards a state that is the more probable, and will only reach equilibrium on arriving at the most probable state available to the system. It should be noted that the field of thermodynamics developed in the eighteenth and nineteenth centuries without the concept of the atomic composition of matter. It is remarkable that very general results on how bulk matter behaves were obtained without any idea of the microscopic composition of matter. From the point of view of scientific methodology, the progress of thermodynamics shows that there are concepts describing the large scale properties of matter that have a logic of their own, and which later on are seen to match on smoothly to a deeper and more complete understanding of the same phenomenon.

Temperature and Heat

We discuss thermodynamics, which is the branch of physics dealing with heat, because this will give us a specific physical system in which we can precisely define and discuss entropy and related ideas. In particular, we will study the behaviour of of an ideal gas to illustrate the concept of heat, energy and entropy. Most of us have an intuitive understanding of heat, based on having experienced hot and cold objects. The first thing one must realize is that heat is different from temperature.

Temperature

What do we mean by saying that an object is at temperature $T$? Temperature is a physical property of all physical systems. From the microscopic point of view, the temperature of a physical object is a measure of the average kinetic energy of the atoms that make up that object. If the atoms, on the average, are moving very fast, they have high kinetic energy and hence the object has a high temperature, and vice versa for slowly moving atoms. We can immediately see that since there is no limit on how much momentum a particle can have, there is consequently no upper limit on how high the temperature can be. The concept of a maximum temperature does, however, appear in Big Bang cosmology, but it is so large as to be practically infinite. On the other hand, there must be a lower limit to temperature, which is reached when all of the atoms that constitute an object are brought to the state of lowest possible energy, called the ground state. It should be noted that at absolute zero temperature, unlike classical systems is which there is no longer any motion, a quantum system in at absolute zero temperature is in its ground state, and continues to have quantum fluctuations.


Zeroth Law of Thermodynamics Two objects are in thermal equilibrium if they are at the same


temperature. The zeroth law is the basis of the measurement of temperature, and of the existence of thermometers. By bringing an object of known temperature into contact with the object whose temperature is being measured, once thermal equilibrium is reached one can conclude that both the temperatures are equal. Any physical material whose behaviour under changes of temperature is known can be used to measure temperature. For example, if we know how a metal expands on being heated, or how the resistance of a conductor changes, can be used for measuring temperature. There is an absolute scale of temperature for which the lowest temperature is at a temperature called absolute zero. This scale is called Kelvin and denoted by $K$. Let $T$ be the temperature of the object in the Kelvin scale, and $T_C$ be its temperature on the ordinary Celsius scale. We then have the following.

\begin{displaymath}
T_C=T-273.15K
\end{displaymath} (8.1)

Hence, for example, $20^0C=293.15K$. Temperature is a new dimensional quantity in addition to $M,L,$ and $T$. The units of temperature is $K$ and this is also used to signify its dimension. The fact that it is much more difficult to cool than to heat an object is evidenced by the fact that refrigerators were only made in the twentieth century whereas heat furnaces and engines have been in existence for many centuries. The lowest temperature reached to date was achieved in 1992 and is
\begin{displaymath}
T=2\times 10^{-9}K
\end{displaymath} (8.2)

For every extra decimal point of cooling closer to absolute zero, new experimental techniques are required and are usually based on qualitatively new physical properties on various phenomena. A table of temperatures found in nature is given in Table [*]

Table 8.1: Some typical temperatures of physical phenomena
Characteristic Temperatures K
Universe at the Big Bang $10^{30}$
Highest Laboratory Temperature $10^{9}$
Surface of Sun $5\times 10^{3}$
Normal Human Temperature $310$
Freezing Temperature of Water $273$
Universe today $3$
Boiling of Helium -3 $10^{-1}$
Lowest Laboratory Temperature $10^{-9}$


Heat

Every object, at temperature $T$, has an internal, thermal, energy denoted by $E$, and which can be thought of as being composed of the sum of the kinetic and potential energy of all the atoms which constitute the object. Heat, denoted by $Q$, is a form of energy, called thermal energy since it is a form arising from temperature. Heat always refers to a process in which thermal energy is exchanged between objects at different temperatures. Heat is analogous to work in that both heat and work are defined by a process, be it the process of the flow of heat, or the process of a force acting over some distance. Furthermore, neither heat nor work are intrinsic properties of the system and cannot be stored in the system; what can be stored instead is energy, and which is reflected in the total energy $E$ of the system. The dimension of heat is that of energy, and the unit of heat is calorie, defined to be the energy required to heat 1 gm of water by $1^0C$. Since J is also a unit of energy, we have the following conversion
\begin{displaymath}
1 \mbox{\rm {calorie}}=4.186J
\end{displaymath} (8.3)

If two objects having different temperatures are brought into contact such that heat flows between them - called thermal contact - then heat will flow from the object at the higher temperature to that which is at lower temperature, until both the objects reach a common temperature. The final state is said to be an equilibrium state in which there are no more macroscopic changes taking place. A very important concept in thermodynamics is that of the heat bath, which is taken to be a large reservoir maintained at some constant temperature $T_E$. An example of a heat bath is a swimming pool maintained at some constant temperature. When an object is brought into thermal contact with the heat bath, it either gains heat if it is cooler, or loses heat if it is hotter, than $T_E$; the important point to note is that in the heat flows, the heat bath is taken to be so big that its temperature does not change, regardless of its exchange of heat with the object in question. Heat flow is taken to be positive if energy flows into the object and adds to the internal energy $E$, and is taken to be negative if a part of the internal energy flows out of the system. Let the system's temperature be $T_S$. Hence, if $T_S<T_E$, the heat $Q$ that flows is positive, and if $T_S>T_E$, then the heat flow is negative. Figure 8.1 shows this flow of heat.

Figure 8.1: Heat flows of object and environment
\begin{figure}
\begin{center}
%%
\input{core/figure14.eepic}
\end{center}
\end{figure}

The specific heat of an object, say $c$, determines how much the temperature of an object, of mass m, changes if it absorbs $Q$ amount of heat. If the initial temperature of an object is $T_i$ and after absorbing $Q$ amount of heat its final temperature is $T_f$, then the definition of $c$ is given by
$\displaystyle T_f-T_i$ $\textstyle =$ $\displaystyle \frac{Q}{mc}$ (8.4)
$\displaystyle \Rightarrow c$ $\textstyle =$ $\displaystyle \frac{Q}{m(T_f-T_i)}$ (8.5)

Note heat $Q$ has dimension of energy, that is $[Q]=ML^2T^{-2}$, and hence $c$ has dimension of $[c]=L^2t^{-2}K^{-1}$. Since $Q$ has units of $J$, the unit of $c$ is $\mathrm{J(kg)^{-1}K^{-1}}$. The specific heat $c$ of an object is a measure of how tightly bound together are its atoms (or molecules). For an object made out of tightly bound constituents, we expect $c$ to be large. A lot of heat has to be supplied to it for increasing its temperature since a larger fraction energy would be stored in the form of potential energy instead of the kinetic energy of moving constituents, and vice versa for materials with loosely bound constituents.

Table 8.2: Specific Heat of Some Materials at Room Temperature
Material $\mathrm{Jkg^{-1}K^{-1}}$
Lead $128$
Silver $236$
Copper $386$
Aluminum $900$
Granite $790$
Glass $840$
Ice (278K) $2100$
Mercury $140$
Water $4190$
Human Body (average) $3470$


So, for example, we can say that metals must have low specific heat since they easily heat up, and this in turn implies that their electrons and atoms are not very tightly bound.

Ideal Gas

Any gas is specified by its temperature $T$, the volume it occupies $V$ and its pressure $P$. Pressure is defined to be the force per unit area that the gas exerts on the walls which enclose its volume. $T$, $P$, and $V$ are called state variables which specify the state of the gas. Pressure $[P]=ML^{-1}T^{-2}$ and volume $[V]=L^3$. Note that the state variables are intrinsic properties of the gas, and, unlike heat, do not depend on the history of how the gas arrived at these values. For example, in the phase diagram given above, the state variables say at volume $V_0$ and temperature $T_0$ shown above does not depend on the curve which resulted in the gas being at that point of the phase diagram; another process, at a different pressure, leads to the same value of $V_0$ and $T_0$, and it does not matter how the gas arrived at that point of phase space. An ideal gas is a gas that is so dilute that its interactions can be completely neglected, and each atom of the gas can be said to behave as a free particle. A number of gases, including helium, oxygen, nitrogen and so on, do behave as ideal gases at moderate pressures and at moderate temperatures, that is temperatures that are far greater than their condensation temperatures. In Section (4.2.2) a derivation is given of equation of state for the ideal gas based on the kinetic theory. The result is the following. Let $N$ be the number of atoms of a gas in a volume $V$ and the constant $k$ be the famous Boltzman's constant given by
\begin{displaymath}
k=1.38\times 10^{-23}JK^{-1}
\end{displaymath} (8.6)

We then have the ideal gas law given by
\begin{displaymath}
PV=NkT
\end{displaymath} (8.7)

For a gas made out of the collection of molecules (composed of many atoms) the formula above is changed slightly. The state of an ideal gas given by (8.7). A phase diagram is one in which any two of its state variables, such as volume and pressure, are plotted as independent variables. A point in the phase diagram represents the state variables of the gas. An implicit assumption in drawing a phase diagram is that thermodynamic variables such as temperature, volume and pressure are continuous variables. This assumption is clearly reasonable, since, in an experiment, we can continuously vary these parameters . The phase diagram for and ideal gas at constant pressure yields a straight line of $V$ versus $T$ and is shown in $VT$ phase diagram in Figure 8.2. Similarly, the $PV$ phase diagram shows how the pressure $P$ and volume $V$ is related for an ideal gas at different constant temperatures.

Figure 8.2: A VT and a PV Phase Diagram of a gas
\begin{figure}
\begin{center}
\epsfig{file=core/figure15.eps, height=5cm}
\end{center}
\end{figure}

(8.7) is an amazing result. It states that equal volumes of a gas, at a given pressure and temperature, have the same number of atoms, regardless of the kind of atom it is made out of! And this conclusion, as shown in the Section (4.2.2), results directly from the application of Newton's Law to a large collection of atoms considered as classical particles. Suppose we hold the temperature of the gas at a constant value, and vary the pressure P. Let the initial values be $P_i$, $V_i$ and the final values be $P_f$, $V_f$. We then have from (8.7) that
\begin{displaymath}
P_iV_i=P_fV_f : \mbox{\rm {Boyle's Law}}
\end{displaymath} (8.8)

Similarly, let us hold the pressure as fixed, and vary the temperature. Let initial values be $V_i$ and $T_i$, and the final values be $V_f$ and $T_f$. Then we have
\begin{displaymath}
\frac{V_i}{T_i}=\frac{V_f}{T_f} : \mbox{\rm {Charles' Law}}
\end{displaymath} (8.9)

The internal energy $U$ of an ideal gas is given by the following (derivation is given in the Section 4.4)
\begin{displaymath}
U=\frac{3}{2}NkT
\end{displaymath} (8.10)

Let us try and understand the physics of the above formulae. For a typical gas, occupying a volume of say 1 litre, the number of particles is close to Avogardo's number, which is given by $N_a=6.02\times 10^{23}$. Hence, the number of particles in any volume which we can directly perceive is truly unimaginably large. The energy per particle is given by $\displaystyle \frac{U}{N}=\frac{3}{2}kT$; since energy of the particles is purely kinetic, we see that the result is in accordance with our earlier discussion that temperature is a measure of the kinetic energy of an atom in a gas. The kinetic theory of gases is based on assuming that the position and velocity of the atoms are random. If the random velocity of an atom is $v$, then, as noted in Section (4.2.2), the kinetic energy, $U$, of the gas is simply the average kinetic energy of the atoms. Denoting the average of value of say $v^2$ by $<v^2>$, we have, in three dimensional space
\begin{displaymath}
\frac{U}{N}=\frac{1}{2}m<{\bf v}^2> =\frac{3}{2}kT
\end{displaymath} (8.11)

First Law of Thermodyamics

The main reason that, in the nineteenth century, physics shifted from the idea of force to the idea of energy was due to the need for explaining the phenomenon of heat. If we include heat as a (rather unique) form of energy, the law of energy conservation then goes under the name of the First Law of Thermodyamics, which states the following.
$\displaystyle \mbox{\rm Total energy of system}$ $\textstyle =$ $\displaystyle {\mbox{\rm Kinetic Energy}}
+ {\mbox{\rm Potential Energy}}{}$  
  $\textstyle +$ $\displaystyle {\mbox{\rm Heat}}$ (8.12)
  $\textstyle =$ $\displaystyle {\mbox{\rm Constant}}$ (8.13)

Let us examine what form the First Law of Thermodynamics takes for the case of an ideal gas. We would like to apply force on the gas from outside, and determine how much energy and other properties of the gas are changed. We would also like to determine how much work we can extract from a gas. The term work is used for indicating mechanical processes that involve moving around macroscopic bodies, such as pistons and pulleys. The energy which is expended in doing such work is called mechanical energy, to differentiate it from the energy of the gas, which is called internal energy. Recall work results from a process of a force acting over some distance. The sign convention for work is that work done by the gas is taken to be negative, whereas work done on the gas is taken to be positive. In other words, a gas loses energy if it does work, whereas it will gain energy if work is done on it. Suppose the gas is contained in a frictionless piston of area $A$, and we hold gas at a fixed temperature by keeping it in contact with a heat bath at temperature $T$; we do work on the gas by compressing its volume, so that $V_f<V_i$, so that $\Delta V=V_f-V_i<0$. What is the work that the piston does on the gas? Let the piston exert a force $F$, and move through a distance $\Delta x$; the work done on the gas $\Delta W$ is given, as expected, by
\begin{displaymath}
\Delta W=F\Delta x
\end{displaymath} (8.14)

However, recall that the pressure $P$ exerted by the gas is defined to be force per unit area. Hence, we have
\begin{displaymath}
F=PA
\end{displaymath} (8.15)

and consequently, since the volume is compressed $\Delta V=A\Delta x$, we have
$\displaystyle \Delta W$ $\textstyle =$ $\displaystyle PA\Delta x$ (8.16)
  $\textstyle =$ $\displaystyle P\Delta V$ (8.17)

Suppose that heat of amount $\Delta Q$ flows into the gas during compression. We have, from the First Law of Thermodynamics (conservation of energy), the following
$\displaystyle \mbox{\rm {Change of energy of gas}}$ $\textstyle =$ $\displaystyle \mbox{\rm {Work done by gas
+ Heat flowing into gas}}$  
$\displaystyle \Delta E$ $\textstyle =$ $\displaystyle -\Delta W+\Delta Q$ (8.18)
  $\textstyle =$ $\displaystyle -P\Delta V +\Delta Q$ (8.19)


\fbox{\fbox{\parbox{12cm}{The limit for infinitesimal change is given by
\begin{equation}
dE=-PdV+dQ
\end{equation}}}}


Note $\Delta E$ is a function only of the state of the gas, and does not depend on how work was done on, or by, the gas, and how heat flowed in or out of the gas. However, the presence of $\Delta Q$ is process dependent, and we need further analysis to show that $E$ is process independent. Energy conservation does not tell us, for example, how much work can be transformed into heat, or whether all the heat in a body can be made to do ``useful'' mechanical work. The information about the direction in which energy can flow, and how much energy can be transformed from say heat into work, and vice versa, is given by the concept of entropy.

Entropy

We will have to discuss a number of concepts before we can precisely define what is the entropy - denoted by $S$ - of a physical system. Before going into the details of this discussion, we state the fundamental result that we will arrive at, namely: For an isolated system, its entropy S can never decrease. The statement above tells us that all physical processes in which heat is transformed into work and visa versa must satisfy the condition that entropy must not decrease. In other words, if this condition is not satisfied, the process cannot take place. For example a process in which there is a net decrease in entropy, that is, $\Delta S_{\mbox{\rm {closed path}}} < 0$ is forbidden as it would imply, as shown in Sec 2.6.3, that a perpetual machine could be constructed. To define the concept of entropy quantitatively, we need to first discuss what is meant by a reversible and an irreversible process. Consider a gas inside a frictionless piston in contact with a heat bath. If we pull out the piston very slowly, in a number of very small steps, the gas will expand at constant temperature $T$, and if we then push the piston back, again slowly and in small steps, the gas will compress and return to its former state. This process is reversible as shown in Figure 8.3.

Figure 8.3: A Reversible Process
\begin{figure}
\begin{center}
\epsfig{file=core/figure16.eps, height=3cm}
\end{center}
\end{figure}

The point to note is that for a reversible process, for each step taken, the system is in equilibrium, and that a small step in one direction, say causing compression, can be exactly reversed by a small step in the opposite direction, causing expansion. If we had suddenly pulled the piston out, the temperature of the gas would have dropped, there would have been turbulent gas motion, heat would have flowed into the gas from the reservoir and so on. The process would then have become an irreversible one. No process in nature is fully reversible, and for a process to be considered as reversible, we need to ignore all effects such as friction, imperfect insulation and so on which would cause irreversiblity. Idealization of a thermodyamic process as being reversible is analogous to Newton's idealization in stating his first law, since complications which come from friction and other irreversible effects are inessential in understanding the underlying principles of thermodyamics. Consider a system in contact with a heat bath and held at constant temperature $T$; for a very small change in the independent parameters of the system such as volume, in which $\Delta Q$ amount of heat flows into it via a reversible process, the change in entropy of the system is given by
\begin{displaymath}
\Delta S=\frac{\Delta Q}{T}
\end{displaymath} (8.20)

Note $[S]=ML^2T^{-2}K^{-1}$, and the unit of entropy is $JK^{-1}$. For energy conservation we consequently have
$\displaystyle \Delta E$ $\textstyle =$ $\displaystyle -\Delta W+\Delta Q$ (8.21)
$\displaystyle \Rightarrow \Delta E$ $\textstyle =$ $\displaystyle -P\Delta V+T\Delta S$ (8.22)

The entropy of an ideal gas is given by
\begin{displaymath}
S_G= Nk \ln(VT^{\frac{3}{2}})+ \mathrm{constant}
\end{displaymath} (8.23)

The entropy per particle (ignoring a constant) is hence given by $S_G/N=k \ln(VT^{\frac{3}{2}})$. We had assumed that for an ideal gas the interactions amongst the particles could be ignored. We see that the formula above confirms this, since the energy and entropy are both proportional to $N$, which comes from a simple sum over the individual gas atoms. Although entropy is an intrinsic property of the system, it is not an intrinsic property of an individual particle, but rather depends, for example, on the total volume occupied by the gas. To know what a particle is doing, we need to know both its position and velocity. We see that entropy is, in a precise sense, our degree of ignorance about the particle's behaviour. For example, entropy increases as we increase $V$, which reflects the fact that the position of the particle becomes more uncertain, since it can be anywhere in a larger volume $V$. Similarly, if we increase temperature, entropy goes up since the larger the kinetic energy, the more random the velocity of the particle, and hence a greater degree in our ignorance about the particle.

Reversible Processes

The most significant and remarkable feature of entropy is that, although $\Delta Q$ is a quantity which is process dependent, entropy -similar to energy- is an intrinsic property of the system, and unlike heat, entropy can be stored. We hence have $S=S(T,V)$ We examine what it means, for example, for energy and entropy to depend on only $T$ and $V$. Consider the $VT$-phase diagram. That is, consider $T$ and $V$ to be the independent variables, and let pressure $P=P(T,V)$ be fixed by the ideal gas law. For every point on this $VT$-diagram, both $E$ and $S$ have some definite value. Suppose we take the ideal gas through any process, be it reversible or irreversible, along path 1, as indicated in Figure 8.4, from values $T,V$ to some new values $T',V'$. (Note this path is not a path in physical space, but rather a path in the phase space of $T,V$.) The system will go through a series of states with varying $T,V$, each having its own $E, S$.

Figure 8.4: Phase diagram of VT with Paths 1 and 2
\begin{figure}
\begin{center}
\epsfig{file=core/figure17.eps, height=5cm}
\end{center}
\end{figure}

The final change in energy and entropy is given by
$\displaystyle \Delta E$ $\textstyle =$ $\displaystyle E(T',V')- E(T,V)$ (8.24)
$\displaystyle \Delta S$ $\textstyle =$ $\displaystyle S(T',V')- S(T,V)$ (8.25)

Now let us take the ideal gas through a different and arbitrary process following say path 2, also given in Figure 8.4. Then $\Delta E$ and $\Delta S$ will have the same values as given in (8.25)! In other words, as mentioned before, the values of $E, S$ depend only on $T,V$, and not how the system arrived at these values. The change in the energy and entropy for a reversible process does not depend on the path taken; rather it only depends on the initial and final values of entropy. If one goes through a closed path following a reversible process, it follows from above that the total change in the entropy of the system is zero, that is it is restored to its original state. However, note that in this reversible process, a net amount of heat has been expended. In symbols
$\displaystyle \Delta S_{\mbox{\rm {closed path}}}$ $\textstyle =$ $\displaystyle 0 \mbox {\rm : Reversible}$ (8.26)
$\displaystyle \Delta Q_{\mbox{\rm {closed path}}}$ $\textstyle >$ $\displaystyle 0$ (8.27)

Recall entropy depends only on the the state of the gas. Hence, if heat $\Delta Q$ is supplied to the gas through a reversible process, the total heat expended can be computed from the change in entropy caused by the heat flows.


% latex2html id marker 10991
\fbox{\fbox{\parbox{12cm}{Let the reversible proces...
...equation}
Q_{\mathrm{Total}}=\int_{\mathrm{Reversible}}TdS
\end{equation}
}}}


Figure 8.5: Reversible process from initial state to final state
\begin{figure}
\begin{center}
\epsfig{file=core/figure18.eps, height=6cm}
\end{center}
\end{figure}

Irreversible Processes

Consider going around, at room temperature $T$, in a circle in physical space. Since we always have to overcome friction for movement, certain amount of work $W$ would be necessary to overcome friction. From energy conservation this work would generate heat $Q=W$ and lead to an increase of entropy $\displaystyle \Delta S=\frac{W}{T}>0$. Hence, in going around a closed path, we have not come back to the original state, since we have had heating due to friction, something that we can no longer reverse. In general, if we follow an irreversible process in completing a complete cycle, the entropy always increases. That is
$\displaystyle \Delta S_{\mbox{\rm {closed path}}}$ $\textstyle >$ $\displaystyle 0 \mbox {\rm : Irreversible}$ (8.28)
$\displaystyle \Delta Q_{\mbox{\rm {closed path}}}$ $\textstyle >$ $\displaystyle 0$ (8.29)

How would one determine the change in entropy for an irreversible process? Since the formula we have for entropy given in eq.(8.21) is valid only for reversible processes, we cannot use it. We instead adopt the following approach. The irreversible process starts from some initial state of the system and ends up in a final state. We construct an equivalent reversible process which also goes from the same initial to the same final state. Since entropy does not depend on how its values are attained, we can now compute the change of entropy for the equivalent reversible process, which will then be equal to the change in entropy for the irreversible process in question.

An Irreversible Process :Free Expansion

To illustrate how to calculate the change in entropy in an irreversible process, consider the irreversible free expansion of an ideal gas from an initial volume $V_i$ into a vacuum such that its final volume is $V_f$. Suppose that the gas is thermally insulated from the environment, and at initial temperature of $T_i$.

Figure 8.6: An Irreversible Process
\begin{figure}
\begin{center}
\epsfig{file=core/figure19.eps, height=6cm}
\end{center}
\end{figure}

Since the gas expands into a vacuum it does no work, that is $\Delta
W=0$; furthermore, since it is thermally insulated there is no flow of heat, and hence $\Delta Q=0$. The first law implies that
$\displaystyle \Delta E$ $\textstyle =$ $\displaystyle -\Delta W+\Delta Q$ (8.30)
  $\textstyle =$ $\displaystyle 0$ (8.31)

Since total energy does not change for an ideal gas, this implies that the initial and final temperature $T_f$ are equal. Hence
\begin{displaymath}
T_i=T_f=T
\end{displaymath} (8.32)

So the initial state of the gas is $T_i=T,V_i$, and the final state of the gas is $T_f=T,V_f$. We would like to replace the irreversible free expansion of the ideal gas with an equivalent reversible process. Since the initial and final temperatures are the same, we can replace the free expansion with an (isothermal) expansion of the gas at constant temperature, from volume $V_i$ to $V_f$. To see that expansion at constant temperature is a reversible process, consider a gas in a piston and in contact with a heat bath at temperature $T$. If one pulls the piston very slowly, heat will flow in to keep the temperature at $T$; on the other hand, if one slowly compresses the piston, the gas will be compressed, and heat will flow back into the reservoir. So we can see that an isothermal expansion is reversible. All we need to do now is to compute the difference between the entropy of an ideal gas in the initial state $T_i,V_i$ and final state $T_f,V_f$ . We hence have the total change in the entropy as given, from (8.24), by
$\displaystyle \Delta S$ $\textstyle =$ $\displaystyle S_G(T,V_i)-S_G(T,V_f)$ (8.33)
  $\textstyle =$ $\displaystyle Nk \ln(\frac{V_f}{V_i})>0 \mbox{\rm {: free expansion}}$ (8.34)

The result obtained shows that entropy increases in the free expansion of a gas, the precise amount being given by the formula above.
next up previous contents
Next: Second Law of Thermodynamics Up: Laws of Physics : Previous: Electric and Magnetic Fields   Contents
Marakani Srikant 2000-09-11