Entropy

From Thermal-FluidsPedia

Jump to: navigation, search

Contents

Thermodynamics and statistical mechanics

There are two related definitions of entropy: the thermodynamic definition and the statistical mechanics definition. The thermodynamic definition was developed in the early 1850s by Rudolf Clausius and essentially describes how to measure the entropy of an isolated system in thermodynamic equilibrium. Importantly, it makes no reference to the microscopic nature of matter. The statistical definition was developed by Ludwig Boltzmann in the 1870s by analyzing the statistical behavior of the microscopic components of the system. Boltzmann went on to show that this definition of entropy was equivalent to the thermodynamic entropy to within a constant number which has since been known as Boltzmann's constant. In summary, the thermodynamic definition of entropy provides the experimental definition of entropy, while the statistical definition of entropy extends the concept, providing an explanation and a deeper understanding of its nature.

Thermodynamic entropy is a non-conserved state function that is of great importance in the sciences of physics and chemistry.[1] Historically, the concept of entropy evolved in order to explain why some processes are spontaneous and others are not; systems tend to progress in the direction of increasing entropy.[2] Entropy is as such a function of a system's tendency towards spontaneous change.[2] For isolated systems, entropy never decreases.[1] This fact has several important consequences in science: first, it prohibits "perpetual motion" machines; and second, it suggests an arrow of time. Increases in entropy correspond to irreversible changes in a system, because some energy must be expended as waste heat, limiting the amount of work a system can do.[3][4][5]

In statistical mechanics, entropy is essentially a measure of the number of ways in which a system may be arranged, often taken to be a measure of "disorder" (the higher the entropy, the higher the disorder).[3][4][5][6] This definition describes the entropy as a measure of the number of possible microscopic configurations of the individual atoms and molecules of the system (microstates) which would give rise to the observed macroscopic state (macrostate) of the system.

An everyday analogy to entropy can be demonstrated by mixing salt and pepper in a bag. Separate clusters of salt and pepper will tend to progress to a mixture if the bag is shaken. Furthermore, this example demonstrates how a process can be thermodynamically irreversible. The separation of the mixture into separate salt and pepper clusters via the random process of shaking is statistically improbable and practically impossible because the mixture has a high amount disorder. This is rendered in popular language by the saying "you can turn an aquarium into fish soup but can you can never turn the fish soup back into an aquarium" once the effect of entropy becomes irrevocable after certain threshold had been passed.

Entropy and the Second Law

The second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system. Hence, in a system isolated from its environment, the entropy of that system will tend not to decrease. It follows that heat will not flow from a colder body to a hotter body without the application of work (the imposition of order) to the colder body. Secondly, it is impossible for any device operating on a cycle to produce net work from a single temperature reservoir; the production of net work requires flow of heat from a hotter reservoir to a colder reservoir. As a result, there is no possibility of a "perpetual motion" system. Finally, it follows that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient.

It follows from the second law of thermodynamics that the entropy of a system that is not isolated may decrease. An air conditioner, for example, may cool the air in a room, thus reducing the entropy of the air of that system. The heat expelled from the room (the system), involved in the operation of the air conditioner, will always make a bigger contribution to the entropy of the environment than will the decrease of the entropy of the air of that system. Thus, the total of entropy of the room plus the entropy of the environment increases, in agreement with the second law of thermodynamics.

In mechanics, the second law in conjunction with the fundamental thermodynamic relation places limits on a system's ability to do useful work. The entropy change of a system at temperature T absorbing an infinitesimal amount of heat δq in a reversible way, is given by \frac{\delta Q}{T}. More explicitly, an energy TRS is not available to do useful work, where TR is the temperature of the coldest accessible reservoir or heat sink external to the system. For further discussion, see Exergy.

Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system.[7]

Statistical thermodynamics

Statistical mechanics views entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties (such as temperature, pressure and volume) have been taken into account. For a given set of macroscopic variables, the entropy measures the degree to which the probability of the system is spread out over different possible microstates. In contrast to the macrostate, which characterizes plainly observable average quantities, a microstate specifies all molecular details about the system including the position and velocity of every molecule. The more such states available to the system with appreciable probability, the greater the entropy.

More specifically, entropy is a logarithmic measure of the density of states:

S = - k_{B}\sum_i P_i \ln P_i \!

where kB = 1.38065*10-23 J K−1 is the Boltzmann constant, the summation is over all the microstates the system can be in, and the Pi are the probabilities for the system to be in the ith microstate. For almost all practical purposes, this can be taken as the fundamental definition of entropy since all other formulas for S can be mathematically derived from it, but not vice versa. (In some rare and recondite situations, a generalization of this formula may be needed to account for quantum coherence effects, but in any situation where a classical notion of probability makes sense, the above is the entropy.)

In what has been called "the most famous equation of statistical thermodynamics", the entropy of a system in which all states, of number Ω, are equally likely, is given by

S = k_{B} \ln \Omega,\!

In thermodynamics, such a system is one in which the volume, number of molecules, and internal energy are fixed (the microcanonical ensemble). The entropy is expressed in units of J·K−1.

In essence, the most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system.[8] This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model.

The interpretative model has a central role in determining entropy. The qualifier "for a given set of macroscopic variables" above has very deep implications: if two observers use different sets of macroscopic variables, then they will observe different entropies. For example, if observer A uses the variables U, V and W, and observer B uses U, V, W, X, then, by changing X, observer B can cause an effect that looks like a violation of the second law of thermodynamics to observer A. In other words: the set of macroscopic variables one chooses must include everything that may change in the experiment, otherwise one might see decreasing entropy![9]

Classical thermodynamics

From a macroscopic perspective, in classical thermodynamics the entropy is interpreted as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy ΔE, and its entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the system's surroundings as unusable heat (TR is the temperature of the system's external surroundings). Otherwise the process will not go forward. In classical thermodynamics, the entropy of a system is defined only if it is in thermodynamic equilibrium.

Clausius states the mathematical expression for this theorem is as follows. Let δq be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and T the absolute temperature of the body at the moment of giving up this heat, then the equation:

\oint \frac{\delta Q}{T} \ge 0

must hold good for every cyclical process which is in any way possible, and the condition of equality for this equation will hold true for any reversible cyclical process.

This is the essential formulation of the second law and one of the original forms of the concept of entropy. It can be seen that the dimensions of entropy are energy divided by temperature, which is the same as the dimensions of Boltzmann's constant (kB) and heat capacity. The SI unit of entropy is "joule per kelvin" (J K−1). In this manner, the quantity ΔS is utilized as a type of internal energy, which accounts for the effects of irreversibility, in the energy balance equation for any given system. In the Gibbs free energy equation, ΔG = ΔH − TΔS, for example, which is a formula commonly utilized to determine if chemical reactions will occur spontaneously, the free energy related to entropy changes, TΔS, is subtracted from the "total" system enthalpy ΔH to give the "free" energy ΔG of the system.

In a thermodynamic system, pressure, density, and temperature tend to become uniform over time because this equilibrium state has higher probability (more possible combinations of microstates) than any other. In the ice melting example, the difference in temperature between a warm room (the surroundings) and cold glass of ice and water (the system and not part of the room), begins to be equalized as portions of the heat energy from the warm surroundings spread out to the cooler system of ice and water.

A thermodynamic system

Over time the temperature of the glass and its contents and the temperature of the room become equal. The entropy of the room has decreased as some of its energy has been dispersed to the ice and water. However, as calculated in the example, the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased. In an isolated system such as the room and ice water taken together, the dispersal of energy from warmer to cooler always results in a net increase in entropy. Thus, when the "universe" of the room and ice water system has reached a temperature equilibrium, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.

A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there will be no net exchange of heat or work - the entropy change will be entirely due to the mixing of the different substances. At a statistical mechanical level, this results due to the change in available volume per particle with mixing.[10]

Entropy versus heat and temperature

Loosely speaking, when a system's energy is divided into its "useful" energy (energy that can be used, for example, to push a piston), and its "useless energy" (that energy which cannot be used to do external work), then entropy can be used to estimate the "useless", "stray", or "lost" energy, which depends on the entropy of the system and the absolute temperature of the surroundings. As the "useful" and "useless" energy both depend on the surroundings, neither one is a function of the state of the system, and both can be quite tricky to quantify. This stands in contrast to the system's Gibbs free energy (for isobaric processes), Helmholtz free energy, entropy, and temperature, all of which are well-defined functions of state. The Gibbs and Helmholtz free energies depend on the temperature of the system (not the surroundings), and do not purport to measure the "useful" energy.

During steady-state continuous operation, an entropy balance applied to an open system accounts for system entropy changes related to heat flow and mass flow across the system boundary.

When heat is added to a system at high temperature, the increase in entropy is small. When heat is added to a system at low temperature, the increase in entropy is great. This can be quantified as follows: in thermal systems, changes in the entropy can be ascertained by observing the temperature while observing changes in energy. This is restricted to situations where thermal conduction is the only form of energy transfer (in contrast to frictional heating and other dissipative processes). It is further restricted to systems at or near thermal equilibrium. In systems held at constant temperature, the change in entropy, ΔS, is given by the equation[11]

\Delta S  = \frac{Q}{T},

where Q is the amount of heat absorbed by the system in an isothermal and reversible process in which the system goes from one state to another, and T is the absolute temperature at which the process is occurring.

If the temperature of the system is not constant, then the relationship becomes a differential equation:

dS  = \frac{\delta q}{T}.

Then the total change in entropy for a transformation is:

 \Delta S = \int \frac{ \delta q }{T}.

This thermodynamic approach to calculating the entropy is subject to several narrow restrictions which must be respected. In contrast, the fundamental statistical definition of entropy applies to any system, including systems far from equilibrium, and including experiments where "heat" and "temperature" are undefinable. In situations where the thermodynamic approach is valid, it can be shown to be consistent with the fundamental statistical definition.

In any case, the statistical definition of entropy remains the fundamental definition, from which all other definitions and all properties of entropy can be derived.

The fundamental thermodynamic relation

The entropy of a system depends on its internal energy and the external parameters, such as the volume. In the thermodynamic limit this fact leads to an equation relating the change in the internal energy to changes in the entropy and the external parameters. This relation is known as the fundamental thermodynamic relation. If the volume is the only external parameter, this relation is:

dE = TdSPdV

Since the internal energy is fixed when one specifies the entropy and the volume, this relation is valid even if the change from one state of thermal equilibrium to another with infinitesimally larger entropy and volume happens in a non-quasistatic way (so during this change the system may be very far out of thermal equilibrium and then the entropy, pressure and temperature may not exist).

The fundamental thermodynamic relation implies many thermodynamic identities that are valid in general, independent of the microscopic details of the system. Important examples are the Maxwell relations and the relations between heat capacities.

Entropy balance equation for open systems

In chemical engineering, the principles of thermodynamics are commonly applied to "open systems", i.e. those in which heat, work, and mass flow across the system boundary. In a system in which there are flows of both heat (\dot{Q}) and work, i.e. \dot{W}_S (shaft work) and P(dV/dt) (pressure-volume work), across the system boundaries, the heat flow, but not the work flow, causes a change in the entropy of the system. This rate of entropy change is \dot{Q}/T, where T is the absolute thermodynamic temperature of the system at the point of the heat flow. If, in addition, there are mass flows across the system boundaries, the total entropy of the system will also change due to this convected flow. During steady-state continuous operation, an entropy balance applied to an open system accounts for system entropy changes related to heat flow and mass flow across the system boundary.

To derive a generalized entropy balanced equation, we start with the general balance equation for the change in any extensive quantity Θ in a thermodynamic system, a quantity that may be either conserved, such as energy, or non-conserved, such as entropy. The basic generic balance expression states that dΘ/dt, i.e. the rate of change of Θ in the system, equals the rate at which Θ enters the system at the boundaries, minus the rate at which Θ leaves the system across the system boundaries, plus the rate at which Θ is generated within the system. Using this generic balance equation, with respect to the rate of change with time of the extensive quantity entropy S, the entropy balance equation for an open thermodynamic system is:

\frac{dS}{dt} = \sum_{k=1}^K  \dot{M}_k \hat{S}_k  + \frac{\dot{Q}}{T} + \dot{S}_{gen}

where

\sum_{k=1}^K  \dot{M}_k  \hat{S}_k = the net rate of entropy flow due to the flows of mass into and out of the system (where \hat{S} = entropy per unit mass).
\frac{\dot{Q}}{T} = the rate of entropy flow due to the flow of heat across the system boundary.
\dot{S}_{gen} = the rate of internal generation of entropy within the system.

Note, also, that if there are multiple heat flows, the term \dot{Q}/T is to be replaced by \sum \dot{Q}_j/T_j, where \dot{Q}_j is the heat flow and Tj is the temperature at the jth heat flow port into the system.

Reversible process

In thermodynamics, a reversible process, or reversible cycle if the process is cyclic, is a process that can be "reversed" by means of infinitesimal changes in some property of the system without loss or dissipation of energy.[12] Due to these infinitesimal changes, the system is in thermodynamic equilibrium throughout the entire process. Since it would take an infinite amount of time for the reversible process to finish, perfectly reversible processes are impossible. However, if the system undergoing the changes responds much faster than the applied change, the deviation from reversibility may be negligible. In a reversible cycle, the system and its surroundings will be exactly the same after each cycle.[13]

An alternative definition of a reversible process is a process that, after it has taken place, can be reversed and causes no change in either the system or its surroundings. In thermodynamic terms, a process "taking place" would refer to its transition from its initial state to its final state.

Irreversibility

A process that is not reversible is termed irreversible. In an irreversible process, finite changes are made; therefore the system is not at equilibrium throughout the process. At the same point in an irreversible cycle, the system will be in the same state, but the surroundings are permanently changed after each cycle.[13]

References

  1. 1.0 1.1 Sandler S. I., Chemical and Engineering Thermodynamics, 3rd Ed. Wiley, New York, 1999 p91
  2. 2.0 2.1 McQuarrie D. A., Simon J. D., Physical Chemistry: A Molecular Approach, University Science Books, Sausalito 1997 pp 817.
  3. 3.0 3.1 McGraw-Hill Concise Encyclopedia of Chemistry, 2004
  4. 4.0 4.1 Sethna, J. Statistical Mechanics Oxford University Press 2006 p78
  5. 5.0 5.1 Oxford Dictionary of Science, 2005
  6. Barnes & Noble's Essential Dictionary of Science, 2004
  7. "Entropy production theorems and some consequences," Physical Review E; Saha, Arnab; Lahiri, Sourabh; Jayannavar, A. M.; The American Physical Society: 14 July 2009, p.1-10
  8. EntropyOrderParametersComplexity.pdf
  9. Jaynes, E. T., "The Gibbs Paradox," In Maximum Entropy and Bayesian Methods; Smith, C. R.; Erickson, G. J.; Neudorfer, P. O., Eds.; Kluwer Academic: Dordrecht, 1992, p.1-22
  10. Ben-Naim, Arieh, On the So-Called Gibbs Paradox, and on the Real Paradox, Entropy, 9, 132-136, 2007 Link
  11. http://khanexercises.appspot.com/video?v=xJf6pHqLzs0
  12. Sears, F.W. and Salinger, G.L. (1986), Thermodynamics, Kinetic Theory, and Statistical Thermodynamics, 3rd edition (Addison-Wesley.)
  13. 13.0 13.1 Zumdahl, Steven S. (2005) "10.2 The Isothermal Expansion and Compression of an Ideal Gas." Chemical Principles. 5th Edition. (Houghton Mifflin Company)

External Links


This entry is from Wikipedia, the leading user-contributed encyclopedia. It may not have been reviewed by professional editors (see full disclaimer).