Entropy in Chemistry

In thermodynamics, entropy (usual symbol S) is a measure of the number of specific ways in which a thermodynamic system may be arranged, commonly understood as a measure of disorder. According to the second law of thermodynamics the entropy of an isolated system never decreases; such a system will spontaneously proceed towards thermodynamic equilibrium, the configuration with maximum entropy. Systems that are not isolated may decrease in entropy, provided they increase the entropy of their environment by at least that same amount. Since entropy is a state function, the change in the entropy of a system is the same for any process that goes from a given initial state to a given final state, whether the process is reversible orirreversible. However, irreversible processes increase the combined entropy of the system and its environment.

The change in entropy (ΔS) of a system was originally defined for athermodynamically reversible process as

entropy in chemistry,

where T is the absolute temperature of the system, dividing an incremental reversible transfer of heat into that system (dQ). (If heat is transferred out the sign would be reversed giving a decrease in entropy of the system.) The above definition is sometimes called the macroscopic definition of entropy because it can be used without regard to any microscopic description of the contents of a system. The concept of entropy has been found to be generally useful and has several other formulations. Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics.

Entropy is an extensive property. It has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J K−1) in the International System of Units (or kg m2 s−2 K−1 in terms of base units). But the entropy of a pure substance is usually given as an intensive property — either entropy per unit mass (SI unit: J K−1 kg−1) or entropy per unit amount of substance (SI unit: J K−1 mol−1).

The absolute entropy (S rather than ΔS) was defined later, using either statistical mechanics or the third law of thermodynamics.

In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition. It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it. The second law is now often seen as an expression of the fundamental postulate of statistical mechanics through the modern definition of entropy.

Function of state

There are many thermodynamic properties that are functions of state. This means that at a particular thermodynamic state (which should not be confused with the microscopic state of a system), these properties have a certain value. Often, if two properties of the system are determined, then the state is determined and the other properties’ values can also be determined. For instance, a gas at a particular temperature and pressure has its state fixed by those values, and has a particular volume that is determined by those values. As another instance, a system composed of a pure substance of a single phase at a particular uniform temperature and pressure is determined (and is thus a particular state) and is at not only a particular volume but also at a particular entropy. The fact that entropy is a function of state is one reason it is useful. In the Carnot cycle, the working fluid returns to the same state it had at the start of the cycle, hence the line integralof any state function, such as entropy, over the cycle is zero.

Reversible process

Entropy is defined for a reversible process and for a system that, at all times, can be treated as being at a uniform state and thus at a uniform temperature. Reversibility is an ideal that some real processes approximate and that is often presented in study exercises. For a reversible process, entropy behaves as a conserved quantity and no change occurs in total entropy. More specifically, total entropy is conserved in a reversible process and not conserved in an irreversible process. One has to be careful about system boundaries. For example, in the Carnot cycle, while the heat flow from the hot reservoir to the cold reservoir represents an increase in entropy, the work output, if reversibly and perfectly stored in some energy storage mechanism, represents a decrease in entropy that could be used to operate the heat engine in reverse and return to the previous state, thus the total entropy change is still zero at all times if the entire process is reversible. Any process that does not meet the requirements of a reversible process must be treated as an irreversible process, which is usually a complex task. An irreversible process increases entropy.

Heat transfer situations require two or more non-isolated systems in thermal contact. In irreversible heat transfer, heat energy is irreversibly transferred from the higher temperature system to the lower temperature system, and the combined entropy of the systems increases. Each system, by definition, must have its own absolute temperature applicable within all areas in each respective system in order to calculate the entropy transfer. Thus, when a system at higher temperature THtransfers heat dQ to a system of lower temperature TC, the former loses entropy dQ/TH and the latter gains entropydQ/TC. Since TH > TC, it follows that dQ/TH < dQ/TC, whence there is a net gain in the combined entropy. When calculating entropy, the same requirement of having an absolute temperature for each system in thermal contact exchanging heat also applies to the entropy change of an isolated system having no thermal contact.