chapter18

= Chapter 18 Entropy, Free Energy, and Equilibrium =

~ **The laws of Thermodynamics**:

 * **0** - Two thermodynamic systems in equilibrium with a third system are in equilibrium with each other.

The name of this "Zeroth Law" originates in that it was added after the other three laws but logically precedes them. That is, understanding the Zeroth Law is a necessary condition for understanding the other three.


 * **1** - Energy can change form, but can never be created nor destroyed.

Alternative statement: The increase in internal energy of a system is equal to the amount of energy added to the system minus the amount of work which the system does on its environment.

Note that the First Law is an expression of the conservation of energy.

Alternative statement: Heat cannot spontaneously flow from a region of lower temperature to an area of higher temperature. __Alternative statement (2): It is impossible to convert heat completely into work in a cyclic process.__
 * **2** - The entropy of the universe increases in a spontaneous process and remains unchanged in an equilibrium process.

It is a consequence of the second law that a heat engine with 100% efficiency is impossible. ~
 * **3** - The entropy of a system is zero at the absolute zero of temperature.

A // Spontaneous Process // is a process / reaction which does occur under a given set of conditions. //E.g.,// waterfalls running downhill as opposed to uphill, a lump of sugar dissolving in water but never reassembling from its particles

- The // Entropy // of a system is a measure of its disorder or randomness. Ludwig Boltzmann (1844-1906) showed that the entropy //S// of a system is related to the number //W// of its microstates as follows:

//S = k// ln //W//

where //k// is Boltzmann's constant.

If we define ΔSuniv, ΔSsys , ΔSsurr , as the change in entropy of the universe, system, and surroundings, respectively, we have, for a spontaneous process ΔSuniv = ΔSsys + ΔSsurr > 0,

while for an equilibrium process, ΔSuniv = ΔSsys + ΔSsurr = 0.

Suppose a system consists of a reaction of the form aA+ bB --> cC+dD.

The Standard Entropy of Reaction Δ Srxn is defined as the difference in standard entropies between the products and reactants:

ΔSrxn=(cS(C)+dS(D))-(aS(A)+bS(B)).

Example: Calculate the standard entropy of reaction of dissociation of CaCO3 into CaO and CO2.

ΔSrxn=S(CaO)+S(CO2)-S(CaCO3)=39.8+213.6-92.9=160.5 J/K.mol.

~(Subject matter of section 18.5 in Chang text) Gibbs Free Energy

The **Gibbs free energy** (G) or **free energy** is defined as

G = H - TS

where T is the temperature of the system. The change in free energy is then given by

ΔG = ΔH - T ΔS

If ΔG < 0, then the reaction is spontaneous in the forward direction. If ΔG > 0, then the reaction is nonspontaneous (it is spontaneous in the opposite direction). If ΔG = 0, then the system is at equilibrium.

It is defined as follows:
 * The standard free energy of reaction** ( ΔGrxn) is the change in free energy under standard state conditions.

ΔGrxn = Σn ΔGf(products) - Σm   ΔGf(reactants)

with stochiometric coefficients n and m. ΔGf is the change in free energy which occurs when 1 mole of the compound is synthesized from standard state elements.

Example: Consider the reaction C + O2 --> CO2. The standard free energy change is

ΔGrxn = ΔGf(CO2) - ΔGf(C) - ΔGf(O2) = ΔGf(CO2) - 0 - 0 = ΔGf(CO2).

Thus the standard free energy change of the sythesis of CO2 is equal to ΔGf(CO2), which has a numerical value of -394.4 kJ/mol.

//Analysis//: the sign of ΔG

Low temperature implies reverse spontaneity. || High temperature implies reverse spontaneity. ||
 * ΔH || ΔS || ΔG ||
 * + || + || - if TΔS>ΔH. Reaction is spontaneous at high temperatures.
 * + || - || + Reaction is spontaneous in reverse direction. ||
 * - || + || - Reaction is spontaneous. ||
 * - || - || - if TΔS<ΔH. Reaction is spontaneous at low temperatures

**Phase Transitions** When a phase change occurs, we have equilibrium (ΔG=0) so ΔS = ΔH/T. This allows us to compute the entropy change of a substance changing states.


 * Equilibrium ** It can be shown that at equilibrium, ΔG = -RT ln K where K is the equilibrium constant of the reaction. The following table summarizes the consequences of this equation.

Example: Kp of the reaction N204 <-> 2N02 is .3 at 300 K, entailing a 3 kJ/mol energy change. The initial pressures are PN02 = .33 and PN2O4 = .3. What is ΔG for the reaction ?
 * ln K || ΔG || Results ||
 * + || - || Right shift ||
 * 0 || 0 || No shift ||
 * - || + || Left shift ||

Solution. We have ΔG = ΔG* +RT ln (PNO2^2N2O4) = 3000 + 8.314*300 ln (.33^2/.3) = 472.5 J/ mol. Because ΔG > 0, the reaction shifts to the left to reach equilibrium.

**Modern Applications of Entropy**

Entropy makes an appearance in Information Theory, a discipline in electrical engineering which deals with the reliable transfer and storage of information; in this context, it describes the amount of uncertainty associated with a message. If p(x) is the probability of a message x, then the information theoretic entropy is defined as

-Σp(x) log p(x)

where the sum is taken over all possible messages x.

__When we are working with bits as a storage unit, the logarithm is taken base 2.__

[Claude Shannon, the "father" of Information Theory. Source: []]\


 * Applications of Information Theory **

Shannon's generalization of entropy to quantities of information, especially communicated messages, underpins the following areas:

The "coding" of information, Measuring quantitatively the "redundance" of a message, Cryptography.

//Illustration of transition from an ordered state to a disorded state of a physical system via colored balls//
 * LAB**

Materials: A transparent cylindrical container Enough ping-pong size balls of two distinct colors to fill half of the container A camera.

Procedure: Place all balls of color 1 in the bottom of the container and all balls of color 2 on top of those of color 1. This is a highly ordered system state because of the clear organization of the balls - each layer consists of only one color. Next, lift the container and give one gentle shake. Observe the organization of the balls; take one photograph. Next, give a second shake, observe, and photograph. Continue shaking and observing until the colors are entirely mixed.

Observe what happens to the organization of the balls over the course of this physical process. The layers gently mix and dissociate, leaving the clear initial organization of the balls entirely undermined. This transition from order to disorder is precisely the essence of entropy. By the 2nd Law of Thermodynamics, the entropy due to spontaneous processes of a system always increases. In our container system, we observe this transition very clearly.

Question: Is the shaking process deterministic? Answer: Real life processes which are "deterministic" have the property that future "states", or configurations of constituent elements, can be determined or predicted from the current or initial state. Our process is non-deterministic because of the complexity of its structure: the ball-container system transitions are random because we don't have control over details of the shaking process; in addition, it has far too many micro-states, or configurations of the balls within the container space. The most we can say is that the system transitions to //less ordered states//. //I.e.,// the exact configuration of the balls at a certain time (say, the //n//th shake) cannot be determined. However, we are sure that the system will continually mix until the balls adopt a sufficiently homogeneous composition. Moreover, the system will, more or less, remain in this final //mixed// state once it reaches it; it does not transition back to the ordered initial state.