10. Free Time, Free Money, Free Energy?
Learning Objectives
- Define the entropy
- Explain the second law of thermodynamics as the sole requirement for the spontaneity of a reaction
- Define the internal energy
- Explain how the absolute value of internal energy is of little significance
- Identify the thermodynamic function or quantity represented by each of the following variables: P, q, R, S, T, U, V, w
- Classify each of the preceding thermodynamic quantities as either a state function or a path function
- Demonstrate the significance of a thermodynamic quantity being a state variable
- Calculate the standard enthalpy change for a reaction, given the standard formation enthalpies of reactants and products
- Classify a process as spontaneous or not in terms of the total entropy change of the universe
- Show how the Gibbs energy is a restatement of the second law of thermodynamics
- Explain phase stability in terms of the Gibbs energy
- Calculate the standard Gibbs energy change for a reaction, given the standard formation Gibbs energies of the reactants and products
- Define the conditions at standard state
- Justify the need for a standard state
- Calculate the enthalpy of a phase transformation under standard conditions, given the formation enthalpies of both phases involved in the phase transformation
- Calculate the heat consumed or released during a phase transformation, given the enthalpy for the transformation and the quantity of material
- Apply the specific heat and the molar heat capacity to relate temperature changes to heat supplied to a quantity of a material
Guess what? There are some useful thermodynamic data at the end of this chapter.
The Finger of Time
In the previous chapter we discussed the formation of solid, crystalline sodium chloride, which we know is stable at room temperature and pressure. That is, we're not surprised to see solid pieces of salt on our fries. We would, however, be surprised if we saw the solid salt on our fries start to melt (note that I don't mean dissolve, I mean melt). Of course, we can have molten salts, but that's not what I'm suggesting here. We use molten salts in the lab for quickly heating or cooling metallic samples to temperatures above the temperature of boiling water. Not something you'd want on your fries.
So, why is it important that we are not surprised to see solid salt on our fries? Why, in fact, is it important that we are not surprised to see someone fumble their phone and have it fall to the ground, but we would be surprised to see the reverse process of a phone flying up from the ground into someones hand, on its own. The answer is the business of thermodynamics, specifically, the second law of thermodynamics.
You see, there are two main laws that govern thermodynamics, the first and second laws. The first law is concerned only with energy changes and in our previous examples, the first law would say nothing against some random vibrations within a phone and the surrounding environment coming together to shoot the phone up from the ground into someones hand. Say the phone dropped and most of the energy of the impact was dissipated by the screen smashing. Well, the first law would not prevent the screen from coming back together and putting all that surface energy from the fracture into kinetic energy to propel the phone into the air. Of course, you know that would not happen and my goal in this section is to show you that the second law of thermodynamics is the law that prevents these things (things that look like a video run in reverse) from happening. You are already instinctively aware of the direction of time, the finger of time, as I like to call it.
The Second Law of Thermodynamics
Alright, enough talking, let's just get the second law of thermodynamics out there. Here is one statement of the second law:
The entropy of the Universe increases during any spontaneous process.
This statement introduces at least one term that I need to clarify, spontaneous. As we've done before with words like plastic or degenerate we are once again hijacking a word from common usage and giving it very special meaning in the context of chemistry. Spontaneous, here, means that the process proceeds on its own, without the need for an input of energy. For example, if I took a flame to a piece of paper in air, it would burn and nobody would be surprised. Why is this usage of spontaneous different from the general public usage? Well, because there is often an impression that something happened quickly, and perhaps unexpectedly when we say, in common usage, that it was spontaneous. For example, "hey, I know we're supposed to be walking to lecture now, but let's be spontaneous and go to Tim Horton's instead." There is the implication that the decision was made quickly and it was unexpected. However, a spontaneous process (in our careful usage, which we'll use from now on) need not occur quickly and may occur for a very good reason, without surprise. For example, if you were to put sweet water, say some orange juice into the freezer at -20ºC it would spontaneously proceed to form solid water ice and solid sugar, however this would take years. Similarly, when I lit a piece of paper on fire it was not surprising or unexpected that it burned.
You likely still have a burning question, "so, what?" Why is the Second Law of Thermodynamics important? Well, this is a good question, and for its immense importance, the Second Law is surprisingly difficult to internalize. It isn't likely immediately obvious that the Second Law is the same reason that brick walls don't build themselves, that coffee doesn't heat itself, that fridges and air conditioners only work when we plug them in, and that if you were to heat the outside of an internal combustion engine to the same temperature as the combustion chamber, the engine would stop.
The Entropy
So, I've been dancing around the Second Law for a while now, telling you how important it is, but never actually telling you what it means or how to use it. Guilty as charged. I have done these things. To make it up to you, let me start by explaining the entropy. As with anything, it is necessary to start with a basic understanding before moving on to a more detailed understanding. The basic understanding of entropy is of disorder. A system at high entropy is highly disordered. You can't predict where anything is. There is very little in the way of structure. A gas has higher entropy than a liquid, which in turn has higher entropy than a solid. This is a decent intuitive understanding of entropy, but it will only take us so far. A much less intuitive, but much more useful definition of entropy is this:
(1)
where is the entropy, is the heat transferred, and is the thermodynamic temperature, that is, the temperature in Kelvin. Each time I introduce one new concept it opens up two new ones that need to be addressed. Such is the case here, and I need to discuss why the letter S is used for entropy and also what the little "rev" subscript on the heat means.
The Thermodynamic Alphabet
So, as far as I can tell the letter S is used for entropy, because it was available in the alphabet in the vicinity of other letters commonly used in thermodynamics Figure 1 shows what I like to call the thermodynamic alphabet. You can see how nicely the letter S fits in.
Reversibility
The other thing that needs to be addressed is the subscript "rev." This means that the heat is transferred reversibly. A reversible process is only a concept. It is not achievable in any real process, however it sets a theoretical limit that could only be achieved in a process run in such tiny steps (infinitesimally small) that the system was at all times in equilibrium. This is actually a hugely important concept in thermodynamics, but I think, for our purposes it is enough to acknowledge it and not dwell on it.
The System, the Surroundings, and the Universe
Much like in classical mechanics we love to draw free-body diagrams because they help us to distill the relevant forces without being distracted by everything else, in thermodynamics it is extremely useful (read:necessary) to define a system of interest that we will study and understand.
Everything else we conveniently call the surroundings. Together the system and the surroundings give us the universe. Pretty straight forward, eh? Figure 2 shows this.
The Second Law, One More Time.
Why one more time? Because we now have a somewhat more developed understanding of entropy and we can right the second law in a somewhat more succinct and mathematical way. Here it is:
(2)
That's it. The only requirement for a process to be spontaneous is that the total entropy change for the universe must be positive.
The First Law of Thermodynamics
Grrr. You're so angry because I introduced the second law first and the first law second. Sorry about that, but I do hope that it helped you to appreciate the significance and the excitement of the second law. Anyway, now, let's dive into the first law. Much like the second law gave us a fundamental property, the entropy, the first law gives us the internal energy. The internal energy can be a surprisingly challenging concept to grasp initially. You are sharp minded and curious individual, which is undoubtedly why you entered engineering. You hear internal energy and you immediately start to visualize a container of fuel, or a battery, or a quantity of nuclear fuel and you think of how the available energy embodied within that material could be calculated. However, this can be a futile pursuit because we never actually calculate the absolute value of internal energy. You see, there are countless potential forms of energy contributing to the internal energy, a few illustrated in Figure 3, and we only concern ourselves with changes in the internal energy, rather than an absolute value.
An analogy that I personally find useful is that of a bottle sitting on the edge of a table, as shown in Figure 4.
If I were to ask you to calculate the potential energy of this bottle, you would almost instinctively calculate the mechanical potential energy over the obvious reference surface. However, if I zoomed out a bit and showed you that the table was in an airplane, you would immediately realize that the mechanical potential energy could be significantly different. You see, by this point in your life you are extremely familiar with the idea that the relevant reference surface is the closest that an object may fall down to. You hardly need to think about it. What's more, you know that if you are concerned with mechanical potential energy, you need not concern yourself with kinetic energy, or the energy you could obtain by burning the bottle. Similarly, in thermodynamics we'll need to define a logical reference point to measure changes in energy against and we'll also limit ourselves, at least for now, to only work done my a change in volume, specifically by a gas pushing against a pressure while volume changes.
A Convenient and Logical Reference Point - The Standard State
So, what on earth are we to use as a "surface" to measure our energy changes against? The floor under the table was obvious, but what can we use that will help us to calculate the energy required to burn a litre of gasoline? Well, it turns out there is a very convenient and useful reference point. We call it the standard state. To determine the standard state we say to ourselves, "selves, what is a logical starting point that we could begin from in order to build all of the compounds that we might want to study? Let's think about gasoline again. Gasoline is mostly octane, so we could think about the energy change that would take place if we wanted to build a molecule of octane from two molecules of butane (and we'd have to account for the hydrogen lost from the end of each butane molecule). We could also consider starting with ethane molecules, or methane, but none of these really seem like obvious choices. And we need an obvious choice for something that we want to call standard. Well, the chemists, and thermodynamicists and engineers who went before us and on whose shoulders we stand have given us a very convenient standard state. They said, "look, if you want to build octane, you'll need carbon and hydrogen, so let's just start with carbon and hydrogen." And to make things even more clear (which, of course, is essential for something we want to call a standard) they said, "if I were to have some carbon and some hydrogen with me here, in this room, what state would I find them in? Let's call that the standard state." So, in this example, carbon would be present as solid graphite and hydrogen would be present as diatomic hydrogen gas. And so, we have a nice tidy definition of the standard state:
The most stable form of a pure element at 25ºC and 105 Pa.
State Functions and Path Functions
Let's stick with the combustion of gasoline:
(3)
To work out the change in internal energy for this reaction we'd like to be able to calculate the energy change required to make the products and subtract the energy required to make the reactants. In fact, this is what we do, but it is important to understand why we can do this. Another analogy will be useful. Let's consider the potential energy of a water bottle again:
(4)
You can likely see that if we took the potential energy change of the bottle from the floor, up to the table, back down to the floor, and then all the way up to the high shelf it would give us the same potential energy change as from the table directly to the high shelf. We can do this, because it doesn't matter if we bring the bottle down and then back up again. All that matters is the final height. This is because the potential energy is a state function. This means that it does not depend on how we got to the final state, all that matters is what that state is. Another state variable is the temperature. If I ask you the temperature, you don't need to look up charts of the historical temperature over the past one hundred years and calculate it. All you do is look at the thermometer and read off the temperature. It doesn't matter if it was really hot yesterday or really cold if all I care about is the current temperature. But not everything is a state function. Some quantities are path functions. Work is a path function. What if I asked you how difficult it was to move the water bottle from the table up to the high shelf? Well, that would depend on how you got it there. If you had a ladder it might have been easy, but if you didn't have a ladder and you needed to do an epic free solo climb up the shelves of the bookcase to place the bottle on the high shelf you might have really broken a sweat. You see, the work done, does depend on how you got there - it depends on the path. We can only add up the values from the products and subtract the values for the reactants for quantities that are state variables. Thankfully, the internal energy is a state function.
Now that we have a good handle on the internal energy we can get the first law of thermodynamics out in the open. The only thing we need to establish first is the nature of the boundaries around our system. You see, this makes an important difference in how we state the first law.
Closed Versus Isolated Systems
Getting straight to it, we can study systems in which the boundaries are open and matter passes through them, but these systems are not what we need to consider now. We need to consider systems where matter is not allowed to pass the boundaries. There are two types of these systems, closed and isolated. In an isolated system no heat is exchanged with the surroundings, whereas in a closed system heat may pass the boundaries. Closed systems are probably the most interesting at this point.
The first law of thermodynamics is sometimes simplified to something along the lines of, "energy is never created or destroyed." This is an intuitive description, but not especially practical. A more useful definition comes from a calculation of changes to the internal energy of a system. In this way, we can state the first law of thermodynamics two ways:
(5)
(6)
where is the heat transferring into the system and is the work done on the system. The sign convention is important here. Remember, heat in and work on are positive.
Let's Do Some Bean Counting - The Enthalpy
Hmmm, perhaps I should have given this section a different title. Bean counting is so often used as a pejorative term. But, as you'll come to appreciate, when it comes to thermodynamics there is no higher compliment than to be thought to place incredible emphasis on tiny amounts. Each bean must be counted. Enter the enthalpy. The enthalpy is a handy quantity that lets us account for a particular amount of work that our processes so often need to do. You see, we often perform our processes exposed to the atmosphere and so reactions will need to do work to push back the atmosphere when a gas expands, or new gas is formed. The enthalpy is defined as
(7)
and for processes occurring at constant pressure (as so many of our processes do, being open to the atmosphere) we can write the change in enthalpy as
(8)
You see, the enthalpy change is just the heat supplied (or absorbed) by a reaction, when we account for the work done in pushing back the atmosphere.
How Solid State People Talk
If you are speaking with a solid state person you may notice that she uses the term enthalpy or enthalpy change interchangeably with the term energy or energy change. You, having been schooled now in the importance of accounting for the PV work done by the system may be irritated by this. But, for solid state people, this is pretty safe because the volume change in solid state processes is minuscule and so the difference between the enthalpy and the internal energy is almost nothing. I actually think this loose usage leads to some confusion amongst people learning these topics, so I hope by mentioning it you avoid this potential confusion.
The Gibbs Energy
Alright, this is getting really good now. We know that the sole requirement for spontaneity is an increase in the entropy of the universe. We saw how we could define the enthalpy, a useful expression for heat, that accounts for the "tax" paid to the atmosphere. Now we are ready to look at another extremely useful property that allows us to quickly determine spontaneity for a system without the need to calculate entropy changes for both the system and the surroundings. What witchcraft is this, you ask? Well, if we limit ourselves to specific processes we can do this. (Remember that we defined the enthalpy change for processes at constant pressure.) First, we'll define a new quantity called the Gibbs energy and then we'll figure out why it is so useful. The Gibbs energy is defined as
(9)
and if we limit ourselves to processes occurring at constant temperature we can write the change in the Gibbs energy as
(10)
Notice that the Gibbs energy is defined completely in terms of the system, without the need to calculate anything for the surroundings. To better understand why this is so useful, lets return to the second law, written for a spontaneous process:
(11)
At constant temperature, the entropy change of the surroundings is the heat entering the surroundings divided by the temperature
(12)
And any heat that leaves the system is the same heat absorbed by the surroundings, or opposite in sign
(13)
So, substituting Equation 13 into Equation 12 we have
(14)
which we can now substitute back into Equation 11 to give
(15)
We can multiply this by T to get rid of that pesky fraction
(16)
And remember that we defined the enthalpy as the heat transferred at constant pressure, so we can replace the system heat term with the enthalpy change of the system
(17)
Looking at the units of this expression, you'll see that it must have energy units since the enthalpy has energy units. For this reason, we could refer to the expression at this point as an "energy." But since the left hand side of Equation 17 must increase for a spontaneous process it feels strange to call it an energy. We're not comfortable with energy increasing in spontaneous processes. Coffee gets cold, and bottles fall to the ground, not the opposite. So, a simple fix is to multiply both sides of Equation 17 by negative one to give
(18)
which you'll recognize as Equation 10. This means that when Gibbs energy decreases, it is just another way of saying that the entropy increased, specifically .
Feel the Heat (of Phase Transformations)
For this section, I'd like to consider what happens when we carefully heat water. I like to use water, because you already have a pretty intuitive sense for its behaviour. Figure 5 shows a plot of temperature versus heat supplied to a quantity of water.
Starting at the lower left point, which is at a temperature below the freezing point for water we see that as we carefully supply heat to the ice, the temperature of the ice increases. That is, until the temperature of the ice suddenly stops increasing, even though we continue to supply energy as heat. Why does the temperature not change even though we are supplying energy as heat? Because, all of the energy is going into melting the ice. The energy supplied as heat is being consumed by the phase transformation. In fact, the length of this lower plateau is the amount of energy that must be supplied to melt all of the ice. Since this is a measure of heat transferred with our system it is an enthalpy, and since it is melting we can call it the enthalpy of melting, or more commonly, the enthalpy of fusion. Once all of the ice is melted we continue to heat the liquid water. You'll notice that the slope of the line for heating water is different than the slope for heating ice. Notice that more heat is needed to bring about a temperature change in liquid water than in solid water. Then, at 100ºC we hit another plateau. Of course this is for boiling water and we would call its full length the enthalpy of vaporization. Finally, we heat the steam, or gas phase of water.
I'd like to take a look at what the slope tells us. It gives us the change in temperature for a particular amount of heat supplied, or
(19)
Which we can rearrange for the heat term
(20)
Now this is a handy expression because it allows us to calculate the amount of heat needed to cause a particular temperature change. In fact, it is so useful that we give the quantity a special name. We call it the molar heat capacity on a per mol basis, or the specific heat on a per mass basis.
So, we have these expressions for the heat required to cause a temperature change
(21)
(22)
Standard Formation Enthalpy, Standard Entropy and Standard Formation Gibbs Energy at 298.15K
Species | ΔfH° [kJ/mol] | S° [J/mol·K] | ΔfG° [kJ/mol] |
---|---|---|---|
C (s, graphite) | 0 | 5.74 | |
C (s, diamond) | 1.9 | 2.4 | |
CH4 (g) | -74.81 | 186.2 | -50.75 |
C2H2 (g) | -83.9 | 200.93 | |
C3H8 (g) | -103.8 | 269.9 | -23.49 |
CaC2 (s) | -59.8 | 70.3 | |
CaF2 (s) | -1225 | 68.87 | -1162 |
CaF2 (l) | -1186 | 92.6 | |
Ca(OH)2 (s) | -987.0 | 83.0 | |
CO2 (g) | -393.5 | 213.6 | -394.4 |
Cu2O (s) | -168.6 | 93.1 | |
Cu2O (l) | -154.79 | 129.96 | |
Cu (s) | 0 | 33.2 | |
Fe (s) | 0 | 27.3 | |
Fe2O3 (s) | -824.2 | 87.4 | |
H2 (g) | 0 | 130.68 | |
H2O (g) | -241.8 | 188.7 | -228.6 |
H2O (l) | -285.8 | 69 | |
O2 (g) | 0 | 205.0 |
Miscellaneous Enthalpies
Substance | Reaction | ΔH [kJ/mol] |
---|---|---|
F-F | Bond Dissociation | 157 |
F | Electron Affinity | -328 |
Ca | Second Ionization Energy | 1734 |
Specific Heats and Heat Capacities
Substance | Specific Heat c | Molar Heat Capacity Cp |
---|---|---|
CO2 | 0.843 | 37.1 |
Air (g) | 1.0 | |
H2O (g) | 2.03 | 36.4 |
H2O (l) | 4.184 | 75.3 |
H2O (s) | 2.09 | 37.7 |
Temperatures and Enthalpies of Phase Changes
Substance | Melting Point | ΔfusH° | Boiling Point | ΔvapH° |
---|---|---|---|---|
Al | 658 | 10.6 | 2467 | 284 |
Ca | 851 | 9.33 | 1487 | 162 |
CH4 | -182 | 0.92 | -164 | 8.18 |
H2O | 0 | 6.01 | 100 | 40.7 |
Fe | 1530 | 14.9 | 2735 | 354 |
All images and videos are created by author.