Energy and Entropy in Thermodynamics

The bonds that we have just described in previous articles can occur between atoms in gases, liquids and solids and to a large extent are responsible for their many and varied properties.

Although we hope construction materials do not change state whilst in service, we are very much concerned with such changes during their manufacture, e.g. in the cooling of metals from the molten to the solid state.

Some knowledge of the processes and the rules governing them are therefore useful in understanding the structure and properties of the materials in their ‘ready-to use’ state. As engineers, although we conventionally express our findings in terms of force, deflection, stress, strain and so on, these are simply a convention. Fundamentally, we are really dealing with energy.

Any change, no matter how simple, involves an exchange of energy. The mere act of lifting a beam involves a change in the potential energy of the beam, a change in the strain energy held in the lifting cables and an input of mechanical energy from the lifting device, which is itself transforming electrical or other energy into kinetic energy.

The harnessing and control of energy are at the heart of all engineering. Thermodynamics teaches us about energy, and draws attention to the fact that every material possesses an internal energy associated with its structure. So we are discussing some of the thermodynamic principles that are of importance to understanding the behaviour patterns.

Stable and Metastable Equilibrium 

We should recognise that all systems are always seeking to minimise their energy, i.e. to become more stable. However, although thermodynamically correct, some changes toward a more stable condition proceed so slowly that the system appears to be stable even though it is not. For example, a small ball sitting in a hollow at the top of a hill will remain there until it is lifted out and rolled down the hill. The ball is in a metastable state and requires a small input of energy to start it on its way down the main slope.

entropy in thermodynamics
Fig. 1 Illustration of activation and free energy

Figure 1 shows a ball sitting in a depression with a potential energy of P1. It will roll to a lower energy state P2, but only if it is first lifted to the top of the hump between the two hollows. Some energy has to be lent to the ball to do this, which the ball returns when it rolls down the hump to its new position. This borrowed energy is known as the activation energy for the process. Thereafter it possesses free energy as it rolls down to P2. However, it is losing potential energy all the time and eventually (say, at sea level) it will achieve a stable equilibrium.

However, note two things. At P1, P2, etc. it is apparently stable, but actually it is metastable, as there are other more stable states available to it, given the necessary activation energy. Where does the activation energy come from?

In materials science it is extracted mostly (but not exclusively) from heat. As things are heated to higher temperatures the atomic particles react more rapidly and can break out of their metastable state into one where they can now lose energy.


If whisky and water are placed in the same container, they mix spontaneously. The internal energy of the resulting solution is less than the sum of the two internal energies before they were mixed. There is no way that we can separate them except by distillation, i.e. by heating them up and collecting the vapours and separating these into alcohol and water. We must, in fact, put in energy to separate them.

But, since energy can be neither be created nor destroyed, the fact that we must use energy, and quite a lot of it, to restore the status quo must surely pose the question ‘Where does the energy come from initially?’

The answer is by no means simple but, as we shall see, every particle, whether of water or whisky, possesses kinetic energies of motion and of interaction. When a system such as a liquid is left to itself, its internal energy remains constant, but when it interacts with another system it will either lose or gain energy.

The transfer may involve work or heat or both and the first law of thermodynamics, ‘the conservation of energy and heat’, requires that:

dE = dQ – dW …….(1)

where E = internal energy, Q = heat and W = work done by the system on the surroundings.

What this tells us is that if we raise a cupful of water from 20°C to 30°C it does not matter how we do it. We can heat it, stir it with paddles or even put in a whole army of gnomes each equipped with a hot water bottle, but the internal energy at 30°C will always be above that at 20°C by exactly the same amount. Note that the first law says nothing about the sequences of changes that are necessary to bring about a change in internal energy.

Entropy in Thermodynamics

Classical thermodynamics, as normally taught to engineers, regards entropy, S, as a capacity property of a system which increases in proportion to the heat absorbed (dQ) at a given temperature (T). Hence the well known relationship:

dS ≥ dQ/T …….(2)

which is a perfectly good definition but does not give any sort of picture of the meaning of entropy and how it is defined. To a materials scientist entropy has a real physical meaning, it is a measure of the state of disorder or chaos in the system.

Whisky and water combine; this simply says that, statistically, there are many ways that the atoms can get mixed up and only one possible way in which the whisky can stay on top of, or, depending on how you pour it, at the bottom of, the water.

Boltzmann showed that the entropy of a system could be represented by:

S = k lnN  ……(3)

where N is the number of ways in which the particles can be distributed and k is a constant (Boltzmann’s constant k = 1.38 × 10−23 J/K).

The logarithmic relationship is important; if the molecules of water can adopt N1 configurations and those of whisky N2 the number of possible configurations open to the mixture is not N1 + N2 but N1 × N2.

It follows from this that the entropy of any closed system not in equilibrium will tend to a maximum since this represents the most probable array of configurations.

This is the second law of thermodynamics, for which you should be very grateful. As you read these words, you are keeping alive by breathing a randomly distributed mixture of oxygen and nitrogen.

Now it is statistically possible that at some instant all the oxygen atoms will collect in one corner of the room while you try to exist on pure nitrogen, but only statistically possible. There are so many other possible distributions involving a more random arrangement of the two gases that it is most likely that you will continue to breathe the normal random mixture.

Free Energy

It must be clear that the fundamental tendency for entropy to increase, that is, for systems to become more randomised, must stop somewhere and somehow, i.e. the system must reach equilibrium. If not, the entire universe would break down into chaos.

As we have seen in the previous article, the reason for the existence of liquids and solids is that their atoms and molecules are not totally indifferent to each other and, under certain conditions and with certain limitations, will associate or bond with each other in a non-random way. As we stated above, from the first law of thermodynamics the change in internal energy is given by:

dE = dQ – dW

From the second law of thermodynamics the entropy change in a reversible process is:

TdS = dQ ……(4)

Hence: dE = TdS – dW  …….(5)

In discussing a system subject to change, it is convenient to use the concept of free energy. For irreversible changes, the change in free energy is always negative and is a measure of the driving force leading to equilibrium.

Since a spontaneous change must lead to a more probable state (or else it would not happen) it follows that, at equilibrium, energy is minimised while entropy is maximised.

The Helmholtz free energy is defined as:

H = E – TS  ……(6)

and the Gibbs free energy as:

G = pV + E – TS  ……(7)

and, at equilibrium, both must be a minimum.

Related Posts

Leave a Comment

Your email address will not be published. Required fields are marked *