Entropy and the Relentless Drift from Order to Chaos

In a famous lecture in 1959, scientist and author C P Snow spoke of a gulf of comprehension between science and the humanities, which had become split into “two cultures”. Many people in each group had a lack of appreciation of the concerns of the other group, causing grave misunderstandings and making the world’s problems more difficult to solve. Snow compared ignorance of the Second Law of Thermodynamics to ignorance of Shakespeare [TM209 or search for “thatsmaths” at irishtimes.com].

So, what is this mysterious second law? Put simply, it says that a physical system moves towards, or remains in, the most probable state. In fact, this is not really mysterious at all. Toss ten coins in the air. Typically, about half will come down heads and half tails. Were all ten to land heads up, suspicion would be aroused: the chance of this is about one in a thousand.

Multiplicity is the Key

The key factor is the “multiplicity”, the number of ways an event can happen. There is only one way to get ten heads: every single coin must be a head. But five heads can be any five of the ten coins, and there about 250 ways this can happen. In the jargon of statistical mechanics, we call each of these ways a microstate, and there are 250 microstates in the macrostate of five heads out of ten. Thus, five heads coming up is 250 times more likely than ten heads.

The situation becomes simpler as the numbers become larger. With one hundred coins, the chance of all being heads is so minute that we can take it as zero. For a volume of gas, the numbers are breathtakingly large. A cubic metre box of air has about 24 trillion trillion molecules, distributed uniformly throughout the box. The chance that they would all move to the left side of the box, leaving a vacuum on the right is quite beyond remote: it never happens!

We learn at school that the logarithm of the product to two numbers is the sum of the logarithms of the numbers: logarithms turn multiplication into addition. Logs are fundamental in mathematics and physics. In maths, the distribution of prime numbers follows a logarithmic law. Many physical phenomena are also governed by log laws, most notably, the second law of thermodynamics.

Ludwig Boltzmann and Entropy

The Austrian physicist Ludwig Boltzmann struggled to explain the behaviour of gases using the ideas of statistical mechanics. For any observed macrostate – a specified pressure and temperature – there are a truly enormous number of microstates; the multiplicity W is vast. For two systems, one with multiplicity W1 and one with multiplicity W2, the multiplicity of the combined system is the product W1W2. Boltzmann knew that the total energy for two combined systems is the sum of the two component energies. He wanted a quantity for macrostates that had this additive property. So, in 1877 he took the logarithm of the multiplicity, defining the entropy as S = k log W (k is called Boltzmann’s constant).

Entropy is a measure of molecular disorder. It governs the direction of irreversible physical processes that, left to their own devices, tend from order to chaos. The awful truth is that, in an isolated system, entropy never gets less. This is what the second law of thermodynamics states. It is all too evident in everyday life: a broken egg cannot be put together again. We understand this without mathematics, but Boltzmann’s equation makes it sharp and inescapable.

Boltzmann knew that matter is made of myriad atoms. This was disputed by many authorities and he encountered severe criticism. It all became too much and in 1906, aged just 62, he hanged himself. His equation S = k log W is carved on his tombstone.


Eric Johnson, 2018: Anxiety and the Equation: Understanding Boltzmann’s Entropy. MIT Press, 192 pages. ISBN: 978-0-2620-3861-4.

* * * * *

That’s Maths II: A Ton of Wonders

by Peter Lynch now available.
Full details and links to suppliers at

>>  Review in The Irish Times  <<

* * * * *

Last 50 Posts