What is entropy, and why is it always increasing?

Life as we know it hinges on us maintaining order. Our bodies die if not kept fueled and at the proper conditions. Appliances break down when you scramble their wires. Our parents get disappointed when we don’t make the bed.

Image via Pixabay.

But regardless of how hard we work at keeping our rooms clean and tidy, the Universe seems to be against us. One value — entropy — describes disorder. And, according to physics, we can’t win against it. No matter what we do, the second law of thermodynamics says that entropy in the universe will stay constant, or increase.

“Technically, physicists define a number called the entropy to measure how scrambled-up the universe is at a given moment of time,” wrote George Musser for Scientific American.

Thus, entropy is perhaps the only truly unstoppable force in the universe, even though it isn’t a force. It started acting ever since the Big Bang. It won’t stop until the heat death of the Universe.

But far from being a malign influence, cynically plotting our demise from the shadows, entropy is simply a product of statistics. It could very well be the thing that gives meaning and direction to the concept of time.

If nothing else, it is a great reminder that in the large picture, what we call order could in fact be chaos, that our planet, our bodies, and our works are the exception, a statistical fluke against a law-abiding, empty Universe.

Sounds cool? Well it does to me, and I’m the guy with the keyboard, so today we’re going to talk about entropy.

The mess of the messy room

The most common way entropy is explained is as disorder or randomness. A clean room, for example, has less entropy than that same room after it hasn’t been tidied for two weeks. It will grow more cluttered over time, but sadly never clean itself by chance.

Cărămizi, Morman, Red, Construcţii, Stivă, Multe
Neither will this.
Image via Pixabay.

Both this example and the equation with disorder have some flaws, as we’ll see later on, but they’re descriptive enough that they’re a good starting point. To get more specific about this concep, we’ll have to look at physics and probability.

Probability

Each system has a macrostate (its shape, size, temperature, etc) and several microstates. Microstates define the arrangement of all molecules within that system and how they interact. Each arrangement (each microstate) has a chance of ‘happening’. Entropy is a way of quantifying how likely the system’s current microstate is.

A coin is a very good analogy. Its macrostate is its shape, size, color, temperature. Flip it two times, however, and you get four possible microstates — alternating heads and tails, two heads, or two tails. All are possible, but one outcome (a sequence of heads and tails) has a 1 in 2 chance of happening, while the others have a 1 in 4. Because of that, the heads-and-tails sequence is the one with the highest entropy.

This statistical understanding of the term is rooted in the physical definition of entropy, and I’m simplifying things a lot, but I feel it’s the best rough idea of how it works.

Castles grow moss and crumble, heels snap off of shoes. Ordered systems break down over time because there’s a single microstate in which they stay the same, and countless in which they change. It’s immensely more likely to happen.

Spontaneous reductions in entropy are possible, such as the formation of life or crystals. Josiah Willard Gibbs, an American engineer from the early 1900’s even found a way to calculate why (more on that later). However, overall, entropy in a system increases over time, because changes towards disorder are overwhelmingly more likely than those towards order.

From a physical point of view

We all instinctively understand that disorder is more likely than order, but why?

The meat of it is that randomness is simple and low on energy. It’s homogeneous. Nature loves that.

A glass of ice is more orderly than a glass of water. Molecules in ice are kept in a very specific arrangement, forming a lattice that we perceive as ice cubes. If you were to simulate a glass of it, you’d have to program their molecular composition, shape, size, and position relative to one another. For the glass of water, all you need to do is define the shape of the glass and how high you’re filling it because its molecules move in an undefinable manner. The ice takes more data to make it what it is, it’s more complicated, so it’s less probable.

Entropy also moves things along towards low states of energy (including potential energy) because spontaneous processes tend to work towards fixing imbalances and thus expending energy. A glass filled half with ice and half with boiling water has a higher imbalance and a lower entropy than a glass where they’re mixed — so they do.

Some probabilities are more likely than others — which is our statistical entropy — because they lead to simpler, more homogenous systems by transforming energy — our physical entropy. And in nature, quite like in finances, nothing happens unless you pay for it (with free energy).

Bringing us neatly to:

Gibbs’ Free Energy

In short, Gibbs’ free energy formula tells us if a process will happen spontaneously, or not.

The free energy of a system can be used to perform physical work (to move things). It’s enthalpy (heat) minus the product of temperature and entropy. As long as it’s negative, the system — such as a chemical reaction — can start spontaneously. This means that either a transfer of heat, which is energy, or an increase in entropy can provide power for the system. This latter one is usually seen as changes to volume, especially in endothermic (heat absorbing) reactions.

Gibbs’ formula shows us that there is energy to be had from breaking apart chemical bonds, so molecules generally try to become as small as they can. Fluids, like liquids or gas, are generally made of smaller, lighter molecules. They’re also an a higher entropy state than solids, for example, since their molecules can move freely among themselves.

The arrow of time

Since things naturally tend to gain entropy, then complex systems tend to break down into disorganized ones. It is one of few physical notions that require a very definite direction in time.

There’s technically no natural laws which say that a piece of burnt wood and a puddle of water can’t un-burn and freeze back, apart from entropy. All the energy and matter in the world was at some point concentrated in a single point during the Big Bang. It’s still here. The only difference since then is that there’s way more entropy around, and it’s always growing.

Because entropy flows a single way, it has been argued that entropy makes time-travel impossible — but only time will tell.

From what we know so far, one of two possible outcomes is for entropy to win out in the end. We call this hypothetical scenario the Big Freeze or Big Chill, or “the heat death of the Universe“. I personally like the last one because it just seems appropriately dramatic. In such a scenario, there is no more free energy in the whole universe. As such, there can be no increase in entropy. But it also means that nothing would happen, nothing would ever move.

So are we doomed? Not necessarily. We could subvert this if we learn how to create hydrogen from pure energy. Hydrogen powers stars, and those could (maybe?) be used to stave off this heat death. There’s also the other alternative, the Big Rip, but that one doesn’t sound pleasant either.

All in all, entropy is a very complex topic. It can only be defined through the system it’s being applied to, so different academic areas will somewhat focus on particular elements of this concept.

But it definitely is a fascinating subject. It’s a bit humbling to know that the same thing making your bedroom dirty is also probably going to end the universe one day.

Leave a Reply

Your email address will not be published. Required fields are marked *