The Arrow of Time: What Thermodynamics Doesn’t Tell You
Many people think the arrow of time can be explained by the increase of entropy; but that’s inadequate. We’ll explore why.
“An inch of time is an inch of gold”: The transience of time has been known since time immemorial. Unlike trekking across an open plain or diving into the ocean’s depths, one can neither explore around nor remain stationary in time. With every tick of the clock, we are faced with the relentless advancement of time.
Time’s irreversibility seems contradictory: Einstein’s theory of Relativity unified space and time, and the laws of physics are (approximately) the same in all directions (including forward and backward in time). Yet, our Universe seems to have picked out a unique direction — the arrow of time. Where does this special direction come from?
One of the most popular explanations comes from thermodynamics, where the concept of entropy is introduced. The direction of time is aligned with entropy’s increase. This supposedly explains why time is special — it is the unique direction that is picked out by the increase of entropy.
Yet, there is something unsettling about this explanation: It seems like we can just explain away the arrow of time by this magical quantity called entropy, without any reference to fundamental physics. The reasoning seems incomplete as well; why does entropy have to increase in the first place?
Indeed, I’d argue that the entropic explanation is not much of an explanation at all. Instead, thermodynamics merely rephrases the original question.
Let’s examine what thermodynamics tells us.
Thermodynamics: An Effective Approximation
Thermodynamics has one essential assumption:
Thermodynamics approximates a system using random processes and statistics.
To understand how this works, let’s go through an example. Take a standard 6-sided die. Rolling the die yields six outcomes each with 1/6 probability. Now, if I were to divide these six numbers into two categories, let’s say:
- Lucky numbers: 5, 6
- Unlucky numbers: 1, 2, 3, 4
then a die-roll will be more likely to yield an unlucky number than a lucky number. We can quantify how “lucky” or “unlucky” a category is by counting the number of possibilities (2 vs. 4). The number of possibilities (or its logarithm) is captured by the concept of entropy — there is some level of subjectivity in its definition (I have written another article that offers more insights on this idea).
Going back to the physics analogy, the die represents a physical system. The system is assumed to change rapidly and randomly such that each measurement of the system is approximated by a roll of the die. Each side of the die represents a state, and each category corresponds to a set of macroscopic physical quantities. The categories of lucky numbers can be referred to as low entropy states, and the categories of unlucky numbers as high entropy states.
Thermodynamics tells us how to count the number of states for a particular category. Each category is characterized by physical properties such as temperature, magnetization, different phases of matter, different amounts of pressure in a gas, and so on.
The Arrow of Time: A Thermodynamic Explanation
So why does entropy have to increase? Well, since a measurement is akin to rolling the die once, the result is more likely to be a high-entropy state (e.g., unlucky numbers). In particular, if the number of sides of the die is increasing as time goes on, then gradually the entropy of the system (defined by the entropy of the category that it is in) will always tend to increase. In other words:
An increase in entropy is equivalent to the shift from a lucky roll to an unlucky roll
So in this view, the increase of entropy is critically dependent on the fact that our system was previously in a low-entropy state, so that there is room to move to a higher-entropy state! Otherwise, the system wouldn’t change in the first place.
At this point, it is clear that the entropic explanation is rather contrived. Why does the system have a lower entropy in the first place (why do we get progressively less lucky rolls over time)? Generally, this happens because the system initially had some extra restrictions, preventing it from exploring the full range of possible states. In other words:
The number of explorable states for the system was smaller in the beginning than in the end.
Entropy’s increase is then ultimately tied to the (apparent) gradual increase of the number of explorable states. Of course, all of this discussion will be vacuous if we cannot explain how non-random physical processes can be approximated by random ones. In the end, a complete explanation for the arrow of time needs to go beyond thermodynamics.
Epilogue: Looking Beyond
To complete the thermodynamic explanation, we must answer two more questions:
- How is it that the dynamics of a physical system can be approximated by random processes?
- Why does the number of states available for exploration get smaller as we travel further back in time?
The answers to these questions lie beyond thermodynamics. It turns out that there are two key issues at play:
- Complex dynamics explains how processes are seemingly random
- The initial conditions of the universe (near the Big Bang) are ultimately responsible for the seemingly smaller number of explorable states at the beginning of time
I’ll dedicate separate articles to each of these topics in the future. Stay tuned!
In the meanwhile, check out my article on entropy. 👋