top of page
physics to God logo

A GUIDED JOURNEY

physics to God logo

Entropy and the Initial Conditions of the Universe

Even with the right laws and constants, it is still very unlikely that chance alone could be responsible for our complex universe. In order to get a universe like our own, the spacetime and matter that comprise it must be arranged in a very improbable state. We discuss entropy and the second law of thermodynamics, and illustrate how they shed light on the ordered initial conditions of our universe. This leads to a third, independent argument for how we know that our universe has an intelligent cause.







OPEN


Elie

Aaron. In the previous episodes we showed how our universe is dependent upon our specially designed laws and fine tuned constants, without which there wouldn’t exist a complex, structured, and ordered universe. This may give the impression that as long as we have the right laws and constants, then we would automatically get a universe with atoms, molecules, planets, stars, and life. However, this is certainly not the case.


Aaron

Right. Even with the right laws and constants, it is still very unlikely that chance alone could be responsible for our complex universe. In order to get a universe like our own, the spacetime and matter that comprise it must be arranged in a very improbable state. And this is going to lead to a third, independent, argument for how we know that our universe has an intelligent cause.


Elie

While this episode will be a bit more complicated than the previous ones and may need a little more focused listening, you should have heard what Aaron originally wanted it to sound like. But don't worry everyone. I’ll help keep things simple with examples and analogies

I’m Rabbi Dr Elie Feder.


Aaron

And I’m Rabbi Aaron Zimmer. Welcome to Physics to God!


Introduction


Aaron

In this episode, we want to highlight the exceptional order of the initial conditions of the universe - something that is conceptually different from the qualitative laws of nature and the quantitative constants. Since these initial conditions exhibit a high degree of order, this line of reasoning will provide independent evidence that our universe has an intelligent cause.


What is Entropy?


Elie

As the concepts involved in this episode require some background knowledge, let’s first introduce four basic terms: system, state, emergent property, and entropy. 

Aaron, please explain what a system is.


Aaron

OK. Almost any analysis in physics focuses its attention on a particular collection of things - and this collection is called a system. For example, we can consider a metal coin as a system which consists of many different molecules and their interactions. Alternatively, we can consider the entire universe as one big system that consists of all the different particles in the universe.


Elie

OK. Good. Now let’s explain the meaning of a state of a system and its emergent properties. 


Aaron

Systems can exist in different states. For example, a metal coin could exist in a solid state, or it can be melted down to a liquid state. Each state of the system can have different properties which only emerge in that state, such as the property of hardness which only emerges in the solid state of metal. 

The emergent property doesn't relate to the individual parts of the system but to the system as a whole. While the individual particles of metal are neither hard nor soft, the metal coin as a whole in its solid state has the emergent property of hardness, due to the specific arrangement of its atoms.


Elie

Let me clarify a bit: when analyzing any system, we can consider two distinct things:

1) The specific arrangement of the individual parts of the system, like the atoms of a metal coin.

2) The emergent properties of the system as a whole, like whether the metal coin is hard or soft.


Now, it’s time to explain the least familiar concept: entropy.


Aaron

If we were to randomly jumble an object's individual parts, then entropy measures the probability that the whole object is in a state with a particular emergent property. A state of “high entropy” is one that can arise through many different arrangements, while a state of “low entropy” is one that can only come about through a few different arrangements. Assuming that all arrangements are equally possible, after a random jumbling it is less probable for the system to be in a state of low entropy, and more probable for it to be in a state of high entropy.


Elie

We'll illustrate this with an example. If we toss two coins, then we can consider all the possible ways that they could land: (1) two heads  (2) two tails (3) heads followed by tails (4) tails followed by heads.

The probability of any one of these four outcomes is 1/4. Upon consideration, we notice that in terms of the state of the system as whole, the last two outcomes will appear the same; both have one heads and one tails. Therefore a better way to describe the probabilities is as follows: the probability of obtaining 0 heads is ¼, the probability of obtaining 2 heads is also ¼, but the probability of obtaining 1 heads is actually 2/4 (which is ½). Obtaining 1 heads is more likely than 0 or 2 heads because it can happen in two different ways, while 0 or 2 heads can only happen in one way each.


Aaron

We can generalize this idea to flipping a hundred coins. In total, there are 2100, or more than a trillion trillion, possible outcomes. Thus, the probability of obtaining any particular outcome is less than 1 out of a trillion trillion. Now, there is only one way to get the state consisting of all 100 heads (HHHHHH…), while there are literally trillions of trillions of ways of getting the state consisting of 50 heads. (The first 50 could be heads, the last 50, the middle 50, alternating heads/tails, and so on.) Thus the probability of obtaining exactly 100 heads is 1 out of a trillion trillion, whereas the probability of obtaining exactly 50 heads is the much more likely value of about 1 out of 13.


Elie

Because it can occur in only one way, the outcome of 100 heads is a low probability occurrence, and is therefore called a low entropy state. Conversely, since the outcome of 50 heads can occur in many ways, it is much more probable, and is therefore called a high entropy state. The outcome of 75 heads - which is less likely than 50 heads but more likely than no heads -  can be considered a state of medium entropy.


Relationship Between Order and Entropy


Aaron

In addition to being used to describe probability and the number of ways to obtain a specific state, entropy is also used to describe the order or disorder of a system. One can think of a high entropy state as being disordered, and a low entropy state as being ordered. This is because there are many ways to randomly bring about a state of disorder, but only a few ways to bring about a state with order or with a meaningful emergent property.


Elie

Let me give an analogy to help explain the relationship between entropy and order. Consider a book as a whole system, with its letters as the parts. The story of the book is dependent upon all the letters and spaces being arranged in the right order. If we randomly jumbled all the letters, the book as a whole will lose the emergent property of intelligibility and coherence. We can therefore discuss the particular state of a book in which its letters convey a specific story. Such a state is called ordered, being that its parts are specifically arranged in a manner which brings about a meaningful emergent property. In such a state, the whole is more than the sum of its randomly arranged parts; that is, a book is more than just a random bunch of letters. This means that a book that tells a coherent story is in a state of low entropy. 

On the other hand, other states of the book are largely independent of how the letters are arranged. For instance, consider the state of the book being a meaningless jumble of letters. Almost every random ordering of its letters will put the book in this same state of meaninglessness. This is because there are many ways to arrange the letters such that the book as a whole makes no sense. Such a state is called disordered, because the particular random arrangement of its parts does not yield any significant emergent property. In such a state, the whole is no more meaningful than the sum of its randomly arranged parts. This means that a book that contains an incoherent jumble of letters is in a state of high entropy. 


Aaron

There are two different ways to think about these ideas. One way is in terms of probability and entropy, and the other is in terms of order and disorder. They generally amount to the same thing and we’ll be using both frameworks in our explanations.


Elie

Before moving on, we want to point out that the relationship between entropy and order involves a counterintuitive terminology. Therefore let’s just get it clear: a system which is more ordered, like a coherent book, has low entropy and a low probability of randomly coming about by pure chance. A system which is less ordered, like a random jumble of letters, has high entropy and a high probability of coming about by pure chance. To avoid getting confused, whenever entropy comes up, we find it helpful to remind ourselves, “low entropy = order; high entropy = disorder.” With that in mind, let’s move on to applying these concepts to the physical world.


The Second Law of Thermodynamics


Aaron

Statistical thermodynamics is a branch of physics that deals with heat and related phenomena. The conceptual foundation for this field is the connection between entropy and probability. An important result of this study is the second law of thermodynamics. This law states that all physical processes move a system from a state of lower entropy to a state of higher entropy. Eventually, all closed systems will end up in a state of maximum entropy.


Elie

This means that as time goes on, all closed systems move towards progressively more disordered states. Eventually, all objects will end up in the most disordered possible state. The intuitive reason for this law is based upon the fact that disorder can arise through many possible arrangements, and will therefore be favored by purely random processes.


Aaron

The second law is one of the most important principles in all of physics. In fact, it’s responsible for the arrow of time that explains all of your everyday experiences, like an egg rolling off the table and splattering all over the floor.

Physicist Arthur Eddington, one of the great pioneers in understanding the significant implications of the second law, commented on its importance. He said:


The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations – then so much the worse for Maxwell's equations. If it is found to be contradicted by observation – well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it, but to collapse in deepest humiliation.


Elie

That the second law is true, is unquestionable. The question is why it’s true. What is the reason that entropy is always lower when we look back into the past but is always increasing into the future? Granted that we consistently observe it to be true, but physicists were bothered about the reasoning behind it. We’ll soon see that this wasn’t just a simple ‘why’ question, but that good statistical reasoning seems to imply that this law shouldn’t be true.


Aaron

Before we get there, let’s start by looking at rising entropy into the future. The reason why all physical processes increase entropy into the future is based upon statistical considerations. To see this point, consider the following example. Start with 100 coins in a container, all of which are on heads, a low entropy state. This is called the initial conditions which refers to the particular state that a system starts off in, and serves as the basis for future states.

If you begin shaking the container and periodically check the number of heads and tails, then with time you will begin to get more and more tails. Eventually, you are very likely to end up with a roughly stable final state of about 50 heads and 50 tails, a high entropy state. Once the system is in its final high entropy state, any further shaking won’t significantly affect it. You might flip some heads to tails, but in all likelihood an approximately equal number of tails will flip to heads, thereby leaving the system in the same overall state.

It is important to understand the nature of this law. It is not that there is a force which drives the coins to an equal number of heads and tails. Rather, the law is purely a result of statistics and probability. Since there are so many more ways to obtain 50 heads and 50 tails than 100 heads and 0 tails, over time the box of coins will tend towards its maximum entropy state of 50 heads and 50 tails. 


Elie

Let’s consider our analogy of the book which starts off with the initial condition of all the letters arranged so that the book tells a coherent story, a low entropy state. If we imagine that over time pairs of letters randomly get swapped, the book will slowly start to lose its meaning. Eventually, it will end up in the high entropy state of a random jumble of letters making no sense at all. This is because there are a lot of ways to have a meaningless sequence of letters, but only a few special arrangements that have meaning. Therefore, the random process of letter swapping will lead to the book's most likely state - disordered meaninglessness. This is an expression of the second law of thermodynamics.

Once again, even if we continually swap random letters, this is not a force which intentionally drives the book towards disorder. Rather, since there are so many more ways to obtain a disordered jumble of letters than an ordered book, over time a book subject to random letter swapping will naturally tend towards disorder. 


Aaron

One more point before moving on. Since the second law is entirely based upon statistics, it is theoretically possible for it to be violated in a particular instance. For example, it’s possible, though highly unlikely, that after a few hours of shaking, you will once again find 100 heads, or even 100 tails. However, because of the strength of probability to make accurate predictions when dealing with very many objects, a violation of this law in a situation that involved a significant number of particles would be considered a miracle.


Contradiction Between the Second Law of Thermodynamics and Statistical Reasoning


Elie

Since the second law applies to any system, we can think of our entire universe as one big system and see what the second law of thermodynamics tells us about it.


Aaron

The second law says that entropy always increases into the future. This means that over time our universe will tend to a state of maximum entropy - a state that cosmologists call the heat death of the universe. In this state, there will be no stars or planets, and everything in the universe will ultimately be in thermal equilibrium, which means that nothing interesting will happen anymore. Not a bright future.


Elie

Don’t worry about it guys. You’ve got many years before that happens - based on current calculations, we’re talking about trillions and trillions of years.  But that fun fact is beside the point.


Aaron

The second law also says that not only will entropy be higher in the future, but it was also lower in the past - that the further back in time we go, the lower the universe's entropy was. While this is a simple accurate description of all our observations, it's not easy to understand why it is true regarding the past.


Elie

We can understand why entropy and disorder will be higher in the future because of the statistics of random processes. This also meets our everyday observations. We find, for example, that eggs roll off of countertops, and splatter all over the floor. However, we never find the reverse happening, a cracked egg rising off the floor and reassembling on the countertop. 


Aaron

But it becomes much trickier when we try to use the same type of statistical reasoning regarding the past. In fact, if we apply the statistical reasoning of the second law to the past, we get something completely at odds with all our observations. Namely, the statistical logic that underlies the second law seems to tell us that even if we find a system in a low entropy state, the most likely explanation of where it came from is a random fluctuation from a higher state in the past, just as it will move to a higher entropy state in the future. 


Elie: For example, if you randomly shake a box of 100 coins and find it has 75 heads, it was most likely preceded by a past state that was closer to 50 heads, just like shaking it more will most likely lead to a future state of 50 heads.


Aaron

As Brian Greene said in The Fabric of the Cosmos:


This leads us to a simple but astounding point…all of the reasoning we have used to argue that systems will evolve from lower to higher entropy toward the future works equally well when applied toward the past…not only is there an overwhelming probability that the entropy of a physical system will be higher in what we call the future, but there is the same overwhelming probability that it was higher in what we call the past…Entropic reasoning yields accurate and sensible conclusions when applied in one time direction, toward what we call the future, but gives apparently inaccurate and seemingly ridiculous conclusions when applied toward what we call the past.


Elie

While you can appreciate this well-accepted point from the example of 75 heads, we’ll explain it in more depth with an example that is a bit complicated. Part of its complication comes from just how counterintuitive it is. It isn’t necessary to follow the details exactly - just to get the main point. We’ll say this example slowly so that you can focus on the details. 


Aaron

Imagine you had a box of a hundred coins that had been shaken up very well. Without first looking, you blindly turn over one coin at random, and only afterwards check inside and are shocked to find 99 heads and 1 tails - a very low entropy arrangement.


Elie

We can then ask, what was the situation just before you blindly flipped over the one coin? If we think about it, there are two possibilities: Scenario 1: There were at first 100 heads - an even lower entropy state - and you changed any one of them from heads to tails; or Scenario 2: there were only 98 heads - a relatively higher entropy state - and you changed one of the two tails to a heads.


Aaron

Now, at first, scenario 2 might seem unlikely. If there were only 2 tails out of 100 coins, the probability that you randomly selected one of the two tails to flip over is 2 out of 100. On the other hand, in scenario 1, when all 100 coins were heads, this precise selection is unnecessary - changing any of the hundred coins would yield the exact same result. 


Elie

Nevertheless, besides comparing the relative likelihoods of selecting the right coin to flip, we also have to consider the probability of a randomly shaken box starting with all heads as opposed to 98 heads and two tails. While it’s certainly unlikely to randomly get 98 heads and only two tails, it’s much more unlikely to get all 100 heads.  In fact, if you make the calculations, it turns out that it’s around 5000 times more likely to get 98 heads and 2 tails, than all 100 heads.


Aaron

Therefore, if we make a computation to determine which scenario actually occurred - Scenario 1 of starting with the lower entropy state of 100 heads and then flipping any coin; or Scenario 2 of starting with the higher entropy state of 98 heads and flipping one of the two tails - it turns out that it’s around 100 times more likely that Scenario 2 occurred - that the initial state was 98 heads and 2 tails. 


Elie

In other words, statistical considerations lead us to conclude that it's more likely that the past was in a higher entropy state than the present - a direct contradiction to the all-important second law of thermodynamics which says that the past was always in a lower state of entropy than the present.


Aaron

Eminent physicist, Roger Penrose, discusses this problem of why entropy is always lower in the past using the example of a splattered egg:


Our reasoning seems to have led us to expect, for example, that an exceedingly probable way that our egg originally found itself to be perched at the edge of the table was that it started as a mess of broken eggshell at the bottom of the table, mixed with yolk and albumen all churned up together and partly absorbed between the floorboards. This mess then spontaneously collected itself together, removing itself cleanly from the floor, with yolk and albumen thoroughly separating out and becoming completely enclosed by the miraculously self-assembling eggshell to provide a perfectly constructed egg, which propels itself from the ground at exactly the right speed so that it can become delicately perched on the edge of the table. Such behavior would be of the kind that our above reasoning led to … But this would be grossly in conflict with what presumably actually happened, namely that some careless person placed the egg on the table, not realizing that it was in danger of rolling too close to the edge. That evolution would have been consistent with the Second Law… When applied in the past time-direction, our argument has indeed given us an answer that is about as completely wrong as one could imagine.


Elie

OK. While this may have all been very complicated, let me reiterate the main idea. That is, just like statistics determine that a random process takes a system to a more probable, higher entropy state in the future, the same statistical reasoning says that the past was also a more probable, higher entropy state. However, this contradicts the observed second law which says that the past is lower entropy, not higher. 


As unintuitive as it sounds, just as whole eggs become splattered eggs in the future, statistical reasoning says that they should also come from splattered eggs of the past coming back together. But this conclusion is clearly wrong. All our experience tells us that whole eggs come from chickens, not from the miraculous unsplattering of splattered eggs. Scientists realized that we must be missing some critical information that can help resolve this apparent contradiction.


The Initial Conditions of the Universe


Aaron

In order to answer this problem, let’s consider the distant past of the universe. Just as we can analyze a system consisting of a box of coins and identify its initial conditions, we can do the same thing for the entire universe. We can consider the universe's initial conditions at its origin - what scientists call the big bang.


Elie

With an eye towards the universe’s initial conditions, let’s recall the apparent contradiction. While statistical considerations imply that the most likely past state of the universe was of higher entropy, the second law says that entropy was lower in the past. 


Aaron

Physicists’ astounding solution to this problem came from realizing that the contradiction only exists because of a mistaken premise - that the universe started in its most likely state - that of high entropy. This being the case, the natural solution to the contradiction is that the universe did not start off in a high entropy state as we might have expected if its initial state was chosen at random. 

Rather, we must say that the universe started off in a very ordered state with extraordinarily low entropy.  As a result of starting in such a low entropy initial state, all subsequent states end up having higher entropy, based purely upon statistical considerations.

 

Elie

Brian Greene expressed this point as follows:


The egg splatters rather than unsplatters because it is carrying forward the drive toward higher entropy that was initiated by the extraordinarily low entropy state with which the universe began. Incredible order at the beginning is what started it all off, and we have been living through the gradual unfolding toward higher disorder ever since.


Aaron

With this solution, the second law is saved and everything works out perfectly. That is, as long as the second law is supplemented with the additional stipulation of low entropy initial conditions at the big bang.


Elie

Of course, all this came as a big surprise to physicists. Since they assumed that the initial state of the universe at the big bang was randomly determined, the natural conclusion was that the universe started in its most likely state - that of high entropy.  

To instead suggest that our universe had low entropy initial conditions seems to go against the assumption that the initial state was set randomly. However, the conclusion was inescapable. The only way to explain the second law is by stipulating that our universe had highly improbable, low entropy initial conditions. After that first moment, the second law dominated everything, with entropy continually on the rise.


Aaron

In the end of the day, we see that the apparent contradiction was implicitly based on the assumption that the universe had randomly determined high entropy initial conditions. Under that assumption, statistical reasoning contradicts the second law because it leads us to believe that our relatively low entropy universe in the present was just a lucky fluctuation from a higher entropy past state, akin to a whole egg emerging from the miraculous fluctuation of a splattered egg. 

But now we see that the second law is, of course, absolutely correct. The fact that past states are always of lower entropy is in perfect accord with both our observations and with statistical reasoning, as long as the second law is supplemented with the stipulation that the universe began with very special low entropy initial conditions. 


Elie

It’s important to note that there is no obvious scientific explanation for these highly ordered initial conditions. In the Feynman Lectures, physicist Richard Feynman said as follows: 

We therefore conclude that the universe is not a fluctuation, and that the order is a memory of conditions when things started. That is not to say that we understand the logic of it. For some reason, the universe at one time had a very low entropy for its energy content, and since then the entropy has increased.


OUTRO


Aaron

Of course, this solution raises a whole slew of new questions. If the initial conditions were randomly determined, how improbable would it have been to attain our actual initial conditions by chance alone? In other words, given the set of all theoretically possible initial states for our universe, how ordered and special were our actual initial conditions? And finally, if chance isn’t a good explanation, what is the reason for the low entropy initial conditions of our universe? You can probably tell where we’re going with this. We’ll discuss this and more in our next episode. So stay tuned.


Elie

Wait wait wait. Aaron. Before we go, I have one last question. Couldn't we have just presented the main argument of this episode in a much simpler way, without all the complications of the contradiction between statistical reasoning and the second law?


Aaron

Ok, Elie. What do you mean?


Elie

The simple line of reasoning would go as follows: the second law of thermodynamics, which Eddington describes as holding the supreme position among the laws of nature, tells us that as time moves forward, the universe gets more and more disordered. If we run the second law back in time, we can deduce that the beginning of our universe must have been in a very ordered state indeed. Couldn't we have just said that? Wouldn't that have been much simpler?


Aaron

We could have said it that way, Elie, but then you wouldn’t have appreciated how this argument follows from one of the deepest and most profound problems in physics. It’s one thing to hear a short elevator pitch for why God exists; it’s something much more valuable to fully appreciate first hand the context and depth of the argument. On this podcast, we’re following the trail of the most fundamental problems in physics and showing how they point directly from Physics to God.


Elie

Ok. I hear you. That’s a very good point. However, I just want to help some of our listeners by pointing out that if someone got lost in the complicated discussion, it’s sufficient to understand my simpler version and use that to appreciate the next episode.


Aaron

That’s fair. The bottom line is that if we run the second law of thermodynamics back in time, we discover that the initial state of the universe was extremely ordered - and nothing in physics itself requires this to have been the case. With that information in hand, join us next time as we complete this argument and show how the incredibly low entropy initial conditions point directly to an intelligent cause. I’m Rabbi Aaron Zimmer.


Elie

And I’m Rabbi Dr. Elie Feder. And this is Physics to God!



56 views0 comments

Related Posts

See All
bottom of page