Requires Fonts

Things Fall Apart

An Introduction to Entropy

Copyright (c) 2001 by Gary Felder

Entropy is for many people one of the most confusing topics that they hit in introductory physics. Most introductory physics terms like energy, velocity, and temperature refer to things that we are familiar with in our day to day life and have built some intuition for. Few people start taking physics with any intuition for entropy, however. Many popular science books and articles discuss entropy, though, usually equating it with "disorder" and noting that according to the laws of physics disorder and therefore entropy must always increase. Such accounts rarely in my experience make any attempt to define what is meant by disorder or how you would measure whether it is increasing or decreasing.

Entropy is a well-defined quantity in physics, however, and the definition is fairly simple. The statement that entropy always increases can be derived from simple arguments, but it has dramatic consequences. In particular this statement explains many processes that we see occurring in the world irreversibly. I'm going to start with a general discussion of the notion of reversibility, leading up to the definition of entropy and the meaning of the law saying it must always increase. I'll end by talking about some of the implications of this law.

The discussion throughout the paper presents ideas in a very general way that assumes no math or physics background. There is a (slightly) more mathematical appendix that discusses a technical issue in the definition of entropy and explains how the concept of entropy is used in defining temperature.

Introduction: Time-Reversal

Your kitchen floor is covered with bits of egg shell, yolk, and egg white in a large puddle by the counter. A moment later the yolk and white run together into one place, the bits of shell fall in around them and form a smooth surface surrounding all the liquid, and finally the whole mess soars up onto your counter, ending up as an intact egg. You leave a cool cup of water on the table and a few minutes later the water has gotten warmer but several cold ice cubes have coalesced out of it. A mass of dandelion spores flying about in the wind all land in nearly perfect unison on the same plant, attaching themselves to it in a spherical ball.

If someone showed you a videotape of any of these events you would recognize instantly that the tape was being played backwards. These are just a few examples of the many processes in the world that happen in one direction only. Eggs break, but they never spontaneously re-form. Ice and warm water combine into cool water, but water never just happens to split into cold ice and warm liquid. Why is that? Why do things happen in the direction they do?

At first glance the question might seem silly. All of these events are governed by physical laws that tell them what to do. For example, objects fall towards the ground because of gravity, which seems to be a clear example of the laws of physics making things happen in one way only. Looked at more closely, however, this example isn't quite so simple. If I play you a videotape of an object falling you will notice that as it falls it picks up speed, going faster and faster towards the ground. Now suppose I play you the same tape in reverse. You will see an object moving up, slowing down the higher up it gets. Either version of the tape looks like a plausible scene. In fact gravity is one of the clearest examples of a time-reversible law. As another example consider a billiards game being played on a table with no friction, so the balls never slow down. If I play a tape of the balls normally you will see balls colliding off each other and the walls. If I play the tape backwards you will see the same thing. There's no way to tell which version is correct.

Notice, however, that in the billiards example I had to specify that the table had no friction. A real billiard ball moving along a table will gradually slow to a stop. If I played you a tape of a billiard ball that started out at rest and started moving with nothing else touching it you would know that the tape was being played backwards. Friction, then, seems to be a good candidate for a physical effect that is not time-reversible.

To examine that claim, let's consider more carefully what friction is. What happens to the billiard ball as it is rolling? In fact the same thing happens as in the frictionless example; it experiences constant collisions. In this case the collisions are not just with other balls but also with the air molecules surrounding it. More importantly, because neither the table nor the ball has a perfectly smooth surface there are constant collisions between their molecules. The net result of all of these collisions is that the ball loses energy, and eventually comes to a stop. Where did the energy go? It went into all the air and table molecules that the ball collided with. They in turn scattered it out ever further into the atmosphere and down into the floor.

So now let's look once again at the backwards tape of the billiard ball. We see it starting out at rest and gradually picking up speed. What causes that to happen? If we zoom in close enough we see that air molecules from all over the room happen to converge on the exact right spot to knock the ball in one direction. Moreover the molecules of the table, which are continually vibrating and moving, all happen to push against the ball in the same direction as the air molecules. Looked at closely enough, there is nothing in this scene that violates the laws of physics. Rather what we see is the playing out of what appears to be a massive coincidence.

In fact it turns out that in classical physics all of the fundamental laws of nature are perfectly time-reversible. (The question of time-reversibility in quantum mechanics is somewhat subtler, and I'm not going to discuss it in this paper.) All of the processes that we see occurring in one direction only do so because it would require strange coincidences for them to occur the other way. This statistical tendency for processes to occur in a particular direction is described by a rule called the second law of thermodynamics. Note that this "law" isn't a fundamental law of physics, but rather a statement of statistical probabilities. In the next section I'll describe how this law is formulated, i.e. how to know which processes will occur in one direction only. I'll do so by defining a quantity entropy that can never decrease in any physical process.

Entropy and the Second Law of Thermodynamics

Entropy is a relationship between macroscopic and microscopic quantities. To illustrate what I mean by that statement I'm going to consider a simple example, namely a sealed room full of air. A full, microscopic description of the state of the room at any given time would include the position and velocity of every air molecule in it. In practical terms it would be essentially impossible to measure those quantities for every molecule in a room. Instead, what we typically observe are macroscopic quantities that arise as averages over large numbers of molecules. These macroscopic quantities include, for example, the temperature and density of the air. We might observe these quantities varying from one place to another in the room, but in general the smallest region in which we will be able to measure variations will still be large enough to contain an enormous number of molecules. (Just to give a feeling for the kind of numbers involved, a typical room at normal Earth temperatures and pressures will contain about 1027, or a billion billion billion, air molecules.)

If I were able to measure the complete, microscopic state of the air molecules then I would know all the information there is to know about the macroscopic state. For example, if I knew the position of every molecule in the room I could calculate the average density in any macroscopic region. The reverse is not true, however. If I know the average density of the air in each cubic centimeter that tells me only how many molecules are in each of these regions, but it tells me nothing about where exactly the individual molecules within each such region are. Thus for any particular macrostate there are many possible corresponding microstates. Roughly speaking, entropy is defined for any particular macrostate as the number of corresponding microstates.

To recap: The microstate of a system consists of a complete description of the state of every constituent of the system. In the case of the air this means the position and velocity of all the molecules. (Going further to the level of atoms or particles wouldn't change the arguments here in any important way.) The macrostate of a system consists of a description of a few, macroscopically measurable quantities such as average density and temperature. For any macrostate of the system there are in general many different possible microstates. Roughly speaking, the entropy of a system in a particular macrostate is defined to be the number of possible microstates that the system might be in. (In the appendix I'll discuss how to make this definition more explicit.)

This definition should be clearer with the aid of a few examples. The first is a simplified example where the microstates consist of balls sitting in one of two bins. For the second example I will return to the room full of air and talk about the entropy of different patterns of density.

Example I: Balls in bins

Imagine I have ten thousand tiny balls, each labeled with a different number. I put all of the balls into one of two bins, Bin A and Bin B. In this case the microstate of the system consists of the position of each of the numbered balls. The macrostate, i.e. what I actually observe, consists simply of how many balls are in each bin. For each macrostate I can easily figure out how many possible microstates there are. For example, if I know that all of the balls are in Bin A then the microstate is determined uniquely. If I know that all but one of the balls are in Bin A then there are ten thousand possible microstates for the system, corresponding to the ten thousand different balls that might be in Bin B. In general, the entropy (possible number of microstates) goes up very rapidly as you get closer to having an even split between the two bins. For that particular case, i.e. five thousand balls in each bin, there are roughly 103008 possible microstates.

Now let's allow the two bins to mix. I open up a large window connecting the two bins and start blowing giant fans that send all the balls constantly flying in different directions. Every minute I shut the window and turn off the fans, weigh each bin to figure out how many balls are in it, and then open the window, turn on the fans, and start again. If I set the experiment up carefully so the fans are blowing equally in both directions then eventually each ball will have a 50/50 chance of being in either bin, more or less independently of where all the other balls are. In other words, every microstate will be equally likely. Take a moment to convince yourself that those two statements are equivalent. If you've followed me so far, I hope it will be clear to you that not all macrostates will be equally likely. The state with one ball in Bin B and all the rest in Bin A will be ten thousand times more likely than the state with all the balls in Bin A. In fact the numbers involved are so huge (see the previous paragraph) that it is almost certain that once the balls have had time to mix I will find almost exactly half of them in each bin every time I look. In other words, entropy will tend to increase. If I start in a state with most of the balls in one bin (low entropy) the system will tend to move to a state where they are evenly distributed (high entropy). On the other hand if I start with high entropy, it's very unlikely that the entropy will spontaneously decrease.

Example 2: Back to Air Density

The previous example illustrates the definition of entropy well, but the setting is a bit artificial. Let's return to the example of a room filled with air and see how the same logic applies. Imagine I am going around the room with a handheld device that measures the local air density. In fact, this situation is almost exactly the same as the previous example. The balls are now air molecules and the bins are regions of the smallest size that my device can measure. In this case there are many bins instead of just two, but the same result will apply. The macrostate in which there are roughly equal numbers of air molecules in each part of the room has a much higher entropy than any macrostate in which there are large density differences from one part of the room to another. In other words if you start with all the air bunched up in one corner of the room it will tend to flow out until it more or less evenly fills the space. If you see a videotape in which the air starts out uniform and then all moves into one corner of the room you can bet that it's being played backwards.

The second law of thermodynamics states that the total entropy of the universe can never decrease. Note that the entropy of a particular system can decrease, provided there is a corresponding increase in one or more other systems. In the example of air density the entropy in the corner of the room that initially had all the air decreases as the air expands, but the total entropy in the room increases. Likewise the growth of an individual human involves building up many complex, low-entropy structures, but this building is done by breaking down complex molecules, plants, and other systems in the environment. Once again total entropy increases. In fact life with all its digestive, heat generating processes, is a very efficient entropy-producing engine. A reversible process is one in which the total entropy of all systems involved remains constant, whereas any process in which total entropy increases is irreversible.

Conclusion

The second law of thermodynamics is a statistical law. There is nothing in the basic laws of physics that forbids a bunch of dandelion spores from converging on a plant stem and sticking to it. There are, however, many different ways that randomly moving molecules can knock spores off a dandelion, whereas it requires a very special and unlikely combination of air movements to push the spores back on. This statistical tendency appears to be a fixed law of physics because the number of microscopic components (particles, atoms, molecules) making up any macroscopic system is so enormous that it would be inconceivable for the laws of thermodynamics to be violated by coincidence. All of this means that a videotape of the universe being played backwards would appear to be following all the usual laws of physics, but in a very strange and unlikely way.

So far as we know entropy will continue increasing until someday the universe is filled with nothing but weak, uniform radiation. Fortunately this scenario, known as "the heat death of the universe," will not take place until after a length of time that makes the current age of the universe seem miniscule by comparison. For now we get to enjoy our complicated, low-entropy state. For that, it seems like having a few eggs break is a small price to pay.

Appendix

This appendix discusses a couple of somewhat more technical issues than the ideas discussed in the main part of the paper. In the text I said that the entropy of a system in a given macrostate is roughly given by the number of corresponding microstates. In the first part of the appendix I give the actual definition of entropy, which is proportional to the logarithm of the number of microstates, and explain why that definition is more useful. The only math required for this section is a familiarity with the definition and properties of logarithms. The second part of the appendix relates entropy to a more familiar quantity, temperature. I explain in that section how temperature is defined using the idea of entropy to explain why energy flows from certain systems (i.e. "hot" ones) to other ("cold") ones. The basic ideas of this section should be accessible with no math, but the precise definition of temperature involves a derivative and thus requires some knowledge of introductory calculus.

A. Entropy is Actually the Logarithm of the Number of Microstates

If a system is in a macrostate for which there are N possible microstates the entropy is not simply defined as N, but rather as kBlog(N). The number kB, called "Boltzmann's Constant," is simply a proportionality factor that sets the units of entropy. I'll ignore it for the rest of this appendix. Why is there a logarithm in the definition, though? In one sense it doesn't matter one way or another. Whether you take the logarithm or not it will still be true that states with higher entropy are more likely to occur, so entropy will tend to increase. The logarithm is convenient, however, because it makes entropy an "extrinsic" quantity. All that means is that if I have two systems with entropy S1 and S2 then the combined system consisting of both of them has entropy S1 + S2. (Entropy is usually denoted by S; I don't know why.) Many quantities in physics are extrinsic, such as mass. If I have a 2 kg weight and a 3 kg weight then the two of them together have a mass of 5 kg.

To see why the logarithm accomplishes this let's consider two systems whose macrostates correspond to N1 and N2 possible microstates, respectively. How many microstates are possible for the combined system? I would encourage you to stop and try to answer this question for yourself before reading on.

The answer is that the combined system has N1 X N2 possible microstates. Say for example that the first system has three possible microstates and the second one has two (N1=3, N2=2). We can list all the possible states for the combined system in a grid:

1) System 1 in state 1, System 2 in state 1 2) System 1 in state 1, System 2 in state 2
3) System 1 in state 2, System 2 in state 1 4) System 1 in state 2, System 2 in state 2
5) System 1 in state 3, System 2 in state 1 6) System 1 in state 3, System 2 in state 2

The total number of possibilities is N1 x N2, or 6. This same process works for any numbers N1 and N2. Say we label the state of the combined system such that (2,4) means the first system is in its second possible microstate and the second one is in its fourth. The possible microstates for the combined system are:

(1,1) (1,2) (1,3) ... (1,N2)
(2,1) (2,2) (2,3) ... (2,N2)
...
(N1,1) (N1,2) ... (N1,N2)

You should be able to convince yourself that there are N1 times N2 entries in the table above.

Thus if entropy was defined as S = N the combined system would have entropy S = S1 x S2. With the logarithm, though, S = log(N1xN2) = log(N1) + log(N2) = S1 + S2.

There is another subtlety in the definition of entropy that I glossed over in the main body of the paper. Consider the example of the room full of air, and recall that the microstate of that system is given by the exact position and velocity of every air molecule in the room. There are actually an infinite number of possible microstates for any given macrostate of the room. This problem arises because position is a continuous quantity, meaning any particular molecule can be at any one of an infinite possible number of positions. This issue can be dealt with rigorously by using integrals over appropriately defined quantities instead of simply counting states. I'm not going to go into the details of that formalism, which is mathematically a bit complicated but conceptually equivalent to counting states.

B. Temperature

As I noted at the beginning of the paper, temperature is a quantity we have experience with in our daily lives. We know what hot and cold things feel like. Somewhat more rigorously, we can say that when a hot thing and a cold thing are put in contact, energy tends to flow from the hot one to the cold one until the two are at the same temperature. To give a fully rigorous definition of temperature, however, requires using the concept of entropy. In particular, the reason that energy tends to flow from hot to cold things is that such a flow increases the entropy of the system as a whole.

To see how this works I will first note that for most systems the entropy increases as the energy increases. Roughly speaking this occurs because a system with a lot of energy can have a lot of different microstates, depending on how that energy is divided among all its particles. Now suppose I put two systems in contact with each other in such a way that they can exchange energy. For example, if they are touching each other then collisions between molecules at the boundary can transfer energy between the two systems. As the molecules jiggle around more or less randomly some energy will go each way between the systems. As each system gains or loses energy its entropy will tend to go up or down.

For a system in a particular state you can generally quantify how much its entropy will increase (or decrease) as you add (or subtract) energy. Let's suppose that of our two systems the entropy of system A depends much more strongly on energy than the entropy of system B. That means that if system B loses energy to system A the total entropy of the combined system will go up. Since the combined system will tend over time to evolve into its highest entropy state, on average system B will tend to lose more energy to system A than the other way around. Note that it doesn't matter whether system A or B has more energy; it simply matters which one will gain or lose more entropy by gaining or losing energy.

I should emphasize here that there is no physical law requiring energy to flow from system B to system A. All the entropy is telling us is that of all the possible interactions that can go on between the systems there are more of them that will involve energy transfer from B to A than the other way around, so statistically that is what will tend to happen.

Finally, I can formulate the preceding ideas mathematically and thus come to a definition of temperature. For each system I can define a quantity (dS/dE), the rate of change of entropy with respect to energy. This quantity, often denoted by the Greek letter b, determines how energy will tend to flow between systems; it will tend to flow from systems with small b to systems with large b. For historical reasons temperature is defined in the opposite way; energy tends to flow from systems with large temperature to systems with small temperature. So the temperature T of a system is defined as T = 1/b = 1/(dS/dE).

To recap: Temperature is defined as a measure of how much the entropy of a system changes in response to an increase or decrease in energy. When two systems are put in contact energy will tend to flow between them in the direction that increases entropy, which means it will flow from the hotter system to the colder one.


Gary and Kenny Felder's Math and Physics Help Home Page
www.felderbooks.com/papers
Send comments or questions to the author