Delight in Disorder

I remember reading once, in a dreary reference called something like The Harvard Guide to American Literature, that Thomas Pynchon’s work was sometimes inspired by Newton’s Second Law of Thermodynamics. This is about as plausible as Benjamin Franklin’s key role in the Los Alamos A-bomb project, but once you’ve recovered from the disorientation, there’s a certain richness to the idea of Newton having formulated a cornerstone of 19th-century science that continues to disturb and fascinate.
For the Second Law of Thermodynamics is, in a way, the Repressed of classical physics. When you learn about Newton and Galileo in first-smester physics, you learn to work in a world where everything is perfectly smooth and efficient. Blocks slide along infinitely long planes at constant speed because there is no friction, objects fly in parabolas because there is no air resistance, collisions are perfectly elastic because none of the kinetic energy is dissipated into heat.
The SL of T is all about those banished things that mess up your equations–it asserts that things run down irreversibly, and that nice, tidy, usable forms of energy eventually dissipate into useless random forms. The measure of the uselessness or disorder of the energy in a system is called its entropy, and the SLT says that the entropy in a closed system never decreases. 

Actually, what we usually think of as sources of energy are more like sources of low entropy.  When you burn gasoline, there’s as much energy in the system at the end as there was before, but you have turned the energy from a low-entropy form (stored, I guess, in the orderly structure of the molecules) into higher-entropy forms like the heat of the exhaust.  We probably should have a Secretary of Entropy instead of a Secretary of Energy, except that that might casue confusion, since the Defense Department also specializes in bringing high entropy states to various parts of the world.

Here is my favorite way to think about entropy, formulated by a guy named Boltzmann (the equation for the Second Law in his terms is inscribed on his tombstone in Vienna).  Imagine a box with a bunch of balls in it, say 100 green and 100 red balls, and they are bouncing around (you can imagine the box in space, or in the back of a truck driving on a really crappy road, or whatever).  If we start with all the red balls on the left and all the green balls on the right, then this is a very low-entropy state in the informal sense of being very neat and orderly–we can formalize this by noting that any change we make will be noticeable, even exchanging one ball from each side.  There is only one way to arrange the balls to fit the overall description I gave.  Now if there’s one red ball among the green, there’s a bit more leeway.  It could be any of 100 balls and the overall picture would look the same.  Now if there are two red balls among the green, there are a lot more ways to do it (100 to choose the first ball, 99 to choose the second, so 199*99/2 = 4950, dividing by 2 because choosing A then B is the same as choosing B then A).  Pretty soon the numbers get really big, with dozens of zeros after them, as you get the balls more and more jumbled together.

So the more mixed-up or jumbled the box gets, the more ways there are to rearrange the details of which ball goes where without changing the global picture of how many balls are on each side.  And this is essentially Boltzmann’s version of the SLT: as time goes on, the box will get more and more mixed up, and the number of low-level rearrangements that can be made without affecting the high-level picture will increase: this Rearrangement Number is called entropy (actually, you have to take the logarithm, but this doesn’t affect the direction of things).

And this explains why the SLT is true: there are only a few ways to be low-entropy, and a really, really large number of ways to be high-entropy.  A system that starts out in one of these super-rare low-entropy states might just stay the way it is, but if it moves to a new state it is extremely unlikely that it will find one of the even-rarer states of even lower entropy.  It is more likely to find one of the super-numerous higher-entropy states.  This means that the SLT technically is just a statement of probability: in any closed system, the entropy will go up except in unlikely cases…which sounds a bit disappointing, except that “unlikely” here, when we’re talking about a few gazillion molecules in a cup of coffee instead of a hundred balls, means “might happen once in a trillion years.”  So physicists just simplify and ignore the exception.

The fact that our universe’s tendency to run down can be derived from basic probability is pretty cool in itself, but I did have somewhere I was going with it, namely, that cosmologists now believe that our low-entropy past is the reason we think of time as having a direction from past to future, and without the SLT, we might have memories of the future instead of the past.

But that will have to wait for the next post….

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

2 Responses to Delight in Disorder

  1. Pingback: Best of 2011, Bookwise-speaking | lippenheimer

  2. Pingback: We Want Information | lippenheimer

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s