Sewell, Part II
In Tuesday's post I began my analysis of Granville Sewell's recent attempt to revive the thermodynamics argument against evolution. I continue that analysis now.
Let us ponder for a moment the second law of thermodynamics.
In its simplest formulation the second law asserts that the entropy of a closed system can never decrease. A closed system is one that receives no energy from outside the system itself. Save for the universe as a whole there are no truly closed systems in nature. Even a well-insulated system in a laboratory setting will always be receiving some minimal input of energy from outside. Nonetheless, we can find systems that are sufficiently close to being closed systems to amount to the same thing.
But what is entropy? It is often described as randomness or disorder. This captures an important aspect of what it is about a system that entropy represents. It is not sufficiently precise, however, to be useful in actual applications. In reality entropy, like many terms in physics, has only a mathematical definition.
The concept of entropy was born out of the realization that there are certain natural processes that only proceed in one direction. Coffee and milk combine to form a murky, light brown liquid, but you will never see mixed coffee separate itself into black coffee and white milk. Air placed in one side of a box will quickly spread to fill the entire thing, but you will never see the air rush to one side of the box, leaving a vacuum on the other.
Since the first law of thermodynamics, conservation of energy, did not prohibit things like mixed coffee separating itself, it was concluded that there was some new principle of thermodynamics waiting to be discovered. And just as the first law dealt with the internal energy of a particular system, it was concluded that there ought to be some other property of physical systems that captured our notions about the inherent directionality of certain natural processes.
This ultimately led to the idea of the entropy of a system. Actually, though, at this stage of the game the entropy of a system was not actually defined (later, when statistical mechanics was brought to bear on the problem, that was changed). What was actually defined was the change in entropy of a system in going from state one to state two. This definition was purely mathematical (in other words, a formula expressed in terms of other known quantities was provided for it), but it had the virtue of being composed of quantities that could actually be measured in many important contexts.
It was a consequence of this definition that if all of the energy added to a system were converted to useful work, then the change in entropy that resulted would be zero. A process in which this maximum possible amount of work was attained was said to be reversible. The idea is that a process is reversible if some minimal amount of effort is sufficient to make the process run the other way. As a simple illustration, imagine that you have a see saw that is currently in balance. Now imagine placing a small weight (in theory, even a grain of sand should be sufficient) on one side of the see saw. The result will be that one side of the see saw will be lowered, while the other side will be raised. If we assume there is no friction between the movable parts of the see saw, then the loss of potential energy on the lowered side is matched exactly by the gain in potential energy of the raised side. If we now removed the weight from the lowered side, the see saw would return to its original state. This process is therefore reversible.
In reality, of course, there are no perfectly reversible processes. Some of the kinetic energy imparted to the see saw by the addition of the weight on one side would translate not into increased potential energy on the other side, but rather into heat generated by the friction between the see saw's movable parts.
Consequently, it was realized that in real life processes the change in entropy would always be positive. In other words, entropy would always increase in a real life process. In certain idealized processes the entropy change could be zero. But it could never happen that the entropy of the system decreased.
At this point we encounter a complication. The formula by which the change in entropy was defined was only valid for reversible processes. It could be extended to irreversible processes by imagining some hypothetical series of reversible processes whose start and end states were the same as those you were interested in. This is possible because the entropy of the system depends only on the state that system is in. Therefore, the entropy change depends only on the start and end states, and not on the path that led you from state one to state two. For this reason, the entropy change is said to be “independent of path&rdquo.;
In real-life all processes are irreversible. But some are sufficiently close to being reversible to make little practical difference. Generally speaking, the processes that are “close enough” are those that are close to thermdynamic equilibrium at every stage of the processes. Roughly, this means that the large-scale properties of the system are making only very small changes at each stage of the process.
So what does this have to do with evolution? Well, let us take as our initial state the lifeless Earth of four billion years ago. As our final state we will use the Earth of today. Certainly the highly complex organisms of today are more highly ordered than their disassembled component parts four billion years ago. So we might say that entropy has decreased during that time.
But by itself this is far too imprecise to argue that evolution on Earth has run afoul of the second law. What is needed is a precise measurement of the decrease in entropy that has occurred on Earth in the last four billion years. We also need to consider that the second law applies only to closed systems, while the Earth is contantly being showered with energy from the Sun (among other sources).
As an approximation we might say that the Earth and Sun together are pretty close to a closed system. The would-be critic of evolution must now establish that the entropy of this system has decreased in the last four billion years.
Which brings us, finally, to two points that are fatal to any attempt to disprove evolution via the second law.
First, measurements of entropy change are practical only for systems that are very close to equilibrium at every step. But living organisms are the consummate examples of nonequilibrium systems. As a result, it is effectively impossible to say by how much the entropy of the Earth has decreased in the course of evolution. How do you compute the entropy of a biosphere? No one knows.
Second, even granting that entropy has decreased on the Earth, we would have to show that this decrease was not offset by increases elsewhere. But the entropy of the Sun is increasing by enormous quantities every day. Even a crude overestimate of the entropy decrease of the Earth over four billion years would show that it is no match for the corresponding increase in entropy in the Sun.
So the second law of thermodynamics does not contradict, or provide reason for challenging, modern evolutionary theory. Period. Anyone who says otherwise is wrong as a simple matter of fact.
But creationists have never let something as simple as the truth prevent them from using an argument. So they will argue that even in an open system we don't naturally expect order to increase. This is the tack taken by Sewell in his essay, who goes on to discuss some of the standard cliches for why known evolutionary mechanisms can not produce cpmplex systems. We will discuss that part of his essay in the next installment of this series.
What is important for now, however, is to note that the second law of thermodynamics is a pure red herring in this argument. In other words, the second law is of absolutely no help at all to Sewell in making his case. Everyone agrees that the growth in complexity that has happened over the last four billion years requires an explanation. It is everyday experience, and not any considerations arising from the seond law, that leads to this agreement. The argument, such as it is, is about whether known mechanisms are up to the task of explaining that complexity increase.
Even if it were someday shown that known natural mechanisms were not up to the task (ID folks claim that day has come, but they are wrong to make that claim), we still would not have a contradiction between evolution and the second law. Sewell raises the issue either because he hasn't the faintest idea what he's talking about, or because he's trying to snow people by invoking technical terminology to cover a bad argument.