A Math Post!
The evolution news is a bit thin today, though you might want to have a look at this statement from the American Astronomical Society supporting the teaching of evolution.
So, what the heck, how about a random math post? Math is just filled with things that can bring a smile to your face and make you say, “Gosh, that's really clever!” Why not share a few of them? If it goes over well, perhaps I will make this a regular feature.
Let's think about addition. If I have two numbers, let's call them x and y, then I can add them together to produce a new number, which I will call z. This fact can be expressed by writing x+y=z.
Addition is something you do to two numbers. You take one number, add it to the second, and that is all. But sometimes we like to talk about adding three or more numbers. What does this mean? Well if I want to evaluate something like a+b+c, I would proceed by first evaluating a+b and then taking the result and adding it to c. In other words, the problem of adding three numbers can be broken down to two separate steps, where in each step I am only adding two numbers.
By the same token, I can add n numbers by carrying out n-1 separate additions, where at each step I am only adding two numbers.
Perhaps there's another concern. In evaluating a+b+c, do I need to worry about the order in which I carry out my additions? The answer is no, which you probably already knew. We can write
a+b+c = c+a+b = b+c+a,
and all of these expressions are equal to the sum obtained from adding the three numbers in any other order. Since the image of our numbers shifting their positions back and forth is vaguely reminiscent of what commuters do in shifting their positions from home to work, we refer to this phenomenon as the commutative property of addition.
What happens if I try to add infinitely many numbers?
Sounds like gibberish. Addition, as I have said, is something you do to two numbers. It is meaningful to talk about adding together arbitrarily large finite collections of numbers, but only by breaking the process down into many smaller steps. But all the small steps in the world will never allow me to add infinitely many numbers. And even if I do manage to add them all up, won't I just keep getting larger and larger numbers without bound?
Perhaps not. Suppose I want to evaluate the following sum:
(1/2) + (1/4) + (1/8) + (1/16) + (1/32) + (1/64) + ...
I am trying to add up all the fractions with one on top and a power of two on the bottom. I have included the parentheses only for ease of reading.
I could reason as follows: If I add up the first two fractions I get
(1/2) + (1/4) = (3/4).
If I added the first three I would get
(1/2) + (1/4) + (1/8) = (7/8).
If I continued in this way, I would find that the sum of the first four fractions is 15/16, the sum of the first five is 31/32, the sum of the first six is 63/64, and the sum of the first n fractions is 2 to the n-th power minus one over 2 to the n-th power.
If I line up these fractions I produce the following sequence
3/4, 7/8, 15/16, 31/32, 63/64, ...
Since every term in this sequence is obtained by adding up a small part of the original series, we refer to this as the sequence of partial sums.
If you stare at that sequence for a while you might notice that each fraction is a little closer to one than the fraction before it. Indeed, by adding up more and more of the fractions in my sequence, I can produce sums that are as close to one as I wish. And from there it's easy to take the plunge and write:
(1/2) + (1/4) + (1/8) + (1/16) + ... = 1.
So if I want to add up infinitely many numbers I carry out the following steps: I add the first two numbers, and jot down the sum. I add the third number to that sum, and jot that down as well. Then I add the fourth number to that new sum, and jot it down. I keep doing this. Then I stare at the sequence of sums that I have produced, and try to determine if it's getting closer and closer to something. I refer to that something as the sum of the infinite series.
Thus, it is now meaningful to talk about the sum of an infinite series.
That's the good news. The bad news is that most of the time your infinite series will not add up to anything. Just try evaluating 1+2+3+4+5+... and you'll see what I mean. Even if it does add up to something, it is usually next to impossible to figure what, exactly, the sum is. If you manage to evaluate this sum:
1 + (1/8) + (1/27) + (1/64) + (1/125) + …
(note that the denominators are all perfect cubes) you will instantly become very famous. (Hint: The sum is known to be irrational).
An infinite series that actually adds up to something is said to converge. Otherwise it is said to diverge. The convergent series are the especially interesting ones.
When confronted with an infinite series, the first question you ask is whether it converges or diverges. Towards that end, there are a variety of tests you can perform that, if they work, will resolve the question. Here's an especially simple one: For there to be any hope that your infinite series converges, you must have that the terms of your series are getting smaller and smaller and smaller.
Otherwise there would be some (possibly tiny) number x with the property that infinitely many terms of your series are larger than x. And if that is the case, then the sum of your series will have to be larger than the sum of infinitely many x's. But no matter how small x is, if you add it to itself enough times you will get very large numbers indeed.
Alas, it is possible that the terms of your series get smaller and smaller, but your series diverges anyway. The most famous example is:
(1/2) + (1/3) + (1/4) + (1/5) + ...
This is known as the harmonic series, and it is a standard exercise in freshman calculus to devise at least three different proofs of its divergence. That means that if you add up enough terms in this series, you can produce a sum that is larger than any number you care to name. (Curiously, if you throw out all of the fractions whose denominators are not prime numbers, the series still diverges. But that's another post).
Adding up infinitely many numbers is not quite the same thing as adding up finitely many numbers. Which might make you wonder how much of your intuition about finite addition remains true when you consider infinite addition.
For example, suppose I have a convergent series in which all of the terms are positive, and I decide to add up the terms in a different order. Will I necessarily get the same sum? Or is it possible that when I discuss sums of infinite series, I must give up the commutative property?
The answer is that I will, indeed, get the same sum. Very comforting.
But what if I throw negative numbers into the mix? For example, consider the following series:
1 - (1/2) + (1/3) - (1/4) + (1/5) - (1/6) + ...
It turns out this adds up to log 2. The expression “log” denotes the natural logarithm; i.e. the logarithm taken to the base e.
(It is possible that in your school days you learned that natural logarithms are denoted “ln” The reason we do not use such an unpronounceable abomination here is that, as far as mathematicians are concerned, base e is the only game in town. It is also possible that you learned that logarithms to the base ten are called, “common logarithms”, a locution so foul I'm sure it exists solely to raise the blood pressure of professional mathematicians. But that's yet another post).
The series above is called the alternating harmonic series, for reasons I trust are obvious. It converges. But if we dropped all the minus signs we would have the harmonic series, which we know diverges.
Let us call such a series conditionally convergent. That means that the series converges, but if you dropped the minus signs (take the absolute value, in math parlance) from all the negative terms you would have a divergent series.
I ask again: If you take a conditionally convergent series and rearrange the terms, can you be certain that they will add up to the same number? To be specific, if I rearrange the terms in the alternating harmonic series, will they still add up to log 2? Or is there a danger that they will add up to something different?
This time the answer is that they might add up to something different.
Actually, it's far worse than that. Hand me any real number and call it x. Positive, negative, rational, irrational, it doesn't matter. I can rearrange the terms of any conditionally convergent series so that the sum is precisely x.
I'm not kidding.
I heard this fact for the first time in a real analysis class in my sophomore year of college. Real analysis (sometimes called “advanced calculus”) is basically calculus, but where you pay attention to proving everything rigorously. The professor was lecturing like gangbusters, and I was dutifully jotting everything down, word for word, in my notes. I was struggling to keep up, when suddenly he tossed off the fact that you can rearrange a conditionally convergent series to make it add up to anything you want. He did not elaborate, and by the time I had lifted my head out of my notes he was on to a different topic. I figured I must have heard him wrong. But I hadn't, it really is true.
Let's see how the trick is done.
Suppose you have a divergent series. Then by adding together sufficiently many terms, you can produce a sum bigger than any number you care to mention. But what would happen if you removed the first one hundred terms from this series? The answer is that the series would still diverge. What if you removed the first billion terms of the series? No effect. In fact, there is no way you can remove finitely many numbers from an infinite, divergent series and produce something that converges.
After all, the finitely many numbers you removed must add up to something finite. If the infinitely many remaining numbers likewise added up to something finite, then the sum of the whole series would simply be the sum of those two finite quantities. And that contradicts the assumption that the series diverges.
Now consider a conditionally convergent series. It is really two series in one: There is a series of positive numbers and a series of negative numbers. By applying logic similar to the argument from the last paragraph, we see there must be infinitely many of each. Furthermore, both of those series must diverge individually. I will ask you simply to take my word for it.
One final observation: Since we are assuming the series converges, we know that the terms in the series must get smaller and smaller and smaller. It follows that the positive numbers and the negative numbers, each taken by themselves, must also be getting smaller and smaller and smaller.
Now take all your positive numbers, arranged from largest to smallest, and put them over here and take your negative numbers, similarly arranged, and put them over there. We are now ready to rearrange our series.
Pick a number. Five sounds good. I will now rearrange my series to produce a sum of five.
Go to your stash of positive numbers, and keep adding them to your series until their sum is just greater than five. We know this is possible, since the series of positive numbers diverges.
Now go to your stash of negative numbers, and take just enough of them to bring your sum down to something smaller than five.
Now return to the positive numbers, and take enough of them to make the sum larger than five once more. Since we have only removed finitely many numbers from our divergent series of positive terms, we know this is possible. Furthermore, since the positive numbers are getting smaller and smaller, the amount by which we overshoot five will be smaller this time than last time.
Return to the negatives, and once again bring the sum down below five.
Since the series of positive and negative terms each diverge individually, w know we can do this forever. Add enough positive numbers to lift your sum over five, then enough negative numbers to bring you below five, and on and on. The amount by which we overshoot or undershoot five will keep getting smaller and smaller.
In this way, our sequence of partial sums will shift, pendulum-like, from being above five to being beneath five, and so on. In the limit it will have to converge to five. The result is a series that adds up to five.
Obviously, we could modify this argument to achieve any sum we desired.
Gosh, that’s really clever!
The process by which we sum up infinitely many numbers seems like a simple extension of basic arithmetic. It isn’t. Strange, counter-intuitive things start to happen when we consider infinite sets. These are the sorts of oddball things that attracted me to math in the first place. Hopefully I managed to communicate some of that here.