Friday, September 23, 2005

A Math Post!

The evolution news is a bit thin today, though you might want to have a look at this statement from the American Astronomical Society supporting the teaching of evolution.

So, what the heck, how about a random math post? Math is just filled with things that can bring a smile to your face and make you say, “Gosh, that's really clever!” Why not share a few of them? If it goes over well, perhaps I will make this a regular feature.

Let's think about addition. If I have two numbers, let's call them x and y, then I can add them together to produce a new number, which I will call z. This fact can be expressed by writing x+y=z.

Addition is something you do to two numbers. You take one number, add it to the second, and that is all. But sometimes we like to talk about adding three or more numbers. What does this mean? Well if I want to evaluate something like a+b+c, I would proceed by first evaluating a+b and then taking the result and adding it to c. In other words, the problem of adding three numbers can be broken down to two separate steps, where in each step I am only adding two numbers.

By the same token, I can add n numbers by carrying out n-1 separate additions, where at each step I am only adding two numbers.

Perhaps there's another concern. In evaluating a+b+c, do I need to worry about the order in which I carry out my additions? The answer is no, which you probably already knew. We can write


a+b+c = c+a+b = b+c+a,


and all of these expressions are equal to the sum obtained from adding the three numbers in any other order. Since the image of our numbers shifting their positions back and forth is vaguely reminiscent of what commuters do in shifting their positions from home to work, we refer to this phenomenon as the commutative property of addition.

What happens if I try to add infinitely many numbers?

Sounds like gibberish. Addition, as I have said, is something you do to two numbers. It is meaningful to talk about adding together arbitrarily large finite collections of numbers, but only by breaking the process down into many smaller steps. But all the small steps in the world will never allow me to add infinitely many numbers. And even if I do manage to add them all up, won't I just keep getting larger and larger numbers without bound?

Perhaps not. Suppose I want to evaluate the following sum:


(1/2) + (1/4) + (1/8) + (1/16) + (1/32) + (1/64) + ...


I am trying to add up all the fractions with one on top and a power of two on the bottom. I have included the parentheses only for ease of reading.

I could reason as follows: If I add up the first two fractions I get

(1/2) + (1/4) = (3/4).

If I added the first three I would get

(1/2) + (1/4) + (1/8) = (7/8).

If I continued in this way, I would find that the sum of the first four fractions is 15/16, the sum of the first five is 31/32, the sum of the first six is 63/64, and the sum of the first n fractions is 2 to the n-th power minus one over 2 to the n-th power.

If I line up these fractions I produce the following sequence

3/4, 7/8, 15/16, 31/32, 63/64, ...

Since every term in this sequence is obtained by adding up a small part of the original series, we refer to this as the sequence of partial sums.

If you stare at that sequence for a while you might notice that each fraction is a little closer to one than the fraction before it. Indeed, by adding up more and more of the fractions in my sequence, I can produce sums that are as close to one as I wish. And from there it's easy to take the plunge and write:

(1/2) + (1/4) + (1/8) + (1/16) + ... = 1.


So if I want to add up infinitely many numbers I carry out the following steps: I add the first two numbers, and jot down the sum. I add the third number to that sum, and jot that down as well. Then I add the fourth number to that new sum, and jot it down. I keep doing this. Then I stare at the sequence of sums that I have produced, and try to determine if it's getting closer and closer to something. I refer to that something as the sum of the infinite series.

Thus, it is now meaningful to talk about the sum of an infinite series.

That's the good news. The bad news is that most of the time your infinite series will not add up to anything. Just try evaluating 1+2+3+4+5+... and you'll see what I mean. Even if it does add up to something, it is usually next to impossible to figure what, exactly, the sum is. If you manage to evaluate this sum:

1 + (1/8) + (1/27) + (1/64) + (1/125) + …

(note that the denominators are all perfect cubes) you will instantly become very famous. (Hint: The sum is known to be irrational).

An infinite series that actually adds up to something is said to converge. Otherwise it is said to diverge. The convergent series are the especially interesting ones.

When confronted with an infinite series, the first question you ask is whether it converges or diverges. Towards that end, there are a variety of tests you can perform that, if they work, will resolve the question. Here's an especially simple one: For there to be any hope that your infinite series converges, you must have that the terms of your series are getting smaller and smaller and smaller.

Otherwise there would be some (possibly tiny) number x with the property that infinitely many terms of your series are larger than x. And if that is the case, then the sum of your series will have to be larger than the sum of infinitely many x's. But no matter how small x is, if you add it to itself enough times you will get very large numbers indeed.

Alas, it is possible that the terms of your series get smaller and smaller, but your series diverges anyway. The most famous example is:

(1/2) + (1/3) + (1/4) + (1/5) + ...


This is known as the harmonic series, and it is a standard exercise in freshman calculus to devise at least three different proofs of its divergence. That means that if you add up enough terms in this series, you can produce a sum that is larger than any number you care to name. (Curiously, if you throw out all of the fractions whose denominators are not prime numbers, the series still diverges. But that's another post).

Adding up infinitely many numbers is not quite the same thing as adding up finitely many numbers. Which might make you wonder how much of your intuition about finite addition remains true when you consider infinite addition.

For example, suppose I have a convergent series in which all of the terms are positive, and I decide to add up the terms in a different order. Will I necessarily get the same sum? Or is it possible that when I discuss sums of infinite series, I must give up the commutative property?

The answer is that I will, indeed, get the same sum. Very comforting.

But what if I throw negative numbers into the mix? For example, consider the following series:

1 - (1/2) + (1/3) - (1/4) + (1/5) - (1/6) + ...

It turns out this adds up to log 2. The expression “log” denotes the natural logarithm; i.e. the logarithm taken to the base e.

(It is possible that in your school days you learned that natural logarithms are denoted “ln” The reason we do not use such an unpronounceable abomination here is that, as far as mathematicians are concerned, base e is the only game in town. It is also possible that you learned that logarithms to the base ten are called, “common logarithms”, a locution so foul I'm sure it exists solely to raise the blood pressure of professional mathematicians. But that's yet another post).

The series above is called the alternating harmonic series, for reasons I trust are obvious. It converges. But if we dropped all the minus signs we would have the harmonic series, which we know diverges.

Let us call such a series conditionally convergent. That means that the series converges, but if you dropped the minus signs (take the absolute value, in math parlance) from all the negative terms you would have a divergent series.

I ask again: If you take a conditionally convergent series and rearrange the terms, can you be certain that they will add up to the same number? To be specific, if I rearrange the terms in the alternating harmonic series, will they still add up to log 2? Or is there a danger that they will add up to something different?

This time the answer is that they might add up to something different.

Actually, it's far worse than that. Hand me any real number and call it x. Positive, negative, rational, irrational, it doesn't matter. I can rearrange the terms of any conditionally convergent series so that the sum is precisely x.

I'm not kidding.

I heard this fact for the first time in a real analysis class in my sophomore year of college. Real analysis (sometimes called “advanced calculus”) is basically calculus, but where you pay attention to proving everything rigorously. The professor was lecturing like gangbusters, and I was dutifully jotting everything down, word for word, in my notes. I was struggling to keep up, when suddenly he tossed off the fact that you can rearrange a conditionally convergent series to make it add up to anything you want. He did not elaborate, and by the time I had lifted my head out of my notes he was on to a different topic. I figured I must have heard him wrong. But I hadn't, it really is true.

Let's see how the trick is done.

Suppose you have a divergent series. Then by adding together sufficiently many terms, you can produce a sum bigger than any number you care to mention. But what would happen if you removed the first one hundred terms from this series? The answer is that the series would still diverge. What if you removed the first billion terms of the series? No effect. In fact, there is no way you can remove finitely many numbers from an infinite, divergent series and produce something that converges.

After all, the finitely many numbers you removed must add up to something finite. If the infinitely many remaining numbers likewise added up to something finite, then the sum of the whole series would simply be the sum of those two finite quantities. And that contradicts the assumption that the series diverges.

Now consider a conditionally convergent series. It is really two series in one: There is a series of positive numbers and a series of negative numbers. By applying logic similar to the argument from the last paragraph, we see there must be infinitely many of each. Furthermore, both of those series must diverge individually. I will ask you simply to take my word for it.

One final observation: Since we are assuming the series converges, we know that the terms in the series must get smaller and smaller and smaller. It follows that the positive numbers and the negative numbers, each taken by themselves, must also be getting smaller and smaller and smaller.

Now take all your positive numbers, arranged from largest to smallest, and put them over here and take your negative numbers, similarly arranged, and put them over there. We are now ready to rearrange our series.

Pick a number. Five sounds good. I will now rearrange my series to produce a sum of five.

Go to your stash of positive numbers, and keep adding them to your series until their sum is just greater than five. We know this is possible, since the series of positive numbers diverges.

Now go to your stash of negative numbers, and take just enough of them to bring your sum down to something smaller than five.

Now return to the positive numbers, and take enough of them to make the sum larger than five once more. Since we have only removed finitely many numbers from our divergent series of positive terms, we know this is possible. Furthermore, since the positive numbers are getting smaller and smaller, the amount by which we overshoot five will be smaller this time than last time.

Return to the negatives, and once again bring the sum down below five.

Since the series of positive and negative terms each diverge individually, w know we can do this forever. Add enough positive numbers to lift your sum over five, then enough negative numbers to bring you below five, and on and on. The amount by which we overshoot or undershoot five will keep getting smaller and smaller.

In this way, our sequence of partial sums will shift, pendulum-like, from being above five to being beneath five, and so on. In the limit it will have to converge to five. The result is a series that adds up to five.

Obviously, we could modify this argument to achieve any sum we desired.

Gosh, that’s really clever!

The process by which we sum up infinitely many numbers seems like a simple extension of basic arithmetic. It isn’t. Strange, counter-intuitive things start to happen when we consider infinite sets. These are the sorts of oddball things that attracted me to math in the first place. Hopefully I managed to communicate some of that here.

18 Comments:

At 6:03 PM, Anonymous Matt Daws said...

Great little post! I'm still surprised by the fact that a convergent, but not absolutely convergent, series can be rearranged to sum to anything. And I supposedly work with Banach spaces... Keep up the good blogging work.

 
At 1:32 AM, Blogger Phil N. Darrer said...

Maybe I miss the point not having dealed with maths for some years. But to me it sounds like you just trade your two diverging series into two converging. The first is the one that converges to , say, five. The other then converges to log2-5. So if you add them together you'll still get log2.
Of course, the complicated part is that the second series changes at different points of your finite calculation. It mainly consists of the numbers ignored at any point from the calculation in 'natural' order (bigger to smaller).
So it's sounds like a side-effect of dealing with infinites in a finite model, not a real effect (if there's such reasoning in maths).
However wrong I might be here, thanks for the interesting post to start the day.

 
At 4:29 AM, Anonymous Matt Daws said...

Phil, In my view, that's *sort* of right. The point is that the series

1 - (1/2) + (1/3) - (1/4) + ...

Is two series which diverge, namely

1 + (1/3) + (1/5) + ...
-(1/2) - (1/4) - ...

So you can view re-arranging the order as trying to add infinity to -infinity. I think something more interesting is happening, as the rearrangement really does sum to a finite limit: in the sense that a finite approximation can be arbitrarily good. It's not just some dodgy trick.

I suppose you could say that the moral is simply that adding up series which aren't absolutely convergent is something which is a bit dodgy. Of course, if you go on to study more math, then you'll find it becomes really useful and interesting: that's the beauty of the subject to me.

 
At 7:17 AM, Blogger Phil N. Darrer said...

Matt, I might be really wrong, I just find it a wrong conclusion to say the rearranged series converges to 5. It is rearranged to look to converge to 5 as long as you stay finite - and you can look ad infinitum at finite numbers, they're still finite and don't tell you the real convergence point of the original series.

To reach 5 you have to use much more numbers of the + series than of the - at any given point. So if you use 1 million + numbers and only 1000 - you have 0.999 million - numbers labeled as 'I'll deal with you later'. The sum of that changing series of 'deal-later' numbers should converge against log2-5.

It's maybe a difference to sum something infinitely and to the real end of infinity :-)

Last note, if God taught that 'rearrangement of sums' thing to Noah, maybe that is how he could fit all the species on his ark?

 
At 8:11 AM, Anonymous Matt Daws said...

Phil, yes, then, you are mis-understand the definition of "convergence" and the proof that the sequence converges to 5. It's a bit hard to continually re-state the issue: if you're really interested, I suggest you read a basic book on real analysis.

On the subject of Noah, you might be amused to read this: Hilbert's Hotel This is a different sort of infinity though...

 
At 2:52 PM, Anonymous Dom said...

Here is something from a Mathematicians Miscellany (forget the author). Only slightly related.

Assume you have an urn and an infinite supply of balls number 1, 2 ...

At 1 minute before noon, place balls 1-10 in the urn, throw out ball 1. There are 9 balls in the urn.

At 1/2 minute before noon, place balls 11-20 in the urn, throw out ball 2. There are 18 balls in the urn.

And so on.

How many balls are in the urn at noon.

You might think it is 9 times infinity, but no, there are no balls in the urn.

Try to name a ball, and I can give you the exact time it which it was tossed out. Is ball # 18543 there? No, it was tossed out at "1 over 2 to the (18543-1) power" minutes before noon.

Now change it so that we place balls 1-10 in the urn, throw out ball 1. Place balls 11-20 in the urn, throw out ball 11. And so on. Same operation, but BY THE SAME LOGIC, there must be an infinite number of balls in the urn.

 
At 3:30 PM, Blogger Phil N. Darrer said...

Matt, I remember from back then about the epsilon test that surely is fulfilled by the rearrangement, so you're probably right.

Maybe it's something in the physicists' perspective of infinities that is not totally pleased with the formal result so I better leave that to mathematicians and keep watching that no infinities enter real life :-)

 
At 4:58 AM, Anonymous Anonymous said...

To add some evolutionary flavour to the maths, perhaps you could examine fractals.

The key claim of ID is that life is too complex to be the result of natural selection processes.

In fractals however, we see how simple formulae easily create seemingly staggeringly complex images.

What lays behind the Mandelbrot series? A simple forumla, a + bi.

If something so simple can create an image that appears to be incredibly complex, then why can't life be explained by simple natural processes?

 
At 1:39 PM, Blogger Jason said...

Matt-

Thanks for the kind words about the post. Glad you liked it.

Phil-

Ulimately, the fact that it is possible to rearrange a conditionally convergent series to add up to anything you like is simply a consequence of the precise, technical meaning of the term “converge.” There is no skulduggery going on in saying that the rearranged series converges to five. It genuinely converges to five, and nothing else.

The reason this result seems so surprising is that when we add up a finite collection of numbers, we think of the sum as telling us how much stuff we have. Furthermore, the way we define the sum of an infinite series doesn't sound all that different from what we do for finite sums. There's just that added step where you have to take the limit of a particular sequence. But that added step can result in counter-intuitive things going on.

In my post I expressed this by saying that the commutative property does not hold for conditionally convergent series. Another way of putting it would be to say that we must give up any notion that the sum of a conditionally convergent series is telling us how much stuff we have.

dom-

I'm familiar with the problem you mention. It illustrates a principle similar to what I was describing in my post. Simply by altering which balls you remove at any given moment, you can justify any answer to the question of how many balls wil be in the urn at noon. Thanks for bringing it up.

anonymous-

I think you make a good point. Fractals, chaos and the like certainly provide a good lesson in humility. Observing complexity and jumping to the conclusion that design is the only explanation is intellectual laziness pure and simple.

 
At 2:51 PM, Blogger M. Saunders said...

Hi!
Great blog! It's really swell and right up my alley. I have a blog on the Evolution vs. Creationism debate too. Check it out sometime!
It's called "A Student for Science and Agaist Bigotry."

P.S. Watch out for the Dover, PA case starting 26 September!

 
At 5:33 PM, Blogger another orphan said...

A question: how does this elegant little puzzle relate (if at all) to the process of renormalization, by which physicists get rid of the nasty infinities in elementary particle theories?

 
At 12:07 PM, Anonymous disasterman said...

What about Picard's theorem: every analytic function assumes every complex value, with possibly one exception, infinitely often in any neighborhood of an essential singularity.

This really blew me away when I first encountered it, and I think of it as some do of the Decalogue: "with feelings of reverence, mingled with awe!"

 
At 12:23 PM, Blogger Jason said...

disasterman-

Picard's Theorem is a good one - somehow I love the “with possibly one exception” part. Might be hard to explain in a blog entry though.

 
At 12:34 PM, Blogger disasterman said...

I sort of heuristically think of an essential singularity as a "black hole", once you cross the event horizon (the "possibly one exception"?) you could be "anywhere and everywhere" ("every complex value").

Admittedly, not terribly rigorous, but possibly of some help?

 
At 1:39 PM, Anonymous ed hessler said...

Thanks for taking time to do this post. I hope you will continue the practice from time to time. These are a delight to read (and the comments, too)and help me appreciate this lens on the world. What things they make one think about!

Cheers.

 
At 11:59 PM, Anonymous zenegra said...

Tadalafil
Cialis
Zenegra
Stop4rx
Zenegra
Stop4rx
ZENEGRA
VIAGRA
Generic Viagra

Zenegra

ZENEGRA

Zenegra

zenegra
mp3 players
buy mp3 players
cheap mp3 players
wholesale mp3 players
portable mp3 players

Zenegra

ZENEGRA
purchase viagra
zenegra

 
At 4:53 PM, Blogger Thomas said...

picture of zoroastrianism
Declaration of Heaven on Earth!
picture of zoroastrianism
Please chant this prayer to recieve heaven on earth:
picture of zoroastrianism
Dearest, greatest, holiest!
Please give us all, the full heaven on earth!
I thank you, & I worship you.
picture of zoroastrianism

 
At 2:58 AM, Anonymous Anonymous said...

The harmonic prime series diverges, but what of the alternating harmonic prime series:
1/2 - 1/3 + 1/5 - 1/7 + 1/11 - 1/13 + 1/17 - ...

R.A.S.

 

Post a Comment

<< Home