Mathematics Stack Exchange is a question and answer site for people studying math at any level and professionals in related fields. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

I am particularly concerned about these two sums: $$\sum_{x=\color{red}0}^\infty \frac{1}{2^x} = 1 + \frac{1}{2} + \frac{1}{4} ... = 2$$

and

$$\sum_{x=1}^\infty \frac{1}{x} = 1 + \frac{1}{2} + \frac{1}{3} ... = \infty $$

Now, I know and I understand the proofs that say the first sum converges to two and the second one is divergent. ( you probably shouldn't equate a divergent series like that ).

But on an intuitive level, what separates these two sums in such a way, that their limits are so drastically different?

Both sums add very small numbers in the end ( e.g. $\frac{1}{10000000000}$ ), yet one always stays smaller than two and one goes way beyond that. You would think ( again, on an intuitive level, the math is clear) that the numbers become so small that there is a certain number that doesn't change anymore (significantly).

For example:

$$\sum_{x=1}^{100} \frac{1}{x} = ca. 5 $$

$$\sum_{x=1}^{1000000} \frac{1}{x} = ca. 14.3 $$

$$\sum_{x=1}^{10000000000000000000000000000000} \frac{1}{x} = ca. 73.5 $$

As one can see, the growth drastically slows down, at some point it should became infinitely small, just like with first sum ( fraction, powers of two ).

share|cite|improve this question
11  
Have you seen the proof of divergence of the second sum that compares the sum to $\frac 12+(\frac 14+\frac 14)+(\frac 18+\frac 18+\frac 18+\frac 18)+\dots$? This is probably the best intuition I can think of, that no matter how far you go, you can always find a set of terms that together increase the value of the sum by at least $\frac 12$. – abiessu yesterday
5  
This is a classic example that shows that intuition cant help in all situations. Sometimes intuition is useful or possible, other times is not. The failure of intuition is the main reason of the existence of mathematical analysis. – Masacroso yesterday
1  
Just a note: The first sum should start at $x=0$, else it converges to 1. – ekkilop yesterday
15  
+1 for a very good question. A point about intuition: Don't try to force the math to conform to your intuition. Instead, digest these answers and let your intuition conform to the mathematics. – Neal yesterday
3  
"Young man, in mathematics you don't understand things. You just get used to them" - Von Neumann – Dair yesterday

This is related to how fast the terms decrease.

A geometric series (your $1/2^x$) is such that every term is a constant fraction of the previous, so that dividing by this constant is the same as dropping the first term.

$$\frac12\left(1+\frac12+\frac14+\frac18\cdots\right)=\frac12+\frac14+\frac18\cdots$$ So you can write

$$\frac12S=S-1$$ and deduce $S=2$.

The same reasoning applies to all geometric series

$$\sum_{k=0}^\infty r^k$$

provided that $r<1$. Indeed, if $r=1$ or $r>1$, the sum clearly grows forever. (This simplified discussion ignores the case $r<0$.)

This leads to a simple convergence criterion: if the ratio of successive terms is a constant less than $1$, the series converges. More generally, if this ratio is variable but tends to a limit smaller than $1$, the series converges.

Conversely, if the ratio tends to a limit larger than $1$, the series diverges. But if the ratio tends to $1$, we don't know, the criterion is insufficient.

The case of the harmonic series ($1/n$) or the generalized harmonic series ($1/n^p$) precisely falls in this category, as

$$\lim_{n\to\infty}\left(\frac{n}{n+1}\right)^p=1.$$

To deal with it, a trick is to sum the terms in groups of increasing size (by doubling), so that the sums exceed a constant. More precisely,

$$\begin{gather} 1,\\ \frac12,\\ \frac13+\frac14 > \frac14+\frac14 = \frac12,\\ \frac15+\frac16+\frac17+\frac18 > \frac18+\frac18+\frac18+\frac18 = \frac12,\\ \cdots \end{gather}$$

Though the groups get longer and longer, you can continue forever and the sum grows to infinity.

If you repeat the reasoning with exponent $p$,

$$\begin{gather} 1,\\ \frac1{2^p},\\ \frac1{3^p}+\frac1{4^p} > \frac1{4^p}+\frac1{4^p} = \frac2{4^p}=\frac1{2^{2p-1}},\\ \frac1{5^p}+\frac1{6^p}+\frac1{7^p}+\frac1{8^p} > \frac1{8^p}+\frac1{8^p}+\frac1{8^p}+\frac1{8^p} = \frac4{8^p} = \frac1{2^{3p-2}},\\ \cdots \end{gather}$$

In this new series, the ratio of successive terms tends to $2^{p-1}$ and by the first criterion, you can conclude convergence for $p>1$ and divergence for $p<1$. (A complete discussion must involve a similar upper bound, omitted here.)


To summarize, by decreasing order of decrease rate

$$\sum r^n, r<1\text{ converges}$$ $$\sum \frac1{n^p}, p>1\text{ converges}$$ $$\sum \frac1{n^p}, p=1\text{ diverges}$$ $$\sum \frac1{n^p}, p<1\text{ diverges}$$ $$\sum r^n, r=1\text{ diverges}$$ $$\sum r^n, r>1\text{ diverges}$$

For other series, you can compare to these decrease rates. For example, with the general term $1/n!$, the limit of the ratio is $\lim_{n\to\infty}n!/(n+1)!=0$ and the series converges, faster than any geometric series. Or $1/\sqrt[3]{n^2+1}$ makes a diverging series because the general term tends to $1/n^{2/3}$.

The curves below shows the trend of the terms of the sequences on a logarithmic scale. The green one corresponds to the harmonic series, which is a border between convergent and divergent series.

enter image description here

share|cite|improve this answer
1  
1/2 S = S - 1 has one more solution: S = ∞. – klimenkov 22 hours ago
    
@klimenkov S is by convention assumed to be a real number, so that's not a solution. – Joren 21 hours ago
    
@klimenkov: you are right, but I used this simple approach for pedagogical purposes. A more rigorous way is $S=\lim_{n\to\infty}S_n=\lim_{n\to\infty}(1-r^n)/(1-r)=1/(1-r)$ for $|r|<1$. – Yves Daoust 21 hours ago

To help your intuition about why it doesn't suffice that the individual terms get arbitrary small, consider the following series: \begin{aligned} a &= \sum_{k=1}^{\infty} 2^{-\lfloor \log_2 k\rfloor}\\ &= 1 + \underbrace{\frac12 + \frac12}_{=1} + \underbrace{\frac14+\frac14+\frac14+\frac14}_{=1} + \underbrace{\frac18+\frac18+\frac18+\frac18+\frac18+\frac18+\frac18+\frac18}_{=1} + \ldots \end{aligned} Clearly the individual terms of this series get arbitrary small, but there are sufficiently many of them that the sum still adds up to an arbitrary large number; indeed for any positive integer $n$, the first $2^n-1$ terms add up to $n$.

So for the series to converge, it does not suffice that the terms get arbitrary small, they have to get small fast enough that their growing count doesn't overcompensate their diminishing value.

share|cite|improve this answer

Intuition is not so intuitive.

Just take a look at this$$\sum_{k=N}^{2N}\frac{1}{k}\geq\sum_{k=N}^{2N}\frac{1}{2N}=\frac{1}{2}.$$

Then$$\sum_{k=1}^{4^{n}}\frac{1}{k}\geq n.$$

So the sequence $(\sum_{k=1}^{m}\frac{1}{k})_{m=1}^{\infty}$ has no upper bounds. It may seem counter-intuitive because the growth of $4^{n}$ is too fast for our "intuition".

share|cite|improve this answer

The difference is not in how small the terms are getting... as OP says, in both cases they become arbitrarily small... but in how many terms there are of a particular size. For instance, say two terms have the same size if their first nonzero binary digits are in the same place. In the first sum, you have one term of size $1$, one of size $1/2$, one of size $1/4$, one of size $1/8$, etc. In the second sum, you have one term of size $1$, one of size $1/2$, but then two of size $1/4$ ($1/3$ and $1/4$), four of size $1/8$ ($1/5$ through $1/8$), and so on. Each time the size is halved, the number of terms of that size is doubled; so the contribution from terms of each size never decreases, and the magnitude of the sum marches steadily to infinity.

share|cite|improve this answer

Using the integrals is the best intuition I've got so far about this. Plot the function $f(x)=\frac{1}{x}$ and then for each $i$, draw a histogram with $\frac{1}{i}$ height. You can easily see that: $$\int_1^{n+1} \frac{1}{x} dx<\sum_{i=1}^{n} \frac{1}{i}<1+\int_1^{n} \frac{1}{x} dx$$ by shifting the histograms to the left. enter image description here

Therefore $$\ln(n+1)<\sum_{i=1}^{n} \frac{1}{i}<1+\ln(n)$$ And I think the sandwich theorem is intuitive enough.

share|cite|improve this answer
1  
This still makes it hard to intuit how $\int_1^M 1/x dx$ can be so large when $M$ is large, since the heights out there are so small. – Ian yesterday
1  
@Ian If we don't know that $1/x$ has an anti-derivative, then you are right. – polfosol yesterday
1  
What I mean is, we know that $\int_1^x 1/y dy=\ln(x)$, and we know that $\ln(x)$ blows up, but then the same intuitive question as in the OP arises: how can you add up these small things to get something that blows up? One nice way to answer it would be to note that the widths of the intervals $(x_k,x_{k+1})$ with $\int_{x_k}^{x_{k+1}} 1/x dx = 1$ grow exponentially fast with $k$--specifically you can take $x_k=e^k$. – Ian yesterday
    
@Ian $\ln(x)$ does not "blow up", it creeps upward very slowly. – Simple Art yesterday
    
"Diverges to infinity" is colloquially expressed as "blows up" even if the relevant limit is to infinity and the divergence is slow. – Ian yesterday

Very interesting question indeed, and so I'll not only try to pick out what parts of your intuition is right, but I'll do my best to build on such intuition, since I noted many of the answers concern the proof of why the second sum diverges, even though you already state that you've seen and understood the proofs.

"Both sums add very small numbers in the end..."

True, but as you can easily see, the second sum diverges. However, it is correct to state that "if the sum converges, then as you take the limit to the last terms you are adding, the terms approach zero."

Clearly this must be the case or you'd end up adding something more than $0$ forever and beyond,

$$\underbrace{a_1+a_2+a_3+\dots}_{\text{first partial sums}}\underbrace{+c+c+c+\dots}_{\text{last partial sums}}=(a_1+a_2+a_3+\dots)+\infty$$

which would be an intuitive way of putting it.


Now, how do we discern the divergent from the convergent, in the case that $\lim_{n\to\infty}a_n=0$? We use convergence tests. A few interesting tests to point out:

$$\begin{array} {|l|c|c|c|} \hline \text{name} & \text{converges} & \text{diverges} & \text{inconclusive} \\ \hline \text{Ratio Test} & \lim_{n\to\infty}\left|\frac{a_{n+1}}{a_n}\right|<1 & \dots>1 & \dots=1 \\ \hline \text{Root Test} & \lim_{n\to\infty}\sqrt[n]{|a_n|}<1 & \dots>1 & \dots=1 \\ \hline \text{Integral Test} & \int_1^\infty f(t)dt<\infty & \dots=\infty & \text{never} \\ \hline \text{Direct Comparison} & \sum a_n<\sum b_n<\infty & \sum a_n>\sum b_n>\infty & \text{fail to find some $b_n$ that satisfies test} \\ \hline \text{Cauchy Condensation Test} & \sum2^nf(2^n)<\infty & \sum2^nf(2^n)=\infty & \text{never} \\ \hline \end{array}$$

Firstly, note a few things. The ratio/root test is really a comparison test between any sum to a geometric series, which already has known convergence.

The integral test is due to the geometric meaning of an integral, which makes sense with the conditions that $f(x)$ is positive monotone decreasing. It can also be seen as another comparison test.

Note that the comparison test should make sense. If this sum is less than a sum that converges, then that sum converges. If it is greater than a sum that diverges, it will also diverge.

Lastly, the Cauchy comparison test is used in celtschk's answer, where he expanded it to make it more visual. It is a special case of the integral test/comparison test.


considering the convergence of $\sum_{n=1}^\infty\frac1n$

Taking the ratio test, the result is inconclusive. Taking the root test, the result is inconclusive. Taking the integral test, the result is divergence. Taking the Cauchy condensation test, the result is divergence.

Now, the comparison test can be done with anything that fits the inequalities, but better yet, one usually takes other tests first, since they are special cases of the comparison test. Only if all else fails (or if you get an interesting idea) should you resort to the comparison test.

Also, if you use the Cauchy condensation test:

$$\sum_{n=1}^\infty\frac1n\le\sum_{n=1}^\infty1=1+1+1+\dots=\infty$$

If you use the integral test:

$$\sum_{n=1}^\infty\frac1n\text{ converges iff }\int_1^\infty\frac1tdt\text{ converges}$$

$$\int_1^\infty\frac1tdt=\ln(\infty)=\infty$$


In essence, the thing that separates these two series is a comparison test, as there is a sum between the two that diverges, and a sum between the two that also converges, one will converge and the other will diverge.

share|cite|improve this answer

Others have been focusing on intuitions on why the second series diverges independently of the first series... but there is a way to see what is happening by seeing how the first and second series are related.

Before we start, note that we can see that $$ \sum_{k=0}^\infty \frac{1}{2^k} = 2 $$ Now, consider the set of all positive integers. We can categorise these according to the largest odd factor of the number. So, for instance, 2, 4, 8, and 16 all share 1 as the largest odd factor. Meanwhile, 15, 30, 60, and 120 all share 15 as the largest odd factor.

This allows us to factorise our second sum, as $$ \sum_{n=1}^\infty \frac1n = \left(\sum_{k=0}^\infty \frac1{2^k}\right)\sum_{n\in2\mathbb{N}-1}\frac1n = 2\sum_{n\in2\mathbb{N}-1}\frac1n $$

But this new sum, for only the odd terms, gives us a peculiar behaviour, as $$ \sum_{n\in2\mathbb{N}-1}\frac1n > \sum_{n\in2\mathbb{N}}\frac1n = \sum_{n=1}^\infty\frac1{2n} $$ That is, $\frac1{2n-1}>\frac1{2n}$ for all $n\geq1$, so the sum must also be larger. Note that this relation is a strict one - they cannot be equal.

So what we have is $$ S=\sum_{n=1}^\infty \frac1n > 2\sum_{n=1}^\infty\frac1{2n} = \sum_{n=1}^\infty \frac1n = S $$ It is here that you can see the problem - the sum must be strictly larger than itself. As this is a logical impossibility, we conclude that the sum is, itself, not well-defined.

This reasoning essentially operates on the same logic used to show that the first sum converges to 2... but gives a very different result.

share|cite|improve this answer

The intuition for the geometric series is easy: you just cut a pie into half and the second half again into a half, etc.. in total, it's still the whole pie.

The second one is more tricky. Consider that you have money in a bank on $100\%$ interest rate evaluated continuously starting with 1 dollar, so that you will have $e^{t}$ dollars after $t$ years (if you like it more, consider $2^t$); the point is that the rate of growth is proportional / equal to your amount of money. Now ask the questions

  1. "how many years I have to wait untill I will have $x$ dollars", where $x$ is an integer?
  2. "how many years I have to wait untill I will have $x+1$ dollars"?

No matter what are the answers, you should agree that if $x$ is large, then the difference between answer $1$ and answer $2$ is of course tiny. In fact, the difference is approximately $1/x$ years, because the rate of increase equals the amount you have, so if you have $x$ dollars and the rate of increase is $x$, it takes around $1/x$ year to get a one dollar increase.

So if you continue and ask

  1. "how many years I have to wait untill I will have $x+2$ dollars"?

you need to add another $1/(x+1)$ amount of time, and so on.

But then, of course, after ANY amount time you will have SOME amount of money in the bank, even after million years... so $1/x+1/(x+1)+1/(x+2)+ \ldots$ can be arbitrary large.

share|cite|improve this answer
    
:) I think it was a nice answer. – Simple Art 11 hours ago

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.