Sleek proof that the harmonic series diverges
Suppose that the harmonic series does converge. Then the sequence of integrable functions functions is bounded above by an integrable function .
Therefore, since almost everywhere, the Dominated Convergence theorem applies and we have
and we have reached a contradiction. Therefore the harmonic series diverges.
Source: http://mathoverflow.net/questions/42512/awfully-sophisticated-proof-for-simple-facts/44742#44742
In the following blog post there are two links in which over 40 ideas of proof are presented, all different from the one presented here.
Categories: Analysis, Measure Theory
dominated convergence, harmonic, trick
Is there a similar argument for:
The proof works for proving the divercence of for . I don't think that the proof can be adapted to prove the convergence of the harmonic series for , because it assumes the convergence in order to reach a contradiction.
But why wouldn’t the same proof work for ? We assume it converges, then we get a contradiction.
I believe you want to say: .
If we define then the integral of over is which converges to zero, so we don’t have a contradiction.
What I meant was defining , then the integral of is 1.
Then you have , and the integral of is which is divergent. This means that you cannot bound by an integrable function so Lebesgue dominated theorem does not apply.
Note that the difference between and is that the difference between consecutive terms is for the first one, and for the second one and that difference appears at the denominator in .
Thanks, I appreciate it.