Synapse

An interconnected graph of micro-tutorials

Infinite Series: When Does Adding Forever Make Sense?

This is an early draft. Content may change as it gets reviewed.

What does it mean to add infinitely many numbers?

$$1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} + \cdots$$

This sum has infinitely many terms, but it equals exactly $2$. Each term gets you closer to 2 and you never overshoot. The sum converges.

But not all infinite sums behave so well:

$$1 + 1 + 1 + 1 + \cdots$$

This obviously diverges β€” it grows without bound. Less obviously, so does:

$$1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + \frac{1}{5} + \cdots$$

Even though the terms shrink toward zero, this sum β€” the harmonic series β€” diverges. It grows past any bound, just very slowly.

Partial sums

An infinite series is defined as the limit of its partial sums. For a series $\sum_{n=1}^{\infty} a_n$, the $N$-th partial sum is:

$$S_N = a_1 + a_2 + \cdots + a_N$$

If $S_N$ approaches a finite number $L$ as $N \to \infty$, the series converges to $L$:

$$\sum_{n=1}^{\infty} a_n = L$$

If $S_N$ grows without bound (or oscillates), the series diverges.

A necessary condition (that isn’t sufficient)

If a series converges, the terms must approach zero: $a_n \to 0$. But the converse is false β€” the harmonic series has $a_n = 1/n \to 0$ and still diverges. Terms shrinking to zero is necessary but not sufficient.

Convergence tests

Several tests help determine convergence:

Comparison test: If $0 \leq a_n \leq b_n$ for all $n$ and $\sum b_n$ converges, then $\sum a_n$ converges. β€œBounded above by something finite? Then finite.”

Ratio test: If $\lim_{n \to \infty} |a_{n+1}/a_n| = r$, then the series converges if $r < 1$, diverges if $r > 1$, and the test is inconclusive if $r = 1$.

$p$-series test: $\sum 1/n^p$ converges if $p > 1$ and diverges if $p \leq 1$. This single fact connects the harmonic series ($p = 1$, diverges) to the Basel problem ($p = 2$, converges to $\pi^2/6$) and to the Riemann zeta function ($\zeta(s) = \sum 1/n^s$).

Absolute vs conditional convergence

A series converges absolutely if $\sum |a_n|$ converges. A series converges conditionally if $\sum a_n$ converges but $\sum |a_n|$ does not.

The alternating harmonic series $1 - 1/2 + 1/3 - 1/4 + \cdots = \ln 2$ converges conditionally. The Riemann rearrangement theorem says you can rearrange its terms to make it converge to any number you want β€” a striking demonstration that conditional convergence is fragile.

Why series matter

Infinite series are the language of: - Analysis: Taylor series represent functions as infinite polynomials - Number theory: The zeta function $\zeta(s) = \sum 1/n^s$ encodes the distribution of primes - Probability: Generating functions are power series that encode distributions - Physics: Perturbation theory expresses solutions as infinite corrections to simpler ones

The question β€œdoes this series converge?” is the gateway to asking what the sum means β€” and in number theory, those meanings run extraordinarily deep.