# Convergent series

In mathematics, a series is the sum of the terms of a sequence of numbers.

Given a sequence $\left\{a_{1},\ a_{2},\ a_{3},\dots \right\}$ , the nth partial sum $S_{n}$ is the sum of the first n terms of the sequence, that is,

$S_{n}=\sum _{k=1}^{n}a_{k}.$ A series is convergent if the sequence of its partial sums $\left\{S_{1},\ S_{2},\ S_{3},\dots \right\}$ converges. In more formal language, a series converges if there exists a limit $\ell$ such that for any arbitrarily small positive number $\varepsilon >0$ , there is a large integer $N$ such that for all $n\geq \ N$ ,

$\left|S_{n}-\ell \right\vert \leq \ \varepsilon .$ A sequence that is not convergent is said to be divergent.

File:LogConvergence95.gif
Illustration of the convergence of the power series of Log[z+1] around 0 evaluated at z = .95 Exp[(π−​13)i].

## Examples of convergent and divergent series

• The reciprocals of powers of 2 produce a convergent series (so the set of powers of 2 is "small"):
${1 \over 1}+{1 \over 2}+{1 \over 4}+{1 \over 8}+{1 \over 16}+{1 \over 32}+\cdots =2.$ • The reciprocals of positive integers produce a divergent series:
${1 \over 1}+{1 \over 2}+{1 \over 3}+{1 \over 4}+{1 \over 5}+{1 \over 6}+\cdots .$ • Alternating the signs of the reciprocals of positive integers produces a convergent series:
${1 \over 1}-{1 \over 2}+{1 \over 3}-{1 \over 4}+{1 \over 5}-{1 \over 6}+\cdots =\ln 2.$ • The reciprocals of prime numbers produce a divergent series (so the set of primes is "large"):
${1 \over 2}+{1 \over 3}+{1 \over 5}+{1 \over 7}+{1 \over 11}+{1 \over 13}+\cdots .$ • The reciprocals of square numbers produce a convergent series (the Basel problem):
${1 \over 1}+{1 \over 4}+{1 \over 9}+{1 \over 16}+{1 \over 25}+{1 \over 36}+\cdots ={\pi ^{2} \over 6}.$ • Alternating the signs of the reciprocals of positive odd numbers produces a convergent series:
${1 \over 1}-{1 \over 3}+{1 \over 5}-{1 \over 7}+{1 \over 9}-{1 \over 11}+\cdots ={\pi \over 4}.$ ## Convergence tests

There are a number of methods of determining whether a series converges or diverges.

File:Comparison test series.svg
If the blue series, $\Sigma b_{n}$ , can be proven to converge, then the smaller series, $\Sigma a_{n}$ must converge. By contraposition, if the red series, $\Sigma a_{n}$ is proven to diverge, then $\Sigma b_{n}$ must also diverge.

Comparison test. The terms of the sequence $\left\{a_{n}\right\}$ are compared to those of another sequence $\left\{b_{n}\right\}$ . If,

for all n, $0\leq \ a_{n}\leq \ b_{n}$ , and $\sum _{n=1}^{\infty }b_{n}$ converges, then so does $\sum _{n=1}^{\infty }a_{n}$ .

However, if,

for all n, $0\leq \ b_{n}\leq \ a_{n}$ , and $\sum _{n=1}^{\infty }b_{n}$ diverges, then so does $\sum _{n=1}^{\infty }a_{n}$ .

Ratio test. Assume that for all n, $a_{n}>0$ . Suppose that there exists $r$ such that

$\lim _{n\to \infty }{\frac {a_{n+1}}{a_{n}}}=r$ .

If r < 1, then the series converges. If r > 1, then the series diverges. If r = 1, the ratio test is inconclusive, and the series may converge or diverge.

Root test or nth root test. Suppose that the terms of the sequence in question are non-negative, and that there exists r such that

$\lim _{n\to \infty }{\sqrt[{n}]{a_{n}}}=r$ If r < 1, then the series converges. If r > 1, then the series diverges. If r = 1, the root test is inconclusive, and the series may converge or diverge.

The ratio test and the root test are both based on comparison with a geometric series, and as such they work in similar situations. In fact, if the ratio test works (meaning that the limit exists and is not equal to 1) then so does the root test; the converse, however, is not true. The root test is therefore more generally applicable, but as a practical matter the limit is often difficult to compute for commonly seen types of series.

Integral test. The series can be compared to an integral to establish convergence or divergence. Let $f(n)=a_{n}$ be a positive and monotone decreasing function. If

$\int _{1}^{\infty }f(x)\,dx=\lim _{t\to \infty }\int _{1}^{t}f(x)\,dx<\infty ,$ then the series converges. But if the integral diverges, then the series does so as well.

Limit comparison test. If $\left\{a_{n}\right\},\left\{b_{n}\right\}>0$ , and the limit $\lim _{n\to \infty }{\frac {a_{n}}{b_{n}}}$ exists and is not zero, then $\sum _{n=1}^{\infty }a_{n}$ converges if and only if $\sum _{n=1}^{\infty }b_{n}$ converges.

Alternating series test. Also known as the Leibniz criterion, the alternating series test states that for an alternating series of the form $\sum _{n=1}^{\infty }a_{n}(-1)^{n}$ , if $\left\{a_{n}\right\}$ is monotone decreasing, and has a limit of 0, then the series converges.

Cauchy condensation test. If $\left\{a_{n}\right\}$ is a monotone decreasing sequence, then $\sum _{n=1}^{\infty }a_{n}$ converges if and only if $\sum _{k=1}^{\infty }2^{k}a_{2^{k}}$ converges.

## Conditional and absolute convergence

File:ExpConvergence.gif
Illustration of the absolute convergence of the power series of Exp[z] around 0 evaluated at z = Exp[​i3]. The length of the line is finite.
File:LogConvergence95.gif
Illustration of the conditional convergence of the power series of Log[z+1] around 0 evaluated at z = .95 Exp[(π−​13)i]. The length of the line is infinite.

For any sequence $\left\{a_{1},\ a_{2},\ a_{3},\dots \right\}$ , $a_{n}\leq \ \left|a_{n}\right\vert$ for all n. Therefore,

$\sum _{n=1}^{\infty }a_{n}\leq \ \sum _{n=1}^{\infty }\left|a_{n}\right\vert .$ This means that if $\sum _{n=1}^{\infty }\left|a_{n}\right\vert$ converges, then $\sum _{n=1}^{\infty }a_{n}$ also converges (but not vice-versa).

If the series $\sum _{n=1}^{\infty }\left|a_{n}\right\vert$ converges, then the series $\sum _{n=1}^{\infty }a_{n}$ is absolutely convergent. An absolutely convergent sequence is one in which the length of the line created by joining together all of the increments to the partial sum is finitely long. The power series of the exponential function is absolutely convergent everywhere.

If the series $\sum _{n=1}^{\infty }a_{n}$ converges but the series $\sum _{n=1}^{\infty }\left|a_{n}\right\vert$ diverges, then the series $\sum _{n=1}^{\infty }a_{n}$ is conditionally convergent. The path formed by connecting the partial sums of a conditionally convergent series is infinitely long. The power series of the logarithm is conditionally convergent.

The Riemann series theorem states that if a series converges conditionally, it is possible to rearrange the terms of the series in such a way that the series converges to any value, or even diverges.

## Uniform convergence

Main article: uniform convergence.

Let $\left\{f_{1},\ f_{2},\ f_{3},\dots \right\}$ be a sequence of functions. The series $\sum _{n=1}^{\infty }f_{n}$ is said to converge uniformly to f if the sequence $\{s_{n}\}$ of partial sums defined by

$s_{n}(x)=\sum _{k=1}^{n}f_{k}(x)$ converges uniformly to f.

There is an analogue of the comparison test for infinite series of functions called the Weierstrass M-test.

## Cauchy convergence criterion

The Cauchy convergence criterion states that a series

$\sum _{n=1}^{\infty }a_{n}$ converges if and only if the sequence of partial sums is a Cauchy sequence. This means that for every $\varepsilon >0,$ there is a positive integer $N$ such that for all $n\geq m\geq N$ we have

$\left|\sum _{k=m}^{n}a_{k}\right|<\varepsilon ,$ which is equivalent to

$\lim _{n\to \infty \atop m\to \infty }\sum _{k=n}^{n+m}a_{k}=0.$  