5. Convergence of Series
≪ 4. Sequences, Limits, \(\limsup\), and \(\liminf\) | Table of Contents | 6. Continuity ≫Today’s focus will be on convergence tests for series, many of which have (hopefully) been seen in calculus before. Let’s first recall
Definition 1.
Given a sequence \(\left( a_n \right) _{n\in \mathbb{N}}\), define the \(n\)-th partial sum as \[S_n = a_1 + \cdots + a_n = \sum _{i=1}^{n} a_i.\] The series \(\sum _{n=1}^{\infty} a_n\) is defined as the limit \(\lim _{n\to\infty} S_n\).
We say the series \(\sum _{n=1}^{\infty}a_n\) converges if the sequence of partial sums converges; diverges to \(\infty\) if the sequence of partial sums diverge to \(\infty\), etc.
In some rather rare circumstances, one can show that a series converges to a finite limit directly by coming up with a formula for \(S_n\). This is the case for geometric series and for series that telescope. We are not so fortunate for the vast majority of series, however, and we need to appeal to indirect arguments to prove convergence.
Recall the following famous theorem:
Theorem 2.
A sequence \(\left( S_n \right) _{n\in \mathbb{N}}\) converges to a finite limit if and only if it is Cauchy.
In the context of series, this says that \(\sum _{n=1}^{\infty} a_n\) converges if and only if for every \(\epsilon > 0\), there exists \(N\) such that for all \(n > m \geq N\), \[\left\lvert S_n - S_m \right\rvert = \left\lvert a _{m+1} + \cdots a_n \right\rvert = \left\lvert \sum _{i=m+1}^{n} a_i \right\rvert < \epsilon .\] This is often much more practical than deriving some complicated algebraic formula for the partial sums and figuring out what the limit of the series is directly.
In some cases, it’s possible to spam the triangle inequality and argue that \[\left\lvert \sum _{i=m+1}^{n} a_i \right\rvert \leq \sum _{i=m+1}^{n} \left\lvert a_i \right\rvert,\] then show that the right hand side of this inequality is bounded by \(\epsilon \) when \(n, m\) are very large. This is at the heart of what the ratio and root tests argue.
Theorem 3.
Let \(\left( a_n \right) _{n\in \mathbb{N}}\) be a sequence of real numbers. If \(\limsup _{n\to\infty} \left\lvert a_n \right\rvert ^{\frac{1}{n}} < 1\), then \(\sum _{n=1}^{\infty} a_n\) converges. If \(\limsup _{n\to\infty} \left\lvert a_n \right\rvert ^{\frac{1}{n}} > 1\), then \(\sum _{n=1}^{\infty} a_n\) diverges.
You have (hopefully) seen a proof of this in lecture, but let us describe the main idea. When the \(\limsup\) is \(< 1\), then for some \(\epsilon > 0\) very small, \(1 - \epsilon \) is eventually an upper bound for \(\left\lvert a_n \right\rvert ^{\frac{1}{n}}\). But this means \(\left\lvert a_n \right\rvert < \left( 1 - \epsilon \right)^n\) for all \(n\) sufficiently large, say \(n \geq N\)! In particular, \(\sum _{n=N}^{\infty} \left\lvert a_n \right\rvert < \sum _{n=N}^{\infty} (1-\epsilon )^n < \infty,\) so the series converges. On the other hand, if the \(\limsup\) is \(> 1\), we have that \(1 + \epsilon \) is not an eventual upper bound of \(\left\lvert a_n \right\rvert ^{\frac{1}{n}}\). But this means that \(\left\lvert a_n \right\rvert > (1 + \epsilon )^n\) for infinitely many values of \(n\), and in particular \(\left\lvert a_n \right\rvert\) isn’t even bounded!
Theorem 4.
Let \(\left( a_n \right) _{n\in \mathbb{N}}\) be a sequence of real numbers. If \[\limsup _{n\to\infty} \left\lvert \frac{a _{n+1}}{a_n} \right\rvert < 1,\] then \(\sum _{n=1}^{\infty} \left\lvert a_n \right\rvert\) converges. If \[\liminf _{n\to\infty} \left\lvert \frac{a _{n+1}}{a_n} \right\rvert > 1,\] then \(\sum _{n=1}^{\infty} \left\lvert a_n \right\rvert\) diverges.
Note that it is possible for \(a_n = 0\) sometimes. For the purposes of this convergence test, we take \(\frac{0}{0} = 0\) and \(\frac{x}{0} = \pm\infty\) for any nonzero \(x\) (the sign is chosen to match that of \(x\)).
Consider first the case that \[\liminf _{n\to\infty} \left\lvert \frac{a _{n+1}}{a_n} \right\rvert > 1.\] This says that for some \(\epsilon > 0\) very small, \(1 + \epsilon \) is an eventual lower bound for the ratio \(\left\lvert \frac{a _{n+1}}{a_n} \right\rvert\). But that means \(\left\lvert a_n \right\rvert\) is growing bigger and bigger and bigger, so there’s no way for \(\sum _{n=1}^{\infty} a_n\) to converge!
On the other hand, when \[\limsup _{n\to\infty} \left\lvert \frac{a _{n+1}}{a_n} \right\rvert < 1,\] we have for some \(\epsilon > 0\) very small that \(1 - \epsilon \) is eventually an upper bound for the ratio \(\left\lvert \frac{a _{n+1}}{a_n} \right\rvert\). This means that for some \(N\) extremely large, we have \(\left\lvert a _{N+1} \right\rvert < (1-\epsilon ) \left\lvert a_N \right\rvert\), that \(\left\lvert a _{N+2} \right\rvert < (1 - \epsilon )^2 \left\lvert a_N \right\rvert\), that \(\left\lvert a _{N+3} \right\rvert < (1 - \epsilon)^3 \left\lvert a_N \right\rvert\), etc. In particular, we have (morally speaking) that \[\sum _{n=N}^{\infty} \left\lvert a_n \right\rvert < \sum _{n=N}^{\infty} \left\lvert a_N \right\rvert \cdot (1- \epsilon ) ^{n-N} < \infty.\] Adding in the first \(N\) many terms, we get that the series converges.
Making this precise is somewhat messy, but the main idea is that knowing the \(\limsup\) and \(\liminf\) of the ratio of successive terms allows us to bound the series above or below by some converging or diverging geometric series.
Question 5.
Though both proofs outlined above make comparisons to geometric series, in what ways are the arguments different? What are some concrete examples of \(a_n\) wherein one argument works but the other doesn’t?
Question 6.
Suppose \(a_n = 2 ^{-n}\) when \(n\) is even and \(a_n = 0\) when \(n\) is odd. What does the ratio test say about the convergence of \(\sum _{n=1}^{\infty} a_n\)? What does the root test say? Does this illustrate any differences you pointed out in the preceeding question?
Question 7.
Give an example of two sequences \(a_n\) and \(b_n\) so that \(\sum _{n=1}^{\infty} a_n\) converges, \(\sum _{n=1}^{\infty} b_n\) diverges, and so that the root test is inconclusive when applied to either series.