4. Sequences, Limits, \(\limsup\), and \(\liminf\)
≪ 3. The \(\sup\) and \(\inf\) | Table of Contents | 5. Convergence of Series ≫Let’s quickly recall the definition of a limit:
Definition 1.
A sequence of real numbers \(\left( a_n \right) _{n=1}^{\infty}\) converges to a limit \(L\in \mathbb{R}\) if for all \(\epsilon > 0\), there exists some \(N\) such that for all \(n > N\), \[\left\lvert a_n - L \right\rvert < \epsilon .\] We write \(L = \lim _{n\to\infty} a_n\).
I like to think about the \(n\) as describing “time”: \(a_0\) is the value of something right this very moment, \(a_2\) is its value in two hours, \(a _{48}\) is its value in two days, etc. Thus saying that \(a_n\) converges to \(L\) is really the same as saying “\(a_n\) is eventually close to \(L\)”. The “close to \(L\)” is quantified by the \(\epsilon\), and the “eventually” is quantified by the \(N\). I think talking through arguments with this perspective helps me identify logical heuristics that can later be formalised with the above definition. Here’s two examples of this:
Exercise 2.
Suppose \(a_n\) and \(b_n\) are two sequences of real numbers. Suppose \(\lim _{n\to\infty }a_n = A\) and \(\lim _{n\to\infty} b_n = B\) for some real numbers \(A\) and \(B\). Show that \[\lim _{n\to\infty} \left( a_n + b_n \right) = A + B. \]
The heuristic is, if I wait long enough, \(a_n\) will be close to \(A\) and \(b_n\) will be close to \(B\). If I wait so long that both of these happen, then \(a_n + b_n\) will be close to \(A + B\).
Proof
Let \(\epsilon > 0\). We wish to show that there exists some \(N\) such that for all \(n > N\), \[\left\lvert \left( a_n + b_n \right) - (A+B) \right\rvert < \epsilon. \]
For the same \(\epsilon > 0\), we know by definition that there exists \(N_A\) such that for all \(n > N_A\), \[\left\lvert a_n - A \right\rvert < \epsilon. \] Likewise, there exists some \(N_B\) such that for all \(n > N_B\), \[\left\lvert b_n - B \right\rvert < \epsilon .\] Therefore, if \(N = \max \left( N_A, N_B \right)\), for all \(n > N\), we have \[\begin{align*} \left\lvert \left( a_n + b_n \right) - (A+B) \right\rvert &= \left\lvert \left( a_n - A \right) + \left( b_n - B \right) \right\rvert \\ & \leq \left\lvert a_n - A \right\rvert + \left\lvert b_n - B \right\rvert \\ & < \epsilon + \epsilon \\ & = 2 \epsilon . \end{align*}\]
Oh shucks, we’re off by a factor of \(2\). We now replace \(\epsilon \) with \(\frac{\epsilon }{2}\) and repeat the argument! \(\square\)
(Actually, the factor of \(2\) doesn’t matter…)
Exercise 3.
Show that the sequence \(a_n = \left( -1 \right)^n\) does not converge to a limit.
Heuristically, if \(a_n\) converges to a limit \(L\), eventually \(a_n\) must be really close to \(L\). But no matter how long I wait, I’ll keep seeing both \(1\) and \(-1\), and these can’t both be close to the same \(L\).
To make this precise, we need to apply the definition of convergence and pick a specific value of \(\epsilon \) to make it work.
Proof
Suppose towards a contradiction that \(a_n \to L\) for some real number \(L\). Applying the definition of a limit with \(\epsilon = 1\), there exists some \(N\) such that for all \(n > N\), \[\left\lvert a_n - L \right\rvert < 1.\] For even numbers \(n > N\), we get \(\left\lvert 1 - L \right\rvert < 1\), so the limit must lie between \(0 < L < 2\). But for odd numbers \(n > N\), we get \(\left\lvert -1 - L \right\rvert < 1\), so the limit must lie between \(-2 < L < 0\). There is no real number \(L\) satisfying both simultaneously, so no limit can exist. \(\square\)
Alternatively, one may argue that if \(1\) is close to \(L\) and \(L\) is close to \(-1\), then \(-1\) is close to \(1\)…
Unfortunately, for lots of sequences you encounter on the street, you don’t immediately know what the limit \(L\) should be, and this can make it quite difficult to prove that it converges to something. For this reason, there is a wonderful theorem that states that a sequence of real numbers converges to a limit if and only if it is a Cauchy sequence.
With the previous example, the sequence \(a_n = \left( -1 \right)^n\) is moving around in an oscillatory way forever, and it never stops moving. Of course a sequence that converges must eventually slow down, settle down, buy a house, etc… and in particular it must settle down near a limiting value.
To actually prove this great theorem, we introduced two new limiting concepts: the \(\liminf\) and \(\limsup\).
Definition 4.
Let \(a_n\) be a sequence of real numbers. The “limit superior” of \(a_n\) is defined as \[\limsup _{n\to\infty} a_n := \lim _{N\to\infty} \sup \left\lbrace a_n : n \geq N \right\rbrace.\] The “limit inferior” of \(a_n\) is defined as \[\liminf _{n\to\infty}a_n := \lim _{N\to\infty} \inf \left\lbrace a_n : n \geq N \right\rbrace. \]
Now this is quite an unwieldly definition, and it’s quite annoying to use and work with in practise.
Let’s return to the thought that \(n\) ought to represent time and introduce two new ideas: the ideas of an “eventual upper bound” and an “eventual lower bound”.
Definition 5.
Let \(a_n\) be a sequence of real numbers. \(M\in \mathbb{R}\) is an eventual upper bound of \(a_n\) if there exists some \(N\) such that for all \(n > N\), \(a_n \geq M\). Likewise, \(L\in \mathbb{R}\) is an eventual lower bound if there exists some \(N\) such that for all \(n > N\), \(a_n \leq L\).
In words, I think this idea explains itself: if you wait long enough, you’ll get an upper or lower bound.
Exercise 6.
What are the eventual upper bounds of \(a_n = (-1)^n\cdot \frac{1}{n}\)? What are its eventual lower bounds? How are they related to its limit?
What about \(a_n = (-1)^n\)?
Now you should remember that \(\sup\) means least upper bound and \(\inf\) means greatest lower bound, and one can generalise this idea exactly to \(\liminf\) and \(\limsup\)!
Proposition 7.
\(\limsup _{n\to\infty} a_n\) is the infimum of all eventual upper bounds of \(a_n\). Likewise, \(\liminf _{n\to\infty} a_n\) is the supremum of all eventual lower bounds of \(a_n\).
Let’s prove this identity for the \(\limsup\); I’ll skip proving it for the \(\liminf\).
We will proceed in two steps: first, we’ll show that \(\limsup _{n\to\infty} a_n \leq M\) whenever \(M\) is an eventual upper bound of \(a_n\). Then, we’ll show that if \(X < \limsup _{n\to\infty}a_n\), then \(X\) cannot be an eventual upper bound.
- Suppose \(M\) is an eventual upper bound of \(a_n\). Then, there exists some \(N_0\) such that for all \(n > N_0\), \(a_n \leq M\). This in other words says that \(M\) is an upper bound of the sets \[\left\lbrace a_n : n \geq N \right\rbrace \subseteq \left\lbrace a_n : n \geq N_0 \right\rbrace\] for all \(N \geq N_0\)! In particular, we get that for all \(N \geq N_0\), \(\sup \left\lbrace a_n : n \geq N \right\rbrace \leq M\). We conclude that \(\limsup _{n\to\infty} a_n \leq M\). (How?)
- Suppose \(X < \limsup _{n\to\infty}a_n\).
Let \(\epsilon \) be the positive difference between the two.
There exists some \(N_0\) such that for all \(N > N_0\),
\[\left\lvert \sup \left\lbrace a_n : n \geq N \right\rbrace - \limsup _{n\to\infty} a_n \right\rvert < \frac{\epsilon }{2}.\] In particular, this implies that \(\sup \left\lbrace a_n : n \geq N \right\rbrace > X + \frac{\epsilon }{2}\) for all \(N > N_0\)! (Think about this step.) This in particular means that \(X\) is not an upper bound of the set \(\left\lbrace a_n : n \geq N \right\rbrace\) for any \(N \geq N_0\). Therefore \(X\) is not an eventual upper bound of \(a_n\). \(\square\)
Question 8.
Is \(\limsup _{n\to\infty} a_n\) itself an eventual upper bound of \(a_n\) in general? Prove this or give a counterexample.