3. Sequences and Limits
≪ 2. The \(\sup\) and \(\inf\) | Table of Contents | 4. Midterm 1 Review ≫In lecture, we briefly introduced the rather intuitive notion of what it means for a sequence of numbers to approach a limit. Recall:
Definition 1.
A sequence of real numbers \(\left( a_n \right) _{n=1}^{\infty}\) approaches a limit \(L\) if for all \(\epsilon > 0\), there exists some \(N_ \epsilon \) such that for all \(n > N_ \epsilon \), \(\left\lvert a_n-L \right\rvert < \epsilon \).
We denote \(L = \lim _{n\to\infty} a_n\).
In words, the sequence \(\left( a_n \right)\) approaches \(L\) if it gets arbitrarily close to \(L\) as \(n\) gets very large. Today’s focus is going to be developing and verifying properties of limits. Most of these properties should hopefully be very intuitive properties that one would come to expect, but it’s important to verify that a bulky abstract definition is compatible with our intuition.
First of all, a sequence should only be able to approach a single limit at a time, i.e. it can’t get arbitrarily close to two distinct values.
Proposition 2. Uniqueness of Limits
Suppose \(\lim _{n\to\infty} a_n\) exists. Then it is unique.
As with almost every uniqueness proof, we’ll begin by assuming that \(L_1\) and \(L_2\) are two (possibly different) numbers that satisfy the definition of the limit, then show that \(L_1=L_2\).
Intuitively, if \(a_n\) gets arbitrarily to both \(L_1\) and \(L_2\), the two can’t be that far away.
Proof
I should remark that this technique of “inserting zero” before using the triangle inequality is not an uncommon one, and it’s a way to formulate the heuristic we outlined before the proof.
Be warned that not every sequence has a limit. This may be obvious on a picture, but proving that this is true sometimes takes a bit of effort. Something nice is that every sequence that does have a limit must be bounded. Therefore, no unbounded sequences can have limits.
Exercise 3.
Let \(\left( a_n \right)\) be a sequence, and suppose \(L=\lim _{n\to\infty}a_n\) exists. Then \(\left( a_n \right)\) is a bounded sequence.
Solution
We should use the assumption that \(L\) exists. By taking \(\epsilon = 1\) (any number would work, really) in the definition of a limit, there exists an integer \(N\) such that \(n > N\) implies \[\left\lvert a_n - L \right\rvert < 1 \implies \left\lvert a_n \right\rvert < \left\lvert L \right\rvert + 1.\] Thus, everything past the first (N) terms of the sequence is bounded by (L + 1).
However, the first \(N\) terms of the sequence are bounded, too — there’s only finitely many dudes, so there’s no way for them to get arbitrarily large. Specifically, take \[M = \max \left\lbrace \left\lvert a_1 \right\rvert, \ldots, \left\lvert a_N \right\rvert, L + 1 \right\rbrace.\] Then, \(\left\lvert a_n \right\rvert \leq M\) for all \(n \leq N\), and \(\left\lvert a_n \right\rvert < L + 1 \leq M\) for all \(n > N\). Thus \(\left\lvert a_n \right\rvert \leq M\) for all \(M\), and the sequence is bounded. \(\square \)
But there are also bounded sequences that don’t have limits, and a hopefully familiar example is:
Exercise 4.
Show that the sequence \(a_n = \left( -1 \right)^n\) does not have a limit.
This is called a “necessary but not sufficient” condition on which sequences have limits and which ones don’t — every converging sequence must be bounded, but the converse is not necessarily true (by the above). Of course, a sequence that’s merely bounded may still oscillate and flail about wildly the same way \(\left( -1 \right)^n\) does! Thus, it makes sense to believe that sequences with limits should not flail about too wildly in the long run — like well-adapted members of society, they should eventually settle down (even if it takes a long time).
Exercise 5.
Let \(\left( a_n \right)\) be a convergent sequence. Show that for all \(\epsilon > 0\), there exists some \(N_ \epsilon \) such that for all indices \(n, m > N _ \epsilon \), \[\left\lvert a_n - a_m \right\rvert < \epsilon .\]
This criterion on how much sequences can jiggle around is called the Cauchy criterion, named after Augustin-Louis Cauchy (like too many other things in math). It turns out that this is a necessary and sufficient condition: sequences of real numbers converge if and only if they don’t flap around wildly in the long run. Proving this is somewhat difficult, however, and I’ll leave that up to the textbook and/or the lectures.
Exercise 6. Iterated Limits
Your job is to construct a function \(a : \mathbb{N}\times \mathbb{N}\to \mathbb{R}\) so that the following properties hold. For ease of communication, we can list out the values of \(a\) in a table, and \(a(n, m)\) can be the entry in the \(n\)-th column and \(m\)-th row.
- The horizontal limits \(H_m = \lim _{n\to\infty} a(n, m)\) exist for every row \(m\).
- The vertical limits \(V_n = \lim _{m\to\infty} a(n, m)\) exist for every column \(n\).
- The limits \(\lim _{n\to\infty }V_n\) and \(\lim _{m\to\infty} H_m\) both exist but are not equal to each other.
The point of this last exercise is to demonstrate that the order in which you take limits matters, if there is more than one limit. That is, in general \[\lim _{n\to\infty} \left( \lim _{m\to\infty} a \left( n,m \right) \right) \neq \lim _{m\to\infty}\left( \lim _{n\to\infty} a \left( n,m \right) \right).\]
This should hopefully remind you of when you learned about taking limits in multivariable calculus: multi-dimensional limits need to be considered over all paths to the limit point. This above exercise can be thought of as a sequential analogue of this principle.