Hunter Liu's Website

8. Week 8: Orders of Zeroes

≪ 7. Week 7: Power Series | Table of Contents | 9. Week 9: Laurent Series and Singularities ≫

Last week, we talked about how nice power series of holomorphic functions were: you can always take a function to be equal to its own Taylor series, right up until there’s a bad singularity (i.e. a “vertical asymptote”).

At the end, we mentioned that one application of power series was to more precisely capture the idea of the “order” of a zero. When working with a (complex) polynomial \(p(z)\), there is a notion of the multiplicity of a zero. One can always factor \(p\) into linear terms: \[p(z) = \left( z-z_1 \right) ^{n_1} \left( z - z_2 \right) ^{n_2}\cdots\] with each \(z_1,z_2,\ldots\) distinct. These powers \(n_1, n_2,\ldots\) are the multiplicities of these zeroes. \(p\left( z_1 \right) = 0\), but more than that, it’s as if \(p \left( z_1 \right) = 0 ^{n_1}\): you can “factor out” \(n_1\) copies of this zero! (In linear algebra, one can describe the algebraic and geometric multiplicity of an eigenvalue in much the same way: how many times can you “factor out” the eigenvalue?)

While the underlying algebra is slightly more complicated, one can just as well develop a theory of the “multiplicity” of a zero of a complex analytic function through its power series.

Orders of Zeroes

Suppose \(D\subseteq \mathbb{C}\) is a domain and \(f:D\to \mathbb{C}\) is a nonzero analytic function. Let \(z_0\in D\) such that \(f\left( z_0 \right)=0\), then \(z_0\) is a zero of \(f\). Writing out the power series of \(f\) centred at \(z_0\), we have \[f\left( z \right) = \sum _{n=0}^{\infty} a_n \left( z-z_0 \right)^n.\] Plugging in \(z_0\) gives \(a_0 = f\left( z_0 \right) = 0\), so the first term of the power series vanishes. It’s entirely possible that the \(z-z_0\) coefficient is zero as well, but at some point there has to be something nonzero.

More precisely, let \(n_0\) be the smallest positive integer such that \(a _{n_0}\neq 0\). Then, \[f\left( z \right) = a _{n_0} \left( z-z_0 \right) ^{n_0} + a _{n_0+1} \left( z-z_0 \right) ^{n_0 + 1 }+\cdots = \sum _{n=n_0}^{\infty} a_n \left( z-z_0 \right) ^n.\]

We call \(n_0\) the order of the zero at \(z_0\).

This mimicks the properties of the order of a zero of a polynomial: we can “factor out” \(n_0\) copies of \(\left( z-z_0 \right)\) from the power series and write \[f(z) = \left( z-z_0 \right) ^{n_0} \left( a _{n_0} + a _{n_0 + 1} \left( z-z_0 \right) + \cdots \right) = \left( z-z_0 \right) ^{n_0} \sum _{n=0}^{\infty} a _{n_0+n} \left( z-z_0 \right)^n.\] The remaining power series is analytic wherever it converges, so \(\frac{f(z)}{\left( z-z_0 \right) ^{n_0}}\) defines an analytic function! In other words, we can “divide out” \(n_0\) zeroes (and no more) and still be analytic.

There are some really nice algebraic properties of how the orders of common zeroes of different functions interact with each other.

Exercise 1.

Let \(f\) and \(g\) be two analytic functions. Let \(n_f\) and \(n_g\) be the orders of \(f\) and \(g\) at a point \(z_0\), respectively. Then,

  1. The order of \(f(z)\cdot g(z)\) at \(z_0\) is exactly \(n_f+n_g\).
  2. The order of \(f(z) \pm g(z)\) at \(z_0\) is at least \(\min \left( n_f, n_g \right)\).
  3. If \(n_g < n_f\), then \(\frac{f(z)}{g(z)}\) is analytic in a neighbourhood of \(z_0\), and its order is exactly \(n_f - n_g\).

Do not try to prove these by taking many, many derivatives. There is not enough time in the universe to do such things. Instead, for the first two claims, you should write \(f\) and \(g\) as power series to find the leading term of the power series of their product and sum. For the third claim, one should prematurely factor out the zeroes of \(f\) and \(g\) and work it out from there.

How to Compute???

One way to compute the order of this zero is to use Taylor’s theorem: we know that these coefficients are given by \(a_n = \frac{1}{n!}\cdot f ^{(n)} \left( z_0 \right)\), so \(a_n=0\) if and only if \(f ^{(n)}\left( z_0 \right) = 0\). Thus, the order \(n_0\) of a zero is the number of derivatives we have to take before we get something nonzero.

However, this is profoundly cumbersome: some second derivatives can already be very annoying to compute and evaluate. In addition, this gives us no help in computing the order of a pole. Instead, we should often think about using power series and some of that abstract algebra from before to help us determine the order of a zero.

The following was a problem on the homework, graded for completion.

Example 2.

Find all the zeroes of \(f(z) = \frac{\cos z - 1}{z}\) and determine their orders.

To solve this, we use the fact that \(f(z) = 0\) implies \(\cos z = 1\). This happens at the points \(z=2\pi n\) for \(n\in \mathbb{Z}\) — all the solutions happen when \(z\) is a real number, as \(\Im \cos z\neq 0\) whenever \(\Im z \neq 0\).

You can certainly compute two derivatives of \(\cos\) if you really wanted to. However, you could also use power series to solve this! At \(z=0\) (corresponding to \(n=0\)), the numerator has the power series \[\sum _{k=0}^{\infty} \frac{(-1)^k z ^{2k}}{(2k)!} - 1 = - \frac{z^2}{2} + \frac{z^4}{24} - \frac{z^6}{720} + \cdots\] Thus, the numerator has a zero of order \(2\). On the other hand, the denominator has a zero of order \(1\) at \(z=0\), so \(f\) has a zero of order \(2-1=1\) at \(z=0\).

To find the power series of \(\cos\) centred at \(2\pi n\), we have \[\cos\left( z \right) = \cos \left( z - 2\pi n \right) = \sum _{k=0}^{\infty} \frac{(-1)^k \left( z-2\pi n \right) ^{2k}}{(2k)!}.\] Thus, the numerator of \(f\) has the power series expansion \[\cos z - 1 = - \frac{\left( z-2\pi n \right) ^{2}}{2} + \frac{\left( z-2\pi n \right)^4}{24} - \frac{\left( z-2\pi n \right)^6}{720}+\cdots\] So, \(\cos z - 1\) has a zero of order \(2\) at every \(2\pi n\). However, when \(n\neq 0\), \(z\) has a zero of order \(0\) at \(2\pi n\) (i.e., it’s nonzero), so \(f\) has a zero of order \(2-0 = 2\). \(\square\)

The point is, analysing the power series of a function is often a relatively quick and reliable way of finding the order of a zero, especially when you don’t know what the order of said zero should be. This is particularly helpful in theoretical applications, where one does not always have knowledge of a function’s derivatives a priori. That said, there are times when computing power series is just as if not more painful than computing derivatives, so tread carefully.

Practise!

Exercise 3. Textbook V.7.1c

Determine the order of each zero of the function \(f(z) = z^2\sin z\).

Exercise 4.

Determine the order of each zero of the function \(f(z) = e ^{z^2} - 1\).

Hint
In this case, you probably shouldn’t try using a power series.

Exercise 5.

Suppose \(f(z)\) is an analytic function with a zero of order \(n\) at the point \(z_0\). Show that there exists a function \(g(z)\), defined in an open neighbourhood of \(z_0\), such that \(f(z) = g(z) ^n\) whenever \(g\) is defined. That is, “\(f(z) ^{\frac{1}{n}}\)” is holomorphic near \(z_0\).

Hint
First factor out the \(n\) zeroes of \(f\), then take an \(n\)-th root of the remaining quotient. Can you carefully select a branch cut?