Infinite Summation

Infinite summation over arbitrary index sets can be described using nets, but there is an elementary way of talking about these sums which is equivalent to using nets. The elegance also shows that there is nothing mysterious or terribly subtle about the subject of infinite summation as talk of nets might imply. The material from today’s post is taken nearly wholesale from Paul Halmos’s excellent book Introduction to Hilbert Space and the Theory of Spectral Multiplicity, which as far as I can tell is out-of-print.

Definition. Let X be a Banach space. We say that a family of vectors \{x_i\} for i\in I is summable, if there exists a vector x\in X such that for all \varepsilon>0 there exists a finite set F\subseteq I such that for any finite set G \supseteq F, we have \lVert \sum_{i\in G} x_i-x\rVert <\varepsilon. We write \sum_{i\in I}x_i=x.

Of course \mathbb{R} is a (real) Banach space and \mathbb{N} is a perfectly good indexing set. We get our usual definition of summation if we restrict our attention to G of the form \{1,2,\dotsc,n\}. This means that our definition of summation is more restrictive than our usual definition for sequences, but we notice that it is equivalent to having \sum_{i=1}^{\infty}x_{\sigma(i)} be convergent series for all permutations \sigma. Equivalently, \sum_{i\in \mathbb{N}} x_i is summable (in our sense) if and only \sum_{i=1}^{\infty} x_i is absolute convergent. Considering the advantages of absolute convergence over convergence and the fact that absolute convergence makes little intuitive sense, it is tempting to consider this definition over our traditional one. Alas, there are too many practical difficulties with this definition to introduce in a calculus course.

We immediately have the fact that linearity is preserved: \alpha\sum_{i\in I}x_i=\sum_{i\in I}\alpha\cdot x_i, \sum_{i\in I}x_i+\sum_{i\in I}y_i=\sum_{i\in I}(x_i+y_i), which passes from the fact that these are true when I is finite. (Though technically, one must show that the sums are unique.)

One cannot go without some analogue of a Cauchy criterion, and this one is quite slick:

Theorem (Cauchy Criterion). A family of vectors \{x_i\} is summable if and only if for all \varepsilon>0, there exists a finite set F\subseteq I such that for any finite set G\subseteq I such that F\cap G=\varnothing, we have \lVert \sum_{i\in G} x_i \rVert < \varepsilon.

The proof of this is not so bad, but enlightening. One direction is straightforward. Suppose \{x_i\} is summable. Then take F\subseteq I such that for any F'\supseteq F one has \lVert\sum_{x\in F'}x_i-x\rVert <\varepsilon/2. Take a set G disjoint from F and consider F'=F\cup G, we then have \lVert\sum_{i\in G}x_i\rVert=\lVert\sum_{i\in F'}x_i-\sum_{i\in F}x_i\rVert\leq \lVert\sum_{i\in F'}x_i-x\rVert+\lVert x-\sum_{i\in F}x_i\rVert<\varepsilon/2+\varepsilon/2=\varepsilon. So that proves one direction. Now suppose the Cauchy condition holds. Let F_n\subseteq I be a finite set such that if G\cap F_n=\varnothing, then \lVert\sum_{i\in G} x_i\rVert<1/n. By replacing F_n with F_1\cup F_2\cup \dotsb \cup F_n, we may assume that F_{n-1}\subseteq F_{n} for all n. (Halmos makes the observation here, that \bigcup_{n\geq 1} F_n is countable and if i\not\in \bigcup_{n\geq 1} F_n, then \|x_i\|n, then \lVert \sum_{i\in F_m} x_i - \sum_{i\in F_n} x_i\rVert =  \lVert \sum_{i\in F_m\setminus F_n} x_i \rVert<1/n. So the sequence converges and we see that if we take any G\supseteq F_n, then \lVert \sum_{i\in G} x_i -x\rVert\leq \lVert \sum_{i\in F_n} x_i-x\rVert + \lVert \sum_{i\in G\setminus F_n} x_i\rVert. The former converges to zero as n\to\infty and latter is less than 1/n.

So I quickly admit that I added nothing new from what Halmos had written, but I feel that this material is not as widely known as it should be. Certainly, one can generalize this by using nets, but the simplicity here works well and requires no exposition of directed sets and any difficulties that arise from dealing with more general nets.