An infinite series is an infinite sum of terms. Here are some examples:
The harmonic series
The geometric series
The alternating harmonic series
The exponential series
We can associate a sequence with any series, called the sequence
of partial sums of the series. The j-th partial sum of a series is defined
to be the sum of its first j terms.
Thus, given the series
a1 + a2 +...+ an +...
we define sj by
A sequence is said to converge to value A if, for any positive criterion c, there is an n such that all the terms in the sequence after the n-th are within a region of radius c about A.
A series is said to converge to A if the sequence of its partial sums converges to it, and A is then called its limit.
We distinguish two kinds of convergence of series. A series is said to be absolutely convergent if the series whose terms are the absolute values of its terms, |aj| converges. A series with both positive and negative terms can converge though it is not absolutely convergent.
For example the harmonic series does not converge at all, so the alternating harmonic sequence is not absolutely convergent. The alternating series does however converge. The geometric series converges absolutely when |x| < 1 holds. When x = 1 it diverges, and when x = -1 it fails to converge but is, in fact, "summable" in some sense, as we shall see.
An absolutely convergent series can be manipulated in many ways without changing its value. Thus, you can rearrange its terms, can integrate or differentiate it term by term, and treat it essentially as if it were a finite series.
A convergent but not absolutely convergent sequence can be assigned a value,
but you must be very careful with it. You must not rearrange its terms; that
can change its value into almost anything. In fact, any manipulation of it at
all is suspect.
Series that diverge are often quite interesting and even useful, despite that property. This can happen in (at least) four ways as follows:
First you may be able to assign a value to such a series even if it fails to converge, by using the notion of summability. A series is convergent if its sequence of partial sums converges. It is said to be C1 summable if the sequence of averages of its first k partial sums converges. Thus, the series
1 - 1 +1 - 1 ...
has partial sums
1, 0, 1, 0, 1, 0, ...
which do not converge. However the averages of the first k of them form the sequence
which converges to 1/2, so this series is C1 summable.
You can go further by defining C2 and more generally Ck summability: the sequence of averages of partial sums of a series defines a series which has them as its partial sums; if this series is C1 summable the original series is said to be C2 summable.
30.1 Find a series which is C2 summable but not C1 summable. (Hint: look at the series whose averages of the first k terms are 1, 0, 1, 0, ...)
30.2 Give a plausible definition for Cj summability.
Second, you may notice that the partial sums, sn, of your series differ from some function of n (which goes to infinity as n increases) by an amount that converges. Thus, the harmonic series has n-th partial sum that differs from the function ln(n) by an amount that approaches a constant , as we shall discuss below. is called Euler's constant.
Third, when you are dealing with a power series, you may be able to assign meaning to a divergent series by means of any of a number of processes called "analytic continuation", which we shall also discuss.
Finally, we sometimes encounter power series whose terms are defined implicitly and we are interested in the coefficient values independent of any actual value of the sequence. Such an entity is called a formal power series, and can be useful even if it diverges for every possible non-zero argument.