Conditional convergence
In mathematics, a series or integral is said to be conditionally convergent if it converges, but it does not converge absolutely.
Definition
More precisely, a series of real numbers [math]\displaystyle{ \sum_{n=0}^\infty a_n }[/math] is said to converge conditionally if [math]\displaystyle{ \lim_{m\rightarrow\infty}\,\sum_{n=0}^m a_n }[/math] exists (as a finite real number, i.e. not [math]\displaystyle{ \infty }[/math] or [math]\displaystyle{ -\infty }[/math]), but [math]\displaystyle{ \sum_{n=0}^\infty \left|a_n\right| = \infty. }[/math]
A classic example is the alternating harmonic series given by [math]\displaystyle{ 1 - {1 \over 2} + {1 \over 3} - {1 \over 4} + {1 \over 5} - \cdots =\sum\limits_{n=1}^\infty {(-1)^{n+1} \over n}, }[/math] which converges to [math]\displaystyle{ \ln (2) }[/math], but is not absolutely convergent (see Harmonic series).
Bernhard Riemann proved that a conditionally convergent series may be rearranged to converge to any value at all, including ∞ or −∞; see Riemann series theorem. The Lévy–Steinitz theorem identifies the set of values to which a series of terms in Rn can converge.
A typical conditionally convergent integral is that on the non-negative real axis of [math]\displaystyle{ \sin (x^2) }[/math] (see Fresnel integral).
See also
References
- Walter Rudin, Principles of Mathematical Analysis (McGraw-Hill: New York, 1964).
Original source: https://en.wikipedia.org/wiki/Conditional convergence.
Read more |