In mathematics, specifically functional analysis, a series is unconditionally convergent if all reorderings of the series converge to the same value. In contrast, a series is conditionally convergent if it converges but different orderings do not all converge to that same value. Unconditional convergence is equivalent to absolute convergence in finite-dimensional vector spaces, but is a weaker property in infinite dimensions.
Let be a topological vector space. Let be an index set and for all
The series is called unconditionally convergent to if
Unconditional convergence is often defined in an equivalent way: A series is unconditionally convergent if for every sequence with the series converges.
If is a Banach space, every absolutely convergent series is unconditionally convergent, but the converse implication does not hold in general. Indeed, if is an infinite-dimensional Banach space, then by Dvoretzky–Rogers theorem there always exists an unconditionally convergent series in this space that is not absolutely convergent. However, when by the Riemann series theorem, the series is unconditionally convergent if and only if it is absolutely convergent.
This article incorporates material from Unconditional convergence on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.