Sequential criterion for limits
A sequence of real numbers is said to converge to a real number L if the sequence gets arbitrarily close to L as the index of the sequence increases. In other w...
A sequence of real numbers is said to converge to a real number L if the sequence gets arbitrarily close to L as the index of the sequence increases. In other w...
A sequence of real numbers is said to converge to a real number L if the sequence gets arbitrarily close to L as the index of the sequence increases. In other words, the sequence converges to L if the difference between the terms in the sequence and L approaches 0 as the index approaches infinity.
Key points:
A sequence converges to a number L if the limit of the sequence as the index approaches infinity is equal to L.
This is the formal definition of convergence for sequences.
A sequence converges if and only if the limit of the difference between the terms in the sequence and L as the index approaches infinity is equal to 0.
This definition is more general than the limit definition for functions, which only requires the limit of the difference to be 0.
The sequential criterion applies to both convergent and divergent sequences.
Examples:
The sequence {1, 2, 3, 4, 5} converges to 5 because the difference between the terms in the sequence approaches 0 as the index increases.
The sequence {1, 2, 3} diverges because the difference between the terms approaches 2 as the index increases.
The sequence {0, 1/2, 1/4, 1/8} converges to 0 because the limit of the difference between the terms and 0 as the index approaches infinity is equal to 0.
The sequential criterion for limits is a powerful tool for determining whether a sequence converges to a real number. It provides a more general definition of convergence than the limit definition, which can be applied to both convergent and divergent sequences