Skip main navigation

The idea of limits

Limits are a key concept in calculus. Watch this video to learn what limits are and what notation is used to express them.

To get started, we’ll learn about the concept of a limit or limiting process. We understand limits by using an intuitive approach through the recent example of the first image taken of a black hole.

We can think of the limit of a function at a number (a) as being the one real number (L) that the functional values approach as the (x)-values approach (a), provided such a real number (L) exists.

Let (f(x)) be a function defined at all values in an open interval containing (alpha), with the possible exception of (alpha) itself, and let (L) be a real number. If all values of the function (f(x)) approach the real number (L) as the values of (x(neq alpha)) approach the number (alpha), then we say that the limit of (f(x)) as (x) approaches (alpha) is (L). (More succinct, as (x) gets closer to (alpha), (f(x)) gets closer and stays close to (L).) Symbolically, we express this idea as: (displaystyle lim_{x to alpha}f(x)=L)

A table of values of graphs may be used to estimate a limit.

This article is from the free online

Applications of Calculus

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now