What is known to us simply as "calculus" is the "infinitesimal calculus", known
to mathematicians as "analysis", a branch of mathematics hinted at by ancient
and medieval figures like Eudoxus, Archimedes and Alhazen, synthesized by
Newton and Liebniz, and put onto rigorous and recognizable footing by the likes
of Euler, Cauchy, and Weirstrass. It concerns itself with four related
(1) The infinite sequence, including the infinite sum.
(2) The limit
(3) The derivative, a linear approximation to a mathematical function
(4) The integral, or antiderivative.
A very rich theory has been built up around both fundamental and applied
questions in calculus, and concepts applicable to simple functions mapping single variables
from the real or natural numbers to the real or natural numbers have been
generalized to higher dimensions and abstract spaces whose utility is not
readily apparent to even a mathematically literate lay reader. The basics,
however, are understandable to anyone with a grasp of basic arithmetic/algebra.
(0) What is a function?
A discussion of the basics of the calculus would be dead in the water without an
understanding of what is meant by a "function" in mathematics.
A function is a relationship between two sets X and Y associating each element
of X with exactly one element of Y. The sets need not be different: the
function y=x^3 is a mapping from the whole real line to the whole real line.
An older usage allows functions to be multivalued; this faded nearly a century
ago but can still be found in some old books.
(1) The infinite sequence
Sequences are functions from the set of natural numbers (sometimes including
zero) to some other set, which we'll take to be the real numbers for
For example, F(n)=(g^n-(1-g)^n)/sqrt(5), where g is the golden ratio, is a
sequence giving the Fibonacci numbers familiar from recreational mathematics.
We sometimes are concerned with the behavior of a sequence as n tends to
infinity, the limiting behavior or limit of the sequence. A sequence tends to
infinity if there exists a number N for any M, no matter how large, F(N)>M. A
sequence is said to have a limit L if for every (real number) epsilon, no
matter how small, there exists an N such that n>N implies |F(n)-L|< epsilon.
Some sequences of interest are sums, and we are often faced with sums of
infinite numbers of terms. Let's call the sum of the first n elements of a
sequence F(n) "sum(F(n))"=F(1)+F(2)+...+F(n). The sum of an infinite number of
terms is just the limit of sum(F(n)), determined as above, when n tends to
Functions from continuous spaces to continuous spaces, such as from the real
numbers to the real numbers, also have limits. If we call the independent
variable "x" and the function "F(x)", the function has a limit L as x
approaches x' if for every positive real epsilon, no matter how small, there
exists a delta such that |x-x'|< delta implies |F(x)-L|< epsilon. In other
words, we can get as close to L as we want by squeezing the function's domain
down around x'.
The limit of a function can be defined (equivalently) by applying the function
to a special kind of sequence called a Cauchy sequence. A sequence S(n) is
Cauchy if for every positive real epsilon, there exists an N such that m,n>N
implies |S(n)-S(m)|< epsilon. This is a sequence in which the elements get
closer and closer together as n grows in a very regular way. If the limit of
this Cauchy sequence is x', we can say that the limit of F(x) as x approaches
x' is L if the limit of the sequence F(S(n)) is L.
The limit is not a useless concept at all. For example, functions like sin(x)/x
are rather common in physics, and we know that, although sin(x)/x is undefined
at x=0, its limit as x approaches 0 is 1, and can thus fill in the "hole".
A derivative is a special kind of limit. For functions from the real numbers to
the real numbers, it is equal to the slope of the line tangent to the graph,
which is the instantaneous rate of change of the function at that point. (This
gets more complicated in higher dimensions!)
The slope of F(x) near the point x' can be approximated by (F(x+h)-F(x))/h; the
slope of the tangent line-the derivative-can be found as the limit of this
quantity as h approaches zero.
Derivatives are fundamental tools to scientists, engineers, and economists, as
they allow the behavior of systems to be specified in terms of their rates of
change. Newton's Second Law F=ma, for example, is an equation of motion for a
system that involves a derivative, namely acceleration, which is the second
derivative (derivative of the derivative) of position.
Integration is the continuous equivalent of the infinite sum. For example, we
can integrate velocity to obtain position, multiplying at small timesteps the
velocity by the length of the time step to get the distance traveled in the
time step, then taking the limit as the size of the timestep approaches 0. In
one dimension, it is the area underneath a graph.
The Fundamental Theorem of Calculus relates integration and differentiation,
stating that they are opposites. Its first part states that D(Int(f,a,x))=f(x)
when we differentiate with respect to X. Its second part states that, when
f=D(F), Int(f,a,b)=F(b)-F(a). Elementary proofs are available but too
intricate to present here. This theorem allows the student to calculate
integrals by anti-differentiating and to solve simple differential equations by
integration. Learning how to do so takes memorization and a good deal of
practice; the exercises in any modern elementary calculus book are a good
These concepts all have more general (and not useless) definitions on abstract
and possibly higher-dimensional metric spaces; if calculus were a rabbit-hole,
it could be said to go "all the way down."