# When is an integral not an integral?

No surprise to anyone really that students get confused by the difference between definite and indefinite integrals. The so-called indefinite integral is not really an integral at all, not in the sense of area: it’s the solution set to a differential equation. It’s not even usually a single function at all, but a whole family of functions.

To imagine that $\int x^2 dx$ defines a single thing – a function $g(x)$ for which $\frac{dg(x)}{dx}=x^2$  – is to miss the important point that this differential equation has a whole family of solutions: $\frac{1}{3}x^3+c$, a  solution for each real number $c \in \mathbb{R}$:

So, $\int x^2 dx = \{\frac{1}{3}x^3+c \vert c \in \mathbb{R}\}$, NOT $\int x^2 dx =\frac{1}{3}x^3+c$ as is commonly written.

The villain who engendered this confusion was none other than the great Gottfried Wilhelm Leibniz a co-discover/inventor of calculus along with Isaac Newton.

Wait! I hear you say, isn’t this just pedantry? Does it really matter if we write $\int x^2 dx$ as $\frac{1}{3}x^3+c$ or as the fashionable set-theoretic $\{\frac{1}{3}x^3+c \vert c \in \mathbb{R}\}$ ?

Well yes, it does. Think about how you might resolve the following apparent conundrum:

$\int\frac{1}{x}dx=\int 1\times\frac{1}{x}dx=x\frac{1}{x}+\int x\times\frac{1}{x^2}dx=1+\int \frac{1}{x} dx$ so $0=1$ !!

This is nonsense, of course, because the algebraic calculation proceeds as if $\int\frac{1}{x}dx$ is a well-defined function, which it is not: it is a family of functions (even worse – defined over a disconnected domain).

Students of mathematics need to learn how to think, not carry out mindless calculations as if they were performing monkeys.

This is an important psychological point: students are used to writing $\int x^2dx=\frac{1}{3}x^3+c$ as a conditioned response, functioning at a symbolic level equivalent to that almost of a dog or a cat (“sit, Fido”, “Here’s food, puss.”), whereas to gain mathematical power and flexibly they need to be functioning at a much higher symbolic level, utilizing a far richer collection of symbolic reference.

2. If we agree that C denotes the additive group of constant functions, then $\frac{1}{3}x^3+C$ is a coset. This is similar to how we formalize big-O notation. In your example, $0+C=1+C$ would not be a contradiction.