$F(\alpha)=\displaystyle\int_0^1 x^\alpha\,dx = \frac{1}{\alpha+1}$. Can we find $F'(\alpha)$ without re-integrating?

1. Setup: An Integral That Depends on a Parameter

Most integrals we've seen so far are "done" once evaluated - plug in limits, get a number. But some integrals carry an extra variable that isn't the integration variable. Think of $e^{-\alpha x}$: here $x$ is what we integrate over, but $\alpha$ is a parameter - a dial we can turn.

When $\alpha$ is small, $e^{-\alpha x}$ decays slowly - the curve stays high and the area under it is large. When $\alpha$ is large, the exponential drops steeply and the area shrinks. So the integral itself becomes a function of $\alpha$:

$$F(\alpha) = \int_0^1 e^{-\alpha x}\,dx$$

We can evaluate this one explicitly: $F(\alpha) = \dfrac{1-e^{-\alpha}}{\alpha}$. But that's precisely why it makes a good test case - we can check our work. The real payoff of Leibnitz's rule comes when the integral has no closed form at all (like $\int e^{-\alpha x^2}\,dx$). We're starting with $e^{-\alpha x}$ so we can verify the technique on solid ground before applying it in the dark.

Drag the slider. The shaded region is $F(\alpha)$ - watch how it shrinks as $\alpha$ grows. The dashed curve traces $F$ as a function of $\alpha$.

Here's the key question: how fast does $F$ change as we nudge $\alpha$? That is, what is $F'(\alpha)$? We could differentiate the closed form, sure. But Leibnitz's rule gives us a way to find $F'(\alpha)$ directly from the integral - without ever computing $F$ first. That's the technique we'll build in the next card.

2. Fixed Limits: Differentiate Under the Integral

When the limits are constants, Leibnitz's rule simplifies to:

$$F'(\alpha) = \int_a^b \frac{\partial}{\partial \alpha} f(x,\alpha)\,dx$$

For our example: $F'(\alpha) = \displaystyle\int_0^1 (-x)\,e^{-\alpha x}\,dx$.

$F(\alpha)$
$F'(\alpha)$
They match!

Toggle between the exact Leibnitz formula and a finite-difference numerical derivative - they agree everywhere.

3. Variable Limits: The Full Rule

Now let $F(\alpha) = \displaystyle\int_0^{\alpha} \sin(\alpha x)\,dx$. Both the integrand and the upper limit depend on $\alpha$.

The derivative $F'(\alpha)$ has three terms:

Integrand changing: $\displaystyle\int_0^{\alpha}\frac{\partial}{\partial\alpha}\sin(\alpha x)\,dx = \int_0^{\alpha} x\cos(\alpha x)\,dx$
Upper limit moving: $f(\alpha,\alpha)\cdot b'(\alpha) = \sin(\alpha^2)\cdot 1$
Lower limit (fixed at 0): $-f(0,\alpha)\cdot a'(\alpha) = -\sin(0)\cdot 0 = 0$

4. The Full Leibnitz Rule

If $F(\alpha) = \displaystyle\int_{a(\alpha)}^{b(\alpha)} f(x,\alpha)\,dx$, then

$$F'(\alpha) = \underbrace{\color{#ef4444}{\int_{a(\alpha)}^{b(\alpha)} \frac{\partial f}{\partial \alpha}\,dx}}_{\text{integrand changes}} +\; \underbrace{\color{#22c55e}{f\!\big(b(\alpha),\alpha\big)\,b'(\alpha)}}_{\text{upper limit moves}} -\; \underbrace{\color{#3b82f6}{f\!\big(a(\alpha),\alpha\big)\,a'(\alpha)}}_{\text{lower limit moves}}$$

When does each term matter?

All three terms contribute when both limits and the integrand depend on $\alpha$.

5. When Antiderivatives Don't Exist

Try this one: $\dfrac{d}{dx}\displaystyle\int_0^x \frac{\sin t}{t}\,dt$.

Predict first. We're differentiating an accumulation function where the upper limit is $x$. The FTC says the answer should just be the integrand evaluated at $x$...

But wait - $\frac{\sin t}{t}$ has no elementary antiderivative. This integral defines the "sine integral" function $\mathrm{Si}(x)$, and there's no closed-form expression for it in terms of standard functions.

Does that stop us? Not at all. This is $F(x) = \int_0^x f(t)\,dt$ with $f(t) = \frac{\sin t}{t}$. The integrand doesn't contain the parameter $x$, and only the upper limit depends on $x$. Leibnitz's rule gives:

$$\frac{d}{dx}\int_0^x \frac{\sin t}{t}\,dt = \frac{\sin x}{x}$$

Key insight: Leibnitz doesn't need an antiderivative. It differentiates around it. The integral may be impossible to evaluate in closed form, but the derivative of the integral is perfectly tractable.

6. All Three Terms Fire

Here's where the full rule earns its keep. Find $\dfrac{d}{da}\displaystyle\int_0^{a^2}\sqrt{a+t}\,dt$.

The parameter $a$ appears in the integrand ($\sqrt{a+t}$) and in the upper limit ($a^2$). All three terms of Leibnitz's rule contribute:

Upper boundary term: $f(a^2, a)\cdot b'(a) = \sqrt{a + a^2}\cdot 2a$
Lower boundary term: $-f(0, a)\cdot a'(0) = -\sqrt{a}\cdot 0 = 0$
Integral term: $\displaystyle\int_0^{a^2} \frac{\partial}{\partial a}\sqrt{a+t}\,dt = \int_0^{a^2} \frac{1}{2\sqrt{a+t}}\,dt$

Evaluate that last integral directly:

$$\int_0^{a^2}\frac{1}{2\sqrt{a+t}}\,dt = \Big[\sqrt{a+t}\Big]_0^{a^2} = \sqrt{a+a^2} - \sqrt{a}$$

Combining all three terms:

$$\frac{d}{da}\int_0^{a^2}\sqrt{a+t}\,dt = 2a\sqrt{a+a^2} + \sqrt{a+a^2} - \sqrt{a}$$

Common trap: When the parameter appears in both the integrand and the limits, all three terms fire. The most common mistake is forgetting the integral term - the one that comes from $\partial f/\partial a$ inside the integral.

7. Variable Limits, Parameter-Free Integrand

Find $\dfrac{d}{dx}\displaystyle\int_x^{x^3} e^{-t^4}\,dt$.

The integrand $e^{-t^4}$ doesn't contain $x$ at all - the dummy variable $t$ is not the parameter. So $\partial f/\partial x = 0$ and the integral term vanishes. Only the boundary terms survive:

Upper limit: $f(x^3)\cdot b'(x) = e^{-(x^3)^4}\cdot 3x^2 = 3x^2\,e^{-x^{12}}$
Lower limit: $-f(x)\cdot a'(x) = -e^{-x^4}\cdot 1 = -e^{-x^4}$
$$\frac{d}{dx}\int_x^{x^3} e^{-t^4}\,dt = 3x^2\,e^{-x^{12}} - e^{-x^4}$$

Numerical check at $x = 1$: $3(1)^2 e^{-1} - e^{-1} = 2e^{-1} \approx 0.736$.

Common trap: Don't confuse the dummy variable $t$ with the parameter $x$. The integrand $e^{-t^4}$ doesn't contain $x$, so $\partial f/\partial x = 0$. If you mistakenly differentiated with respect to $t$, you'd get a completely wrong answer.

Playground: Test Leibnitz's Rule

Enter $f(x,\alpha)$, $a(\alpha)$, $b(\alpha)$ using JavaScript syntax. Variable names: x, a (for $\alpha$).

$F(\alpha)$
$F'(\alpha)$ - Leibnitz vs Numerical

Practice Problems - §4.9

From Kaplan, problems after §4.9

2(a) Differentiate $\displaystyle\frac{d}{dx}\int_1^{x^2} t^2\,dt$
Find $\dfrac{d}{dx}\displaystyle\int_1^{x^2} t^2\,dt$.
Step 1: Identify the setup.
Here $f(t,x) = t^2$ (no explicit $x$-dependence in the integrand), $a = 1$ (constant), $b(x) = x^2$, so $b'(x) = 2x$.
Step 2: Apply the generalized Leibnitz rule.
Since the integrand doesn't depend on $x$, the "integrand changing" term is zero. Only the upper-limit term survives: $$\frac{d}{dx}\int_1^{x^2} t^2\,dt = f(b(x),x)\cdot b'(x) = (x^2)^2\cdot 2x = 2x^5$$
Step 3: Verify directly.
$$\int_1^{x^2} t^2\,dt = \frac{t^3}{3}\bigg|_1^{x^2} = \frac{x^6 - 1}{3}$$ $$\frac{d}{dx}\frac{x^6-1}{3} = 2x^5 \quad\checkmark$$
2(b) Differentiate $\displaystyle\frac{d}{dt}\int_t^{t^2} \sin(x^2)\,dx$
Find $\dfrac{d}{dt}\displaystyle\int_t^{t^2}\sin(x^2)\,dx$.
Step 1: Identify the three terms.
Here $f(x,t) = \sin(x^2)$ (no explicit $t$-dependence), $a(t) = t$ so $a'(t)=1$, $b(t) = t^2$ so $b'(t) = 2t$.

The "integrand changing" term vanishes since $\partial f/\partial t = 0$.
Step 2: Apply the limit terms.
Upper limit: $f(t^2, t)\cdot b'(t) = \sin(t^4)\cdot 2t = 2t\sin(t^4)$
Lower limit: $-f(t,t)\cdot a'(t) = -\sin(t^2)\cdot 1 = -\sin(t^2)$
Step 3: Combine.
$$\frac{d}{dt}\int_t^{t^2}\sin(x^2)\,dx = 2t\sin(t^4) - \sin(t^2)$$
1(a) Express $\displaystyle\frac{d}{dt}\int_0^1 \frac{\cos(xt)}{x}\,dx$ as an integral
Obtain the derivative $\dfrac{d}{dt}\displaystyle\int_0^1 \frac{\cos(xt)}{x}\,dx$ in the form of an integral.
Step 1: Identify the setup.
Fixed limits $a=0$, $b=1$. Integrand $f(x,t) = \cos(xt)/x$. Only the "integrand changing" term applies.
Step 2: Differentiate under the integral sign.
$$\frac{\partial}{\partial t}\frac{\cos(xt)}{x} = \frac{-x\sin(xt)}{x} = -\sin(xt)$$
Step 3: Write the result.
$$\frac{d}{dt}\int_0^1 \frac{\cos(xt)}{x}\,dx = -\int_0^1 \sin(xt)\,dx$$

Note: the $1/x$ singularity in the original integrand vanishes after differentiation - a common simplification via Leibnitz's rule.

4(a) Evaluate $\displaystyle\int_0^1 x^n \log x\,dx$ by differentiating under the integral
Evaluate $\displaystyle\int_0^1 x^n \log x\,dx$ ($n > -1$) by differentiating both sides of $\displaystyle\int_0^1 x^n\,dx = \frac{1}{n+1}$ with respect to $n$.
Step 1: Start with the known identity.
For $n > -1$: $$F(n) = \int_0^1 x^n\,dx = \frac{1}{n+1}$$
Step 2: Differentiate both sides w.r.t. $n$.
The left side, by Leibnitz's rule (fixed limits, parameter $n$): $$F'(n) = \int_0^1 \frac{\partial}{\partial n}x^n\,dx = \int_0^1 x^n \log x\,dx$$ The right side: $$\frac{d}{dn}\frac{1}{n+1} = -\frac{1}{(n+1)^2}$$
Step 3: Equate.
$$\int_0^1 x^n \log x\,dx = -\frac{1}{(n+1)^2} \quad (n > -1)$$

For example, $n=0$: $\int_0^1 \log x\,dx = -1$, which matches integration by parts.