Derivative - Integration Connections and Multivariable Extensions
Understand antiderivatives and the Fundamental Theorem of Calculus, how to evaluate definite integrals using antiderivatives, and multivariable derivative concepts such as the gradient, Jacobian, and total derivative.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz
Quick Practice
What is the relationship between a function $f$ and its antiderivative $F$?
1 of 10
Summary
Antiderivatives and the Fundamental Theorem of Calculus
What Is an Antiderivative?
An antiderivative of a function $f$ is a function $F$ such that:
$$F'(x) = f(x)$$
In other words, $F$ is a function whose derivative gives you back $f$. This is the inverse operation of differentiation.
Example: If $f(x) = 3x^2$, then $F(x) = x^3$ is an antiderivative because $\frac{d}{dx}(x^3) = 3x^2$.
The notation for antiderivatives uses the integral symbol: we write $F(x) = \int f(x) \, dx$ to mean "$F$ is an antiderivative of $f$."
The Family of Antiderivatives
Here's a crucial observation: if $F(x)$ is an antiderivative of $f(x)$, then so is $F(x) + C$ for any constant $C$.
Why? Because the derivative of any constant is zero: $$\frac{d}{dx}[F(x) + C] = F'(x) + 0 = f(x)$$
This means every function has infinitely many antiderivatives—they form a family of functions that differ only by a constant. When we write the indefinite integral:
$$\int f(x) \, dx = F(x) + C$$
the "$+C$" explicitly accounts for this entire family. The constant $C$ is called the constant of integration, and it's essential to include it when finding antiderivatives.
Example: Both $x^3$ and $x^3 + 5$ and $x^3 - 7$ are all antiderivatives of $3x^2$, since they all have the same derivative.
The Fundamental Theorem of Calculus – Part I
The Fundamental Theorem of Calculus (Part I) is one of the most important theorems in mathematics. It connects differentiation and integration:
$$\inta^b f(x) \, dx = F(b) - F(a)$$
where $F$ is any antiderivative of $f$ on the closed interval $[a,b]$.
This theorem tells us that to compute a definite integral (which represents the signed area under the curve), we need only:
Find any antiderivative $F$ of $f$
Evaluate it at the upper and lower limits
Subtract: $F(b) - F(a)$
Notice that the constant $C$ in the family of antiderivatives cancels out when we compute $F(b) - F(a)$, so it doesn't matter which antiderivative we choose.
Example: To find $\int0^2 3x^2 \, dx$:
An antiderivative of $3x^2$ is $F(x) = x^3$
Evaluate: $F(2) - F(0) = 8 - 0 = 8$
Computing Definite Integrals via Antiderivatives
The practical impact of the Fundamental Theorem is that it gives us a computational method for evaluating definite integrals. Rather than computing limits of Riemann sums (which is often difficult), we find an antiderivative and evaluate.
The process is straightforward:
Identify the integrand $f(x)$ and limits $a$ and $b$
Find an antiderivative $F(x)$ using integration techniques
Apply the theorem: compute $F(b) - F(a)$
This is useful because many definite integrals that arise in applications (computing areas, volumes, work done by a force, etc.) can now be solved exactly rather than approximated.
Multivariable Derivatives
When we move beyond single-variable functions to functions of multiple variables, derivatives become more sophisticated. Instead of a single slope, we need to understand how the function changes in many different directions simultaneously.
The Gradient Vector
For a real-valued function $f(x1, x2, \ldots, xn)$ (a function that takes multiple inputs and produces one output), the gradient is a vector containing all of the first-order partial derivatives:
$$\nabla f = \left( \frac{\partial f}{\partial x1}, \frac{\partial f}{\partial x2}, \ldots, \frac{\partial f}{\partial xn} \right)$$
The symbol $\nabla$ is called "del" or "nabla," and $\nabla f$ is read as "grad $f$" or "the gradient of $f$."
Intuition: Each partial derivative $\frac{\partial f}{\partial xi}$ tells you how fast $f$ changes when you move in the $xi$ direction (while holding all other variables fixed). The gradient bundles all these rates of change into a single vector.
Example: For $f(x,y) = 3x^2 + 2xy + y^2$: $$\frac{\partial f}{\partial x} = 6x + 2y$$ $$\frac{\partial f}{\partial y} = 2x + 2y$$ $$\nabla f = (6x + 2y, 2x + 2y)$$
At the point $(1, 1)$: $\nabla f(1,1) = (8, 4)$
Directional Derivative
While a partial derivative tells you the rate of change in the direction of a coordinate axis, the directional derivative tells you the rate of change in any direction you choose.
The directional derivative of $f$ in the direction of a unit vector $\mathbf{u}$ is:
$$D{\mathbf{u}} f = \nabla f \cdot \mathbf{u}$$
This is simply the dot product of the gradient vector with your chosen direction vector. The result is a scalar that tells you the instantaneous rate of change of $f$ as you move in the direction $\mathbf{u}$.
Key insight: The directional derivative is maximized when $\mathbf{u}$ points in the same direction as $\nabla f$. This means the gradient points in the direction of steepest increase.
Example: For $f(x,y) = 3x^2 + 2xy + y^2$ at point $(1,1)$ with $\nabla f = (8, 4)$:
In the direction $\mathbf{u} = \left(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}\right)$:
$$D{\mathbf{u}} f = (8, 4) \cdot \left(\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}}\right) = \frac{8 + 4}{\sqrt{2}} = 6\sqrt{2}$$
The Jacobian Matrix
So far we've considered functions that output a single number. What if we have a vector-valued function that outputs multiple numbers?
For a vector-valued function $\mathbf{F}(x1, x2, \ldots, xn) = (f1(x1, \ldots, xn), f2(x1, \ldots, xn), \ldots, fm(x1, \ldots, xn))$, the Jacobian matrix is the matrix of all first-order partial derivatives:
$$J = \begin{pmatrix} \frac{\partial f1}{\partial x1} & \frac{\partial f1}{\partial x2} & \cdots & \frac{\partial f1}{\partial xn} \\ \frac{\partial f2}{\partial x1} & \frac{\partial f2}{\partial x2} & \cdots & \frac{\partial f2}{\partial xn} \\ \vdots & \vdots & \ddots & \vdots \\ \frac{\partial fm}{\partial x1} & \frac{\partial fm}{\partial x2} & \cdots & \frac{\partial fm}{\partial xn} \end{pmatrix}$$
The Jacobian has one row for each output component and one column for each input variable. Each entry tells you how one output responds to changes in one input.
Structure:
Row $i$ contains all partial derivatives of output $fi$ with respect to each input
Column $j$ contains all partial derivatives of every output with respect to input $xj$
Example: For $\mathbf{F}(x, y) = (x^2 + y, xy)$: $$J = \begin{pmatrix} 2x & 1 \\ y & x \end{pmatrix}$$
At point $(1, 2)$: $J(1,2) = \begin{pmatrix} 2 & 1 \\ 2 & 1 \end{pmatrix}$
Total Derivative (The Differential)
The total derivative (or differential) is a generalization of the derivative from single-variable calculus. It's the best linear approximation to your function at a point.
For a multivariable function, the total derivative at a point is a linear transformation that, when applied to a small change in inputs, gives you the approximate change in outputs. This linear transformation can be represented by the Jacobian matrix.
Intuition: Just as a single-variable function is approximated by its tangent line, a multivariable function is approximated by its tangent hyperplane. The total derivative encodes all the information needed to define this tangent hyperplane.
If $\mathbf{F}$ is differentiable at a point $\mathbf{a}$, then near that point: $$\mathbf{F}(\mathbf{a} + \mathbf{h}) \approx \mathbf{F}(\mathbf{a}) + J(\mathbf{a}) \cdot \mathbf{h}$$
where $J(\mathbf{a})$ is the Jacobian matrix at $\mathbf{a}$, and $\mathbf{h}$ is a small change in the input.
Relationship Between Total and Partial Derivatives
This is an important conceptual point: partial derivatives are special cases of the total derivative.
The relationship: If the total derivative exists at a point, then every partial derivative exists at that point, and the partial derivatives are exactly the entries of the Jacobian matrix representing the total derivative.
More precisely:
The partial derivative $\frac{\partial fi}{\partial xj}$ (how output $fi$ changes with respect to input $xj$) is the $(i,j)$ entry of the Jacobian
All partial derivatives existing and being continuous guarantees that the total derivative exists
Why this matters: Partial derivatives alone don't tell the complete story about how a function changes. The total derivative is the complete description of how a function changes in all directions simultaneously. But practically, we can find the total derivative by computing partial derivatives and arranging them in a Jacobian matrix.
Example: For $f(x,y) = x^2y$:
$\frac{\partial f}{\partial x} = 2xy$ (first entry of gradient)
$\frac{\partial f}{\partial y} = x^2$ (second entry of gradient)
These two partial derivatives form the gradient vector, which represents the total derivative of this scalar-valued function
Flashcards
What is the relationship between a function $f$ and its antiderivative $F$?
$F' = f$ (The derivative of $F$ equals $f$)
If $F$ is an antiderivative of $f$, why is $F + C$ (where $C$ is any constant) also an antiderivative?
Because the derivative of a constant is zero
According to Part I of the theorem, how is the definite integral $\int{a}^{b} f(x) \, dx$ evaluated using an antiderivative $F$?
$F(b) - F(a)$
How does the Fundamental Theorem of Calculus provide a method for evaluating the area under a curve?
By finding an antiderivative and subtracting its values at the interval endpoints
For a real-valued function $f(x{1}, \dots, x{n})$, what components make up the gradient vector $\nabla f$?
All first-order partial derivatives
What is the formula for the directional derivative of a function $f$ in the direction of a unit vector $\mathbf{u}$?
$D{\mathbf{u}}f = \nabla f \cdot \mathbf{u}$ (The dot product of the gradient and the unit vector)
For a vector-valued function $\mathbf{F}(x{1}, \dots, x{n})$, what specific values comprise the entries of the Jacobian matrix?
All first-order partial derivatives
What kind of mathematical object is the total derivative at a point, and what does it represent?
A linear map that best approximates the function in all directions
How can the total derivative of a multivariable function be represented in matrix form?
By the Jacobian matrix
If the total derivative of a function exists, what can be concluded about its partial derivatives?
Every partial derivative exists and equals the corresponding entry of the Jacobian matrix
Quiz
Derivative - Integration Connections and Multivariable Extensions Quiz Question 1: What condition must a function $F$ satisfy to be an antiderivative of a given function $f$?
- $F' = f$ (correct)
- $\displaystyle\int F\,dx = f$
- $F'' = f$
- $F$ is constant
Derivative - Integration Connections and Multivariable Extensions Quiz Question 2: Why does adding a constant $C$ to an antiderivative $F$ produce another antiderivative of the same function?
- The derivative of a constant is zero. (correct)
- The constant shifts the graph vertically.
- The constant changes the domain of $F$.
- The constant alters the slope of $F$.
Derivative - Integration Connections and Multivariable Extensions Quiz Question 3: According to the Fundamental Theorem of Calculus Part I, $\displaystyle\int_{a}^{b} f(x)\,dx$ equals what?
- $F(b)-F(a)$, where $F'=f$ (correct)
- $F(a)-F(b)$, where $F'=f$
- $\displaystyle\int_{a}^{b} F(x)\,dx$
- $F'(b)-F'(a)$
Derivative - Integration Connections and Multivariable Extensions Quiz Question 4: What is the primary method for evaluating a definite integral using antiderivatives?
- Find an antiderivative and compute $F(b)-F(a)$. (correct)
- Approximate the area with Riemann sums.
- Use the derivative at the midpoint of the interval.
- Compute the limit of a series expansion.
Derivative - Integration Connections and Multivariable Extensions Quiz Question 5: What entries are contained in the Jacobian matrix $J_{\mathbf F}$ of a vector‑valued function $\mathbf F$?
- Partial derivatives $\displaystyle\frac{\partial F_i}{\partial x_j}$. (correct)
- Second‑order partial derivatives $\displaystyle\frac{\partial^2 F_i}{\partial x_j^2}$.
- Values of $\mathbf F$ at each coordinate.
- Gradients of each component $F_i$.
What condition must a function $F$ satisfy to be an antiderivative of a given function $f$?
1 of 5
Key Concepts
Calculus Concepts
Antiderivative
Fundamental Theorem of Calculus
Family of Antiderivatives
Multivariable Calculus
Gradient Vector
Directional Derivative
Jacobian Matrix
Total Derivative (Differential)
Definitions
Antiderivative
A function F whose derivative equals a given function f on an interval.
Fundamental Theorem of Calculus
Relates the definite integral of a function over an interval to the values of its antiderivative at the interval’s endpoints.
Family of Antiderivatives
The set of all antiderivatives of a function, differing only by an additive constant.
Gradient Vector
The vector of all first‑order partial derivatives of a real‑valued multivariable function, indicating the direction of steepest ascent.
Directional Derivative
The rate of change of a multivariable function in the direction of a specified unit vector.
Jacobian Matrix
A matrix of first‑order partial derivatives representing the linear approximation of a vector‑valued function.
Total Derivative (Differential)
The linear map that best approximates a multivariable function near a point, often expressed by its Jacobian matrix.