Introduction to Mathematical Analysis
Understand limits, continuity, differentiation, integration, and the foundational properties of real numbers and metric spaces.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz
Quick Practice
What is the formal definition of a sequence $(an)$ converging to a limit $L$?
1 of 29
Summary
Real Analysis: Limits, Continuity, and Derivatives
Introduction
Real analysis is the rigorous study of functions, limits, and the properties of the real number system. Rather than relying on intuition and informal reasoning, analysis uses precise definitions and logical proofs to establish the foundations of calculus. The concepts in this guide form the backbone of advanced mathematics, and understanding them deeply is essential for your study of analysis.
Part 1: The Foundation—Structure of the Real Numbers
Before we can understand limits and continuity, we need to understand what kind of number system we're working with.
The Completeness Property
The real numbers have a fundamental property that distinguishes them from the rational numbers: completeness. This property states that every non-empty set of real numbers that is bounded above has a least upper bound, called a supremum (or "least upper bound").
Why does this matter? The completeness property guarantees that the real number line has no "gaps." It's what allows us to define limits rigorously. Without completeness, many theorems in calculus would fail.
Supremum and Infimum
The supremum of a set $S$ (written $\sup S$) is the smallest real number that is greater than or equal to every element in $S$.
The infimum of a set $S$ (written $\inf S$) is the greatest real number that is less than or equal to every element in $S$.
Example: For the set $S = \{x : 0 < x < 1\}$:
The supremum is $1$ (even though $1$ is not in the set)
The infimum is $0$ (even though $0$ is not in the set)
Notice that the supremum and infimum don't have to be in the set itself—they're the boundaries.
The Archimedean Property
The Archimedean property states that for any real number $x$, there exists an integer $n$ such that $n > x$.
Why is this important? This property ensures that the integers are unbounded—no real number is "larger than all integers." This might seem obvious, but it's essential for proving many results about limits. It guarantees that no matter how large or small our sequences or functions get, we can always find an integer that exceeds any given real number.
Density of the Rationals
Between any two distinct real numbers, there exists a rational number. This means the rationals are "dense" in the reals—they're scattered throughout the real line with no gaps.
This is useful for proofs: sometimes it's easier to work with rational numbers and then extend results to all real numbers.
Part 2: Limits—The Fundamental Tool
Limits are the foundation of calculus and analysis. They provide a rigorous way to talk about what happens to a function or sequence as we approach some value.
Limits of Sequences
A sequence is an ordered list of numbers: $(a1, a2, a3, \ldots)$, which we write as $(an)$ or $\{an\}$.
Definition: Convergence of a Sequence
A sequence $(an)$ converges to a limit $L$ if the terms get arbitrarily close to $L$ and stay close. More precisely:
For every $\epsilon > 0$, there exists an integer $N$ such that $|an - L| < \epsilon$ whenever $n \geq N$.
What does this mean? No matter how tight a tolerance $\epsilon$ you demand, eventually all terms of the sequence (after some point $N$) will be within that tolerance of $L$.
Example: The sequence $an = \frac{1}{n}$ converges to $0$.
If you want all terms within $0.1$ of zero, that happens when $n \geq 10$ (since $\frac{1}{10} = 0.1$)
If you want all terms within $0.01$ of zero, that happens when $n \geq 100$
And so on
The key insight: the required $N$ depends on $\epsilon$. Smaller tolerances require us to go further out in the sequence.
Limits of Functions
Functions have limits too. The definition is similar but uses the distance from the input $x$ to some point $c$.
Definition: Limit of a Function
We say $\displaystyle\lim{x\to c}f(x) = L$ if for every $\epsilon > 0$, there exists a $\delta > 0$ such that
$$|f(x) - L| < \epsilon \text{ whenever } 0 < |x - c| < \delta$$
What does this mean? As $x$ gets close to $c$ (within distance $\delta$), the function value $f(x)$ gets close to $L$ (within distance $\epsilon$). The condition $0 < |x - c|$ means we don't require $f$ to be defined at $c$ itself—we only care about the behavior near $c$.
A crucial point: This definition makes no reference to what $f(c)$ actually is (if it even exists). The limit depends only on the behavior near $c$, not at $c$.
Example: Consider $f(x) = \frac{x^2 - 1}{x - 1}$ and the limit as $x \to 1$. This function is undefined at $x = 1$ (we'd get $\frac{0}{0}$), but: $$\lim{x \to 1} \frac{x^2 - 1}{x - 1} = \lim{x \to 1} \frac{(x-1)(x+1)}{x - 1} = \lim{x \to 1} (x+1) = 2$$
The limit exists and equals $2$, even though the function is undefined at $x = 1$.
Algebraic Properties of Limits
Limits behave nicely with respect to arithmetic operations. If $\displaystyle\lim{x\to c}f(x) = L$ and $\displaystyle\lim{x\to c}g(x) = M$, then:
Sum: $\displaystyle\lim{x\to c}[f(x) + g(x)] = L + M$
Difference: $\displaystyle\lim{x\to c}[f(x) - g(x)] = L - M$
Product: $\displaystyle\lim{x\to c}[f(x) \cdot g(x)] = L \cdot M$
Quotient: $\displaystyle\lim{x\to c}\frac{f(x)}{g(x)} = \frac{L}{M}$ (provided $M \neq 0$)
Why is this important? These properties let us compute limits of complicated functions by breaking them into simpler pieces. Instead of using the $\epsilon$-$\delta$ definition for every limit, we can use these rules.
Part 3: Continuity
Continuity is one of the most important concepts in analysis. Intuitively, a continuous function is one you can draw without lifting your pen. Formally, continuity is defined using limits.
Definition of Continuity at a Point
A function $f$ is continuous at $c$ if
$$\lim{x\to c}f(x) = f(c)$$
What does this require? Three things:
The function must be defined at $c$ (so $f(c)$ exists)
The limit as $x$ approaches $c$ must exist
The limit must equal the function value at $c$
Why is the third condition important? Without it, the limit of a function might not match the function's actual value. Continuity demands that there's no "jump" or "hole" at $c$.
Example of discontinuity: Consider $$f(x) = \begin{cases} x & \text{if } x \neq 1 \\ 5 & \text{if } x = 1 \end{cases}$$
As $x \to 1$, the function approaches $1$, but $f(1) = 5$. So $f$ is not continuous at $x = 1$, even though the limit exists.
The Intermediate Value Theorem
One of the most useful theorems about continuous functions is the Intermediate Value Theorem (IVT).
Theorem: If $f$ is continuous on a closed interval $[a, b]$, then for every value $y$ between $f(a)$ and $f(b)$, there exists at least one point $c \in [a, b]$ where $f(c) = y$.
Why is this useful? The IVT guarantees that a continuous function "hits" every value between its endpoints. It can't "skip over" a value.
Example: If $f$ is continuous on $[0, 1]$ with $f(0) = -2$ and $f(1) = 3$, then by the IVT, there must be some point where $f(x) = 0$, some point where $f(x) = 1$, some point where $f(x) = 2.5$, etc.
This theorem is often used to prove that equations have solutions.
Uniform Continuity
Continuity at a point is a local property—it tells us about behavior near one specific point. Uniform continuity is a global property—it applies to an entire set.
Definition: Uniform Continuity
A function $f$ is uniformly continuous on a set $S$ if for every $\epsilon > 0$, there exists a $\delta > 0$ such that
$$|f(x) - f(y)| < \epsilon \text{ for all } x, y \in S \text{ with } |x - y| < \delta$$
The key difference from continuity: For ordinary continuity, the $\delta$ can depend on which point you're looking at. For uniform continuity, the same $\delta$ works for all points in $S$.
Why does this matter? Uniform continuity is stronger and more useful. It guarantees that the function doesn't wiggle too rapidly anywhere in the domain.
Important theorem: If $f$ is continuous on a compact set (like a closed interval $[a,b]$), then $f$ is uniformly continuous on that set.
Consequences of Continuity
Continuous functions preserve important properties:
Boundedness: If $f$ is continuous on a compact set $[a,b]$, then $f$ is bounded (its values don't escape to infinity).
Extreme values: If $f$ is continuous on $[a,b]$, then $f$ attains its maximum and minimum values on that interval.
Preservation of compactness: Continuous functions map compact sets to compact sets.
These properties make continuous functions particularly nice to work with.
Part 4: Differentiation
The derivative measures how rapidly a function is changing. Just as limits are central to defining continuity, limits are also central to defining derivatives.
Definition of the Derivative
The derivative of $f$ at a point $c$ is defined as
$$f'(c) = \lim{h \to 0} \frac{f(c+h) - f(c)}{h}$$
provided this limit exists.
What does this mean? The numerator $f(c+h) - f(c)$ is the change in the function value, and the denominator $h$ is the change in the input. Their ratio is the average rate of change. As $h \to 0$, this becomes the instantaneous rate of change—the slope of the tangent line.
Example: For $f(x) = x^2$ at $c = 2$: $$f'(2) = \lim{h \to 0} \frac{(2+h)^2 - 4}{h} = \lim{h \to 0} \frac{4 + 4h + h^2 - 4}{h} = \lim{h \to 0} \frac{4h + h^2}{h} = \lim{h \to 0} (4 + h) = 4$$
Differentiability Implies Continuity
Here's a key result: If a function has a derivative at a point, it must be continuous at that point.
Why? If $f'(c)$ exists, then the function can't have a jump or hole at $c$—it must be smooth enough to have a well-defined tangent line. More formally, the existence of the limit $f'(c)$ guarantees that $\lim{x \to c} f(x) = f(c)$.
Important: The converse is not true. A function can be continuous but not differentiable. For example, $f(x) = |x|$ is continuous at $x = 0$ but has no derivative there (the left and right tangent slopes are different).
The Mean Value Theorem
The Mean Value Theorem (MVT) is one of the most important results in analysis.
Theorem: If $f$ is continuous on $[a, b]$ and differentiable on $(a, b)$, then there exists at least one point $c \in (a, b)$ such that
$$f'(c) = \frac{f(b) - f(a)}{b - a}$$
What does this say? The right side is the average rate of change between the endpoints. The MVT guarantees that somewhere in between, the instantaneous rate of change (the derivative) equals this average rate.
Intuition: If you drive from point $a$ to point $b$ and your average speed is 60 mph, then at some moment during the drive, you were going exactly 60 mph.
The MVT is powerful because it relates the behavior of a function at a point to its behavior over an interval.
The Chain Rule
When you compose functions, their derivatives follow the chain rule:
$$\frac{d}{dx}[f(g(x))] = f'(g(x)) \cdot g'(x)$$
Or in Leibniz notation: $\frac{dy}{dx} = \frac{dy}{du} \cdot \frac{du}{dx}$.
Example: For $h(x) = (x^2 + 1)^3$, we have $f(u) = u^3$ and $g(x) = x^2 + 1$, so: $$h'(x) = 3(x^2 + 1)^2 \cdot 2x = 6x(x^2 + 1)^2$$
<extrainfo>
Higher-Order Derivatives
The second derivative $f''(x)$ is the derivative of $f'(x)$. Likewise, the third derivative $f'''(x)$ is the derivative of $f''(x)$, and so on.
These higher-order derivatives measure the rate of change of the rate of change. For example, $f''(x)$ describes the concavity of $f$: if $f''(x) > 0$, the function is concave up; if $f''(x) < 0$, it's concave down.
</extrainfo>
Part 5: Integration
Integration is the process of finding areas under curves. Just as differentiation uses limits to define instantaneous rates of change, integration uses limits to define areas.
Definition of the Riemann Integral
To define the area under a curve, we use Riemann sums. We partition the interval $[a, b]$ into small subintervals, form rectangles with these intervals as bases and heights $f(xi^)$ for some sample points, and add up their areas.
The Riemann integral is defined as the limit of these sums:
$$\inta^b f(x)\,dx = \lim{\text{mesh} \to 0} \sum{i=1}^n f(xi^)\Delta xi$$
where $\Delta xi$ are the widths of the subintervals and $xi^$ are sample points within each subinterval.
Why "limit"? As the subintervals get thinner (the mesh approaches zero), the Riemann sum gives a better and better approximation to the area. The integral is what this sum approaches.
Upper and Lower Sums
To determine if a function is Riemann integrable, we use upper and lower sums.
For a partition $P$ of $[a, b]$:
The upper sum $U(P, f)$ uses the maximum value of $f$ on each subinterval
The lower sum $L(P, f)$ uses the minimum value of $f$ on each subinterval
Clearly, $L(P, f) \leq$ (area under $f$) $\leq U(P, f)$.
Theorem: A bounded function $f$ is Riemann integrable on $[a, b]$ if and only if the supremum of all lower sums equals the infimum of all upper sums.
Why does this matter? This criterion tells us which functions can be meaningfully integrated. Well-behaved functions like continuous functions are integrable.
The Fundamental Theorem of Calculus
The Fundamental Theorem of Calculus connects differentiation and integration—they're inverse operations.
Part I: Differentiation of an Integral
If $f$ is continuous on an interval and we define
$$F(x) = \inta^x f(t)\,dt$$
then $F'(x) = f(x)$.
What does this mean? The derivative of an integral gets us back the original function. Integration and differentiation undo each other.
Part II: Computing Integrals with Antiderivatives
If $f$ is continuous on $[a, b]$ and $F$ is any antiderivative of $f$ (meaning $F'(x) = f(x)$), then
$$\inta^b f(x)\,dx = F(b) - F(a)$$
Why is this useful? This is the formula you learned in calculus for computing integrals. Instead of taking limits of Riemann sums, we find an antiderivative and evaluate it at the endpoints.
Example: $\int0^2 x\,dx = \left[\frac{x^2}{2}\right]0^2 = 2 - 0 = 2$
Linearity and Additivity of the Integral
Integrals respect arithmetic operations:
Linearity: $\displaystyle\inta^b (cf(x) + dg(x))\,dx = c\inta^b f(x)\,dx + d\inta^b g(x)\,dx$
Additivity: $\displaystyle\inta^c f(x)\,dx + \intc^b f(x)\,dx = \inta^b f(x)\,dx$
These allow us to split complicated integrals into simpler pieces.
Part 6: Sequences and Series of Functions
We've studied sequences of numbers and series of numbers. Now we extend these ideas to functions.
Pointwise Convergence
A sequence of functions $(fn)$ converges pointwise to a function $f$ if, for each fixed $x$ in the domain, the sequence of numbers $(fn(x))$ converges to $f(x)$:
$$\lim{n \to \infty} fn(x) = f(x) \text{ for each } x$$
Example: Consider $fn(x) = x^n$ on $[0, 1]$. For each fixed $x \in [0, 1)$, we have $\lim{n \to \infty} x^n = 0$. At $x = 1$, we have $fn(1) = 1$ for all $n$. So $(fn)$ converges pointwise to
$$f(x) = \begin{cases} 0 & \text{if } 0 \leq x < 1 \\ 1 & \text{if } x = 1 \end{cases}$$
Notice that each $fn$ is continuous, but the pointwise limit $f$ is discontinuous at $x = 1$. Pointwise convergence doesn't preserve nice properties like continuity.
Uniform Convergence
Uniform convergence is a stronger notion. A sequence of functions $(fn)$ converges uniformly to $f$ on a set $S$ if for every $\epsilon > 0$, there exists an $N$ such that
$$|fn(x) - f(x)| < \epsilon \text{ for all } x \in S \text{ whenever } n \geq N$$
The key difference from pointwise convergence: The $N$ doesn't depend on which $x$ you pick—the same $N$ works for all $x \in S$ simultaneously.
Why does this matter? Uniform convergence preserves important properties. If each $fn$ is continuous and $(fn)$ converges uniformly to $f$, then $f$ is continuous. This doesn't happen with merely pointwise convergence.
Return to the example: The sequence $fn(x) = x^n$ does not converge uniformly on $[0, 1]$. For $n$ large, $fn$ is very close to zero on most of $[0, 1)$, but it's still equal to $1$ at $x = 1$. The convergence is "nonuniform" at the boundary.
Absolute Convergence of Series
A series $\sum{n=1}^\infty an$ converges absolutely if the series of absolute values $\sum{n=1}^\infty |an|$ converges.
Why is absolute convergence useful? If a series converges absolutely, we're guaranteed that the series converges (you can ignore the signs). Moreover, we can rearrange the terms without changing the sum—absolute convergence is "robust."
By contrast, a series that converges but doesn't converge absolutely can have very sensitive properties: rearranging the terms might change the sum, or even make it diverge.
Comparison Tests
To determine if a series converges, we can compare it to a series we already understand.
Comparison Test: If $0 \leq an \leq bn$ for all $n$ and $\sum bn$ converges, then $\sum an$ also converges. Similarly, if $\sum an$ diverges, then $\sum bn$ also diverges.
Why is this useful? We don't have to analyze each series from scratch. If we can find a comparable series with known behavior, we can deduce the behavior of our series.
Part 7: Extensions to Metric Spaces
So far, we've worked specifically with real numbers. But the concepts of limits and continuity are much broader. They apply to any space where we can measure distance.
Metric Spaces
A metric space is a set $M$ together with a distance function (or metric) $d$ that satisfies:
Positivity: $d(x, y) \geq 0$, and $d(x, y) = 0$ only if $x = y$
Symmetry: $d(x, y) = d(y, x)$
Triangle inequality: $d(x, z) \leq d(x, y) + d(y, z)$
Example: The real numbers with $d(x, y) = |x - y|$ form a metric space. So do points in the plane with the Euclidean distance $d((x1, y1), (x2, y2)) = \sqrt{(x1 - x2)^2 + (y1 - y2)^2}$.
Normed Vector Spaces
A norm on a vector space $V$ is a function $\|\cdot\|$ that assigns each vector a non-negative length, satisfying:
$\|v\| \geq 0$, and $\|v\| = 0$ only if $v = 0$
$\|cv\| = |c| \|v\|$ for scalars $c$
$\|v + w\| \leq \|v\| + \|w\|$ (triangle inequality)
A norm naturally induces a metric: $d(v, w) = \|v - w\|$.
Example: In $\mathbb{R}^n$, the Euclidean norm is $\|v\| = \sqrt{v1^2 + v2^2 + \cdots + vn^2}$.
Limits and Continuity in Metric Spaces
All the definitions of limits and continuity generalize to metric spaces by replacing absolute value with the metric.
Limit in a metric space: We say $\lim{x \to c} f(x) = L$ if for every $\epsilon > 0$, there exists $\delta > 0$ such that $d(f(x), L) < \epsilon$ whenever $0 < d(x, c) < \delta$.
Continuity in a metric space: A function $f$ is continuous at $c$ if $\lim{x \to c} f(x) = f(c)$ in the sense above.
Why does this matter? The proofs and properties we developed for real numbers—the MVT, the IVT, properties of continuous functions—many of them generalize to metric spaces. This shows that our analysis isn't just about real numbers; it's about the fundamental structure of limits and continuity.
Summary
Real analysis provides the rigorous foundations for calculus. The key concepts are:
Limits are the fundamental tool for defining continuity and derivatives with precision
Continuity ensures functions don't have unexpected jumps
Differentiation measures rates of change rigorously using limits
Integration computes areas using limits of sums
Sequences and series of functions generalize these ideas
The real number system's completeness makes all of this possible
Metric spaces show that these ideas apply far beyond just the real line
Understanding these concepts deeply—beyond just memorizing formulas—will serve you well in advanced mathematics.
Flashcards
What is the formal definition of a sequence $(an)$ converging to a limit $L$?
For every $\epsilon > 0$ there exists $N$ such that $|an - L| < \epsilon$ whenever $n \ge N$.
What is the formal $\epsilon$-$\delta$ definition of $\lim{x \to c} f(x) = L$?
For every $\epsilon > 0$ there exists $\delta > 0$ such that $|f(x) - L| < \epsilon$ whenever $0 < |x - c| < \delta$.
Which algebraic operations are preserved by the properties of limits?
Addition
Subtraction
Multiplication
Division (provided the divisor limit is non-zero)
What is the definition of a function $f$ being continuous at a point $c$?
$\lim{x \to c} f(x) = f(c)$
According to the Intermediate Value Theorem, if $f$ is continuous on $[a, b]$, what exists for every $y$ between $f(a)$ and $f(b)$?
There exists a $c \in [a, b]$ such that $f(c) = y$.
What distinguishes uniform continuity on a set $S$ from ordinary continuity?
The same $\delta$ works for every point in the set $S$ for a given $\epsilon$.
What are two major consequences of continuity regarding sets and boundedness?
Continuous functions map compact sets to compact sets.
Continuous functions preserve boundedness.
What is the formal limit definition of the derivative $f'(c)$?
$f'(c) = \lim{h \to 0} \frac{f(c+h) - f(c)}{h}$ (provided the limit exists).
What is the relationship between differentiability and continuity at a point $c$?
If $f$ is differentiable at $c$, then $f$ is continuous at $c$.
What does the Mean Value Theorem guarantee exists for a function $f$ continuous on $[a, b]$ and differentiable on $(a, b)$?
A point $c \in (a, b)$ such that $f'(c) = \frac{f(b) - f(a)}{b - a}$.
What is the formula for the Chain Rule regarding the derivative of the composition $(f \circ g)(x)$?
$(f \circ g)'(x) = f'(g(x)) \cdot g'(x)$
How are higher-order derivatives, such as the second derivative $f''$, defined?
Iteratively, where each higher derivative is the derivative of the previous one.
How is the definite integral $\inta^b f(x) \, dx$ defined using Riemann sums?
It is the limit of the sums $\sum{i=1}^n f(xi^) \Delta xi$ as the mesh of the partition approaches zero.
Under what condition involving upper and lower sums is a function Riemann integrable on $[a, b]$?
When the supremum of its lower sums equals the infimum of its upper sums.
According to the Fundamental Theorem of Calculus (Part I), what is the derivative of $F(x) = \inta^x f(t) \, dt$ if $f$ is continuous?
$F'(x) = f(x)$
According to the Fundamental Theorem of Calculus (Part II), how is $\inta^b f(x) \, dx$ evaluated using an antiderivative $F$?
$\inta^b f(x) \, dx = F(b) - F(a)$
What are the two primary properties of the integral regarding linearity and interval additivity?
Linearity: $\inta^b (af + bg) = a \inta^b f + b \inta^b g$
Additivity: $\inta^c f + \intc^b f = \inta^b f$
What is the definition of a sequence of functions $(fn)$ converging pointwise to $f$?
For each $x$, $\lim{n \to \infty} fn(x) = f(x)$.
What is the formal definition of uniform convergence for a sequence of functions $(fn)$ on a set $S$?
For every $\epsilon > 0$ there exists $N$ such that $|fn(x) - f(x)| < \epsilon$ for all $x \in S$ whenever $n \ge N$.
When is a series $\sum an$ said to converge absolutely?
When the series of absolute values $\sum |an|$ converges.
According to the Comparison Test, if $0 \le an \le bn$ and $\sum bn$ converges, what can be concluded about $\sum an$?
The series $\sum an$ also converges.
What does the Completeness Property of real numbers state?
Every non-empty set of real numbers bounded above has a least upper bound (supremum).
What is the definition of the supremum of a set?
The smallest real number greater than or equal to every element of the set.
What is the definition of the infimum of a set?
The greatest real number less than or equal to every element of the set.
What does the Archimedean Property state about any real number $x$?
There exists an integer $n$ such that $n > x$.
What does the density of rational numbers imply about any two distinct real numbers?
There exists a rational number between them.
What three properties must the distance function $d$ satisfy in a metric space $(M, d)$?
Positivity
Symmetry
Triangle inequality
How does a norm $\|\cdot\|$ on a vector space induce a metric $d(x, y)$?
$d(x, y) = \|x - y\|$
How are the definitions of limit and continuity extended from the real line to general metric spaces?
By using the metric $d$ in place of the absolute value.
Quiz
Introduction to Mathematical Analysis Quiz Question 1: When is a function $f$ said to be continuous at a point $c$?
- When $\displaystyle\lim_{x\to c}f(x)=f(c)$. (correct)
- When the limit $\displaystyle\lim_{x\to c}f(x)$ exists (finite or infinite).
- When $f$ has a derivative at $c$.
- When $f(c)$ is defined, regardless of limits.
Introduction to Mathematical Analysis Quiz Question 2: What does pointwise convergence of a sequence of functions $(f_n)$ to $f$ mean?
- For each $x$, $\displaystyle\lim_{n\to\infty}f_n(x)=f(x)$. (correct)
- The convergence $\displaystyle\sup_{x}|f_n(x)-f(x)|\to0$ as $n\to\infty$.
- The series $\sum_{n=1}^\infty f_n(x)$ converges to $f(x)$ for each $x$.
- The functions $f_n$ are all equal to $f$ after some index $N$.
Introduction to Mathematical Analysis Quiz Question 3: According to the Fundamental Theorem of Calculus Part I, if $F(x)=\displaystyle\int_a^x f(t)\,dt$ and $f$ is continuous, what is $F'(x)$?
- $F'(x)=f(x)$ (correct)
- $F'(x)=\displaystyle\int_a^x f(t)\,dt$
- $F'(x)=f(a)$
- $F'(x)=0$
Introduction to Mathematical Analysis Quiz Question 4: For differentiable functions $f$ and $g$, what is the derivative of the composition $f\circ g$ at a point $x$?
- (f∘g)'(x)=f'(g(x))·g'(x) (correct)
- (f∘g)'(x)=f'(x)·g'(x)
- (f∘g)'(x)=f'(g(x))+g'(x)
- (f∘g)'(x)=f(g'(x))·g(x)
Introduction to Mathematical Analysis Quiz Question 5: If $f$ is continuous on $[a,b]$ and $F$ is any antiderivative of $f$, what does the Fundamental Theorem of Calculus Part II state?
- ∫ₐᵇ f(x) dx = F(b) − F(a) (correct)
- ∫ₐᵇ f(x) dx = F(a) − F(b)
- ∫ₐᵇ f(x) dx = F(b) + F(a)
- ∫ₐᵇ f(x) dx = F'(b) − F'(a)
Introduction to Mathematical Analysis Quiz Question 6: When does a series $\displaystyle\sum_{n=1}^\infty a_n$ converge absolutely?
- When the series of absolute values $\displaystyle\sum_{n=1}^\infty |a_n|$ converges. (correct)
- Whenever the original series $\displaystyle\sum a_n$ converges.
- Only if all terms $a_n$ are positive.
- If $\displaystyle\sum_{n=1}^\infty |a_n|$ diverges, then $\displaystyle\sum a_n$ must also diverge.
Introduction to Mathematical Analysis Quiz Question 7: According to the Mean Value Theorem, what must exist for a function $f$ that is continuous on $[a,b]$ and differentiable on $(a,b)$?
- A point $c\in(a,b)$ such that $f'(c)=\dfrac{f(b)-f(a)}{b-a}$. (correct)
- The function $f$ must be constant on $[a,b]$.
- The derivative $f'$ attains its maximum value at some point in $(a,b)$.
- The function $f$ is uniformly continuous on $[a,b]$.
Introduction to Mathematical Analysis Quiz Question 8: If $f$ is continuous on a compact set $K\subset\mathbb{R}$, which of the following must hold?
- The image $f(K)$ is also compact. (correct)
- The image $f(K)$ is necessarily an open set.
- The image $f(K)$ must be a dense subset of $\mathbb{R}$.
- The image $f(K)$ is always connected, even if $K$ is disconnected.
Introduction to Mathematical Analysis Quiz Question 9: Suppose a function $f$ has a derivative at a point $c$. Which conclusion is guaranteed?
- $f$ is continuous at $c$. (correct)
- $f$ is uniformly continuous on its entire domain.
- $f$ is monotonic in some neighborhood of $c$.
- $f$ is bounded on every interval containing $c$.
Introduction to Mathematical Analysis Quiz Question 10: Which of the following is a correct property of a norm $\|\cdot\|$ on a vector space?
- It induces a metric via $d(x,y)=\|x-y\|$. (correct)
- It must satisfy the parallelogram law for all vectors.
- It is always derivable from an inner product.
- It can assign negative values to some non‑zero vectors.
Introduction to Mathematical Analysis Quiz Question 11: If $0\le a_n\le b_n$ for every $n$ and the series $\sum b_n$ converges, what can be concluded about $\sum a_n$?
- The series $\sum a_n$ also converges (correct)
- The series $\sum a_n$ diverges
- The series $\sum a_n$ converges conditionally
- No conclusion can be drawn without further information
Introduction to Mathematical Analysis Quiz Question 12: Which of the following defines a valid metric on $\mathbb{R}$?
- $d(x,y)=|x-y|$ (correct)
- $d(x,y)=|x-y|+1$
- $d(x,y)=(x-y)^{2}$
- $d(x,y)=0$ for all $x,y$
Introduction to Mathematical Analysis Quiz Question 13: Which of the following statements about limits of sequences is always true?
- The limit of a sum equals the sum of the limits (provided the limits exist). (correct)
- The limit of a product equals the sum of the limits.
- The limit of a difference equals the product of the limits.
- The limit of a quotient equals the limit of the numerator minus the denominator.
Introduction to Mathematical Analysis Quiz Question 14: What does it mean for a sequence of functions (f_n) to converge uniformly to f on a set S?
- For every ε>0 there exists N such that |f_n(x)−f(x)|<ε for all x∈S whenever n≥N. (correct)
- For each x∈S there exists N (depending on x) such that |f_n(x)−f(x)|<ε whenever n≥N.
- The sequence converges in the L² norm on S.
- Only pointwise convergence is required, i.e., the inequality holds for each fixed x.
Introduction to Mathematical Analysis Quiz Question 15: In the ε‑δ definition of limit for functions between metric spaces, what replaces the absolute value |·|?
- The metric d(·,·) of the space. (correct)
- The Euclidean norm.
- The inner product ⟨·,·⟩.
- No replacement is needed; |·| is still used.
Introduction to Mathematical Analysis Quiz Question 16: How is the second derivative $f''$ defined?
- It is the derivative of the first derivative $f'$. (correct)
- It is the limit of the difference quotient applied directly to $f$ twice simultaneously.
- It is the integral of $f'$ over the domain.
- It is the product of $f'$ with itself.
Introduction to Mathematical Analysis Quiz Question 17: Which of the following expresses the linearity of the definite integral?
- $\displaystyle\int_a^b (af+bg)\,dx = a\int_a^b f\,dx + b\int_a^b g\,dx$ (correct)
- $\displaystyle\int_a^b (f+g)\,dx = \left(\int_a^b f\,dx\right)\!\cdot\!\left(\int_a^b g\,dx\right)$
- $\displaystyle\int_a^b af\,dx = a\int_a^b f\,dx + C$ for some constant $C$
- $\displaystyle\int_a^b f\,dx = a\int_a^b f\,dx$ for any real $a$
Introduction to Mathematical Analysis Quiz Question 18: What does “uniformly” mean in the definition of uniform continuity of a function $f$ on a set $S$?
- The same $\delta$ works for every point of $S$ in the ε‑δ condition. (correct)
- The $\delta$ may depend on the chosen point of $S$.
- Only the endpoints of $S$ need to satisfy the ε‑δ condition.
- The function must have a bounded derivative on $S$.
When is a function $f$ said to be continuous at a point $c$?
1 of 18
Key Concepts
Calculus Concepts
Limit (mathematics)
Continuity
Derivative
Riemann integral
Fundamental theorem of calculus
Intermediate value theorem
Mean value theorem
Mathematical Structures
Uniform convergence
Completeness (order theory)
Metric space
Normed vector space
Definitions
Limit (mathematics)
The value that a sequence or function approaches as its index or argument tends toward infinity or a specified point.
Continuity
A property of a function where small changes in the input produce arbitrarily small changes in the output.
Derivative
The instantaneous rate of change of a function, defined as the limit of the difference quotient.
Riemann integral
The limit of Riemann sums that gives the area under a bounded function on a closed interval.
Fundamental theorem of calculus
The theorem linking differentiation and integration, stating that integration and differentiation are inverse processes.
Intermediate value theorem
The principle that a continuous function on a closed interval takes every value between its endpoints.
Mean value theorem
The result that a differentiable function on an interval has a point where its instantaneous slope equals the average slope over the interval.
Uniform convergence
A mode of convergence of functions where the speed of convergence is independent of the point in the domain.
Completeness (order theory)
The property that every non‑empty set bounded above has a least upper bound (supremum) in the real numbers.
Metric space
A set equipped with a distance function satisfying positivity, symmetry, and the triangle inequality.
Normed vector space
A vector space with a norm that assigns lengths to vectors and induces a metric.