Law of large numbers - Illustrative Examples
Understand how the law of large numbers makes absolute differences grow while ratios vanish, underpins Monte Carlo accuracy, and drives empirical probabilities toward their theoretical values.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz
Quick Practice
What happens to the ratio of the difference between heads and tails to the total number of flips as the number of flips increases?
1 of 3
Summary
Understanding the Law of Large Numbers: Key Examples
Introduction
The Law of Large Numbers is one of the most important principles in probability and statistics. It tells us that as we repeat an experiment more and more times, our observed results tend to match the theoretical predictions better and better. The three examples below illustrate different aspects of this fundamental principle and show why it matters in both theory and practice.
The Absolute Difference Grows, but the Relative Proportion Vanishes
This example addresses a common source of confusion about the Law of Large Numbers. NECESSARYBACKGROUNDKNOWLEDGE
Imagine flipping a fair coin many times. You might expect that the number of heads and tails should always be exactly equal, but that's not how randomness works. In reality, as you flip more times, the absolute difference between the number of heads and tails can grow—sometimes you might end up with 510 heads and 490 tails (a difference of 20), and if you keep flipping, you might eventually have 5,100 heads and 4,900 tails (a difference of 200).
This seems to contradict the Law of Large Numbers, but here's the crucial insight: what matters is not the absolute difference, but the ratio or relative proportion. In the first case, the difference of 20 is large compared to 1,000 flips, representing a 2% deviation. In the second case, the difference of 200 happens in 10,000 flips, representing only a 2% deviation—actually smaller in relative terms.
As you continue flipping, this relative proportion keeps shrinking. You might eventually have 50,000 heads and 50,000 tails (equal!), or 50,100 heads and 49,900 tails (a difference of 200 in 100,000 flips, or just 0.2% deviation). The larger the sample, the smaller the relative deviation becomes.
The key takeaway: Don't confuse absolute differences with relative proportions. The Law of Large Numbers guarantees that relative differences shrink, even if absolute differences might occasionally grow.
Empirical Probability Converges to Theoretical Probability
This is the most direct application of the Law of Large Numbers. CRITICALCOVEREDONEXAM
When we perform an experiment many times, the empirical probability (the fraction of times an outcome actually occurred) converges to the theoretical probability (what we expect based on the underlying probability distribution).
For example, consider rolling a standard six-sided die. The theoretical probability of rolling a 4 is $\frac{1}{6} \approx 0.167$. If you roll the die just 10 times, you might get a 4 only once, giving an empirical probability of $\frac{1}{10} = 0.1$. But if you roll it 1,000 times, you'll get much closer to $\frac{1}{6}$—perhaps $\frac{168}{1000} = 0.168$. At 100,000 rolls, you'll get even closer.
The graph below shows this principle in action:
Notice how the observed average (green line) starts off erratic and noisy, but gradually settles down and converges to the theoretical mean (blue line) of 3.5 for a standard die. The more trials we conduct, the more stable our observed average becomes.
Why does this matter? This is why we can trust statistics. When we survey 1,000 people instead of 10 people, we get much more reliable estimates of population characteristics because of this convergence principle.
The Monte Carlo Method Relies on the Law of Large Numbers
Understanding the Law of Large Numbers explains why a powerful computational technique works: the Monte Carlo method. CRITICALCOVEREDONEXAM
The Monte Carlo method solves difficult problems by using repeated random sampling. Instead of trying to solve a complex mathematical problem analytically, we:
Generate many random samples
Calculate a result for each sample
Average all the results
This works because of the Law of Large Numbers. As the number of random samples increases, our average result converges to the true answer.
A practical example: Suppose you wanted to estimate the area of an irregular shape. You could:
Draw the shape inside a square of known area
Generate thousands of random points uniformly distributed within the square
Count how many points fall inside the shape versus outside
Use the ratio to estimate the shape's area
With just a few random points, your estimate will be rough. But as you generate more and more random points, the ratio of points inside the shape to total points converges to the true ratio of areas, giving you a better and better estimate.
<extrainfo>
This same principle underlies many modern applications: financial risk modeling (by simulating thousands of possible market scenarios), weather prediction (by running multiple simulation models), and machine learning algorithms that rely on averaging over many samples.
</extrainfo>
The remarkable thing is that this method doesn't require you to solve a difficult equation—it just requires generating random samples and averaging. The Law of Large Numbers guarantees that you'll get closer to the truth as you increase the number of samples.
Flashcards
What happens to the ratio of the difference between heads and tails to the total number of flips as the number of flips increases?
It tends to zero
To what value does the empirical probability of success in Bernoulli trials converge?
The theoretical probability of success
On what principle does the Monte Carlo method rely to produce accurate estimates through repeated random sampling?
The Law of Large Numbers
Quiz
Law of large numbers - Illustrative Examples Quiz Question 1: As the number of fair‑coin flips grows large, what happens to the ratio of the absolute difference between heads and tails to the total number of flips?
- It approaches zero. (correct)
- It approaches one half.
- It approaches one.
- It remains constant.
Law of large numbers - Illustrative Examples Quiz Question 2: Which statistical principle explains why Monte Carlo estimates become more accurate as the number of random samples increases?
- Law of large numbers (correct)
- Central limit theorem
- Bayes' theorem
- Markov inequality
Law of large numbers - Illustrative Examples Quiz Question 3: Which theorem guarantees that the empirical probability of success in repeated Bernoulli trials converges to the true probability?
- Law of large numbers (correct)
- Central limit theorem
- Chebyshev's inequality
- Bayes' theorem
As the number of fair‑coin flips grows large, what happens to the ratio of the absolute difference between heads and tails to the total number of flips?
1 of 3
Key Concepts
Probability Concepts
Law of Large Numbers
Bernoulli trial
Empirical probability
Relative frequency
Convergence in probability
Statistical Methods
Monte Carlo method
Random sampling
Asymptotic behavior
Definitions
Law of Large Numbers
A theorem stating that the average of a large number of independent, identically distributed random variables converges to their expected value.
Monte Carlo method
A computational technique that estimates results by performing repeated random sampling.
Bernoulli trial
An experiment with exactly two possible outcomes, typically labeled “success” and “failure.”
Empirical probability
The probability of an event estimated from the relative frequency of its occurrence in observed data.
Relative frequency
The proportion of times an event occurs compared to the total number of trials.
Convergence in probability
A mode of convergence where the probability that a sequence deviates from its limit by more than any ε approaches zero as the sample size grows.
Random sampling
The process of selecting a subset of individuals from a population such that each member has an equal chance of being chosen.
Asymptotic behavior
The tendency of a function or sequence to approach a particular value or pattern as its argument or index goes to infinity.