RemNote Community
Community

Introduction to Expected Values

Understand what expected value is, how to calculate it for discrete and continuous variables, and why it matters through linearity and the law of large numbers.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz

Quick Practice

What is the conceptual definition of expected value in terms of a repeated random experiment?
1 of 13

Summary

Expected Value: A Comprehensive Guide What is Expected Value? The expected value of a random variable is the long-run average outcome you would observe if you repeated a random experiment many times. It's denoted by $E[X]$ or sometimes by the Greek letter $\mu$. Think of expected value as the "center of mass" of a probability distribution—where outcomes tend to cluster when weighted by their probabilities. If you roll a fair die repeatedly and compute the average of your rolls, that average will eventually settle around the expected value of the die. An important insight: expected value is not required to be a possible outcome of the random variable itself. For example, the expected value of rolling a fair die is 3.5, yet you can never actually roll a 3.5. Expected Value for Discrete Random Variables For a discrete random variable—one that can take on a finite or countably infinite number of specific values—computing the expected value is straightforward. If your random variable $X$ can take values $x1, x2, \dots, xk$ with probabilities $p1, p2, \dots, pk$ respectively, then: $$E[X] = \sum{i=1}^{k} xi \, pi$$ In words: multiply each possible outcome by its probability, then add all these products together. Remember: the probabilities must satisfy $\sum{i=1}^{k} pi = 1$ (they sum to 1). Example: Rolling a Fair Die A fair six-sided die has outcomes 1, 2, 3, 4, 5, 6, each with probability $\frac{1}{6}$. The expected value is: $$E[X] = 1 \cdot \frac{1}{6} + 2 \cdot \frac{1}{6} + 3 \cdot \frac{1}{6} + 4 \cdot \frac{1}{6} + 5 \cdot \frac{1}{6} + 6 \cdot \frac{1}{6} = \frac{21}{6} = 3.5$$ Each outcome contributes equally to the average because each has equal probability. Expected Value for Continuous Random Variables For continuous random variables—which can take any value within an interval—expected value is computed using integration instead of summation. If $X$ is a continuous random variable with probability density function $f(x)$, then: $$E[X] = \int{-\infty}^{\infty} x \, f(x) \, dx$$ The probability density function must satisfy $\int{-\infty}^{\infty} f(x) \, dx = 1$, analogous to probabilities summing to 1 in the discrete case. The integral averages the outcomes weighted by their probability density. Just as with the discrete case, you're multiplying values by their probabilities (or probability densities) and summing the contributions. Key Properties of Expected Value Linearity of Expectation One of the most powerful properties is linearity of expectation. For any constants $a$ and $b$ and any random variables $X$ and $Y$: $$E[aX + bY] = a \, E[X] + b \, E[Y]$$ This property is remarkable because it holds regardless of whether $X$ and $Y$ are independent. This makes expected value incredibly useful: you can break complex expressions into simpler pieces, compute their expectations separately, and combine the results. Example: If $X$ has expected value 10 and $Y$ has expected value 5, then $E[2X + 3Y] = 2(10) + 3(5) = 35$. Non-negativity of Expectation If a random variable $X$ is always non-negative (i.e., $X \geq 0$ always), then its expected value must also be non-negative: $E[X] \geq 0$. This is intuitive: if you only have non-negative outcomes and you average them (weighted by probability), you cannot get a negative result. The Law of Large Numbers: Why Expected Value Matters The Law of Large Numbers explains why expected value is such a powerful concept. This theorem states that if you repeatedly draw independent samples from the same distribution, the average of those samples converges toward the expected value as the number of samples increases. In other words, if you have independent, identically distributed observations $X1, X2, X3, \ldots, Xn$ from a random variable with expected value $\mu$, then the sample average $$\bar{X}n = \frac{1}{n}(X1 + X2 + \cdots + Xn)$$ gets closer and closer to $\mu$ as $n$ grows large. This explains the intuitive interpretation of expected value as a "long-run average." This convergence is not just a theoretical nicety—it's the foundation for why we can use expected value as a reliable predictor of behavior over time, which makes it essential for practical applications in finance, insurance, and engineering. Why Expected Value Matters in Statistics and Probability Expected value is a cornerstone concept with wide-ranging applications: Foundation for variance: Variance, which measures the spread of a distribution, is itself defined using expected value: $\text{Var}(X) = E[(X - E[X])^2]$. You cannot understand variability without understanding expectation. Higher-order moments: Expected value extends to computing second moments $E[X^2]$, skewness, and other descriptors of a distribution's shape. Risk assessment: In finance and insurance, expected value helps quantify risk. Insurance premiums are priced based on the expected value of claims. Investment decisions depend on expected returns. Statistical estimation: When designing estimators (methods to guess unknown population parameters), expected value guides us toward unbiased estimators—those whose expected value equals the true parameter. Decision-making: Expected value provides a rational framework for comparing uncertain outcomes, which is essential in business, medicine, and policy decisions.
Flashcards
What is the conceptual definition of expected value in terms of a repeated random experiment?
The long-run average outcome.
What physical concept does the expected value represent in a probability distribution?
The center of mass.
Is the expected value of a random variable required to be one of its possible outcomes?
No.
For a discrete random variable $X$, what is the formula for expected value $E[X]$?
$E[X]=\sum{i=1}^{k} xi pi$ (where $xi$ are outcomes and $pi$ are their probabilities).
What condition must the probabilities $pi$ of a discrete random variable satisfy?
$\sum{i=1}^{k} pi = 1$.
For a continuous random variable $X$ with density $f(x)$, what is the formula for expected value $E[X]$?
$E[X]=\int{-\infty}^{\infty} x f(x) dx$.
What is the required integral of a probability density function $f(x)$ over all space?
$\int{-\infty}^{\infty} f(x) dx = 1$.
What is the formula for the linearity of expectation for $E[aX + bY]$?
$a E[X] + b E[Y]$ (where $a, b$ are constants and $X, Y$ are random variables).
Does the linearity of expectation require the random variables $X$ and $Y$ to be independent?
No, it holds regardless of independence.
If a random variable $X$ is always non-negative, what must be true of its expected value $E[X]$?
It is also non-negative.
How does the sample average of many independent draws behave according to the Law of Large Numbers?
It converges toward the expected value.
What two conditions must the draws of a random variable satisfy for the Law of Large Numbers to apply?
Independent Identically distributed
What statistical measure of spread is defined using the expected value?
Variance.

Quiz

What does the expected value represent in a repeated random experiment?
1 of 9
Key Concepts
Random Variables
Discrete random variable
Continuous random variable
Probability mass function
Probability density function
Statistical Concepts
Expected value
Variance
Linearity of expectation
Law of large numbers