




















The expected value (mean) of random variables adds even if the variables are dependent.
This is the magic part:
Expectation is always linear — no independence required.
Formally, for any random variables and
:
And for constants:
This works for sums of any number of variables:
🎯 Why this matters
- You can compute expectations without knowing the full distribution.
- You don’t need independence.
- It simplifies many probability problems dramatically.
⭐ Examples
Example 1: Rolling Two Dice
Let
result of die 1
result of die 2
We know:
Then:
No need to compute the full distribution of the sum.
Example 2: Number of Heads in 10 Coin Flips
Let if flip
is heads, and
otherwise.
Then:
Total heads:
By linearity:
We didn’t need the binomial formula — linearity did all the work.
Example 3: Expected Score on a Test
A test has 5 true/false questions.
A student guesses randomly.
Let if question
is correct.
Total score:
Even though the student might get 0 or 5, the expected score is 2.5.
Example 4: Dependent Variables (to show independence is NOT needed)
A box has 3 red and 1 blue ball.
You draw two balls without replacement.
Let
number of red balls drawn
indicator for first draw being red
indicator for second draw being red
These draws are dependent — the first affects the second.
But:
Compute:
(Yes — symmetry!)
So:
Even though the draws are dependent, linearity still works perfectly.
🎨 Intuition
Expectation is like adding averages, not outcomes.
- You don’t care about interactions.
- You don’t care about dependence.
- You don’t care about the shape of the distribution.
You just add the expected contributions of each piece.