Skip to content

expectation is linear

The expected value (mean) of random variables adds even if the variables are dependent.

This is the magic part:

Expectation is always linear — no independence required.

Formally, for any random variables X and Y:

E[X + Y] = E[X] + E[Y]

And for constants:

E[aX + b] = aE[X] + b

This works for sums of any number of variables:

E\left[\sum_{i=1}^n X_i\right] = \sum_{i=1}^n E[X_i]

🎯 Why this matters

  • You can compute expectations without knowing the full distribution.
  • You don’t need independence.
  • It simplifies many probability problems dramatically.

⭐ Examples

Example 1: Rolling Two Dice

Let

  • X = result of die 1
  • Y = result of die 2

We know:

E[X] = E[Y] = 3.5

Then:

E[X + Y] = E[X] + E[Y] = 3.5 + 3.5 = 7

No need to compute the full distribution of the sum.

Example 2: Number of Heads in 10 Coin Flips

Let X_i = 1 if flip i is heads, and 0 otherwise.

Then:

E[X_i] = 0.5

Total heads:

X = X_1 + X_2 + \cdots + X_{10}

By linearity:

E[X] = 10(0.5) = 5

We didn’t need the binomial formula — linearity did all the work.

Example 3: Expected Score on a Test

A test has 5 true/false questions.
A student guesses randomly.

Let X_i = 1 if question i is correct.

E[X_i] = 0.5

Total score:

E[X_1 + \cdots + X_5] = 5(0.5) = 2.5

Even though the student might get 0 or 5, the expected score is 2.5.

Example 4: Dependent Variables (to show independence is NOT needed)

A box has 3 red and 1 blue ball.
You draw two balls without replacement.

Let

  • X = number of red balls drawn
  • X_1 = indicator for first draw being red
  • X_2 = indicator for second draw being red

These draws are dependent — the first affects the second.

But:

E[X] = E[X_1] + E[X_2]

Compute:

E[X_1] = \frac{3}{4}

E[X_2] = \frac{3}{4}

(Yes — symmetry!)

So:

E[X] = \frac{3}{4} + \frac{3}{4} = \frac{3}{2}

Even though the draws are dependent, linearity still works perfectly.

🎨 Intuition

Expectation is like adding averages, not outcomes.

  • You don’t care about interactions.
  • You don’t care about dependence.
  • You don’t care about the shape of the distribution.
See also  Poisson distribution ex: aggressive koalas, drunken moose

You just add the expected contributions of each piece.

Leave a Reply

error: Content is protected !!