# What is mean independence?

Mean independence is a relationship between two random variables that lies between the usual definition of independence and uncorrelatedness. A random variable $Y$ is said to mean independent of $X$ if (and only if) $\mathbb{E}[Y \mid X = x] = \mathbb{E}[Y]$

for all $x$ such that the probability mass/density of $X$ at $x$ is not zero.

Mean independence comes up most often in econometrics as an assumption that is weaker than independence but stronger than uncorrelatedness. Independence implies mean independence, and mean independence implies uncorrelatedness.

Independence implies mean independence

This is the most technical argument of the lot (adapted from Reference 2). By definition of conditional expectation, $\mathbb{E}[Y \mid X]$ is the unique random variable such that for any $A \in \sigma(X)$ (the $\sigma$-algebra generated by $X$), we have $\mathbb{E}[\mathbb{E}[Y \mid X] 1_A] = \mathbb{E}[Y 1_A].$

However, by independence of $X$ and $Y$, $\mathbb{E}[Y 1_A] = \mathbb{E}[Y] \mathbb{E}[1_A] = \mathbb{E}[\mathbb{E}[Y]1_A].$

Hence, $\mathbb{E}[Y \mid X]$ must be equal to the constant random variable $E[Y]$, i.e. $Y$ is mean independent of $X$.

(Update: Ok here’s an easier way to see this: independence implies that the conditional distribution of $Y$ given $X = x$ is the same for all $x$. Since the conditional mean $\mathbb{E}[Y \mid X = x]$ is a function of the conditional distribution $Y \mid X = x$, it follows that $\mathbb{E}[Y \mid X = x]$ is the same for all $x$, and so must be equal to $\mathbb{E}[Y]$.)

Mean independence implies uncorrelatedness

(Proof adapted from Reference 3.) Assume $Y$ is mean independent of $X$. By the law of iterated expectations, \begin{aligned} \mathbb{E}[XY] &= \mathbb{E}[\mathbb{E}[XY \mid X]] \\ &= \mathbb{E}[X \mathbb{E}[Y \mid X]] \\ &= \mathbb{E}[X \mathbb{E}[Y]] \\ &= \mathbb{E}[Y] \mathbb{E}[X], \end{aligned}

so $\text{Cov}(X, Y) = 0$.

Both converses are not true, as the counterexamples below show. (Counterexamples adapted from Reference 4.)

Mean independence does not imply independence

Let $X = \cos \theta$ and $Y = \sin \theta$, with $\theta \sim \text{Unif}[0, 2\pi]$. For any $x \in [-1, 1]$, $\mathbb{P}(Y = \sqrt{1-x^2}) = \mathbb{P}(Y = -\sqrt{1-x^2}) = 0.5$, so $\mathbb{E}[Y \mid X = x] = 0$, i.e. $Y$ is mean independent of $X$. We can similarly establish that $X$ is mean independent of $Y$.

On the other hand, $\mathbb{P}(X > 3/4) > 0$ and $\mathbb{P}(Y > 3/4) > 0$, but since $X^2 + Y^2 = 1 < (3/4)^2 + (3/4)^2$, $\mathbb{P}(X > 3/4 \text{ and } Y > 3/4) = 0 \neq \mathbb{P}(X > 3/4 ) \mathbb{P}(Y > 3/4).$

Therefore, $X$ and $Y$ are not independent.

Uncorrelatedness does not imply mean independence

Let $X$ and $Y$ be such that $(X, Y) \in \{ (1, 3), (-3, 1), (-1, -3), (3, -1) \}$ with equal probability. With this setup, $\mathbb{E}[XY] = \mathbb{E}[X] = \mathbb{E}[Y] = 0$,

so $\text{Cov}(X, Y) = 0$, i.e. $X$ and $Y$ uncorrelated. However, $\mathbb{E}[Y \mid X = 1] = 3, \quad \mathbb{E}[Y \mid X = -3] = 1,$

so $Y$ is not mean independent of $X$.

Mean independence is not symmetric

Both independence and uncorrelatedness are symmetric; however (maybe surprisingly) mean independence is not. Here is a simple from StackExchange demonstrating this. Let $X$ and $Y$ be such that $(X, Y) \in \{ (2, 1), (-2, 1), (5, -1), (-5, -1) \}$ with equal probability. With this setup, $\mathbb{E}[X \mid Y = 1] = \mathbb{E}[X \mid Y = -1] = 0,$

so $X$ is mean independent of $Y$, but $\mathbb{E}[Y \mid X = 2] = 1, \quad \mathbb{E}[Y \mid X = 5] = -1,$

so $Y$ is not mean independent of $X$.

References:

1. Wikipedia. Mean dependence.
2. Dembo, A. Stat/219 Math 136 – Stochastic Processes: Notes on Section 4.1.2.
3. Zhang, H. Mean-independence implies zero covariance.
4. Merx, J.-P. Mean independent and correlated variables.