Is it true that for two random variables $A$ and $B$,
$$E(A\mid B)=E(B\mid A)\frac{E(A)}{E(B)}?$$
|
Is it true that for two random variables $A$ and $B$, $$E(A\mid B)=E(B\mid A)\frac{E(A)}{E(B)}?$$ |
|||||||||
|
|
It is trivially true for independent random variables $A$ and $B$ with nonzero means. If $B$ has mean $0$, then the right side is of the form $\frac 00 E[A]$ which cannot be evaluated. The result does not hold for dependent random variables. Bear in mind that $E[A\mid B]$ is a random variable that is a function of the random variable $B$, say $g(B)$ while $E[B\mid A]$ is a random variable that is a function of the random variable $A$, say $h(A)$ and so you are asking whether $g(B)$ can equal $h(A)$ times a constant. Obviously this cannot be true in general. For independent random variables, $g(B)$ and $h(A)$ are degenerate random variables (called constants by statistically-illiterate folks) and so the desired equality (or its more general version $E[A\mid B]E[B]=E[B\mid A]E[A]$ that is suggested in whuber's comment) holds. |
|||||||||
|
|
The result is untrue in general, let us see that in a simple example. Let $X \mid P=p$ have a binomial distribution with parameters $n,p$ and $P$ have the beta distrubution with parameters $(\alpha, \beta)$, that is, a bayesian model with conjugate prior. Now just calculate the two sides of your formula, the left hand side is $\DeclareMathOperator{\E}{\mathbb{E}} \E X \mid P = nP$, while the right hand side is $$ \E( P\mid X) \frac{\E X}{\E P} = \frac{\alpha+X}{n+\alpha+\beta} \frac{\alpha/(\alpha+\beta)}{n\alpha/(\alpha+\beta)} $$ and those are certainly not equal. |
||||
|
|
|
The conditional expected value of a random variable $A$ given the event that $B=b$ is a number that depends on what number $b$ is. So call it $h(b).$ Then the conditional expected value $\operatorname{E}(A\mid B)$ is $h(B),$ a random variable whose value is completely determined by the value of the random variable $B$. Thus $\operatorname{E}(A\mid B)$ is a function of $B$ and $\operatorname{E}(B\mid A)$ is a function of $A$. The quotient $\operatorname{E}(A)/\operatorname{E}(B)$ is just a number. So one side of your proposed equality is determined by $A$ and the other by $B$, so they cannot be equal. (Perhaps I should add that they may be equal when the values of $A$ and $B$ determine each other, as when for example, $\Pr(A=2B)=1.$ But only in that trivial case can they be equal, and functions equal to each other only at a few points are not equal.) |
|||||||||||||||||
|
|
Dilip's answer is excellent. I'll simply add some slightly more explicit notation to highlight the absurdity of the equation in question. More explicit notation:Let $A$ and $B$ be real valued random variables and let $a$ and $b$ be real numbers. The expectation of $A$ conditional on knowing the value of $B$ is a function.
Your question essentially is whether for all $a$ and $b$ (in the support of $A$ and $B$ respectively): $$ f(a) \mathrm{E}[A] = g(b) \mathrm{E}[B]$$ The left hand side is a function of $a$. The right hand side is a function of $b$. They cannot possibly be equal if either $f$ or $g$ varies at all! (As Dilip points out, if $f$ and $g$ are constant then $f = \mathrm{E}[B]$ and $g = \mathrm{E}[A]$ and the equation holds trivially.) Constrast with Bayes Rule:$$ P(B = b \mid A = a) = \frac{P(A = a \mid B = b) P(B = b)}{P(A = a)}$$ Both the left hand side and the right hand side of the equation are functions of both $a$ and $b$. |
|||||
|