Cross Validated is a question and answer site for people interested in statistics, machine learning, data analysis, data mining, and data visualization. Join them; it only takes a minute:

Sign up
Here's how it works:
  1. Anybody can ask a question
  2. Anybody can answer
  3. The best answers are voted up and rise to the top

Is it true that for two random variables $A$ and $B$,

$$E(A\mid B)=E(B\mid A)\frac{E(A)}{E(B)}?$$

share|improve this question
3  
Hmm...I do not think those two sides are equivalent – Jon 18 hours ago
3  
As pointed out in the answers, the question is probabilistically meaningless because of the integration of random variables on one side that are the conditioning variables on the other side. – Xi'an 17 hours ago

It is trivially true for independent random variables $A$ and $B$ with nonzero means. If $B$ has mean $0$, then the right side is of the form $\frac 00 E[A]$ which cannot be evaluated. The result does not hold for dependent random variables.

Bear in mind that $E[A\mid B]$ is a random variable that is a function of the random variable $B$, say $g(B)$ while $E[B\mid A]$ is a random variable that is a function of the random variable $A$, say $h(A)$ and so you are asking whether $g(B)$ can equal $h(A)$ times a constant. Obviously this cannot be true in general. For independent random variables, $g(B)$ and $h(A)$ are degenerate random variables (called constants by statistically-illiterate folks) and so the desired equality (or its more general version $E[A\mid B]E[B]=E[B\mid A]E[A]$ that is suggested in whuber's comment) holds.

share|improve this answer
5  
+1. To be generous, the question could be interpreted as asking whether $E(A|B)E(B)=E(B|A)E(A)$, where the question of division by zero disappears. – whuber 18 hours ago
1  
@whuber Thanks. My edit addresses the more general question as to whether it is possible to have $E[A\mid B]E[B]=E[B\mid A]E[A]$. – Dilip Sarwate 18 hours ago

The result is untrue in general, let us see that in a simple example. Let $X \mid P=p$ have a binomial distribution with parameters $n,p$ and $P$ have the beta distrubution with parameters $(\alpha, \beta)$, that is, a bayesian model with conjugate prior. Now just calculate the two sides of your formula, the left hand side is $\DeclareMathOperator{\E}{\mathbb{E}} \E X \mid P = nP$, while the right hand side is $$ \E( P\mid X) \frac{\E X}{\E P} = \frac{\alpha+X}{n+\alpha+\beta} \frac{\alpha/(\alpha+\beta)}{n\alpha/(\alpha+\beta)} $$ and those are certainly not equal.

share|improve this answer

The conditional expected value of a random variable $A$ given the event that $B=b$ is a number that depends on what number $b$ is. So call it $h(b).$ Then the conditional expected value $\operatorname{E}(A\mid B)$ is $h(B),$ a random variable whose value is completely determined by the value of the random variable $B$. Thus $\operatorname{E}(A\mid B)$ is a function of $B$ and $\operatorname{E}(B\mid A)$ is a function of $A$.

The quotient $\operatorname{E}(A)/\operatorname{E}(B)$ is just a number.

So one side of your proposed equality is determined by $A$ and the other by $B$, so they cannot be equal.

(Perhaps I should add that they may be equal when the values of $A$ and $B$ determine each other, as when for example, $\Pr(A=2B)=1.$ But only in that trivial case can they be equal, and functions equal to each other only at a few points are not equal.)

share|improve this answer
    
You mean they are not necessarily equal? I mean they CAN be equal? – BCLC 16 hours ago
1  
@BCLC : They are equal only in trivial cases. And two functions equal to each other at some points and not at others are not equal. – Michael Hardy 13 hours ago
2  
"But only in that trivial case can they be equal" (emphasis added) is not quite correct. Consider independent $A$ and $B$ with $E[B]\neq 0$. Then, $E[A\mid B] = E[A]$ while $E[B\mid A] = E[B]$ and so $$E[B\mid A] \frac{E[A]}{E[B]} = E[B]\frac{E[A]}{E[B]} = E[A] = E[A\mid B].$$ – Dilip Sarwate 12 hours ago
    
@DilipSarwate I was about to say that haha! – BCLC 7 hours ago

Dilip's answer is excellent. I'll simply add some slightly more explicit notation to highlight the absurdity of the equation in question.

More explicit notation:

Let $A$ and $B$ be real valued random variables and let $a$ and $b$ be real numbers. The expectation of $A$ conditional on knowing the value of $B$ is a function.

  • Let $f(a) = \mathrm{E}[B \mid A = a]$. This is a function of $a$ but not of $b$.

  • Let $g(b) = \mathrm{E}[A \mid B = b]$. This is a function of $b$ but not of $a$.

Your question essentially is whether for all $a$ and $b$ (in the support of $A$ and $B$ respectively):

$$ f(a) \mathrm{E}[A] = g(b) \mathrm{E}[B]$$

The left hand side is a function of $a$. The right hand side is a function of $b$. They cannot possibly be equal if either $f$ or $g$ varies at all! (As Dilip points out, if $f$ and $g$ are constant then $f = \mathrm{E}[B]$ and $g = \mathrm{E}[A]$ and the equation holds trivially.)

Constrast with Bayes Rule:

$$ P(B = b \mid A = a) = \frac{P(A = a \mid B = b) P(B = b)}{P(A = a)}$$

Both the left hand side and the right hand side of the equation are functions of both $a$ and $b$.

share|improve this answer
    
Whoever voted me down, I would suggest you comment on what you think is wrong? – Matthew Gunn 2 hours ago

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.