## Monday, September 10, 2012

### LIE: Law of Iterative Expectations, Word Problem

The law of iterative expectations is extremely useful and I have been trying to think of ways of explaining it.

It basically state that E(Y)=E(E(Y|X))=E(E(Y|X,Z)) where X and Z are sets of covariates.

See my previous post for more information: Law of Iterative Expectations/Law of Total Expectations

I have been trying to explain it in a manner that is intuitively appealing.  Often times I find notation to be a little difficult to dissect.

Imagine Bob and Susie are brother and sister, they go to the same school and always buy lunch.  Their parents randomly decide how much to give them for lunch each day.  But their parents like Susie more than Bob so however much money they give Bob, Susie expects to get a larger amount of money.  There are a few things that you might want to find out.

a. What is the expected amount of money that Bob E(X) will get?
b. What is the expected amount of money that Susie E(Y) will get?
c. What is the distribution of lunch money to Bob (pdf(X))?
d. How much money does Susie expect to get given that we know how much Bob got E(Y|X)?

We know already from the setup that E(Y)>E(X).  {This is not important for the example}

If we know a and b, can we infer c or d? No because in general E(Y)=f(E(X)) and unless we make some assumption about f, then cannot identify the relationship between a and b.

What about if we knew a,b,c? No.  Imagine, if we had a simple distribution p=1/2, x=1: p=1/2, x = 5, thus E(X) = 3, because 1/2*1+1/2*5 = 3.  Imagine that we know E(Y)=6.  This still does not infer the relationship d.  Why, because E(Y|X)=2*X or E(Y|X)=X+3 or any other of a set of infinite functional forms.

* So what can we infer?

* 1. If we know the distribution of X (c), then we know the expected value of X (a).  This is because the expected value function is defined as only the integration of x across the pdf of x.

* 2. If we know the distribution of X (how frequently Bob gets each amount of money) and we know the expected relationship between how much money Bob gets and how much Susie gets (d) then we can figure out how much money Susie gets.  How does this work?  Well imagine that Bob gets \$2 with probability 1/3, \$4 with probability 1/3, and \$6 with probability 1/3.  Imagine also that Susie expects to gets the squared amount of whatever Bob got E(Y|X)=X^2.  Thus we can figure out how much Susie expects to get on any day without knowing how much Bob has gotten yet E(Y).  E(Y)=E(E(Y|X)) = 1/3*2^2 + 1/3*4^2 + 1/3*6^2 = (56)/3 ~ 18.6.