Monday, July 2, 2012

Jensen's inequality


* Jensen's inequality

* If f is convex then the Expected value of f(x) is greater that f of the expected value of x.
* If f'(x) >= 0             E(f(x)) >= f(E(x))

* The reverse with concave.
* Likewise if f'(x) <= 0    E(f(x)) <= f(E(x))

* This is easy to show:

* Let's start with 1000 observations

clear
set obs 100

gen x=runiform()*-10
  * This will make it so that x is always positive making derivatives of functions easy to interpret.
 
* f(x)=x^2 -> f''(x)=2 > 0 thus convex

gen fx = x^2

sum x
* The expected value is approximated by the mean of x

local fEx =  (r(mean))^2
di "So: f(E(x)) ~ " string(`fEx',"%9.2f")

sum fx

di "E(f(x)) ~ " string(r(mean),"%9.2f") " is greater than f(E(x)) ~ " string(`fEx',"%9.2f")

* Thus Jensen's inequality works!

* It is a very helpful property in probability theory.


* However, the medians function passes through monotonic functions (completely unrelated to Jensen's inequality)

qui sum fx, detail
local medfx =  r(p50)

qui sum x, detail
local medx =  r(p50)

di "med(f(x)) ~ " string(`medfx',"%9.2f") " is approximately the same as f(med(x)) ~ " string(`medx'^2,"%9.2f")

No comments:

Post a Comment