Logmoment generating function

From Wikipedia, the free encyclopedia

In mathematics, the logarithmic momentum generating function (equivalent to cumulant generating function) (logmoment gen func) is defined as follows:

\mu_{Y}(s)=\ln E(e^{s\cdot Y})

where Y is a random variable.

Thus, if Y is a discrete random variable, then

\mu_{Y}(s):=\ln \sum_y P(y)\cdot e^{s\cdot y} ,

especially for the binary case (Bernoulli distribution)

\mu_Y(s)=\ln\left\{p\cdot e^s + (1-p)\right\}

and if Y is a random variable with continuous distribution, then

\mu_{Y}(s):=\ln \int_y \Phi(y)\cdot e^{s\cdot y}.

Here Φ is the cumulative distribution function of Y.

it is also true that for a sum of independent random variables

Y=\sum_{j=1}^J X_j

that

\mu_Y(s)=\sum_{j=1}^J \mu_{X_j}(s)

Proof:

\mu_Y(s)=\ln \left(e^{s\cdot Y}\right) = \ln E\left(e^{s\cdot \sum_{j=1}^J X_j}\right) \stackrel{*}{=} \ln\prod_{j=1}^{J} E\left(e^{s\cdot X_j}\right) = \sum_{j=1}^J \ln E\left(e^{s\cdot X_j}\right) = \sum_{j=1}^J \mu_{X_j}(s).

("*" is where we used the independence of the Xj random variables)


[edit] See also