# moment generating function of exponential distribution

$$Unif(0, 1)$$. 2014. Letting $$M_z(t)$$ mean ‘the MGF of $$Z$$’: $M_z(t) = e^{\lambda(e^t - 1)}e^{\lambda(e^t - 1)}$ $= e^{2\lambda(e^t - 1)}$. Let’s start by finding the MGF, of course. and welcome your input. - \frac{x^6}{6!} The implication here is that $$Z \sim Pois(2\lambda)$$, and this must be true by Property 5.1 because it has the MGF of a $$Pois(2 \lambda)$$ (imagine substituting $$2\lambda$$ in for $$\lambda$$ for the MGF of $$X$$ calculation). Here are a couple of reasons why the MGF $$M(t)$$ is so special: If two random variables have the same MGF, then they must have the same distribution. We could always convert back to any Exponential distribution $$X \sim Expo(\lambda)$$. By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy. So, we have $$t^n$$ in the sum, but we know from our list of the ‘properties’ of the MGF that we need the coefficient not of $$t^n$$ but of $$\frac{t^n}{n!}$$. What about this MGF is interesting? Sturdy and "maintenance-free"? }\], $\frac{1}{1-x} = 1 + x + x^2 + x^3 + ... = \sum_{n=0}^{\infty} x^n$, $\frac{1}{(1 - x)^2} = 1 + 2x + 3x^2 + 4x^3 + ... = \sum_{n=0}^{\infty}nx^{n-1}$, $sin(x) = x - \frac{x^3}{3!} Making statements based on opinion; back them up with references or personal experience. Suppose X has a standard normal distribution. First, we’ll work on applying Property 6.3: actually finding the moments of a distribution. The sum looks like the expansion of $$e^{e^t \lambda}$$, so we write: \[= e^{-\lambda}e^{e^t \lambda}$ $= e^{e^t \lambda - \lambda}$ $= e^{\lambda(e^t - 1)}$.

Now recall the Exponential series that we mentioned earlier in this chapter. However, this seems a little tedious: we need to calculate an increasingly complex derivative, just to get one new moment each time.

Specifically, what we can do is find the MGF of $$Z$$, and see if it matches the MGF of a known distribution; if they match, by Property 5.1, then they have the same distribution and we thus know the distribution of $$Z$$. It may look a little strange, because we have the $$e$$ in the exponent of the $$e$$.

Let’s confirm that this is in fact the MGF with a simulation in R. We will plot the analytical result above, as well as empirical estimates of $$E(e^{tx})$$.

Computing variance from moment generating function of exponential distribution, en.wikipedia.org/wiki/Moment_(mathematics), Feature Preview: New Review Suspensions Mod UX, 2020 Community Moderator Election Results, Moment Generating Function of a nonlinear transformation of an exponential random variable. Say that we want the first moment, or the mean. Median response time is 34 minutes and may be longer for new subjects. This could take a lot of work, though (and we have to do a separate integral/sum for each distinct moment), and it’s here that MGFs are really valuable. That is, $$Var(X) = E(X^2) - E(X)^2$$, or the Variance equals the second moment minus the square of the first moment (recall how LoTUS can be used to find both expectation and variance). MGFs make dealing with sums of random variables easier to handle. This exactly matches what we already know is the variance for the Exponential. \frac{2 \lambda}{(\lambda - t)^3} \bigg\vert_{t=0} = \frac{2}{\lambda^2} Nitpick: the variance is defined as $$E[(X−E[X])^2]$$ and from this one derives the more used formula in the answer. Additionally, it’s clear that this is even a much cleaner and, in the long run, easier way to find moments than using the MGF to take derivatives over and over again.

We could turn this into finding the moments of $$X$$ very easily: $(\lambda X)^n = Y^n \rightarrow \lambda^n X^n = Y^n \rightarrow X^n = \frac{Y^n}{\lambda^n}$. We write out the LoTUS calculation, where $$M_x(t)$$ denotes the MGF of $$X$$: $M_x(t) = E(e^{tx}) = \sum_{k = 0}^{\infty} e^{tk} \frac{\lambda^k e^{-\lambda}}{k!}$. We’ll just take the derivative with respect to the dummy variable $$t$$ and plug in 0 for $$t$$ after taking the derivative (remember how we said that $$t$$ was a ‘dummy’ variable that would keep track of the moments? It only takes a minute to sign up. We know that the mean and the variance of an $$Expo(1)$$ are both 1, so this should match up with what we’ve just found. + \frac{x^5}{5!}

Now that we have established the relevance of the MGF (and perhaps been a little confusing along the way) we can formalize what the function actually looks like. Let $$U_1, U_2, ..., U_{60}$$ be i.i.d.~$$Unif(0,1)$$ and $$X = U_1 + U_2 + ... + U_{60}$$.

Show that if I and I' are initial objects there exists a unique isomorphism between them. Does learning the same spell from different sources allow it to benefit from bonuses from all sources?