Lesson 1.7: The Master Tool: Moment Generating Functions (MGFs)
This lesson introduces one of the most powerful tools in probability theory: the Moment Generating Function (MGF). We will treat the MGF as the unique 'fingerprint' of a distribution. More importantly, we will rigorously derive the MGFs for our key discrete distributions and use them as an elegant 'moment factory' to derive their mean, variance, and higher moments with calculus instead of cumbersome summations.
Part 1: The 'Why' and 'How' of MGFs
1.1 The Problem: Moments are Hard to Calculate
We've defined moments like mean and variance using summations, e.g., . This works for the first two moments, but what if we need the third moment (for skewness) or the fourth (for kurtosis)?
Calculating directly is an algebraic nightmare. We need a more elegant and powerful method. That method is the MGF.
Definition: Moment Generating Function (MGF)
The MGF of a random variable is defined as the expected value of :
For a discrete random variable, this is calculated as:
The Core Derivation: Why is it a 'Moment Generator'?
This is the most important proof of this lesson. We use the Taylor series expansion of around , where .
Step 1: Expand as a Taylor Series
Step 2: Apply the Expectation Operator
By definition, . Let's substitute the series expansion:
Step 3: Use the Linearity of Expectation
We can bring the expectation inside the sum. The terms and are constants with respect to the random variable , so they can be pulled out of the expectation.
Let's write out the first few terms to see the pattern:
Step 4: Differentiate and Evaluate at t=0
Now, watch what happens when we differentiate with respect to and then set .
First Derivative:
Second Derivative:
The pattern holds! The k-th derivative evaluated at t=0 isolates the k-th moment.
Part 2: Detailed Derivations for Key Distributions
Bernoulli MGF Derivation
Bernoulli Moment Derivations
Mean:
Variance:
Binomial MGF Derivation
Let . PMF is .
Using the Binomial Theorem with and :
Binomial Moment Derivations
Mean:
Variance: (Using Product Rule)
Poisson MGF Derivation
Using the series for with :
Masterclass: All Four Poisson Moments
Let's derive all four central moments for the Poisson as a demonstration of the MGF's power.
1. Mean:
2. Variance:
3. Skewness: (Requires 3rd derivative)
The standardized skewness is .
For the Poisson, this simplifies to .
4. Kurtosis: (Requires 4th derivative)
The excess kurtosis simplifies to .
Geometric MGF Derivation
Let . PMF is for
Using the geometric series sum with :
Geometric Moment Derivations
Mean: (Using Quotient Rule)
Variance: The second derivative is complex, but evaluating at t=0 gives:
While we derived the first two moments in detail for all distributions, the standardized higher moments are also important characteristics:
| Distribution | Skewness | Excess Kurtosis |
|---|---|---|
| Binomial() | ||
| Poisson() | ||
| Geometric() |
What's Next? The Continuous World
We have now rigorously mastered the mathematical machinery for discrete random variables. We have a toolbox of distributions and a master tool (the MGF) for analyzing them.
It is time to cross the bridge into the continuous world. We must replace our summation tool () with the tool of integration () and learn about Probability Density Functions (PDFs).