def. Moment Generating Functions:
…where is differentiable -times around zero to be able to generate the -th moment. You can build an MGF from a pmf of pdf:
- for pmfs:
- for pdfs: It can generate the -th moment like such:
thm. Linear Combination of MGFs. Given r.v. , and which are iid,:
thm. Uniqueness Theorem of MGF. let have CDFs then:
i.e. this moment generating function fully characterizes the distribution of the random variable. thm. Taylor Expansion of MGF. For RV
Intuition for Moments
Motivation. Moments are a convenient way to characterize a distinction. The foundation is from this realization (1) The Uniqueness Theorem and (2) Taylor expansion. This means that a linear combination of uniquely describes the distribution of . It’s also nice that there’s a nice visual explanation for each.
Types of Moments
For RV :
- Raw Moments:
- Central Moments:
- Standardized Moments:
Special Moments
- th moment i.e. second probability anxiom
- 1st moment: , the mean, location of the pdf
- 2nd (central) moment: , the variance, the spread of the pdf
- squaring disregards signs, so the farther mass is from the higher variance
- 3rd (standardized) moment: the skew, relative tailed-ness of the pdf
- cubing preserves signs, so the left-tails negatively contribute, and right-tails positively contribute
- 4th (standardized) moment: the kurtosis, the absolute tailed-ness of the pdf
- 4th power disregards signs, so the farther mass is from the of higher variance. Similar to variance, but higher punishment.
Method of Moments
Motivation. An alternative way to get an estimator quick and dirty, as opposed to MLEs. alg. Method of Moments (MOM): let . Then to get estimators for :
- let
- gather data on to get an empirical estimate Observe that because , this means that
- let function which maps can be inverted
- Get system of equations for as many ’s as necessary
- Solve the system of equations as