Stable Distributions: Generalised CLT and Series Representation Theorem

This is a continuation of the previous post that can be found here. We discuss the Generalised CLT in this post. We give a sloppy proof of the theorem. Then we end with a series representation theorem for the Stable distributions.

The Generalised Central Limit Theorem

Theorem 9: {X_1,X_2,\ldots} are iid symmetric random variables with {P(X_1>\lambda) \sim k\lambda^{-\alpha}} as {\lambda \rightarrow\infty} for some {k>0} and {\alpha \in (0,2)}. Then

\displaystyle \frac{X_1+X_2+\cdots+X_n}{n^{1/\alpha}} \stackrel{d}{\rightarrow} Y

where {Y\sim S\alpha S} with appropiate constants.`

Motivation for proof: Recall the proof of usual CLT. We assume {X_i} are iid random variables mean {0} and variance {\sigma^2} and characteristic function {\phi}. We use the fact that under finite variance assumption we have

\displaystyle \frac{1-\phi(t)}{t^2} \rightarrow \frac{\sigma^2}{2}

and hence using this we get

\displaystyle \begin{aligned} E\left[e^{i\frac{t}{\sqrt{n}}(X_1+X_2+\cdots+X_n)}\right] = \left[E\left(e^{i\frac{t}{\sqrt{n}}X_1}\right)\right]^n & = \left[\phi\left(\frac{t}{\sqrt{n}}\right)\right]^n \\ & = \left[1-\left(1-\phi\left(\frac{t}{\sqrt{n}}\right)\right)\right]^n \\ & \rightarrow \exp(-t^2\sigma^2/2) \end{aligned}

We will this idea in our proof. We will leave some of the technical details of the proof. The proof presented here is not rigourous. We primarily focus on the methodology and tricks used in the proof.

Continue reading

Advertisements