# Stable Distributions: Generalised CLT and Series Representation Theorem

This is a continuation of the previous post that can be found here. We discuss the Generalised CLT in this post. We give a sloppy proof of the theorem. Then we end with a series representation theorem for the Stable distributions.

The Generalised Central Limit Theorem

Theorem 9: ${X_1,X_2,\ldots}$ are iid symmetric random variables with ${P(X_1>\lambda) \sim k\lambda^{-\alpha}}$ as ${\lambda \rightarrow\infty}$ for some ${k>0}$ and ${\alpha \in (0,2)}$. Then

$\displaystyle \frac{X_1+X_2+\cdots+X_n}{n^{1/\alpha}} \stackrel{d}{\rightarrow} Y$

where ${Y\sim S\alpha S}$ with appropiate constants.`

Motivation for proof: Recall the proof of usual CLT. We assume ${X_i}$ are iid random variables mean ${0}$ and variance ${\sigma^2}$ and characteristic function ${\phi}$. We use the fact that under finite variance assumption we have

$\displaystyle \frac{1-\phi(t)}{t^2} \rightarrow \frac{\sigma^2}{2}$

and hence using this we get

\displaystyle \begin{aligned} E\left[e^{i\frac{t}{\sqrt{n}}(X_1+X_2+\cdots+X_n)}\right] = \left[E\left(e^{i\frac{t}{\sqrt{n}}X_1}\right)\right]^n & = \left[\phi\left(\frac{t}{\sqrt{n}}\right)\right]^n \\ & = \left[1-\left(1-\phi\left(\frac{t}{\sqrt{n}}\right)\right)\right]^n \\ & \rightarrow \exp(-t^2\sigma^2/2) \end{aligned}

We will this idea in our proof. We will leave some of the technical details of the proof. The proof presented here is not rigourous. We primarily focus on the methodology and tricks used in the proof.

# Stable Distributions: Properties

This is a continuation of the previous post that can be found here. We will discuss some of the properties of the Stable distributions in this post.

Theorem 2. $Y$ is the limit of $\displaystyle\frac{X_1+X_2+\cdots+X_n-b_n}{a_n}$ for some iid sequence $X_i$ and sequence $a_n >0$ and $b_n$ if and only if $Y$ has a stable law.

Remark: This kind of explains why Stable laws are called Stable!

Proof: The if part follows by taking $X_i$ to be stable itself. We focus on the only if part.

Let $\displaystyle Z_n=\frac{X_1+X_2+\cdots+X_n-b_n}{a_n}$. Fix a $k\in \mathbb{N}$. Let

$\displaystyle S_n^{j}= \sum_{i=1}^n X_{(j-1)n+i}= X_{(j-1)n+1}+X_{(j-1)n+2}+\cdots+X_{jn} \ \ \mbox{for} \ (1\le j \le k)$

Now consider $Z_{nk}$. Observe that

$\displaystyle Z_{nk} = \frac{S_n^1+S_n^2+\cdots+S_n^k-b_{nk}}{a_{nk}}$

$\displaystyle \implies \frac{a_{nk}Z_{nk}}{a_n} = \frac{S_n^1-b_n}{a_n}+\frac{S_n^2-b_n}{a_n}+\cdots+\frac{S_n^k-b_n}{a_n}+\frac{kb_n-b_{nk}}{a_n}$

As $n\to \infty$,

$\displaystyle \frac{S_n^1-b_n}{a_n}+\frac{S_n^2-b_n}{a_n}+\cdots+\frac{S_n^k-b_n}{a_n} \stackrel{d}{\to} Y_1+Y_2+\cdots+Y_k$

where $Y_i$‘s are iid copies of $Y$. Let $Z_{nk}=W_n$ and let

\displaystyle \begin{aligned} W_n' & :=\frac{a_{nk}Z_{nk}}{a_n}-\frac{kb_n-b_{nk}}{a_n} \\ & = \alpha_nW_n+\beta_n \end{aligned}

where $\displaystyle\alpha_n:=\frac{a_{nk}}{a_n}$ and $\displaystyle\beta_n:=-\frac{kb_n-b_{nk}}{a_n}$.

Now $W_n \stackrel{d}{\to} Y$. and $W_n'\stackrel{d}{\to} Y_1+Y_2+\cdots+Y_k$. If we can show that $\alpha_n \to \alpha$ and $\beta_n \to \beta$, then this will ensure

$\alpha Y+\beta \stackrel{d}{=} Y_1+Y_2+\cdots+Y_k$

Note that $\alpha$ and $\beta$ depends only on $k$. Hence the law of $Y$ is stable.

$\square$

# Stable Distributions: An introduction

In this series of posts I will introduce the stable distributions and discuss some of its properties. The theorems and proofs presented here are written in a lucid manner, so that it can be accessible to most of the readers. I tried my best to make this discussion self contained as far as possible so that the readers don’t have to look up to different references to follow the proofs. It is based on the presentation that I did in Measure Theory course in MStat 1st year.

## Introduction

In statistics while modelling continuous data we assume that the data are iid observations coming from some ‘nice’ distribution. Then we do statistical inference. The strongest statistical argument we usually use is the Central Limit Theorem, which states that the sum of a large number of iid variables from a finite variance distribution will tend to be normally distributed. But this finite variance assumption is not always true. For example many real life data, such as financial assets exhibit fat tails. Such finite variance assumption does not hold for heavy tails. So there is a need for an alternative model. One such alternative is the Stable distribution. Of course there are other alternatives. But one good reason to use Stable distribution as an alternative is that they are supported by General Central Limit Theorem.

Notations: $U\stackrel{d}{=} V$ means $U$ and $V$ have same distribution. Throughout this section we assume

$\displaystyle X,X_1,X_2,\ldots \stackrel{iid}{\sim} F \ \ \ \mbox{and} \ S_n=\sum_{i=1}^n X_i$

Motivation: Suppose $F= \mbox{N}(0,\sigma^2)$. In that case we know

$S_n\stackrel{d}{=} \sqrt{n}X$

Motivated from the above relation we question ourselves that can we get a distribution such that the above relation holds with some other constants like $n$ or $\log n$ or $n^{1/3}$ instead of $\sqrt{n}$? Hence we generalise the above relation in the following manner.

Definition: The distribution $F$ is stable (in the broad sense) if for each $n$ there exist constants $c_n >0$ and $\gamma_n$ such that

$S_n \stackrel{d}{=} c_nX+\gamma_n$

and $F$ is non degenerate. $F$ is stable in the strict sense if the above relation holds with $\gamma_n=0$.