In this series of posts I will introduce the stable distributions and discuss some of its properties. The theorems and proofs presented here are written in a lucid manner, so that it can be accessible to most of the readers. I tried my best to make this discussion self contained as far as possible so that the readers don’t have to look up to different references to follow the proofs. It is based on the presentation that I did in Measure Theory course in MStat 1st year.

## Introduction

In statistics while modelling continuous data we assume that the data are iid observations coming from some ‘nice’ distribution. Then we do statistical inference. The strongest statistical argument we usually use is the Central Limit Theorem, which states that the sum of a large number of iid variables from a finite variance distribution will tend to be normally distributed. But this finite variance assumption is not always true. For example many real life data, such as financial assets exhibit fat tails. Such finite variance assumption does not hold for heavy tails. So there is a need for an alternative model. One such alternative is the Stable distribution. Of course there are other alternatives. But one good reason to use Stable distribution as an alternative is that they are supported by General Central Limit Theorem.

**Notations:** means and have same distribution. Throughout this section we assume

**Motivation:** Suppose . In that case we know

Motivated from the above relation we question ourselves that can we get a distribution such that the above relation holds with some other constants like or or instead of ? Hence we generalise the above relation in the following manner.

**Definition:** The distribution is stable (in the broad sense) if for each there exist constants and such that

and is non degenerate. is stable in the strict sense if the above relation holds with .

**Examples:** and

Observe that is like a location constant. We are interested in the scale constant . We want to know what are the possibilities for . The following theorem shows that has to be an appropriate power of . So or are ruled out.

**Theorem 1:** (Feller volume 2) where .

**Proof:** Note that if is stable, the symmetrized version of , (distribution of ) is strictly stable with same constant . So without loss of generality assume is symmetric stable. Hence we have .

**Step-1:** We claim

Note that

as and are independent. Hence

Observe that

Now if our claim is not true, we may get a sequence such that as with . Using this sequence in that above inequality we have

The last inequality is strict as is non degenerate. But then the above holds for all which contradicts tightness. This proves the claim.

**Step-2:** We will show . Note that

Thus .

Thus and if for some some , we have , then . Taking characteristic functions we have

As is symmetric, is real. Hence or . But since is continuous and at , we have which forces to be degenerate.

Also if for some , we have . Then . But then it contradicts Step 1. Thus we must have for all . Finally for each , we choose an (depending on ) such that . We will show the choice of is independent of . This will prove Step 2.

Therefore we wish to show

Note that and . For each , there exists such that . Then

Thus

But as , . But by Step 1, the left hand side of the above inequality always remains bounded. Hence . Similarly by changing the roles of and . Hence .

**Step-3:** We will show . Note the if has finite second moment, taking variances we have

We will show that for , will have finite second moment which will force which is a contradiction. For this, we need the following lemma

**Lemma 1:** . is symmetric. . Then

We skip the proof of this lemma. One can find a proof of the lemma in Feller Volume 2. Since is tight, get such that . Clearly must be bounded. Otherwise it will contradict the lemma. Thus is bounded for all $x>t$. So this implies is also bounded for all $x>0$ So suppose

Then due to symmetricity

This completes the proof.

We look into some standard properties of stable distributions in next post.