Stable Distributions: Properties

This is a continuation of the previous post that can be found here. We will discuss some of the properties of the Stable distributions in this post.

Theorem 2. Y is the limit of \displaystyle\frac{X_1+X_2+\cdots+X_n-b_n}{a_n} for some iid sequence X_i and sequence a_n >0 and b_n if and only if Y has a stable law.

Remark: This kind of explains why Stable laws are called Stable!

Proof: The if part follows by taking X_i to be stable itself. We focus on the only if part.

Let \displaystyle Z_n=\frac{X_1+X_2+\cdots+X_n-b_n}{a_n}. Fix a k\in \mathbb{N}. Let

\displaystyle S_n^{j}= \sum_{i=1}^n X_{(j-1)n+i}= X_{(j-1)n+1}+X_{(j-1)n+2}+\cdots+X_{jn} \ \ \mbox{for} \ (1\le j \le k)

Now consider Z_{nk}. Observe that

\displaystyle Z_{nk} = \frac{S_n^1+S_n^2+\cdots+S_n^k-b_{nk}}{a_{nk}}

\displaystyle \implies \frac{a_{nk}Z_{nk}}{a_n} = \frac{S_n^1-b_n}{a_n}+\frac{S_n^2-b_n}{a_n}+\cdots+\frac{S_n^k-b_n}{a_n}+\frac{kb_n-b_{nk}}{a_n}

As n\to \infty,

\displaystyle \frac{S_n^1-b_n}{a_n}+\frac{S_n^2-b_n}{a_n}+\cdots+\frac{S_n^k-b_n}{a_n} \stackrel{d}{\to} Y_1+Y_2+\cdots+Y_k

where Y_i‘s are iid copies of Y. Let Z_{nk}=W_n and let

\displaystyle \begin{aligned} W_n' & :=\frac{a_{nk}Z_{nk}}{a_n}-\frac{kb_n-b_{nk}}{a_n} \\ & = \alpha_nW_n+\beta_n \end{aligned}

where \displaystyle\alpha_n:=\frac{a_{nk}}{a_n} and \displaystyle\beta_n:=-\frac{kb_n-b_{nk}}{a_n}.

Now W_n \stackrel{d}{\to} Y. and W_n'\stackrel{d}{\to} Y_1+Y_2+\cdots+Y_k. If we can show that \alpha_n \to \alpha and \beta_n \to \beta, then this will ensure

\alpha Y+\beta \stackrel{d}{=} Y_1+Y_2+\cdots+Y_k

Note that \alpha and \beta depends only on k. Hence the law of Y is stable.

\square

The fact that \alpha_n \to \alpha and \beta_n \to \beta follows from the well known convergence of types theorem which we state below.

Theorem 3. (Convergence of types) If W_n \stackrel{d}{\to} W and there are constants \alpha_n>0 and \beta_n so that W_n'=\alpha_nW_n+\beta_n \stackrel{d}{\to} W'. If W,W' are non degenerate distributions, then there are constants \alpha and \beta so that \alpha_n \to \alpha and \beta_n \to \beta.

Proof: The proof involves analysis arguments using characteristic functions. We will skip it. Interested readers may look into Durret[1] for proof.

\square

So far we have introduced stable distributions in the abstract sense. Except for $\alpha=1$ (Cauchy) and $\alpha=2$ (Normal) we dont know if Stable distributions exist at all for other $\alpha$’s. Now we will show the existence through characteristic functions. We will now focus only on the Symmetric stable distributions. Let us assume

\displaystyle X_1+X_2+\cdots+X_k=k^{1/\alpha}X \ \ \forall \ k\in\mathbb{N} \ \ \ \ \ \ (*)

where X, X_1,X_2, \ldots iid F where F is symmetric.

Theorem 4. The characteristic function of F is given by \phi(t)=e^{-c|t|^{\alpha}}.

Comment: I didn’t find a simple proof of this fact in books. I asked few professors in our institute, they were also not aware any simple proof. The following proof is my own, largely inspired by Avi levy‘s idea. Well, I do not claim that we are first to discover it!

Proof: (Sayan and Avi Levy) Let \psi(t) be the characteristic function. \psi(t) is real and even function. Observe that from (*) relation we have

\displaystyle E(e^{it(X_1+X_2+\cdots+X_k)})=E(e^{itk^{1/\alpha}X}) \ \ \forall \ k\in \mathbb{N} \ \ \mbox{and} \ \ \forall \ t
\displaystyle \implies (\psi(t))^k= \psi(tk^{1/\alpha}) \ \ \forall \ k\in \mathbb{N} \ \ \mbox{and} \ \ \forall \ t \ \ \ \ \ \ \ (1)

Note that if we put \displaystyle t=\frac{x}{2^{1/\alpha}} and k=2 in (1), then

\displaystyle \phi(x)=\phi\left(\frac{x}{2^{1/\alpha}}\cdot 2^{1/\alpha}\right)=\phi\left(\frac{x}{2^{1/\alpha}}\right)^2 \ge 0

Thus \phi is non negative. Hence \phi has a unique k-th root for all positive integers k. We will now use it.

Observe that for all m,n \in \mathbb{N} we have

\displaystyle \begin{aligned} \psi\left(\left(\frac{m}{n}\right)^{1/\alpha}\right) & = \psi\left(\frac{1}{n^{1/\alpha}}\cdot m^{1/\alpha}\right) \\ & = \left(\psi\left(\frac1{n^{1/\alpha}}\right)\right)^m \ \ \mbox{(Putting} \ t=\dfrac{1}{n^{1/\alpha}} \  \mbox{and} \ k=m \ \mbox{in} \ (1)) \\ & = (\psi(1))^{m/n} \ \ \mbox{(Putting} \ t=\frac{1}{n^{1/\alpha}} \ \mbox{and} \ k=n \ \mbox{in} \ (1)) \end{aligned}

Suppose \psi(1)= e^{-c} (to be justified later). As \psi(t)=\psi(-t). We have

\displaystyle\psi(t)=e^{-c|t|^\alpha} \ \ \ \ \ \ \ (2)

whenver |t|^\alpha is rational. Now since \psi is continuous and (2) holds on a dense set of \mathbb{R}. We have

\displaystyle \psi(t)=e^{-c|t|^\alpha} \ \ \forall t \in \mathbb{R}

Now we only have to justify the existence of constant c \ge 0. Note that \psi(1)=E(\cos X) \le 1 and \psi(1) \ge 0. If \psi(1)=0, then for all n, \displaystyle \psi\left(\frac{1}{n^{1/\alpha}}\right)=0 which forces \psi(0)=0 by continuity of \psi which is a contradiction. Thus 0 < \psi(1) \le 1 and c can be taken as - \log \psi(1).

\square

Finally we show that symmetric stable distributions actually exists by showing that e^{-c|t|^\alpha} is a characteristic function of some random variable for latex 0<\alpha\le 2.

Theorem 6: e^{-c|t|^{\alpha}} where 0<\alpha\le 2 is a characteristic function.

Proof: The case \alpha=2 is settled by normal distribution. So let us assume 0<\alpha <2. Note that for any \beta and $latex|x|\le 1$ we have

\displaystyle (1-x)^\beta = \sum_{n=0}^\infty \binom{\beta}{n}(-x)^n

where \displaystyle \binom{\beta}{n}=\frac{\beta(\beta-1)\cdots (\beta-n+1)}{n!}.

Let \displaystyle\psi(t)=1-(1-\cos t)^{\alpha/2}=\sum_{n=1}^{\infty} c_n(\cos t)^n where

\displaystyle \begin{aligned} c_n & = \binom{\alpha/2}{n}(-1)^{n+1} \\ & = \frac{\frac{\alpha}2\left(1-\frac{\alpha}{2}\right)\left(2-\frac{\alpha}{2}\right)\cdots\left(n-1-\frac{\alpha}{2}\right)}{n!} \end{aligned}

Since \alpha<2 we have c_n \ge 0 and \displaystyle \psi(0)=1 \implies \sum_{n=1}^{\infty} c_n =1. Note that \cos(t) is a characteristic function, and hence (\cos(t))^n (for all n) is a characteristic function. Therefpre, \psi(t) is characteristic function as it is a convex combination of characteristic functions. Thus

\displaystyle \psi_n(t)=[\psi(t\cdot 2^{1/2}\cdot n^{-1/\alpha})]^n \ \ \mbox{ is a characteristic function.}

For t>0 we have

\displaystyle \begin{aligned} \lim_{n\to\infty} \psi_n(t) & = \lim_{n\to\infty} \left[1-\left(1-\cos \frac{t\sqrt{2}}{n^{1/\alpha}}\right)^{\alpha/2}\right]^n = e^{-t^\alpha} \end{aligned}

The last one is true because as x\to 0 we have 1-\cos x \sim x^2/2. Hence

\displaystyle \lim_{n\to\infty} n\left(1-\cos \frac{t\sqrt{2}}{n^{1/\alpha}}\right)^{\alpha/2}=\lim_{n\to\infty} n\left(\frac{2t^2}{n^{2/\alpha}}\cdot\frac12\right)^{\alpha/2} =t^\alpha

Since \psi_n(t)=\psi_n(-t), we have \psi_n(t) \to e^{-|t|^\alpha}. Since e^{-|t|^\alpha} is continuous at t=0. Using Levy’s continuity theorem we get that e^{-|t|^\alpha} is a characteristic function. Hence e^{-c|t|^\alpha} is a characteristic function.

\square

We will now use the notation S\alpha S(c) to denote a symmetric stable random variable with characteristic function \phi(t)=e^{c|t|^{\alpha}} We will drop the parameter c from the notation when it is of no interest. From now on we assume 0<\alpha <2. So we do not consider the normal case any more.

Theorem 7: X\sim S\alpha S, then P(|X|>\lambda) \sim k_{\alpha}\lambda^{-\alpha} as \lambda \to \infty where k_\alpha > 0.

Proof: The proof is out of scope of our discussion. I will probably add it (if required) later. Readers may look into Samorodnitsky and Taqqu[5] for proof.

Theorem 8: X\sim S\alpha S, then E|X|^p <\infty for all 0<p<\alpha and E|X|^\alpha =\infty.

Proof: By Theorem 7 we infer that there exist a constant M such that x^\alpha(P(|X|>x)) < M \ \ \forall \ x>0. Let p<\alpha.

\displaystyle \begin{aligned} E(|X|^p) & \le 1+\sum_{k=0}^\infty E|X|^p\mathbb{I}_{2^{k-1}<|X|\le 2^k} \\ & \le 1+\sum_{k=0}^\infty 2^{kp}P(2^{k-1}<|X|\le 2^k) \\ & \le 1+\sum_{k=0}^\infty 2^{kp}P(|X|>2^{k-1}) \\ & \le 1+\sum_{k=0}^\infty 2^{kp}\cdot\frac{M}{2^{(k-1)\alpha}} = 1+2^{\alpha}M\sum_{k=0}^{\infty} 2^{k(p-\alpha)} < \infty \end{aligned}

\displaystyle \begin{aligned} E(|X|^\alpha) & \ge \sum_{k=0}^\infty E|X|^\alpha\mathbb{I}_{2^{k-1}<|X|\le 2^k} \\ & \ge 1+\sum_{k=0}^\infty 2^{(k-1)\alpha}P(2^{k-1}<|X|\le 2^k) \\ & = 1+\sum_{k=0}^\infty 2^{(k-1)\alpha}[P(|X|>2^{k-1})-P(|X|>2^{k})] \end{aligned}

Using the tail asymptotics of S\alpha S random variable we have

\lim_{k\to \infty} 2^{(k-1)\alpha}[P(|X|>2^{k-1})-P(|X|>2^{k})] = k_{\alpha}(1-2^{-\alpha}) > 0

Hence the last sum diverges implying $E|X|^\alpha = \infty$.

\square

We end this section with an application of Theorem 8.

Application: Y_1,Y_2,\ldots are iid random variables with E\sqrt{|Y_1|} <\infty. It is well known via Marcinkiewics-Zygmund law that

\displaystyle \frac{Y_1+Y_2+\cdots+Y_n}{n^2} \stackrel{a.s.}{\to} 0

We wish to know whether there exist a constant \delta >0 small enough such that

\displaystyle \frac{Y_1+Y_2+\cdots+Y_n}{n^{2-\delta}} \stackrel{a.s.}{\to} 0 \ ?

Assume there exist such a \delta >0. Without loss of generality assume \delta<1. Set \alpha=\dfrac{1}{2-\delta}. Observe that \frac12 <\alpha <1. If we consider S\alpha S distribution as the law of Y_i‘s. Then E\sqrt{|Y_1|}<\infty by Theorem 8. But by definition of S\alpha S distribution

\displaystyle \frac{Y_1+Y_2+\cdots+Y_n}{n^{2-\delta}} =\frac{Y_1+Y_2+\cdots+Y_n}{n^{1/\alpha}} \stackrel{d}{=} Y_1 \not\equiv 0

Hence we get a contradiction. Thus there exists no such \delta.

\square

In the next post we will look at The Generalised CLT which makes stable distributions so famous.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s