Tracy-Widom Law: Vague convergence and Ledoux Bound

This is a continuation of the previous post available here. In the previous post, we develop the ingredients required for the vague convergence proof.  Let us now return to the random matrix scenario.

Proof of Theorem 4: X_n be a sequence of n\times n GUE matrices. Let \lambda_1,\ldots, \lambda_n be the eigenvalues of X_n. Fix -\infty < t< t' < \infty. Let us quickly evaluate the limit

\displaystyle \lim_{n\to \infty} P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right) \not\in (t,t'), \ \ i=1,2, \ldots, n\right]

Observe that by using Theorem 3, we have

\displaystyle \begin{aligned} & P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right) \not\in (t,t'), \ \ i=1,2, \ldots, n\right] \\ & = P\left[\lambda_i \not\in \left(2\sqrt{n}+\frac{t}{n^{1/6}},2\sqrt{n}+\frac{t'}{n^{1/6}}\right), \ \ i=1,2, \ldots, n \right] \\ & = 1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!}\int\limits_{2\sqrt{n}+\frac{t}{n^{1/6}}}^{2\sqrt{n}+\frac{t'}{n^{1/6}}}\cdots\int\limits_{2\sqrt{n}+\frac{t}{n^{1/6}}}^{2\sqrt{n}+\frac{t'}{n^{1/6}}} \det_{i,j=1}^k K^{(n)}(x_i,x_j)\prod_{i=1}^{k}dx_i \\ & = 1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \int_{t}^{t'}\cdots\int_{t}^{t'} \det_{i,j=1}^k \left[\frac1{n^{1/6}}K^{(n)}\left(2\sqrt{n}+\frac{x_i}{n^{1/6}},2\sqrt{n}+\frac{x_j}{n^{1/6}}\right)\right]\prod_{i=1}^k dx_i \end{aligned}

where in the last line we use change of variable formula. Let us define

\displaystyle A^{(n)}(x,y):=\frac1{n^{1/6}}K^{(n)}\left(2\sqrt{n}+\frac{x}{n^{1/6}},2\sqrt{n}+\frac{y}{n^{1/6}}\right)

Note that A^{(n)} are kernels since K^{(n)} is a kernel. Let us also define \displaystyle \phi_n(x) = n^{1/12}\psi_n\left(2\sqrt{n}+\frac{x}{n^{1/6}}\right). Then using proposition 1 (c) we have

\displaystyle A^{(n)}(x,y)=\frac{\phi_n(x)\phi_n'(y)-\phi_n'(x)\phi_n'(y)}{x-y}-\frac1{2n^{1/3}}\phi_n(x)\phi_n(y)

Note that Proposition 4 implies that  for every C>1, \phi_n \to Ai uniformly over a ball of radius C in the complex plane. \phi_n are entire functions. Hence \phi_n' \to Ai' and \phi_n'' \to Ai''. Hence for t\le x,y \le t' we have

\displaystyle ||A^{(n)}(x,y)-A(x,y)|| \to 0

  Hence by continuity lemma for fredholm determinants we have

\displaystyle \begin{aligned} & \lim_{n\to \infty} P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right) \not\in (t,t'), \ \ i=1,2, \ldots, n\right] \\ & = \lim_{n\to\infty} \Delta(A^{(n)}) \\ & = \lim_{n\to\infty} \Delta(A) \\ & = 1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \int_{t}^{t'}\cdots\int_{t}^{t'} \det_{i,j=1}^k A(x_i,x_j) \prod_{i=1}^k dx_i \end{aligned}  

where the measure is the lebesgue measure on the bounded interval \displaystyle (t,t'). This completes the proof of Theorem 4

\square

If we put t'=\infty both sides then this is gives us the vague convergence (Why? Discussed later). But life is not that simple! Putting t'=\infty makes the measure unbounded. We need little more rigor to show that letting t'\to \infty is indeed possible and the result remains same.

Note that

\displaystyle \begin{aligned} & 1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \int_{t}^{\infty}\cdots\int_{t}^{\infty} \det_{i,j=1}^k A(x_i,x_j) \prod_{i=1}^k dx_i \\ & = 1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \int_{t}^{\infty}\cdots\int_{t}^{\infty} \det_{i,j=1}^k e^{x_i+x_j}A(x_i,x_j) \prod_{i=1}^k e^{-2x_i}dx_i \end{aligned}

Note that B(x,y):=e^{x+y}A(x,y)1_{x,y>t} is a kernel by proposition 3 and the measure e^{-2x}dx has finite norm, hence the above series is finite and can be expressed as fredholm determinant.

We further note that

\displaystyle \begin{aligned} |f_{t'}(k)| & := \left|\frac{(-1)^k}{k!} \int_{t}^{t'}\cdots\int_{t}^{t'} \det_{i,j=1}^k A(x_i,x_j) \prod_{i=1}^k dx_i\right| \\ & \le \frac{1}{k!}  \int_{t}^{t'}\cdots\int_{t}^{t'} \left|\det_{i,j=1}^k A(x_i,x_j)\right| \prod_{i=1}^k dx_i  \\ & \le  \frac{1}{k!}  \int_{t}^{\infty}\cdots\int_{t}^{\infty} \left|\det_{i,j=1}^k B(x_i,x_j)\right| \prod_{i=1}^k e^{-2x_i}dx_i := g(k) \end{aligned}

Note that g(k) is finite. In fact \displaystyle \sum_{k=1}^{\infty} g(k) is also finite due to absolute convergence of Fredholm determinant. Hence by dominating convergence theorem

\displaystyle \begin{aligned} &  \lim_{t' \to \infty} \lim_{n\to \infty} P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right) \not\in (t,t'), \ \ i=1,2, \ldots, n\right] \\ & = \lim_{t' \to \infty} \left[1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \int_{t}^{t'}\cdots\int_{t}^{t'} \det_{i,j=1}^k A(x_i,x_j) \prod_{i=1}^k dx_i \right]\\ & = 1+\lim_{t'\to\infty} \sum_{k=1}^{\infty} f_{t'}(k) \\ & = 1+\sum_{k=1}^{\infty}\lim_{t'\to\infty} f_{t'}(k) \\ & = 1+ \sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \lim_{t'\to \infty}\left[\int_{t}^{t'}\cdots\int_{t}^{t'} \det_{i,j=1}^k A(x_i,x_j) \prod_{i=1}^k dx_i\right] \\ & = 1+\sum_{k=1}^{\infty} \frac{(-1)^k}{k!} \int_{t}^{\infty}\cdots\int_{t}^{\infty} \det_{i,j=1}^k A(x_i,x_j) \prod_{i=1}^k dx_i\\ & = \mbox{RHS of Theorem 5} \end{aligned}

Note that LHS of Theorem 5 can be written as

\displaystyle \begin{aligned} & \lim_{n\to\infty}  P\left[n^{2/3}\left(\frac{\max \lambda_i}{\sqrt{n}}-2\right)\le t\right]  \\ & = \lim_{n\to \infty} P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right)\le t, \ \ i=1,2, \ldots, n\right]\\ & =  \lim_{n\to \infty}  P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right) \not\in (t,\infty), \ \ i=1,2, \ldots, n\right]\\ & =  \lim_{n\to \infty}  \lim_{t' \to \infty} P\left[n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right) \not\in (t,t'), \ \ i=1,2, \ldots, n\right] \end{aligned}

Hence for the proof of Vague convergence only thing that remains to show is that limits in n and t' are interchangeable. The interchangeability of the limits is assured by Ledoux Bound which will be introduced and proved in the following section.

Ledoux Bound

Let us recall the definition of L_n which we defined as the empirical distribution of the eigenvalues, L_n:=\frac1n\sum_{i=1}^n \delta_{\lambda_i^n/\sqrt{n}}. We denote the average empirical distribution simply as \bar{L_n}:=EL_n which we defined by the relation

\displaystyle \langle L_n,f\rangle=E\langle L_n,f\rangle

Let us recall  theorem 2 once again.

Theorem 2:  Consider the density \rho_n^{(2)}(x_1,x_2,\ldots,x_n) of the unordered eigenvalues of the GUE. We have

\displaystyle \rho_{n}^{(2)}(x_1,x_2,\ldots,x_n)= \frac1{n!}\det_{k,l=1}^n K^{(n)}(x_k,x_l)

We have a the following general version of theorem 2.

Theorem 2A: Let \rho_{p,n}^{(2)}(x_1,x_2,\ldots,x_p) be the density of the p unordered eigenvalues then

\displaystyle \rho_{p,n}^{(2)}(x_1,x_2,\ldots,x_p)=\frac{(n-p)!}{n!}\det_{i,j=1}^p K^{(n)}(x_i,x_j)

The proof is skipped. What is interesting is the case when p=1. The density of 1 unordered eigenvalue is given by \rho_{1,n}^{(2)}(x)=\frac{1}{n}K^{(n)}(x,x). Let U be a discrete uniform random variable from \{1,2,\ldots,n\}. Then this is the density of \lambda_{U}^n, the U-th ordered eigenvalue. Now it is immediate from the definition of \bar{L_n} that

\displaystyle \langle \bar{L_n},f \rangle  = \frac1{n}\int_{\mathbb{R}} f\left(\frac{x}{\sqrt{n}}\right)K^{(n)}(x,x)\,dx

The \sqrt{n} term inside the f is due to the fact that \bar{L_n} is the average empirical distribution of the rescaled eigenvalues: \lambda_i^n/\sqrt{n}.

Moment generating function of \bar{L_n}

Fix s \in \mathbb{R}. We derive the mgf of \bar{L_n} explicitly.

\displaystyle \begin{aligned} & \langle \bar{L_n},e^{sx} \rangle \\ & = \frac1n\int_{-\infty}^{\infty} e^{sx/\sqrt{n}}K^{(n)}(x,x)\,dx \\ & \stackrel{\mbox{By parts}}{=} \frac1n\left[\frac{\sqrt{n}}{s}K^{(n)}(x,x)e^{sx/\sqrt{n}}\right]_{-\infty}^{\infty}-\frac{1}{s}\int_{-\infty}^{\infty} e^{sx/\sqrt{n}}\frac1{\sqrt{n}}\left(\frac{d}{dx}K^{(n)}(x,x)\right)\,dx \\ & \stackrel{\mbox{Propostion 1 (d)}}{=} \frac{1}{s}\int_{-\infty}^{\infty} e^{sx/\sqrt{n}}\psi_n(x)\psi_{n-1}(x)\,dx \end{aligned}

The first term of the integration parts is zero as K^{(n)}(x,x) always contains a e^{-x^2/4} term as a factor which kills everyone at the limits.

Observe that

\displaystyle \begin{aligned} S_t^n & := \int_{-\infty}^{\infty} e^{tx}\psi_n(x)\psi_{n-1}(x)\,dx \\ & = \frac{\sqrt{n}}{n!\sqrt{2\pi}}\int_{-\infty}^{\infty} H_n(x)H_{n-1}(x)e^{-x^2/2+tx}\,dx \\ & = \frac{\sqrt{n}e^{t^2/2}}{n!\sqrt{2\pi}}\int_{-\infty}^{\infty} H_n(x)H_{n-1}(x)e^{-(x-t)^2/2}\,dx \\ & = \frac{\sqrt{n}e^{t^2/2}}{n!\sqrt{2\pi}}\int_{-\infty}^{\infty} H_n(x+t)H_{n-1}(x+t)e^{-x^2/2}\,dx \\ & = \frac{\sqrt{n}e^{t^2/2}}{n!\sqrt{2\pi}}\sum_{k=0}^n\sum_{\ell=0}^{n-1}\int_{-\infty}^{\infty}\binom{n}{k}\binom{n-1}{\ell} H_k(x)H_{\ell}(x)t^{2n-1-k-\ell}e^{-x^2/2}\,dx \end{aligned}

where the last equality follows from Proposition 1 (a). Note that the orthonormality of \psi_n(x) implies that

\displaystyle \begin{aligned} S_t^n &  = \frac{ne^{t^2/2}}{n!\sqrt{2\pi}}\sum_{k=0}^n\sum_{\ell=0}^{n-1}\int_{-\infty}^{\infty}\binom{n}{k}\binom{n-1}{\ell} H_k(x)H_{\ell}(x)t^{2n-1-k-\ell}e^{-x^2/2}\,dx \\ & = \frac{\sqrt{n}e^{t^2/2}}{n!\sqrt{2\pi}}\sum_{k=0}^{n-1}\binom{n}{k}\binom{n-1}{k}\sqrt{2\pi}k!t^{2n-1-2k}\,dx  \\ & = \sqrt{n}e^{t^2/2}\sum_{k=0}^{n-1}\frac{1}{(n-k)!}\binom{n-1}{k}t^{2n-1-2k} \\ & = \sqrt{n}e^{t^2/2}\sum_{k=0}^{n-1}\frac{1}{(k+1)!}\binom{n-1}{k}t^{2k+1} \end{aligned}

Hence

\displaystyle \begin{aligned} \langle \bar{L_n},e^{sx} \rangle & = \frac1s S_{s/\sqrt{n}}^n\\ & = e^{s^2/2n}\sum_{k=0}^{n-1}\frac1{(k+1)!}\binom{n-1}{k}\frac{s^{2k}}{n^k} \\ & = \sum_{k=0}^{\infty} \frac{b_k^{(n)}}{k+1}\binom{2k}{k}\frac{s^{2k}}{(2k)!} \end{aligned}

How did we arrive at the last term? Expanding e^{s^2/2n} and arranging terms in the increasing order of powers of s we can certainly write \displaystyle \langle \bar{L_n},e^{sx} \rangle as \displaystyle \sum p_k^{(n)}s^{2k} which may rewritten as \displaystyle \sum q_k^{(n)}\frac{s^{2k}}{(2k)!}, so that q_k^{(n)} gives us the even order moments of L_n (the odd order moments are zero). Now the question is why we write q_k^{(n)} as \displaystyle \frac{b_k^{(n)}}{k+1}\binom{2k}{k}? Recall that L_n converges weakly in probability to the semicircle distribution. So we can expect the moments of \bar{L_n} to converge to the moments of semicircle distribution. It turns out via simple calculation that the 2k-th moments of semicircle distribution is given by \displaystyle \frac1{k+1}\binom{2k}{k}. Hence we have written q_k^{(n)} in that fashion. We can now expect some nice behavior for the constants b_k^{(n)}.

Theorem 7 (Ledoux Bound): There exist positive constants b and d such that

\displaystyle P\left[\max\frac{\lambda_i}{2\sqrt{n}}\ge e^{n^{-2/3}\epsilon}\right]\le de^{-b\epsilon}

for all n \ge 1 and \epsilon >0.

Remark: Note that

\displaystyle \max\frac{\lambda_i}{2\sqrt{n}}\ge e^{n^{-2/3}\epsilon} \implies   \max\frac{\lambda_i}{2\sqrt{n}}-1 \ge e^{n^{-2/3}\epsilon}-1=O(n^{-2/3})

Hence fluctuations of \displaystyle \max\frac{\lambda_i}{2\sqrt{n}}-1 is of the order of magnitude n^{2/3}. Hence we can expect \displaystyle n^{2/3}\left(\max\frac{\lambda_i}{2\sqrt{n}}-1\right) converges in distribution.

Proof: The initial step of the proof is to get a bound on b_k^{(n)}. Let

\displaystyle \Phi_n(t)=e^{-t/2}\sum_{k=0}^{n-1} \frac{(-1)^k}{(k+1)!}\binom{n-1}{k}t^k

Note that \langle \bar{L_n}, e^sx \rangle = \Phi_n(-s^2/n). \Phi_n(t) satisfies the following differential equation

\displaystyle 4t\phi_n''(t)+8\Phi_n'(t)+(4n-t)\Phi_n(t)=0

The truth of the above differential equation can be checked by comparing coefficients of powers of t on both sides (Do yourself or believe me! 😀 ). This differential equation comes from the properties of hypergeometric functions.

If we write \Phi_n(t)=\sum a_k^{(n)}t^k we have

\displaystyle 4(k+2)(k+1)a_{k+1}^{(n)}+4na_k^{(n)}-a_{k-1}^{(n)}=0

Note that \displaystyle \frac{(-1)^ka_k^{(n)}}{n^k}=\frac{b_k^{(n)}}{k+1}\binom{2k}{k}\frac1{(2k)!}. Plugging a_k^{(n)} values in terms on b_k^{(n)} in the above relation, we obtain the following simplified recursion formula

\displaystyle b_{k+1}^{(n)}=b_k^{(n)}+\frac{k(k+1)}{4n^2}b_{k-1}^{(n)}, \ \ \ \ \ \ \   (*)  

Note that \displaystyle  \frac{b_k^{(n)}}{k+1}\binom{2k}{k}=\langle \bar{L_n},x^{2k} \rangle which implies b_k^{(n)} \ge 0. Hence by (*), b_k^{(n)}\le b_{k+1}^{(n)} for all k

\displaystyle \begin{aligned} 0 \le b_k^{(n)} & \stackrel{(*)}{=} b_{k-1}^{(n)}+\frac{k(k-1)}{4n^2}b_{k-2}^{(n)} \\ &  \le b_{k-1}^{(n)}\left(1+\frac{k(k-1)}{4n^2}\right) \\ & \le e^{\frac{k(k-1)}{4n^2}}b_{k-1}^{(n)}\end{aligned}

Hence applying the inequality recursively k times we have

\displaystyle b_k^{(n)} \le e^{ck^3/n^2}

for some constant c>0

Fix t>0. Appropriate t will be chosen later. Take k=\lceil t \rceil.

\displaystyle \begin{aligned} P\left(\max \frac{\lambda_i}{2\sqrt{n}}\ge e^{\epsilon}\right) & \stackrel{\mbox{Markov}}{\le} E\left(\frac{\max \lambda_i}{2\sqrt{n}e^{\epsilon}}\right)^{2k} \\ & = \frac{e^{-2\epsilon k}}{2^{2k}}E\left(\frac{\max \lambda_i}{\sqrt{n}}\right)^{2k} \\ & \le \frac{e^{-2\epsilon k}}{2^{2k}}\cdot n\cdot \langle \bar{L_n}, x^{2k} \rangle \\ & = \frac{ne^{-2\epsilon k}}{2^{2k}}\cdot\frac{b_k^{(n)}}{k+1}\frac{(2k)!}{(k!)^2} \\ & \le \frac{ne^{-2\epsilon k}}{2^{2k}}\cdot\frac{e^{ck^3/n^2}}{k+1}\frac{(2k)!}{(k!)^2} \end{aligned}

Note that by Stirling’s approximation

\displaystyle \frac{1}{2^{2k}(k+1)}\frac{(2k)!}{(k!)^2} \sim \frac{(2k)^{2k+1/2}e^{-2k}}{2^{2k}e^{2k}k^{2k+1}(k+1)} =O(k^{-3/2})

Hence

\displaystyle \begin{aligned} P\left(\max \frac{\lambda_i}{2\sqrt{n}}\ge e^{\epsilon}\right) & \le \frac{ne^{-2\epsilon k}}{2^{2k}}\cdot\frac{e^{ck^3/n^2}}{k+1}\frac{(2k)!}{(k!)^2} \\ & \le d'ne^{-2\epsilon k}e^{ck^3/n^2}k^{-3/2} \\ & \le d'ne{-2\epsilon t}t^{-3/2}e^{ct^3/n^2}e^{c(k^3-t^3)/n^2} \\ & \stackrel{t=n^{2/3}}{=} d'e^{-2n^{2/3}\epsilon}e^{c}e^{c(k^3-n^2)/n^2} \\ & = de^{-bn^{2/3}\epsilon} \end{aligned}

Finally by replacing \epsilon with n^{-2/3}\epsilon, we get that

\displaystyle  P\left(\max \frac{\lambda_i}{2\sqrt{n}} \ge e^{n^{-2/3}\epsilon}\right) \le de^{-b\epsilon}

which proves the bound. Note that the inequality is true for all n \ge 1 and for all \epsilon > 0

\square

Remark: Using Ledoux bound we have

\displaystyle \begin{aligned} P\left[n^{2/3}\left(\frac{\max \lambda_i}{\sqrt{n}}-2\right)\ge t\right] & = P\left[\frac{\max \lambda_i}{2\sqrt{n}} \ge 1+tn^{-2/3}/2\right] \\ & \le d\exp\left(-bn^{2/3}\log(1+tn^{-2/3}/2)\right) \\ & = d\exp\left(-b\frac{t}{2}\cdot\frac{\log(1+tn^{-2/3}/2)}{tn^{-2/3}/2}\right)\end{aligned}

We will using this bound in our proof of Theorem 5.

Proof of Theorem 5: As noted earlier enough to show that the limits are interchangeable. Let \displaystyle \tilde\lambda_i:=n^{2/3}\left(\frac{\lambda_i}{\sqrt{n}}-2\right). Note that for all  t'<\infty we have

\displaystyle P(\tilde\lambda_i \not\in (t,t'), \ \forall  \ i) \ge P(\tilde\lambda_i\not\in (t,\infty) \ \forall  \ i)

Hence by taking limsup n\to \infty and then taking limit t' \to \infty (note that the right side is free of t') we have

\displaystyle  \begin{aligned} & \lim_{t'\to \infty} \lim_{n\to \infty} P(\tilde \lambda_i \not\in (t,t'), \ \forall  \ i)  \\ & \ge \limsup_{n\to \infty} P(\tilde\lambda_i\not\in (t,\infty) \ \forall \  i) \\ & \ge  \limsup_{n\to \infty}\lim_{t'\to\infty} P(\tilde\lambda_i\not\in (t,t')  \ \forall \ i) \end{aligned}

The existence of the limit of the left side was proven earlier.

However

\displaystyle \begin{aligned}  & \lim_{n\to \infty} P(\tilde \lambda_i \not\in (t,t'), \ \forall  \ i) \\ & \le \limsup_{n\to \infty} \left[P\left(\tilde\lambda_i\not\in (t,\infty) \ \forall \  i\right)+P\left(\max \tilde\lambda_i \ge t'\right)\right] \\ & \le \limsup_{n\to \infty} \left[P\left(\tilde\lambda_i\not\in (t,\infty) \ \forall \  i\right)+d\exp\left(-b\frac{t'}{2}\cdot\frac{\log(1+t'n^{-2/3}/2)}{t'n^{-2/3}/2}\right)\right] \\ & = \limsup_{n\to \infty} P\left(\tilde\lambda_i\not\in (t,\infty) \ \forall \  i\right)+ de^{-bt'/2}\end{aligned}

where in the last line we used the fact if a sequence b_n converges then

\displaystyle \limsup_{n\to\infty} (a_n+b_n)=\limsup_{n\to\infty} a_n +\lim_{n\to\infty} b_n

Now taking limit t' \to \infty both sides we have

\displaystyle  \begin{aligned} & \lim_{t'\to \infty} \lim_{n\to \infty} P(\tilde \lambda_i \not\in (t,t'), \ \forall  \ i)  \\ & \le \limsup_{n\to \infty} P(\tilde\lambda_i\not\in (t,\infty) \ \forall \  i) \\ & \le  \limsup_{n\to \infty}\lim_{t'\to\infty} P(\tilde\lambda_i\not\in (t,t')  \ \forall \ i) \end{aligned}

Hence we have

\displaystyle \lim_{t'\to \infty} \lim_{n\to \infty} P(\tilde \lambda_i \not\in (t,t'), \ \forall  \ i)  =  \limsup_{n\to \infty}\lim_{t'\to\infty} P(\tilde\lambda_i\not\in (t,t')  \ \forall \ i)

The same calculations holds if we replace \limsup by \liminf. Hence it shows that the limit exists for the right side of the above equation and we have

\displaystyle \lim_{t'\to \infty} \lim_{n\to \infty} P(\tilde \lambda_i \not\in (t,t'), \ \forall  \ i)  =  \lim_{n\to \infty}\lim_{t'\to\infty} P(\tilde\lambda_i\not\in (t,t')  \ \forall \ i)

Hence the limits are interchangable. This completes the proof of Vague convergence.

\square

Remarks: The crucial parts of the proofs are Ledoux Bound and the asymptotic relation between Hermite polynomials and Airy kernel. The asymptotic result is very old; a general result can be found in [3] and [4]. The asymptotics of Hermite polynomials has been studied extensively in literature. [4] gives a nice exposition to orthogonal polynomials. Ledoux Bound is relatively new, it was published in 2003. The original proof, can be found in [2] and [5]; it does not uses this bound. Most parts of the proof are taken from [1].

I plan to write a similar series on weak convergence (may be next year!).

References:

  1. Anderson, Greg W., Alice Guionnet, and Ofer Zeitouni. An introduction to random matrices. Vol. 118. Cambridge university press, 2010.
  2. Forrester, Peter J. “The spectrum edge of random matrix ensembles.”Nuclear Physics B 402.3 (1993): 709-728.
  3. Krishnapur, M., Random Matrix Theory notes.
  4. Plancherel, M., and W. Rotach. “Sur les valeurs asymptotiques des polynomes d’Hermite.” Commentarii Mathematici Helvetici 1.1 (1929): 227-254.
  5. Szego, Gabor. Orthogonal polynomials. Vol. 23. American Mathematical Soc., 1939.
  6. Tracy, Craig A., and Harold Widom. “Level-spacing distributions and the Airy kernel.” Communications in Mathematical Physics 159.1 (1994): 151-174.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s