This is a continuation of the previous post available here. In the previous post, we develop the ingredients required for the vague convergence proof. Let us now return to the random matrix scenario.

**Proof of Theorem 4:** be a sequence of GUE matrices. Let be the eigenvalues of . Fix . Let us quickly evaluate the limit

Observe that by using Theorem 3, we have

where in the last line we use change of variable formula. Let us define

Note that are kernels since is a kernel. Let us also define . Then using proposition 1 (c) we have

Note that Proposition 4 implies that for every , uniformly over a ball of radius C in the complex plane. are entire functions. Hence and . Hence for we have

Hence by continuity lemma for fredholm determinants we have

where the measure is the lebesgue measure on the bounded interval . This completes the proof of Theorem 4

If we put both sides then this is gives us the vague convergence (Why? Discussed later). But life is not that simple! Putting makes the measure unbounded. We need little more rigor to show that letting is indeed possible and the result remains same.

Note that

Note that is a kernel by proposition 3 and the measure has finite norm, hence the above series is finite and can be expressed as fredholm determinant.

We further note that

Note that is finite. In fact is also finite due to absolute convergence of Fredholm determinant. Hence by dominating convergence theorem

Note that LHS of Theorem 5 can be written as

Hence for the proof of Vague convergence only thing that remains to show is that limits in and are interchangeable. The interchangeability of the limits is assured by Ledoux Bound which will be introduced and proved in the following section.

#### Ledoux Bound

Let us recall the definition of which we defined as the empirical distribution of the eigenvalues, . We denote the *average empirical distribution *simply as which we defined by the relation

Let us recall theorem 2 once again.

**Theorem 2:** Consider the density of the unordered eigenvalues of the GUE. We have

We have a the following general version of theorem 2.

**Theorem 2A:** Let be the density of the unordered eigenvalues then

The proof is skipped. What is interesting is the case when . The density of unordered eigenvalue is given by . Let be a discrete uniform random variable from . Then this is the density of , the -th ordered eigenvalue. Now it is immediate from the definition of that

The term inside the is due to the fact that is the average empirical distribution of the rescaled eigenvalues: .

###### Moment generating function of

Fix . We derive the mgf of explicitly.

The first term of the integration parts is zero as always contains a term as a factor which kills everyone at the limits.

Observe that

where the last equality follows from Proposition 1 (a). Note that the orthonormality of implies that

Hence

How did we arrive at the last term? Expanding and arranging terms in the increasing order of powers of we can certainly write as which may rewritten as , so that gives us the even order moments of (the odd order moments are zero). Now the question is why we write as ? Recall that converges weakly in probability to the semicircle distribution. So we can expect the moments of to converge to the moments of semicircle distribution. It turns out via simple calculation that the -th moments of semicircle distribution is given by . Hence we have written in that fashion. We can now expect some nice behavior for the constants .

**Theorem 7 (Ledoux Bound): **There exist positive constants and such that

for all and .

**Remark: **Note that

Hence fluctuations of is of the order of magnitude . Hence we can expect converges in distribution.

**Proof: **The initial step of the proof is to get a bound on . Let

Note that . satisfies the following differential equation

The truth of the above differential equation can be checked by comparing coefficients of powers of on both sides (Do yourself or believe me! 😀 ). This differential equation comes from the properties of hypergeometric functions.

If we write we have

Note that . Plugging values in terms on in the above relation, we obtain the following simplified recursion formula

Note that which implies . Hence by , for all

Hence applying the inequality recursively times we have

for some constant

Fix . Appropriate will be chosen later. Take .

Note that by Stirling’s approximation

Hence

Finally by replacing with , we get that

which proves the bound. Note that the inequality is true for all and for all

**Remark:** Using Ledoux bound we have

We will using this bound in our proof of Theorem 5.

**Proof of Theorem 5: **As noted earlier enough to show that the limits are interchangeable. Let . Note that for all we have

Hence by taking limsup and then taking limit (note that the right side is free of ) we have

The existence of the limit of the left side was proven earlier.

However

where in the last line we used the fact if a sequence converges then

Now taking limit both sides we have

Hence we have

The same calculations holds if we replace by . Hence it shows that the limit exists for the right side of the above equation and we have

Hence the limits are interchangable. This completes the proof of Vague convergence.

**Remarks: **The crucial parts of the proofs are Ledoux Bound and the asymptotic relation between Hermite polynomials and Airy kernel. The asymptotic result is very old; a general result can be found in [3] and [4]. The asymptotics of Hermite polynomials has been studied extensively in literature. [4] gives a nice exposition to orthogonal polynomials. Ledoux Bound is relatively new, it was published in 2003. The original proof, can be found in [2] and [5]; it does not uses this bound. Most parts of the proof are taken from [1].

I plan to write a similar series on weak convergence (may be next year!).

**References:**

- Anderson, Greg W., Alice Guionnet, and Ofer Zeitouni.
*An introduction to random matrices*. Vol. 118. Cambridge university press, 2010. - Forrester, Peter J. “The spectrum edge of random matrix ensembles.”
*Nuclear Physics B*402.3 (1993): 709-728. - Krishnapur, M.,
*Random Matrix Theory notes.* - Plancherel, M., and W. Rotach. “Sur les valeurs asymptotiques des polynomes d’Hermite.”
*Commentarii Mathematici Helvetici*1.1 (1929): 227-254. - Szego, Gabor.
*Orthogonal polynomials*. Vol. 23. American Mathematical Soc., 1939. - Tracy, Craig A., and Harold Widom. “Level-spacing distributions and the Airy kernel.”
*Communications in Mathematical Physics*159.1 (1994): 151-174.