Appendix 2 Multivariate Normal Distribution

In this chapter, the following topics will be discussed:

Definition

Moment generating function and independence of normal variables

Quadratic forms in normal variable

2.1Definition

Intuition:

Let . Then, the density function is

Definition (Multivariate Normal Random Variable):

A random vector

with has the density function

Theorem:

[proof:]

Since is positive definite, , where is a real orthogonal matrix () and . Then,

. Thus,

where . Further,

Therefore, if we can prove and are mutually independent, then

.

The joint density function of is

,

where

Therefore, the density function of

Therefore, and are mutually independent.

2.2Moment Generating Function and Independence of Normal Random Variables

Moment Generating Function of Multivariate Normal Random Variable:

Let

.

Then, the moment generating function for Yis

Theorem:

If and C is a matrix of rank p, then

.

[proof:]

Let . Then,

Since is the moment generating function of ,

. ◆

Corollary:

If then

,

where Tis an orthogonal matrix.

Theorem:

If , then the marginal distribution of subset of the elements of Y is also multivariate normal.

, then , where

Theorem:

Y has a multivariate normal distribution if and only if is univariate normal for all real vectors a.

[proof:]

Suppose . is univariate normal. Also,

.

Then,. Since

Since

,

is the moment generating function of , thus Y has a multivariate distribution .

By the previous theorem. ◆

2.3Quadratic Form in Normal Variables

Theorem:

If and let P be an symmetric matrix of rank r. Then,

is distributed as if and only if (i.e., P is idempotent).

[proof]

Suppose and . Then, Phas r eigenvalues equal to 1 and eigenvalues equal to 0. Thus, without loss generalization,

where T is an orthogonal matrix. Then,

Since and , thus

.

are i.i.d.normal random variables with common variance . Therefore,

Since P is symmetric, , where T is an orthogonal matrix and is a diagonal matrix with elements . Thus, let . Since ,

.

That is, are independent normal random variable with variance . Then,

The moment generating function of is

Also, sinceQis distributed as , the moment generating function is also equal to . Thus, for every t,

Further,

.

By the uniqueness of polynomial roots, we must have . Then, by the following result:

a matrix P is symmetric, then P is idempotent and rank r if and only if it has r eigenvalues equal to 1 and n-r eigenvalues equal to 0. ◆

Important Result:

Let and let and be both distributed as chi-square. Then, and are independent if and only if .

Useful Lemma:

If , and is semi-positive definite, then

 is idempotent.

Theorem:

If and let

If , then and are independent and .

[proof:]

We first prove ., thus

Since , is any vector in . Therefore, is semidefinite. By the above useful lemma, is idempotent. Further, by the previous theorem,

since

We now prove and are independent. Since

By the previous important result, the proof is complete. ◆

1