Image Restoration with Gaussian Mixture Models

R.FARNOOSH, B.ZARPAK

Department of Mathematics

University of Science and Technology-University of Shahed

IRAN

Abstract: In recent years probabilistic models are interested and have an important

role in image analysis. In this paper we have assumed image classification is done

and each pixel of image belongs to a class with a Gaussian density so we have

Gaussian Mixture Model prior for true image. In the other hand our observed

image also is Gaussian, thus with finding the posterior density and maximum a

posterior we can estimate the parameters, which construct our image restoration.

Keyword: Restoration-Gaussian Mixture Model-Bayesian Image Analysis-

Maximum a Posterior

1

1 Introduction

1

Recently the probabilistic models have become widespread in many domains including image analysis, signal processing and pattern recognition. Meanwhile the probabilistic models such as Bayesian model, hidden Markov model and Gaussian mixture model have changed data analysis and many scientific problems. Mclachlan, Peel (2000) have written a good book about finite mixture models and Rasmussen, Ghahramani (2002) have worked infinite mixture models and Bishop (2003) has considered Bayesian mixtures.

In this paper we consider the Gaussian mixture models for noise reduction and image restoration. First we assumed image classification is done and each pixel of image belongs to a class with a Gaussian density so we have Gaussian mixture model for true image. But our observed image is Gaussian too so if we find the posterior density and use EM algorithm, then we can estimate the parameters which construct our image restoration.

Therefore we use Gaussian mixture model (GMM) for image processing and try to learn about Gaussian Mixture distribution to an image and estimate the parameters that cause to reduce the image noises.

In the next sections we first introduce GMM formula for a true image then we also consider the Gaussian white noise which adds true image and makes observed image. We use Bayesian approach and EM algorithm to estimate the parameters which give us the GMM of true image and help us for image restoration.

2 The GMM Formulation

The value of a pixel in an image (i.e. the intensity or the color) can be taking as a random variable. Since every random variable has a probability distribution then pixel values also have probability distribution. The Gaussian mixture distribution is a good probability distribution for pixel values of an image.

The Gaussian mixture distribution formula is the form

Where

We assume an image is classified into class that the number of class’s k is known. The parameters and are respect to mean, variance and the probability of a pixel belongs to the class. It means

2.1 The GWN Formulation

The white noise (WN) is random process which is uncorrelated random variables with zero mean and constant variance. The Gaussian white noise (GWN) also has the Gaussian distribution is written by

Let us consider the GWN adds the true image and makes the observed image as follows

If the true image supposes to be constant then the probability distribution of for given is the form by

3 Bayesian Approach

Bayesian model is the popular method in probabilistic models and we use here and try to find the posterior probability or the probability distribution of for given, it means

But for power of exponential, we have

If we take

Then we can write

Where

Therefore the posterior probability also is a Gaussian mixture distribution which is an important result and is basis of our paper, so we have

4 Estimation of Parameter

Observed image is our data and with these data we can estimate the parameters of probability model. Let image data be

Where n is the number of pixels and our parameters which we should estimate are

But since we want to use all data so we make likelihood function as

Or log-likelihood function is the form by

The values of which maximize (9) are called maximum likelihood estimators (MLE) which is a popular point estimation method.

Now we take partial differential from (9) respect to the parameters

But since

Therefore

Where

This formula is called the membership probability. Similarity we can write

But we know

After taking logarithm

Therefore

And

Now we have from (10), (12) the following equations which their solutions get us the maximization

Finally the maximum likelihood estimation (MLE) of the parameters is the form by

Where this formula is the expected value or the mean of membership Probability that we have showed in the formula (11).

Now we can improve MLE method that we will discuss in next section.

4.1 The EM-Algorithm

The expectation maximization (EM) algorithm introduced by Dempster (1977) for maximization likelihood functions with missing data. This algorithm is a popular tool for simplifying difficult maximum likelihood problems. It has two steps; in E-step we compute the expectation and in M-step the maximization of the last step is done and iteration EM-steps continue until convergence occur.

______

The EM-algorithm

1. Initialize

2. (E-step) Compute

3. (M-step) Compute

4. Iterate steps 2 and 3 until convergence.

4.2 The True Image Parameters

In the first of paper we wanted to fit a GMM with (1) formula for a true image. But from (6), (7) we can write

With observed image, EM-algorithm and above formula the GMM true image is specified.

The expected value or the mean of GMM fitted is considered as the image restoration

But we have

So in our method we use all observed image or our training data twice, first in determining the posterior probability and EM-algorithm, second in calculating the expected value of learned GMM.

References:

[1] G.J.Mclachlan, D.Peel, Finite Mixtures Models, john Wiley, 1999

[2] C.E.Rasmussen, Z.Ghahramani, Infinite Mixtures of Gaussian Process Experts, Advances Neural Information Processing system 14, MIT Press, 2002

[3]C.M.Bishop, M.Svensen, Bayesian Hierarchical Mixtures of Experts, Uncertainty in Artificial Intelligence proceeding of the 19th conference, 2003

1