A Random Processx(Or Stochastic) Is an Indexed Collection

A Random Processx(Or Stochastic) Is an Indexed Collection

6. RANDOM PROCESSES

Definition

A random processX(or stochastic) is an indexed collection

of random variables, all on the same probability space (S,F,P)

In many applications the index set T(parameter set ) is a set of times (continuous or discrete)

  • discrete time random processes
  • continuous time random processes

To every S , there corresponds a function of time –

a sample function

The totality of all sample functions is called an ensemble

The values assumed by X(t) are called states and they form a state space E of the random process.

Sample function of a random process

Example 6-1

In the tossing coin experiment, where S = {H, T}, define the random function

Both parameter set and state space can be discrete or continuous. Depending on that, the process is:

PARAMETER SET T / STATE SPACE E
DISCRETE / DISCRETE PARAMETER
or
DISCRETE TIME
or
RANDOM SEQUENCE
{Xn, n = 1, 2, ....} / DISCRETE STATE
or
CHAIN
CONTINUOUS / CONTINUOUS PARAMETER
or
CONTINUOUS TIME / CONTINUOUS STATE

There are three ways to look at the random process

  1. X(,t ) as a function of both S and T,
  2. for each fixed S, X(t ) is a function of t T,
  3. for each fixed t T, X() is a function on S.

Distribution and density functions

.

Thefirst-order distribution function is defined as:

The first-order density function is defined as:

In general, we can define thenth-order distribution functionas:

and thenth-order density functionas:

First- and second-order statistical averages

The mean or expected value of random process X(t) is defined as:

X(t) is treated as a random variable for a fixed value of t. In general, X(t) is a function of time, and it is often called the ensemble average of X(t).

A measure of dependence of random variables of X(t) is expressed by its autocorrelation function, defined by:

and autocovariance function, defined by:

Classification of random processes

Stationary processes

A random process X(t)is stationary, or strict-sense stationary, if its statistical properties do not change with time, or more precisely:

for all orders n and all time shifts .

Stationarity influences the form of the first- and second-order distribution and density function:

The mean of a stationary process

does not depend on time, and the autocorrelation function

depends only on time difference t2 – t1.

If stationarity condition of a random process X(t)does not hold for all n, but only for nk, than the process X(t)is stationary to order k.

If X(t)is stationary to order 2, then it is wide-sense stationary(WSS) or weak stationary.

Independent processes

In a random process X(t),if X(ti) are independent random variables for i = 1, 2,…n, than for n 2 we have:

Only the first-order distribution is sufficient to characterize an independent random process.

Markov Processes

A random process X(t)is said to be a Markov process if

The future states of the process depend only on the present state and not on the past history (memoryless property)

For a Markov process we can write:

Ergodic processes

A random process X(t)is ergodic if the time averages of the sample functions are equal to ensemble averages.

The time average of x(t)is defined as:

Similarly, the time autocorrelation function ofx(t)is defined as:

Counting process

A random process { X(t ), t 0} is called a counting process if X(t) represents the total number of “events” that have occurred in the interval (0, t). It has the following properties:

  1. X(t) 0 and X(0) = 0
  2. X(t) is integer valued
  3. X(t1)X(t2) if t1t2
  4. X(t1) -X(t2)equals the number of events in the interval (t1, t2)

A sample function of a counting process

Poison processes

If the number of events nin any interval of length is Poisson distributed with the mean , that is:

then the counting process X(t)is said to be a Poisson process with rate ( or intensity) .

6-1

Stochastic Processes – Random Processes