<< Chapter < Page Chapter >> Page >

Time of occurrence statistics

To derive the multivariate distribution of W , we use the count statistics and the independence properties of the Poisson process. Thedensity we seek satisfies v w 1 w 1 1 v w n w n n p W n v W 1 w 1 w 1 1 W n w n w n n The expression on the right equals the probability that noevents occur in t 1 w 1 , one event in w 1 w 1 1 , no event in w 1 1 w 2 , etc. Because of the independence of event occurrence in these disjoint intervals, we can multiplytogether the probability of these event occurrences, each of which is given by the count statistics. W 1 w 1 w 1 1 W n w n w n n t 1 w 1 w 1 w 1 + 1 w 1 w 1 + 1 w 1 + 1 w 2 w 2 w 2 + 2 w 2 w 2 + 2 w n w n + n w n w n + n k 1 n w k k t 1 w n for small k . From this approximation, we find that the joint distribution ofthe first n event times equals

p W n w k 1 n w k t 1 w n t 1 w 1 w 2 w n 0

Sample function density

For Poisson processes, the sample function density describes the joint distribution of counts and event times within aspecified time interval. Thus, it can be written as t 1 t t 2 N t W 1 w 1 W n w n N t 1 , t 2 n p W n w The second term in the product equals the distribution derived previously for the time of occurrence statistics. Theconditional probability equals the probability that no events occur between w n and t 2 ; from the Poisson process's count statistics, this probability equals w n t 2 . Consequently, the sample function density for the Poisson process, be it stationary or not, equals

t 1 t t 2 N t k 1 n w k t 1 t 2

Properties

From the probability distributions derived on the previous pages, we can discern many structural properties of thePoisson process. These properties set the stage for delineating other point processes from the Poisson. They, asdescribed subsequently, have much more structure and are much more difficult to handle analytically.

The counting process

The counting process N t is an independent increment process. For a Poisson process, the number of events in disjoint intervals are statistically independent of eachother, meaning that we have an independent increment process. When the Poisson process is stationary, incrementstaken over equi-duration intervals are identically distributed as well as being statistically independent. Twoimportant results obtain from this property. First, the counting process's covariance function K N t u equals 2 t u . This close relation to the Wiener waveform process indicates the fundamental nature of the Poissonprocess in the world of point processes. Note, however, that the Poisson counting process is not continuous almost surely. Second, the sequence of counts forms an ergodic process, meaning wecan estimate the intensity parameter from observations.

The mean and variance of the number of events in an interval can be easily calculated from the Poisson distribution.Alternatively, we can calculate the characteristic function and evaluate its derivatives. The characteristic functionof an increment equals N t 1 , t 2 v v 1 t 1 t 2 The first two moments and variance of an increment of the Poisson process, be it stationary or not, equal

N t 1 , t 2 t 1 t 2
N t 1 , t 2 2 t 1 t 2 t 1 t 2 2 N t 1 , t 2 t 1 t 2 Note that the mean equals the variance here, a trademark of the Poisson process.

Poisson process event times from a markov process

Consider the conditional density p W n | W n - 1 , , W 1 w n | w n - 1 , , w 1 . This density equals the ratio of the event time densitiesfor the n - and ( n 1 )-dimensional event time vectors. Simple substitution yields

w n w n w n - 1 p W n | W n - 1 , , W 1 w n | w n - 1 , , w 1 w n w n - 1 w n
Thus the n th event time depends only on when the n 1 th event occurs, meaning that we have a Markov process. Note that event times are ordered: the n th event must occur after the n 1 th , etc. Thus, the values of this Markov process keep increasing, meaning that from this viewpoint, the eventtimes form a nonstationary Markovian sequence. When the process is stationary, the evolutionary density isexponential. It is this special form of event occurence time density that defines a Poisson process.

Interevent intervals in a poisson process form a white sequence.

Exploiting the previous property, the duration of the n th interval n w n w n - 1 does not depend on the lengths of previous (or future) intervals. Consequently, the sequence of intereventintervals forms a "white" sequence. The sequence may not be identically distributed unless the process is stationary.In the stationary case, interevent intervals are truly white - they form an IID sequence - and have an exponentialdistribution.

0 p n 0 0
To show that the exponential density for a white sequence corresponds to the most "random" distribution, Parzen proved that the ordered times of n events sprinkled independently and uniformly over a given interval form astationary Poisson process. If the density of event sprinkling is not uniform, the resulting ordered timesconstitute a nonstationary Poisson process with an intensity proportional to the sprinkling density.

Doubly stochastic poisson processes

Here, the intensity t equals a sample function drawn from some waveform process. In waveform processes, the analogous concept does not have nearly the impact it does here. Because intensitywaveforms must be non-negative, the intensity process must be nonzero mean and non-Gaussian. The authors shall assume throughout that the intensityprocess is stationary for simplicity. This model arises in those situations in which the event occurrence rate clearlyvaries unpredictably with time. Such processes have the property that the variance-to-mean ratio of the number ofevents in any interval exceeds one. In the process of deriving this last property, we illustrate the typical wayof analyzing doubly stochastic processes: Condition on the intensity equaling a particular sample function, use thestatistical characteristics of nonstationary Poisson processes, then "average" with respect to the intensityprocess. To calculate the expected number N t 1 , t 2 of events in an interval, we use conditional expected values:

N t 1 , t 2 t t 1 t t 2 N t 1 , t 2 t 1 t 2 t 2 t 1 t
This result can also be written as the expected value of the integrated intensity: N t 1 , t 2 t 1 t 2 . Similar calculations yield the increment's second moment and variance. N t 1 , t 2 2 t 1 t 2 t 1 t 2 2 N t 1 , t 2 t 1 t 2 t 1 t 2 Using the last result, we find that the variance-to-mean ratio in a doubly stochastic process always exceeds unity, equalingone plus the variance-to-mean ratio of the intensity process.

The approach of sample-function conditioning can also be used to derive the density of the number of events occurring in aninterval for a doubly stochastic Poisson process. Conditioned on the occurrence of a sample function, the probability of n events occurring in the interval t 1 t 2 equals ( ) t t 1 t t 2 N t 1 , t 2 n t 1 t 2 n n t 1 t 2 Because t 1 t 2 is a random variable, the unconditional distribution equals this conditional probability averagedwith respect to this random variable's density. This average is known as the Poisson Transform of the randomvariable's density.

N t 1 , t 2 n 0 n n p t 1 t 2

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Statistical signal processing. OpenStax CNX. Dec 05, 2011 Download for free at http://cnx.org/content/col11382/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Statistical signal processing' conversation and receive update notifications?

Ask