<< Chapter < Page Chapter >> Page >
Figure one is a graph labeled, execution time distribution function. The horizontal axis is labeled, Time, and the vertical axis is labeled, probability. The values on the horizontal axis range from 0 to 100 in increments of 10. The values on the vertical axis range from 0 to 1 in increments of 0.1. There is one plotted distribution function on this graph. It begins in the bottom-left corner, at the point (0, 0), and moves right at a strong positive slope. As the plot moves from left to right, the slope decreases as the function increases. About midway across the graph horizontally, the plot is nearly at the top, at a probability value above 0.9. The plot continues to increase at a decreasing rate until it tapers off to a horizontal line by the point (80, 1), at which it continues and terminates at the top-right corner. Figure one is a graph labeled, execution time distribution function. The horizontal axis is labeled, Time, and the vertical axis is labeled, probability. The values on the horizontal axis range from 0 to 100 in increments of 10. The values on the vertical axis range from 0 to 1 in increments of 0.1. There is one plotted distribution function on this graph. It begins in the bottom-left corner, at the point (0, 0), and moves right at a strong positive slope. As the plot moves from left to right, the slope decreases as the function increases. About midway across the graph horizontally, the plot is nearly at the top, at a probability value above 0.9. The plot continues to increase at a decreasing rate until it tapers off to a horizontal line by the point (80, 1), at which it continues and terminates at the top-right corner.
Execution Time Distribution Function F D .

The same results may be achieved with mgd, although at the cost of more computing time. In that case, use g N as in [link] , but use the actual distribution for Y .

Arrival times and counting processes

Suppose we have phenomena which take place at discrete instants of time, separated by random waiting or interarrival times. These may be arrivals of customers in a store,of noise pulses on a communications line, vehicles passing a position on a road, the failures of a system, etc. We refer to these occurrences as arrivals and designate the times of occurrence as arrival times . A stream of arrivals may be described in three equivalent ways.

  • Arrival times : { S n : 0 n } , with 0 = S 0 < S 1 < a . s . (basic sequence)
  • Interarrival times : { W i : 1 i } , with each W i > 0 a . s . (incremental sequence)

The strict inequalities imply that with probability one there are no simultaneous arrivals. The relations between the two sequences are simply

S 0 = 0 , S n = i = 1 n W i and W n = S n - S n - 1 for all n 1

The formulation indicates the essential equivalence of the problem with that of the compound demand . The notation and terminology are changed to correspond to thatcustomarily used in the treatment of arrival and counting processes.

The stream of arrivals may be described in a third way.

  • Counting processes : N t = N ( t ) is the number of arrivals in time period ( 0 , t ] . It should be clear that this is a random quantity for each nonnegative t . For a given t , ω the value is N ( t , ω ) . Such a family of random variables constitutes a random process . In this case the random process is a counting process .

We thus have three equivalent descriptions for the stream of arrivals.

{ S n : 0 n } { W n : 1 n } { N t : 0 t }

Several properties of the counting process N should be noted:

  1. N ( t + h ) - N ( t ) counts the arrivals in the interval ( t , t + h ] , h > 0 , so that N ( t + h ) N ( t ) for h > 0 .
  2. N 0 = 0 and for t > 0 we have
    N t = i = 1 I ( 0 , t ] ( S i ) = max { n : S n t } = min { n : S n + 1 > t }
  3. For any given ω , N ( · , ω ) is a nondecreasing, right-continuous, integer-valued function defined on [ 0 , ) , with N ( 0 , ω ) = 0 .

The essential relationships between the three ways of describing the stream of arrivals is displayed in

W n = S n - S n - 1 , { N t n } = { S n t } , { N t = n } = { S n t < S n + 1 }

This imples

P ( N t = n ) = P ( S n t ) - P ( S n + 1 t ) = P ( S n + 1 > t ) - P ( S n > t )

Although there are many possibilities for the interarrival time distributions, we assume

{ W i : 1 i } is iid, with W i > 0 a . s .

Under such assumptions, the counting process is often referred to as a renewal process and the interrarival times are called renewal times . In the literature on renewal processes, it is common for the random variable to count an arrival at t = 0 . This requires an adjustment of the expressions relating N t and the S i . We use the convention above.

Exponential iid interarrival times

The case of exponential interarrival times is natural in many applications and leads to important mathematical results. We utilize the followingpropositions about the arrival times S n , the interarrival times W i , and the counting process N .

  1. If { W i : 1 i } is iid exponential ( λ ) , then S n gamma ( n , λ ) for all n 1 . This is worked out in the unit on TRANSFORM METHODS, in the discussionof the connection between the gamma distribution and the exponential distribution.
  2. S n gamma ( n , λ ) for all n 1 , and S 0 = 0 , iff N t Poisson ( λ t ) for all t > 0 . This follows the result in the unit DISTRIBUTION APPROXI9MATIONS onthe relationship between the Poisson and gamma distributions, along with the fact that { N t n } = { S n t } .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Applied probability. OpenStax CNX. Aug 31, 2009 Download for free at http://cnx.org/content/col10708/1.6
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Applied probability' conversation and receive update notifications?

Ask