Integrated Renewal Process

The marginal distribution of integrated renewal process is derived in this paper. Our approach is based on the theory of point processes, especially Poisson point processes. The results are presented in the form of Laplace transforms. DOI : http://dx.doi.org/10.22342/jims.13.2.64.149-159


INTRODUCTION
Consider arrival of passengers at a train station and model this situation as a renewal process. It means that the inter-arrival times between consecutive passengers are assumed to be independent and identically distributed (iid) non-negative random variables. Suppose that at a certain time (time 0) a train is just departed from the station and there are no passengers left. Passengers who come after the time 0 have to wait until the departure of the next train at some time point t ≥ 0. We are interested in the waiting time of these passengers. The total waiting time of all passengers in the time interval [0,t] is an example of a stochastic process which we call an integrated renewal process. The nomenclature becomes clear from the mathematical definition of the process in Section 2.
An integrated renewal process can be considered as a generalization of a stochastic process which we call an integrated Poisson process in this paper. The expected value of the integrated Poisson process has been discussed in Ross [6]. The integrated renewal process has also a close connection with shot noise processes discussed by Gubner [4]. For asymptotic properties of this process, see Suyono and Van der Weide (2007). In this paper we discuss the marginal distribution of this process which is important for analyzing probabilistic characterizations of the process at any time point. This paper is organized as follows. In Section 2 we give a mathematical definition of an integrated renewal process. In Section 3 we consider an integrated Poisson process including the variance and the marginal probability density function of the process. In Section 4 we consider the marginal distribution of integrated renewal process and in the last section we give an example.

DEFINITIONS
Let (X n , n ≥ 1) be an iid sequence of non-negative random variables having a common distribution function F . Let S n = n i=1 X i , n ≥ 1, and set S 0 = 0. The process (N (t), t ≥ 0) where N (t) = sup{n ≥ 0 : S n ≤ t} is known as a renewal process. In the sequel we will interpret the variables X n as inter-arrival times of the renewal process.
Define for t ≥ 0 We call the stochastic process (Y (t), t ≥ 0) integrated renewal process. As a special case, if (N (t), t ≥ 0) is a Poisson process then we call the process (Y (t), t ≥ 0) integrated Poisson process. Note that we can express Y (t) as where So if we interpret S n , n = 1, 2, 3, ... as arrival times of passengers in a train station then Y (t) represents the total waiting time of all passengers until the departure of a train at time t. In the next sections we will discuss the marginal distributions of Y (t) and Z(t).

INTEGRATED POISSON PROCESS
Firstly, suppose that the process (N (t), t ≥ 0) is an homogeneous Poisson process with rate λ > 0. It is well known that given N (t) = n, the n arrival times S 1 , ..., S n have the same distribution as the order statistics corresponding to n independent random variables uniformly distributed on the time interval [0, t], see e.g. Ross [6].
Conditioning on the number of arrivals in the time interval [0, t] we obtain where U i , i = 1, 2, ..., n are independent and identically uniform random variables on [0, t]. Since it follows that Using a similar argument we can prove that Z(t) has the same Laplace transform as Y (t). So by uniqueness theorem for Laplace transforms we conclude that for each t, Y (t) and Z(t) have the same distribution.
The distribution of Y (t) has a mass at zero with P(Y (t) = 0) = e −λt . The probability density function f Y (t) of the continuous part of Y (t) can be obtained by inverting the Laplace transform in (3). Note that we can express (3) as Inverting this transform we obtain, for x > 0, where I k (x) is the Modified Bessel function of the first kind, i.e., , see Gradshteyn and Ryzhik [2].
For large t, the distribution of Y (t) can be approximated by the normal distribution having mean 1 2 λt 2 and variance 1 3 λt 3 . To prove this we will consider the characteristic function of the normalized Y (t). Firstly, note that Using (3) with α is replaced by iα which is the characteristic function of the standard normal distribution. Now consider the case where (N (t)) is a non-homogeneous Poisson process with intensity measure ν. Given N (t) = n, the arrival times S i , i = 1, 2, ..., n have the same distribution as the order statistics of n iid random variables having a common cumulative distribution function By conditioning on the number of arrivals in the time interval [0, t] we get From this Laplace transform we deduce that Similarly, we can prove that Note that in general the process Y (t) has different distribution from Z(t) when (N (t)) is a non-homogeneous Poisson process.

INTEGRATED RENEWAL PROCESS
In this section we will consider the marginal distributions of the processes (Y (t)) and (Z(t)) defined in (1) and (2) for the case that (N (t)) is a renewal process. We will assume that the inter-arrival times X n of the renewal process (N (t)) are strictly positive. First we consider the process (Z(t)). Obviously we can express Z(t) as We will use point processes to derive the marginal distribution of Z(t).
Let (Ω, F, P) be the probability space on which the iid sequence (X n ) is defined and also an iid sequence (V n , n ≥ 1) of exponentially distributed random variables with parameter 1 such that the sequences (X n ) and (V n ) are independent. Let (T n , n ≥ 1) be the sequence of partial sums of the variables V n . Then the map where δ (x,y) is the Dirac measure in (x, y), defines a Poisson point process on [5]. Let M p (E) be the set of all point measures on E. We will denote the distribution of Φ by P ν , i.e., In the sequel we write A(t, µ) = A(t)(µ). Suppose that the point measure µ has the support supp(µ) = ((t n , x n )) ∞ n=1 with t 1 < t 2 < . . .. It follows that and A(t, µ) can be expressed as Note that for every t ≥ 0, A(t, µ) is almost surely finite. Define also for t ≥ 0 the functional Z(t) on M p (E) by The next lemma motivates the definition of Z(t).
Proof. Let ω ∈ Ω. Then Theorem 4.1. Let (X n , n ≥ 1) be an iid sequence of strictly positive random variables with common distribution function F . Let (S n , n ≥ 0) be the sequence of partial sums of the variables X n and (N (t), t ≥ 0) be the corresponding renewal process: N (t) = sup{n ≥ 0 : S n ≤ t}. Let (with the usual convention that the empty product equals 1), where F * denotes the Laplace-Stieltjes transforms of F . Proof. By Lemma 4.1.
Applying the Palm formula for Poisson point processes, see Grandell [3], we obtain Using Fubini's theorem and a substitution we obtain The integral with respect to P ν can be written as a sum of integrals over the sets B n := {µ ∈ M p (E) : µ([0, s) × [0, ∞)) = n}, n = 0, 1, 2, .... Fix a value of n and let µ ∈ M p (E) be such that µ([0, s) × [0, ∞)) = n and supp(µ) = ((t i , x i )) ∞ i=1 . So t n < s ≤ t n+1 . For such a measure µ the integrand with respect to P ν can be written as Now the measure P ν is the image measure of P under the map Φ, see (5). Expressing the integral with respect to P ν over B n as an integral with respect to P over the subset A n := {ω ∈ Ω : T n (ω) < s ≤ T n+1 (ω)} of Ω, and using independence of (T n ) and (X n ), we obtain Since for each n, ∞ 0 s n n! e −s ds = 1, the theorem follows. We can take derivatives with respect to α in (7) to find Laplace transforms of the moments of Z(t). For example the Laplace transforms of the first and second moments of Z(t) are given in the following proposition.
Now we will consider the distribution of Y (t) when (N (t)) is a renewal process. It is easy to see that where A(t, µ) is defined as in (6). Then as in Lemma 4.1., with probability 1, The following theorem can be proved using arguments as for Z(t), and therefore we omit the proof.
Theorem 4.2. Let (X n , n ≥ 1) be an iid sequence of strictly positive random variables with common distribution function F . Let (S n , n ≥ 0) be the sequence of partial sums of the variables X n and (N (t), t ≥ 0) be the corresponding renewal process: N (t) = sup{n ≥ 0 : S n ≤ t}. Let

AN EXAMPLE
Suppose that the inter-arrival times X n of the renewal process have a common Gamma(m,2) distribution having the probability density function Note that if m = 2λ then X 1 has the same mean as the exponential random variable with parameter λ (exp(λ)). For m = 1, using Theorem 4.2. we obtain , ]e −βt dt = 2(β 3 + 4β 2 + 8β + 6) β 5 (β + 2) 3 .
Inverting these transforms we obtain Hence the variance of Y (t) is given by The double Laplace transform of Y (t) is given by The pdf of Y (t) can be approximated by first truncating the infinite sum in this transform and then by inverting the truncated transform. The graph of the pdf of Y (3) for m = 1 can be seen in Figure 1 (dashed line).  In this Figure we also display the graph for the pdf of Y (3) when (X n ) are iid exponential random variables with parameter 0.5 (solid line). Both Gamma(1,2) and exp(0.5) distributions have the same mean equal to 2.