Hubbry Logo
search button
Sign in
Markov renewal process
Markov renewal process
Comunity Hub
History
arrow-down
starMore
arrow-down
bob

Bob

Have a question related to this hub?

bob

Alice

Got something to say related to this hub?
Share it here.

#general is a chat channel to discuss anything related to the hub.
Hubbry Logo
search button
Sign in
Markov renewal process
Community hub for the Wikipedia article
logoWikipedian hub
Welcome to the community hub built on top of the Markov renewal process Wikipedia article. Here, you can discuss, collect, and organize anything related to Markov renewal process. The purpose of the hub i...
Add your contribution
Markov renewal process

Markov renewal processes are a class of random processes in probability and statistics that generalize the class of Markov jump processes. Other classes of random processes, such as Markov chains and Poisson processes, can be derived as special cases among the class of Markov renewal processes, while Markov renewal processes are special cases among the more general class of renewal processes.

Definition

[edit]
An illustration of a Markov renewal process

In the context of a jump process that takes states in a state space , consider the set of random variables , where represents the jump times and represents the associated states in the sequence of states (see Figure). Let the sequence of inter-arrival times . In order for the sequence to be considered a Markov renewal process the following condition should hold:

Relation to other stochastic processes

[edit]
  1. Let and be as defined in the previous statement. Defining a new stochastic process for , then the process is called a semi-Markov process as it happens in a continuous-time Markov chain. The process is Markovian only at the specified jump instants, justifying the name semi-Markov.[1][2][3] (See also: hidden semi-Markov model.)
  2. A semi-Markov process (defined in the above bullet point) in which all the holding times are exponentially distributed is called a continuous-time Markov chain. In other words, if the inter-arrival times are exponentially distributed and if the waiting time in a state and the next state reached are independent, we have a continuous-time Markov chain.
  3. The sequence in the Markov renewal process is a discrete-time Markov chain. In other words, if the time variables are ignored in the Markov renewal process equation, we end up with a discrete-time Markov chain.
  4. If the sequence of s is independent and identically distributed, and if their distribution does not depend on the state , then the process is a renewal. So, if the states are ignored and we have a chain of iid times, then we have a renewal process.

See also

[edit]

References

[edit]
  1. ^ Medhi, J. (1982). Stochastic processes. New York: Wiley & Sons. ISBN 978-0-470-27000-4.
  2. ^ Ross, Sheldon M. (1999). Stochastic processes (2nd ed.). New York [u.a.]: Routledge. ISBN 978-0-471-12062-9.
  3. ^ Barbu, Vlad Stefan; Limnios, Nikolaos (2008). Semi-Markov chains and hidden semi-Markov models toward applications: their use in reliability and DNA analysis. New York: Springer. ISBN 978-0-387-73171-1.