6 resultados para Denumerable-markov-processes

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Let Q be a stable and conservative Q-matrix over a countable state space S consisting of an irreducible class C and a single absorbing state 0 that is accessible from C. Suppose that Q admits a finite mu-subinvariant measure in on C. We derive necessary and sufficient conditions for there to exist a Q-process for which m is mu-invariant on C, as well as a necessary condition for the uniqueness of such a process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new structure with the special property that instantaneous resurrection and mass disaster are imposed on an ordinary birth-death process is considered. Under the condition that the underlying birth-death process is exit or bilateral, we are able to give easily checked existence criteria for such Markov processes. A very simple uniqueness criterion is also established. All honest processes are explicitly constructed. Ergodicity properties for these processes are investigated. Surprisingly, it can be proved that all the honest processes are not only recurrent but also ergodic without imposing any extra conditions. Equilibrium distributions are then established. Symmetry and reversibility of such processes are also investigated. Several examples are provided to illustrate our results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Let (Phi(t))(t is an element of R+) be a Harris ergodic continuous-time Markov process on a general state space, with invariant probability measure pi. We investigate the rates of convergence of the transition function P-t(x, (.)) to pi; specifically, we find conditions under which r(t) vertical bar vertical bar P-t (x, (.)) - pi vertical bar vertical bar -> 0 as t -> infinity, for suitable subgeometric rate functions r(t), where vertical bar vertical bar - vertical bar vertical bar denotes the usual total variation norm for a signed measure. We derive sufficient conditions for the convergence to hold, in terms of the existence of suitable points on which the first hitting time moments are bounded. In particular, for stochastically ordered Markov processes, explicit bounds on subgeometric rates of convergence are obtained. These results are illustrated in several examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let S be a countable set and let Q = (q(ij), i, j is an element of S) be a conservative q-matrix over S with a single instantaneous state b. Suppose that we are given a real number mu >= 0 and a strictly positive probability measure m = (m(j), j is an element of S) such that Sigma(i is an element of S) m(i)q(ij) = -mu m(j), j 0 b. We prove that there exists a Q-process P(t) = (p(ij) (t), i, j E S) for which m is a mu-invariant measure, that is Sigma(i is an element of s) m(i)p(ij)(t) = e(-mu t)m(j), j is an element of S. We illustrate our results with reference to the Kolmogorov 'K 1' chain and a birth-death process with catastrophes and instantaneous resurrection.