11 resultados para Bellman-Harris Branching Processes
em Greenwich Academic Literature Archive - UK
Resumo:
We extend the Harris regularity condition for ordinary Markov branching process to a more general case of non-linear Markov branching process. A regularity criterion which is very easy to check is obtained. In particular, we prove that a super-linear Markov branching process is regular if and only if the per capita offspring mean is less than or equal to I while a sub-linear Markov branching process is regular if the per capita offspring mean is finite. The Harris regularity condition then becomes a special case of our criterion.
Resumo:
Attention has recently focussed on stochastic population processes that can undergo total annihilation followed by immigration into state j at rate αj. The investigation of such models, called Markov branching processes with instantaneous immigration (MBPII), involves the study of existence and recurrence properties. However, results developed to date are generally opaque, and so the primary motivation of this paper is to construct conditions that are far easier to apply in practice. These turn out to be identical to the conditions for positive recurrence, which are very easy to check. We obtain, as a consequence, the surprising result that any MBPII that exists is ergodic, and so must possess an equilibrium distribution. These results are then extended to more general MBPII, and we show how to construct the associated equilibrium distributions.
Resumo:
A generalized Markov Brnching Process (GMBP) is a Markov branching model where the infinitesimal branching rates are modified with an interaction index. It is proved that there always exists only one GMBP. An associated differential-integral equation is derived. The extinction probalility and the mean and conditional mean extinction times are obtained. Ergodicity and stability of GMBP with resurrection are also considered. Easy checking criteria are established for ordinary and strong ergodicty. The equilibrium distribution is given in an elegant closed form. The probability meaning of our results is clear and thus explained.
Resumo:
This paper focuses on the basic problems regarding uniqueness and extinction properties for generalised Markov branching processes. The uniqueness criterion is firstly established and a differential–integral equation satisfied by the transition functions of such processes is derived. The extinction probability is then obtained. A closed form is presented for both the mean extinction time and the conditional mean extinction time. It turns out that these important quantities are closely related to the elementary gamma function.
Resumo:
This paper concentrates on investigating ergodicity and stability for generalised Markov branching processes with resurrection. Easy checking criteria including several clear-cut corollaries are established for ordinary and strong ergodicity of such processes. The equilibrium distribution is given in an elegant closed form for the ergodic case. The probabilistic interpretation of the results is clear and thus explained.
Resumo:
This note provides a new probabilistic approach in discussing the weighted Markov branching process (WMBP) which is a natural generalisation of the ordinary Markov branching process. Using this approach, some important characteristics regarding the hitting times of such processes can be easily obtained. In particular, the closed forms for the mean extinction time and conditional mean extinction time are presented. The explosion behaviour of the process is investigated and the mean explosion time is derived. The mean global holding time and the mean total survival time are also obtained. The close link between these newly developed processes and the well-known compound Poisson processes is investigated. It is revealed that any weighted Markov branching process (WMBP) is a random time change of a compound Poisson process.
Resumo:
This paper surveys the recent progresses made in the field of unstable denumerable Markov processes. Emphases are laid upon methodology and applications. The important tools of Feller transition functions and Resolvent Decomposition Theorems are highlighted. Their applications particularly in unstable denumerable Markov processes with a single instantaneous state and Markov branching processes are illustrated.
Resumo:
A Feller–Reuter–Riley function is a Markov transition function whose corresponding semigroup maps the set of the real-valued continuous functions vanishing at infinity into itself. The aim of this paper is to investigate applications of such functions in the dual problem, Markov branching processes, and the Williams-matrix. The remarkable property of a Feller–Reuter–Riley function is that it is a Feller minimal transition function with a stable q-matrix. By using this property we are able to prove that, in the theory of branching processes, the branching property is equivalent to the requirement that the corresponding transition function satisfies the Kolmogorov forward equations associated with a stable q-matrix. It follows that the probabilistic definition and the analytic definition for Markov branching processes are actually equivalent. Also, by using this property, together with the Resolvent Decomposition Theorem, a simple analytical proof of the Williams' existence theorem with respect to the Williams-matrix is obtained. The close link between the dual problem and the Feller–Reuter–Riley transition functions is revealed. It enables us to prove that a dual transition function must satisfy the Kolmogorov forward equations. A necessary and sufficient condition for a dual transition function satisfying the Kolmogorov backward equations is also provided.
Resumo:
By revealing close links among strong ergodicity, monotone, and the Feller–Reuter–Riley (FRR) transition functions, we prove that a monotone ergodic transition function is strongly ergodic if and only if it is not FRR. An easy to check criterion for a Feller minimal monotone chain to be strongly ergodic is then obtained. We further prove that a non-minimal ergodic monotone chain is always strongly ergodic. The applications of our results are illustrated using birth-and-death processes and branching processes.
Resumo:
We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.