6 resultados para Markov Processes

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

70.00% 70.00%

Publicador:

Resumo:

A non-Markovian process is one that retains `memory' of its past. A systematic understanding of these processes is necessary to fully describe and harness a vast range of complex phenomena; however, no such general characterisation currently exists. This long-standing problem has hindered advances in understanding physical, chemical and biological processes, where often dubious theoretical assumptions are made to render a dynamical description tractable. Moreover, the methods currently available to treat non-Markovian quantum dynamics are plagued with unphysical results, like non-positive dynamics. Here we develop an operational framework to characterise arbitrary non-Markovian quantum processes. We demonstrate the universality of our framework and how the characterisation can be rendered efficient, before formulating a necessary and sufficient condition for quantum Markov processes. Finally, we stress how our framework enables the actual systematic analysis of non-Markovian processes, the understanding of their typicality, and the development of new master equations for the effective description of memory-bearing open-system evolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We provide a sufficient condition of analyticity of infinitely differentiable eigenfunctions of operators of the form Uf(x) = integral a(x, y) f(b( x, y)) mu(dy) acting on functions f: [u, v] --> C ( evolution operators of one-dimensional dynamical systems and Markov processes have this form). We estimate from below the region of analyticity of the eigenfunctions and apply these results for studying the spectral properties of the Frobenius-Perron operator of the continuous fraction Gauss map. We prove that any infinitely differentiable eigenfunction f of this Frobenius-Perron operator, corresponding to a non-zero eigenvalue admits a (unique) analytic extension to the set C\(-infinity, 1]. Analyzing the spectrum of the Frobenius Perron operator in spaces of smooth functions, we extend significantly the domain of validity of the Mayer and Ropstorff asymptotic formula for the decay of correlations of the Gauss map.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Discrete Conditional Phase-type (DC-Ph) models consist of a process component (survival distribution) preceded by a set of related conditional discrete variables. This paper introduces a DC-Ph model where the conditional component is a classification tree. The approach is utilised for modelling health service capacities by better predicting service times, as captured by Coxian Phase-type distributions, interfaced with results from a classification tree algorithm. To illustrate the approach, a case-study within the healthcare delivery domain is given, namely that of maternity services. The classification analysis is shown to give good predictors for complications during childbirth. Based on the classification tree predictions, the duration of childbirth on the labour ward is then modelled as either a two or three-phase Coxian distribution. The resulting DC-Ph model is used to calculate the number of patients and associated bed occupancies, patient turnover, and to model the consequences of changes to risk status.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Markov Decision Processes (MDPs) are extensively used to encode sequences of decisions with probabilistic effects. Markov Decision Processes with Imprecise Probabilities (MDPIPs) encode sequences of decisions whose effects are modeled using sets of probability distributions. In this paper we examine the computation of Γ-maximin policies for MDPIPs using multilinear and integer programming. We discuss the application of our algorithms to “factored” models and to a recent proposal, Markov Decision Processes with Set-valued Transitions (MDPSTs), that unifies the fields of probabilistic and “nondeterministic” planning in artificial intelligence research.