969 resultados para STOCHASTIC PROCESSES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the Clinical Pathway Analysis Method (CPAM) approach that enables the extraction of valuable organisational and medical information on past clinical pathway executions from the event logs of healthcare information systems. The method deals with the complexity of real-world clinical pathways by introducing a perspective-based segmentation of the date-stamped event log. CPAM enables the clinical pathway analyst to effectively and efficiently acquire a profound insight into the clinical pathways. By comparing the specific medical conditions of patients with the factors used for characterising the different clinical pathway variants, the medical expert can identify the best therapeutic option. Process mining-based analytics enables the acquisition of valuable insights into clinical pathways, based on the complete audit traces of previous clinical pathway instances. Additionally, the methodology is suited to assess guideline compliance and analyse adverse events. Finally, the methodology provides support for eliciting tacit knowledge and providing treatment selection assistance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Biochemical systems with relatively low numbers of components must be simulated stochastically in order to capture their inherent noise. Although there has recently been considerable work on discrete stochastic solvers, there is still a need for numerical methods that are both fast and accurate. The Bulirsch-Stoer method is an established method for solving ordinary differential equations that possesses both of these qualities. Results In this paper, we present the Stochastic Bulirsch-Stoer method, a new numerical method for simulating discrete chemical reaction systems, inspired by its deterministic counterpart. It is able to achieve an excellent efficiency due to the fact that it is based on an approach with high deterministic order, allowing for larger stepsizes and leading to fast simulations. We compare it to the Euler τ-leap, as well as two more recent τ-leap methods, on a number of example problems, and find that as well as being very accurate, our method is the most robust, in terms of efficiency, of all the methods considered in this paper. The problems it is most suited for are those with increased populations that would be too slow to simulate using Gillespie’s stochastic simulation algorithm. For such problems, it is likely to achieve higher weak order in the moments. Conclusions The Stochastic Bulirsch-Stoer method is a novel stochastic solver that can be used for fast and accurate simulations. Crucially, compared to other similar methods, it better retains its high accuracy when the timesteps are increased. Thus the Stochastic Bulirsch-Stoer method is both computationally efficient and robust. These are key properties for any stochastic numerical method, as they must typically run many thousands of simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since we still know very little about stem cells in their natural environment, it is useful to explore their dynamics through modelling and simulation, as well as experimentally. Most models of stem cell systems are based on deterministic differential equations that ignore the natural heterogeneity of stem cell populations. This is not appropriate at the level of individual cells and niches, when randomness is more likely to affect dynamics. In this paper, we introduce a fast stochastic method for simulating a metapopulation of stem cell niche lineages, that is, many sub-populations that together form a heterogeneous metapopulation, over time. By selecting the common limiting timestep, our method ensures that the entire metapopulation is simulated synchronously. This is important, as it allows us to introduce interactions between separate niche lineages, which would otherwise be impossible. We expand our method to enable the coupling of many lineages into niche groups, where differentiated cells are pooled within each niche group. Using this method, we explore the dynamics of the haematopoietic system from a demand control system perspective. We find that coupling together niche lineages allows the organism to regulate blood cell numbers as closely as possible to the homeostatic optimum. Furthermore, coupled lineages respond better than uncoupled ones to random perturbations, here the loss of some myeloid cells. This could imply that it is advantageous for an organism to connect together its niche lineages into groups. Our results suggest that a potential fruitful empirical direction will be to understand how stem cell descendants communicate with the niche and how cancer may arise as a result of a failure of such communication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The care processes of healthcare providers are typically considered as human-centric, flexible, evolving, complex and multi-disciplinary. Consequently, acquiring an insight in the dynamics of these care processes can be an arduous task. A novel event log based approach for extracting valuable medical and organizational information on past executions of the care processes is presented in this study. Care processes are analyzed with the help of a preferential set of process mining techniques in order to discover recurring patterns, analyze and characterize process variants and identify adverse medical events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to simulate stiff biochemical reaction systems, an explicit exponential Euler scheme is derived for multidimensional, non-commutative stochastic differential equations with a semilinear drift term. The scheme is of strong order one half and A-stable in mean square. The combination with this and the projection method shows good performance in numerical experiments dealing with an alternative formulation of the chemical Langevin equation for a human ether a-go-go related gene ion channel mode

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an algorithm for multiarmed bandits that achieves almost optimal performance in both stochastic and adversarial regimes without prior knowledge about the nature of the environment. Our algorithm is based on augmentation of the EXP3 algorithm with a new control lever in the form of exploration parameters that are tailored individually for each arm. The algorithm simultaneously applies the “old” control lever, the learning rate, to control the regret in the adversarial regime and the new control lever to detect and exploit gaps between the arm losses. This secures problem-dependent “logarithmic” regret when gaps are present without compromising on the worst-case performance guarantee in the adversarial regime. We show that the algorithm can exploit both the usual expected gaps between the arm losses in the stochastic regime and deterministic gaps between the arm losses in the adversarial regime. The algorithm retains “logarithmic” regret guarantee in the stochastic regime even when some observations are contaminated by an adversary, as long as on average the contamination does not reduce the gap by more than a half. Our results for the stochastic regime are supported by experimental validation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crashes at any particular transport network location consist of a chain of events arising from a multitude of potential causes and/or contributing factors whose nature is likely to reflect geometric characteristics of the road, spatial effects of the surrounding environment, and human behavioural factors. It is postulated that these potential contributing factors do not arise from the same underlying risk process, and thus should be explicitly modelled and understood. The state of the practice in road safety network management applies a safety performance function that represents a single risk process to explain crash variability across network sites. This study aims to elucidate the importance of differentiating among various underlying risk processes contributing to the observed crash count at any particular network location. To demonstrate the principle of this theoretical and corresponding methodological approach, the study explores engineering (e.g. segment length, speed limit) and unobserved spatial factors (e.g. climatic factors, presence of schools) as two explicit sources of crash contributing factors. A Bayesian Latent Class (BLC) analysis is used to explore these two sources and to incorporate prior information about their contribution to crash occurrence. The methodology is applied to the state controlled roads in Queensland, Australia and the results are compared with the traditional Negative Binomial (NB) model. A comparison of goodness of fit measures indicates that the model with a double risk process outperforms the single risk process NB model, and thus indicating the need for further research to capture all the three crash generation processes into the SPFs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter interrogates what recognition of prior learning (RPL) can and does mean in the higher education sector—a sector in the grip of the widening participation agenda and an open access age. The chapter discusses how open learning is making inroads into recognition processes and examines two studies in open learning recognition. A case study relating to e-portfolio-style RPL for entry into a Graduate Certificate in Policy and Governance at a metropolitan university in Queensland is described. In the first instance, candidates who do not possess a relevant Bachelor degree need to demonstrate skills in governmental policy work in order to be eligible to gain entry to a Graduate Certificate (at Australian Qualifications Framework Level 8) (Australian Qualifications Framework Council, 2013, p. 53). The chapter acknowledges the benefits and limitations of recognition in open learning and those of more traditional RPL, anticipating future developments in both (or their convergence).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power to influence others in ever-expanding social networks in the new knowledge economy is tied to capabilities with digital media production. This chapter draws on research in elementary classrooms to examine the repertoires of cross-disciplinary knowledge that literacy learners need to produce innovative digital media via the “social web”. It focuses on the knowledge processes that occurred when elementary students engaged in multimodal text production with new digital media. It draws on Kalantzis and Cope’s (2008) heuristic for theorizing “Knowledge Processes” in the Learning by Design approach to pedagogy. Learners demonstrate eight “Knowledge Processes” across different subject domains, skills areas, and sensibilities. Drawing data from media-based lessons across several classroom and schools, this chapter examines what kinds of knowledge students utilize when they produce digital, multimodal texts in the classroom. The Learning by Design framework is used as an analytic tool to theorize how students learn when they engaged in a specific domain of learning – digital media production.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes a maximum likelihood method for estimating the parameters of the standard square-root stochastic volatility model and a variant of the model that includes jumps in equity prices. The model is fitted to data on the S&P 500 Index and the prices of vanilla options written on the index, for the period 1990 to 2011. The method is able to estimate both the parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options), including the volatility and jump risk premia. The estimation is implemented using a particle filter whose efficacy is demonstrated under simulation. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using graphics processing units (GPUs). The empirical results indicate that the parameters of the models are reliably estimated and consistent with values reported in previous work. In particular, both the volatility risk premium and the jump risk premium are found to be significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have studied two person stochastic differential games with multiple modes. For the zero-sum game we have established the existence of optimal strategies for both players. For the nonzero-sum case we have proved the existence of a Nash equilibrium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sputter deposition of YBa2Cu3O7-x in a de-diode was performed in pure oxygen medium and an optical spectroscopic study of the resultant discharge revealed strong emissions from both metal atoms and oxygen ions. Emission intensities were studied in pressure range from 0.5 to 3 mbar, with substrate temperatures from 150 to 850 degrees C. Raising the substrate temperature to 850 degrees C increased the number of positive ions and excited neutral atoms. Raising the pressure decreased the emission intensities of excited neutral and ionic species. The results have been compared with those obtained from Langmuir probe measurements. The rise in emission intensities of excited neutrals and ions with temperature suggested the possibility of chemically enhanced physical sputtering of YBa2Cu3O7-x. The effect of process conditions on film composition and quality is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proteins are polymerized by cyclic machines called ribosomes, which use their messenger RNA (mRNA) track also as the corresponding template, and the process is called translation. We explore, in depth and detail, the stochastic nature of the translation. We compute various distributions associated with the translation process; one of them-namely, the dwell time distribution-has been measured in recent single-ribosome experiments. The form of the distribution, which fits best with our simulation data, is consistent with that extracted from the experimental data. For our computations, we use a model that captures both the mechanochemistry of each individual ribosome and their steric interactions. We also demonstrate the effects of the sequence inhomogeneities of real genes on the fluctuations and noise in translation. Finally, inspired by recent advances in the experimental techniques of manipulating single ribosomes, we make theoretical predictions on the force-velocity relation for individual ribosomes. In principle, all our predictions can be tested by carrying out in vitro experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we find through computer simulations and theoretical analysis that the low temperature thermodynamic anomalies of liquid water arises from the intermittent fluctuation between its high density and low density forms, consisting largely of 5-coordinated and 4-coordinated water molecules, respectively. The fluctuations exhibit strong dynamic heterogeneity (defined by the four point time correlation function), accompanied by a divergence like growth of the dynamic correlation length, of the type encountered in fragile supercooled liquids. The intermittency has been explained by invoking a two state model often employed to understand stochastic resonance, with the relevant periodic perturbation provided here by the fluctuation of the total volume of the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Friction has an important influence in metal forming operations, as it contributes to the success or otherwise of the process. In the present investigation, the effect of friction on metal forming was studied by simulating compression tests on cylindrical Al-Mg alloy using the finite element method (FEM) technique. Three kinds of compression tests were considered wherein a constant coefficient of friction was employed at the upper die-work-piece interface. However, the coefficient of friction between the lower die-work-piece interfaces was varied in the tests. The simulation results showed that a difference in metal flow occurs near the interfaces owing to the differences in the coefficient of friction. It was concluded that the variations in the coefficient of friction between the dies and the work-piece directly affect the stress distribution and shape of the work-piece, having implications on the microstructure of the material being processed.