638 resultados para Stochastic Models
Resumo:
Radio Frequency Identification is a wireless identification method that utilizes the reception of electromagnetic radio waves. This research has proposed a novel model to allow for an in-depth security analysis of current protocols and developed new flexible protocols that can be adapted to offer either stronger security or better efficiency.
Resumo:
Travelling wave phenomena are observed in many biological applications. Mathematical theory of standard reaction-diffusion problems shows that simple partial differential equations exhibit travelling wave solutions with constant wavespeed and such models are used to describe, for example, waves of chemical concentrations, electrical signals, cell migration, waves of epidemics and population dynamics. However, as in the study of cell motion in complex spatial geometries, experimental data are often not consistent with constant wavespeed. Non-local spatial models have successfully been used to model anomalous diffusion and spatial heterogeneity in different physical contexts. In this paper, we develop a fractional model based on the Fisher-Kolmogoroff equation and analyse it for its wavespeed properties, attempting to relate the numerical results obtained from our simulations to experimental data describing enteric neural crest-derived cells migrating along the intact gut of mouse embryos. The model proposed essentially combines fractional and standard diffusion in different regions of the spatial domain and qualitatively reproduces the behaviour of neural crest-derived cells observed in the caecum and the hindgut of mouse embryos during in vivo experiments.
Resumo:
This thesis makes several contributions towards improved methods for encoding structure in computational models of word meaning. New methods are proposed and evaluated which address the requirement of being able to easily encode linguistic structural features within a computational representation while retaining the ability to scale to large volumes of textual data. Various methods are implemented and evaluated on a range of evaluation tasks to demonstrate the effectiveness of the proposed methods.
Resumo:
This book presents readers with the opportunity to fundamentally re-evaluate the processes of innovation and entrepreneurship, and to rethink how they might best be stimulated and fostered within our organizations and communities. The fundamental thesis of the book is that the entrepreneurial process is not a linear progression from novel idea to successful innovation, but is an iterative series of experiments, where progress depends on the persistence and resilience of the individuals involved, and their ability and to learn from failure as well as success. From this premise, the authors argue that the ideal environment for new venture creation is a form of “experimental laboratory,” a community of innovators where ideas are generated, shared, and refined; experiments are encouraged; and which in itself serves as a test environment for those ideas and experiments. This environment is quite different from the traditional “incubator,” which may impose the disciplines of the established firm too early in the development of the new venture.
Resumo:
A range of authors from the risk management, crisis management, and crisis communications literature have proposed different models as a means of understanding components of crisis. A generic component of these sources has focused on preparedness practices before disturbance events and response practices during events. This paper provides a critical analysis of three key explanatory models of how crises escalate highlighting the strengths and limitations of each approach. The paper introduces an optimised conceptual model utilising components from the previous work under the four phases of pre-event, response, recovery, and post-event. Within these four phases, a ten step process is introduced that can enhance understanding of the progression of distinct stages of disturbance for different types of events. This crisis evolution framework is examined as a means to provide clarity and applicability to a range of infrastructure failure contexts and provide a path for further empirical investigation in this area.
Resumo:
The traditional hospital-based model of cardiac rehabilitation faces substantial challenges, such as cost and accessibility. These challenges have led to the development of alternative models of cardiac rehabilitation in recent years. The aim of this study was to identify and critique evidence for the effectiveness of these alternative models. A total of 22 databases were searched to identify quantitative studies or systematic reviews of quantitative studies regarding the effectiveness of alternative models of cardiac rehabilitation. Included studies were appraised using a Critical Appraisal Skills Programme tool and the National Health and Medical Research Council's designations for Level of Evidence. The 83 included articles described interventions in the following broad categories of alternative models of care: multifactorial individualized telehealth, internet based, telehealth focused on exercise, telehealth focused on recovery, community- or home-based, and complementary therapies. Multifactorial individualized telehealth and community- or home-based cardiac rehabilitation are effective alternative models of cardiac rehabilitation, as they have produced similar reductions in cardiovascular disease risk factors compared with hospital-based programmes. While further research is required to address the paucity of data available regarding the effectiveness of alternative models of cardiac rehabilitation in rural, remote, and culturally and linguistically diverse populations, our review indicates there is no need to rely on hospital-based strategies alone to deliver effective cardiac rehabilitation. Local healthcare systems should strive to integrate alternative models of cardiac rehabilitation, such as brief telehealth interventions tailored to individual's risk factor profiles as well as community- or home-based programmes, in order to ensure there are choices available for patients that best fit their needs, risk factor profile, and preferences.
Resumo:
The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.
Resumo:
Invasion waves of cells play an important role in development, disease and repair. Standard discrete models of such processes typically involve simulating cell motility, cell proliferation and cell-to-cell crowding effects in a lattice-based framework. The continuum-limit description is often given by a reaction–diffusion equation that is related to the Fisher–Kolmogorov equation. One of the limitations of a standard lattice-based approach is that real cells move and proliferate in continuous space and are not restricted to a predefined lattice structure. We present a lattice-free model of cell motility and proliferation, with cell-to-cell crowding effects, and we use the model to replicate invasion wave-type behaviour. The continuum-limit description of the discrete model is a reaction–diffusion equation with a proliferation term that is different from lattice-based models. Comparing lattice based and lattice-free simulations indicates that both models lead to invasion fronts that are similar at the leading edge, where the cell density is low. Conversely, the two models make different predictions in the high density region of the domain, well behind the leading edge. We analyse the continuum-limit description of the lattice based and lattice-free models to show that both give rise to invasion wave type solutions that move with the same speed but have very different shapes. We explore the significance of these differences by calibrating the parameters in the standard Fisher–Kolmogorov equation using data from the lattice-free model. We conclude that estimating parameters using this kind of standard procedure can produce misleading results.
Resumo:
This thesis describes the use of 2- and 3-dimensional cell-based models for studying how skin cells respond to ultraviolet radiation. These methods were used to investigate skin damage and repair after exposure to radiation in the context of skin cancer development. Interactions between different skin cell types were demonstrated as being significant in protecting against ultraviolet radiation-induced skin damage. This has important implications in understanding how skin cancers occur, as well as in the development of new strategies to prevent and treat them.
Resumo:
Process-aware information systems (PAISs) can be configured using a reference process model, which is typically obtained via expert interviews. Over time, however, contextual factors and system requirements may cause the operational process to start deviating from this reference model. While a reference model should ideally be updated to remain aligned with such changes, this is a costly and often neglected activity. We present a new process mining technique that automatically improves the reference model on the basis of the observed behavior as recorded in the event logs of a PAIS. We discuss how to balance the four basic quality dimensions for process mining (fitness, precision, simplicity and generalization) and a new dimension, namely the structural similarity between the reference model and the discovered model. We demonstrate the applicability of this technique using a real-life scenario from a Dutch municipality.
Resumo:
This thesis examines how the initial institutional and technological aspects of the economy and the reforms that alter these aspects influence long run growth and development. These issues are addressed in the framework of stochastic endogenous growth models and an empirical framework. The thesis is able to explain why developing nations exhibit diverse growth and inequality patterns. Consequently, the thesis raises a number of policy implications regarding how these nations can improve their economic outcomes.
Resumo:
This presentation discusses topics and issues that connect closely with the Conference Themes and themes in the ARACY Report Card. For example, developing models of public space that are safe, welcoming and relevant to children and young people will impact on their overall wellbeing and may help to prevent many of the tensions occurring in Australia and elsewhere around the world. This area is the subject of ongoing international debate, research and policy formation, relevant to concerns in the ARACY Report Card about children and young people’s health and safety, participation, behaviours and risks and peer and family relationships.
Resumo:
Background: Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult owing to species biology and behavioural characteristics. The design of robust sampling programmes should be based on an underlying statistical distribution that is sufficiently flexible to capture variations in the spatial distribution of the target species. Results: Comparisons are made of the accuracy of four probability-of-detection sampling models - the negative binomial model,1 the Poisson model,1 the double logarithmic model2 and the compound model3 - for detection of insects over a broad range of insect densities. Although the double log and negative binomial models performed well under specific conditions, it is shown that, of the four models examined, the compound model performed the best over a broad range of insect spatial distributions and densities. In particular, this model predicted well the number of samples required when insect density was high and clumped within experimental storages. Conclusions: This paper reinforces the need for effective sampling programs designed to detect insects over a broad range of spatial distributions. The compound model is robust over a broad range of insect densities and leads to substantial improvement in detection probabilities within highly variable systems such as grain storage.
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.
Resumo:
Mathematical descriptions of birth–death–movement processes are often calibrated to measurements from cell biology experiments to quantify tissue growth rates. Here we describe and analyze a discrete model of a birth–death-movement process applied to a typical two–dimensional cell biology experiment. We present three different descriptions of the system: (i) a standard mean–field description which neglects correlation effects and clustering; (ii) a moment dynamics description which approximately incorporates correlation and clustering effects, and; (iii) averaged data from repeated discrete simulations which directly incorporates correlation and clustering effects. Comparing these three descriptions indicates that the mean–field and moment dynamics approaches are valid only for certain parameter regimes, and that both these descriptions fail to make accurate predictions of the system for sufficiently fast birth and death rates where the effects of spatial correlations and clustering are sufficiently strong. Without any method to distinguish between the parameter regimes where these three descriptions are valid, it is possible that either the mean–field or moment dynamics model could be calibrated to experimental data under inappropriate conditions, leading to errors in parameter estimation. In this work we demonstrate that a simple measurement of agent clustering and correlation, based on coordination number data, provides an indirect measure of agent correlation and clustering effects, and can therefore be used to make a distinction between the validity of the different descriptions of the birth–death–movement process.