947 resultados para Stochastic models
Resumo:
Phylogenetic inference from sequences can be misled by both sampling (stochastic) error and systematic error (nonhistorical signals where reality differs from our simplified models). A recent study of eight yeast species using 106 concatenated genes from complete genomes showed that even small internal edges of a tree received 100% bootstrap support. This effective negation of stochastic error from large data sets is important, but longer sequences exacerbate the potential for biases (systematic error) to be positively misleading. Indeed, when we analyzed the same data set using minimum evolution optimality criteria, an alternative tree received 100% bootstrap support. We identified a compositional bias as responsible for this inconsistency and showed that it is reduced effectively by coding the nucleotides as purines and pyrimidines (RY-coding), reinforcing the original tree. Thus, a comprehensive exploration of potential systematic biases is still required, even though genome-scale data sets greatly reduce sampling error.
Resumo:
In this paper we present a new simulation methodology in order to obtain exact or approximate Bayesian inference for models for low-valued count time series data that have computationally demanding likelihood functions. The algorithm fits within the framework of particle Markov chain Monte Carlo (PMCMC) methods. The particle filter requires only model simulations and, in this regard, our approach has connections with approximate Bayesian computation (ABC). However, an advantage of using the PMCMC approach in this setting is that simulated data can be matched with data observed one-at-a-time, rather than attempting to match on the full dataset simultaneously or on a low-dimensional non-sufficient summary statistic, which is common practice in ABC. For low-valued count time series data we find that it is often computationally feasible to match simulated data with observed data exactly. Our particle filter maintains $N$ particles by repeating the simulation until $N+1$ exact matches are obtained. Our algorithm creates an unbiased estimate of the likelihood, resulting in exact posterior inferences when included in an MCMC algorithm. In cases where exact matching is computationally prohibitive, a tolerance is introduced as per ABC. A novel aspect of our approach is that we introduce auxiliary variables into our particle filter so that partially observed and/or non-Markovian models can be accommodated. We demonstrate that Bayesian model choice problems can be easily handled in this framework.
Resumo:
A quasi-maximum likelihood procedure for estimating the parameters of multi-dimensional diffusions is developed in which the transitional density is a multivariate Gaussian density with first and second moments approximating the true moments of the unknown density. For affine drift and diffusion functions, the moments are exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good and is as effective as alternative methods based on likelihood approximations. The estimation procedure generalises to models with latent factors. A conditioning procedure is developed that allows parameter estimation in the absence of proxies.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.
Resumo:
In a recent paper, Gordon, Muratov, and Shvartsman studied a partial differential equation (PDE) model describing radially symmetric diffusion and degradation in two and three dimensions. They paid particular attention to the local accumulation time (LAT), also known in the literature as the mean action time, which is a spatially dependent timescale that can be used to provide an estimate of the time required for the transient solution to effectively reach steady state. They presented exact results for three-dimensional applications and gave approximate results for the two-dimensional analogue. Here we make two generalizations of Gordon, Muratov, and Shvartsman’s work: (i) we present an exact expression for the LAT in any dimension and (ii) we present an exact expression for the variance of the distribution. The variance provides useful information regarding the spread about the mean that is not captured by the LAT. We conclude by describing further extensions of the model that were not considered by Gordon,Muratov, and Shvartsman. We have found that exact expressions for the LAT can also be derived for these important extensions...
Resumo:
Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.
Resumo:
The determinants and key mechanisms of cancer cell osteotropism have not been identified, mainly due to the lack of reproducible animal models representing the biological, genetic and clinical features seen in humans. An ideal model should be capable of recapitulating as many steps of the metastatic cascade as possible, thus facilitating the development of prognostic markers and novel therapeutic strategies. Most animal models of bone metastasis still have to be derived experimentally as most syngeneic and transgeneic approaches do not provide a robust skeletal phenotype and do not recapitulate the biological processes seen in humans. The xenotransplantation of human cancer cells or tumour tissue into immunocompromised murine hosts provides the possibility to simulate early and late stages of the human disease. Human bone or tissue-engineered human bone constructs can be implanted into the animal to recapitulate more subtle, species-specific aspects of the mutual interaction between human cancer cells and the human bone microenvironment. Moreover, the replication of the entire "organ" bone makes it possible to analyse the interaction between cancer cells and the haematopoietic niche and to confer at least a partial human immunity to the murine host. This process of humanisation is facilitated by novel immunocompromised mouse strains that allow a high engraftment rate of human cells or tissue. These humanised xenograft models provide an important research tool to study human biological processes of bone metastasis.
Resumo:
Radio Frequency Identification is a wireless identification method that utilizes the reception of electromagnetic radio waves. This research has proposed a novel model to allow for an in-depth security analysis of current protocols and developed new flexible protocols that can be adapted to offer either stronger security or better efficiency.
Resumo:
Travelling wave phenomena are observed in many biological applications. Mathematical theory of standard reaction-diffusion problems shows that simple partial differential equations exhibit travelling wave solutions with constant wavespeed and such models are used to describe, for example, waves of chemical concentrations, electrical signals, cell migration, waves of epidemics and population dynamics. However, as in the study of cell motion in complex spatial geometries, experimental data are often not consistent with constant wavespeed. Non-local spatial models have successfully been used to model anomalous diffusion and spatial heterogeneity in different physical contexts. In this paper, we develop a fractional model based on the Fisher-Kolmogoroff equation and analyse it for its wavespeed properties, attempting to relate the numerical results obtained from our simulations to experimental data describing enteric neural crest-derived cells migrating along the intact gut of mouse embryos. The model proposed essentially combines fractional and standard diffusion in different regions of the spatial domain and qualitatively reproduces the behaviour of neural crest-derived cells observed in the caecum and the hindgut of mouse embryos during in vivo experiments.
Resumo:
This thesis makes several contributions towards improved methods for encoding structure in computational models of word meaning. New methods are proposed and evaluated which address the requirement of being able to easily encode linguistic structural features within a computational representation while retaining the ability to scale to large volumes of textual data. Various methods are implemented and evaluated on a range of evaluation tasks to demonstrate the effectiveness of the proposed methods.
Resumo:
This book presents readers with the opportunity to fundamentally re-evaluate the processes of innovation and entrepreneurship, and to rethink how they might best be stimulated and fostered within our organizations and communities. The fundamental thesis of the book is that the entrepreneurial process is not a linear progression from novel idea to successful innovation, but is an iterative series of experiments, where progress depends on the persistence and resilience of the individuals involved, and their ability and to learn from failure as well as success. From this premise, the authors argue that the ideal environment for new venture creation is a form of “experimental laboratory,” a community of innovators where ideas are generated, shared, and refined; experiments are encouraged; and which in itself serves as a test environment for those ideas and experiments. This environment is quite different from the traditional “incubator,” which may impose the disciplines of the established firm too early in the development of the new venture.
Resumo:
A range of authors from the risk management, crisis management, and crisis communications literature have proposed different models as a means of understanding components of crisis. A generic component of these sources has focused on preparedness practices before disturbance events and response practices during events. This paper provides a critical analysis of three key explanatory models of how crises escalate highlighting the strengths and limitations of each approach. The paper introduces an optimised conceptual model utilising components from the previous work under the four phases of pre-event, response, recovery, and post-event. Within these four phases, a ten step process is introduced that can enhance understanding of the progression of distinct stages of disturbance for different types of events. This crisis evolution framework is examined as a means to provide clarity and applicability to a range of infrastructure failure contexts and provide a path for further empirical investigation in this area.
Resumo:
The traditional hospital-based model of cardiac rehabilitation faces substantial challenges, such as cost and accessibility. These challenges have led to the development of alternative models of cardiac rehabilitation in recent years. The aim of this study was to identify and critique evidence for the effectiveness of these alternative models. A total of 22 databases were searched to identify quantitative studies or systematic reviews of quantitative studies regarding the effectiveness of alternative models of cardiac rehabilitation. Included studies were appraised using a Critical Appraisal Skills Programme tool and the National Health and Medical Research Council's designations for Level of Evidence. The 83 included articles described interventions in the following broad categories of alternative models of care: multifactorial individualized telehealth, internet based, telehealth focused on exercise, telehealth focused on recovery, community- or home-based, and complementary therapies. Multifactorial individualized telehealth and community- or home-based cardiac rehabilitation are effective alternative models of cardiac rehabilitation, as they have produced similar reductions in cardiovascular disease risk factors compared with hospital-based programmes. While further research is required to address the paucity of data available regarding the effectiveness of alternative models of cardiac rehabilitation in rural, remote, and culturally and linguistically diverse populations, our review indicates there is no need to rely on hospital-based strategies alone to deliver effective cardiac rehabilitation. Local healthcare systems should strive to integrate alternative models of cardiac rehabilitation, such as brief telehealth interventions tailored to individual's risk factor profiles as well as community- or home-based programmes, in order to ensure there are choices available for patients that best fit their needs, risk factor profile, and preferences.