968 resultados para stochastic process


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a novel approach for developing summary statistics for use in approximate Bayesian computation (ABC) algorithms by using indirect inference. ABC methods are useful for posterior inference in the presence of an intractable likelihood function. In the indirect inference approach to ABC the parameters of an auxiliary model fitted to the data become the summary statistics. Although applicable to any ABC technique, we embed this approach within a sequential Monte Carlo algorithm that is completely adaptive and requires very little tuning. This methodological development was motivated by an application involving data on macroparasite population evolution modelled by a trivariate stochastic process for which there is no tractable likelihood function. The auxiliary model here is based on a beta–binomial distribution. The main objective of the analysis is to determine which parameters of the stochastic model are estimable from the observed data on mature parasite worms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ability to accurately predict the remaining useful life of machine components is critical for machine continuous operation and can also improve productivity and enhance system’s safety. In condition-based maintenance (CBM), maintenance is performed based on information collected through condition monitoring and assessment of the machine health. Effective diagnostics and prognostics are important aspects of CBM for maintenance engineers to schedule a repair and to acquire replacement components before the components actually fail. Although a variety of prognostic methodologies have been reported recently, their application in industry is still relatively new and mostly focused on the prediction of specific component degradations. Furthermore, they required significant and sufficient number of fault indicators to accurately prognose the component faults. Hence, sufficient usage of health indicators in prognostics for the effective interpretation of machine degradation process is still required. Major challenges for accurate longterm prediction of remaining useful life (RUL) still remain to be addressed. Therefore, continuous development and improvement of a machine health management system and accurate long-term prediction of machine remnant life is required in real industry application. This thesis presents an integrated diagnostics and prognostics framework based on health state probability estimation for accurate and long-term prediction of machine remnant life. In the proposed model, prior empirical (historical) knowledge is embedded in the integrated diagnostics and prognostics system for classification of impending faults in machine system and accurate probability estimation of discrete degradation stages (health states). The methodology assumes that machine degradation consists of a series of degraded states (health states) which effectively represent the dynamic and stochastic process of machine failure. The estimation of discrete health state probability for the prediction of machine remnant life is performed using the ability of classification algorithms. To employ the appropriate classifier for health state probability estimation in the proposed model, comparative intelligent diagnostic tests were conducted using five different classifiers applied to the progressive fault data of three different faults in a high pressure liquefied natural gas (HP-LNG) pump. As a result of this comparison study, SVMs were employed in heath state probability estimation for the prediction of machine failure in this research. The proposed prognostic methodology has been successfully tested and validated using a number of case studies from simulation tests to real industry applications. The results from two actual failure case studies using simulations and experiments indicate that accurate estimation of health states is achievable and the proposed method provides accurate long-term prediction of machine remnant life. In addition, the results of experimental tests show that the proposed model has the capability of providing early warning of abnormal machine operating conditions by identifying the transitional states of machine fault conditions. Finally, the proposed prognostic model is validated through two industrial case studies. The optimal number of health states which can minimise the model training error without significant decrease of prediction accuracy was also examined through several health states of bearing failure. The results were very encouraging and show that the proposed prognostic model based on health state probability estimation has the potential to be used as a generic and scalable asset health estimation tool in industrial machinery.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We introduce a genetic programming (GP) approach for evolving genetic networks that demonstrate desired dynamics when simulated as a discrete stochastic process. Our representation of genetic networks is based on a biochemical reaction model including key elements such as transcription, translation and post-translational modifications. The stochastic, reaction-based GP system is similar but not identical with algorithmic chemistries. We evolved genetic networks with noisy oscillatory dynamics. The results show the practicality of evolving particular dynamics in gene regulatory networks when modelled with intrinsic noise.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), we show to be a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Messenger RNAs (mRNAs) can be repressed and degraded by small non-coding RNA molecules. In this paper, we formulate a coarsegrained Markov-chain description of the post-transcriptional regulation of mRNAs by either small interfering RNAs (siRNAs) or microRNAs (miRNAs). We calculate the probability of an mRNA escaping from its domain before it is repressed by siRNAs/miRNAs via cal- culation of the mean time to threshold: when the number of bound siRNAs/miRNAs exceeds a certain threshold value, the mRNA is irreversibly repressed. In some cases,the analysis can be reduced to counting certain paths in a reduced Markov model. We obtain explicit expressions when the small RNA bind irreversibly to the mRNA and we also discuss the reversible binding case. We apply our models to the study of RNA interference in the nucleus, examining the probability of mRNAs escaping via small nuclear pores before being degraded by siRNAs. Using the same modelling framework, we further investigate the effect of small, decoy RNAs (decoys) on the process of post-transcriptional regulation, by studying regulation of the tumor suppressor gene, PTEN : decoys are able to block binding sites on PTEN mRNAs, thereby educing the number of sites available to siRNAs/miRNAs and helping to protect it from repression. We calculate the probability of a cytoplasmic PTEN mRNA translocating to the endoplasmic reticulum before being repressed by miRNAs. We support our results with stochastic simulations

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coalescence between two droplets in a turbulent liquid-liquid dispersion is generally viewed as a consequence of forces exerted on the drop-pair squeezing out the intervening continuous phase to a critical thickness. A new synthesis is proposed herein which models the film drainage as a stochastic process driven by a suitably idealized random process for the fluctuating force. While the true test of the model lies in detailed parameter estimations with measurement of drop-size distributions in coalescing dispersions, experimental measurements on average coalescence frequencies lend preliminary support to the model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last two decades, there has been an increasing awareness of, and interest in, the use of spatial moment techniques to provide insight into a range of biological and ecological processes. Models that incorporate spatial moments can be viewed as extensions of mean-field models. These mean-field models often consist of systems of classical ordinary differential equations and partial differential equations, whose derivation, at some point, hinges on the simplifying assumption that individuals in the underlying stochastic process encounter each other at a rate that is proportional to the average abundance of individuals. This assumption has several implications, the most striking of which is that mean-field models essentially neglect any impact of the spatial structure of individuals in the system. Moment dynamics models extend traditional mean-field descriptions by accounting for the dynamics of pairs, triples and higher n-tuples of individuals. This means that moment dynamics models can, to some extent, account for how the spatial structure affects the dynamics of the system in question.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mathematical models describing the movement of multiple interacting subpopulations are relevant to many biological and ecological processes. Standard mean-field partial differential equation descriptions of these processes suffer from the limitation that they implicitly neglect to incorporate the impact of spatial correlations and clustering. To overcome this, we derive a moment dynamics description of a discrete stochastic process which describes the spreading of distinct interacting subpopulations. In particular, we motivate our model by mimicking the geometry of two typical cell biology experiments. Comparing the performance of the moment dynamics model with a traditional mean-field model confirms that the moment dynamics approach always outperforms the traditional mean-field approach. To provide more general insight we summarise the performance of the moment dynamics model and the traditional mean-field model over a wide range of parameter regimes. These results help distinguish between those situations where spatial correlation effects are sufficiently strong, such that a moment dynamics model is required, from other situations where spatial correlation effects are sufficiently weak, such that a traditional mean-field model is adequate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Thesis presents a state-space model for a basketball league and a Kalman filter algorithm for the estimation of the state of the league. In the state-space model, each of the basketball teams is associated with a rating that represents its strength compared to the other teams. The ratings are assumed to evolve in time following a stochastic process with independent Gaussian increments. The estimation of the team ratings is based on the observed game scores that are assumed to depend linearly on the true strengths of the teams and independent Gaussian noise. The team ratings are estimated using a recursive Kalman filter algorithm that produces least squares optimal estimates for the team strengths and predictions for the scores of the future games. Additionally, if the Gaussianity assumption holds, the predictions given by the Kalman filter maximize the likelihood of the observed scores. The team ratings allow probabilistic inference about the ranking of the teams and their relative strengths as well as about the teams’ winning probabilities in future games. The predictions about the winners of the games are correct 65-70% of the time. The team ratings explain 16% of the random variation observed in the game scores. Furthermore, the winning probabilities given by the model are concurrent with the observed scores. The state-space model includes four independent parameters that involve the variances of noise terms and the home court advantage observed in the scores. The Thesis presents the estimation of these parameters using the maximum likelihood method as well as using other techniques. The Thesis also gives various example analyses related to the American professional basketball league, i.e., National Basketball Association (NBA), and regular seasons played in year 2005 through 2010. Additionally, the season 2009-2010 is discussed in full detail, including the playoffs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of on-line recognition and retrieval of relatively weak industrial signals such as partial discharges (PD), buried in excessive noise, has been addressed in this paper. The major bottleneck being the recognition and suppression of stochastic pulsive interference (PI) due to the overlapping broad band frequency spectrum of PI and PD pulses. Therefore, on-line, onsite, PD measurement is hardly possible in conventional frequency based DSP techniques. The observed PD signal is modeled as a linear combination of systematic and random components employing probabilistic principal component analysis (PPCA) and the pdf of the underlying stochastic process is obtained. The PD/PI pulses are assumed as the mean of the process and modeled instituting non-parametric methods, based on smooth FIR filters, and a maximum aposteriori probability (MAP) procedure employed therein, to estimate the filter coefficients. The classification of the pulses is undertaken using a simple PCA classifier. The methods proposed by the authors were found to be effective in automatic retrieval of PD pulses completely rejecting PI.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

From the analysis of experimentally observed variations in surface strains with loading in reinforced concrete beams, it is noted that there is a need to consider the evolution of strains (with loading) as a stochastic process. Use of Markov Chains for modeling stochastic evolution of strains with loading in reinforced concrete flexural beams is studied in this paper. A simple, yet practically useful, bi-level homogeneous Gaussian Markov Chain (BLHGMC) model is proposed for determining the state of strain in reinforced concrete beams. The BLHGMC model will be useful for predicting behavior/response of reinforced concrete beams leading to more rational design.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We develop an approximate analytical technique for evaluating the performance of multi-hop networks based on beacon-less CSMA/CA as standardised in IEEE 802.15.4, a popular standard for wireless sensor networks. The network comprises sensor nodes, which generate measurement packets, and relay nodes which only forward packets. We consider a detailed stochastic process at each node, and analyse this process taking into account the interaction with neighbouring nodes via certain unknown variables (e.g., channel sensing rates, collision probabilities, etc.). By coupling these analyses of the various nodes, we obtain fixed point equations that can be solved numerically to obtain the unknown variables, thereby yielding approximations of time average performance measures, such as packet discard probabilities and average queueing delays. Different analyses arise for networks with no hidden nodes and networks with hidden nodes. We apply this approach to the performance analysis of tree networks rooted at a data sink. Finally, we provide a validation of our analysis technique against simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We develop an approximate analytical technique for evaluating the performance of multi-hop networks based on beaconless IEEE 802.15.4 ( the ``ZigBee'' PHY and MAC), a popular standard for wireless sensor networks. The network comprises sensor nodes, which generate measurement packets, relay nodes which only forward packets, and a data sink (base station). We consider a detailed stochastic process at each node, and analyse this process taking into account the interaction with neighbouring nodes via certain time averaged unknown variables (e.g., channel sensing rates, collision probabilities, etc.). By coupling the analyses at various nodes, we obtain fixed point equations that can be solved numerically to obtain the unknown variables, thereby yielding approximations of time average performance measures, such as packet discard probabilities and average queueing delays. The model incorporates packet generation at the sensor nodes and queues at the sensor nodes and relay nodes. We demonstrate the accuracy of our model by an extensive comparison with simulations. As an additional assessment of the accuracy of the model, we utilize it in an algorithm for sensor network design with quality-of-service (QoS) objectives, and show that designs obtained using our model actually satisfy the QoS constraints (as validated by simulating the networks), and the predictions are accurate to well within 10% as compared to the simulation results in a regime where the packet discard probability is low. (C) 2015 Elsevier B.V. All rights reserved.