954 resultados para maximized monte Carlo test


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure for calculating critical level and power of likelihood ratio test, based on a Monte-Carlo simulation method is proposed. General principles of software building for its realization are given. Some examples of its application are shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of checking the normality assumption in most statistical procedures especially parametric tests cannot be over emphasized as the validity of the inferences drawn from such procedures usually depend on the validity of this assumption. Numerous methods have been proposed by different authors over the years, some popular and frequently used, others, not so much. This study addresses the performance of eighteen of the available tests for different sample sizes, significance levels, and for a number of symmetric and asymmetric distributions by conducting a Monte-Carlo simulation. The results showed that considerable power is not achieved for symmetric distributions when sample size is less than one hundred and for such distributions, the kurtosis test is most powerful provided the distribution is leptokurtic or platykurtic. The Shapiro-Wilk test remains the most powerful test for asymmetric distributions. We conclude that different tests are suitable under different characteristics of alternative distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Questa tesi si inserisce nell’ambito del progetto WA104-NESSiE al CERN per il quale era richiesto lo sviluppo di un tracciatore di particelle cariche da utilizzare in presenza di campi magnetici e avente una risoluzione sulla posizione ricostruita di 1-2 mm. Il lavoro di tesi ha riguardato l'analisi dei dati raccolti con un prototipo del tracciatore composto da barre di scintillatori a sezione triangolare, accoppiati a SiPM i cui segnali sono acquisiti in modalità analogica. Il prototipo è stato esposto a particelle cariche presso la linea di fascio T9 del PS del CERN nel maggio 2016. La catena di analisi è stata validata con dati provenienti da una simulazione Monte Carlo basata su Geant4 che fornisce la risposta del tracciatore al passaggio di particelle cariche (pioni e muoni) a diversi impulsi (1-10 GeV/c). Successivamente, è stata fatta un'analisi preliminare dei dati reali e un confronto con la simulazione Monte Carlo. La risoluzione ottenuta per pioni di 5 GeV è di ∼ 2 mm, compatibile con il valore ottenuto dalla simulazione Monte Carlo di ∼ 1.5 mm. Questi risultati sono stati ricavati analizzando una frazione degli eventi acquisiti durante il test beam. Una misura più accurata della risoluzione del tracciatore può essere ottenuta introducendo alcune correzioni, come ad esempio l’allineamento dei piani, la ricalibrazione dei segnali dei singoli canali e, infine, analizzando l’intero campione.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Free energy calculations are a computational method for determining thermodynamic quantities, such as free energies of binding, via simulation.

Currently, due to computational and algorithmic limitations, free energy calculations are limited in scope.

In this work, we propose two methods for improving the efficiency of free energy calculations.

First, we expand the state space of alchemical intermediates, and show that this expansion enables us to calculate free energies along lower variance paths.

We use Q-learning, a reinforcement learning technique, to discover and optimize paths at low computational cost.

Second, we reduce the cost of sampling along a given path by using sequential Monte Carlo samplers.

We develop a new free energy estimator, pCrooks (pairwise Crooks), a variant on the Crooks fluctuation theorem (CFT), which enables decomposition of the variance of the free energy estimate for discrete paths, while retaining beneficial characteristics of CFT.

Combining these two advancements, we show that for some test models, optimal expanded-space paths have a nearly 80% reduction in variance relative to the standard path.

Additionally, our free energy estimator converges at a more consistent rate and on average 1.8 times faster when we enable path searching, even when the cost of path discovery and refinement is considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent research indicates that characteristics of El Niño and the Southern Oscillation (ENSO) have changed over the past several decades. Here, I examined different flavors of El Niño in the observational record and the recent changes in the character of El Niño events. The fundamental physical processes that drive ENSO were described and the Eastern Pacific (EP) and Central Pacific (CP) types or flavors of El Niño were defined. Using metrics from the peer-reviewed literature, I examined several historical data sets to interpret El Niño behavior from 1950-2010. A Monte Carlo Simulation was then applied to output from coupled model simulations to test the statistical significance of recent observations surrounding EP and CP El Niño. Results suggested that EP and CP El Niño had been occurring in a similar fashion over the past 60 years with natural variability, but no significant increase in CP El Niño behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of decentralized sequential detection is studied in this thesis, where local sensors are memoryless, receive independent observations, and no feedback from the fusion center. In addition to traditional criteria of detection delay and error probability, we introduce a new constraint: the number of communications between local sensors and the fusion center. This metric is able to reflect both the cost of establishing communication links as well as overall energy consumption over time. A new formulation for communication-efficient decentralized sequential detection is proposed where the overall detection delay is minimized with constraints on both error probabilities and the communication cost. Two types of problems are investigated based on the communication-efficient formulation: decentralized hypothesis testing and decentralized change detection. In the former case, an asymptotically person-by-person optimum detection framework is developed, where the fusion center performs a sequential probability ratio test based on dependent observations. The proposed algorithm utilizes not only reported statistics from local sensors, but also the reporting times. The asymptotically relative efficiency of proposed algorithm with respect to the centralized strategy is expressed in closed form. When the probabilities of false alarm and missed detection are close to one another, a reduced-complexity algorithm is proposed based on a Poisson arrival approximation. In addition, decentralized change detection with a communication cost constraint is also investigated. A person-by-person optimum change detection algorithm is proposed, where transmissions of sensing reports are modeled as a Poisson process. The optimum threshold value is obtained through dynamic programming. An alternative method with a simpler fusion rule is also proposed, where the threshold values in the algorithm are determined by a combination of sequential detection analysis and constrained optimization. In both decentralized hypothesis testing and change detection problems, tradeoffs in parameter choices are investigated through Monte Carlo simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the highly competitive world of modern finance, new derivatives are continually required to take advantage of changes in financial markets, and to hedge businesses against new risks. The research described in this paper aims to accelerate the development and pricing of new derivatives in two different ways. Firstly, new derivatives can be specified mathematically within a general framework, enabling new mathematical formulae to be specified rather than just new parameter settings. This Generic Pricing Engine (GPE) is expressively powerful enough to specify a wide range of stand¬ard pricing engines. Secondly, the associated price simulation using the Monte Carlo method is accelerated using GPU or multicore hardware. The parallel implementation (in OpenCL) is automatically derived from the mathematical description of the derivative. As a test, for a Basket Option Pricing Engine (BOPE) generated using the GPE, on the largest problem size, an NVidia GPU runs the generated pricing engine at 45 times the speed of a sequential, specific hand-coded implementation of the same BOPE. Thus a user can more rapidly devise, simulate and experiment with new derivatives without actual programming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides an empirical test of the child quantity–quality (QQ) trade-off predicted by unified growth theory. Using individual census returns from the 1911 Irish census, we examine whether children who attended school were from smaller families—as predicted by a standard QQ model. To measure causal effects, we use a selection of models robust to endogeneity concerns which we validate for this application using an Empirical Monte Carlo analysis. Our results show that a child remaining in school between the ages of 14 and 16 caused up to a 27 % reduction in fertility. Our results are robust to alternative estimation techniques with different modeling assumptions, sample selection, and alternative definitions of fertility. These findings highlight the importance of the demographic transition as a mechanism which underpinned the expansion in human capital witnessed in Western economies during the twentieth century.