983 resultados para Monte-Carlo method


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Potassium content in tea brew was determined by gamma-ray spectroscopy, Using the 1461 keV gamma-ray fro M (40)K, the naturally occurring radioactive isotope of potassium. We measured radiation with a shielded HPGe detector from individual test samples of tea leaves, before and after infusion preparation, and from commercial instant tea powder. The correction factor for the gamma-ray self-absorption in the extended source was determined with the help of Monte-Carlo simulations. This gamma-ray spectroscopy technique enabled the absolute determination of potassium content with a relative uncertainty smaller than 4%, at the one standard deviation confidence level, showing the feasibility of this method. An experiment to evaluate a possible systematic Uncertainty due to K distribution heterogeneity in the sample was performed, with file result that the corresponding relative standard deviation is smaller than 2% at 95% confidence level. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This letter presents pseudolikelihood equations for the estimation of the Potts Markov random field model parameter on higher order neighborhood systems. The derived equation for second-order systems is a significantly reduced version of a recent result found in the literature (from 67 to 22 terms). Also, with the proposed method, a completely original equation for Potts model parameter estimation in third-order systems was obtained. These equations allow the modeling of less restrictive contextual systems for a large number of applications in a computationally feasible way. Experiments with both simulated and real remote sensing images provided good results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work we study the spectrum of the lowest screening masses for Yang-Mills theories on the lattice. We used the SU(2) gauge group in (3 + 1) dmensions. We adopted the multiple exponential method and the so-called ""variational"" method, in order to detect possible excited states. The calculations were done near the critical temperature of the confinement-deconfinement phase transition. We obtained values for the ratios of the screening masses consistent with predictions from universality arguments. A Monte Carlo evolution of the screening masses in the gauge theory confirms the validity of the predictions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The generalized Birnbaum-Saunders distribution pertains to a class of lifetime models including both lighter and heavier tailed distributions. This model adapts well to lifetime data, even when outliers exist, and has other good theoretical properties and application perspectives. However, statistical inference tools may not exist in closed form for this model. Hence, simulation and numerical studies are needed, which require a random number generator. Three different ways to generate observations from this model are considered here. These generators are compared by utilizing a goodness-of-fit procedure as well as their effectiveness in predicting the true parameter values by using Monte Carlo simulations. This goodness-of-fit procedure may also be used as an estimation method. The quality of this estimation method is studied here. Finally, through a real data set, the generalized and classical Birnbaum-Saunders models are compared by using this estimation method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is interest in studying latent variables. These latent variables are directly considered in the Item Response Models (IRM) and they are usually called latent traits. A usual assumption for parameter estimation of the IRM, considering one group of examinees, is to assume that the latent traits are random variables which follow a standard normal distribution. However, many works suggest that this assumption does not apply in many cases. Furthermore, when this assumption does not hold, the parameter estimates tend to be biased and misleading inference can be obtained. Therefore, it is important to model the distribution of the latent traits properly. In this paper we present an alternative latent traits modeling based on the so-called skew-normal distribution; see Genton (2004). We used the centred parameterization, which was proposed by Azzalini (1985). This approach ensures the model identifiability as pointed out by Azevedo et al. (2009b). Also, a Metropolis Hastings within Gibbs sampling (MHWGS) algorithm was built for parameter estimation by using an augmented data approach. A simulation study was performed in order to assess the parameter recovery in the proposed model and the estimation method, and the effect of the asymmetry level of the latent traits distribution on the parameter estimation. Also, a comparison of our approach with other estimation methods (which consider the assumption of symmetric normality for the latent traits distribution) was considered. The results indicated that our proposed algorithm recovers properly all parameters. Specifically, the greater the asymmetry level, the better the performance of our approach compared with other approaches, mainly in the presence of small sample sizes (number of examinees). Furthermore, we analyzed a real data set which presents indication of asymmetry concerning the latent traits distribution. The results obtained by using our approach confirmed the presence of strong negative asymmetry of the latent traits distribution. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study presents an approach to combine uncertainties of the hydrological model outputs predicted from a number of machine learning models. The machine learning based uncertainty prediction approach is very useful for estimation of hydrological models' uncertainty in particular hydro-metrological situation in real-time application [1]. In this approach the hydrological model realizations from Monte Carlo simulations are used to build different machine learning uncertainty models to predict uncertainty (quantiles of pdf) of the a deterministic output from hydrological model . Uncertainty models are trained using antecedent precipitation and streamflows as inputs. The trained models are then employed to predict the model output uncertainty which is specific for the new input data. We used three machine learning models namely artificial neural networks, model tree, locally weighted regression to predict output uncertainties. These three models produce similar verification results, which can be improved by merging their outputs dynamically. We propose an approach to form a committee of the three models to combine their outputs. The approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model in the Brue catchment in UK and the Bagmati catchment in Nepal. The verification results show that merged output is better than an individual model output. [1] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press, 2013.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Neste trabalho é dado ênfase à inclusão das incertezas na avaliação do comportamento estrutural, objetivando uma melhor representação das características do sistema e uma quantificação do significado destas incertezas no projeto. São feitas comparações entre as técnicas clássicas existentes de análise de confiabilidade, tais como FORM, Simulação Direta Monte Carlo (MC) e Simulação Monte Carlo com Amostragem por Importância Adaptativa (MCIS), e os métodos aproximados da Superfície de Resposta( RS) e de Redes Neurais Artificiais(ANN). Quando possível, as comparações são feitas salientando- se as vantagens e inconvenientes do uso de uma ou de outra técnica em problemas com complexidades crescentes. São analisadas desde formulações com funções de estado limite explícitas até formulações implícitas com variabilidade espacial de carregamento e propriedades dos materiais, incluindo campos estocásticos. É tratado, em especial, o problema da análise da confiabilidade de estruturas de concreto armado incluindo o efeito da variabilidade espacial de suas propriedades. Para tanto é proposto um modelo de elementos finitos para a representação do concreto armado que incorpora as principais características observadas neste material. Também foi desenvolvido um modelo para a geração de campos estocásticos multidimensionais não Gaussianos para as propriedades do material e que é independente da malha de elementos finitos, assim como implementadas técnicas para aceleração das avaliações estruturais presentes em qualquer das técnicas empregadas. Para o tratamento da confiabilidade através da técnica da Superfície de Resposta, o algoritmo desenvolvido por Rajashekhar et al(1993) foi implementado. Já para o tratamento através de Redes Neurais Artificias, foram desenvolvidos alguns códigos para a simulação de redes percéptron multicamada e redes com função de base radial e então implementados no algoritmo de avaliação de confiabilidade desenvolvido por Shao et al(1997). Em geral, observou-se que as técnicas de simulação tem desempenho bastante baixo em problemas mais complexos, sobressaindo-se a técnica de primeira ordem FORM e as técnicas aproximadas da Superfície de Resposta e de Redes Neurais Artificiais, embora com precisão prejudicada devido às aproximações presentes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This Thesis is the result of my Master Degree studies at the Graduate School of Economics, Getúlio Vargas Foundation, from January 2004 to August 2006. am indebted to my Thesis Advisor, Professor Luiz Renato Lima, who introduced me to the Econometrics' world. In this Thesis, we study time-varying quantile process and we develop two applications, which are presented here as Part and Part II. Each of these parts was transformed in paper. Both papers were submitted. Part shows that asymmetric persistence induces ARCH effects, but the LMARCH test has power against it. On the other hand, the test for asymmetric dynamics proposed by Koenker and Xiao (2004) has correct size under the presence of ARCH errors. These results suggest that the LM-ARCH and the Koenker-Xiao tests may be used in applied research as complementary tools. In the Part II, we compare four different Value-at-Risk (VaR) methodologies through Monte Cario experiments. Our results indicate that the method based on quantile regression with ARCH effect dominates other methods that require distributional assumption. In particular, we show that the non-robust method ologies have higher probability to predict VaRs with too many violations. We illustrate our findings with an empirical exercise in which we estimate VaR for returns of São Paulo stock exchange index, IBOVESPA, during periods of market turmoil. Our results indicate that the robust method based on quantile regression presents the least number of violations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper deals with the testing of autoregressive conditional duration (ACD) models by gauging the distance between the parametric density and hazard rate functions implied by the duration process and their non-parametric estimates. We derive the asymptotic justification using the functional delta method for fixed and gamma kernels, and then investigate the finite-sample properties through Monte Carlo simulations. Although our tests display some size distortion, bootstrapping suffices to correct the size without compromising their excellent power. We show the practical usefulness of such testing procedures for the estimation of intraday volatility patterns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we compare four different Value-at-Risk (V aR) methodologies through Monte Carlo experiments. Our results indicate that the method based on quantile regression with ARCH effect dominates other methods that require distributional assumption. In particular, we show that the non-robust methodologies have higher probability to predict V aRs with too many violations. We illustrate our findings with an empirical exercise in which we estimate V aR for returns of S˜ao Paulo stock exchange index, IBOVESPA, during periods of market turmoil. Our results indicate that the robust method based on quantile regression presents the least number of violations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Os objetivos deste trabalho foram (i) rever métodos numéricos para precificação de derivativos; e (ii) comparar os métodos assumindo que os preços de mercado refletem àqueles obtidos pela fórmula de Black Scholes para precificação de opções do tipo européia. Aplicamos estes métodos para precificar opções de compra da ações Telebrás. Os critérios de acurácia e de custo computacional foram utilizados para comparar os seguintes modelos binomial, Monte Carlo, e diferenças finitas. Os resultados indicam que o modelo binomial possui boa acurácia e custo baixo, seguido pelo Monte Carlo e diferenças finitas. Entretanto, o método Monte Carlo poderia ser usado quando o derivativo depende de mais de dois ativos-objetos. É recomendável usar o método de diferenças finitas quando se obtém uma equação diferencial parcial cuja solução é o valor do derivativo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents semiparametric estimators of changes in inequality measures of a dependent variable distribution taking into account the possible changes on the distributions of covariates. When we do not impose parametric assumptions on the conditional distribution of the dependent variable given covariates, this problem becomes equivalent to estimation of distributional impacts of interventions (treatment) when selection to the program is based on observable characteristics. The distributional impacts of a treatment will be calculated as differences in inequality measures of the potential outcomes of receiving and not receiving the treatment. These differences are called here Inequality Treatment Effects (ITE). The estimation procedure involves a first non-parametric step in which the probability of receiving treatment given covariates, the propensity-score, is estimated. Using the inverse probability weighting method to estimate parameters of the marginal distribution of potential outcomes, in the second step weighted sample versions of inequality measures are computed. Root-N consistency, asymptotic normality and semiparametric efficiency are shown for the semiparametric estimators proposed. A Monte Carlo exercise is performed to investigate the behavior in finite samples of the estimator derived in the paper. We also apply our method to the evaluation of a job training program.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper develops nonparametric tests of independence between two stationary stochastic processes. The testing strategy boils down to gauging the closeness between the joint and the product of the marginal stationary densities. For that purpose, I take advantage of a generalized entropic measure so as to build a class of nonparametric tests of independence. Asymptotic normality and local power are derived using the functional delta method for kernels, whereas finite sample properties are investigated through Monte Carlo simulations.