992 resultados para Monte Carlo -simulointi
Resumo:
Modellunsicherheiten, Parameterunsicherheiten, epistemisch, aleatorisch, Monte-Carlo-Methode, Wärmeaustauscher, Kläranlage, Information, Anlagenplanung, Anlagensicherheit
Resumo:
Electromagnetic compatibility, lightning, crosstalk surge voltages, Monte Carlo simulation, accident initiator
Resumo:
Dados de contagem de juvenis de siri-azul (Callinectes sapidus Rathbun, 1896) coletados em dois estuários do Rio Grande do Sul são objeto do presente estudo. Por se encontrarem zero-inflacionados, esses dados motivaram a formulação de modelos hierárquicos, que quantificam o efeito das covariáveis categóricas mês e local sobre a probabilidade de ocorrência e densidade dessas populações, levando em conta a detecção imperfeita. Foram também desenvolvidos modelos não-hierárquicos para comparação. Uma abordagem Bayesiana foi adotada para a estimação dos parâmetros dos modelos por simulação Monte Carlo com Cadeias de Markov (MCMC). A comparação entre modelos foi feita com o Critério de Informação da Deviância (DIC). Os modelos hierárquicos apresentaram ajustes melhores que os modelos convencionais, mitigaram o problema do excesso de zeros e permitiram analisar simultaneamente as probabilidades de ocorrência e a densidade de juvenis de siri-azul. No estuário da Lagoa dos Patos, a probabilidade de ocorrência de juvenis na Classe 2 aumenta com a distância da desembocadura, enquanto em Tramandaí os pontos intermediários apresentam as maiores probabilidades. Em ambos os estuários a ocorrência é mais provável nos meses de verão e de inverno. A densidade de juvenis da Classe 2 apresenta marcada variação em relação aos meses do ano sendo, em geral, maior no estuário de Tramandaí.
Resumo:
This note describes ParallelKnoppix, a bootable CD that allows econometricians with average knowledge of computers to create and begin using a high performance computing cluster for parallel computing in very little time. The computers used may be heterogeneous machines, and clusters of up to 200 nodes are supported. When the cluster is shut down, all machines are in their original state, so their temporary use in the cluster does not interfere with their normal uses. An example shows how a Monte Carlo study of a bootstrap test procedure may be done in parallel. Using a cluster of 20 nodes, the example runs approximately 20 times faster than it does on a single computer.
Resumo:
This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.
Resumo:
Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
Assuming the role of debt management is to provide hedging against fiscal shocks we consider three questions: i) what indicators can be used to assess the performance of debt management? ii) how well have historical debt management policies performed? and iii) how is that performance affected by variations in debt issuance? We consider these questions using OECD data on the market value of government debt between 1970 and 2000. Motivated by both the optimal taxation literature and broad considerations of debt stability we propose a range of performance indicators for debt management. We evaluate these using Monte Carlo analysis and find that those based on the relative persistence of debt perform best. Calculating these measures for OECD data provides only limited evidence that debt management has helped insulate policy against unexpected fiscal shocks. We also find that the degree of fiscal insurance achieved is not well connected to cross country variations in debt issuance patterns. Given the limited volatility observed in the yield curve the relatively small dispersion of debt management practices across countries makes little difference to the realised degree of fiscal insurance.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.
Resumo:
Background: Over the last two decades, mortality from coronary heart disease (CHD) and cerebrovascular disease (CVD) declined by about 30% in the European Union (EU). Design: We analyzed trends in CHD (X ICD codes: I20-I25) and CVD (X ICD codes: I60-I69) mortality in young adults (age 35-44 years) in the EU as a whole and in 12 selected European countries, over the period 1980-2007. Methods: Data were derived from the World Health Organization mortality database. With joinpoint regression analysis, we identified significant changes in trends and estimated average annual percent changes (AAPC). Results: CHD mortality rates at ages 35-44 years have decreased in both sexes since the 1980s for most countries, except for Russia (130/100,000 men and 24/100,000 women, in 2005-7). The lowest rates (around 9/100,000 men, 2/100,000 women) were in France, Italy and Sweden. In men, the steepest declines in mortality were in the Czech Republic (AAPC = -6.1%), the Netherlands (-5.2%), Poland (-4.5%), and England and Wales (-4.5%). Patterns were similar in women, though with appreciably lower rates. The AAPC in the EU was -3.3% for men (rate = 16.6/100,000 in 2005-7) and -2.1% for women (rate = 3.5/100,000). For CVD, Russian rates in 2005-7 were 40/100,000 men and 16/100,000 women, 5 to 10-fold higher than in most western European countries. The steepest declines were in the Czech Republic and Italy for men, in Sweden and the Czech Republic for women. The AAPC in the EU was -2.5% in both sexes, with steeper declines after the mid-late 1990s (rates = 6.4/100,000 men and 4.3/100,000 women in 2005-7). Conclusions: CHD and CVD mortality steadily declined in Europe, except in Russia, whose rates were 10 to 15-fold higher than those of France, Italy or Sweden. Hungary and Poland, and also Scotland, where CHD trends were less favourable than in other western European countries, also emerge as priorities for preventive interventions.
Resumo:
Least Squares estimators are notoriously known to generate sub-optimal exercise decisions when determining the optimal stopping time. The consequence is that the price of the option is underestimated. We show how variance reduction methods can be implemented to obtain more accurate option prices. We also extend the Longsta¤ and Schwartz (2001) method to price American options under stochastic volatility. These are two important contributions that are particularly relevant for practitioners. Finally, we extend the Glasserman and Yu (2004b) methodology to price Asian options and basket options.
Resumo:
There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specification which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for inflation.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
Resumo:
Pricing American options is an interesting research topic since there is no analytical solution to value these derivatives. Different numerical methods have been proposed in the literature with some, if not all, either limited to a specific payoff or not applicable to multidimensional cases. Applications of Monte Carlo methods to price American options is a relatively new area that started with Longstaff and Schwartz (2001). Since then, few variations of that methodology have been proposed. The general conclusion is that Monte Carlo estimators tend to underestimate the true option price. The present paper follows Glasserman and Yu (2004b) and proposes a novel Monte Carlo approach, based on designing "optimal martingales" to determine stopping times. We show that our martingale approach can also be used to compute the dual as described in Rogers (2002).
Resumo:
This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.