992 resultados para Metodo de Monte Carlo - Simulação por computador
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
Assuming the role of debt management is to provide hedging against fiscal shocks we consider three questions: i) what indicators can be used to assess the performance of debt management? ii) how well have historical debt management policies performed? and iii) how is that performance affected by variations in debt issuance? We consider these questions using OECD data on the market value of government debt between 1970 and 2000. Motivated by both the optimal taxation literature and broad considerations of debt stability we propose a range of performance indicators for debt management. We evaluate these using Monte Carlo analysis and find that those based on the relative persistence of debt perform best. Calculating these measures for OECD data provides only limited evidence that debt management has helped insulate policy against unexpected fiscal shocks. We also find that the degree of fiscal insurance achieved is not well connected to cross country variations in debt issuance patterns. Given the limited volatility observed in the yield curve the relatively small dispersion of debt management practices across countries makes little difference to the realised degree of fiscal insurance.
Resumo:
BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.
Resumo:
Background: Over the last two decades, mortality from coronary heart disease (CHD) and cerebrovascular disease (CVD) declined by about 30% in the European Union (EU). Design: We analyzed trends in CHD (X ICD codes: I20-I25) and CVD (X ICD codes: I60-I69) mortality in young adults (age 35-44 years) in the EU as a whole and in 12 selected European countries, over the period 1980-2007. Methods: Data were derived from the World Health Organization mortality database. With joinpoint regression analysis, we identified significant changes in trends and estimated average annual percent changes (AAPC). Results: CHD mortality rates at ages 35-44 years have decreased in both sexes since the 1980s for most countries, except for Russia (130/100,000 men and 24/100,000 women, in 2005-7). The lowest rates (around 9/100,000 men, 2/100,000 women) were in France, Italy and Sweden. In men, the steepest declines in mortality were in the Czech Republic (AAPC = -6.1%), the Netherlands (-5.2%), Poland (-4.5%), and England and Wales (-4.5%). Patterns were similar in women, though with appreciably lower rates. The AAPC in the EU was -3.3% for men (rate = 16.6/100,000 in 2005-7) and -2.1% for women (rate = 3.5/100,000). For CVD, Russian rates in 2005-7 were 40/100,000 men and 16/100,000 women, 5 to 10-fold higher than in most western European countries. The steepest declines were in the Czech Republic and Italy for men, in Sweden and the Czech Republic for women. The AAPC in the EU was -2.5% in both sexes, with steeper declines after the mid-late 1990s (rates = 6.4/100,000 men and 4.3/100,000 women in 2005-7). Conclusions: CHD and CVD mortality steadily declined in Europe, except in Russia, whose rates were 10 to 15-fold higher than those of France, Italy or Sweden. Hungary and Poland, and also Scotland, where CHD trends were less favourable than in other western European countries, also emerge as priorities for preventive interventions.
Resumo:
Least Squares estimators are notoriously known to generate sub-optimal exercise decisions when determining the optimal stopping time. The consequence is that the price of the option is underestimated. We show how variance reduction methods can be implemented to obtain more accurate option prices. We also extend the Longsta¤ and Schwartz (2001) method to price American options under stochastic volatility. These are two important contributions that are particularly relevant for practitioners. Finally, we extend the Glasserman and Yu (2004b) methodology to price Asian options and basket options.
Resumo:
There are both theoretical and empirical reasons for believing that the parameters of macroeconomic models may vary over time. However, work with time-varying parameter models has largely involved Vector autoregressions (VARs), ignoring cointegration. This is despite the fact that cointegration plays an important role in informing macroeconomists on a range of issues. In this paper we develop time varying parameter models which permit cointegration. Time-varying parameter VARs (TVP-VARs) typically use state space representations to model the evolution of parameters. In this paper, we show that it is not sensible to use straightforward extensions of TVP-VARs when allowing for cointegration. Instead we develop a specification which allows for the cointegrating space to evolve over time in a manner comparable to the random walk variation used with TVP-VARs. The properties of our approach are investigated before developing a method of posterior simulation. We use our methods in an empirical investigation involving a permanent/transitory variance decomposition for inflation.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
Resumo:
Pricing American options is an interesting research topic since there is no analytical solution to value these derivatives. Different numerical methods have been proposed in the literature with some, if not all, either limited to a specific payoff or not applicable to multidimensional cases. Applications of Monte Carlo methods to price American options is a relatively new area that started with Longstaff and Schwartz (2001). Since then, few variations of that methodology have been proposed. The general conclusion is that Monte Carlo estimators tend to underestimate the true option price. The present paper follows Glasserman and Yu (2004b) and proposes a novel Monte Carlo approach, based on designing "optimal martingales" to determine stopping times. We show that our martingale approach can also be used to compute the dual as described in Rogers (2002).
Resumo:
This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.
Resumo:
This paper proposes a bootstrap artificial neural network based panel unit root test in a dynamic heterogeneous panel context. An application to a panel of bilateral real exchange rate series with the US Dollar from the 20 major OECD countries is provided to investigate the Purchase Power Parity (PPP). The combination of neural network and bootstrapping significantly changes the findings of the economic study in favour of PPP.
Resumo:
Spatio-temporal clusters in 1997?2003 fire sequences of Tuscany region (central Italy) have been identified and analysed by using the scan statistic, a method which was devised to evidence clusters in epidemiology. Results showed that the method is reliable to find clusters of events and to evaluate their significance via Monte Carlo replication. The evaluation of the presence of spatial and temporal patterns in fire occurrence and their significance could have a great impact in forthcoming studies on fire occurrences prediction.
Resumo:
The recent developments in high magnetic field 13C magnetic resonance spectroscopy with improved localization and shimming techniques have led to important gains in sensitivity and spectral resolution of 13C in vivo spectra in the rodent brain, enabling the separation of several 13C isotopomers of glutamate and glutamine. In this context, the assumptions used in spectral quantification might have a significant impact on the determination of the 13C concentrations and the related metabolic fluxes. In this study, the time domain spectral quantification algorithm AMARES (advanced method for accurate, robust and efficient spectral fitting) was applied to 13 C magnetic resonance spectroscopy spectra acquired in the rat brain at 9.4 T, following infusion of [1,6-(13)C2 ] glucose. Using both Monte Carlo simulations and in vivo data, the goal of this work was: (1) to validate the quantification of in vivo 13C isotopomers using AMARES; (2) to assess the impact of the prior knowledge on the quantification of in vivo 13C isotopomers using AMARES; (3) to compare AMARES and LCModel (linear combination of model spectra) for the quantification of in vivo 13C spectra. AMARES led to accurate and reliable 13C spectral quantification similar to those obtained using LCModel, when the frequency shifts, J-coupling constants and phase patterns of the different 13C isotopomers were included as prior knowledge in the analysis.
Resumo:
This paper considers the instrumental variable regression model when there is uncertainty about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainty can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very exible and can be easily adapted to analyze any of the di¤erent priors that have been proposed in the Bayesian instrumental variables literature. We show how to calculate the probability of any relevant restriction (e.g. the posterior probability that over-identifying restrictions hold) and discuss diagnostic checking using the posterior distribution of discrepancy vectors. We illustrate our methods in a returns-to-schooling application.
Resumo:
We propose a nonlinear heterogeneous panel unit root test for testing the null hypothesis of unit-roots processes against the alternative that allows a proportion of units to be generated by globally stationary ESTAR processes and a remaining non-zero proportion to be generated by unit root processes. The proposed test is simple to implement and accommodates cross sectional dependence. We show that the distribution of the test statistic is free of nuisance parameters as (N, T) −! 1. Monte Carlo simulation shows that our test holds correct size and under the hypothesis that data are generated by globally stationary ESTAR processes has a better power than the recent test proposed in Pesaran [2007]. Various applications are provided.
Resumo:
The effects of structural breaks in dynamic panels are more complicated than in time series models as the bias can be either negative or positive. This paper focuses on the effects of mean shifts in otherwise stationary processes within an instrumental variable panel estimation framework. We show the sources of the bias and a Monte Carlo analysis calibrated on United States bank lending data demonstrates the size of the bias for a range of auto-regressive parameters. We also propose additional moment conditions that can be used to reduce the biases caused by shifts in the mean of the data.