21 resultados para net radiation estimation
Resumo:
The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.
Resumo:
This paper presents calculations of semiparametric efficiency bounds for quantile treatment effects parameters when se1ection to treatment is based on observable characteristics. The paper also presents three estimation procedures forthese parameters, alI ofwhich have two steps: a nonparametric estimation and a computation ofthe difference between the solutions of two distinct minimization problems. Root-N consistency, asymptotic normality, and the achievement ofthe semiparametric efficiency bound is shown for one ofthe three estimators. In the final part ofthe paper, an empirical application to a job training program reveals the importance of heterogeneous treatment effects, showing that for this program the effects are concentrated in the upper quantiles ofthe earnings distribution.
Resumo:
We investigate the issue of whether there was a stable money demand function for Japan in 1990's using both aggregate and disaggregate time series data. The aggregate data appears to support the contention that there was no stable money demand function. The disaggregate data shows that there was a stable money demand function. Neither was there any indication of the presence of liquidity trapo Possible sources of discrepancy are explored and the diametrically opposite results between the aggregate and disaggregate analysis are attributed to the neglected heterogeneity among micro units. We also conduct simulation analysis to show that when heterogeneity among micro units is present. The prediction of aggregate outcomes, using aggregate data is less accurate than the prediction based on micro equations. Moreover. policy evaluation based on aggregate data can be grossly misleading.
Resumo:
Consumers often pay different prices for the same product bought in the same store at the same time. However, the demand estimation literature has ignored that fact using, instead, aggregate measures such as the “list” or average price. In this paper we show that this will lead to biased price coefficients. Furthermore, we perform simple comparative statics simulation exercises for the logit and random coefficient models. In the “list” price case we find that the bias is larger when discounts are higher, proportion of consumers facing discount prices is higher and when consumers are more unwilling to buy the product so that they almost only do it when facing discount. In the average price case we find that the bias is larger when discounts are higher, proportion of consumers that have access to discount are similar to the ones that do not have access and when consumers willingness to buy is very dependent on idiosyncratic shocks. Also bias is less problematic in the average price case in markets with a lot of bargain deals, so that prices are as good as individual. We conclude by proposing ways that the econometrician can reduce this bias using different information that he may have available.
Resumo:
Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.