938 resultados para Currency forecast errors
Resumo:
In the event of a volcanic eruption the decision to close airspace is based on forecast ash maps, produced using volcanic ash transport and dispersion models. In this paper we quantitatively evaluate the spatial skill of volcanic ash simulations using satellite retrievals of ash from the Eyja allajökull eruption during the period from 7 to 16 May 2010. We find that at the start of this period, 7–10 May, the model (FLEXible PARTicle) has excellent skill and can predict the spatial distribution of the satellite-retrieved ash to within 0.5∘ × 0.5∘ latitude/longitude. However, on 10 May there is a decrease in the spatial accuracy of the model to 2.5∘× 2.5∘ latitude/longitude, and between 11 and 12 May the simulated ash location errors grow rapidly. On 11 May ash is located close to a bifurcation point in the atmosphere, resulting in a rapid divergence in the modeled and satellite ash locations. In general, the model skill reduces as the residence time of ash increases. However, the error growth is not always steady. Rapid increases in error growth are linked to key points in the ash trajectories. Ensemble modeling using perturbed meteorological data would help to represent this uncertainty, and assimilation of satellite ash data would help to reduce uncertainty in volcanic ash forecasts.
Resumo:
Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecastuncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called “How much are you prepared to pay for a forecast?”. The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydrometeorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants’ willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.
Resumo:
In this paper we have discussed inference aspects of the skew-normal nonlinear regression models following both, a classical and Bayesian approach, extending the usual normal nonlinear regression models. The univariate skew-normal distribution that will be used in this work was introduced by Sahu et al. (Can J Stat 29:129-150, 2003), which is attractive because estimation of the skewness parameter does not present the same degree of difficulty as in the case with Azzalini (Scand J Stat 12:171-178, 1985) one and, moreover, it allows easy implementation of the EM-algorithm. As illustration of the proposed methodology, we consider a data set previously analyzed in the literature under normality.
Resumo:
Considering the Wald, score, and likelihood ratio asymptotic test statistics, we analyze a multivariate null intercept errors-in-variables regression model, where the explanatory and the response variables are subject to measurement errors, and a possible structure of dependency between the measurements taken within the same individual are incorporated, representing a longitudinal structure. This model was proposed by Aoki et al. (2003b) and analyzed under the bayesian approach. In this article, considering the classical approach, we analyze asymptotic test statistics and present a simulation study to compare the behavior of the three test statistics for different sample sizes, parameter values and nominal levels of the test. Also, closed form expressions for the score function and the Fisher information matrix are presented. We consider two real numerical illustrations, the odontological data set from Hadgu and Koch (1999), and a quality control data set.
Resumo:
This paper develops a bias correction scheme for a multivariate heteroskedastic errors-in-variables model. The applicability of this model is justified in areas such as astrophysics, epidemiology and analytical chemistry, where the variables are subject to measurement errors and the variances vary with the observations. We conduct Monte Carlo simulations to investigate the performance of the corrected estimators. The numerical results show that the bias correction scheme yields nearly unbiased estimates. We also give an application to a real data set.
Resumo:
This paper deals with asymptotic results on a multivariate ultrastructural errors-in-variables regression model with equation errors Sufficient conditions for attaining consistent estimators for model parameters are presented Asymptotic distributions for the line regression estimators are derived Applications to the elliptical class of distributions with two error assumptions are presented The model generalizes previous results aimed at univariate scenarios (C) 2010 Elsevier Inc All rights reserved
Resumo:
We have considered a Bayesian approach for the nonlinear regression model by replacing the normal distribution on the error term by some skewed distributions, which account for both skewness and heavy tails or skewness alone. The type of data considered in this paper concerns repeated measurements taken in time on a set of individuals. Such multiple observations on the same individual generally produce serially correlated outcomes. Thus, additionally, our model does allow for a correlation between observations made from the same individual. We have illustrated the procedure using a data set to study the growth curves of a clinic measurement of a group of pregnant women from an obstetrics clinic in Santiago, Chile. Parameter estimation and prediction were carried out using appropriate posterior simulation schemes based in Markov Chain Monte Carlo methods. Besides the deviance information criterion (DIC) and the conditional predictive ordinate (CPO), we suggest the use of proper scoring rules based on the posterior predictive distribution for comparing models. For our data set, all these criteria chose the skew-t model as the best model for the errors. These DIC and CPO criteria are also validated, for the model proposed here, through a simulation study. As a conclusion of this study, the DIC criterion is not trustful for this kind of complex model.
Resumo:
We introduce in this paper the class of linear models with first-order autoregressive elliptical errors. The score functions and the Fisher information matrices are derived for the parameters of interest and an iterative process is proposed for the parameter estimation. Some robustness aspects of the maximum likelihood estimates are discussed. The normal curvatures of local influence are also derived for some usual perturbation schemes whereas diagnostic graphics to assess the sensitivity of the maximum likelihood estimates are proposed. The methodology is applied to analyse the daily log excess return on the Microsoft whose empirical distributions appear to have AR(1) and heavy-tailed errors. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In many epidemiological studies it is common to resort to regression models relating incidence of a disease and its risk factors. The main goal of this paper is to consider inference on such models with error-prone observations and variances of the measurement errors changing across observations. We suppose that the observations follow a bivariate normal distribution and the measurement errors are normally distributed. Aggregate data allow the estimation of the error variances. Maximum likelihood estimates are computed numerically via the EM algorithm. Consistent estimation of the asymptotic variance of the maximum likelihood estimators is also discussed. Test statistics are proposed for testing hypotheses of interest. Further, we implement a simple graphical device that enables an assessment of the model`s goodness of fit. Results of simulations concerning the properties of the test statistics are reported. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
This paper generalizes the methodology of Cat and Brown [Cai, T., Brown, L.D., 1998. Wavelet shrinkage for nonequispaced samples. The Annals of Statistics 26, 1783-1799] for wavelet shrinkage for nonequispaced samples, but in the presence of correlated stationary Gaussian errors. If the true function is a member of a piecewise Holder class, it is shown that, even for long memory errors, the rate of convergence of the procedure is almost-minimax relative to the independent and identically distributed errors case. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Birnbaum-Saunders models have largely been applied in material fatigue studies and reliability analyses to relate the total time until failure with some type of cumulative damage. In many problems related to the medical field, such as chronic cardiac diseases and different types of cancer, a cumulative damage caused by several risk factors might cause some degradation that leads to a fatigue process. In these cases, BS models can be suitable for describing the propagation lifetime. However, since the cumulative damage is assumed to be normally distributed in the BS distribution, the parameter estimates from this model can be sensitive to outlying observations. In order to attenuate this influence, we present in this paper BS models, in which a Student-t distribution is assumed to explain the cumulative damage. In particular, we show that the maximum likelihood estimates of the Student-t log-BS models attribute smaller weights to outlying observations, which produce robust parameter estimates. Also, some inferential results are presented. In addition, based on local influence and deviance component and martingale-type residuals, a diagnostics analysis is derived. Finally, a motivating example from the medical field is analyzed using log-BS regression models. Since the parameter estimates appear to be very sensitive to outlying and influential observations, the Student-t log-BS regression model should attenuate such influences. The model checking methodologies developed in this paper are used to compare the fitted models.
Resumo:
Grammar has always been an important part of language learning. Based on various theories, such as the universal grammar theory (Chomsky, 1959) and, the input theory (Krashen, 1970), the explicit and implicit teaching methods have been developed. Research shows that both methods may have some benefits and disadvantages. The attitude towards English grammar teaching methods in schools has also changed and nowadays grammar teaching methods and learning strategies, as a part of language mastery, are one of the discussion topics among linguists. This study focuses on teacher and learner experiences and beliefs about teaching English grammar and difficulties learners may face. The aim of the study is to conduct a literature review and to find out what scientific knowledge exists concerning the previously named topics. Along with this, the relevant steering documents are investigated focusing on grammar teaching at Swedish upper secondary schools. The universal grammar theory of Chomsky as well as Krashen’s input hypotheses provide the theoretical background for the current study. The study has been conducted applying qualitative and quantitative methods. The systematic search in four databases LIBRIS, ERIK, LLBA and Google Scholar were used for collecting relevant publications. The result shows that scientists’ publications name different grammar areas that are perceived as problematic for learners all over the world. The most common explanation of these difficulties is the influence of learner L1. Research presents teachers’ and learners’ beliefs to the benefits of grammar teaching methods. An effective combination of teaching methods needs to be done to fit learners’ expectations and individual needs. Together, they will contribute to the achieving of higher language proficiency levels and, therefore, they can be successfully applied at Swedish upper secondary schools.
Resumo:
Traditionally the issue of an optimum currency area is based on the theoretical underpinnings developed in the 1960s by McKinnon [13], Kenen [12] and mainly Mundell [14], who is concerned with the benefits of lowering transaction costs vis-à- vis adjustments to asymmetrical shocks. Recently, this theme has been reappraised with new aspects included in the analysis, such as: incomplete markets, credibility of monetary policy and seigniorage, among others. For instance, Neumeyer [15] develops a general equilibrium model with incomplete asset markets and shows that a monetary union is desirable when the welfare gains of eliminating the exchange rate volatility are greater than the cost of reducing the number of currencies to hedge against risks. In this paper, we also resort to a general equilibrium model to evaluate financial aspects of an optimum currency area. Our focus is to appraise the welfare of a country heavily dependent on foreign capital that may suffer a speculative attack on its public debt. The welfare analysis uses as reference the self-fulfilling debt crisis model of Cole and Kehoe ([6], [7] and [8]), which is employed here to represent dollarization. Under this regime, the national government has no control over its monetary policy, the total public debt is denominated in dollars and it is in the hands of international bankers. To describe a country that is a member of a currency union, we modify the original Cole-Kehoe model by including public debt denominated in common currency, only purchased by national consumers. According to this rule, the member countries regain some influence over the monetary policy decision, which is, however, dependent on majority voting. We show that for specific levels of dollar debt, to create inflation tax on common-currency debt in order to avoid an external default is more desirable than to suspend its payment, which is the only choice available for a dollarized economy when foreign creditors decide not to renew their loans.