979 resultados para Forecast error variance


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses a new method for describing dynamic comovement and persistence in economic time series which builds on the contemporaneous forecast error method developed in den Haan (2000). This data description method is then used to address issues in New Keynesian model performance in two ways. First, well known data patterns, such as output and inflation leads and lags and inflation persistence, are decomposed into forecast horizon components to give a more complete description of the data patterns. These results show that the well known lead and lag patterns between output and inflation arise mostly in the medium term forecasts horizons. Second, the data summary method is used to investigate a rich New Keynesian model with many modeling features to see which of these features can reproduce lead, lag and persistence patterns seen in the data. Many studies have suggested that a backward looking component in the Phillips curve is needed to match the data, but our simulations show this is not necessary. We show that a simple general equilibrium model with persistent IS curve shocks and persistent supply shocks can reproduce the lead, lag and persistence patterns seen in the data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes an extended version of the basic New Keynesian monetary (NKM) model which contemplates revision processes of output and inflation data in order to assess the importance of data revisions on the estimated monetary policy rule parameters and the transmission of policy shocks. Our empirical evidence based on a structural econometric approach suggests that although the initial announcements of output and inflation are not rational forecasts of revised output and inflation data, ignoring the presence of non well-behaved revision processes may not be a serious drawback in the analysis of monetary policy in this framework. However, the transmission of inflation-push shocks is largely affected by considering data revisions. The latter being especially true when the nominal stickiness parameter is estimated taking into account data revision processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese investiga os efeitos agudos da poluição atmosférica no pico de fluxo expiratório (PFE) de escolares com idades entre 6 e 15 anos, residentes em municípios da Amazônia Brasileira. O primeiro artigo avaliou os efeitos do material particulado fino (PM2,5) no PFE de 309 escolares do município de Alta Floresta, Mato Grosso (MT), durante a estação seca de 2006. Modelos de efeitos mistos foram estimados para toda a amostra e estratificados por turno escolar e presença de sintomas de asma. O segundo artigo expõe as estratégias utilizadas para a determinação da função de variância do erro aleatório dos modelos de efeitos mistos. O terceiro artigo analisa os dados do estudo de painel com 234 escolares, realizado na estação seca de 2008 em Tangará da Serra, MT. Avaliou-se os efeitos lineares e com defasagem distribuída (PDLM) do material particulado inalável (PM10), do PM2,5 e do Black Carbon (BC) no PFE de todos os escolares e estratificados por grupos de idade. Nos três artigos, os modelos de efeitos mistos foram ajustados por tendência temporal, temperatura, umidade e características individuais. Os modelos também consideraram o ajuste da autocorrelação residual e da função de variância do erro aleatório. Quanto às exposições, foram avaliados os efeitos das exposições de 5hs, 6hs, 12hs e 24hs, no dia corrente, com defasagens de 1 a 5 dias e das médias móveis de 2 e 3 dias. No que se refere aos resultados de Alta Floresta, os modelos para todas as crianças indicaram reduções no PFE variando de 0,26 l/min (IC95%: 0,49; 0,04) a 0,38 l/min (IC95%: 0,71; 0,04), para cada aumento de 10g/m3 no PM2,5. Não foram observados efeitos significativos da poluição no grupo das crianças asmáticas. A exposição de 24hs apresentou efeito significativo no grupo de alunos da tarde e no grupo dos não asmáticos. A exposição de 0hs a 5:30hs foi significativa tanto para os alunos da manhã quanto para a tarde. Em Tangará da Serra, os resultados mostraram reduções significativas do PFE para aumentos de 10 unidades do poluente, principalmente para as defasagens de 3, 4 e 5 dias. Para o PM10, as reduções variaram de 0,15 (IC95%: 0,29; 0,01) a 0,25 l/min (IC95%: 0,40 ; 0,10). Para o PM2,5, as reduções estiveram entre 0,46 l/min (IC95%: 0,86 to 0,06 ) e 0,54 l/min (IC95%: 0,95; 0,14). E no BC, a redução foi de aproximadamente 0,014 l/min. Em relação ao PDLM, efeitos mais importantes foram observados nos modelos baseados na exposição do dia corrente até 5 dias passados. O efeito global foi significativo apenas para o PM10, com redução do PFE de 0,31 l/min (IC95%: 0,56; 0,05). Esta abordagem também indicou efeitos defasados significativos para todos os poluentes. Por fim, o estudo apontou as crianças de 6 a 8 anos como grupo mais sensível aos efeitos da poluição. Os achados da tese sugerem que a poluição atmosférica decorrente da queima de biomassa está associada a redução do PFE de crianças e adolescentes com idades entre 6 e 15 anos, residentes na Amazônia Brasileira.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quantifying scientific uncertainty when setting total allowable catch limits for fish stocks is a major challenge, but it is a requirement in the United States since changes to national fisheries legislation. Multiple sources of error are readily identifiable, including estimation error, model specification error, forecast error, and errors associated with the definition and estimation of reference points. Our focus here, however, is to quantify the influence of estimation error and model specification error on assessment outcomes. These are fundamental sources of uncertainty in developing scientific advice concerning appropriate catch levels and although a study of these two factors may not be inclusive, it is feasible with available information. For data-rich stock assessments conducted on the U.S. west coast we report approximate coefficients of variation in terminal biomass estimates from assessments based on inversion of the assessment of the model’s Hessian matrix (i.e., the asymptotic standard error). To summarize variation “among” stock assessments, as a proxy for model specification error, we characterize variation among multiple historical assessments of the same stock. Results indicate that for 17 groundfish and coastal pelagic species, the mean coefficient of variation of terminal biomass is 18%. In contrast, the coefficient of variation ascribable to model specification error (i.e., pooled among-assessment variation) is 37%. We show that if a precautionary probability of overfishing equal to 0.40 is adopted by managers, and only model specification error is considered, a 9% reduction in the overfishing catch level is indicated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, a novel MPC strategy is proposed, and referred to as asso MPC. The new paradigm features an 1-regularised least squares loss function, in which the control error variance competes with the sum of input channels magnitude (or slew rate) over the whole horizon length. This cost choice is motivated by the successful development of LASSO theory in signal processing and machine learning. In the latter fields, sum-of-norms regularisation have shown a strong capability to provide robust and sparse solutions for system identification and feature selection. In this paper, a discrete-time dual-mode asso MPC is formulated, and its stability is proven by application of standard MPC arguments. The controller is then tested for the problem of ship course keeping and roll reduction with rudder and fins, in a directional stochastic sea. Simulations show the asso MPC to inherit positive features from its corresponding regressor: extreme reduction of decision variables' magnitude, namely, actuators' magnitude (or variations), with a finite energy error, being particularly promising for over-actuated systems. © 2012 AACC American Automatic Control Council).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accurate estimation of the instantaneous frequency of speech resonances is a hard problem mainly due to phase discontinuities in the speech signal associated with excitation instants. We review a variety of approaches for enhanced frequency and bandwidth estimation in the time-domain and propose a new cognitively motivated approach using filterbank arrays. We show that by filtering speech resonances using filters of different center frequency, bandwidth and shape, the ambiguity in instantaneous frequency estimation associated with amplitude envelope minima and phase discontinuities can be significantly reduced. The novel estimators are shown to perform well on synthetic speech signals with frequency and bandwidth micro-modulations (i.e., modulations within a pitch period), as well as on real speech signals. Filterbank arrays, when applied to frequency and bandwidth modulation index estimation, are shown to reduce the estimation error variance by 85% and 70% respectively. © 2013 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A method to solve the stationary state probability is presented for the first-order bang-bang phase-locked loop (BBPLL) with nonzero loop delay. This is based on a delayed Markov chain model and a state How diagram for tracing the state history due to the loop delay. As a result, an eigenequation is obtained, and its closed form solutions are derived for some cases. After obtaining the state probability, statistical characteristics such as mean gain of the binary phase detector and timing error variance are calculated and demonstrated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wind energy is the energy source that contributes most to the renewable energy mix of European countries. While there are good wind resources throughout Europe, the intermittency of the wind represents a major problem for the deployment of wind energy into the electricity networks. To ensure grid security a Transmission System Operator needs today for each kilowatt of wind energy either an equal amount of spinning reserve or a forecasting system that can predict the amount of energy that will be produced from wind over a period of 1 to 48 hours. In the range from 5m/s to 15m/s a wind turbine’s production increases with a power of three. For this reason, a Transmission System Operator requires an accuracy for wind speed forecasts of 1m/s in this wind speed range. Forecasting wind energy with a numerical weather prediction model in this context builds the background of this work. The author’s goal was to present a pragmatic solution to this specific problem in the ”real world”. This work therefore has to be seen in a technical context and hence does not provide nor intends to provide a general overview of the benefits and drawbacks of wind energy as a renewable energy source. In the first part of this work the accuracy requirements of the energy sector for wind speed predictions from numerical weather prediction models are described and analysed. A unique set of numerical experiments has been carried out in collaboration with the Danish Meteorological Institute to investigate the forecast quality of an operational numerical weather prediction model for this purpose. The results of this investigation revealed that the accuracy requirements for wind speed and wind power forecasts from today’s numerical weather prediction models can only be met at certain times. This means that the uncertainty of the forecast quality becomes a parameter that is as important as the wind speed and wind power itself. To quantify the uncertainty of a forecast valid for tomorrow requires an ensemble of forecasts. In the second part of this work such an ensemble of forecasts was designed and verified for its ability to quantify the forecast error. This was accomplished by correlating the measured error and the forecasted uncertainty on area integrated wind speed and wind power in Denmark and Ireland. A correlation of 93% was achieved in these areas. This method cannot solve the accuracy requirements of the energy sector. By knowing the uncertainty of the forecasts, the focus can however be put on the accuracy requirements at times when it is possible to accurately predict the weather. Thus, this result presents a major step forward in making wind energy a compatible energy source in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Gas fired generation currently plays an integral support role ensuring security of supply in power systems with high wind power penetrations due to its technical and economic attributes. However, the increase in variable wind power has affected the gas generation output profile and is pushing the boundaries of the design and operating envelope of gas infrastructure. This paper investigates the mutual dependence and interaction between electricity generation and gas systems through the first comprehensive joined-up, multi-vector energy system analysis for Ireland. Key findings reveal the high vulnerability of the Irish power system to outages on the Irish gas system. It has been shown that the economic operation of the power system can be severely impacted by gas infrastructure outages, resulting in an average system marginal price of up to €167/MWh from €67/MWh in the base case. It has also been shown that gas infrastructure outages pose problems for the location of power system reserve provision, with a 150% increase in provision across a power system transmission bottleneck. Wind forecast error was shown to be a significant cause for concern, resulting in large swings in gas demand requiring key gas infrastructure to operate at close to 100% capacity. These findings are thought to increase in prominence as the installation of wind capacity increases towards 2020, placing further stress on both power and gas systems to maintain security of supply.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using the method of Lorenz (1982), we have estimated the predictability of a recent version of the European Center for Medium-Range Weather Forecasting (ECMWF) model using two different estimates of the initial error corresponding to 6- and 24-hr forecast errors, respectively. For a 6-hr forecast error of the extratropical 500-hPa geopotential height field, a potential increase in forecast skill by more than 3 d is suggested, indicating a further increase in predictability by another 1.5 d compared to the use of a 24-hr forecast error. This is due to a smaller initial error and to an initial error reduction resulting in a smaller averaged growth rate for the whole 7-d forecast. A similar assessment for the tropics using the wind vector fields at 850 and 250 hPa suggests a huge potential improvement with a 7-d forecast providing the same skill as a 1-d forecast now. A contributing factor to the increase in the estimate of predictability is the apparent slow increase of error during the early part of the forecast.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A regional study of the prediction of extratropical cyclones by the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) has been performed. An objective feature-tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast error statistics have then been produced for the position, intensity, and propagation speed of the storms. In previous work, data limitations meant it was only possible to present the diagnostics for the entire Northern Hemisphere (NH) or Southern Hemisphere. A larger data sample has allowed the diagnostics to be computed separately for smaller regions around the globe and has made it possible to explore the regional differences in the prediction of storms by the EPS. Results show that in the NH there is a larger ensemble mean error in the position of storms over the Atlantic Ocean. Further analysis revealed that this is mainly due to errors in the prediction of storm propagation speed rather than in direction. Forecast storms propagate too slowly in all regions, but the bias is about 2 times as large in the NH Atlantic region. The results show that storm intensity is generally overpredicted over the ocean and underpredicted over the land and that the absolute error in intensity is larger over the ocean than over the land. In the NH, large errors occur in the prediction of the intensity of storms that originate as tropical cyclones but then move into the extratropics. The ensemble is underdispersive for the intensity of cyclones (i.e., the spread is smaller than the mean error) in all regions. The spatial patterns of the ensemble mean error and ensemble spread are very different for the intensity of cyclones. Spatial distributions of the ensemble mean error suggest that large errors occur during the growth phase of storm development, but this is not indicated by the spatial distributions of the ensemble spread. In the NH there are further differences. First, the large errors in the prediction of the intensity of cyclones that originate in the tropics are not indicated by the spread. Second, the ensemble mean error is larger over the Pacific Ocean than over the Atlantic, whereas the opposite is true for the spread. The use of a storm-tracking approach, to both weather forecasters and developers of forecast systems, is also discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impact of targeted sonde observations on the 1-3 day forecasts for northern Europe is evaluated using the Met Office four-dimensional variational data assimilation scheme and a 24 km gridlength limited-area version of the Unified Model (MetUM). The targeted observations were carried out during February and March 2007 as part of the Greenland Flow Distortion Experiment, using a research aircraft based in Iceland. Sensitive area predictions using either total energy singular vectors or an ensemble transform Kalman filter were used to predict where additional observations should be made to reduce errors in the initial conditions of forecasts for northern Europe. Targeted sonde data was assimilated operationally into the MetUM. Hindcasts show that the impact of the sondes was mixed. Only two out of the five cases showed clear forecast improvement; the maximum forecast improvement seen over the verifying region was approximately 5% of the forecast error 24 hours into the forecast. These two cases are presented in more detail: in the first the improvement propagates into the verification region with a developing polar low; and in the second the improvement is associated with an upper-level trough. The impact of cycling targeted data in the background of the forecast (including the memory of previous targeted observations) is investigated. This is shown to cause a greater forecast impact, but does not necessarily lead to a greater forecast improvement. Finally, the robustness of the results is assessed using a small ensemble of forecasts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new spectral-based approach is presented to find orthogonal patterns from gridded weather/climate data. The method is based on optimizing the interpolation error variance. The optimally interpolated patterns (OIP) are then given by the eigenvectors of the interpolation error covariance matrix, obtained using the cross-spectral matrix. The formulation of the approach is presented, and the application to low-dimension stochastic toy models and to various reanalyses datasets is performed. In particular, it is found that the lowest-frequency patterns correspond to largest eigenvalues, that is, variances, of the interpolation error matrix. The approach has been applied to the Northern Hemispheric (NH) and tropical sea level pressure (SLP) and to the Indian Ocean sea surface temperature (SST). Two main OIP patterns are found for the NH SLP representing respectively the North Atlantic Oscillation and the North Pacific pattern. The leading tropical SLP OIP represents the Southern Oscillation. For the Indian Ocean SST, the leading OIP pattern shows a tripole-like structure having one sign over the eastern and north- and southwestern parts and an opposite sign in the remaining parts of the basin. The pattern is also found to have a high lagged correlation with the Niño-3 index with 6-months lag.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ECMWF full-physics and dry singular vector (SV) packages, using a dry energy norm and a 1-day optimization time, are applied to four high impact European cyclones of recent years that were almost universally badly forecast in the short range. It is shown that these full-physics SVs are much more relevant to severe cyclonic development than those based on dry dynamics plus boundary layer alone. The crucial extra ingredient is the representation of large-scale latent heat release. The severe winter storms all have a long, nearly straight region of high baroclinicity stretching across the Atlantic towards Europe, with a tongue of very high moisture content on its equatorward flank. In each case some of the final-time top SV structures pick out the region of the actual storm. The initial structures were generally located in the mid- to low troposphere. Forecasts based on initial conditions perturbed by moist SVs with opposite signs and various amplitudes show the range of possible 1-day outcomes for reasonable magnitudes of forecast error. In each case one of the perturbation structures gave a forecast very much closer to the actual storm than the control forecast. Deductions are made about the predictability of high-impact extratropical cyclone events. Implications are drawn for the short-range forecast problem and suggestions made for one practicable way to approach short-range ensemble forecasting. Copyright © 2005 Royal Meteorological Society.