16 resultados para streamflow forecasts
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
Resumo:
We propose a simple and flexible framework for forecasting the joint density of asset returns. The multinormal distribution is augmented with a polynomial in (time-varying) non-central co-moments of assets. We estimate the coefficients of the polynomial via the Method of Moments for a carefully selected set of co-moments. In an extensive empirical study, we compare the proposed model with a range of other models widely used in the literature. Employing a recently proposed as well as standard techniques to evaluate multivariate forecasts, we conclude that the augmented joint density provides highly accurate forecasts of the “negative tail” of the joint distribution.
Resumo:
A significant part of the literature on input-output (IO) analysis is dedicated to the development and application of methodologies forecasting and updating technology coefficients and multipliers. Prominent among such techniques is the RAS method, while more information demanding econometric methods, as well as other less promising ones, have been proposed. However, there has been little interest expressed in the use of more modern and often more innovative methods, such as neural networks in IO analysis in general. This study constructs, proposes and applies a Backpropagation Neural Network (BPN) with the purpose of forecasting IO technology coefficients and subsequently multipliers. The RAS method is also applied on the same set of UK IO tables, and the discussion of results of both methods is accompanied by a comparative analysis. The results show that the BPN offers a valid alternative way of IO technology forecasting and many forecasts were more accurate using this method. Overall, however, the RAS method outperformed the BPN but the difference is rather small to be systematic and there are further ways to improve the performance of the BPN.
Resumo:
The rationality of investors during asset price bubbles has been the subject of considerable debate. An analysis of the British Railway Mania, which occurred in the 1840s, suggests that investors may have been myopic, as their expectations were only accurate in the short-term, but they remained rational, as they acted in a utility maximising manner given their expectations. Investors successfully incorporated forecasts of short-term dividend changes into their valuations, but were unable to predict longer-term changes. When short-term growth is controlled for, it appears that the railways were priced consistently with the non-railways throughout the entire episode.
Resumo:
Wind power generation differs from conventional thermal generation due to the stochastic nature of wind. Thus wind power forecasting plays a key role in dealing with the challenges of balancing supply and demand in any electricity system, given the uncertainty associated with the wind farm power output. Accurate wind power forecasting reduces the need for additional balancing energy and reserve power to integrate wind power. Wind power forecasting tools enable better dispatch, scheduling and unit commitment of thermal generators, hydro plant and energy storage plant and more competitive market trading as wind power ramps up and down on the grid. This paper presents an in-depth review of the current methods and advances in wind power forecasting and prediction. Firstly, numerical wind prediction methods from global to local scales, ensemble forecasting, upscaling and downscaling processes are discussed. Next the statistical and machine learning approach methods are detailed. Then the techniques used for benchmarking and uncertainty analysis of forecasts are overviewed, and the performance of various approaches over different forecast time horizons is examined. Finally, current research activities, challenges and potential future developments are appraised. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Heat Alert and Response Systems (HARS) are currently undergoing testing and implementation in Canada. These programs seek to reduce the adverse health effects of heat waves on human health by issuing weather forecasts and warnings, informing individuals about possible protections from excessive heat, and providing such protections to vulnerable subpopulations and individuals at risk. For these programs to be designed effectively, it is important to know how individuals perceive the heat, what their experience with heat-related illness is, how they protect themselves from excessive heat, and how they acquire information about such protections. In September 2010, we conducted a survey of households in 5 cities in Canada to study these issues. At the time of the survey, these cities had not implemented heat outreach and response systems. The study results indicate that individuals' recollections of recent heat wave events were generally accurate. About 21% of the sample reported feeling unwell during the most recent heat spell, but these illnesses were generally minor. Only in 25 cases out of 243, these illnesses were confirmed or diagnosed by a health care professional. The rate at which our respondents reported heat-related illnesses was higher among those with cardiovascular and respiratory illnesses, was higher among younger respondents and bore no relationship with the availability of air conditioning at home. Most of the respondents indicated that they would not dismiss themselves as
Resumo:
Longevity risk has become one of the major risks facing the insurance and pensions markets globally. The trade in longevity risk is underpinned by accurate forecasting of mortality rates. Using techniques from macroeconomic forecasting, we propose a dynamic factor model of mortality that fits and forecasts mortality rates parsimoniously.We compare the forecasting quality of this model and of existing models and find that the dynamic factor model generally provides superior forecasts when applied to international mortality data. We also show that existing multifactorial models have superior fit but their forecasting performance worsens as more factors are added. The dynamic factor approach used here can potentially be further improved upon by applying an appropriate stopping rule for the number of static and dynamic factors.
Resumo:
An appreciation of the quantity of streamflow derived from the main hydrological pathways involved in transporting diffuse contaminants is critical when addressing a wide range of water resource management issues. In order to assess hydrological pathway contributions to streams, it is necessary to provide feasible upper and lower bounds for flows in each pathway. An important first step in this process is to provide reliable estimates of the slower responding groundwater pathways and subsequently the quicker overland and interflow pathways. This paper investigates the effectiveness of a multi-faceted approach applying different hydrograph separation techniques, supplemented by lumped hydrological modelling, for calculating the Baseflow Index (BFI), for the development of an integrated approach to hydrograph separation. A semi-distributed, lumped and deterministic rainfall runoff model known as NAM has been applied to ten catchments (ranging from 5 to 699 km2). While this modelling approach is useful as a validation method, NAM itself is also an important tool for investigation. These separation techniques provide a large variation in BFI, a difference of 0.741 predicted for BFI in a catchment with the less reliable fixed and sliding interval methods and local minima turning point methods included. This variation is reduced to 0.167 with these methods omitted. The Boughton and Eckhardt algorithms, while quite subjective in their use, provide quick and easily implemented approaches for obtaining physically realistic hydrograph separations. It is observed that while the different separation techniques give varying BFI values for each of the catchments, a recharge coefficient approach developed in Ireland, when applied in conjunction with the Master recession Curve Tabulation method, predict estimates in agreement with those obtained using the NAM model, and these estimates are also consistent with the study catchments’ geology. These two separation methods, in conjunction with the NAM model, were selected to form an integrated approach to assessing BFI in catchments.
Resumo:
Flow responsive passive samplers offer considerable potential in nutrient monitoring in catchments; bridging the gap between the intermittency of grab sampling and the high cost of automated monitoring systems. A commercially available passive sampler was evaluated in a number of river systems encapsulating a gradient in storm response, combinations of diffuse and point source pressures, and levels of phosphorus and nitrogen concentrations. Phosphorus and nitrogen are sequestered to a resin matrix in a permeable cartridge positioned in line with streamflow. A salt tracer dissolves in proportion to advective flow through the cartridge. Multiple deployments of different cartridge types were undertaken and the recovery of P and N compared with the flow-weighted mean concentration (FWMC) from high-resolution bank-side analysers at each site. Results from the passive samplers were variable and largely underestimated the FWMC derived from the bank-side analysers. Laboratory tests using ambient river samples indicated good replication of advective throughflow using pumped water, although this appeared not to be a good analogue of river conditions where flow divergence was possible. Laboratory tests also showed good nutrient retention but not elution and these issues appeared to combine to limit the utility in ambient river systems at the small catchment scale.
Resumo:
Mortality models used for forecasting are predominantly based on the statistical properties of time series and do not generally incorporate an understanding of the forces driving secular trends. This paper addresses three research questions: Can the factors found in stochastic mortality-forecasting models be associated with real-world trends in health-related variables? Does inclusion of health-related factors in models improve forecasts? Do resulting models give better forecasts than existing stochastic mortality models? We consider whether the space spanned by the latent factor structure in mortality data can be adequately described by developments in gross domestic product, health expenditure and lifestyle-related risk factors using statistical techniques developed in macroeconomics and finance. These covariates are then shown to improve forecasts when incorporated into a Bayesian hierarchical model. Results are comparable or better than benchmark stochastic mortality models.
Resumo:
Mortality modelling for the purposes of demographic forecasting and actuarial pricing is generally done at an aggregate level using national data. Modelling at this level fails to capture the variation in mortality within country and potentially leads to a mis-specification of mortality forecasts for a subset of the population. This can have detrimental effects for pricing and reserving in the actuarial context. In this paper we consider mortality rates at a regional level and analyse the variation in those rates. We consider whether variation in mortality rates within a country can be explained using local economic and social variables. Using Northern Ireland data on mortality and measures of deprivation we identify the variables explaining mortality variation. We create a population polarisation variable and find that this variable is significant in explaining some of the variation in mortality rates. Further, we consider whether spatial and non-spatial models have a part to play in explaining mortality differentials.
Resumo:
In recent years, the issue of life expectancy has become of upmost importance to pension providers, insurance companies and the government bodies in the developed world. Significant and consistent improvements in mortality rates and, hence, life expectancy have led to unprecedented increases in the cost of providing for older ages. This has resulted in an explosion of stochastic mortality models forecasting trends in mortality data in order to anticipate future life expectancy and, hence, quantify the costs of providing for future aging populations. Many stochastic models of mortality rates identify linear trends in mortality rates by time, age and cohort, and forecast these trends into the future using standard statistical methods. The modeling approaches used failed to capture the effects of any structural change in the trend and, thus, potentially produced incorrect forecasts of future mortality rates. In this paper, we look at a range of leading stochastic models of mortality and test for structural breaks in the trend time series.
Resumo:
The paper addresses the issue of choice of bandwidth in the application of semiparametric estimation of the long memory parameter in a univariate time series process. The focus is on the properties of forecasts from the long memory model. A variety of cross-validation methods based on out of sample forecasting properties are proposed. These procedures are used for the choice of bandwidth and subsequent model selection. Simulation evidence is presented that demonstrates the advantage of the proposed new methodology.
Reducible Diffusions with Time-Varying Transformations with Application to Short-Term Interest Rates
Resumo:
Reducible diffusions (RDs) are nonlinear transformations of analytically solvable Basic Diffusions (BDs). Hence, by construction RDs are analytically tractable and flexible diffusion processes. Existing literature on RDs has mostly focused on time-homogeneous transformations, which to a significant extent fail to explore the full potential of RDs from both theoretical and practical points of view. In this paper, we propose flexible and economically justifiable time variations to the transformations of RDs. Concentrating on the Constant Elasticity Variance (CEV) RDs, we consider nonlinear dynamics for our time-varying transformations with both deterministic and stochastic designs. Such time variations can greatly enhance the flexibility of RDs while maintaining sufficient tractability of the resulting models. In the meantime, our modeling approach enjoys the benefits of classical inferential techniques such as the Maximum Likelihood (ML). Our application to the UK and the US short-term interest rates suggests that from an empirical point of view time-varying transformations are highly relevant and statistically significant. We expect that the proposed models can describe more truthfully the dynamic time-varying behavior of economic and financial variables and potentially improve out-of-sample forecasts significantly.
Resumo:
The proliferation of mobile devices in society accessing data via the ‘cloud’ is imposing a dramatic increase in the amount of information to be stored on hard disk drives (HDD) used in servers. Forecasts are that areal densities will need to increase by as much as 35% compound per annum and by 2020 cloud storage capacity will be around 7 zettabytes corresponding to areal densities of 2 Tb/in2. This requires increased performance from the magnetic pole of the electromagnetic writer in the read/write head in the HDD. Current state-of-art writing is undertaken by morphologically complex magnetic pole of sub 100 nm dimensions, in an environment of engineered magnetic shields and it needs to deliver strong directional magnetic field to areas on the recording media around 50 nm x 13 nm. This points to the need for a method to perform direct quantitative measurements of the magnetic field generated by the write pole at the nanometer scale. Here we report on the complete in situ quantitative mapping of the magnetic field generated by a functioning write pole in operation using electron holography. Opportunistically, it points the way towards a new nanoscale magnetic field source to further develop in situ Transmission Electron Microscopy.