916 resultados para flood forecasting model
Resumo:
This paper presents an efficient algorithm for optimizing the operation of battery storage in a low voltage distribution network with a high penetration of PV generation. A predictive control solution is presented that uses wavelet neural networks to predict the load and PV generation at hourly intervals for twelve hours into the future. The load and generation forecast, and the previous twelve hours of load and generation history, is used to assemble load profile. A diurnal charging profile can be compactly represented by a vector of Fourier coefficients allowing a direct search optimization algorithm to be applied. The optimal profile is updated hourly allowing the state of charge profile to respond to changing forecasts in load.
Resumo:
In ecosystems driven by water availability, plant community dynamics depend on complex interactions between vegetation, hydrology, and human water resources use. Along ephemeral rivers—where water availability is erratic—vegetation and people are particularly vulnerable to changes in each other's water use. Sensible management requires that water supply be maintained for people, while preserving ecosystem health. Meeting such requirements is challenging because of the unpredictable water availability. We applied information gap decision theory to an ecohydrological system model of the Kuiseb River environment in Namibia. Our aim was to identify the robustness of ecosystem and water management strategies to uncertainties in future flood regimes along ephemeral rivers. We evaluated the trade-offs between alternative performance criteria and their robustness to uncertainty to account for both (i) human demands for water supply and (ii) reducing the risk of species extinction caused by water mining. Increasing uncertainty of flood regime parameters reduced the performance under both objectives. Remarkably, the ecological objective (species coexistence) was more sensitive to uncertainty than the water supply objective. However, within each objective, the relative performance of different management strategies was insensitive to uncertainty. The ‘best’ management strategy was one that is tuned to the competitive species interactions in the Kuiseb environment. It regulates the biomass of the strongest competitor and, thus, at the same time decreases transpiration, thereby increasing groundwater storage and reducing pressure on less dominant species. This robust mutually acceptable strategy enables species persistence without markedly reducing the water supply for humans. This study emphasises the utility of ecohydrological models for resource management of water-controlled ecosystems. Although trade-offs were identified between alternative performance criteria and their robustness to uncertain future flood regimes, management strategies were identified that help to secure an ecologically sustainable water supply.
Resumo:
This paper presents a novel path planning method for minimizing the energy consumption of an autonomous underwater vehicle subjected to time varying ocean disturbances and forecast model uncertainty. The algorithm determines 4-Dimensional path candidates using Nonlinear Robust Model Predictive Control (NRMPC) and solutions optimised using A*-like algorithms. Vehicle performance limits are incorporated into the algorithm with disturbances represented as spatial and temporally varying ocean currents with a bounded uncertainty in their predictions. The proposed algorithm is demonstrated through simulations using a 4-Dimensional, spatially distributed time-series predictive ocean current model. Results show the combined NRMPC and A* approach is capable of generating energy-efficient paths which are resistant to both dynamic disturbances and ocean model uncertainty.
Resumo:
Klaassen and Magnus (2003) provide a model of the probability of a given player winning a tennis match, with the prediction updated on a point-by-point basis. This paper provides a point-by-point comparison of that model with the probability of a given player winning the match, as implied by betting odds. The predictions implied by the betting odds match the model predictions closely, with an extremely high correlation being found between the model and the betting market. The results for both men’s and women’s matches also suggest that there is a high level of efficiency in the betting market, demonstrating that betting markets are a good predictor of the outcomes of tennis matches. The significance of service breaks and service being held is anticipated up to four points prior to the end of the game. However, the tendency of players to lose more points than would be expected after conceding a break of service is not captured instantaneously in betting odds. In contrast, there is no evidence of a biased reaction to a player winning a game on service.
Resumo:
A key component of robotic path planning is ensuring that one can reliably navigate a vehicle to a desired location. In addition, when the features of interest are dynamic and move with oceanic currents, vehicle speed plays an important role in the planning exercise to ensure that vehicles are in the right place at the right time. Aquatic robot design is moving towards utilizing the environment for propulsion rather than traditional motors and propellers. These new vehicles are able to realize significantly increased endurance, however the mission planning problem, in turn, becomes more difficult as the vehicle velocity is not directly controllable. In this paper, we examine Gaussian process models applied to existing wave model data to predict the behavior, i.e., velocity, of a Wave Glider Autonomous Surface Vehicle. Using training data from an on-board sensor and forecasting with the WAVEWATCH III model, our probabilistic regression models created an effective method for forecasting WG velocity.
Resumo:
To facilitate marketing and export, the Australian macadamia industry requires accurate crop forecasts. Each year, two levels of crop predictions are produced for this industry. The first is an overall longer-term forecast based on tree census data of growers in the Australian Macadamia Society (AMS). This data set currently accounts for around 70% of total production, and is supplemented by our best estimates of non-AMS orchards. Given these total tree numbers, average yields per tree are needed to complete the long-term forecasts. Yields from regional variety trials were initially used, but were found to be consistently higher than the average yields that growers were obtaining. Hence, a statistical model was developed using growers' historical yields, also taken from the AMS database. This model accounted for the effects of tree age, variety, year, region and tree spacing, and explained 65% of the total variation in the yield per tree data. The second level of crop prediction is an annual climate adjustment of these overall long-term estimates, taking into account the expected effects on production of the previous year's climate. This adjustment is based on relative historical yields, measured as the percentage deviance between expected and actual production. The dominant climatic variables are observed temperature, evaporation, solar radiation and modelled water stress. Initially, a number of alternate statistical models showed good agreement within the historical data, with jack-knife cross-validation R2 values of 96% or better. However, forecasts varied quite widely between these alternate models. Exploratory multivariate analyses and nearest-neighbour methods were used to investigate these differences. For 2001-2003, the overall forecasts were in the right direction (when compared with the long-term expected values), but were over-estimates. In 2004 the forecast was well under the observed production, and in 2005 the revised models produced a forecast within 5.1% of the actual production. Over the first five years of forecasting, the absolute deviance for the climate-adjustment models averaged 10.1%, just outside the targeted objective of 10%.
Resumo:
When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.
Resumo:
The continually expanding macadamia industry needs an accurate crop forecasting system to allow it to develop effective crop handling and marketing strategies, particularly when the industry faces recurring cycles of unsustainably high and low commodity prices. This project aims to provide the AMS with a robust, reliable predictive model of national crop volume within 10% of the actual crop by 1 April each year by factoring known seasonal, environmental, cultural, climatic, management and biological constraints, together with the existing AMS database which includes data on tree numbers, tree age, variety, location and previous season's production.
Resumo:
The quality of short-term electricity load forecasting is crucial to the operation and trading activities of market participants in an electricity market. In this paper, it is shown that a multiple equation time-series model, which is estimated by repeated application of ordinary least squares, has the potential to match or even outperform more complex nonlinear and nonparametric forecasting models. The key ingredient of the success of this simple model is the effective use of lagged information by allowing for interaction between seasonal patterns and intra-day dependencies. Although the model is built using data for the Queensland region of Australia, the method is completely generic and applicable to any load forecasting problem. The model’s forecasting ability is assessed by means of the mean absolute percentage error (MAPE). For day-ahead forecast, the MAPE returned by the model over a period of 11 years is an impressive 1.36%. The forecast accuracy of the model is compared with a number of benchmarks including three popular alternatives and one industrial standard reported by the Australia Energy Market Operator (AEMO). The performance of the model developed in this paper is superior to all benchmarks and outperforms the AEMO forecasts by about a third in terms of the MAPE criterion.
Resumo:
This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.
Resumo:
The hype cycle model traces the evolution of technological innovations as they pass through successive stages pronounced by the peak, disappointment, and recovery of expectations. Since its introduction by Gartner nearly two decades ago, the model has received growing interest from practitioners, and more recently from scholars. Given the model's proclaimed capacity to forecast technological development, an important consideration for organizations in formulating marketing strategies, this paper provides a critical review of the hype cycle model by seeking evidence from Gartner's own technology databases for the manifestation of hype cycles. The results of our empirical work show incongruences connected with the reports of Gartner, which motivates us to consider possible future directions, whereby the notion of hype or hyped dynamics (though not necessarily the hype cycle model itself) can be captured in existing life cycle models through the identification of peak, disappointment, and recovery patterns.
Resumo:
The most difficult operation in flood inundation mapping using optical flood images is to map the ‘wet’ areas where trees and houses are partly covered by water. This can be referred to as a typical problem of the presence of mixed pixels in the images. A number of automatic information extracting image classification algorithms have been developed over the years for flood mapping using optical remote sensing images, with most labelling a pixel as a particular class. However, they often fail to generate reliable flood inundation mapping because of the presence of mixed pixels in the images. To solve this problem, spectral unmixing methods have been developed. In this thesis, methods for selecting endmembers and the method to model the primary classes for unmixing, the two most important issues in spectral unmixing, are investigated. We conduct comparative studies of three typical spectral unmixing algorithms, Partial Constrained Linear Spectral unmixing, Multiple Endmember Selection Mixture Analysis and spectral unmixing using the Extended Support Vector Machine method. They are analysed and assessed by error analysis in flood mapping using MODIS, Landsat and World View-2 images. The Conventional Root Mean Square Error Assessment is applied to obtain errors for estimated fractions of each primary class. Moreover, a newly developed Fuzzy Error Matrix is used to obtain a clear picture of error distributions at the pixel level. This thesis shows that the Extended Support Vector Machine method is able to provide a more reliable estimation of fractional abundances and allows the use of a complete set of training samples to model a defined pure class. Furthermore, it can be applied to analysis of both pure and mixed pixels to provide integrated hard-soft classification results. Our research also identifies and explores a serious drawback in relation to endmember selections in current spectral unmixing methods which apply fixed sets of endmember classes or pure classes for mixture analysis of every pixel in an entire image. However, as it is not accurate to assume that every pixel in an image must contain all endmember classes, these methods usually cause an over-estimation of the fractional abundances in a particular pixel. In this thesis, a subset of adaptive endmembers in every pixel is derived using the proposed methods to form an endmember index matrix. The experimental results show that using the pixel-dependent endmembers in unmixing significantly improves performance.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
Recently, focus of real estate investment has expanded from the building-specific level to the aggregate portfolio level. The portfolio perspective requires investment analysis for real estate which is comparable with that of other asset classes, such as stocks and bonds. Thus, despite its distinctive features, such as heterogeneity, high unit value, illiquidity and the use of valuations to measure performance, real estate should not be considered in isolation. This means that techniques which are widely used for other assets classes can also be applied to real estate. An important part of investment strategies which support decisions on multi-asset portfolios is identifying the fundamentals of movements in property rents and returns, and predicting them on the basis of these fundamentals. The main objective of this thesis is to find the key drivers and the best methods for modelling and forecasting property rents and returns in markets which have experienced structural changes. The Finnish property market, which is a small European market with structural changes and limited property data, is used as a case study. The findings in the thesis show that is it possible to use modern econometric tools for modelling and forecasting property markets. The thesis consists of an introduction part and four essays. Essays 1 and 3 model Helsinki office rents and returns, and assess the suitability of alternative techniques for forecasting these series. Simple time series techniques are able to account for structural changes in the way markets operate, and thus provide the best forecasting tool. Theory-based econometric models, in particular error correction models, which are constrained by long-run information, are better for explaining past movements in rents and returns than for predicting their future movements. Essay 2 proceeds by examining the key drivers of rent movements for several property types in a number of Finnish property markets. The essay shows that commercial rents in local markets can be modelled using national macroeconomic variables and a panel approach. Finally, Essay 4 investigates whether forecasting models can be improved by accounting for asymmetric responses of office returns to the business cycle. The essay finds that the forecast performance of time series models can be improved by introducing asymmetries, and the improvement is sufficient to justify the extra computational time and effort associated with the application of these techniques.
Resumo:
A diffusion/replacement model for new consumer durables designed to be used as a long-term forecasting tool is developed. The model simulates new demand as well as replacement demand over time. The model is called DEMSIM and is built upon a counteractive adoption model specifying the basic forces affecting the adoption behaviour of individual consumers. These forces are the promoting forces and the resisting forces. The promoting forces are further divided into internal and external influences. These influences are operationalized within a multi-segmental diffusion model generating the adoption behaviour of the consumers in each segment as an expected value. This diffusion model is combined with a replacement model built upon the same segmental structure as the diffusion model. This model generates, in turn, the expected replacement behaviour in each segment. To be able to use DEMSIM as a forecasting tool in early stages of a diffusion process estimates of the model parameters are needed as soon as possible after product launch. However, traditional statistical techniques are not very helpful in estimating such parameters in early stages of a diffusion process. To enable early parameter calibration an optimization algorithm is developed by which the main parameters of the diffusion model can be estimated on the basis of very few sales observations. The optimization is carried out in iterative simulation runs. Empirical validations using the optimization algorithm reveal that the diffusion model performs well in early long-term sales forecasts, especially as it comes to the timing of future sales peaks.