963 resultados para flood forecasting model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Madden-Julian Oscillation (MJO) is the dominant mode of intraseasonal variability in the Trop- ics. It can be characterised as a planetary-scale coupling between the atmospheric circulation and organised deep convection that propagates east through the equatorial Indo-Pacific region. The MJO interacts with weather and climate systems on a near-global scale and is a crucial source of predictability for weather forecasts on medium to seasonal timescales. Despite its global signifi- cance, accurately representing the MJO in numerical weather prediction (NWP) and climate models remains a challenge. This thesis focuses on the representation of the MJO in the Integrated Forecasting System (IFS) at the European Centre for Medium-Range Weather Forecasting (ECMWF), a state-of-the-art NWP model. Recent modifications to the model physics in Cycle 32r3 (Cy32r3) of the IFS led to ad- vances in the simulation of the MJO; for the first time the observed amplitude of the MJO was maintained throughout the integration period. A set of hindcast experiments, which differ only in their formulation of convection, have been performed between May 2008 and April 2009 to asses the sensitivity of MJO simulation in the IFS to the Cy32r3 convective parameterization. Unique to this thesis is the attribution of the advances in MJO simulation in Cy32r3 to the mod- ified convective parameterization, specifically, the relative-humidity-dependent formulation for or- ganised deep entrainment. Increasing the sensitivity of the deep convection scheme to environmen- tal moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid-troposphere. Due to the modified precipitation-moisture relationship more moisture is able to build up which effectively preconditions the tropical atmosphere for the transition to deep convection. Results from this thesis suggest that a tropospheric moisture control on convection is key to simulating the interaction between the physics and large-scale circulation associated with the MJO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On 23 November 1981, a strong cold front swept across the U.K., producing tornadoes from the west to the east coasts. An extensive campaign to collect tornado reports by the Tornado and Storm Research Organisation (TORRO) resulted in 104 reports, the largest U.K. outbreak. The front was simulated with a convection-permitting numerical model down to 200-m horizontal grid spacing to better understand its evolution and meteorological environment. The event was typical of tornadoes in the U.K., with convective available potential energy (CAPE) less than 150 J kg-1, 0-1-km wind shear of 10-20 m s-1, and a narrow cold-frontal rainband forming precipitation cores and gaps. A line of cyclonic absolute vorticity existed along the front, with maxima as large as 0.04 s-1. Some hook-shaped misovortices bore kinematic similarity to supercells. The narrow swath along which the line was tornadic was bounded on the equatorward side by weak vorticity along the line and on the poleward side by zero CAPE, enclosing a region where the environment was otherwise favorable for tornadogenesis. To determine if the 104 tornado reports were plausible, first possible duplicate reports were eliminated, resulting in as few as 58 tornadoes to as many as 90. Second, the number of possible parent misovortices that may have spawned tornadoes is estimated from model output. The number of plausible tornado reports in the 200-m grid-spacing domain was 22 and as many as 44, whereas the model simulation was used to estimate 30 possible parent misovortices within this domain. These results suggest that 90 reports was plausible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several biotic crises during the past 300 million years have been linked to episodes of continental flood basalt volcanism, and in particular to the release of massive quantities of magmatic sulphur gas species. Flood basalt provinces were typically formed by numerous individual eruptions, each lasting years to decades. However, the environmental impact of these eruptions may have been limited by the occurrence of quiescent periods that lasted hundreds to thousands of years. Here we use a global aerosol model to quantify the sulphur-induced environmental effects of individual, decade-long flood basalt eruptions representative of the Columbia River Basalt Group, 16.5–14.5 million years ago, and the Deccan Traps, 65 million years ago. For a decade-long eruption of Deccan scale, we calculate a decadal-mean reduction in global surface temperature of 4.5 K, which would recover within 50 years after an eruption ceased unless climate feedbacks were very different in deep-time climates. Acid mists and fogs could have caused immediate damage to vegetation in some regions, but acid-sensitive land and marine ecosystems were well-buffered against volcanic sulphur deposition effects even during century-long eruptions. We conclude that magmatic sulphur from flood basalt eruptions would have caused a biotic crisis only if eruption frequencies and lava discharge rates had been high and sustained for several centuries at a time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ghana faces a macroeconomic problem of inflation for a long period of time. The problem in somehow slows the economic growth in this country. As we all know, inflation is one of the major economic challenges facing most countries in the world especially those in African including Ghana. Therefore, forecasting inflation rates in Ghana becomes very important for its government to design economic strategies or effective monetary policies to combat any unexpected high inflation in this country. This paper studies seasonal autoregressive integrated moving average model to forecast inflation rates in Ghana. Using monthly inflation data from July 1991 to December 2009, we find that ARIMA (1,1,1)(0,0,1)12 can represent the data behavior of inflation rate in Ghana well. Based on the selected model, we forecast seven (7) months inflation rates of Ghana outside the sample period (i.e. from January 2010 to July 2010). The observed inflation rate from January to April which was published by Ghana Statistical Service Department fall within the 95% confidence interval obtained from the designed model. The forecasted results show a decreasing pattern and a turning point of Ghana inflation in the month of July.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A massive amount has been written about forecasting but few articles are written about the development of time series models of call volumes for emergency services. In this study, we use different techniques for forecasting and make the comparison of the techniques for the call volume of the emergency service Rescue 1122 Lahore, Pakistan. For the purpose of this study data is taken from emergency calls of Rescue 1122 from 1st January 2008 to 31 December 2009 and 731 observations are used. Our goal is to develop a simple model that could be used for forecasting the daily call volume. Two different approaches are used for forecasting the daily call volume Box and Jenkins (ARIMA) methodology and Smoothing methodology. We generate the models for forecasting of call volume and present a comparison of the two different techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work concerns forecasting with vector nonlinear time series models when errorsare correlated. Point forecasts are numerically obtained using bootstrap methods andillustrated by two examples. Evaluation concentrates on studying forecast equality andencompassing. Nonlinear impulse responses are further considered and graphically sum-marized by highest density region. Finally, two macroeconomic data sets are used toillustrate our work. The forecasts from linear or nonlinear model could contribute usefulinformation absent in the forecasts form the other model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital elevation model (DEM) plays a substantial role in hydrological study, from understanding the catchment characteristics, setting up a hydrological model to mapping the flood risk in the region. Depending on the nature of study and its objectives, high resolution and reliable DEM is often desired to set up a sound hydrological model. However, such source of good DEM is not always available and it is generally high-priced. Obtained through radar based remote sensing, Shuttle Radar Topography Mission (SRTM) is a publicly available DEM with resolution of 92m outside US. It is a great source of DEM where no surveyed DEM is available. However, apart from the coarse resolution, SRTM suffers from inaccuracy especially on area with dense vegetation coverage due to the limitation of radar signals not penetrating through canopy. This will lead to the improper setup of the model as well as the erroneous mapping of flood risk. This paper attempts on improving SRTM dataset, using Normalised Difference Vegetation Index (NDVI), derived from Visible Red and Near Infra-Red band obtained from Landsat with resolution of 30m, and Artificial Neural Networks (ANN). The assessment of the improvement and the applicability of this method in hydrology would be highlighted and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the simulation of flood risks originating from the overtopping of river banks is well covered within continuously evaluated programs to improve flood protection measures, flash flooding is not. Flash floods are triggered by short, local thunderstorm cells with high precipitation intensities. Small catchments have short response times and flow paths and convective thunder cells may result in potential flooding of endangered settlements. Assessing local flooding and pathways of flood requires a detailed hydraulic simulation of the surface runoff. Hydrological models usually do not incorporate surface runoff at this detailedness but rather empirical equations are applied for runoff detention. In return 2D hydrodynamic models usually do not allow distributed rainfall as input nor are any types of soil/surface interaction implemented as in hydrological models. Considering several cases of local flash flooding during the last years the issue emerged for practical reasons but as well as research topics to closing the model gap between distributed rainfall and distributed runoff formation. Therefore, a 2D hydrodynamic model, depth-averaged flow equations using the finite volume discretization, was extended to accept direct rainfall enabling to simulate the associated runoff formation. The model itself is used as numerical engine, rainfall is introduced via the modification of waterlevels at fixed time intervals. The paper not only deals with the general application of the software, but intends to test the numerical stability and reliability of simulation results. The performed tests are made using different artificial as well as measured rainfall series as input. Key parameters of the simulation such as losses, roughness or time intervals for water level manipulations are tested regarding their impact on the stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for large-scale systems. Nonetheless, a critical obstacle, which needs to be overcome in MPC, is the large computational burden when a large-scale system is considered or a long prediction horizon is involved. In order to solve this problem, we use an adaptive prediction accuracy (APA) approach that can reduce the computational burden almost by half. The proposed MPC scheme with this scheme is tested on the northern Dutch water system, which comprises Lake IJssel, Lake Marker, the River IJssel and the North Sea Canal. The simulation results show that by using the MPC-APA scheme, the computational time can be reduced to a large extent and a flood protection problem over longer prediction horizons can be well solved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The Örst reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modiÖed information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of Ötted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy ñreaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the electricity hourly load demand in the area covered by a utility situated in the southeast of Brazil. We propose a stochastic model which employs generalized long memory (by means of Gegenbauer processes) to model the seasonal behavior of the load. The model is proposed for sectional data, that is, each hour’s load is studied separately as a single series. This approach avoids modeling the intricate intra-day pattern (load profile) displayed by the load, which varies throughout days of the week and seasons. The forecasting performance of the model is compared with a SARIMA benchmark using the years of 1999 and 2000 as the out-of-sample. The model clearly outperforms the benchmark. We conclude for general long memory in the series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the electricity load demand behavior during the 2001 rationing period, which was implemented because of the Brazilian energetic crisis. The hourly data refers to a utility situated in the southeast of the country. We use the model proposed by Soares and Souza (2003), making use of generalized long memory to model the seasonal behavior of the load. The rationing period is shown to have imposed a structural break in the series, decreasing the load at about 20%. Even so, the forecast accuracy is decreased only marginally, and the forecasts rapidly readapt to the new situation. The forecast errors from this model also permit verifying the public response to pieces of information released regarding the crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this paper is to present a comprehensive emprical analysis of the return and conditional variance of four Brazilian …nancial series using models of the ARCH class. Selected models are then compared regarding forecasting accuracy and goodness-of-…t statistics. To help understanding the empirical results, a self-contained theoretical discussion of ARCH models is also presented in such a way that it is useful for the applied researcher. Empirical results show that although all series share ARCH and are leptokurtic relative to the Normal, the return on the US$ has clearly regime switching and no asymmetry for the variance, the return on COCOA has no asymmetry, while the returns on the CBOND and TELEBRAS have clear signs of asymmetry favoring the leverage e¤ect. Regarding forecasting, the best model overall was the EGARCH(1; 1) in its Gaussian version. Regarding goodness-of-…t statistics, the SWARCH model did well, followed closely by the Student-t GARCH(1; 1)