916 resultados para flood forecasting model
Resumo:
At least two important transportation planning activities rely on planning-level crash prediction models. One is motivated by the Transportation Equity Act for the 21st Century, which requires departments of transportation and metropolitan planning organizations to consider safety explicitly in the transportation planning process. The second could arise from a need for state agencies to establish incentive programs to reduce injuries and save lives. Both applications require a forecast of safety for a future period. Planning-level crash prediction models for the Tucson, Arizona, metropolitan region are presented to demonstrate the feasibility of such models. Data were separated into fatal, injury, and property-damage crashes. To accommodate overdispersion in the data, negative binomial regression models were applied. To accommodate the simultaneity of fatality and injury crash outcomes, simultaneous estimation of the models was conducted. All models produce crash forecasts at the traffic analysis zone level. Statistically significant (p-values < 0.05) and theoretically meaningful variables for the fatal crash model included population density, persons 17 years old or younger as a percentage of the total population, and intersection density. Significant variables for the injury and property-damage crash models were population density, number of employees, intersections density, percentage of miles of principal arterial, percentage of miles of minor arterials, and percentage of miles of urban collectors. Among several conclusions it is suggested that planning-level safety models are feasible and may play a role in future planning activities. However, caution must be exercised with such models.
Resumo:
Flood related scientific and community-based data are rarely systematically collected and analysed in the Philippines. Over the last decades the Pagsangaan River Basin, Leyte, has experienced several flood events. However, documentation describing flood characteristics such as extent, duration or height of these floods are close to non-existing. To address this issue, computerized flood modelling was used to reproduce past events where there was data available for at least partial calibration and validation. The model was also used to provide scenario-based predictions based on A1B climate change assumptions for the area. The most important input for flood modelling is a Digital Elevation Model (DEM) of the river basin. No accurate topographic maps or Light Detection And Ranging (LIDAR)-generated data are available for the Pagsangaan River. Therefore, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Map (GDEM), Version 1, was chosen as the DEM. Although the horizontal spatial resolution of 30 m is rather desirable, it contains substantial vertical errors. These were identified, different correction methods were tested and the resulting DEM was used for flood modelling. The above mentioned data were combined with cross-sections at various strategic locations of the river network, meteorological records, river water level, and current velocity to develop the 1D-2D flood model. SOBEK was used as modelling software to create different rainfall scenarios, including historic flooding events. Due to the lack of scientific data for the verification of the model quality, interviews with local stakeholders served as the gauge to judge the quality of the generated flood maps. According to interviewees, the model reflects reality more accurately than previously available flood maps. The resulting flood maps are now used by the operations centre of a local flood early warning system for warnings and evacuation alerts. Furthermore these maps can serve as a basis to identify flood hazard areas for spatial land use planning purposes.
Resumo:
Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.
Resumo:
Price forecast is a matter of concern for all participants in electricity markets, from suppliers to consumers through policy makers, which are interested in the accurate forecast of day-ahead electricity prices either for better decisions making or for an improved evaluation of the effectiveness of market rules and structure. This paper describes a methodology to forecast market prices in an electricity market using an ARIMA model applied to the conjectural variations of the firms acting in an electricity market. This methodology is applied to the Iberian electricity market to forecast market prices in the 24 hours of a working day. The methodology was then compared with two other methodologies, one called naive and the other a direct forecast of market prices using also an ARIMA model. Results show that the conjectural variations price forecast performs better than the naive and that it performs slightly better than the direct price forecast.
Resumo:
Urban flood inundation models require considerable data for their parameterisation, calibration and validation. TerraSAR-X should be suitable for urban flood detection because of its high resolution in stripmap/spotlight modes. The paper describes ongoing work on a project to assess how well TerraSAR-X can detect flooded regions in urban areas, and how well these can constrain the parameters of an urban flood model. The study uses a TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK , in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with LiDAR data to estimate regions of the image in which water would not be visible due to shadow or layover caused by buildings and vegetation. An algorithm for the delineation of flood water in urban areas is described, together with its validation using the aerial photographs.
Resumo:
This study addresses three issues: spatial downscaling, calibration, and combination of seasonal predictions produced by different coupled ocean-atmosphere climate models. It examines the feasibility Of using a Bayesian procedure for producing combined, well-calibrated downscaled seasonal rainfall forecasts for two regions in South America and river flow forecasts for the Parana river in the south of Brazil and the Tocantins river in the north of Brazil. These forecasts are important for national electricity generation management and planning. A Bayesian procedure, referred to here as forecast assimilation, is used to combine and calibrate the rainfall predictions produced by three climate models. Forecast assimilation is able to improve the skill of 3-month lead November-December-January multi-model rainfall predictions over the two South American regions. Improvements are noted in forecast seasonal mean values and uncertainty estimates. River flow forecasts are less skilful than rainfall forecasts. This is partially because natural river flow is a derived quantity that is sensitive to hydrological as well as meteorological processes, and to human intervention in the form of reservoir management.
Resumo:
Satellite observed data for flood events have been used to calibrate and validate flood inundation models, providing valuable information on the spatial extent of the flood. Improvements in the resolution of this satellite imagery have enabled indirect remote sensing of water levels by using an underlying LiDAR DEM to extract the water surface elevation at the flood margin. Further to comparison of the spatial extent, this now allows for direct comparison between modelled and observed water surface elevations. Using a 12.5m ERS-1 image of a flood event in 2006 on the River Dee, North Wales, UK, both of these data types are extracted and each assessed for their value in the calibration of flood inundation models. A LiDAR guided snake algorithm is used to extract an outline of the flood from the satellite image. From the extracted outline a binary grid of wet / dry cells is created at the same resolution as the model, using this the spatial extent of the modelled and observed flood can be compared using a measure of fit between the two binary patterns of flooding. Water heights are extracted using points at intervals of approximately 100m along the extracted outline, and the students T-test is used to compare modelled and observed water surface elevations. A LISFLOOD-FP model of the catchment is set up using LiDAR topographic data resampled to the 12.5m resolution of the satellite image, and calibration of the friction parameter in the model is undertaken using each of the two approaches. Comparison between the two approaches highlights the sensitivity of the spatial measure of fit to uncertainty in the observed data and the potential drawbacks of using the spatial extent when parts of the flood are contained by the topography.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the interdisciplinary development of its specialist fields, but also to provoke reflection on the idea of ‘European philosophy of science’. This efforts should foster a contemporaneous reflection on what might be meant by philosophy of science in Europe and European philosophy of science, and how in fact awareness of it could assist philosophers interpret and motivate their research through a stronger collective identity. The overarching aim is to set the background for a collaborative project organising, systematising, and ultimately forging an identity for, European philosophy of science by creating research structures and developing research networks across Europe to promote its development.
Resumo:
Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.
Resumo:
With many operational centers moving toward order 1-km-gridlength models for routine weather forecasting, this paper presents a systematic investigation of the properties of high-resolution versions of the Met Office Unified Model for short-range forecasting of convective rainfall events. The authors describe a suite of configurations of the Met Office Unified Model running with grid lengths of 12, 4, and 1 km and analyze results from these models for a number of convective cases from the summers of 2003, 2004, and 2005. The analysis includes subjective evaluation of the rainfall fields and comparisons of rainfall amounts, initiation, cell statistics, and a scale-selective verification technique. It is shown that the 4- and 1-km-gridlength models often give more realistic-looking precipitation fields because convection is represented explicitly rather than parameterized. However, the 4-km model representation suffers from large convective cells and delayed initiation because the grid length is too long to correctly reproduce the convection explicitly. These problems are not as evident in the 1-km model, although it does suffer from too numerous small cells in some situations. Both the 4- and 1-km models suffer from poor representation at the start of the forecast in the period when the high-resolution detail is spinning up from the lower-resolution (12 km) starting data used. A scale-selective precipitation verification technique implies that for later times in the forecasts (after the spinup period) the 1-km model performs better than the 12- and 4-km models for lower rainfall thresholds. For higher thresholds the 4-km model scores almost as well as the 1-km model, and both do better than the 12-km model.