916 resultados para flood forecasting model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An agent based model for spatial electric load forecasting using a local movement approach for the spatiotemporal allocation of the new loads in the service zone is presented. The density of electrical load for each of the major consumer classes in each sub-zone is used as the current state of the agents. The spatial growth is simulated with a walking agent who starts his path in one of the activity centers of the city and goes to the limits of the city following a radial path depending on the different load levels. A series of update rules are established to simulate the S growth behavior and the complementarity between classes. The results are presented in future load density maps. The tests in a real system from a mid-size city show a high rate of success when compared with other techniques. The most important features of this methodology are the need for few data and the simplicity of the algorithm, allowing for future scalability. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os tipos de vegetação atual, sequências sedimentares, dados de pólen e datações por radiocarbono obtidas em três testemunhos de sedimento da planície costeira de Calçoene foram utilizados para estabelecer uma história paleoecológica durante o Holoceno superior das zonas úmidas costeiras do Amapá conforme as mudanças no regime de inundação, nível do mar e clima. Baseado nestes três registros, quatro fases de desenvolvimento da vegetação são apresentadas e provavelmente refletem a interação entre o fluxo de energia na acumulação do sedimento e a influência das águas salobras e doces na vegetação. Este trabalho sugere alternâncias entre períodos caracterizados por influências marinha e fluvial. O perfil longitudinal não revelou a ocorrência de manguezais nos sedimentos depositados por volta de 2100 anos A.P. Durante a segunda fase, a lama preencheu progressivamente as depressões e canais de maré. Provavelmente, os manguezais iniciaram seu desenvolvimento nas margens dos canais, e os campos herbáceos nos setores elevados. A terceira fase é caracterizada por uma interrupção no desenvolvimento dos manguezais e a expansão da vegetação de várzea devido a uma diminuição na influência das águas marinhas. A última fase é representada pela expansão de manguezais e várzeas. A correlação entre os padrões atuais de distribuição das unidades geobotânicas e a paleovegetação indica que os manguezais e as florestas de várzea estão migrando sobre os campos herbáceos nos setores topograficamente mais elevados do litoral em estudo, o que pode estar relacionado a um aumento do nível relativo do mar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brazil is the largest sugarcane producer in the world and has a privileged position to attend to national and international market places. To maintain the high production of sugarcane, it is fundamental to improve the forecasting models of crop seasons through the use of alternative technologies, such as remote sensing. Thus, the main purpose of this article is to assess the results of two different statistical forecasting methods applied to an agroclimatic index (the water requirement satisfaction index; WRSI) and the sugarcane spectral response (normalized difference vegetation index; NDVI) registered on National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (NOAA-AVHRR) satellite images. We also evaluated the cross-correlation between these two indexes. According to the results obtained, there are meaningful correlations between NDVI and WRSI with time lags. Additionally, the adjusted model for NDVI presented more accurate results than the forecasting models for WRSI. Finally, the analyses indicate that NDVI is more predictable due to its seasonality and the WRSI values are more variable making it difficult to forecast.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new methodology is being devised for ensemble ocean forecasting using distributions of the surface wind field derived from a Bayesian Hierarchical Model (BHM). The ocean members are forced with samples from the posterior distribution of the wind during the assimilation of satellite and in-situ ocean data. The initial condition perturbations are then consistent with the best available knowledge of the ocean state at the beginning of the forecast and amplify the ocean response to uncertainty only in the forcing. The ECMWF Ensemble Prediction System (EPS) surface winds are also used to generate a reference ocean ensemble to evaluate the performance of the BHM method that proves to be eective in concentrating the forecast uncertainty at the ocean meso-scale. An height month experiment of weekly BHM ensemble forecasts was performed in the framework of the operational Mediterranean Forecasting System. The statistical properties of the ensemble are compared with model errors throughout the seasonal cycle proving the existence of a strong relationship between forecast uncertainties due to atmospheric forcing and the seasonal cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]In this paper we introduce a new methodology for wind field forecasting over complex terrain. The idea is to use the predictions of the HARMONIE mesoscale model as the input data for an adaptive finite element mass consistent wind model [1, 2]. A description of the HARMONIE Non-Hydrostatic Dynamics can be found in [3]. The HARMONIE results (obtained with a maximum resolution about 1 Km) are refined in a local scale (about a few meters)...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecasting the time, location, nature, and scale of volcanic eruptions is one of the most urgent aspects of modern applied volcanology. The reliability of probabilistic forecasting procedures is strongly related to the reliability of the input information provided, implying objective criteria for interpreting the historical and monitoring data. For this reason both, detailed analysis of past data and more basic research into the processes of volcanism, are fundamental tasks of a continuous information-gain process; in this way the precursor events of eruptions can be better interpreted in terms of their physical meanings with correlated uncertainties. This should lead to better predictions of the nature of eruptive events. In this work we have studied different problems associated with the long- and short-term eruption forecasting assessment. First, we discuss different approaches for the analysis of the eruptive history of a volcano, most of them generally applied for long-term eruption forecasting purposes; furthermore, we present a model based on the characteristics of a Brownian passage-time process to describe recurrent eruptive activity, and apply it for long-term, time-dependent, eruption forecasting (Chapter 1). Conversely, in an effort to define further monitoring parameters as input data for short-term eruption forecasting in probabilistic models (as for example, the Bayesian Event Tree for eruption forecasting -BET_EF-), we analyze some characteristics of typical seismic activity recorded in active volcanoes; in particular, we use some methodologies that may be applied to analyze long-period (LP) events (Chapter 2) and volcano-tectonic (VT) seismic swarms (Chapter 3); our analysis in general are oriented toward the tracking of phenomena that can provide information about magmatic processes. Finally, we discuss some possible ways to integrate the results presented in Chapters 1 (for long-term EF), 2 and 3 (for short-term EF) in the BET_EF model (Chapter 4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new Coastal Rapid Environmental Assessment (CREA) strategy has been developed and successfully applied to the Northern Adriatic Sea. CREA strategy exploits the recent advent of operational oceanography to establish a CREA system based on an operational regional forecasting system and coastal monitoring networks of opportunity. The methodology wishes to initialize a coastal high resolution model, nested within the regional forecasting system, blending the large scale parent model fields with the available coastal observations to generate the requisite field estimates. CREA modeling system consists of a high resolution, O(800m), Adriatic SHELF model (ASHELF) implemented into the Northern Adriatic basin and nested within the Adriatic Forecasting System (AFS) (Oddo et al. 2006). The observational system is composed by the coastal networks established in the framework of ADRICOSM (ADRiatic sea integrated COastal areaS and river basin Managment system) Pilot Project. An assimilation technique exerts a correction of the initial field provided by AFS on the basis of the available observations. The blending of the two data sets has been carried out through a multi-scale optimal interpolation technique developed by Mariano and Brown (1992). Two CREA weekly exercises have been conducted: the first, at the beginning of May (spring experiment); the second in middle August (summer experiment). The weeks have been chosen looking at the availability of all coastal observations in the initialization day and one week later to validate model results, verifying our predictive skills. ASHELF spin up time has been investigated too, through a dedicated experiment, in order to obtain the maximum forecast accuracy within a minimum time. Energetic evaluations show that for the Northern Adriatic Sea and for the forcing applied, a spin-up period of one week allows ASHELF to generate new circulation features enabled by the increased resolution and its total kinetic energy to establish a new dynamical balance. CREA results, evaluated by mean of standard statistics between ASHELF and coastal CTDs, show improvement deriving from the initialization technique and a good model performance in the coastal areas of the Northern Adriatic basin, characterized by a shallow and wide continental shelf subject to substantial freshwater influence from rivers. Results demonstrate the feasibility of our CREA strategy to support coastal zone management and wish an additional establishment of operational coastal monitoring activities to advance it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research activity studied how the uncertainties are concerned and interrelated through the multi-model approach, since it seems to be the bigger challenge of ocean and weather forecasting. Moreover, we tried to reduce model error throughout the superensemble approach. In order to provide this aim, we created different dataset and by means of proper algorithms we obtained the superensamble estimate. We studied the sensitivity of this algorithm in function of its characteristics parameters. Clearly, it is not possible to evaluate a reasonable estimation of the error neglecting the importance of the grid size of ocean model, for the large amount of all the sub grid-phenomena embedded in space discretizations that can be only roughly parametrized instead of an explicit evaluation. For this reason we also developed a high resolution model, in order to calculate for the first time the impact of grid resolution on model error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work studies a km-scale data assimilation scheme based on a LETKF developed for the COSMO model. The aim is to evaluate the impact of the assimilation of two different types of data: temperature, humidity, pressure and wind data from conventional networks (SYNOP, TEMP, AIREP reports) and 3d reflectivity from radar volume. A 3-hourly continuous assimilation cycle has been implemented over an Italian domain, based on a 20 member ensemble, with boundary conditions provided from ECMWF ENS. Three different experiments have been run for evaluating the performance of the assimilation on one week in October 2014 during which Genova flood and Parma flood took place: a control run of the data assimilation cycle with assimilation of data from conventional networks only, a second run in which the SPPT scheme is activated into the COSMO model, a third run in which also reflectivity volumes from meteorological radar are assimilated. Objective evaluation of the experiments has been carried out both on case studies and on the entire week: check of the analysis increments, computing the Desroziers statistics for SYNOP, TEMP, AIREP and RADAR, over the Italian domain, verification of the analyses against data not assimilated (temperature at the lowest model level objectively verified against SYNOP data), and objective verification of the deterministic forecasts initialised with the KENDA analyses for each of the three experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study summarises all the accessible data on old German chemical weapons dumped in the Baltic Sea. Mr. Goncharov formulated a concept of ecological impact evaluation of chemical warfare agents (CWA) on the marine environment and structured a simulation model adapted to the specific character of the hydrological condition and hydrobiological subjects of the Bornholm Deep. The mathematical model he has created describes the spreading of contaminants by currents and turbulence in the near bottom boundary layer. Parameters of CWA discharge through corrosion of canisters were given for various kinds of bottom sediments with allowance for current velocity. He created a method for integral estimations and a computer simulation model and completed a forecast for CWA "Mustard", which showed that in normal hydrometeorological conditions there are local toxic plumes drifting along the bottom for a distance of up to several kilometres. With storm winds the toxic plumes from separate canisters interflow and lengthen and can reach fishery areas near Bornholm Island. When salt water from the North Sea flows in, the length of toxic zones can increase up to and over 100 kilometres and toxic water masses can spread into the northern Baltic. On this basis, Mr. Goncharov drew up recommendations to reduce dangers for human ecology and proposed the creation of a special system for the forecasting and remote sensing of the environmental conditions of CWA burial places.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change is expected to profoundly influence the hydrosphere of mountain ecosystems. The focus of current process-based research is centered on the reaction of glaciers and runoff to climate change; spatially explicit impacts on soil moisture remain widely neglected. We spatio-temporally analyzed the impact of the climate on soil moisture in a mesoscale high mountain catchment to facilitate the development of mitigation and adaptation strategies at the level of vegetation patterns. Two regional climate models were downscaled using three different approaches (statistical downscaling, delta change, and direct use) to drive a hydrological model (WaSiM-ETH) for reference and scenario period (1960–1990 and 2070–2100), resulting in an ensemble forecast of six members. For all ensembles members we found large changes in temperature, resulting in decreasing snow and ice storage and earlier runoff, but only small changes in evapotranspiration. The occurrence of downscaled dry spells was found to fluctuate greatly, causing soil moisture depletion and drought stress potential to show high variability in both space and time. In general, the choice of the downscaling approach had a stronger influence on the results than the applied regional climate model. All of the results indicate that summer soil moisture decreases, which leads to more frequent declines below a critical soil moisture level and an advanced evapotranspiration deficit. Forests up to an elevation of 1800 m a.s.l. are likely to be threatened the most, while alpine areas and most pastures remain nearly unaffected. Nevertheless, the ensemble variability was found to be extremely high and should be interpreted as a bandwidth of possible future drought stress situations.