905 resultados para Weather forecasting.
Resumo:
[EN]Ensemble forecasting [1] is a methodology to deal with uncertainties in the numerical wind prediction. In this work we propose to apply ensemble methods to the adaptive wind forecasting model presented in [2]. The wind _eld forecasting is based on a mass-consistent model and a log-linear wind pro_le using as input data the resulting forecast wind from Harmonie [3], a Non-Hydrostatic Dynamic model. The mass-consistent model parameters are estimated by using genetic algorithms [4]. The mesh is generated using the meccano method [5] and adapted to the geometry. The main source of uncertainties in this model is the parameter estimation and the in- trinsic uncertainties of the Harmonie Model…
Resumo:
[EN]Ensemble forecasting is a methodology to deal with uncertainties in the numerical wind prediction. In this work we propose to apply ensemble methods to the adaptive wind forecasting model presented in. The wind field forecasting is based on a mass-consistent model and a log-linear wind profile using as input data the resulting forecast wind from Harmonie, a Non-Hydrostatic Dynamic model used experimentally at AEMET with promising results. The mass-consistent model parameters are estimated by using genetic algorithms. The mesh is generated using the meccano method and adapted to the geometry…
Resumo:
Forecasting the time, location, nature, and scale of volcanic eruptions is one of the most urgent aspects of modern applied volcanology. The reliability of probabilistic forecasting procedures is strongly related to the reliability of the input information provided, implying objective criteria for interpreting the historical and monitoring data. For this reason both, detailed analysis of past data and more basic research into the processes of volcanism, are fundamental tasks of a continuous information-gain process; in this way the precursor events of eruptions can be better interpreted in terms of their physical meanings with correlated uncertainties. This should lead to better predictions of the nature of eruptive events. In this work we have studied different problems associated with the long- and short-term eruption forecasting assessment. First, we discuss different approaches for the analysis of the eruptive history of a volcano, most of them generally applied for long-term eruption forecasting purposes; furthermore, we present a model based on the characteristics of a Brownian passage-time process to describe recurrent eruptive activity, and apply it for long-term, time-dependent, eruption forecasting (Chapter 1). Conversely, in an effort to define further monitoring parameters as input data for short-term eruption forecasting in probabilistic models (as for example, the Bayesian Event Tree for eruption forecasting -BET_EF-), we analyze some characteristics of typical seismic activity recorded in active volcanoes; in particular, we use some methodologies that may be applied to analyze long-period (LP) events (Chapter 2) and volcano-tectonic (VT) seismic swarms (Chapter 3); our analysis in general are oriented toward the tracking of phenomena that can provide information about magmatic processes. Finally, we discuss some possible ways to integrate the results presented in Chapters 1 (for long-term EF), 2 and 3 (for short-term EF) in the BET_EF model (Chapter 4).
Resumo:
A new Coastal Rapid Environmental Assessment (CREA) strategy has been developed and successfully applied to the Northern Adriatic Sea. CREA strategy exploits the recent advent of operational oceanography to establish a CREA system based on an operational regional forecasting system and coastal monitoring networks of opportunity. The methodology wishes to initialize a coastal high resolution model, nested within the regional forecasting system, blending the large scale parent model fields with the available coastal observations to generate the requisite field estimates. CREA modeling system consists of a high resolution, O(800m), Adriatic SHELF model (ASHELF) implemented into the Northern Adriatic basin and nested within the Adriatic Forecasting System (AFS) (Oddo et al. 2006). The observational system is composed by the coastal networks established in the framework of ADRICOSM (ADRiatic sea integrated COastal areaS and river basin Managment system) Pilot Project. An assimilation technique exerts a correction of the initial field provided by AFS on the basis of the available observations. The blending of the two data sets has been carried out through a multi-scale optimal interpolation technique developed by Mariano and Brown (1992). Two CREA weekly exercises have been conducted: the first, at the beginning of May (spring experiment); the second in middle August (summer experiment). The weeks have been chosen looking at the availability of all coastal observations in the initialization day and one week later to validate model results, verifying our predictive skills. ASHELF spin up time has been investigated too, through a dedicated experiment, in order to obtain the maximum forecast accuracy within a minimum time. Energetic evaluations show that for the Northern Adriatic Sea and for the forcing applied, a spin-up period of one week allows ASHELF to generate new circulation features enabled by the increased resolution and its total kinetic energy to establish a new dynamical balance. CREA results, evaluated by mean of standard statistics between ASHELF and coastal CTDs, show improvement deriving from the initialization technique and a good model performance in the coastal areas of the Northern Adriatic basin, characterized by a shallow and wide continental shelf subject to substantial freshwater influence from rivers. Results demonstrate the feasibility of our CREA strategy to support coastal zone management and wish an additional establishment of operational coastal monitoring activities to advance it.
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
Der Wintersturm Lothar zog am 26. Dezember 1999 über Europa und richtete in Frankreich, in Deutschland, in der Schweiz und in Österreich ungewöhnlich hohe Schäden an. Lothar entstand aus einer diabatischen Rossby Welle (DRW) und erreichte erst wenige Stunden vor dem europäischen Kontinent Orkanstärke. DRWs weisen ein interessantes atmosphärisches Strömungsmuster auf. Sie bestehen aus einer positiven PV-Anomalie in der unteren Troposphäre, die sich in einer Region mit starkem meridionalen Temperaturgradient befindet. Die positive PV-Anomalie löst eine zyklonale Strömung aus, dadurch wird östlich der PV-Anomalie warme Luft aus dem Süden herantransportiert. Während des Aufstieg der warmen Luft finden diabatische Prozesse statt, die zur Bildung einer neuen positiven PV-Anomalie in der unteren Troposphäre (PVA) führen. DRWs entstehen unabhängig von PV-Anomalien an der Tropopause. Falls sie jedoch mit ihnen in Wechselwirkung treten, kann - wie im Falle von Lothar - eine explosive Zyklogenese daraus resultieren. Im ersten Teil wird die Dynamik einer DRW am Beispiel des Wintersturms Lothar untersucht. Es wird insbesondere auf das Potential einer DRW zur explosiven Zyklogenese eingegangen. Im zweiten Teil wird das Aufretreten von DRWs in ECMWF-Vorhersagen untersucht. Es werden Unterschiede zwischen DRWs und anderen PV-Anomalien in der unteren Troposphäre hervorgehoben. Die Dynamik von DRWs wird mit Hilfe eines ECMWF-"Ensemble Prediction System" (EPS) des Wintersturms Lothar untersucht. Die 50 Modellläufe des EPS starten am 24. Dezember 1999 um 12 UTC und reichen bis zum 26. Dezember 1999 um 12 UTC. Nur 16 der 50 Modellläufe sagen einen ähnlich starken Sturm wie Lothar vorher. 10 Modellläufen sagen am 26. Dezember keine Zyklone mehr vorher. Die Ausprägung der baroklinen Zone, in der sich die DRW befindet, ist ausschlaggebend für die Intensität der DRW. Weitere wichtige Parameter sind der Feuchtegehalt der unteren Troposphäre und der latente Wärmefluss über dem Ozean. Diejenigen DRWs, die sich zu am 25. Dezember um 12 UTC näher als 400 km am Tropopausenjet befinden, entwickeln sich zu einer starken Zyklone. Alle anderen lösen sich auf oder bleiben schwache Zyklonen. Es ist schwierig, diabatische Prozesse in Wettervorhersagemodellen abzubilden, dementsprechend treten Schwierigkeiten bei der Vorhersage von PVAs auf. In den operationellen ECMWF-Vorhersagen von Juni 2004 bis Mai 2005 werden mit Hilfe eines Tracking- Algorithmus PVAs im Nordpazifik und Nordatlantik bestimmt und in fünf Kategorien eingeteilt. Die fünf Kategorien unterscheiden sich in ihrer Häufigkeit, ihrer Zugbahn und ihrer Gestalt. Im Nordpazifik entstehen doppelt so viele PVAs wie im Nordatlantik. Durchschnittlich werden im Winter weniger PVAs gefunden als im Sommer. Die Baroklinität und die Geschwindigkeit des Tropopausenjets ist in der Nähe von DRWs besonders hoch. Verglichen mit anderen PVAs weisen DRWs eine ähnliche Verteilung des reduzierten Bodendrucks auf. DRWs können in etwa gleich gut vorhergesagt werden wie andere PVAs.
Resumo:
In der hier vorliegenden Arbeit wurde am Beispiel der Kraut- und Knollenfäule an Kartoffeln Phytophthora infestans und des Kartoffelkäfers Leptinotarsa decemlineata untersucht, ob durch den Einsatz von Geographischen Informationssystemen (GIS) landwirtschaftliche Schader¬reger¬prognosen für jeden beliebigen Kartoffelschlag in Deutschland erstellt werden können. Um dieses Ziel zu erreichen, wurden die Eingangsparameter (Temperatur und relative Luftfeuchte) der Prognosemodelle für die beiden Schaderreger (SIMLEP1, SIMPHYT1, SIMPHYT3 and SIMBLIGHT1) so aufbereitet, dass Wetterdaten flächendeckend für Deutschland zur Verfügung standen. Bevor jedoch interpoliert werden konnte, wurde eine Regionalisierung von Deutschland in Interpolationszonen durchgeführt und somit Naturräume geschaffen, die einen Vergleich und eine Bewertung der in ihnen liegenden Wetterstationen zulassen. Hierzu wurden die Boden-Klima-Regionen von SCHULZKE und KAULE (2000) modifiziert, an das Wetterstationsnetz angepasst und mit 5 bis 10 km breiten Pufferzonen an der Grenze der Interpolationszonen versehen, um die Wetterstationen so häufig wie möglich verwenden zu können. Für die Interpolation der Wetterdaten wurde das Verfahren der multiplen Regression gewählt, weil dieses im Vergleich zu anderen Verfahren die geringsten Abweichungen zwischen interpolierten und gemessenen Daten aufwies und den technischen Anforderungen am besten entsprach. Für 99 % aller Werte konnten bei der Temperaturberechnung Abweichungen in einem Bereich zwischen -2,5 und 2,5 °C erzielt werden. Bei der Berechnung der relativen Luftfeuchte wurden Abweichungen zwischen -12 und 10 % relativer Luftfeuchte erreicht. Die Mittelwerte der Abweichungen lagen bei der Temperatur bei 0,1 °C und bei der relativen Luftfeuchte bei -1,8 %. Zur Überprüfung der Trefferquoten der Modelle beim Betrieb mit interpolierten Wetterdaten wurden Felderhebungsdaten aus den Jahren 2000 bis 2007 zum Erstauftreten der Kraut- und Knollenfäule sowie des Kartoffelkäfers verwendet. Dabei konnten mit interpolierten Wetterdaten die gleichen und auch höhere Trefferquoten erreicht werden, als mit der bisherigen Berechnungsmethode. Beispielsweise erzielte die Berechnung des Erstauftretens von P. infestans durch das Modell SIMBLIGHT1 mit interpolierten Wetterdaten im Schnitt drei Tage geringere Abweichungen im Vergleich zu den Berechnungen ohne GIS. Um die Auswirkungen interpretieren zu können, die durch Abweichungen der Temperatur und der relativen Luftfeuchte entstanden wurde zusätzlich eine Sensitivitätsanalyse zur Temperatur und relativen Luftfeuchte der verwendeten Prognosemodelle durchgeführt. Die Temperatur hatte bei allen Modellen nur einen geringen Einfluss auf das Prognoseergebnis. Veränderungen der relativen Luftfeuchte haben sich dagegen deutlich stärker ausgewirkt. So lag bei SIMBLIGHT1 die Abweichung durch eine stündliche Veränderung der relativen Luftfeuchte (± 6 %) bei maximal 27 Tagen, wogegen stündliche Veränderungen der Temperatur (± 2 °C) eine Abweichung von maximal 10 Tagen ausmachten. Die Ergebnisse dieser Arbeit zeigen, dass durch die Verwendung von GIS mindestens die gleichen und auch höhere Trefferquoten bei Schaderregerprognosen erzielt werden als mit der bisherigen Verwendung von Daten einer nahegelegenen Wetterstation. Die Ergebnisse stellen einen wesentlichen Fortschritt für die landwirtschaftlichen Schaderregerprognosen dar. Erstmals ist es möglich, bundesweite Prognosen für jeden beliebigen Kartoffelschlag zur Bekämpfung von Schädlingen in der Landwirtschaft bereit zu stellen.
Resumo:
L'obiettivo principale della tesi è lo sviluppo di un modello empirico previsivo di breve periodo che sia in grado di offrire previsioni precise ed affidabili dei consumi di energia elettrica su base oraria del mercato italiano. Questo modello riassume le conoscenze acquisite e l'esperienza fatta durante la mia attuale attività lavorativa presso il Romagna Energia S.C.p.A., uno dei maggiori player italiani del mercato energetico. Durante l'ultimo ventennio vi sono stati drastici cambiamenti alla struttura del mercato elettrico in tutto il mondo. Nella maggior parte dei paesi industrializzati il settore dell'energia elettrica ha modificato la sua originale conformazione di monopolio in mercato competitivo liberalizzato, dove i consumatori hanno la libertà di scegliere il proprio fornitore. La modellazione e la previsione della serie storica dei consumi di energia elettrica hanno quindi assunto un ruolo molto importante nel mercato, sia per i policy makers che per gli operatori. Basandosi sulla letteratura già esistente, sfruttando le conoscenze acquisite 'sul campo' ed alcune intuizioni, si è analizzata e sviluppata una struttura modellistica di tipo triangolare, del tutto innovativa in questo ambito di ricerca, suggerita proprio dal meccanismo fisico attraverso il quale l'energia elettrica viene prodotta e consumata nell'arco delle 24 ore. Questo schema triangolare può essere visto come un particolare modello VARMA e possiede una duplice utilità, dal punto di vista interpretativo del fenomeno da una parte, e previsivo dall'altra. Vengono inoltre introdotti nuovi leading indicators legati a fattori meteorologici, con l'intento di migliorare le performance previsive dello stesso. Utilizzando quindi la serie storica dei consumi di energia elettrica italiana, dall'1 Marzo 2010 al 30 Marzo 2012, sono stati stimati i parametri del modello dello schema previsivo proposto e valutati i risultati previsivi per il periodo dall'1 Aprile 2012 al 30 Aprile 2012, confrontandoli con quelli forniti da fonti ufficiali.
Resumo:
This thesis is divided in three chapters. In the first chapter we analyse the results of the world forecasting experiment run by the Collaboratory for the Study of Earthquake Predictability (CSEP). We take the opportunity of this experiment to contribute to the definition of a more robust and reliable statistical procedure to evaluate earthquake forecasting models. We first present the models and the target earthquakes to be forecast. Then we explain the consistency and comparison tests that are used in CSEP experiments to evaluate the performance of the models. Introducing a methodology to create ensemble forecasting models, we show that models, when properly combined, are almost always better performing that any single model. In the second chapter we discuss in depth one of the basic features of PSHA: the declustering of the seismicity rates. We first introduce the Cornell-McGuire method for PSHA and we present the different motivations that stand behind the need of declustering seismic catalogs. Using a theorem of the modern probability (Le Cam's theorem) we show that the declustering is not necessary to obtain a Poissonian behaviour of the exceedances that is usually considered fundamental to transform exceedance rates in exceedance probabilities in the PSHA framework. We present a method to correct PSHA for declustering, building a more realistic PSHA. In the last chapter we explore the methods that are commonly used to take into account the epistemic uncertainty in PSHA. The most widely used method is the logic tree that stands at the basis of the most advanced seismic hazard maps. We illustrate the probabilistic structure of the logic tree, and then we show that this structure is not adequate to describe the epistemic uncertainty. We then propose a new probabilistic framework based on the ensemble modelling that properly accounts for epistemic uncertainties in PSHA.
Resumo:
This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.