963 resultados para flood forecasting model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Prepared in partial fulfilment of section 5 of the Flood Control Act of 1945, 615 ILCS 15/5."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"December, 1977."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of their relative concentration towards the respective Atlantic margins, the silicic eruptives of the Parana (Brazil)-Etendeka large igneous province are disproportionately abundant in the Etendeka of Namibia. The NW Etendeka silicic units, dated at similar to132 Ma, occupy the upper stratigraphic levels of the volcanic sequences, restricted to the coastal zone, and comprise three latites and five quartz latites (QL). The large-volume Fria QL is the only low-Ti type. Its trace element and isotopic signatures indicate massive crustal input. The remaining NW Etendeka silicic units are enigmatic high-Ti types, geochemically different from low-Ti types. They exhibit chemical affinities with the temporally overlapping Khumib high-Ti basalt (see Ewart et al. Part 1) and high crystallization temperatures (greater than or equal to980 to 1120degreesC) inferred from augite and pigeonite phenocrysts, both consistent with their evolution from a mafic source. Geochemically, the high-Ti units define three groups, thought genetically related. We test whether these represent independent liquid lines of descent from a common high-Ti mafic parent. Although the recognition of latites reduces the apparent silica gap, difficulty is encountered in fractional crystallization models by the large volumes of two QL units. Numerical modelling does, however, support large-scale open-system fractional crystallization, assimilation of silicic to basaltic materials, and magma mixing, but cannot entirely exclude partial melting processes within the temporally active extensional environment. The fractional crystallization and mixing signatures add to the complexity of these enigmatic and controversial silicic magmas. The existence, however, of temporally and spatially overlapping high-Ti basalts is, in our view, not coincidental and the high-Ti character of the silicic magmas ultimately reflects a mantle signature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The bimodal NW Etendeka province is located at the continental end of the Tristan plume trace in coastal Namibia. It comprises a high-Ti (Khumib type) and three low-Ti basalt (Tafelberg, Kuidas and Esmeralda types) suites, with, at stratigraphically higher level, interstratified high-Ti latites (three units) and quartz latites (five units), and one low-Ti quartz latite. Khumib basalts are enriched in high field strength elements and light rare earth elements relative to low-Ti types and exhibit trace element affinities with Tristan da Cunha lavas. The unradiogenic Pb-206/Pb-204 ratios of Khumib basalts are distinctive, most plotting to the left of the 132 Ma Geochron, together with elevated Pb-207/Pb-204 ratios, and Sr-Nd isotopic compositions plotting in the lower Nd-143/Nd-144 part of mantle array (EM1-like). The low-Ti basalts have less coherent trace element patterns and variable, radiogenic initial Sr (similar to0.707-0.717) and Pb isotope compositions, implying crustal contamination. Four samples, however, have less radiogenic Pb and Sr that we suggest approximate their uncontaminated source. All basalt types, but particularly the low-Ti types, contain samples with trace element characteristics (e.g. Nb/Nb-*) suggesting metasediment input, considered source-related. Radiogenic isotope compositions of these samples require long-term isolation of the source in the mantle and depletions (relative to unmodified sediment) in certain elements (e.g. Cs, Pb, U), which are possibly subduction-related. A geodynamic model is proposed in which the emerging Tristan plume entrained subducted material in the Transition Zone region, and further entrained asthenosphere during plume head expansion. Mixing calculations suggest that the main features of the Etendeka basalt types can be explained without sub-continental lithospheric mantle input. Crustal contamination is evident in most low-Ti basalts, but is distinct from the incorporation of a metasedimentary source component at mantle depths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework for developing marketing category management decision support systems (DSS) based upon the Bayesian Vector Autoregressive (BVAR) model is extended. Since the BVAR model is vulnerable to permanent and temporary shifts in purchasing patterns over time, a form that can correct for the shifts and still provide the other advantages of the BVAR is a Bayesian Vector Error-Correction Model (BVECM). We present the mechanics of extending the DSS to move from a BVAR model to the BVECM model for the category management problem. Several additional iterative steps are required in the DSS to allow the decision maker to arrive at the best forecast possible. The revised marketing DSS framework and model fitting procedures are described. Validation is conducted on a sample problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In deregulated electricity market, modeling and forecasting the spot price present a number of challenges. By applying wavelet and support vector machine techniques, a new time series model for short term electricity price forecasting has been developed in this paper. The model employs both historical price and other important information, such as load capacity and weather (temperature), to forecast the price of one or more time steps ahead. The developed model has been evaluated with the actual data from Australian National Electricity Market. The simulation results demonstrated that the forecast model is capable of forecasting the electricity price with a reasonable forecasting accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La prima parte di questo lavoro di tesi tratta dell’interazione tra un bacino di laminazione e il sottostante acquifero: è in fase di progetto, infatti, la costruzione di una cassa di espansione sul torrente Baganza, a monte della città di Parma. L’obiettivo di tale intervento è di ridurre il rischio di esondazione immagazzinando temporaneamente, in un serbatoio artificiale, la parte più pericolosa del volume di piena che verrebbe rilasciata successivamente con portate che possono essere agevolmente contenute nel tratto cittadino del torrente. L’acquifero è stato preliminarmente indagato e monitorato permettendone la caratterizzazione litostratigrafica. La stratigrafia si può riassumere in una sequenza di strati ghiaioso-sabbiosi con successione di lenti d’argilla più o meno spesse e continue, distinguendo due acquiferi differenti (uno freatico ed uno confinato). Nel presente studio si fa riferimento al solo acquifero superficiale che è stato modellato numericamente, alle differenze finite, per mezzo del software MODFLOW_2005. L'obiettivo del presente lavoro è di rappresentare il sistema acquifero nelle condizioni attuali (in assenza di alcuna opera) e di progetto. La calibrazione è stata condotta in condizioni stazionarie utilizzando i livelli piezometrici raccolti nei punti d’osservazione durante la primavera del 2013. I valori di conducibilità idraulica sono stati stimati per mezzo di un approccio geostatistico Bayesiano. Il codice utilizzato per la stima è il bgaPEST, un software gratuito per la soluzione di problemi inversi fortemente parametrizzati, sviluppato sulla base dei protocolli del software PEST. La metodologia inversa stima il campo di conducibilità idraulica combinando osservazioni sullo stato del sistema (livelli piezometrici nel caso in esame) e informazioni a-priori sulla struttura dei parametri incogniti. La procedura inversa richiede il calcolo della sensitività di ciascuna osservazione a ciascuno dei parametri stimati; questa è stata valutata in maniera efficiente facendo ricorso ad una formulazione agli stati aggiunti del codice in avanti MODFLOW_2005_Adjoint. I risultati della metodologia sono coerenti con la natura alluvionale dell'acquifero indagato e con le informazioni raccolte nei punti di osservazione. Il modello calibrato può quindi essere utilizzato come supporto alla progettazione e gestione dell’opera di laminazione. La seconda parte di questa tesi tratta l'analisi delle sollecitazioni indotte dai percorsi di flusso preferenziali causati da fenomeni di piping all’interno dei rilevati arginali. Tali percorsi preferenziali possono essere dovuti alla presenza di gallerie scavate da animali selvatici. Questo studio è stato ispirato dal crollo del rilevato arginale del Fiume Secchia (Modena), che si è verificato in gennaio 2014 a seguito di un evento alluvionale, durante il quale il livello dell'acqua non ha mai raggiunto la sommità arginale. La commissione scientifica, la cui relazione finale fornisce i dati utilizzati per questo studio, ha attribuito, con molta probabilità, il crollo del rilevato alla presenza di tane di animali. Con lo scopo di analizzare il comportamento del rilevato in condizioni integre e in condizioni modificate dall'esistenza di un tunnel che attraversa il manufatto arginale, è stato realizzato un modello numerico 3D dell’argine mediante i noti software Femwater e Feflow. I modelli descrivono le infiltrazioni all'interno del rilevato considerando il terreno in entrambe le porzioni sature ed insature, adottando la tecnica agli elementi finiti. La tana è stata rappresentata da elementi con elevata permeabilità e porosità, i cui valori sono stati modificati al fine di valutare le diverse influenze sui flussi e sui contenuti idrici. Per valutare se le situazioni analizzate presentino o meno il verificarsi del fenomeno di erosione, sono stati calcolati i valori del fattore di sicurezza. Questo è stato valutato in differenti modi, tra cui quello recentemente proposto da Richards e Reddy (2014), che si riferisce al criterio di energia cinetica critica. In ultima analisi è stato utilizzato il modello di Bonelli (2007) per calcolare il tempo di erosione ed il tempo rimanente al collasso del rilevato.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a forecasting technique for forward electricity/gas prices, one day ahead. This technique combines a Kalman filter (KF) and a generalised autoregressive conditional heteroschedasticity (GARCH) model (often used in financial forecasting). The GARCH model is used to compute next value of a time series. The KF updates parameters of the GARCH model when the new observation is available. This technique is applied to real data from the UK energy markets to evaluate its performance. The results show that the forecasting accuracy is improved significantly by using this hybrid model. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generation of very short range forecasts of precipitation in the 0-6 h time window is traditionally referred to as nowcasting. Most existing nowcasting systems essentially extrapolate radar observations in some manner, however, very few systems account for the uncertainties involved. Thus deterministic forecast are produced, which have a limited use when decisions must be made, since they have no measure of confidence or spread of the forecast. This paper develops a Bayesian state space modelling framework for quantitative precipitation nowcasting which is probabilistic from conception. The model treats the observations (radar) as noisy realisations of the underlying true precipitation process, recognising that this process can never be completely known, and thus must be represented probabilistically. In the model presented here the dynamics of the precipitation are dominated by advection, so this is a probabilistic extrapolation forecast. The model is designed in such a way as to minimise the computational burden, while maintaining a full, joint representation of the probability density function of the precipitation process. The update and evolution equations avoid the need to sample, thus only one model needs be run as opposed to the more traditional ensemble route. It is shown that the model works well on both simulated and real data, but that further work is required before the model can be used operationally. © 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Working within the framework of the branch of Linguistics known as discourse analysis, and more specifically within the current approach of genre analysis, this thesis presents an analysis of the English of economic forecasting. The language of economic forecasting is highly specialised and follows certain conventions of structure and style. This research project identifies these characteristics and explains them in terms of their communicative function. The work is based on a corpus of texts published in economic reports and surveys by major corporate bodies. These documents are targeted at an international expert readership familiar with this genre. The data is analysed at two broad levels: firstly, the macro-level of text structure which is described in terms of schema-theory, a currently influential model of analysis, and, secondly, the micro-level of authors' strategies for modulating the predictions which form the key move in the forecasting schema. The thesis aims to contribute to the newly developing field of genre analysis in a number of ways: firstly, by a coverage of a hitherto neglected but intrinsically interesting and important genre (Economic Forecasting); secondly, by testing the applicability of existing models of analysis at the level of schematic structure and proposing a genre-specific model; thirdly by offering insights into the nature of modulation of propositions which is often broadly classified as `hedging' or `modality', and which has been recently described as lq`an area for prolonged fieldwork'. This phenomenon is shown to be a key feature of this particular genre. It is suggested that this thesis, in addition to its contribution to the theory of genre analysis, provides a useful basis for work by teachers of English for Economics, an important area of English for Specific Purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. In our forecasting experiment we use two non-linear techniques, namely, recurrent neural networks and kernel recursive least squares regression - techniques that are new to macroeconomics. Recurrent neural networks operate with potentially unbounded input memory, while the kernel regression technique is a finite memory predictor. The two methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares the UK/US exchange rate forecasting performance of linear and nonlinear models based on monetary fundamentals, to a random walk (RW) model. Structural breaks are identified and taken into account. The exchange rate forecasting framework is also used for assessing the relative merits of the official Simple Sum and the weighted Divisia measures of money. Overall, there are four main findings. First, the majority of the models with fundamentals are able to beat the RW model in forecasting the UK/US exchange rate. Second, the most accurate forecasts of the UK/US exchange rate are obtained with a nonlinear model. Third, taking into account structural breaks reveals that the Divisia aggregate performs better than its Simple Sum counterpart. Finally, Divisia-based models provide more accurate forecasts than Simple Sum-based models provided they are constructed within a nonlinear framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate whether consideration of store-level heterogeneity in marketing mix effects improves the accuracy of the marketing mix elasticities, fit, and forecasting accuracy of the widely-applied SCAN*PRO model of store sales. Models with continuous and discrete representations of heterogeneity, estimated using hierarchical Bayes (HB) and finite mixture (FM) techniques, respectively, are empirically compared to the original model, which does not account for store-level heterogeneity in marketing mix effects, and is estimated using ordinary least squares (OLS). The empirical comparisons are conducted in two contexts: Dutch store-level scanner data for the shampoo product category, and an extensive simulation experiment. The simulation investigates how between- and within-segment variance in marketing mix effects, error variance, the number of weeks of data, and the number of stores impact the accuracy of marketing mix elasticities, model fit, and forecasting accuracy. Contrary to expectations, accommodating store-level heterogeneity does not improve the accuracy of marketing mix elasticities relative to the homogeneous SCAN*PRO model, suggesting that little may be lost by employing the original homogeneous SCAN*PRO model estimated using ordinary least squares. Improvements in fit and forecasting accuracy are also fairly modest. We pursue an explanation for this result since research in other contexts has shown clear advantages from assuming some type of heterogeneity in market response models. In an Afterthought section, we comment on the controversial nature of our result, distinguishing factors inherent to household-level data and associated models vs. general store-level data and associated models vs. the unique SCAN*PRO model specification.