963 resultados para flood forecasting model
Resumo:
Mathematical relationships between Scoring Parameters can be used in Economic Scoring Formulas (ESF) in tendering to distribute the score among bidders in the economic part of a proposal. Each contracting authority must set an ESF when publishing tender specifications and the strategy of each bidder will differ depending on the ESF selected and the weight of the overall proposal scoring. This paper introduces the various mathematical relationships and density distributions that describe and inter-relate not only the main Scoring Parameters but the main Forecasting Parameters in any capped tender (those whose price is upper-limited). Forecasting Parameters, as variables that can be known in advance before the deadline of a tender is reached, together with Scoring Parameters constitute the basis of a future Bid Tender Forecasting Model.
Resumo:
This thesis consists of a summary and five self-contained papers addressing dynamics of firms in the Swedish wholesale trade sector. Paper [1] focuses upon determinants of new firm formation in the Swedish wholesale trade sector, using two definitions of firms’ relevant markets, markets defined as administrative areas, and markets based on a cost minimizing behavior of retailers. The paper shows that new entering firms tend to avoid regions with already high concentration of other firms in the same branch of wholesaling, while right-of-the-center local government and quality of the infrastructure have positive impacts upon entry of new firms. The signs of the estimated coefficients remain the same regardless which definition of relevant market is used, while the size of the coefficients is generally higher once relevant markets delineated on the cost-minimizing assumption of retailers are used. Paper [2] analyses determinant of firm relocation, distinguishing between the role of the factors in in-migration municipalities and out-migration municipalities. The results of the analysis indicate that firm-specific factors, such as profits, age and size of the firm are negatively related to the firm’s decision to relocate. Furthermore, firms seems to be avoiding municipalities with already high concentration of firms operating in the same industrial branch of wholesaling and also to be more reluctant to leave municipalities governed by right-of-the- center parties. Lastly, firms seem to avoid moving to municipalities characterized with high population density. Paper [3] addresses determinants of firm growth, adopting OLS and a quantile regression technique. The results of this paper indicate that very little of the firm growth can be explained by the firm-, industry- and region-specific factors, controlled for in the estimated models. Instead, the firm growth seems to be driven by internal characteristics of firms, factors difficult to capture in conventional statistics. This result supports Penrose’s (1959) suggestion that internal resources such as firm culture, brand loyalty, entrepreneurial skills, and so on, are important determinants of firm growth rates. Paper [4] formulates a forecasting model for firm entry into local markets and tests this model using data from the Swedish wholesale industry. The empirical analysis is based on directly estimating the profit function of wholesale firms and identification of low- and high-return local markets. The results indicate that 19 of 30 estimated models have more net entry in high-return municipalities, but the estimated parameters is only statistically significant at conventional level in one of our estimated models, and then with unexpected negative sign. Paper [5] studies effects of firm relocation on firm profits of relocating firms, employing a difference-in-difference propensity score matching. Using propensity score matching, the pre-relocalization differences between relocating and non-relocating firms are balanced, while the difference-in-difference estimator controls for all time-invariant unobserved heterogeneity among firms. The results suggest that firms that relocate increase their profits significantly, in comparison to what the profits would be had the firms not relocated. This effect is estimated to vary between 3 to 11 percentage points, depending on the length of the analyzed period.
Resumo:
Applying microeconomic theory, we develop a forecasting model for firm entry into local markets and test this model using data from the Swedish wholesale industry. The empirical analysis is based on directly estimating the profit function of wholesale firms. As in previous entry studies, profits are assumed to depend on firm- and location-specific factors,and the profit equation is estimated using panel data econometric techniques. Using the residuals from the profit equation estimations, we identify local markets in Sweden where firm profits are abnormally high given the level of all independent variables included in the profit function. From microeconomic theory, we then know that these local markets should have higher net entry than other markets, all else being equal, and we investigate this in a second step,also using a panel data econometric model. The results of estimating the net-entry equation indicate that four of five estimated models have more net entry in high-return municipalities, but the estimated parameter is only statistically significant at conventional levels in one of our estimated models.
Resumo:
Sistemas de previsão de cheias podem ser adequadamente utilizados quando o alcance é suficiente, em comparação com o tempo necessário para ações preventivas ou corretivas. Além disso, são fundamentalmente importantes a confiabilidade e a precisão das previsões. Previsões de níveis de inundação são sempre aproximações, e intervalos de confiança não são sempre aplicáveis, especialmente com graus de incerteza altos, o que produz intervalos de confiança muito grandes. Estes intervalos são problemáticos, em presença de níveis fluviais muito altos ou muito baixos. Neste estudo, previsões de níveis de cheia são efetuadas, tanto na forma numérica tradicional quanto na forma de categorias, para as quais utiliza-se um sistema especialista baseado em regras e inferências difusas. Metodologias e procedimentos computacionais para aprendizado, simulação e consulta são idealizados, e então desenvolvidos sob forma de um aplicativo (SELF – Sistema Especialista com uso de Lógica “Fuzzy”), com objetivo de pesquisa e operação. As comparações, com base nos aspectos de utilização para a previsão, de sistemas especialistas difusos e modelos empíricos lineares, revelam forte analogia, apesar das diferenças teóricas fundamentais existentes. As metodologias são aplicadas para previsão na bacia do rio Camaquã (15543 km2), para alcances entre 10 e 48 horas. Dificuldades práticas à aplicação são identificadas, resultando em soluções as quais constituem-se em avanços do conhecimento e da técnica. Previsões, tanto na forma numérica quanto categorizada são executadas com sucesso, com uso dos novos recursos. As avaliações e comparações das previsões são feitas utilizandose um novo grupo de estatísticas, derivadas das freqüências simultâneas de ocorrência de valores observados e preditos na mesma categoria, durante a simulação. Os efeitos da variação da densidade da rede são analisados, verificando-se que sistemas de previsão pluvio-hidrométrica em tempo atual são possíveis, mesmo com pequeno número de postos de aquisição de dados de chuva, para previsões sob forma de categorias difusas.
Resumo:
O presente estudo apresenta um modelo de previsão do preço e do volume comercializado no mercado transoceânico de minério de ferro. Para tanto, foi desenvolvido um modelo VAR, utilizando, além das variáveis endógenas com um lag de diferença, o preço do petróleo Brent e um índice de produção industrial. Após testar raiz unitária das variáveis e constatar que nenhuma era estacionária, o teste de cointegração atestou que existia relação de longo prazo entre as mesmas que era estacionária, afastando a possibilidade de uma regressão espúria. Como resultado, a modelagem VAR apresentou um modelo consistente, com elevada aderência para a previsão do preço e do volume negociado de minério de ferro no mercado transoceânico, não obstante ele tenha apresentado alguma imprecisão no curto prazo.
Resumo:
This thesis elaborates the creation of value in private equity and in particular analyzes value creation in 3G Capital’s acquisition of Burger King. In this sense, a specific model is applied that composes value creation into several drivers, in order to answer the question of how value creation can be addressed in private equity investments. Although previous research by Achleitner et al. (2010) introduced a specific model that addresses value creation in private equity, the respective model was neither applied to an individual company, nor linked to indirect drivers that explain the dynamics and rationales for the creation of value. In turn this paper applies the quantitative model to an ongoing private equity investment and thereby provides different extensions to turn the model into a better forecasting model for ongoing investments, instead of only analyzing a deal that has already been divested from an ex post perspective. The chosen research approach is a case study about the Burger King buyout that first includes an extensive review about the current status of academic literature, second a quantitative calculation and qualitative interpretation of different direct value drivers, third a qualitative breakdown of indirect drivers, and lastly a recapitulating discussion about value creation and value drivers. Presenting a very successful private equity investment and elaborately demonstrating the dynamics and mechanisms that drive value creation in this case, provides important implications for other private equity firms as well as public firms in order to develop their proprietary approach towards value creation.
Resumo:
Neste trabalho, foi realizado um estudo de mapeamento de áreas de incidência e previsões para os casos de dengue na área urbana de Belém. Para as previsões foi utilizada à incidência de dengue com a precipitação pluviométrica a partir de modelos estatísticos, baseados na metodologia de Box e Jenkins de series temporais. O período do estudo foi de 05 anos (2007-2011). Na pesquisa temos métodos multivariados de series temporais, com uso de função de transferência e modelos espaciais, em que se analisou a existência de autocorrelações espaciais na variável em estudo. Os resultados das análises dos dados de incidência de casos de dengue e precipitação mostraram que, o aumento no número de casos de dengue acompanha o aumento na precipitação, demonstrando a relação direta entre o número de casos de dengue e a precipitação nos anos em estudo. O modelo de previsão construído para a incidência de casos de dengue apresentou um bom ajuste com resultados satisfatórios podendo, neste caso, ser utilizado na previsão da dengue. Em relação à análise espacial, foi possível uma visualização da incidência de casos na área urbana de Belém, com as respectivas áreas de incidência, mostrando os níveis de significância em porcentagem. Para o período estudado observou-se o comportamento e as variações dos casos de dengue, com destaque para quatro bairros: Marco, Guamá, Pedreira e Tapanã, com possíveis influências destes bairros nas áreas (bairros) vizinhas. Portanto, o presente estudo evidencia a contribuição para o planejamento das ações de controle da dengue, ao servir de instrumento no apoio às decisões na área de saúde pública.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Considering the high competitiveness in the industrial chemical sector, demand forecast is a relevant factor for decision-making. There is a need for tools capable of assisting in the analysis and definition of the forecast. In that sense, the objective is to generate the chemical industry forecast using an advanced forecasting model and thus verify the accuracy of the method. Because it is time series with seasonality, the model of seasonal autoregressive integrated moving average - SARIMA generated reliable forecasts and acceding to the problem analyzed, thus enabling, through validation with real data improvements in the management and decision making of supply chain
Resumo:
Considering the high competitiveness in the industrial chemical sector, demand forecast is a relevant factor for decision-making. There is a need for tools capable of assisting in the analysis and definition of the forecast. In that sense, the objective is to generate the chemical industry forecast using an advanced forecasting model and thus verify the accuracy of the method. Because it is time series with seasonality, the model of seasonal autoregressive integrated moving average - SARIMA generated reliable forecasts and acceding to the problem analyzed, thus enabling, through validation with real data improvements in the management and decision making of supply chain
Resumo:
The objective of this work were apply and provide a preliminary evaluation of the Weather Research and Forecasting model coupled with Chemistry (WRF-Chem) performance, for Londrina region. We performed comparison with measurements obtained in meteorological stations. The model was configured to run with three domains with 27,9 and 3 km of grid resolution, using the ndown program and also was realized a simulation with the model configured to run with a single domain using a land use file based in a classified image for region of MODIS sensor. The emission files to supply the chemistry run were generated based in the work of Martins et al., 2012. RADM2 chemical mechanism and MADE/SORGAM modal aerosol models were used in the simulations. The results demonstrated that model was able to represent coherently the formation and dispersion of the pollution in Metropolitan Region of Londrina and also the importance of using the appropriate land use file for the region.
Resumo:
In der hier vorliegenden Arbeit wurde am Beispiel der Kraut- und Knollenfäule an Kartoffeln Phytophthora infestans und des Kartoffelkäfers Leptinotarsa decemlineata untersucht, ob durch den Einsatz von Geographischen Informationssystemen (GIS) landwirtschaftliche Schader¬reger¬prognosen für jeden beliebigen Kartoffelschlag in Deutschland erstellt werden können. Um dieses Ziel zu erreichen, wurden die Eingangsparameter (Temperatur und relative Luftfeuchte) der Prognosemodelle für die beiden Schaderreger (SIMLEP1, SIMPHYT1, SIMPHYT3 and SIMBLIGHT1) so aufbereitet, dass Wetterdaten flächendeckend für Deutschland zur Verfügung standen. Bevor jedoch interpoliert werden konnte, wurde eine Regionalisierung von Deutschland in Interpolationszonen durchgeführt und somit Naturräume geschaffen, die einen Vergleich und eine Bewertung der in ihnen liegenden Wetterstationen zulassen. Hierzu wurden die Boden-Klima-Regionen von SCHULZKE und KAULE (2000) modifiziert, an das Wetterstationsnetz angepasst und mit 5 bis 10 km breiten Pufferzonen an der Grenze der Interpolationszonen versehen, um die Wetterstationen so häufig wie möglich verwenden zu können. Für die Interpolation der Wetterdaten wurde das Verfahren der multiplen Regression gewählt, weil dieses im Vergleich zu anderen Verfahren die geringsten Abweichungen zwischen interpolierten und gemessenen Daten aufwies und den technischen Anforderungen am besten entsprach. Für 99 % aller Werte konnten bei der Temperaturberechnung Abweichungen in einem Bereich zwischen -2,5 und 2,5 °C erzielt werden. Bei der Berechnung der relativen Luftfeuchte wurden Abweichungen zwischen -12 und 10 % relativer Luftfeuchte erreicht. Die Mittelwerte der Abweichungen lagen bei der Temperatur bei 0,1 °C und bei der relativen Luftfeuchte bei -1,8 %. Zur Überprüfung der Trefferquoten der Modelle beim Betrieb mit interpolierten Wetterdaten wurden Felderhebungsdaten aus den Jahren 2000 bis 2007 zum Erstauftreten der Kraut- und Knollenfäule sowie des Kartoffelkäfers verwendet. Dabei konnten mit interpolierten Wetterdaten die gleichen und auch höhere Trefferquoten erreicht werden, als mit der bisherigen Berechnungsmethode. Beispielsweise erzielte die Berechnung des Erstauftretens von P. infestans durch das Modell SIMBLIGHT1 mit interpolierten Wetterdaten im Schnitt drei Tage geringere Abweichungen im Vergleich zu den Berechnungen ohne GIS. Um die Auswirkungen interpretieren zu können, die durch Abweichungen der Temperatur und der relativen Luftfeuchte entstanden wurde zusätzlich eine Sensitivitätsanalyse zur Temperatur und relativen Luftfeuchte der verwendeten Prognosemodelle durchgeführt. Die Temperatur hatte bei allen Modellen nur einen geringen Einfluss auf das Prognoseergebnis. Veränderungen der relativen Luftfeuchte haben sich dagegen deutlich stärker ausgewirkt. So lag bei SIMBLIGHT1 die Abweichung durch eine stündliche Veränderung der relativen Luftfeuchte (± 6 %) bei maximal 27 Tagen, wogegen stündliche Veränderungen der Temperatur (± 2 °C) eine Abweichung von maximal 10 Tagen ausmachten. Die Ergebnisse dieser Arbeit zeigen, dass durch die Verwendung von GIS mindestens die gleichen und auch höhere Trefferquoten bei Schaderregerprognosen erzielt werden als mit der bisherigen Verwendung von Daten einer nahegelegenen Wetterstation. Die Ergebnisse stellen einen wesentlichen Fortschritt für die landwirtschaftlichen Schaderregerprognosen dar. Erstmals ist es möglich, bundesweite Prognosen für jeden beliebigen Kartoffelschlag zur Bekämpfung von Schädlingen in der Landwirtschaft bereit zu stellen.
Resumo:
L'obiettivo principale della tesi è lo sviluppo di un modello empirico previsivo di breve periodo che sia in grado di offrire previsioni precise ed affidabili dei consumi di energia elettrica su base oraria del mercato italiano. Questo modello riassume le conoscenze acquisite e l'esperienza fatta durante la mia attuale attività lavorativa presso il Romagna Energia S.C.p.A., uno dei maggiori player italiani del mercato energetico. Durante l'ultimo ventennio vi sono stati drastici cambiamenti alla struttura del mercato elettrico in tutto il mondo. Nella maggior parte dei paesi industrializzati il settore dell'energia elettrica ha modificato la sua originale conformazione di monopolio in mercato competitivo liberalizzato, dove i consumatori hanno la libertà di scegliere il proprio fornitore. La modellazione e la previsione della serie storica dei consumi di energia elettrica hanno quindi assunto un ruolo molto importante nel mercato, sia per i policy makers che per gli operatori. Basandosi sulla letteratura già esistente, sfruttando le conoscenze acquisite 'sul campo' ed alcune intuizioni, si è analizzata e sviluppata una struttura modellistica di tipo triangolare, del tutto innovativa in questo ambito di ricerca, suggerita proprio dal meccanismo fisico attraverso il quale l'energia elettrica viene prodotta e consumata nell'arco delle 24 ore. Questo schema triangolare può essere visto come un particolare modello VARMA e possiede una duplice utilità, dal punto di vista interpretativo del fenomeno da una parte, e previsivo dall'altra. Vengono inoltre introdotti nuovi leading indicators legati a fattori meteorologici, con l'intento di migliorare le performance previsive dello stesso. Utilizzando quindi la serie storica dei consumi di energia elettrica italiana, dall'1 Marzo 2010 al 30 Marzo 2012, sono stati stimati i parametri del modello dello schema previsivo proposto e valutati i risultati previsivi per il periodo dall'1 Aprile 2012 al 30 Aprile 2012, confrontandoli con quelli forniti da fonti ufficiali.
Resumo:
The magnitudes of the largest known floods of the River Rhine in Basel since 1268 were assessed using a hydraulic model drawing on a set of pre-instrumental evidence and daily hydrological measurements from 1808. The pre-instrumental evidence, consisting of flood marks and documentary data describing extreme events with the customary reference to specific landmarks, was “calibrated” by comparing it with the instrumental series for the overlapping period between the two categories of evidence (1808–1900). Summer (JJA) floods were particularly frequent in the century between 1651–1750, when precipitation was also high. Severe winter (DJF) floods have not occurred since the late 19th century despite a significant increase in winter precipitation. Six catastrophic events involving a runoff greater than 6000 m 3 s-1 are documented prior to 1700. They were initiated by spells of torrential rainfall of up to 72 h (1480 event) and preceded by long periods of substantial precipitation that saturated the soils, and/or by abundant snowmelt. All except two (1999 and 2007) of the 43 identified severe events (SEs: defined as having runoff > 5000 and < 6000 m 3 s -1) occurred prior to 1877. Not a single SE is documented from 1877 to 1998. The intermediate 121-year-long “flood disaster gap” is unique over the period since 1268. The effect of river regulations (1714 for the River Kander; 1877 for the River Aare) and the building of reservoirs in the 20th century upon peak runoff were investigated using a one-dimensional hydraulic flood-routing model. Results show that anthropogenic effects only partially account for the “flood disaster gap” suggesting that variations in climate should also be taken into account in explaining these features.
Resumo:
Esta tesis realiza una contribución metodológica al problema de la gestión óptima de embalses hidroeléctricos durante eventos de avenidas, considerando un enfoque estocástico y multiobjetivo. Para ello se propone una metodología de evaluación de estrategias de laminación en un contexto probabilístico y multiobjetivo. Además se desarrolla un entorno dinámico de laminación en tiempo real con pronósticos que combina un modelo de optimización y algoritmos de simulación. Estas herramientas asisten a los gestores de las presas en la toma de decisión respecto de cuál es la operación más adecuada del embalse. Luego de una detallada revisión de la bibliografía, se observó que los trabajos en el ámbito de la gestión óptima de embalses en avenidas utilizan, en general, un número reducido de series de caudales o hidrogramas para caracterizar los posibles escenarios. Limitando el funcionamiento satisfactorio de un modelo determinado a situaciones hidrológicas similares. Por otra parte, la mayoría de estudios disponibles en este ámbito abordan el problema de la laminación en embalses multipropósito durante la temporada de avenidas, con varios meses de duración. Estas características difieren de la realidad de la gestión de embalses en España. Con los avances computacionales en materia de gestión de información en tiempo real, se observó una tendencia a la implementación de herramientas de operación en tiempo real con pronósticos para determinar la operación a corto plazo (involucrando el control de avenidas). La metodología de evaluación de estrategias propuesta en esta tesis se basa en determinar el comportamiento de éstas frente a un espectro de avenidas características de la solicitación hidrológica. Con ese fin, se combina un sistema de evaluación mediante indicadores y un entorno de generación estocástica de avenidas, obteniéndose un sistema implícitamente estocástico. El sistema de evaluación consta de tres etapas: caracterización, síntesis y comparación, a fin de poder manejar la compleja estructura de datos resultante y realizar la evaluación. En la primera etapa se definen variables de caracterización, vinculadas a los aspectos que se quieren evaluar (seguridad de la presa, control de inundaciones, generación de energía, etc.). Estas variables caracterizan el comportamiento del modelo para un aspecto y evento determinado. En la segunda etapa, la información de estas variables se sintetiza en un conjunto de indicadores, lo más reducido posible. Finalmente, la comparación se lleva a cabo a partir de la comparación de esos indicadores, bien sea mediante la agregación de dichos objetivos en un indicador único, o bien mediante la aplicación del criterio de dominancia de Pareto obteniéndose un conjunto de soluciones aptas. Esta metodología se aplicó para calibrar los parámetros de un modelo de optimización de embalse en laminación y su comparación con otra regla de operación, mediante el enfoque por agregación. Luego se amplió la metodología para evaluar y comparar reglas de operación existentes para el control de avenidas en embalses hidroeléctricos, utilizando el criterio de dominancia. La versatilidad de la metodología permite otras aplicaciones, tales como la determinación de niveles o volúmenes de seguridad, o la selección de las dimensiones del aliviadero entre varias alternativas. Por su parte, el entorno dinámico de laminación al presentar un enfoque combinado de optimización-simulación, permite aprovechar las ventajas de ambos tipos de modelos, facilitando la interacción con los operadores de las presas. Se mejoran los resultados respecto de los obtenidos con una regla de operación reactiva, aun cuando los pronósticos se desvían considerablemente del hidrograma real. Esto contribuye a reducir la tan mencionada brecha entre el desarrollo teórico y la aplicación práctica asociada a los modelos de gestión óptima de embalses. This thesis presents a methodological contribution to address the problem about how to operate a hydropower reservoir during floods in order to achieve an optimal management considering a multiobjective and stochastic approach. A methodology is proposed to assess the flood control strategies in a multiobjective and probabilistic framework. Additionally, a dynamic flood control environ was developed for real-time operation, including forecasts. This dynamic platform combines simulation and optimization models. These tools may assist to dam managers in the decision making process, regarding the most appropriate reservoir operation to be implemented. After a detailed review of the bibliography, it was observed that most of the existing studies in the sphere of flood control reservoir operation consider a reduce number of hydrographs to characterize the reservoir inflows. Consequently, the adequate functioning of a certain strategy may be limited to similar hydrologic scenarios. In the other hand, most of the works in this context tackle the problem of multipurpose flood control operation considering the entire flood season, lasting some months. These considerations differ from the real necessity in the Spanish context. The implementation of real-time reservoir operation is gaining popularity due to computational advances and improvements in real-time data management. The methodology proposed in this thesis for assessing the strategies is based on determining their behavior for a wide range of floods, which are representative of the hydrological forcing of the dam. An evaluation algorithm is combined with a stochastic flood generation system to obtain an implicit stochastic analysis framework. The evaluation system consists in three stages: characterizing, synthesizing and comparing, in order to handle the complex structure of results and, finally, conduct the evaluation process. In the first stage some characterization variables are defined. These variables should be related to the different aspects to be evaluated (such as dam safety, flood protection, hydropower, etc.). Each of these variables characterizes the behavior of a certain operating strategy for a given aspect and event. In the second stage this information is synthesized obtaining a reduced group of indicators or objective functions. Finally, the indicators are compared by means of an aggregated approach or by a dominance criterion approach. In the first case, a single optimum solution may be achieved. However in the second case, a set of good solutions is obtained. This methodology was applied for calibrating the parameters of a flood control model and to compare it with other operating policy, using an aggregated method. After that, the methodology was extent to assess and compared some existing hydropower reservoir flood control operation, considering the Pareto approach. The versatility of the method allows many other applications, such as determining the safety levels, defining the spillways characteristics, among others. The dynamic framework for flood control combines optimization and simulation models, exploiting the advantages of both techniques. This facilitates the interaction between dam operators and the model. Improvements are obtained applying this system when compared with a reactive operating policy, even if the forecasts deviate significantly from the observed hydrograph. This approach contributes to reduce the gap between the theoretical development in the field of reservoir management and its practical applications.