932 resultados para Hydrological forecasting
Resumo:
Yield management helps hotels more profitably manage the capacity of their rooms. Hotels tend to have two types of business: transient and group. Yield management research and systems have been designed for transient business in which the group forecast is taken as a given. In this research, forecast data from approximately 90 hotels of a large North American hotel chain were used to determine the accuracy of group forecasts and to identify factors associated with accurate forecasts. Forecasts showed a positive bias and had a mean absolute percentage error (MAPE) of 40% at two months before arrival; 30% at one month before arrival; and 10-15% on the day of arrival. Larger hotels, hotels with a higher dependence on group business, and hotels that updated their forecasts frequently during the month before arrival had more accurate forecasts.
Resumo:
Oil palm has increasingly been established on peatlands throughout Indonesia. One of the concerns is that the drainage required for cultivating oil palm in peatlands leads to soil subsidence, potentially increasing future flood risks. This study analyzes the hydrological and economic effects of oil palm production in a peat landscape in Central Kalimantan. We examine two land use scenarios, one involving conversion of the complete landscape including a large peat area to oil palm plantations, and another involving mixed land use including oil palm plantations, jelutung (jungle rubber; (Dyera spp.) plantations, and natural forest. The hydrological effect was analyzed through flood risk modeling using a high-resolution digital elevation model. For the economic analysis, we analyzed four ecosystem services: oil palm production, jelutung production, carbon sequestration, and orangutan habitat. This study shows that after 100 years, in the oil palm scenario, about 67% of peat in the study area will be subject to regular flooding. The flood-prone area will be unsuitable for oil palm and other crops requiring drained soils. The oil palm scenario is the most profitable only in the short term and when the externalities of oil palm production, i.e., the costs of CO2 emissions, are not considered. In the examined scenarios, the social costs of carbon emissions exceed the private benefits from oil palm plantations in peat. Depending upon the local hydrology, income from jelutung, which can sustainably be grown in undrained conditions and does not lead to soil subsidence, outweighs that from oil palm after several decades. These findings illustrate the trade-offs faced at present in Indonesian peatland management and point to economic advantages of an approach that involves expansion of oil palm on mineral lands while conserving natural peat forests and using degraded peat for crops that do not require drainage.
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
How rainfall infiltration rate and soil hydrological characteristics develop over time under forests of different ages in temperate regions is poorly understood. In this study, infiltration rate and soil hydrological characteristics were investigated under forests of different ages and under grassland. Soil hydraulic characteristics were measured at different scales under a 250-year-old grazed grassland (GL), 6-year-old (6yr) and 48-year-old (48yr) Scots pine (Pinus sylvestris) plantations, remnant 300-year-old individual Scots pine (OT) and a 4000-year-old Caledonian Forest (AF). In situ field-saturated hydraulic conductivity (Kfs) was measured, and visible root:soil area was estimated from soil pits. Macroporosity, pore structure and macropore connectivity were estimated from X-ray tomography of soil cores, and from water-release characteristics. At all scales, the median values for Kfs, root fraction, macroporosity and connectivity values tended to AF>OT>48yr>GL>6yr, indicating that infiltration rates and water storage increased with forest age. The remnant Caledonian Forest had a huge range of Kfs (12 to >4922mmh-1), with maximum Kfs values 7 to 15 times larger than those of 48-year-old Scots pine plantation, suggesting that undisturbed old forests, with high rainfall and minimal evapotranspiration in winter, may act as important areas for water storage and sinks for storm rainfall to infiltrate and transport to deeper soil layers via preferential flow. The importance of the development of soil hydrological characteristics under different aged forests is discussed.
Resumo:
The mobile networks market (focus of this work) strategy is based on the consolidation of the installed structure and the optimization of the already existent resources. The increasingly competition and aggression of this market requires, to the mobile operators, a continuous maintenance and update of the networks in order to obtain the minimum number of fails and provide the best experience for its subscribers. In this context, this dissertation presents a study aiming to assist the mobile operators improving future network modifications. In overview, this dissertation compares several forecasting methods (mostly based on time series analysis) capable of support mobile operators with their network planning. Moreover, it presents several network indicators about the more common bottlenecks.
Resumo:
Este estudio empírico compara la capacidad de los modelos Vectores auto-regresivos (VAR) sin restricciones para predecir la estructura temporal de las tasas de interés en Colombia -- Se comparan modelos VAR simples con modelos VAR aumentados con factores macroeconómicos y financieros colombianos y estadounidenses -- Encontramos que la inclusión de la información de los precios del petróleo, el riesgo de crédito de Colombia y un indicador internacional de la aversión al riesgo mejora la capacidad de predicción fuera de la muestra de los modelos VAR sin restricciones para vencimientos de corto plazo con frecuencia mensual -- Para vencimientos de mediano y largo plazo los modelos sin variables macroeconómicas presentan mejores pronósticos sugiriendo que las curvas de rendimiento de mediano y largo plazo ya incluyen toda la información significativa para pronosticarlos -- Este hallazgo tiene implicaciones importantes para los administradores de portafolios, participantes del mercado y responsables de las políticas
Resumo:
Italy and its urban systems are under high seismic and hydrogeological risks. The awareness about the role of human activities in the genesis of disasters is achieved in the scientific debate, as well as the role of urban and regional planning in reducing risks. The paper reviews the state of Italian major cities referred to hydrogeological and seismic risk by: 1) extrapolating data and maps about seismic hazard and landslide risk concerning cities with more than 50.000 inhabitants and metropolitan contexts, and 2) outlining how risk reduction is framed in Italian planning system (at national and regional levels). The analyses of available data and the review of the normative framework highlight the existing gaps in addressing risk reduction: nevertheless a wide knowledge about natural risks afflicting Italian territory and an articulated regulatory framework, the available data about risks are not exhaustive, and risk reduction policies and multidisciplinary pro-active approaches are only partially fostered and applied.
Resumo:
unpublished
Resumo:
La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.
Resumo:
Les enjeux hydrologiques modernes, de prévisions ou liés aux changements climatiques, forcent l’exploration de nouvelles approches en modélisation afin de combler les lacunes actuelles et d’améliorer l’évaluation des incertitudes. L’approche abordée dans ce mémoire est celle du multimodèle (MM). L’innovation se trouve dans la construction du multimodèle présenté dans cette étude : plutôt que de caler individuellement des modèles et d’utiliser leur combinaison, un calage collectif est réalisé sur la moyenne des 12 modèles globaux conceptuels sélectionnés. Un des défis soulevés par cette approche novatrice est le grand nombre de paramètres (82) qui complexifie le calage et l’utilisation, en plus d’entraîner des problèmes potentiels d’équifinalité. La solution proposée dans ce mémoire est une analyse de sensibilité qui permettra de fixer les paramètres peu influents et d’ainsi réduire le nombre de paramètres total à caler. Une procédure d’optimisation avec calage et validation permet ensuite d’évaluer les performances du multimodèle et de sa version réduite en plus d’en améliorer la compréhension. L’analyse de sensibilité est réalisée avec la méthode de Morris, qui permet de présenter une version du MM à 51 paramètres (MM51) tout aussi performante que le MM original à 82 paramètres et présentant une diminution des problèmes potentiels d’équifinalité. Les résultats du calage et de la validation avec le « Split-Sample Test » (SST) du MM sont comparés avec les 12 modèles calés individuellement. Il ressort de cette analyse que les modèles individuels, composant le MM, présentent de moins bonnes performances que ceux calés indépendamment. Cette baisse de performances individuelles, nécessaire pour obtenir de bonnes performances globales du MM, s’accompagne d’une hausse de la diversité des sorties des modèles du MM. Cette dernière est particulièrement requise pour les applications hydrologiques nécessitant une évaluation des incertitudes. Tous ces résultats mènent à une amélioration de la compréhension du multimodèle et à son optimisation, ce qui facilite non seulement son calage, mais également son utilisation potentielle en contexte opérationnel.
Resumo:
Anthropogenic activities and land-based inputs into the sea may influence the trophic structure and functioning of coastal and continental shelf ecosystems, despite the numerous opportunities and services the latter offer to humans and wildlife. In addition, hydrological structures and physical dynamics potentially influence the sources of organic matter (e.g., terrestrial versus marine, or fresh material versus detrital material) entering marine food webs. Understanding the significance of the processes that influence marine food webs and ecosystems (e.g., terrestrial inputs, physical dynamics) is crucially important because trophic dynamics are a vital part of ecosystem integrity. This can be achieved by identifying organic matter sources that enter food webs along inshore–offshore transects. We hypothesised that regional hydrological structures over wide continental shelves directly control the benthic trophic functioning across the shelf. We investigated this issue along two transects in the northern ecosystem of the Bay of Biscay (north-eastern Atlantic). Carbon and nitrogen stable isotope analysis (SIA) and fatty acid analysis (FAA) were conducted on different complementary ecosystem compartments that include suspended particulate organic matter (POM), sedimentary organic matter (SOM), and benthic consumers such as bivalves, large crustaceans and demersal fish. Samples were collected from inshore shallow waters (at ∼1 m in depth) to more than 200 m in depth on the offshore shelf break. Results indicated strong discrepancies in stable isotope (SI) and fatty acid (FA) compositions in the sampled compartments between inshore and offshore areas, although nitrogen SI (δ15N) and FA trends were similar along both transects. Offshore the influence of a permanently stratified area (described previously as a “cold pool”) was evident in both transects. The influence of this hydrological structure on benthic trophic functioning (i.e., on the food sources available for consumers) was especially apparent across the northern transect, due to unusual carbon isotope compositions (δ13C) in the compartments. At stations under the cold pool, SI and FA organism compositions indicated benthic trophic functioning based on a microbial food web, including a significant contribution of heterotrophic planktonic organisms and/or of SOM, notably in stations under the cold pool. On the contrary, inshore and shelf break areas were characterised by a microalgae-based food web (at least in part for the shelf break area, due to slope current and upwelling that can favour fresh primary production sinking on site). SIA and FAA were relevant and complementary tools, and consumers better medium- to long-term system integrators than POM samples, for depicting the trophic functioning and dynamics along inshore–offshore transects over continental shelves.
Resumo:
Three types of forecasts of the total Australian production of macadamia nuts (t nut-in-shell) have been produced early each year since 2001. The first is a long-term forecast, based on the expected production from the tree census data held by the Australian Macadamia Society, suitably scaled up for missing data and assumed new plantings each year. These long-term forecasts range out to 10 years in the future, and form a basis for industry and market planning. Secondly, a statistical adjustment (termed the climate-adjusted forecast) is made annually for the coming crop. As the name suggests, climatic influences are the dominant factors in this adjustment process, however, other terms such as bienniality of bearing, prices and orchard aging are also incorporated. Thirdly, industry personnel are surveyed early each year, with their estimates integrated into a growers and pest-scouts forecast. Initially conducted on a 'whole-country' basis, these models are now constructed separately for the six main production regions of Australia, with these being combined for national totals. Ensembles or suites of step-forward regression models using biologically-relevant variables have been the major statistical method adopted, however, developing methodologies such as nearest-neighbour techniques, general additive models and random forests are continually being evaluated in parallel. The overall error rates average 14% for the climate forecasts, and 12% for the growers' forecasts. These compare with 7.8% for USDA almond forecasts (based on extensive early-crop sampling) and 6.8% for coconut forecasts in Sri Lanka. However, our somewhatdisappointing results were mainly due to a series of poor crops attributed to human reasons, which have now been factored into the models. Notably, the 2012 and 2013 forecasts averaged 7.8 and 4.9% errors, respectively. Future models should also show continuing improvement, as more data-years become available.