859 resultados para Hydrological forecasting.
Resumo:
How rainfall infiltration rate and soil hydrological characteristics develop over time under forests of different ages in temperate regions is poorly understood. In this study, infiltration rate and soil hydrological characteristics were investigated under forests of different ages and under grassland. Soil hydraulic characteristics were measured at different scales under a 250-year-old grazed grassland (GL), 6-year-old (6yr) and 48-year-old (48yr) Scots pine (Pinus sylvestris) plantations, remnant 300-year-old individual Scots pine (OT) and a 4000-year-old Caledonian Forest (AF). In situ field-saturated hydraulic conductivity (Kfs) was measured, and visible root:soil area was estimated from soil pits. Macroporosity, pore structure and macropore connectivity were estimated from X-ray tomography of soil cores, and from water-release characteristics. At all scales, the median values for Kfs, root fraction, macroporosity and connectivity values tended to AF>OT>48yr>GL>6yr, indicating that infiltration rates and water storage increased with forest age. The remnant Caledonian Forest had a huge range of Kfs (12 to >4922mmh-1), with maximum Kfs values 7 to 15 times larger than those of 48-year-old Scots pine plantation, suggesting that undisturbed old forests, with high rainfall and minimal evapotranspiration in winter, may act as important areas for water storage and sinks for storm rainfall to infiltrate and transport to deeper soil layers via preferential flow. The importance of the development of soil hydrological characteristics under different aged forests is discussed.
Resumo:
The mobile networks market (focus of this work) strategy is based on the consolidation of the installed structure and the optimization of the already existent resources. The increasingly competition and aggression of this market requires, to the mobile operators, a continuous maintenance and update of the networks in order to obtain the minimum number of fails and provide the best experience for its subscribers. In this context, this dissertation presents a study aiming to assist the mobile operators improving future network modifications. In overview, this dissertation compares several forecasting methods (mostly based on time series analysis) capable of support mobile operators with their network planning. Moreover, it presents several network indicators about the more common bottlenecks.
Resumo:
Este estudio empírico compara la capacidad de los modelos Vectores auto-regresivos (VAR) sin restricciones para predecir la estructura temporal de las tasas de interés en Colombia -- Se comparan modelos VAR simples con modelos VAR aumentados con factores macroeconómicos y financieros colombianos y estadounidenses -- Encontramos que la inclusión de la información de los precios del petróleo, el riesgo de crédito de Colombia y un indicador internacional de la aversión al riesgo mejora la capacidad de predicción fuera de la muestra de los modelos VAR sin restricciones para vencimientos de corto plazo con frecuencia mensual -- Para vencimientos de mediano y largo plazo los modelos sin variables macroeconómicas presentan mejores pronósticos sugiriendo que las curvas de rendimiento de mediano y largo plazo ya incluyen toda la información significativa para pronosticarlos -- Este hallazgo tiene implicaciones importantes para los administradores de portafolios, participantes del mercado y responsables de las políticas
Resumo:
Italy and its urban systems are under high seismic and hydrogeological risks. The awareness about the role of human activities in the genesis of disasters is achieved in the scientific debate, as well as the role of urban and regional planning in reducing risks. The paper reviews the state of Italian major cities referred to hydrogeological and seismic risk by: 1) extrapolating data and maps about seismic hazard and landslide risk concerning cities with more than 50.000 inhabitants and metropolitan contexts, and 2) outlining how risk reduction is framed in Italian planning system (at national and regional levels). The analyses of available data and the review of the normative framework highlight the existing gaps in addressing risk reduction: nevertheless a wide knowledge about natural risks afflicting Italian territory and an articulated regulatory framework, the available data about risks are not exhaustive, and risk reduction policies and multidisciplinary pro-active approaches are only partially fostered and applied.
Resumo:
unpublished
Resumo:
La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.
Resumo:
Les enjeux hydrologiques modernes, de prévisions ou liés aux changements climatiques, forcent l’exploration de nouvelles approches en modélisation afin de combler les lacunes actuelles et d’améliorer l’évaluation des incertitudes. L’approche abordée dans ce mémoire est celle du multimodèle (MM). L’innovation se trouve dans la construction du multimodèle présenté dans cette étude : plutôt que de caler individuellement des modèles et d’utiliser leur combinaison, un calage collectif est réalisé sur la moyenne des 12 modèles globaux conceptuels sélectionnés. Un des défis soulevés par cette approche novatrice est le grand nombre de paramètres (82) qui complexifie le calage et l’utilisation, en plus d’entraîner des problèmes potentiels d’équifinalité. La solution proposée dans ce mémoire est une analyse de sensibilité qui permettra de fixer les paramètres peu influents et d’ainsi réduire le nombre de paramètres total à caler. Une procédure d’optimisation avec calage et validation permet ensuite d’évaluer les performances du multimodèle et de sa version réduite en plus d’en améliorer la compréhension. L’analyse de sensibilité est réalisée avec la méthode de Morris, qui permet de présenter une version du MM à 51 paramètres (MM51) tout aussi performante que le MM original à 82 paramètres et présentant une diminution des problèmes potentiels d’équifinalité. Les résultats du calage et de la validation avec le « Split-Sample Test » (SST) du MM sont comparés avec les 12 modèles calés individuellement. Il ressort de cette analyse que les modèles individuels, composant le MM, présentent de moins bonnes performances que ceux calés indépendamment. Cette baisse de performances individuelles, nécessaire pour obtenir de bonnes performances globales du MM, s’accompagne d’une hausse de la diversité des sorties des modèles du MM. Cette dernière est particulièrement requise pour les applications hydrologiques nécessitant une évaluation des incertitudes. Tous ces résultats mènent à une amélioration de la compréhension du multimodèle et à son optimisation, ce qui facilite non seulement son calage, mais également son utilisation potentielle en contexte opérationnel.
Resumo:
Anthropogenic activities and land-based inputs into the sea may influence the trophic structure and functioning of coastal and continental shelf ecosystems, despite the numerous opportunities and services the latter offer to humans and wildlife. In addition, hydrological structures and physical dynamics potentially influence the sources of organic matter (e.g., terrestrial versus marine, or fresh material versus detrital material) entering marine food webs. Understanding the significance of the processes that influence marine food webs and ecosystems (e.g., terrestrial inputs, physical dynamics) is crucially important because trophic dynamics are a vital part of ecosystem integrity. This can be achieved by identifying organic matter sources that enter food webs along inshore–offshore transects. We hypothesised that regional hydrological structures over wide continental shelves directly control the benthic trophic functioning across the shelf. We investigated this issue along two transects in the northern ecosystem of the Bay of Biscay (north-eastern Atlantic). Carbon and nitrogen stable isotope analysis (SIA) and fatty acid analysis (FAA) were conducted on different complementary ecosystem compartments that include suspended particulate organic matter (POM), sedimentary organic matter (SOM), and benthic consumers such as bivalves, large crustaceans and demersal fish. Samples were collected from inshore shallow waters (at ∼1 m in depth) to more than 200 m in depth on the offshore shelf break. Results indicated strong discrepancies in stable isotope (SI) and fatty acid (FA) compositions in the sampled compartments between inshore and offshore areas, although nitrogen SI (δ15N) and FA trends were similar along both transects. Offshore the influence of a permanently stratified area (described previously as a “cold pool”) was evident in both transects. The influence of this hydrological structure on benthic trophic functioning (i.e., on the food sources available for consumers) was especially apparent across the northern transect, due to unusual carbon isotope compositions (δ13C) in the compartments. At stations under the cold pool, SI and FA organism compositions indicated benthic trophic functioning based on a microbial food web, including a significant contribution of heterotrophic planktonic organisms and/or of SOM, notably in stations under the cold pool. On the contrary, inshore and shelf break areas were characterised by a microalgae-based food web (at least in part for the shelf break area, due to slope current and upwelling that can favour fresh primary production sinking on site). SIA and FAA were relevant and complementary tools, and consumers better medium- to long-term system integrators than POM samples, for depicting the trophic functioning and dynamics along inshore–offshore transects over continental shelves.
Resumo:
Three types of forecasts of the total Australian production of macadamia nuts (t nut-in-shell) have been produced early each year since 2001. The first is a long-term forecast, based on the expected production from the tree census data held by the Australian Macadamia Society, suitably scaled up for missing data and assumed new plantings each year. These long-term forecasts range out to 10 years in the future, and form a basis for industry and market planning. Secondly, a statistical adjustment (termed the climate-adjusted forecast) is made annually for the coming crop. As the name suggests, climatic influences are the dominant factors in this adjustment process, however, other terms such as bienniality of bearing, prices and orchard aging are also incorporated. Thirdly, industry personnel are surveyed early each year, with their estimates integrated into a growers and pest-scouts forecast. Initially conducted on a 'whole-country' basis, these models are now constructed separately for the six main production regions of Australia, with these being combined for national totals. Ensembles or suites of step-forward regression models using biologically-relevant variables have been the major statistical method adopted, however, developing methodologies such as nearest-neighbour techniques, general additive models and random forests are continually being evaluated in parallel. The overall error rates average 14% for the climate forecasts, and 12% for the growers' forecasts. These compare with 7.8% for USDA almond forecasts (based on extensive early-crop sampling) and 6.8% for coconut forecasts in Sri Lanka. However, our somewhatdisappointing results were mainly due to a series of poor crops attributed to human reasons, which have now been factored into the models. Notably, the 2012 and 2013 forecasts averaged 7.8 and 4.9% errors, respectively. Future models should also show continuing improvement, as more data-years become available.
Resumo:
Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.
Resumo:
Climate change and carbon (C) sequestration are a major focus of research in the twenty-first century. Globally, soils store about 300 times the amount of C that is released per annum through the burning of fossil fuels (Schulze and Freibauer 2005). Land clearing and introduction of agricultural systems have led to rapid declines in soil C reserves. The recent introduction of conservation agricultural practices has not led to a reversing of the decline in soil C content, although it has minimized the rate of decline (Baker et al. 2007; Hulugalle and Scott 2008). Lal (2003) estimated the quantum of C pools in the atmosphere, terrestrial ecosystems, and oceans and reported a “missing C” component in the world C budget. Though not proven yet, this could be linked to C losses through runoff and soil erosion (Lal 2005) and a lack of C accounting in inland water bodies (Cole et al. 2007). Land management practices to minimize the microbial respiration and soil organic C (SOC) decline such as minimum tillage or no tillage were extensively studied in the past, and the soil erosion and runoff studies monitoring those management systems focused on other nutrients such as nitrogen (N) and phosphorus (P).
Resumo:
This study is aimed to model and forecast the tourism demand for Mozambique for the period from January 2004 to December 2013 using artificial neural networks models. The number of overnight stays in Hotels was used as representative of the tourism demand. A set of independent variables were experimented in the input of the model, namely: Consumer Price Index, Gross Domestic Product and Exchange Rates, of the outbound tourism markets, South Africa, United State of America, Mozambique, Portugal and the United Kingdom. The best model achieved has 6.5% for Mean Absolute Percentage Error and 0.696 for Pearson correlation coefficient. A model like this with high accuracy of forecast is important for the economic agents to know the future growth of this activity sector, as it is important for stakeholders to provide products, services and infrastructures and for the hotels establishments to adequate its level of capacity to the tourism demand.