110 resultados para Downscaling
Resumo:
Introduction. Iceland’s domestic politics and foreign affairs are undergoing drastic changes. After an economic crash, violent protests on the streets of Reykjavik for the first time in Iceland’s history contributed to the defeat of the government. The party system has been altered. A turn has been taken towards Europe after the United States left the island, first by closing its military base in 2006 and then by its clear stance not to assist the country in its economic difficulties. The former close relations with the superpower are unlikely ever to be restored. The EU membership application is placing severe constraints on political parties which are split on the issue and has put in jeopardy the unity of the first left majority in the Icelandic parliament, the Althingi. Society is in a state of flux after an unprecedented economic downscaling and the collapse of almost its entire financial sector – which had boomed rapidly beginning in the mid-1990s. The credibility of politicians, the parliament and the media is in ruins. Iceland’s smallness and its location on the geographical map – one could also say the geopolitical map – has had a profound influence on its domestic and foreign affairs. Iceland is closely associated with the other Nordic states and has adopted many of their domestic characteristics, with important exceptions. On the other hand, the country has come under American influence – geographically, it straddles the Mid-Atlantic rift – and has limited its participation in the European project. Its geographical location in the middle of the North Atlantic has led to a notion that the country’s culture is unique and should be protected by all available means. Politicians continue to play the ‘nationalistic uniqueness’ card with considerable success even though the country has been swept by globalization. Rapid modernization (which only really began in the Second World War with British and American occupations) and sudden engagement with the outside world (which only extended to the general public in the last quarter of the twentieth century) are still slowly but steadily making their mark on the country’s foreign policy. The country’s political discourse and foreign policy still bear the hallmark of the past, i.e. of a small and insular society This paper will address the political developments in Iceland since the 2008 economic crash and place it in a historical context. The aim is to understand Iceland’s present foreign policy and, in particular, the highly contested decision by its government in 2009 to apply for membership of the European Union. The paper is divided into five sections in addition to this introduction and the concluding remarks. First, it starts by explaining the importance in Iceland of a political discourse based on the concept of independence which dates back to the historical narrative of the settlement period. This section will also examine Iceland’s close relations with the other Nordic states – despite important differences between it and the others. Second, the paper will analyse the importance of the party system, i.e. the dominance of the centre-right in Icelandic politics, and the changed nature of the system. Third, it examines how Iceland further distinguishes itself from the other Nordic states in many important features. Fourthly, the paper analyses the country’s three main foreign policy priorities in the post-war period, i.e. extensions of the Exclusive Economic Zone, firm defence arrangements with the US and membership of NATO, and the drive for better market access for marine products – including a partial engagement in the European project. Fifthly, the paper examines how the country’s smallness, in terms of its central administrative capacity, has affected its domestic and foreign policy-making. The concluding section summarizes the main findings concerning the political and historical obstacles that the Social Democratic Alliance faces in its hard-fought battle to change the country’s European Policy.
Resumo:
This dataset contains continuous time series of land surface temperature (LST) at spatial resolution of 300m around the 12 experimental sites of the PAGE21 project (grant agreement number 282700, funded by the EC seventh Framework Program theme FP7-ENV-2011). This dataset was produced from hourly LST time series at 25km scale, retrieved from SSM/I data (André et al., 2015, doi:10.1016/j.rse.2015.01.028) and downscaled to 300m using a dynamic model and a particle smoothing approach. This methodology is based on two main assumptions. First, LST spatial variability is mostly explained by land cover and soil hydric state. Second, LST is unique for a land cover class within the low resolution pixel. Given these hypotheses, this variable can be estimated using a land cover map and a physically based land surface model constrained with observations using a data assimilation process. This methodology described in Mechri et al. (2014, doi:10.1002/2013JD020354) was applied to the ORCHIDEE land surface model (Krinner et al., 2005, doi:10.1029/2003GB002199) to estimate prior values of each land cover class provided by the ESA CCI-Land Cover product (Bontemps et al., 2013) at 300m resolution . The assimilation process (particle smoother) consists in simulating ensemble of LST time series for each land cover class and for a large number of parameter sets. For each parameter set, the resulting temperatures are aggregated considering the grid fraction of each land cover and compared to the coarse observations. Miniminizing the distance between the aggregated model solutions and the observations allow us to select the simulated LST and the corresponding parameter sets which fit the observations most closely. The retained parameter sets are then duplicated and randomly perturbed before simulating the next time window. At the end, the most likely LST of each land cover class are estimated and used to reconstruct LST maps at 300m resolution using ESA CCI-Land Cover. The resulting temperature maps on which ice pixels were masked, are provided at daily time step during the nine-year analysis period (2000-2009).
Resumo:
Most magnetic resonance imaging (MRI) spatial encoding techniques employ low-frequency pulsed magnetic field gradients that undesirably induce multiexponentially decaying eddy currents in nearby conducting structures of the MRI system. The eddy currents degrade the switching performance of the gradient system, distort the MRI image, and introduce thermal loads in the cryostat vessel and superconducting MRI components. Heating of superconducting magnets due to induced eddy currents is particularly problematic as it offsets the superconducting operating point, which can cause a system quench. A numerical characterization of transient eddy current effects is vital for their compensation/control and further advancement of the MRI technology as a whole. However, transient eddy current calculations are particularly computationally intensive. In large-scale problems, such as gradient switching in MRI, conventional finite-element method (FEM)-based routines impose very large computational loads during generation/solving of the system equations. Therefore, other computational alternatives need to be explored. This paper outlines a three-dimensional finite-difference time-domain (FDTD) method in cylindrical coordinates for the modeling of low-frequency transient eddy currents in MRI, as an extension to the recently proposed time-harmonic scheme. The weakly coupled Maxwell's equations are adapted to the low-frequency regime by downscaling the speed of light constant, which permits the use of larger FDTD time steps while maintaining the validity of the Courant-Friedrich-Levy stability condition. The principal hypothesis of this work is that the modified FDTD routine can be employed to analyze pulsed-gradient-induced, transient eddy currents in superconducting MRI system models. The hypothesis is supported through a verification of the numerical scheme on a canonical problem and by analyzing undesired temporal eddy current effects such as the B-0-shift caused by actively shielded symmetric/asymmetric transverse x-gradient head and unshielded z-gradient whole-body coils operating in proximity to a superconducting MRI magnet.
Resumo:
The need to incorporate advanced engineering tools in biology, biochemistry and medicine is in great demand. Many of the existing instruments and tools are usually expensive and require special facilities.^ With the advent of nanotechnology in the past decade, new approaches to develop devices and tools have been generated by academia and industry. ^ One such technology, NMR spectroscopy, has been used by biochemists for more than 2 decades to study the molecular structure of chemical compounds. However, NMR spectrometers are very expensive and require special laboratory rooms for their proper operation. High magnetic fields with strengths in the order of several Tesla make these instruments unaffordable to most research groups.^ This doctoral research proposes a new technology to develop NMR spectrometers that can operate at field strengths of less than 0.5 Tesla using an inexpensive permanent magnet and spin dependent nanoscale magnetic devices. This portable NMR system is intended to analyze samples as small as a few nanoliters.^ The main problem to resolve when downscaling the variables is to obtain an NMR signal with high Signal-To-Noise-Ratio (SNR). A special Tunneling Magneto-Resistive (TMR) sensor design was developed to achieve this goal. The minimum specifications for each component of the proposed NMR system were established. A complete NMR system was designed based on these minimum requirements. The goat was always to find cost effective realistic components. The novel design of the NMR system uses technologies such as Direct Digital Synthesis (DDS), Digital Signal Processing (DSP) and a special Backpropagation Neural Network that finds the best match of the NMR spectrum. The system was designed, calculated and simulated with excellent results.^ In addition, a general method to design TMR Sensors was developed. The technique was automated and a computer program was written to help the designer perform this task interactively.^
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
An emerging approach to downscaling the projections from General Circulation Models (GCMs) to scales relevant for basin hydrology is to use output of GCMs to force higher-resolution Regional Climate Models (RCMs). With spatial resolution often in the tens of kilometers, however, even RCM output will likely fail to resolve local topography that may be climatically significant in high-relief basins. Here we develop and apply an approach for downscaling RCM output using local topographic lapse rates (empirically-estimated spatially and seasonally variable changes in climate variables with elevation). We calculate monthly local topographic lapse rates from the 800-m Parameter-elevation Regressions on Independent Slopes Model (PRISM) dataset, which is based on regressions of observed climate against topographic variables. We then use these lapse rates to elevationally correct two sources of regional climate-model output: (1) the North American Regional Reanalysis (NARR), a retrospective dataset produced from a regional forecasting model constrained by observations, and (2) a range of baseline climate scenarios from the North American Regional Climate Change Assessment Program (NARCCAP), which is produced by a series of RCMs driven by GCMs. By running a calibrated and validated hydrologic model, the Soil and Water Assessment Tool (SWAT), using observed station data and elevationally-adjusted NARR and NARCCAP output, we are able to estimate the sensitivity of hydrologic modeling to the source of the input climate data. Topographic correction of regional climate-model data is a promising method for modeling the hydrology of mountainous basins for which no weather station datasets are available or for simulating hydrology under past or future climates.
Resumo:
The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.
Resumo:
Ignoring small-scale heterogeneities in Arctic land cover may bias estimates of water, heat and carbon fluxes in large-scale climate and ecosystem models. We investigated subpixel-scale heterogeneity in CHRIS/PROBA and Landsat-7 ETM+ satellite imagery over ice-wedge polygonal tundra in the Lena Delta of Siberia, and the associated implications for evapotranspiration (ET) estimation. Field measurements were combined with aerial and satellite data to link fine-scale (0.3 m resolution) with coarse-scale (upto 30 m resolution) land cover data. A large portion of the total wet tundra (80%) and water body area (30%) appeared in the form of patches less than 0.1 ha in size, which could not be resolved with satellite data. Wet tundra and small water bodies represented about half of the total ET in summer. Their contribution was reduced to 20% in fall, during which ET rates from dry tundra were highest instead. Inclusion of subpixel-scale water bodies increased the total water surface area of the Lena Delta from 13% to 20%. The actual land/water proportions within each composite satellite pixel was best captured with Landsat data using a statistical downscaling approach, which is recommended for reliable large-scale modelling of water, heat and carbon exchange from permafrost landscapes.
Resumo:
El 5º Informe del IPCC (Panel Intergubernamental de Cambio Climático, 2014) señala que el turismo será una de las actividades económicas que mayores efectos negativos experimentará en las próximas décadas debido al calentamiento térmico del planeta. En España, el turismo es una fuente principal de ingresos y de creación de puestos de trabajo en su economía. De ahí que sea necesaria la puesta en marcha de medidas de adaptación a la nueva realidad climática que, en nuestro país, va a suponer cambios en el confort climático de los destinos e incremento de extremos atmosféricos. Frente a los planes de adaptación al cambio climático en la actividad turística, elaborados por los gobiernos estatal y regional, que apenas se han desarrollado en España, la escala local muestra interesantes ejemplos de acciones de adaptación al cambio climático, desarrolladas tanto por los municipios (energía, transporte, vivienda, planificación urbanística) como por la propia empresa turística (hoteles, campings, apartamentos). Medidas de ahorro de agua y luz, fomento del transporte público y de las energías limpias, creación de zonas verdes urbanas y adaptación a los extremos atmosféricos destacan como acciones de mitigación del cambio climático en los destinos turísticos principales de nuestro país.
Resumo:
Climate models project that the northern high latitudes will warm at a rate in excess of the global mean. This will pose severe problems for Arctic and sub-Arctic infrastructure dependent on maintaining low temperatures for structural integrity. This is the case for the economically important Tibbitt to Contwoyto Winter Road (TCWR)—the world’s busiest heavy haul ice road, spanning 400 km across mostly frozen lakes within the Northwest Territories of Canada. In this study, future climate scenarios are developed for the region using statistical downscaling methods. In addition, changes in lake ice thickness are projected based on historical relationships between measured ice thickness and air temperatures. These projections are used to infer the theoretical operational dates of the TCWR based on weight limits for trucks on the ice. Results across three climate models driven by four RCPs reveal a considerable warming trend over the coming decades. Projected changes in ice thickness reveal a trend towards thinner lake ice and a reduced time window when lake ice is at sufficient thickness to support trucks on the ice road, driven by increasing future temperatures. Given the uncertainties inherent in climate modelling and the resultant projections, caution should be exercised in interpreting the magnitude of these scenarios. More certain is the direction of change, with a clear trend towards winter warming that will reduce the operation time window of the TCWR. This illustrates the need for planners and policymakers to consider future changes in climate when planning annual haulage along the TCWR.
Resumo:
La gestion intégrée de la ressource en eau implique de distinguer les parcours de l’eau qui sont accessibles aux sociétés de ceux qui ne le sont pas. Les cheminements de l’eau sont nombreux et fortement variables d’un lieu à l’autre. Il est possible de simplifier cette question en s’attardant plutôt aux deux destinations de l’eau. L’eau bleue forme les réserves et les flux dans l’hydrosystème : cours d’eau, nappes et écoulements souterrains. L’eau verte est le flux invisible de vapeur d’eau qui rejoint l’atmosphère. Elle inclut l’eau consommée par les plantes et l’eau dans les sols. Or, un grand nombre d’études ne portent que sur un seul type d’eau bleue, en ne s’intéressant généralement qu’au devenir des débits ou, plus rarement, à la recharge des nappes. Le portrait global est alors manquant. Dans un même temps, les changements climatiques viennent impacter ce cheminement de l’eau en faisant varier de manière distincte les différents composants de cycle hydrologique. L’étude réalisée ici utilise l’outil de modélisation SWAT afin de réaliser le suivi de toutes les composantes du cycle hydrologique et de quantifier l’impact des changements climatiques sur l’hydrosystème du bassin versant de la Garonne. Une première partie du travail a permis d’affiner la mise en place du modèle pour répondre au mieux à la problématique posée. Un soin particulier a été apporté à l’utilisation de données météorologiques sur grille (SAFRAN) ainsi qu’à la prise en compte de la neige sur les reliefs. Le calage des paramètres du modèle a été testé dans un contexte differential split sampling, en calant puis validant sur des années contrastées en terme climatique afin d’appréhender la robustesse de la simulation dans un contexte de changements climatiques. Cette étape a permis une amélioration substantielle des performances sur la période de calage (2000-2010) ainsi que la mise en évidence de la stabilité du modèle face aux changements climatiques. Par suite, des simulations sur une période d’un siècle (1960-2050) ont été produites puis analysées en deux phases : i) La période passée (1960-2000), basée sur les observations climatiques, a servi de période de validation à long terme du modèle sur la simulation des débits, avec de très bonnes performances. L’analyse des différents composants hydrologiques met en évidence un impact fort sur les flux et stocks d’eau verte, avec une diminution de la teneur en eau des sols et une augmentation importante de l’évapotranspiration. Les composantes de l’eau bleue sont principalement perturbées au niveau du stock de neige et des débits qui présentent tous les deux une baisse substantielle. ii) Des projections hydrologiques ont été réalisées (2010-2050) en sélectionnant une gamme de scénarios et de modèles climatiques issus d’une mise à l’échelle dynamique. L’analyse de simulation vient en bonne part confirmer les conclusions tirées de la période passée : un impact important sur l’eau verte, avec toujours une baisse de la teneur en eau des sols et une augmentation de l’évapotranspiration potentielle. Les simulations montrent que la teneur en eau des sols pendant la période estivale est telle qu’elle en vient à réduire les flux d’évapotranspiration réelle, mettant en évidence le possible déficit futur des stocks d’eau verte. En outre, si l’analyse des composantes de l’eau bleue montre toujours une diminution significative du stock de neige, les débits semblent cette fois en hausse pendant l’automne et l’hiver. Ces résultats sont un signe de l’«accélération» des composantes d’eau bleue de surface, probablement en relation avec l’augmentation des évènements extrêmes de précipitation. Ce travail a permis de réaliser une analyse des variations de la plupart des composantes du cycle hydrologique à l’échelle d’un bassin versant, confirmant l’importance de prendre en compte toutes ces composantes pour évaluer l’impact des changements climatiques et plus largement des changements environnementaux sur la ressource en eau.
Resumo:
The starting point for this study was the consideration of future climate change scenarios and their uncertainties. The paper presents the global projections from the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and compares them with regional scenarios (downscaling) developed by the Brazilian National Institute for Space Research (Instituto Nacional de Pesquisas Espaciais - INPE), with a focus on two main IPCC scenarios (RCP4.5 and RCP8.5) and two main global models (MIROC and Hadley Centre) for the periods 2011-2040 and 2041-2070. It aims to identify the main trends in terms of changes in temperature and precipitation for the North and Northeast regions of Brazil (more specifically, in the Amazon, semi-arid and cerrado biomes).
Resumo:
Summary: Climate change has a potential to impact rainfall, temperature and air humidity, which have relation to plant evapotranspiration and crop water requirement. The purpose of this research is to assess climate change impacts on irrigation water demand, based on future scenarios derived from the PRECIS (Providing Regional Climates for Impacts Studies), using boundary conditions of the HadCM3 submitted to a dynamic downscaling nested to the Hadley Centre regional circulation model HadRM3P. Monthly time series for average temperature and rainfall were generated for 1961-90 (baseline) and the future (2040). The reference evapotranspiration was estimated using monthly average temperature. Projected climate change impact on irrigation water demand demonstrated to be a result of evapotranspiration and rainfall trend. Impacts were mapped over the target region by using geostatistical methods. An increase of the average crop water needs was estimated to be 18.7% and 22.2% higher for 2040 A2 and B2 scenarios, respectively. Objective ? To analyze the climate change impacts on irrigation water requirements, using downscaling techniques of a climate change model, at the river basin scale. Method: The study area was delimited between 4º39?30? and 5º40?00? South and 37º35?30? and 38º27?00? West. The crop pattern in the target area was characterized, regarding type of irrigated crops, respective areas and cropping schedules, as well as the area and type of irrigation systems adopted. The PRECIS (Providing Regional Climates for Impacts Studies) system (Jones et al., 2004) was used for generating climate predictions for the target area, using the boundary conditions of the Hadley Centre model HadCM3 (Johns et al., 2003). The considered time scale of interest for climate change impacts evaluation was the year of 2040, representing the period of 2025 to 2055. The output data from the climate model was interpolated, considering latitude/longitude, by applying ordinary kriging tools available at a Geographic Information System, in order to produce thematic maps.
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
The Mara River Basin (MRB) is endowed with pristine biodiversity, socio-cultural heritage and natural resources. The purpose of my study is to develop and apply an integrated water resource allocation framework for the MRB based on the hydrological processes, water demand and economic factors. The basin was partitioned into twelve sub-basins and the rainfall runoff processes was modeled using the Soil and Water Assessment Tool (SWAT) after satisfactory Nash-Sutcliff efficiency of 0.68 for calibration and 0.43 for validation at Mara Mines station. The impact and uncertainty of climate change on the hydrology of the MRB was assessed using SWAT and three scenarios of statistically downscaled outputs from twenty Global Circulation Models. Results predicted the wet season getting more wet and the dry season getting drier, with a general increasing trend of annual rainfall through 2050. Three blocks of water demand (environmental, normal and flood) were estimated from consumptive water use by human, wildlife, livestock, tourism, irrigation and industry. Water demand projections suggest human consumption is expected to surpass irrigation as the highest water demand sector by 2030. Monthly volume of water was estimated in three blocks of current minimum reliability, reserve (>95%), normal (80–95%) and flood (40%) for more than 5 months in a year. The assessment of water price and marginal productivity showed that current water use hardly responds to a change in price or productivity of water. Finally, a water allocation model was developed and applied to investigate the optimum monthly allocation among sectors and sub-basins by maximizing the use value and hydrological reliability of water. Model results demonstrated that the status on reserve and normal volumes can be improved to ‘low’ or ‘moderate’ by updating the existing reliability to meet prevailing demand. Flow volumes and rates for four scenarios of reliability were presented. Results showed that the water allocation framework can be used as comprehensive tool in the management of MRB, and possibly be extended similar watersheds.