936 resultados para Scheduler simulator
Resumo:
Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis
Resumo:
We describe Global Atmosphere 4.0 (GA4.0) and Global Land 4.0 (GL4.0): configurations of the Met Office Unified Model and JULES (Joint UK Land Environment Simulator) community land surface model developed for use in global and regional climate research and weather prediction activities. GA4.0 and GL4.0 are based on the previous GA3.0 and GL3.0 configurations, with the inclusion of developments made by the Met Office and its collaborators during its annual development cycle. This paper provides a comprehensive technical and scientific description of GA4.0 and GL4.0 as well as details of how these differ from their predecessors. We also present the results of some initial evaluations of their performance. Overall, performance is comparable with that of GA3.0/GL3.0; the updated configurations include improvements to the science of several parametrisation schemes, however, and will form a baseline for further ongoing development.
Resumo:
In addition to CO2, the climate impact of aviation is strongly influenced by non-CO2 emissions, such as nitrogen oxides, influencing ozone and methane, and water vapour, which can lead to the formation of persistent contrails in ice-supersaturated regions. Because these non-CO2 emission effects are characterised by a short lifetime, their climate impact largely depends on emission location and time; that is to say, emissions in certain locations (or times) can lead to a greater climate impact (even on the global average) than the same emission in other locations (or times). Avoiding these climate-sensitive regions might thus be beneficial to climate. Here, we describe a modelling chain for investigating this climate impact mitigation option. This modelling chain forms a multi-step modelling approach, starting with the simulation of the fate of emissions released at a certain location and time (time-region grid points). This is performed with the chemistry–climate model EMAC, extended via the two submodels AIRTRAC (V1.0) and CONTRAIL (V1.0), which describe the contribution of emissions to the composition of the atmosphere and to contrail formation, respectively. The impact of emissions from the large number of time-region grid points is efficiently calculated by applying a Lagrangian scheme. EMAC also includes the calculation of radiative impacts, which are, in a second step, the input to climate metric formulas describing the global climate impact of the emission at each time-region grid point. The result of the modelling chain comprises a four-dimensional data set in space and time, which we call climate cost functions and which describes the global climate impact of an emission at each grid point and each point in time. In a third step, these climate cost functions are used in an air traffic simulator (SAAM) coupled to an emission tool (AEM) to optimise aircraft trajectories for the North Atlantic region. Here, we describe the details of this new modelling approach and show some example results. A number of sensitivity analyses are performed to motivate the settings of individual parameters. A stepwise sanity check of the results of the modelling chain is undertaken to demonstrate the plausibility of the climate cost functions.
Resumo:
Weather and climate model simulations of the West African Monsoon (WAM) have generally poor representation of the rainfall distribution and monsoon circulation because key processes, such as clouds and convection, are poorly characterized. The vertical distribution of cloud and precipitation during the WAM are evaluated in Met Office Unified Model simulations against CloudSat observations. Simulations were run at 40-km and 12-km horizontal grid length using a convection parameterization scheme and at 12-km, 4-km, and 1.5-km grid length with the convection scheme effectively switched off, to study the impact of model resolution and convection parameterization scheme on the organisation of tropical convection. Radar reflectivity is forward-modelled from the model cloud fields using the CloudSat simulator to present a like-with-like comparison with the CloudSat radar observations. The representation of cloud and precipitation at 12-km horizontal grid length improves dramatically when the convection parameterization is switched off, primarily because of a reduction in daytime (moist) convection. Further improvement is obtained when reducing model grid length to 4 km or 1.5 km, especially in the representation of thin anvil and mid-level cloud, but three issues remain in all model configurations. Firstly, all simulations underestimate the fraction of anvils with cloud top height above 12 km, which can be attributed to too low ice water contents in the model compared to satellite retrievals. Secondly, the model consistently detrains mid-level cloud too close to the freezing level, compared to higher altitudes in CloudSat observations. Finally, there is too much low-level cloud cover in all simulations and this bias was not improved when adjusting the rainfall parameters in the microphysics scheme. To improve model simulations of the WAM, more detailed and in-situ observations of the dynamics and microphysics targeting these non-precipitating cloud types are required.
Resumo:
This placebo-controlled, randomised, double-blind, cross-over human feeding study aimed to determine the prebiotic effect of agave fructans. A total of thirty-eight volunteers completed this trial. The treatment consisted of 3 weeks' supplementation with 5 g/d of prebiotic agave fructan (Predilife) or equivalent placebo (maltodextrin), followed by a 2-week washout period following which subjects were crossed over to alternate the treatment arm for 3 weeks followed by a 2-week washout. Faecal samples were collected at baseline, on the last day of treatment (days 22 and 58) and washout (days 36 and 72), respectively. Changes in faecal bacterial populations, SCFA and secretory IgA were assessed using fluorescent in situ hybridisation, GC and ELISA, respectively. Bowel movements, stool consistencies, abdominal comfort and mood changes were evaluated by a recorded daily questionnaire. In parallel, the effect of agave fructans on different regions of the colon using a three-stage continuous culture simulator was studied. Predilife significantly increased faecal bifidobacteria (log10 9·6 (sd 0·4)) and lactobacilli (log10 7·7 (sd 0·8)) compared with placebo (log10 9·2 (sd 0·4); P = 0·00) (log10 7·4 (sd 0·7); P = 0·000), respectively. No change was observed for other bacterial groups tested, SCFA, secretory IgA, and PGE2 concentrations between the treatment and placebo. Denaturing gradient gel electrophoresis analysis indicated that bacterial communities were randomly dispersed and no significant differences were observed between Predilife and placebo treatments. The in vitro models showed similar increases in bifidobacterial and lactobacilli populations to that observed with the in vivo trial. To conclude, agave fructans are well tolerated in healthy human subjects and increased bifidobacteria and lactobacilli numbers in vitro and in vivo but did not influence other products of fermentation
Resumo:
In this work, the Cloud Feedback Model Intercomparison (CFMIP) Observation Simulation Package (COSP) is expanded to include scattering and emission effects of clouds and precipitation at passive microwave frequencies. This represents an advancement over the official version of COSP (version 1.4.0) in which only clear-sky brightness temperatures are simulated. To highlight the potential utility of this new microwave simulator, COSP results generated using the climate model EC-Earth's version 3 atmosphere as input are compared with Microwave Humidity Sounder (MHS) channel (190.311 GHz) observations. Specifically, simulated seasonal brightness temperatures (TB) are contrasted with MHS observations for the period December 2005 to November 2006 to identify possible biases in EC-Earth's cloud and atmosphere fields. The EC-Earth's atmosphere closely reproduces the microwave signature of many of the major large-scale and regional scale features of the atmosphere and surface. Moreover, greater than 60 % of the simulated TB are within 3 K of the NOAA-18 observations. However, COSP is unable to simulate sufficiently low TB in areas of frequent deep convection. Within the Tropics, the model's atmosphere can yield an underestimation of TB by nearly 30 K for cloudy areas in the ITCZ. Possible reasons for this discrepancy include both incorrect amount of cloud ice water in the model simulations and incorrect ice particle scattering assumptions used in the COSP microwave forward model. These multiple sources of error highlight the non-unique nature of the simulated satellite measurements, a problem exacerbated by the fact that EC-Earth lacks detailed micro-physical parameters necessary for accurate forward model calculations. Such issues limit the robustness of our evaluation and suggest a general note of caution when making COSP-satellite observation evaluations.
Resumo:
This paper shows that radiometer channel radiances for cloudy atmospheric conditions can be simulated with an optimised frequency grid derived under clear-sky conditions. A new clear-sky optimised grid is derived for AVHRR channel 5 ð12 m m, 833 cm �1 Þ. For HIRS channel 11 ð7:33 m m, 1364 cm �1 Þ and AVHRR channel 5, radiative transfer simulations using an optimised frequency grid are compared with simulations using a reference grid, where the optimised grid has roughly 100–1000 times less frequencies than the full grid. The root mean square error between the optimised and the reference simulation is found to be less than 0.3 K for both comparisons, with the magnitude of the bias less than 0.03 K. The simulations have been carried out with the radiative transfer model Atmospheric Radiative Transfer Simulator (ARTS), version 2, using a backward Monte Carlo module for the treatment of clouds. With this module, the optimised simulations are more than 10 times faster than the reference simulations. Although the number of photons is the same, the smaller number of frequencies reduces the overhead for preparing the optical properties for each frequency. With deterministic scattering solvers, the relative decrease in runtime would be even more. The results allow for new radiative transfer applications, such as the development of new retrievals, because it becomes much quicker to carry out a large number of simulations. The conclusions are applicable to any downlooking infrared radiometer.
Resumo:
A study has been carried out to assess the importance of radiosonde corrections in improving the agreement between satellite and radiosonde measurements of upper-tropospheric humidity. Infrared [High Resolution Infrared Radiation Sounder (HIRS)-12] and microwave [Advanced Microwave Sounding Unit (AMSU)-18] measurements from the NOAA-17 satellite were used for this purpose. The agreement was assessed by comparing the satellite measurements against simulated measurements using collocated radiosonde profiles of the Atmospheric Radiation Measurement (ARM) Program undertaken at tropical and midlatitude sites. The Atmospheric Radiative Transfer Simulator (ARTS) was used to simulate the satellite radiances. The comparisons have been done under clear-sky conditions, separately for daytime and nighttime soundings. Only Vaisala RS92 radiosonde sensors were used and an empirical correction (EC) was applied to the radiosonde measurements. The EC includes correction for mean calibration bias and for solar radiation error, and it removes radiosonde bias relative to three instruments of known accuracy. For the nighttime dataset, the EC significantly reduces the bias from 0.63 to 20.10 K in AMSU-18 and from 1.26 to 0.35 K in HIRS-12. The EC has an even greater impact on the daytime dataset with a bias reduction from 2.38 to 0.28 K in AMSU-18 and from 2.51 to 0.59 K in HIRS-12. The present study promises a more accurate approach in future radiosonde-based studies in the upper troposphere.
Resumo:
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.
Resumo:
Background: Daily consumption of Concord grape juice (CGJ) over three to four months has been shown to improve memory function in adults with mild cognitive impairment, and reduce blood pressure in hypertensive adults. These benefits are likely due to the high concentration of polyphenols in CGJ. Increased stress can impair cognitive function and elevate blood pressure. Thus we examined the potential beneficial effect of CGJ in individuals experiencing somewhat stressful demanding lifestyles. Objective: To examine the effects of twelve weeks’ daily consumption of CGJ on cognitive function, driving performance, and blood pressure in healthy, middle-aged working mothers. Design: Twenty five healthy mothers of pre-teen children, aged 40-50 years, who were employed for > 30 hours/week consumed 12oz (355ml) CGJ (containing 777mg total polyphenols) or an energy, taste and appearance matched placebo daily for twelve weeks according to a randomised, crossover design with a four week washout. Verbal and spatial memory, executive function, attention, blood pressure and mood were assessed at baseline, six weeks and twelve weeks. Immediately following the cognitive battery, a subsample of seventeen females completed a driving performance assessment in the University of Leeds Driving Simulator. The twenty five minute driving task required participants to match the speed and direction of a lead vehicle. Results: Significant improvements in immediate spatial memory and driving performance were observed following CGJ relative to placebo. There was evidence of an enduring effect of CGJ such that participants who received CGJ in arm 1 maintained better performance in the placebo arm. Conclusions: Cognitive benefits associated with chronic consumption of flavonoid-rich grape juice are not exclusive to adults with mild cognitive impairment. Moreover, these cognitive benefits are apparent in complex everyday tasks such as driving. Effects may persist beyond cessation of flavonoid consumption and future studies should carefully consider the length of washout within crossover designs.
Resumo:
The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.
Resumo:
The introduction of a new technology High Speed Downlink Packet Access (HSDPA) in the Release 5 of the 3GPP specifications raises the question about its performance capabilities. HSDPA is a promising technology which gives theoretical rates up to 14.4 Mbits. The main objective of this thesis is to discuss the system level performance of HSDPAMainly the thesis exploration focuses on the Packet Scheduler because it is the central entity of the HSDPA design. Due to its function, the Packet Scheduler has a direct impact on the HSDPA system performance. Similarly, it also determines the end user performance, and more specifically the relative performance between the users in the cell.The thesis analyzes several Packet Scheduling algorithms that can optimize the trade-off between system capacity and end user performance for the traffic classes targeted in this thesis.The performance evaluation of the algorithms in the HSDPA system are carried out under computer aided simulations that are assessed under realistic conditions to predict the results as precise on the algorithms efficiency. The simulation of the HSDPA system and the algorithms are coded in C/C++ language
Resumo:
This thesis work concerns about the Performance evolution of peer to peer networks, where we used different distribution technique’s of peer distribution like Weibull, Lognormal and Pareto distribution process. Then we used a network simulator to evaluate the performance of these three distribution techniques.During the last decade the Internet has expanded into a world-wide network connecting millions of hosts and users and providing services for everyone. Many emerging applications are bandwidth-intensive in their nature; the size of downloaded files including music and videos can be huge, from ten megabits to many gigabits. The efficient use of network resources is thus crucial for the survivability of the Internet. Traffic engineering (TE) covers a range of mechanisms for optimizing operational networks from the traffic perspective. The time scale in traffic engineering varies from the short-term network control to network planning over a longer time period.Here in this thesis work we considered the peer distribution technique in-order to minimise the peer arrival and service process with three different techniques, where we calculated the congestion parameters like blocking time for each peer before entering into the service process, waiting time for a peers while the other peer has been served in the service block and the delay time for each peer. Then calculated the average of each process and graphs have been plotted using Matlab to analyse the results
Resumo:
The aim of this work was to design a set of rules for levodopa infusion dose adjustment in Parkinson’s disease based on a simulation experiments. Using this simulator, optimal infusions dose in different conditions were calculated. There are seven conditions (-3 to +3)appearing in a rating scale for Parkinson’s disease patients. By finding mean of the differences between conditions and optimal dose, two sets of rules were designed. The set of rules was optimized by several testing. Usefulness for optimizing the titration procedure of new infusion patients based on rule-based reasoning was investigated. Results show that both of the number of the steps and the errors for finding optimal dose was shorten by new rules. At last, the dose predicted with new rules well on each single occasion of majority of patients in simulation experiments.
Resumo:
This Thesis project is a part of the all-round automation of production of concentrating solar PV/T systems Absolicon X10. ABSOLICON Solar Concentrator AB has been invented and started production of the prospective solar concentrated system Absolicon X10. The aims of this Thesis project are designing, assembling, calibrating and putting in operation the automatic measurement system intended to evaluate the shape of concentrating parabolic reflectors.On the basis of the requirements of the company administration and needs of real production process the operation conditions for the Laser testing rig were formulated. The basic concept to use laser radiation was defined.At the first step, the complex design of the whole system was made and division on the parts was defined. After the preliminary conducted simulations the function and operation conditions of the all parts were formulated.At the next steps, the detailed design of all the parts was conducted. Most components were ordered from respective companies. Some of the mechanical components were made in the workshop of the company. All parts of the Laser-testing rig were assembled and tested. Software part, which controls the Laser-testing rig work, was created on the LabVIEW basis. To tune and test software part the special simulator was designed and assembled.When all parts were assembled in the complete system, the Laser-testing rig was tested, calibrated and tuned.In the workshop of Absolicon AB, the trial measurements were conducted and Laser-testing rig was installed in the production line at the plant in Soleftea.