61 resultados para Climatic data simulation

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the result of a project to develop climate adaptation design strategies funded by the UK’s Technology Strategy Board. The aim of the project was to look at the effects of climate change in the distant future (2080) on a vulnerable group such as older people with special needs and see how architectural design strategies and technologies may be used today to help mitigate problems ahead caused by climate change.
Older people are the most vulnerable sector of society and are particularly at risk in extreme weather, either excess cold in winter or continual high temperatures in summer. In the UK it is predicted that average temperatures may rise by as much as 8 degrees in Summer by 2080 and there will be a 20% greater chance of extreme weather events. This will place extreme stress on the building stock which is designed for today’s mild maritime climate.
The project took a current proposal for an extra-care home for the elderly designed to 2010 regulations and developed a road map to 2080 using climate models developed by the UK Meteorological Office. This allowed the current design to be assessed using future climatic data, proposals for improvement of the scheme to be made within existing constraints and also a new scheme to be developed from first principals using this data, and projections of new technologies that will be available. By comparing these schemes, the approach allowed a reassessment of the initial scheme, and allowed a new design to be developed that offered a more flexible solution incorporating future retrofit which allows new renewable technologies for heating, cooling and water storage to be added at a later date.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hulun Lake, China's fifth-largest inland lake, experienced severe declines in water level in the period of 2000-2010. This has prompted concerns whether the lake is drying up gradually. A multi-million US dollar engineering project to construct a water channel to transfer part of the river flow from a nearby river to maintain the water level was completed in August 2010. This study aimed to advance the understanding of the key processes controlling the lake water level variation over the last five decades, as well as investigate the impact of the river transfer engineering project on the water level. A water balance model was developed to investigate the lake water level variations over the last five decades, using hydrological and climatic data as well as satellite-based measurements and results from land surface modelling. The investigation reveals that the severe reduction of river discharge (-364±64 mm/yr, ∼70% of the five-decade average) into the lake was the key factor behind the decline of the lake water level between 2000 and 2010. The decline of river discharge was due to the reduction of total runoff from the lake watershed. This was a result of the reduction of soil moisture due to the decrease of precipitation (-49±45 mm/yr) over this period. The water budget calculation suggests that the groundwater component from the surrounding lake area as well as surface run off from the un-gauged area surrounding the lake contributed ∼ net 210 Mm3/yr (equivalent to ∼ 100 mm/yr) water inflows into the lake. The results also show that the water diversion project did prevent a further water level decline of over 0.5 m by the end of 2012. Overall, the monthly water balance model gave an excellent prediction of the lake water level fluctuation over the last five decades and can be a useful tool to manage lake water resources in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. We tested the species diversity-energy hypothesis using the British bird fauna. This predicts that temperature patterns should match diversity patterns. We also tested the hypothesis that the mechanism operates directly through effects of temperature on thermoregulatory loads; this further predicts that seasonal changes in temperature cause matching changes in patterns of diversity, and that species' body mass is influential.

2. We defined four assemblages using migration status (residents or visitors) and season (summer or winter distribution). Records of species' presence/absence in a total of 2362, 10 x 10-km, quadrats covering most of Britain were used, together with a wide selection of habitat, topographic and seasonal climatic data.

3. We fitted a logistic regression model to each species' distribution using the environmental data. We then combined these individual species models mathematically to form a diversity model. Analysis of this composite model revealed that summer temperature was the factor most strongly associated with diversity.

4. Although the species-energy hypothesis was supported, the direct mechanism, predicting an important role for body mass and matching seasonal patterns of change between diversity and temperature, was not supported.

5. However, summer temperature is the best overall explanation for bird diversity patterns in Britain. It is a better predictor of winter diversity than winter temperature. Winter diversity is predicted more precisely from environmental factors than summer diversity.

6. Climate change is likely to influence the diversity of different areas to different extents; for resident species, low diversity areas may respond more strongly as climate change progresses. For winter visitors, higher diversity areas may respond more strongly, while summer visitors are approximately neutral.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The increasing complexity and scale of cloud computing environments due to widespread data centre heterogeneity makes measurement-based evaluations highly difficult to achieve. Therefore the use of simulation tools to support decision making in cloud computing environments to cope with this problem is an increasing trend. However the data required in order to model cloud computing environments with an appropriate degree of accuracy is typically large, very difficult to collect without some form of automation, often not available in a suitable format and a time consuming process if done manually. In this research, an automated method for cloud computing topology definition, data collection and model creation activities is presented, within the context of a suite of tools that have been developed and integrated to support these activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presented is a study that expands the body of knowledge on the effect of in-cycle speed fluctuations on performance of small engines. It uses the engine and drivetrain models developed previously by Callahan, et al. (1) to examine a variety of engines. The predicted performance changes due to drivetrain effects are shown in each case, and conclusions are drawn from those results. The single-cylinder, high performance four-stroke engine showed significant changes in predicted performance compared to the prediction with zero speed fluctuation in the model. Measured speed fluctuations from a firing Yamaha YZ426 engine were applied to the simulation in addition to data from a simple free mass model. Both methods predicted similar changes in performance. The multiple-cylinder, high performance two-stroke engine also showed significant changes in performance depending on the firing configuration. With both engines, the change in performance diminished with increasing mean engine speed. The low output, single-cylinder two-stroke engine simulation showed only a negligible change in performance, even with high amplitude speed fluctuations. Because the torque versus engine speed characteristic for the engine was so flat, this was expected. The cross-charged, multi-cylinder two-stroke engine also showed only a negligible change in performance. In this case, the combination of a relatively high inertia rotating assembly and the multiple cylinder firing events within the revolution smoothing the torque pulsations reduced the speed fluctuation amplitude itself.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a model of a 1.8-litre four-cylinder four-stroke gasoline engine fitted with a close-coupled three-way catalyst (TWC). Designed to meet EURO 3 emissions standards, the engine includes some advanced emission control features in addition to the TWC, namely: variable valve timing (VVT), swirl control plates, and exhaust gas recirculation (EGR). Gas flow is treated as one-dimensional (1D) and unsteady in the engine ducting and in the catalyst. Reflection and transmission of pressure waves at the boundaries of the catalyst monolith are modelled. In-cylinder combustion is represented by a two-zone burn model with dissociation and reaction kinetics. A single Wiebe analysis of measured in-cylinder pressure data is used to determine the mass fraction burned as a function of crank angle (CA) at each engine speed. Measured data from steady-state dynamometer tests are presented for operation at wide open throttle (WOT) over a range of engine speeds. These results include CA-resolved traces of pressure at various locations throughout the engine together with cycle-averaged traces of gas composition entering the catalyst as indicated by a fast-response emissions analyser. Simulated engine performance and pressure wave action throughout the engine are well validated by the measured data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A force field model of phosphorus has been developed based on density functional (DF) computations and experimental results, covering low energy forms of local tetrahedral symmetry and more compact (simple cubic) structures that arise with increasing pressure. Rules tailored to DF data for the addition, deletion, and exchange of covalent bonds allow the system to adapt the bonding configuration to the thermodynamic state. Monte Carlo simulations in the N-P-T ensemble show that the molecular (P-4) liquid phase, stable at low pressure P and relatively low temperature T, transforms to a polymeric (gel) state on increasing either P or T. These phase changes are observed in recent experiments at similar thermodynamic conditions, as shown by the close agreement of computed and measured structure factors in the molecular and polymer phases. The polymeric phase obtained by increasing pressure has a dominant simple cubic character, while the polymer obtained by raising T at moderate pressure is tetrahedral. Comparison with DF results suggests that the latter is a semiconductor, while the cubic form is metallic. The simulations show that the T-induced polymerization is due to the entropy of the configuration of covalent bonds, as in the polymerization transition in sulfur. The transition observed with increasing P is the continuation at high T of the black P to arsenic (A17) structure observed in the solid state, and also corresponds to a semiconductor to metal transition. (C) 2004 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent experimental neutron diffraction data and ab initio molecular dynamics simulation of the ionic liquid dimethylimidazolium chloride ([dmim]Cl) have provided a structural description of the system at the molecular level. However, partial radial distribution functions calculated from the latter, when compared to previous classical simulation results, highlight some limitations in the structural description offered by force fieldbased simulations. With the availability of ab initio data it is possible to improve the classical description of [dmim]Cl by using the force matching approach, and the strategy for fitting complex force fields in their original functional form is discussed. A self-consistent optimization method for the generation of classical potentials of general functional form is presented and applied, and a force field that better reproduces the observed first principles forces is obtained. When used in simulation, it predicts structural data which reproduces more faithfully that observed in the ab initio studies. Some possible refinements to the technique, its application, and the general suitability of common potential energy functions used within many ionic liquid force fields are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and purpose: Radiotherapy is widely used to palliate local symptoms in non-small-cell lung cancer. Using conventional X-ray simulation, it is often difficult to accurately localize the extent of the tumour. We report a randomized, double blind trial comparing target localization with conventional and virtual simulation.Methods: Eighty-six patients underwent both conventional and virtual simulation. The conventional simulator films were compared with digitally reconstructed radiographs (DRRs) produced from the computed tomography (CT) data. The treatment fields defined by the clinicians using each modality were compared in terms of field area, position and the implications for target coverage.Results: Comparing fields defined by each study arm, there was a major mis-match in coverage between fields in 66.2% of cases, and a complete match in only 5.2% of cases. In 82.4% of cases, conventional simulator fields were larger (mean 24.5+/-5.1% (95% confidence interval)) than CT-localized fields, potentially contributing to a mean target under-coverage of 16.4+/-3.5% and normal tissue over-coverage of 25.4+/-4.2%.Conclusions: CT localization and virtual simulation allow more accurate definition of the target volume. This could enable a reduction in geographical misses, while also reducing treatment-related toxicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and purpose: Currently, optimal use of virtual simulation for all treatment sites is not entirely clear. This study presents data to identify specific patient groups for whom conventional simulation may be completely eliminated and replaced by virtual simulation. Sampling and method: Two hundred and sixty patients were recruited from four treatment sites (head and neck, breast, pelvis, and thorax). Patients were randomly assigned to be treated using the usual treatment process involving conventional simulation, or a treatment process differing only in the replacement of conventional plan verification with virtual verification. Data were collected on set-up accuracy at verification, and the number of unsatisfactory verifications requiring a return to the conventional simulator. A micro-economic costing analysis was also undertaken, whereby data for each treatment process episode were also collected: number and grade of staff present, and the time for each treatment episode. Results: The study shows no statistically significant difference in the number of returns to the conventional simulator for each site and study arm. Image registration data show similar quality of verification for each study arm. The micro-costing data show no statistical difference between the virtual and conventional simulation processes. Conclusions: At our institution, virtual simulation including virtual verification for the sites investigated presents no disadvantage compared to conventional simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microscopic simulation models are often evaluated based on visual inspection of the results. This paper presents formal econometric techniques to compare microscopic simulation (MS) models with real-life data. A related result is a methodology to compare different MS models with each other. For this purpose, possible parameters of interest, such as mean returns, or autocorrelation patterns, are classified and characterized. For each class of characteristics, the appropriate techniques are presented. We illustrate the methodology by comparing the MS model developed by He and Li [J. Econ. Dynam. Control, 2007, 31, 3396-3426, Quant. Finance, 2008, 8, 59-79] with actual data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A family of stochastic gradient algorithms and their behaviour in the data echo cancellation work platform are presented. The cost function adaptation algorithms use an error exponent update strategy based on an absolute error mapping, which is updated at every iteration. The quadratic and nonquadratic cost functions are special cases of the new family. Several possible realisations are introduced using these approaches. The noisy error problem is discussed and the digital recursive filter estimator is proposed. The simulation outcomes confirm the effectiveness of the proposed family of algorithms.