76 resultados para potential models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Composites of wind speeds, equivalent potential temperature, mean sea level pressure, vertical velocity, and relative humidity have been produced for the 100 most intense extratropical cyclones in the Northern Hemisphere winter for the 40-yr ECMWF Re-Analysis (ERA-40) and the high resolution global environment model (HiGEM). Features of conceptual models of cyclone structure—the warm conveyor belt, cold conveyor belt, and dry intrusion—have been identified in the composites from ERA-40 and compared to HiGEM. Such features can be identified in the composite fields despite the smoothing that occurs in the compositing process. The surface features and the three-dimensional structure of the cyclones in HiGEM compare very well with those from ERA-40. The warm conveyor belt is identified in the temperature and wind fields as a mass of warm air undergoing moist isentropic uplift and is very similar in ERA-40 and HiGEM. The rate of ascent is lower in HiGEM, associated with a shallower slope of the moist isentropes in the warm sector. There are also differences in the relative humidity fields in the warm conveyor belt. In ERA-40, the high values of relative humidity are strongly associated with the moist isentropic uplift, whereas in HiGEM these are not so strongly associated. The cold conveyor belt is identified as rearward flowing air that undercuts the warm conveyor belt and produces a low-level jet, and is very similar in HiGEM and ERA-40. The dry intrusion is identified in the 500-hPa vertical velocity and relative humidity. The structure of the dry intrusion compares well between HiGEM and ERA-40 but the descent is weaker in HiGEM because of weaker along-isentrope flow behind the composite cyclone. HiGEM’s ability to represent the key features of extratropical cyclone structure can give confidence in future predictions from this model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uptake and storage of anthropogenic carbon in the North Atlantic is investigated using different configurations of ocean general circulation/carbon cycle models. We investigate how different representations of the ocean physics in the models, which represent the range of models currently in use, affect the evolution of CO2 uptake in the North Atlantic. The buffer effect of the ocean carbon system would be expected to reduce ocean CO2 uptake as the ocean absorbs increasing amounts of CO2. We find that the strength of the buffer effect is very dependent on the model ocean state, as it affects both the magnitude and timing of the changes in uptake. The timescale over which uptake of CO2 in the North Atlantic drops to below preindustrial levels is particularly sensitive to the ocean state which sets the degree of buffering; it is less sensitive to the choice of atmospheric CO2 forcing scenario. Neglecting physical climate change effects, North Atlantic CO2 uptake drops below preindustrial levels between 50 and 300 years after stabilisation of atmospheric CO2 in different model configurations. Storage of anthropogenic carbon in the North Atlantic varies much less among the different model configurations, as differences in ocean transport of dissolved inorganic carbon and uptake of CO2 compensate each other. This supports the idea that measured inventories of anthropogenic carbon in the real ocean cannot be used to constrain the surface uptake. Including physical climate change effects reduces anthropogenic CO2 uptake and storage in the North Atlantic further, due to the combined effects of surface warming, increased freshwater input, and a slowdown of the meridional overturning circulation. The timescale over which North Atlantic CO2 uptake drops to below preindustrial levels is reduced by about one-third, leading to an estimate of this timescale for the real world of about 50 years after the stabilisation of atmospheric CO2. In the climate change experiment, a shallowing of the mixed layer depths in the North Atlantic results in a significant reduction in primary production, reducing the potential role for biology in drawing down anthropogenic CO2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite observed data for flood events have been used to calibrate and validate flood inundation models, providing valuable information on the spatial extent of the flood. Improvements in the resolution of this satellite imagery have enabled indirect remote sensing of water levels by using an underlying LiDAR DEM to extract the water surface elevation at the flood margin. Further to comparison of the spatial extent, this now allows for direct comparison between modelled and observed water surface elevations. Using a 12.5m ERS-1 image of a flood event in 2006 on the River Dee, North Wales, UK, both of these data types are extracted and each assessed for their value in the calibration of flood inundation models. A LiDAR guided snake algorithm is used to extract an outline of the flood from the satellite image. From the extracted outline a binary grid of wet / dry cells is created at the same resolution as the model, using this the spatial extent of the modelled and observed flood can be compared using a measure of fit between the two binary patterns of flooding. Water heights are extracted using points at intervals of approximately 100m along the extracted outline, and the students T-test is used to compare modelled and observed water surface elevations. A LISFLOOD-FP model of the catchment is set up using LiDAR topographic data resampled to the 12.5m resolution of the satellite image, and calibration of the friction parameter in the model is undertaken using each of the two approaches. Comparison between the two approaches highlights the sensitivity of the spatial measure of fit to uncertainty in the observed data and the potential drawbacks of using the spatial extent when parts of the flood are contained by the topography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main biogeochemical nutrient distributions, along with ambient ocean temperature and the light field, control ocean biological productivity. Observations of nutrients are much sparser than physical observations of temperature and salinity, yet it is critical to validate biogeochemical models against these sparse observations if we are to successfully model biological variability and trends. Here we use data from the Bermuda Atlantic Time-series Study and the World Ocean Database 2005 to demonstrate quantitatively that over the entire globe a significant fraction of the temporal variability of phosphate, silicate and nitrate within the oceans is correlated with water density. The temporal variability of these nutrients as a function of depth is almost always greater than as a function of potential density, with he largest reductions in variability found within the main pycnocline. The greater nutrient variability as a function of depth occurs when dynamical processes vertically displace nutrient and density fields together on shorter timescales than biological adjustments. These results show that dynamical processes can have a significant impact on the instantaneous nutrient distributions. These processes must therefore be considered when modeling biogeochemical systems, when comparing such models with observations, or when assimilating data into such models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agricultural policy liberalisation, concern about unhealthy diets and growing recognition of the importance of sustainable land use have fostered interest in the development of competitive food chains based around products that are beneficial to the rural environment. We review the potential for foods with enhanced health attributes based on alternative varieties/breeds and production systems to traditional agriculture which has been predominantly motivated by yields. We concentrate on soft fruit, which is an important source of polyphenols, and grazing livestock systems that have the potential for improving fatty acid profiles in meat products and find there to be clear scientific potential, but limited research to date. Consumer research suggests considerable acceptance of such products and willingness to pay sufficient to cover additional production costs. Purchase of such foods could have major implications for agricultural land use and the rural environment. There is little research to date on specific healthier food products, but spatially explicit models are being developed to assess land use and environmental implications of changing demand and husbandry methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effectiveness of development assistance has come under renewed scrutiny in recent years. In an era of growing economic liberalisation, research organisations are increasingly being asked to account for the use of public funds by demonstrating achievements. However, in the natural resources (NR) research field, conventional economic assessment techniques have focused on quantifying the impact achieved rather understanding the process that delivered it. As a result, they provide limited guidance for planners and researchers charged with selecting and implementing future research. In response, “pathways” or logic models have attracted increased interest in recent years as a remedy to this shortcoming. However, as commonly applied these suffer from two key limitations in their ability to incorporate risk and assess variance from plan. The paper reports the results of a case study that used a Bayesian belief network approach to address these limitations and outlines its potential value as a tool to assist the planning, monitoring and evaluation of development-orientated research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models have been vitally important in the development of technologies in building engineering. A literature review identifies that linear models are the most widely used building simulation models. The advent of intelligent buildings has added new challenges in the application of the existing models as an intelligent building requires learning and self-adjusting capabilities based on environmental and occupants' factors. It is therefore argued that the linearity is an impropriate basis for any model of either complex building systems or occupant behaviours for control or whatever purpose. Chaos and complexity theory reflects nonlinear dynamic properties of the intelligent systems excised by occupants and environment and has been used widely in modelling various engineering, natural and social systems. It is proposed that chaos and complexity theory be applied to study intelligent buildings. This paper gives a brief description of chaos and complexity theory and presents its current positioning, recent developments in building engineering research and future potential applications to intelligent building studies, which provides a bridge between chaos and complexity theory and intelligent building research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article is the second part of a review of the historical evolution of mathematical models applied in the development of building technology. The first part described the current state of the art and contrasted various models with regard to the applications to conventional buildings and intelligent buildings. It concluded that mathematical techniques adopted in neural networks, expert systems, fuzzy logic and genetic models, that can be used to address model uncertainty, are well suited for modelling intelligent buildings. Despite the progress, the possible future development of intelligent buildings based on the current trends implies some potential limitations of these models. This paper attempts to uncover the fundamental limitations inherent in these models and provides some insights into future modelling directions, with special focus on the techniques of semiotics and chaos. Finally, by demonstrating an example of an intelligent building system with the mathematical models that have been developed for such a system, this review addresses the influences of mathematical models as a potential aid in developing intelligent buildings and perhaps even more advanced buildings for the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reports that heat processing of foods induces the formation of acrylamide heightened interest in the chemistry, biochemistry, and safety of this compound. Acrylamide-induced neurotoxicity, reproductive toxicity, genotoxicity, and carcinogenicity are potential human health risks based on animal studies. Because exposure of humans to acrylamide can come from both external sources and the diet, there exists a need to develop a better understanding of its formation and distribution in food and its role in human health. To contribute to this effort, experts from eight countries have presented data on the chemistry, analysis, metabolism, pharmacology, and toxicology of acrylamide. Specifically covered are the following aspects: exposure from the environment and the diet; biomarkers of exposure; risk assessment; epidemiology; mechanism of formation in food; biological alkylation of amino acids, peptides, proteins, and DNA by acrylamide and its epoxide metabolite glycidamide; neurotoxicity, reproductive toxicity, and carcinogenicity; protection against adverse effects; and possible approaches to reducing levels in food. Cross-fertilization of ideas among several disciplines in which an interest in acrylamide has developed, including food science, pharmacology, toxicology, and medicine, will provide a better understanding of the chemistry and biology of acrylamide in food, and can lead to the development of food processes to decrease the acrylamide content of the diet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed computing paradigms for sharing resources such as Clouds, Grids, Peer-to-Peer systems, or voluntary computing are becoming increasingly popular. While there are some success stories such as PlanetLab, OneLab, BOINC, BitTorrent, and SETI@home, a widespread use of these technologies for business applications has not yet been achieved. In a business environment, mechanisms are needed to provide incentives to potential users for participating in such networks. These mechanisms may range from simple non-monetary access rights, monetary payments to specific policies for sharing. Although a few models for a framework have been discussed (in the general area of a "Grid Economy"), none of these models has yet been realised in practice. This book attempts to fill this gap by discussing the reasons for such limited take-up and exploring incentive mechanisms for resource sharing in distributed systems. The purpose of this book is to identify research challenges in successfully using and deploying resource sharing strategies in open-source and commercial distributed systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physiological evidence using Infrared Video Microscopy during the uncaging of glutamate has proven the existence of excitable calcium ion channels in spine heads, highlighting the need for reliable models of spines. In this study we compare the three main methods of simulating excitable spines: Baer & Rinzel's Continuum (B&R) model, Coombes' Spike-Diffuse-Spike (SDS) model and paired cable and ion channel equations (Cable model). Tests are done to determine how well the models approximate each other in terms of speed and heights of travelling waves. Significant quantitative differences are found between the models: travelling waves in the SDS model in particular are found to travel at much lower speeds and sometimes much higher voltages than in the Cable or B&R models. Meanwhile qualitative differences are found between the B&R and SDS models over realistic parameter ranges. The cause of these differences is investigated and potential solutions proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach for automatic classification of pulsed Terahertz (THz), or T-ray, signals highlighting their potential in biomedical, pharmaceutical and security applications. T-ray classification systems supply a wealth of information about test samples and make possible the discrimination of heterogeneous layers within an object. In this paper, a novel technique involving the use of Auto Regressive (AR) and Auto Regressive Moving Average (ARMA) models on the wavelet transforms of measured T-ray pulse data is presented. Two example applications are examined - the classi. cation of normal human bone (NHB) osteoblasts against human osteosarcoma (HOS) cells and the identification of six different powder samples. A variety of model types and orders are used to generate descriptive features for subsequent classification. Wavelet-based de-noising with soft threshold shrinkage is applied to the measured T-ray signals prior to modeling. For classi. cation, a simple Mahalanobis distance classi. er is used. After feature extraction, classi. cation accuracy for cancerous and normal cell types is 93%, whereas for powders, it is 98%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assimilation of physical variables into coupled physical/biogeochemical models poses considerable difficulties. One problem is that data assimilation can break relationships between physical and biological variables. As a consequence, biological tracers, especially nutrients, are incorrectly displaced in the vertical, resulting in unrealistic biogeochemical fields. To prevent this, we present the idea of applying an increment to the nutrient field within a data assimilating model to ensure that nutrient-potential density relationships are maintained within a water column during assimilation. After correcting the nutrients, it is assumed that other biological variables rapidly adjust to the corrected nutrient fields. We applied this method to a 17 year run of the 2° NEMO ocean-ice model coupled to the PlankTOM5 ecosystem model. Results were compared with a control with no assimilation, and with a model with physical assimilation but no nutrient increment. In the nutrient incrementing experiment, phosphate distributions were improved both at high latitudes and at the equator. At midlatitudes, assimilation generated unrealistic advective upwelling of nutrients within the boundary currents, which spread into the subtropical gyres resulting in more biased nutrient fields. This result was largely unaffected by the nutrient increment and is probably due to boundary currents being poorly resolved in a 2° model. Changes to nutrient distributions fed through into other biological parameters altering primary production, air-sea CO2 flux, and chlorophyll distributions. These secondary changes were most pronounced in the subtropical gyres and at the equator, which are more nutrient limited than high latitudes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An aquaplanet model is used to study the nature of the highly persistent low-frequency waves that have been observed in models forced by zonally symmetric boundary conditions. Using the Hayashi spectral analysis of the extratropical waves, the authors find that a quasi-stationary wave 5 belongs to a wave packet obeying a well-defined dispersion relation with eastward group velocity. The components of the dispersion relation with k ≥ 5 baroclinically convert eddy available potential energy into eddy kinetic energy, whereas those with k < 5 are baroclinically neutral. In agreement with Green’s model of baroclinic instability, wave 5 is weakly unstable, and the inverse energy cascade, which had been previously proposed as a main forcing for this type of wave, only acts as a positive feedback on its predominantly baroclinic energetics. The quasi-stationary wave is reinforced by a phase lock to an analogous pattern in the tropical convection, which provides further amplification to the wave. It is also found that the Pedlosky bounds on the phase speed of unstable waves provide guidance in explaining the latitudinal structure of the energy conversion, which is shown to be more enhanced where the zonal westerly surface wind is weaker. The wave’s energy is then trapped in the waveguide created by the upper tropospheric jet stream. In agreement with Green’s theory, as the equator-to-pole SST difference is reduced, the stationary marginally stable component shifts toward higher wavenumbers, while wave 5 becomes neutral and westward propagating. Some properties of the aquaplanet quasi-stationary waves are found to be in interesting agreement with a low frequency wave observed by Salby during December–February in the Southern Hemisphere so that this perspective on low frequency variability, apart from its value in terms of basic geophysical fluid dynamics, might be of specific interest for studying the earth’s atmosphere.