911 resultados para Evaluation of different sources of proteins
Resumo:
Many studies comparing the effects of single- and multi-strain probiotics on pathogen inhibition compare treatments with different concentrations. They also do not examine the possibility of inhibition between probiotic strains with a mixture. We tested the ability of 14 single-species probiotics to inhibit each other using a cross-streak assay, and agar spot test. We then tested the ability of 15 single-species probiotics and 5 probiotic mixtures to inhibit C. difficile, E. coli and S. Typhimurium, using the agar spot test. Testing was done with mixtures created in two ways: one group contained component species incubated together, the other group of mixtures was made using component species which had been incubated separately, equalised to equal optical density, and then mixed in equal volumes. Inhibition was observed for all combinations of probiotics, suggesting that when used as such there may be inhibition between probiotics, potentially reducing efficacy of the mixture. Significant inter-species variation was seen against each pathogen. When single species were tested against mixtures, the multi-species preparations displayed significantly (p<0.05 or less) greater inhibition of pathogens in 12 out of 24 cases. Despite evidence that probiotic species will inhibit each other when incubated together in vitro, in many cases a probiotic mixture was more effective at inhibiting pathogens than its component species when tested at approximately equal concentrations of biomass. This suggests that using a probiotic mixture might be more effective at reducing gastrointestinal infections, and that creating a mixture using species with different effects against different pathogens may have a broader spectrum of action that a single provided by a single strain.
Resumo:
Diffuse pollution, and the contribution from agriculture in particular, has become increasingly important as pollution from point sources has been addressed by wastewater treatment. Land management approaches, such as construction of field wetlands, provide one group of mitigation options available to farmers. Although field wetlands are widely used for diffuse pollution control in temperate environments worldwide, there is a shortage of evidence for the effectiveness and viability of these mitigation options in the UK. The Mitigation Options for Phosphorus and Sediment Project aims to make recommendations regarding the design and effectiveness of field wetlands for diffuse pollution control in UK landscapes. Ten wetlands have been built on four farms in Cumbria and Leicestershire. This paper focuses on sediment retention within the wetlands, estimated from annual sediment surveys in the first two years, and discusses establishment costs. It is clear that the wetlands are effective in trapping a substantial amount of sediment. Estimates of annual sediment retention suggest higher trapping rates at sandy sites (0.5–6 t ha�1 yr�1), compared to silty sites (0.02–0.4 t ha�1 yr�1) and clay sites (0.01–0.07 t ha�1 yr�1). Establishment costs for the wetlands ranged from £280 to £3100 and depended more on site specific factors, such as fencing and gateways on livestock farms, rather than on wetland size or design. Wetlands with lower trapping rates would also have lower maintenance costs, as dredging would be required less frequently. The results indicate that field wetlands show promise for inclusion in agri-environment schemes, particularly if capital payments can be provided for establishment, to encourage uptake of these multi-functional features.
Resumo:
The application of antibodies to living neurones has the potential to modulate function of specific proteins by virtue of their high specificity. This specificity has proven effective in determining the involvement of many proteins in neuronal function where specific agonists and antagonists do not exist, e.g. ion channel subunits. We discuss studies where antibodies modulate functions of voltage gated sodium, voltage gated potassium, voltage gated calcium hyperpolarisation activated cyclic nucleotide (HCN gated) and transient receptor potential (TRP) channels. Ligand gated channels studied in this way include nicotinic acetylcholine receptors, purinoceptors and GABA receptors. Antibodies have also helped reveal the involvement of different intracellular proteins in neuronal functions including G-proteins as well as other proteins involved in trafficking, phosphoinositide signalling and neurotransmitter release. Some suggestions for control experiments are made to help validate the method. We conclude that antibodies can be extremely valuable in determining the functions of specific proteins in living neurones in neuroscience research.
Resumo:
Earth system models are increasing in complexity and incorporating more processes than their predecessors, making them important tools for studying the global carbon cycle. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes, with coupled climate-carbon cycle models that represent land-use change simulating total land carbon stores by 2100 that vary by as much as 600 Pg C given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous model evaluation methodologies. Here we assess the state-of-the-art with respect to evaluation of Earth system models, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeo data and (ii) metrics for evaluation, and discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute towards the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but it is also a challenge, as more knowledge about data uncertainties is required in order to determine robust evaluation methodologies that move the field of ESM evaluation from "beauty contest" toward the development of useful constraints on model behaviour.
Resumo:
Earth system models (ESMs) are increasing in complexity by incorporating more processes than their predecessors, making them potentially important tools for studying the evolution of climate and associated biogeochemical cycles. However, their coupled behaviour has only recently been examined in any detail, and has yielded a very wide range of outcomes. For example, coupled climate–carbon cycle models that represent land-use change simulate total land carbon stores at 2100 that vary by as much as 600 Pg C, given the same emissions scenario. This large uncertainty is associated with differences in how key processes are simulated in different models, and illustrates the necessity of determining which models are most realistic using rigorous methods of model evaluation. Here we assess the state-of-the-art in evaluation of ESMs, with a particular emphasis on the simulation of the carbon cycle and associated biospheric processes. We examine some of the new advances and remaining uncertainties relating to (i) modern and palaeodata and (ii) metrics for evaluation. We note that the practice of averaging results from many models is unreliable and no substitute for proper evaluation of individual models. We discuss a range of strategies, such as the inclusion of pre-calibration, combined process- and system-level evaluation, and the use of emergent constraints, that can contribute to the development of more robust evaluation schemes. An increasingly data-rich environment offers more opportunities for model evaluation, but also presents a challenge. Improved knowledge of data uncertainties is still necessary to move the field of ESM evaluation away from a "beauty contest" towards the development of useful constraints on model outcomes.
Resumo:
Precipitation forecast data from the ERA-Interim reanalysis (33 years) are evaluated using the daily England and Wales Precipitation (EWP) observations obtained from a rain gauge network. Observed and reanalysis daily precipitation data are both described well by Weibull distributions with indistinguishable shapes but different scale parameters, such that the reanalysis underestimates the observations by an average factor of 22%. The correlation between the observed and ERA-Interim time series of regional, daily precipitation is 0.91. ERA-Interim also captures the statistics of extreme precipitation including a slightly lower likelihood of the heaviest precipitation events (>15 mm day− 1 for the regional average) than indicated by the Weibull fit. ERA-Interim is also closer to EWP for the high precipitation events. Since these carry weight in longer accumulations, a smaller underestimation of 19% is found for monthly mean precipitation. The partition between convective and stratiform precipitation in the ERA-Interim forecast is also examined. In summer both components contribute equally to the total precipitation amount, while in winter the stratiform precipitation is approximately double convective. These results are expected to be relevant to other regions with low orography on the coast of a continent at the downstream end of mid-latitude stormtracks.
Resumo:
Many of the next generation of global climate models will include aerosol schemes which explicitly simulate the microphysical processes that determine the particle size distribution. These models enable aerosol optical properties and cloud condensation nuclei (CCN) concentrations to be determined by fundamental aerosol processes, which should lead to a more physically based simulation of aerosol direct and indirect radiative forcings. This study examines the global variation in particle size distribution simulated by 12 global aerosol microphysics models to quantify model diversity and to identify any common biases against observations. Evaluation against size distribution measurements from a new European network of aerosol supersites shows that the mean model agrees quite well with the observations at many sites on the annual mean, but there are some seasonal biases common to many sites. In particular, at many of these European sites, the accumulation mode number concentration is biased low during winter and Aitken mode concentrations tend to be overestimated in winter and underestimated in summer. At high northern latitudes, the models strongly underpredict Aitken and accumulation particle concentrations compared to the measurements, consistent with previous studies that have highlighted the poor performance of global aerosol models in the Arctic. In the marine boundary layer, the models capture the observed meridional variation in the size distribution, which is dominated by the Aitken mode at high latitudes, with an increasing concentration of accumulation particles with decreasing latitude. Considering vertical profiles, the models reproduce the observed peak in total particle concentrations in the upper troposphere due to new particle formation, although modelled peak concentrations tend to be biased high over Europe. Overall, the multi-model-mean data set simulates the global variation of the particle size distribution with a good degree of skill, suggesting that most of the individual global aerosol microphysics models are performing well, although the large model diversity indicates that some models are in poor agreement with the observations. Further work is required to better constrain size-resolved primary and secondary particle number sources, and an improved understanding of nucleation and growth (e.g. the role of nitrate and secondary organics) will improve the fidelity of simulated particle size distributions.
Resumo:
This report assesses the implications and revenue-generating potential of options for reform of the International Treaty on Plant Genetic Resources for Food and Agriculture in the context of the structure of the global seed industry and the emerging landscape of plant variety innovation for different crops. The implementation of these options would require modifications of Treaty and provisions of the Standard Material Transfer Agreements to alter the nature of payment obligations related to different categories of products, the payment rates under different options and the coverage of crops in Annex-I to the Treaty.
Resumo:
A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.
Resumo:
Hydrogels are polymeric materials used in many pharmaceutical and biomedical applications due to their ability to form 3D hydrophilic polymeric networks, which can absorb large amounts of water. In the present work, polyethylene glycols (PEG) were introduced into the hydrogel liquid phase in order to improve the mechanical properties of hydrogels composed of 2-hydroxyethylacrylate and 2-hydroxyethylmethacrylate (HEA–HEMA) synthesized with different co-monomer compositions and equilibrated in water or in 20 % water–PEG 400 and 600 solutions. The thermoanalytical techniques [differential scanning calorimetry (DSC) and thermogravimetry (TG)] were used to evaluate the amount and properties of free and bound water in HEA–HEMA hydrogels. The internal structure and the mechanical properties of hydrogels were studied using scanning electron microscopy and friability assay. TG “loss-on-drying” experiments were applied to study the water-retention properties of hydrogels, whereas the combination of TG and DSC allowed estimating the total amount of freezable and non-freezing water in hydrogels. The results show that the addition of viscous co-solvent (PEG) to the liquid medium results in significant improvement of the mechanical properties of HEA–HEMA hydrogels and also slightly retards the water loss from the hydrogels. A redistribution of free and bound water in the hydrogels equilibrated in mixed solutions containing 20 vol% of PEGs takes place.
Resumo:
The objective of this article is to study the problem of pedestrian classification across different light spectrum domains (visible and far-infrared (FIR)) and modalities (intensity, depth and motion). In recent years, there has been a number of approaches for classifying and detecting pedestrians in both FIR and visible images, but the methods are difficult to compare, because either the datasets are not publicly available or they do not offer a comparison between the two domains. Our two primary contributions are the following: (1) we propose a public dataset, named RIFIR , containing both FIR and visible images collected in an urban environment from a moving vehicle during daytime; and (2) we compare the state-of-the-art features in a multi-modality setup: intensity, depth and flow, in far-infrared over visible domains. The experiments show that features families, intensity self-similarity (ISS), local binary patterns (LBP), local gradient patterns (LGP) and histogram of oriented gradients (HOG), computed from FIR and visible domains are highly complementary, but their relative performance varies across different modalities. In our experiments, the FIR domain has proven superior to the visible one for the task of pedestrian classification, but the overall best results are obtained by a multi-domain multi-modality multi-feature fusion.
Resumo:
The decision to close airspace in the event of a volcanic eruption is based on hazard maps of predicted ash extent. These are produced using output from volcanic ash transport and dispersion (VATD)models. In this paper an objectivemetric to evaluate the spatial accuracy of VATD simulations relative to satellite retrievals of volcanic ash is presented. The 5 metric is based on the fractions skill score (FSS). Thismeasure of skill provides more information than traditional point-bypoint metrics, such as success index and Pearson correlation coefficient, as it takes into the account spatial scale overwhich skill is being assessed. The FSS determines the scale overwhich a simulation has skill and can differentiate between a "near miss" and a forecast that is badly misplaced. The 10 idealised scenarios presented show that even simulations with considerable displacement errors have useful skill when evaluated over neighbourhood scales of 200–700km2. This method could be used to compare forecasts produced by different VATDs or using different model parameters, assess the impact of assimilating satellite retrieved ash data and evaluate VATD forecasts over a long time period.
Resumo:
Georeferencing is one of the major tasks of satellite-borne remote sensing. Compared to traditional indirect methods, direct georeferencing through a Global Positioning System/inertial navigation system requires fewer and simpler steps to obtain exterior orientation parameters of remotely sensed images. However, the pixel shift caused by geographic positioning error, which is generally derived from boresight angle as well as terrain topography variation, can have a great impact on the precision of georeferencing. The distribution of pixel shifts introduced by the positioning error on a satellite linear push-broom image is quantitatively analyzed. We use the variation of the object space coordinate to simulate different kinds of positioning errors and terrain topography. Then a total differential method was applied to establish a rigorous sensor model in order to mathematically obtain the relationship between pixel shift and positioning error. Finally, two simulation experiments are conducted using the imaging parameters of Chang’ E-1 satellite to evaluate two different kinds of positioning errors. The experimental results have shown that with the experimental parameters, the maximum pixel shift could reach 1.74 pixels. The proposed approach can be extended to a generic application for imaging error modeling in remote sensing with terrain variation.
Resumo:
Designing surgical instruments for robotic-assisted minimally-invasive surgery (RAMIS) is challenging due to constraints on the number and type of sensors imposed by considerations such as space or the need for sterilization. A new method for evaluating the usability of virtual teleoperated surgical instruments based on virtual sensors is presented. This method uses virtual prototyping of the surgical instrument with a dual physical interaction, which allows testing of different sensor configurations in a real environment. Moreover, the proposed approach has been applied to the evaluation of prototypes of a two-finger grasper for lump detection by remote pinching. In this example, the usability of a set of five different sensor configurations, with a different number of force sensors, is evaluated in terms of quantitative and qualitative measures in clinical experiments with 23 volunteers. As a result, the smallest number of force sensors needed in the surgical instrument that ensures the usability of the device can be determined. The details of the experimental setup are also included.
Resumo:
Different treatments that could be implemented in the home environ-ment are evaluated with the objective of reaching a more rational and efficient use of energy. We consider that a detailed knowledge of energy-consuming behaviour is paramount for the development and implementation of new technologies, services and even policies that could result in more rational energy use. The proposed evaluation methodology is based on the development of economic experiments implemented in an experimental economics laboratory, where the behaviour of individuals when making decisions related to energy use in the domestic environment can be tested.