945 resultados para EVALUATING CATTLE DIETS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global syntheses of palaeoenvironmental data are required to test climate models under conditions different from the present. Data sets for this purpose contain data from spatially extensive networks of sites. The data are either directly comparable to model output or readily interpretable in terms of modelled climate variables. Data sets must contain sufficient documentation to distinguish between raw (primary) and interpreted (secondary, tertiary) data, to evaluate the assumptions involved in interpretation of the data, to exercise quality control, and to select data appropriate for specific goals. Four data bases for the Late Quaternary, documenting changes in lake levels since 30 kyr BP (the Global Lake Status Data Base), vegetation distribution at 18 kyr and 6 kyr BP (BIOME 6000), aeolian accumulation rates during the last glacial-interglacial cycle (DIRTMAP), and tropical terrestrial climates at the Last Glacial Maximum (the LGM Tropical Terrestrial Data Synthesis) are summarised. Each has been used to evaluate simulations of Last Glacial Maximum (LGM: 21 calendar kyr BP) and/or mid-Holocene (6 cal. kyr BP) environments. Comparisons have demonstrated that changes in radiative forcing and orography due to orbital and ice-sheet variations explain the first-order, broad-scale (in space and time) features of global climate change since the LGM. However, atmospheric models forced by 6 cal. kyr BP orbital changes with unchanged surface conditions fail to capture quantitative aspects of the observed climate, including the greatly increased magnitude and northward shift of the African monsoon during the early to mid-Holocene. Similarly, comparisons with palaeoenvironmental datasets show that atmospheric models have underestimated the magnitude of cooling and drying of much of the land surface at the LGM. The inclusion of feedbacks due to changes in ocean- and land-surface conditions at both times, and atmospheric dust loading at the LGM, appears to be required in order to produce a better simulation of these past climates. The development of Earth system models incorporating the dynamic interactions among ocean, atmosphere, and vegetation is therefore mandated by Quaternary science results as well as climatological principles. For greatest scientific benefit, this development must be paralleled by continued advances in palaeodata analysis and synthesis, which in turn will help to define questions that call for new focused data collection efforts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ancestral human populations had diets containing more indigestible plant material than present-day diets in industrialized countries. One hypothesis for the rise in prevalence of obesity is that physiological mechanisms for controlling appetite evolved to match a diet with plant fiber content higher than that of present-day diets. We investigated how diet affects gut microbiota and colon cells by comparing human microbial communities with those from a primate that has an extreme plant-based diet, namely, the gelada baboon, which is a grazer. The effects of potato (high starch) versus grass (high lignin and cellulose) diets on human-derived versus gelada-derived fecal communities were compared in vitro. We especially focused on the production of short-chain fatty acids, which are hypothesized to be key metabolites influencing appetite regulation pathways. The results confirmed that diet has a major effect on bacterial numbers, short-chain fatty acid production, and the release of hormones involved in appetite suppression. The potato diet yielded greater production of short-chain fatty acids and hormone release than the grass diet, even in the gelada cultures, which we had expected should be better adapted to the grass diet. The strong effects of diet on hormone release could not be explained, however, solely by short-chain fatty acid concentrations. Nuclear magnetic resonance spectroscopy found changes in additional metabolites, including betaine and isoleucine, that might play key roles in inhibiting and stimulating appetite suppression pathways. Our study results indicate that a broader array of metabolites might be involved in triggering gut hormone release in humans than previously thought. IMPORTANCE: One theory for rising levels of obesity in western populations is that the body's mechanisms for controlling appetite evolved to match ancestral diets with more low-energy plant foods. We investigated this idea by comparing the effects of diet on appetite suppression pathways via the use of gut bacterial communities from humans and gelada baboons, which are modern-day primates with an extreme diet of low-energy plant food, namely, grass. We found that diet does play a major role in affecting gut bacteria and the production of a hormone that suppresses appetite but not in the direction predicted by the ancestral diet hypothesis. Also, bacterial products were correlated with hormone release that were different from those normally thought to play this role. By comparing microbiota and diets outside the natural range for modern humans, we found a relationship between diet and appetite pathways that was more complex than previously hypothesized on the basis of more-controlled studies of the effects of single compounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper analyses the impact of a priori determinants of biosecurity behaviour of farmers in Great Britain. We use a dataset collected through a stratified telephone survey of 900 cattle and sheep farmers in Great Britain (400 in England and a further 250 in Wales and Scotland respectively) which took place between 25 March 2010 and 18 June 2010. The survey was stratified by farm type, farm size and region. To test the influence of a priori determinants on biosecurity behaviour we used a behavioural economics method, structural equation modelling (SEM) with observed and latent variables. SEM is a statistical technique for testing and estimating causal relationships amongst variables, some of which may be latent using a combination of statistical data and qualitative causal assumptions. Thirteen latent variables were identified and extracted, expressing the behaviour and the underlying determining factors. The variables were: experience, economic factors, organic certification of farm, membership in a cattle/sheep health scheme, perceived usefulness of biosecurity information sources, knowledge about biosecurity measures, perceived importance of specific biosecurity strategies, perceived effect (on farm business in the past five years) of welfare/health regulation, perceived effect of severe outbreaks of animal diseases, attitudes towards livestock biosecurity, attitudes towards animal welfare, influence on decision to apply biosecurity measures and biosecurity behaviour. The SEM model applied on the Great Britain sample has an adequate fit according to the measures of absolute, incremental and parsimonious fit. The results suggest that farmers’ perceived importance of specific biosecurity strategies, organic certification of farm, knowledge about biosecurity measures, attitudes towards animal welfare, perceived usefulness of biosecurity information sources, perceived effect on business during the past five years of severe outbreaks of animal diseases, membership in a cattle/sheep health scheme, attitudes towards livestock biosecurity, influence on decision to apply biosecurity measures, experience and economic factors are significantly influencing behaviour (overall explaining 64% of the variance in behaviour).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sustainable Intensification (SI) of agriculture has recently received widespread political attention, in both the UK and internationally. The concept recognises the need to simultaneously raise yields, increase input use efficiency and reduce the negative environmental impacts of farming systems to secure future food production and to sustainably use the limited resources for agriculture. The objective of this paper is to outline a policy-making tool to assess SI at a farm level. Based on the method introduced by Kuosmanen and Kortelainen (2005), we use an adapted Data Envelopment Analysis (DEA) to consider the substitution possibilities between economic value and environmental pressures generated by farming systems in an aggregated index of Eco-Efficiency. Farm level data, specifically General Cropping Farms (GCFs) from the East Anglian River Basin Catchment (EARBC), UK were used as the basis for this analysis. The assignment of weights to environmental pressures through linear programming techniques, when optimising the relative Eco-Efficiency score, allows the identification of appropriate production technologies and practices (integrating pest management, conservation farming, precision agriculture, etc.) for each farm and therefore indicates specific improvements that can be undertaken towards SI. Results are used to suggest strategies for the integration of farming practices and environmental policies in the framework of SI of agriculture. Paths for improving the index of Eco-Efficiency and therefore reducing environmental pressures are also outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluate the ability of process based models to reproduce observed global mean sea-level change. When the models are forced by changes in natural and anthropogenic radiative forcing of the climate system and anthropogenic changes in land-water storage, the average of the modelled sea-level change for the periods 1900–2010, 1961–2010 and 1990–2010 is about 80%, 85% and 90% of the observed rise. The modelled rate of rise is over 1 mm yr−1 prior to 1950, decreases to less than 0.5 mm yr−1 in the 1960s, and increases to 3 mm yr−1 by 2000. When observed regional climate changes are used to drive a glacier model and an allowance is included for an ongoing adjustment of the ice sheets, the modelled sea-level rise is about 2 mm yr−1 prior to 1950, similar to the observations. The model results encompass the observed rise and the model average is within 20% of the observations, about 10% when the observed ice sheet contributions since 1993 are added, increasing confidence in future projections for the 21st century. The increased rate of rise since 1990 is not part of a natural cycle but a direct response to increased radiative forcing (both anthropogenic and natural), which will continue to grow with ongoing greenhouse gas emissions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We utilize energy budget diagnostics from the Coupled Model Intercomparison Project phase 5 (CMIP5) to evaluate the models' climate forcing since preindustrial times employing an established regression technique. The climate forcing evaluated this way, termed the adjusted forcing (AF), includes a rapid adjustment term associated with cloud changes and other tropospheric and land-surface changes. We estimate a 2010 total anthropogenic and natural AF from CMIP5 models of 1.9 ± 0.9 W m−2 (5–95% range). The projected AF of the Representative Concentration Pathway simulations are lower than their expected radiative forcing (RF) in 2095 but agree well with efficacy weighted forcings from integrated assessment models. The smaller AF, compared to RF, is likely due to cloud adjustment. Multimodel time series of temperature change and AF from 1850 to 2100 have large intermodel spreads throughout the period. The intermodel spread of temperature change is principally driven by forcing differences in the present day and climate feedback differences in 2095, although forcing differences are still important for model spread at 2095. We find no significant relationship between the equilibrium climate sensitivity (ECS) of a model and its 2003 AF, in contrast to that found in older models where higher ECS models generally had less forcing. Given the large present-day model spread, there is no indication of any tendency by modelling groups to adjust their aerosol forcing in order to produce observed trends. Instead, some CMIP5 models have a relatively large positive forcing and overestimate the observed temperature change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The techno-economic performance of a small wind turbine is very sensitive to the available wind resource. However, due to financial and practical constraints installers rely on low resolution wind speed databases to assess a potential site. This study investigates whether the two site assessment tools currently used in the UK, NOABL or the Energy Saving Trust wind speed estimator, are accurate enough to estimate the techno-economic performance of a small wind turbine. Both the tools tend to overestimate the wind speed, with a mean error of 23% and 18% for the NOABL and Energy Saving Trust tool respectively. A techno-economic assessment of 33 small wind turbines at each site has shown that these errors can have a significant impact on the estimated load factor of an installation. Consequently, site/turbine combinations which are not economically viable can be predicted to be viable. Furthermore, both models tend to underestimate the wind resource at relatively high wind speed sites, this can lead to missed opportunities as economically viable turbine/site combinations are predicted to be non-viable. These results show that a better understanding of the local wind resource is a required to make small wind turbines a viable technology in the UK.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study compares the impact of thermal and high pressure high temperature(HPHT) processing on volatile profile (via a non-targeted headspace fingerprinting) and structural and nutritional quality parameter (via targeted approaches) of orange and yellow carrot purees. The effect of oil enrichment was also considered. Since oil enrichment affects compounds volatility, the effect of oil was not studied when comparing the volatile fraction. For the targeted part, as yellow carrot purees were shown to contain a very low amount of carotenoids, focus was given to orange carrot purees. The results of the non-targeted approach demonstrated HPHT processing exerts a distinct effect on the volatile fractions compared to thermal processing. In addition, different colored carrot varieties are characterized by distinct headspace fingerprints. From a structural point of view, limited or no difference could be observed between orange carrot purees treated with HPHT or HT processes, both for samples without and with oil. From nutritional point of view, only in samples with oil, significant isomerisation of all-trans-β-carotene occurred due to both processing. Overall, for this type of product and for the selected conditions, HPHT processing seems to have a different impact on the volatile profile but rather similar impact on the structural and nutritional attributes compared to thermal processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This commentary seeks to prompt new discussion about the place of urban planning history in the era of contemporary globalisation. Given the deep historic engagement of urban planning thought and practice with ‘place’ shaping and thus with the constitution of society, culture and politics, we ask how relevant is planning's legacy to the shaping of present day cities. Late twentieth century urban sociology, cultural and economic geography have demonstrated the increasing significance of intercity relations and the functional porosity of metropolitan boundaries in the network society, however statutory urban planning systems remain tied to the administrative geographies of states. This ‘territorial fixing’ of practice constrains the operational space of planning and, we argue, also limits its vision to geopolitical scales and agendas that have receding relevance for emerging urban relations. We propose that a re-evaluation of planning history could have an important part to play in addressing this spatial conundrum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research evaluating perceptual responses to music has identified many structural features as correlates that might be incorporated in computer music systems for affectively charged algorithmic composition and/or expressive music performance. In order to investigate the possible integration of isolated musical features to such a system, a discrete feature known to correlate some with emotional responses – rhythmic density – was selected from a literature review and incorporated into a prototype system. This system produces variation in rhythm density via a transformative process. A stimulus set created using this system was then subjected to a perceptual evaluation. Pairwise comparisons were used to scale differences between 48 stimuli. Listener responses were analysed with Multidimensional scaling (MDS). The 2-Dimensional solution was then rotated to place the stimuli with the largest range of variation across the horizontal plane. Stimuli with variation in rhythmic density were placed further from the source material than stimuli that were generated by random permutation. This, combined with the striking similarity between the MDS scaling and that of the 2-dimensional emotional model used by some affective algorithmic composition systems, suggests that isolated musical feature manipulation can now be used to parametrically control affectively charged automated composition in a larger system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using lessons from idealised predictability experiments, we discuss some issues and perspectives on the design of operational seasonal to inter-annual Arctic sea-ice prediction systems. We first review the opportunities to use a hierarchy of different types of experiment to learn about the predictability of Arctic climate. We also examine key issues for ensemble system design, such as: measuring skill, the role of ensemble size and generation of ensemble members. When assessing the potential skill of a set of prediction experiments, using more than one metric is essential as different choices can significantly alter conclusions about the presence or lack of skill. We find that increasing both the number of hindcasts and ensemble size is important for reliably assessing the correlation and expected error in forecasts. For other metrics, such as dispersion, increasing ensemble size is most important. Probabilistic measures of skill can also provide useful information about the reliability of forecasts. In addition, various methods for generating the different ensemble members are tested. The range of techniques can produce surprisingly different ensemble spread characteristics. The lessons learnt should help inform the design of future operational prediction systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been growing concern about bacterial resistance to antimicrobials in the farmed livestock sector. Attention has turned to sub-optimal use of antimicrobials as a driver of resistance. Recent reviews have identified a lack of data on the pattern of antimicrobial use as an impediment to the design of measures to tackle this growing problem. This paper reports on a study that explored use of antibiotics by dairy farmers and factors influencing their decision-making around this usage. We found that respondents had either recently reduced their use of antibiotics, or planned to do so. Advice from their veterinarian was instrumental in this. Over 70% thought reducing antibiotic usage would be a good thing to do. The most influential source of information used was their own veterinarian. Some 50% were unaware of the available guidelines on use in cattle production. However, 97% thought it important to keep treatment records. The Theory of Planned Behaviour was used to identify dairy farmers’ drivers and barriers to reduce use of antibiotics. Intention to reduce usage was weakly correlated with current and past practice of antibiotic use, whilst the strongest driver was respondents’ belief that their social and advisory network would approve of them doing this. The higher the proportion of income from milk production and the greater the chance of remaining in milk production, the significantly higher the likelihood of farmers exhibiting positive intention to reduce antibiotic usage. Such farmers may be more commercially minded than others and thus more cost-conscious or, perhaps, more aware of possible future restrictions. Strong correlation was found between farmers’ perception of their social referents’ beliefs and farmers’ intent to reduce antibiotic use. Policy makers should target these social referents, especially veterinarians, with information on the benefits from, and the means to, achieving reductions in antibiotic usage. Information on sub-optimal use of antibiotics as a driver of resistance in dairy herds and in humans along with advice on best farm practice to minimise risk of disease and ensure animal welfare, complemented with data on potential cost savings from reduced antibiotic use would help improve poor practice.