899 resultados para range analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

From the early Roman period, there is archaeological evidence for the exploitation of the Flemish coastal plain (Belgium) for a range of activities, such as sheep herding on the then developing salt-marshes and salt-meadows for the production of wool. During the early Middle Ages, this culminated in the establishment of dedicated ‘sheep estates’. This phase of exploitation was followed by extensive drainage and land reclamation measures in the high Medieval period, transforming areas into grassland, suited for cattle breeding. As part of a larger project investigating the onset, intensification and final decline of sheep management in coastal Flanders in the historical period, this pilot study presents the results of sequential sampling and oxygen isotope analysis of a number of sheep teeth (M2, n = 8) from four late Roman and Medieval sites (dating from 4th to 15th century AD), in order to assess potential variations in season of birth between the different sites and through time. In comparison with published data from herds of known birth season, incremental enamel data from the Flemish sites are consistent with late winter/spring births, with the possibility of some instances of slightly earlier parturition. These findings suggest that manipulation of season of birth was not a feature of the sheep husbandry-based economies of early historic Flanders, further evidencing that wool production was the main purpose of contemporary sheep rearing in the region. Manipulation of season of birth is not likely to have afforded economic advantage in wool-centred economies, unlike in some milk- or meat-based regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current UK intake of non-milk extrinsic sugars (NMES) is above recommendations. Reducing the sugar content of processed high sugar foods through reformulation is one option for reducing consumption of NMES at a population level. However, reformulation can alter the sensory attributes of food products and influence consumer liking. This study evaluated consumer acceptance of a selection of products that are commercially-available in the UK; these included regular and sugar-reduced baked beans, strawberry jam, milk chocolate, cola and cranberry & raspberry juice. Sweeteners were present in the reformulated chocolate (maltitol), cola (aspartame and acesulfame-K) and juice (sucralose) samples. Healthy, non-smoking consumers (n = 116; 55 men, 61 women, age: 33 ± 9 years; BMI: 25.7 ± 4.6 kg/m2) rated the products for overall liking and on liking of appearance, flavor and texture using a nine-point hedonic scale. There were significant differences between standard and reduced sugar products in consumers’ overall liking and on liking of each modality (appearance, flavor and texture; all P < 0.0001). For overall liking, only the regular beans and cola were significantly more liked than their reformulated counterparts (P < 0.0001). Cluster analysis identified three consumer clusters that were representative of different patterns of consumer liking. For the largest cluster (cluster 3: 45%), there was a significant difference in mean liking scores across all products, except jam. Differences in liking were predominantly driven by sweet taste in 2 out of 3 clusters. The current research has demonstrated that a high proportion of consumers prefer conventional products over sugar-reduced products across a wide range of product types (45%) or across selected products (27%), when tasted unbranded, and so there is room for further optimization of commercial reduced sugar products that were evaluated in the current study. Future work should evaluate strategies to facilitate compliance to dietary recommendations on NMES and free sugars, such as the impact of sugar-reduced food exposure on their acceptance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The automatic transformation of sequential programs for efficient execution on parallel computers involves a number of analyses and restructurings of the input. Some of these analyses are based on computing array sections, a compact description of a range of array elements. Array sections describe the set of array elements that are either read or written by program statements. These sections can be compactly represented using shape descriptors such as regular sections, simple sections, or generalized convex regions. However, binary operations such as Union performed on these representations do not satisfy a straightforward closure property, e.g., if the operands to Union are convex, the result may be nonconvex. Approximations are resorted to in order to satisfy this closure property. These approximations introduce imprecision in the analyses and, furthermore, the imprecisions resulting from successive operations have a cumulative effect. Delayed merging is a technique suggested and used in some of the existing analyses to minimize the effects of approximation. However, this technique does not guarantee an exact solution in a general setting. This article presents a generalized technique to precisely compute Union which can overcome these imprecisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter presents an accurate delay analysis in prioritised wireless sensor networks (WSN). The analysis is an enhancement of the existing analysis proposed by Choobkar and Dilmaghani, which is only applicable to the case where the lower priority nodes always have packets to send in the empty slots of the higher priority node. The proposed analysis is applicable for any pattern of packet arrival, which includes the general case where the lower priority nodes may or may not have packets to send in the empty slots of the higher priority nodes. Evaluation of both analyses showed that the proposed delay analysis has better accuracy over the full range of loads and provides an excellent match to simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new online method to analyse water isotopes of speleothem fluid inclusions using a wavelength scanned cavity ring down spectroscopy (WS-CRDS) instrument is presented. This novel technique allows us simultaneously to measure hydrogen and oxygen isotopes for a released aliquot of water. To do so, we designed a new simple line that allows the online water extraction and isotope analysis of speleothem samples. The specificity of the method lies in the fact that fluid inclusions release is made on a standard water background, which mainly improves the δ D robustness. To saturate the line, a peristaltic pump continuously injects standard water into the line that is permanently heated to 140 °C and flushed with dry nitrogen gas. This permits instantaneous and complete vaporisation of the standard water, resulting in an artificial water background with well-known δ D and δ18O values. The speleothem sample is placed in a copper tube, attached to the line, and after system stabilisation it is crushed using a simple hydraulic device to liberate speleothem fluid inclusions water. The released water is carried by the nitrogen/standard water gas stream directly to a Picarro L1102-i for isotope determination. To test the accuracy and reproducibility of the line and to measure standard water during speleothem measurements, a syringe injection unit was added to the line. Peak evaluation is done similarly as in gas chromatography to obtain &delta D; and δ18O isotopic compositions of measured water aliquots. Precision is better than 1.5 ‰ for δ D and 0.4 ‰ for δ18O for water measurements for an extended range (−210 to 0 ‰ for δ D and −27 to 0 ‰ for δ18O) primarily dependent on the amount of water released from speleothem fluid inclusions and secondarily on the isotopic composition of the sample. The results show that WS-CRDS technology is suitable for speleothem fluid inclusion measurements and gives results that are comparable to the isotope ratio mass spectrometry (IRMS) technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Cognitive–behavioural therapy (CBT) for childhood anxiety disorders is associated with modest outcomes in the context of parental anxiety disorder. Objectives This study evaluated whether or not the outcome of CBT for children with anxiety disorders in the context of maternal anxiety disorders is improved by the addition of (i) treatment of maternal anxiety disorders, or (ii) treatment focused on maternal responses. The incremental cost-effectiveness of the additional treatments was also evaluated. Design Participants were randomised to receive (i) child cognitive–behavioural therapy (CCBT); (ii) CCBT with CBT to target maternal anxiety disorders [CCBT + maternal cognitive–behavioural therapy (MCBT)]; or (iii) CCBT with an intervention to target mother–child interactions (MCIs) (CCBT + MCI). Setting A NHS university clinic in Berkshire, UK. Participants Two hundred and eleven children with a primary anxiety disorder, whose mothers also had an anxiety disorder. Interventions All families received eight sessions of individual CCBT. Mothers in the CCBT + MCBT arm also received eight sessions of CBT targeting their own anxiety disorders. Mothers in the MCI arm received 10 sessions targeting maternal parenting cognitions and behaviours. Non-specific interventions were delivered to balance groups for therapist contact. Main outcome measures Primary clinical outcomes were the child’s primary anxiety disorder status and degree of improvement at the end of treatment. Follow-up assessments were conducted at 6 and 12 months. Outcomes in the economic analyses were identified and measured using estimated quality-adjusted life-years (QALYs). QALYS were combined with treatment, health and social care costs and presented within an incremental cost–utility analysis framework with associated uncertainty. Results MCBT was associated with significant short-term improvement in maternal anxiety; however, after children had received CCBT, group differences were no longer apparent. CCBT + MCI was associated with a reduction in maternal overinvolvement and more confident expectations of the child. However, neither CCBT + MCBT nor CCBT + MCI conferred a significant post-treatment benefit over CCBT in terms of child anxiety disorder diagnoses [adjusted risk ratio (RR) 1.18, 95% confidence interval (CI) 0.87 to 1.62, p = 0.29; adjusted RR CCBT + MCI vs. control: adjusted RR 1.22, 95% CI 0.90 to 1.67, p = 0.20, respectively] or global improvement ratings (adjusted RR 1.25, 95% CI 1.00 to 1.59, p = 0.05; adjusted RR 1.20, 95% CI 0.95 to 1.53, p = 0.13). CCBT + MCI outperformed CCBT on some secondary outcome measures. Furthermore, primary economic analyses suggested that, at commonly accepted thresholds of cost-effectiveness, the probability that CCBT + MCI will be cost-effective in comparison with CCBT (plus non-specific interventions) is about 75%. Conclusions Good outcomes were achieved for children and their mothers across treatment conditions. There was no evidence of a benefit to child outcome of supplementing CCBT with either intervention focusing on maternal anxiety disorder or maternal cognitions and behaviours. However, supplementing CCBT with treatment that targeted maternal cognitions and behaviours represented a cost-effective use of resources, although the high percentage of missing data on some economic variables is a shortcoming. Future work should consider whether or not effects of the adjunct interventions are enhanced in particular contexts. The economic findings highlight the utility of considering the use of a broad range of services when evaluating interventions with this client group. Trial registration Current Controlled Trials ISRCTN19762288. Funding This trial was funded by the Medical Research Council (MRC) and Berkshire Healthcare Foundation Trust and managed by the National Institute for Health Research (NIHR) on behalf of the MRC–NIHR partnership (09/800/17) and will be published in full in Health Technology Assessment; Vol. 19, No. 38.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human gut is a complex ecosystem occupied by a diverse microbial community. Modulation of this microbiota impacts health and disease. The definitive way to investigate the impact of dietary intervention on the gut microbiota is a human trial. However, human trials are expensive and can be difficult to control; thus, initial screening is desirable. Utilization of a range of in vitro and in vivo models means that useful information can be gathered prior to the necessity for human intervention. This review discusses the benefits and limitations of these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social network has gained remarkable attention in the last decade. Accessing social network sites such as Twitter, Facebook LinkedIn and Google+ through the internet and the web 2.0 technologies has become more affordable. People are becoming more interested in and relying on social network for information, news and opinion of other users on diverse subject matters. The heavy reliance on social network sites causes them to generate massive data characterised by three computational issues namely; size, noise and dynamism. These issues often make social network data very complex to analyse manually, resulting in the pertinent use of computational means of analysing them. Data mining provides a wide range of techniques for detecting useful knowledge from massive datasets like trends, patterns and rules [44]. Data mining techniques are used for information retrieval, statistical modelling and machine learning. These techniques employ data pre-processing, data analysis, and data interpretation processes in the course of data analysis. This survey discusses different data mining techniques used in mining diverse aspects of the social network over decades going from the historical techniques to the up-to-date models, including our novel technique named TRCM. All the techniques covered in this survey are listed in the Table.1 including the tools employed as well as names of their authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The efficiency of a Wireless Power Transfer (WPT) system is greatly dependent on both the geometry and operating frequency of the transmitting and receiving structures. By using Coupled Mode Theory (CMT), the figure of merit is calculated for resonantly-coupled loop and dipole systems. An in-depth analysis of the figure of merit is performed with respect to the key geometric parameters of the loops and dipoles, along with the resonant frequency, in order to identify the key relationships leading to high-efficiency WPT. For systems consisting of two identical single-turn loops, it is shown that the choice of both the loop radius and resonant frequency are essential in achieving high-efficiency WPT. For the dipole geometries studied, it is shown that the choice of length is largely irrelevant and that as a result of their capacitive nature, low-MHz frequency dipoles are able to produce significantly higher figures of merit than those of the loops considered. The results of the figure of merit analysis are used to propose and subsequently compare two mid-range loop and dipole WPT systems of equal size and operating frequency, where it is shown that the dipole system is able to achieve higher efficiencies than the loop system of the distance range examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper concerns the innovative use of a blend of systems thinking ideas in the ‘Munro Review of Child Protection’, a high-profile examination of child protection activities in England, conducted for the Department for Education. We go ‘behind the scenes’ to describe the OR methodologies and processes employed. The circumstances that led to the Review are outlined. Three specific contributions that systems thinking made to the Review are then described. First, the systems-based analysis and visualisation of how a ‘compliance culture’ had grown up. Second the creation of a large, complex systems map of current operations and the effects of past policies on them. Third, how the map gave shape to the range of issues the Review addressed and acted as an organising framework for the systemically coherent set of recommendations made. The paper closes with an outline of the main implementation steps taken so far to create a child protection system with the critically reflective properties of a learning organisation, and methodological reflections on the benefits of systems thinking to support organisational analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Little information exists on the effects of ensiling on condensed tannins or proanthocyanidins. The acetone–butanol–HCl assay is suitable for measuring proanthocyanidin contents in a wide range of samples, silages included, but provides limited information on proanthocyanidin composition, which is of interest for deciphering the relationships between tannins and their bioactivities in terms of animal nutrition or health. Degradation with benzyl mercaptan (thiolysis) provides information on proanthocyanidin composition, but proanthocyanidins in several sainfoin silages have proved resistant to thiolysis. We now report that a pretreatment step with sodium hydroxide prior to thiolysis was needed to enable their analysis. This alkaline treatment increased their extractability from ensiled sainfoin and facilitated especially the release of larger proanthocyanidins. Ensiling reduced assayable proanthocyanidins by 29%, but the composition of the remaining proanthocyanidins in silage resembled that of the fresh plants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Demand for organic meat is partially driven by consumer perceptions that organic foods are more nutritious than non-organic foods. However, there have been no systematic reviews comparing specifically the nutrient content of organic and conventionally produced meat. In this study, we report results of a meta-analysis based on sixty-seven published studies comparing the composition of organic and non-organic meat products. For many nutritionally relevant compounds (e.g. minerals, antioxidants and most individual fatty acids (FA)), the evidence base was too weak for meaningful meta-analyses. However, significant differences in FA profiles were detected when data from all livestock species were pooled. Concentrations of SFA and MUFA were similar or slightly lower, respectively, in organic compared with conventional meat. Larger differences were detected for total PUFA and n-3 PUFA, which were an estimated 23 (95 % CI 11, 35) % and 47 (95 % CI 10, 84) % higher in organic meat, respectively. However, for these and many other composition parameters, for which meta-analyses found significant differences, heterogeneity was high, and this could be explained by differences between animal species/meat types. Evidence from controlled experimental studies indicates that the high grazing/forage-based diets prescribed under organic farming standards may be the main reason for differences in FA profiles. Further studies are required to enable meta-analyses for a wider range of parameters (e.g. antioxidant, vitamin and mineral concentrations) and to improve both precision and consistency of results for FA profiles for all species. Potential impacts of composition differences on human health are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Pseudomonas syringae can cause stem necrosis and canker in a wide range of woody species including cherry, plum, peach, horse chestnut and ash. The detection and quantification of lesion progression over time in woody tissues is a key trait for breeders to select upon for resistance. Results In this study a general, rapid and reliable approach to lesion quantification using image recognition and an artificial neural network model was developed. This was applied to screen both the virulence of a range of P. syringae pathovars and the resistance of a set of cherry and plum accessions to bacterial canker. The method developed was more objective than scoring by eye and allowed the detection of putatively resistant plant material for further study. Conclusions Automated image analysis will facilitate rapid screening of material for resistance to bacterial and other phytopathogens, allowing more efficient selection and quantification of resistance responses.