106 resultados para SHORT RETROPOSONS
Resumo:
In an adaptive seamless phase II/III clinical trial interim analysis, data are used for treatment selection, enabling resources to be focused on comparison of more effective treatment(s) with a control. In this paper, we compare two methods recently proposed to enable use of short-term endpoint data for decision-making at the interim analysis. The comparison focuses on the power and the probability of correctly identifying the most promising treatment. We show that the choice of method depends on how well short-term data predict the best treatment, which may be measured by the correlation between treatment effects on short- and long-term endpoints.
Resumo:
The Green Feed (GF) system (C-Lock Inc., Rapid City, USA) is used to estimate total daily methane emissions of individual cattle using short-term measurements obtained over several days. Our objective was to compare measurements of methane emission by growing cattle obtained using the GF system with measurements using respiration chambers (RC)or sulphur hexafluoride tracer (SF6). It was hypothesised that estimates of methane emission for individual animals and treatments would be similar for GF compared to RC or SF6 techniques. In experiment 1, maize or grass silage-based diets were fed to four growing Holstein heifers, whilst for experiment 2, four different heifers were fed four haylage treatments. Both experiments were a 4 × 4 Latin square design with 33 day periods. Green Feed measurements of methane emission were obtained over 7 days (days 22–28) and com-pared to subsequent RC measurements over 4 days (days 29–33). For experiment 3, 12growing heifers rotationally grazed three swards for 26 days, with simultaneous GF and SF6 measurements over two 4 day measurement periods (days 15–19 and days 22–26).Overall methane emissions (g/day and g/kg dry matter intake [DMI]) measured using GF in experiments 1 (198 and 26.6, respectively) and 2 (208 and 27.8, respectively) were similar to averages obtained using RC (218 and 28.3, respectively for experiment 1; and 209 and 27.7, respectively, for experiment 2); but there was poor concordance between the two methods (0.1043 for experiments 1 and 2 combined). Overall, methane emissions measured using SF6 were higher (P<0.001) than GF during grazing (186 vs. 164 g/day), but there was significant (P<0.01) concordance between the two methods (0.6017). There were fewer methane measurements by GF under grazing conditions in experiment 3 (1.60/day) com-pared to indoor measurements in experiments 1 (2.11/day) and 2 (2.34/day). Significant treatment effects on methane emission measured using RC and SF6 were not evident for GF measurements, and the ranking for treatments and individual animals differed using the GF system. We conclude that under our conditions of use the GF system was unable to detectsignificant treatment and individual animal differences in methane emissions that were identified using both RC and SF6techniques, in part due to limited numbers and timing ofmeasurements obtained. Our data suggest that successful use of the GF system is reliant on the number and timing of measurements obtained relative to diurnal patterns of methane emission.
Resumo:
The treatment of auditory-verbal short-term memory (STM) deficits in aphasia is a growing avenue of research (Martin & Reilly, 2012; Murray, 2012). STM treatment requires time precision, which is suited to computerised delivery. We have designed software, which provides STM treatment for aphasia. The treatment is based on matching listening span tasks (Howard & Franklin, 1990), aiming to improve the temporal maintenance of multi-word sequences (Salis, 2012). The person listens to pairs of word-lists that differ in word-order and decides if the pairs are the same or different. This approach does not require speech output and is suitable for persons with aphasia who have limited or no output. We describe the software and how its review from clinicians shaped its design.
Resumo:
Short-term memory (STM) impairments are prevalent in adults with acquired brain injuries. While there are several published tests to assess these impairments, the majority require speech production, e.g. digit span (Wechsler, 1987). This feature may make them unsuitable for people with aphasia and motor speech disorders because of word finding difficulties and speech demands respectively. If patients perceive the speech demands of the test to be high, the may not engage with testing. Furthermore, existing STM tests are mainly ‘pen-and-paper’ tests, which can jeopardise accuracy. To address these shortcomings, we designed and standardised a novel computerised test that does not require speech output and because of the computerised delivery it would enable clinicians identify STM impairments with greater precision than current tests. The matching listening span tasks, similar to the non-normed PALPA 13 (Kay, Lesser & Coltheart, 1992) is used to test short-term memory for serial order of spoken items. Sequences of digits are presented in pairs. The person hears the first sequence, followed by the second sequence and s/he decides whether the two sequences are the same or different. In the computerised test, the sequences are presented in live voice recordings on a portable computer through a software application (Molero Martin, Laird, Hwang & Salis 2013). We collected normative data from healthy older adults (N=22-24) using digits, real words (one- and two-syllables) and non-words (one- and two- syllables). Their performance was scored following two systems. The Highest Span system was the highest span length (e.g. 2-8) at which a participant correctly responded to over 7 out of 10 trials at the highest sequence length. Test re-test reliability was also tested in a subgroup of participants. The test will be available as free of charge for clinicians and researchers to use.
Resumo:
An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.
Resumo:
Policies to control air quality focus on mitigating emissions of aerosols and their precursors, and other short-lived climate pollutants (SLCPs). On a local scale, these policies will have beneficial impacts on health and crop yields, by reducing particulate matter (PM) and surface ozone concentrations; however, the climate impacts of reducing emissions of SLCPs are less straightforward to predict. In this paper we consider a set of idealised, extreme mitigation strategies, in which the total anthropogenic emissions of individual SLCP emissions species are removed. This provides an upper bound on the potential climate impacts of such air quality strategies. We focus on evaluating the climate responses to changes in anthropogenic emissions of aerosol precursor species: black carbon (BC), organic carbon (OC) and sulphur dioxide (SO2). We perform climate integrations with four fully coupled atmosphere-ocean global climate models (AOGCMs), and examine the effects on global and regional climate of removing the total land-based anthropogenic emissions of each of the three aerosol precursor species. We find that the SO2 emissions reductions lead to the strongest response, with all three models showing an increase in surface temperature focussed in the northern hemisphere high latitudes, and a corresponding increase in global mean precipitation and run-off. Changes in precipitation and run-off patterns are driven mostly by a northward shift in the ITCZ, consistent with the hemispherically asymmetric warming pattern driven by the emissions changes. The BC and OC emissions reductions give a much weaker forcing signal, and there is some disagreement between models in the sign of the climate responses to these perturbations. These differences between models are due largely to natural variability in sea-ice extent, circulation patterns and cloud changes. This large natural variability component to the signal when the ocean circulation and sea-ice are free-running means that the BC and OC mitigation measures do not necessarily lead to a discernible climate response.
Resumo:
This study investigates the effects of a short-term pedagogic intervention on the development of L2 fluency among learners studying English for Academic purposes (EAP) at a university in the UK. It also examines the interaction between the development of fluency, and complexity and accuracy. Through a pre-test, post-test design, data were collected over a period of four weeks from learners performing monologic tasks. While the Control Group (CG) focused on developing general speaking and listening skills, the Experimental Group (EG) received awareness-raising activities and fluency strategy training in addition to general speaking and listening practice i.e following the syllabus. The data, coded in terms of a range of measures of fluency, accuracy and complexity, were subjected to repeated measures MANOVA, t-tests and correlations. The results indicate that after the intervention, while some fluency gains were achieved by the CG, the EG produced statistically more fluent language demonstrating a faster speech and articulation rate, longer runs and higher phonation time ratios. The significant correlations obtained between measures of accuracy and learners’ pauses in the CG suggest that pausing opportunities may have been linked to accuracy. The findings of the study have significant implications for L2 pedagogy, highlighting the effective impact of instruction on the development of fluency.
Resumo:
Seamless phase II/III clinical trials in which an experimental treatment is selected at an interim analysis have been the focus of much recent research interest. Many of the methods proposed are based on the group sequential approach. This paper considers designs of this type in which the treatment selection can be based on short-term endpoint information for more patients than have primary endpoint data available. We show that in such a case, the familywise type I error rate may be inflated if previously proposed group sequential methods are used and the treatment selection rule is not specified in advance. A method is proposed to avoid this inflation by considering the treatment selection that maximises the conditional error given the data available at the interim analysis. A simulation study is reported that illustrates the type I error rate inflation and compares the power of the new approach with two other methods: a combination testing approach and a group sequential method that does not use the short-term endpoint data, both of which also strongly control the type I error rate. The new method is also illustrated through application to a study in Alzheimer's disease. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Resumo:
The aim of this study was to evaluate the effects of inulin as fat replacer on short dough biscuits and their corresponding doughs. A control formulation, with no replacement, and four formulations in which 10, 20, 30, and 40 % of shortening was replaced by inulin were studied. In the dough, shortening was observed surrounding flour components. At higher fat replacement levels, flour was more available for hydration leading to significant (P<0.05) harder doughs: from 2.76 (0.12)N in 10 % fat-replaced biscuits to 5.81 (1.56)N in 30 % fat-replaced ones. Biscuit structure was more continuous than dough structure. A continuous fat layer coated the matrix surface, where starch granules were embedded. In general, weight loss during baking and water activity decreased significantly (P<0.05) as fat replacement increased. Biscuit dimensions and aeration decreased when fat replacement increased, e.g., width gain was +1.20 mm in 10 fat-replaced biscuits and only +0.32 mm in 40 % fat-replaced ones. Panelist found biscuits with 20 % of fat replacement slightly harder than control biscuits. It can be concluded that shortening may be partially replaced, up to 20 %, with inulin. These low fat biscuits are similar than the control biscuits, and they can have additional health benefits derived from inulin presence.
Resumo:
This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.
Resumo:
Research has shown that verbal short‐term memory span is shorter in individuals with Down syndrome than in typically developing individuals of equivalent mental age, but little attention has been given to variations within or across groups. Differences in the environment and in particular educational experiences may play a part in the relative ease or difficulty with which children remember verbal material. This article explores the performance of 26 Egyptian pupils with Down syndrome and 26 Egyptian typically developing children on two verbal short‐term memory tests: digit recall and non‐word repetition tasks. The findings of the study revealed that typically developing children showed superior performance on these tasks to that of pupils with Down syndrome, whose performance was both lower and revealed a narrower range of attainment. Comparisons with the performance of children with Down syndrome in this study suggested that not only did the children with Down syndrome perform more poorly than the typically developing children, their profile also appeared worse than the results of studies of children with a similar mental age with Down syndrome carried out in western countries. The results from this study suggested that, while deficits in verbal short‐term memory in Down syndrome may well be universal, it is important to recognise that performances may vary as a consequence of culture and educational experiences. The significance of these findings is explored with reference to approaches to education and how these are conceptualised in relation to children with disabilities.
Resumo:
Partial budgeting was used to estimate the net benefit of blending Jersey milk in Holstein-Friesian milk for Cheddar cheese production. Jersey milk increases Cheddar cheese yield. However, the cost of Jersey milk is also higher; thus, determining the balance of profitability is necessary, including consideration of seasonal effects. Input variables were based on a pilot plant experiment run from 2012 to 2013 and industry milk and cheese prices during this period. When Jersey milk was used at an increasing rate with Holstein-Friesian milk (25, 50, 75, and 100% Jersey milk), it resulted in an increase of average net profit of 3.41, 6.44, 8.57, and 11.18 pence per kilogram of milk, respectively, and this additional profit was constant throughout the year. Sensitivity analysis showed that the most influential input on additional profit was cheese yield, whereas cheese price and milk price had a small effect. The minimum increase in yield, which was necessary for the use of Jersey milk to be profitable, was 2.63, 7.28, 9.95, and 12.37% at 25, 50, 75, and 100% Jersey milk, respectively. Including Jersey milk did not affect the quantity of whey butter and powder produced. Althoug further research is needed to ascertain the amount of additional profit that would be found on a commercial scale, the results indicate that using Jersey milk for Cheddar cheese making would lead to an improvement in profit for the cheese makers, especially at higher inclusion rates.