14 resultados para TEST-DAY RECORDS

em CentAUR: Central Archive University of Reading - UK


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fossil pollen data supplemented by tree macrofossil records were used to reconstruct the vegetation of the Former Soviet Union and Mongolia at 6000 years. Pollen spectra were assigned to biomes using the plant-functional-type method developed by Prentice et al. (1996). Surface pollen data and a modern vegetation map provided a test of the method. This is the first time such a broad-scale vegetation reconstruction for the greater part of northern Eurasia has been attempted with objective techniques. The new results confirm previous regional palaeoenvironmental studies of the mid-Holocene while providing a comprehensive synopsis and firmer conclusions. West of the Ural Mountains temperate deciduous forest extended both northward and southward from its modern range. The northern limits of cool mixed and cool conifer forests were also further north than present. Taiga was reduced in European Russia, but was extended into Yakutia where now there is cold deciduous forest. The northern limit of taiga was extended (as shown by increased Picea pollen percentages, and by tree macrofossil records north of the present-day forest limit) but tundra was still present in north-eastern Siberia. The boundary between forest and steppe in the continental interior did not shift substantially, and dry conditions similar to present existed in western Mongolia and north of the Aral Sea.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Accurate knowledge of species’ habitat associations is important for conservation planning and policy. Assessing habitat associations is a vital precursor to selecting appropriate indicator species for prioritising sites for conservation or assessing trends in habitat quality. However, much existing knowledge is based on qualitative expert opinion or local scale studies, and may not remain accurate across different spatial scales or geographic locations. Data from biological recording schemes have the potential to provide objective measures of habitat association, with the ability to account for spatial variation. We used data on 50 British butterfly species as a test case to investigate the correspondence of data-derived measures of habitat association with expert opinion, from two different butterfly recording schemes. One scheme collected large quantities of occurrence data (c. 3 million records) and the other, lower quantities of standardised monitoring data (c. 1400 sites). We used general linear mixed effects models to derive scores of association with broad-leaf woodland for both datasets and compared them with scores canvassed from experts. Scores derived from occurrence and abundance data both showed strongly positive correlations with expert opinion. However, only for occurrence data did these fell within the range of correlations between experts. Data-derived scores showed regional spatial variation in the strength of butterfly associations with broad-leaf woodland, with a significant latitudinal trend in 26% of species. Sub-sampling of the data suggested a mean sample size of 5000 occurrence records per species to gain an accurate estimation of habitat association, although habitat specialists are likely to be readily detected using several hundred records. Occurrence data from recording schemes can thus provide easily obtained, objective, quantitative measures of habitat association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a series of experiments the toxicity of lead to worms in soil was determined following the draft OECD earthworm reproduction toxicity protocol except that lead was added as solid lead nitrate, carbonate and sulphide rather than as lead nitrate solution as would normally be the case. The compounds were added to the test soil to give lead concentrations of 625-12500 pg Pb g-1 of soil. Calculated toxicities of the lead decreased in the order nitrate > carbonate > sulphide, the same order as the decrease in the solubility of the metal compounds used. The 7-day LC50 (lethal concentration when 50% of the population is killed) for the nitrate was 5321 +/- 275 mug Pb g(-1) of soil and this did not change with time. The LC50 values for carbonate and sulphide could not be determined at the concentration ranges used. The only parameter sensitive enough to distinguish the toxicities of the three compounds was cocoon (egg) production. The EC50S for cocoon production (the concentration to produce a 50% reduction in cocoon production) were 993, 8604 and 10 246 mug Pb g(-1) of soil for lead nitrate, carbonate and sulphide, respectively. Standard toxicity tests need to take into account the form in which the contaminant is present in the soil to be of environmental relevance. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internationally agreed standard protocols for assessing chemical toxicity of contaminants in soil to worms assume that the test soil does not need to equilibrate with the chemical to be tested prior to the addition of the test organisms and that the chemical will exert any toxic effect upon the test organism within 28 days. Three experiments were carried out to investigate these assumptions. The first experiment was a standard toxicity test where lead nitrate was added to a soil in solution to give a range of concentrations. The mortality of the worms and the concentration of lead in the survivors were determined. The LC(50)s for 14 and 28 days were 5311 and 5395 mug(Pb) g(soil)(-1) respectively. The second experiment was a timed lead accumulation study with worms cultivated in soil containing either 3000 or 5000 mug(Pb) g(soil)(-1). The concentration of lead in the worms was determined at various sampling times. Uptake at so' Sol both concentrations was linear with time. Worms in the 5000 mug g(-1) soil accumulated lead at a faster rate (3.16 mug Pb g(tissue)(-1) day(-1)) tiss than those in the 3000 mug g(-1) soil (2.21 mug Pb-tissue g(-1) day(-1)). The third experiment was a timed experiment with worms cultivated in tiss soil containing 7000 mugPb g(soil)(-1). Soil and lead nitrate solution were mixed and stored at 20 degreesC. Worms were added at various times over a 35-day period. The time to death increased from 23 h, when worms were added directly after the lead was added to the soil, to 67 It when worms were added after the soil had equilibrated with the lead for 35 days. In artificially Pb-amended soils the worms accumulate Pb over the duration of their exposure to the Pb. Thus time limited toxicity tests may be terminated before worm body load has reached a toxic level. This could result in under-estimates of the toxicity of Pb to worms. As the equilibration time of artificially amended Pb-bearing soils increases the bioavailability of Pb decreases. Thus addition of worms shortly after addition of Pb to soils may result in the over-estimate of Pb toxicity to worms. The current OECD acute worm toxicity test fails to take these two phenomena into account thereby reducing the environmental relevance of the contaminant toxicities it is used to calculate. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate estimation of the soil water balance (SWB) is important for a number of applications (e.g. environmental, meteorological, agronomical and hydrological). The objective of this study was to develop and test techniques for the estimation of soil water fluxes and SWB components (particularly infiltration, evaporation and drainage below the root zone) from soil water records. The work presented here is based on profile soil moisture data measured using dielectric methods, at 30-min resolution, at an experimental site with different vegetation covers (barley, sunflower and bare soil). Estimates of infiltration were derived by assuming that observed gains in the soil profile water content during rainfall were due to infiltration. Inaccuracies related to diurnal fluctuations present in the dielectric-based soil water records are resolved by filtering the data with adequate threshold values. Inconsistencies caused by the redistribution of water after rain events were corrected by allowing for a redistribution period before computing water gains. Estimates of evaporation and drainage were derived from water losses above and below the deepest zero flux plane (ZFP), respectively. The evaporation estimates for the sunflower field were compared to evaporation data obtained with an eddy covariance (EC) system located elsewhere in the field. The EC estimate of total evaporation for the growing season was about 25% larger than that derived from the soil water records. This was consistent with differences in crop growth (based on direct measurements of biomass, and field mapping of vegetation using laser altimetry) between the EC footprint and the area of the field used for soil moisture monitoring. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To assess the impact of a closed-loop electronic prescribing and automated dispensing system on the time spent providing a ward pharmacy service and the activities carried out. Setting Surgical ward, London teaching hospital. Method All data were collected two months pre- and one year post-intervention. First, the ward pharmacist recorded the time taken each day for four weeks. Second, an observational study was conducted over 10 weekdays, using two-dimensional work sampling, to identify the ward pharmacist's activities. Finally, medication orders were examined to identify pharmacists' endorsements that should have been, and were actually, made. Key findings Mean time to provide a weekday ward pharmacy service increased from 1 h 8 min to 1 h 38 min per day (P = 0.001; unpaired t-test). There were significant increases in time spent prescription monitoring, recommending changes in therapy/monitoring, giving advice or information, and non-productive time. There were decreases for supply, looking for charts and checking patients' own drugs. There was an increase in the amount of time spent with medical and pharmacy staff, and with 'self'. Seventy-eight per cent of patients' medication records could be assessed for endorsements pre- and 100% post-intervention. Endorsements were required for 390 (50%) of 787 medication orders pre-intervention and 190 (21%) of 897 afterwards (P < 0.0001; chi-square test). Endorsements were made for 214 (55%) of endorsement opportunities pre-intervention and 57 (30%) afterwards (P < 0.0001; chi-square test). Conclusion The intervention increased the overall time required to provide a ward pharmacy service and changed the types of activity undertaken. Contact time with medical and pharmacy staff increased. There was no significant change in time spent with patients. Fewer pharmacy endorsements were required post-intervention, but a lower percentage were actually made. The findings have important implications for the design, introduction and use of similar systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a robust method is developed for the analysis of data consisting of repeated binary observations taken at up to three fixed time points on each subject. The primary objective is to compare outcomes at the last time point, using earlier observations to predict this for subjects with incomplete records. A score test is derived. The method is developed for application to sequential clinical trials, as at interim analyses there will be many incomplete records occurring in non-informative patterns. Motivation for the methodology comes from experience with clinical trials in stroke and head injury, and data from one such trial is used to illustrate the approach. Extensions to more than three time points and to allow for stratification are discussed. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction A high saturated fatty acid intake is a well recognized risk factor for coronary heart disease development. More recently a high intake of n-6 polyunsaturated fatty acids (PUFA) in combination with a low intake of the long chain n-3 PUFA, eicosapentaenoic acid and docosahexaenoic acid has also been implicated as an important risk factor. Aim To compare total dietary fat and fatty acid intake measured by chemical analysis of duplicate diets with nutritional database analysis of estimated dietary records, collected over the same 3-day study period. Methods Total fat was analysed using soxhlet extraction and subsequently the individual fatty acid content of the diet was determined by gas chromatography. Estimated dietary records were analysed using a nutrient database which was supplemented with a selection of dishes commonly consumed by study participants. Results Bland & Altman statistical analysis demonstrated a lack of agreement between the two dietary assessment techniques for determining dietary fat and fatty acid intake. Conclusion The lack of agreement observed between dietary evaluation techniques may be attributed to inadequacies in either or both assessment techniques. This study highlights the difficulties that may be encountered when attempting to accurately evaluate dietary fat intake among the population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The collection of wind speed time series by means of digital data loggers occurs in many domains, including civil engineering, environmental sciences and wind turbine technology. Since averaging intervals are often significantly larger than typical system time scales, the information lost has to be recovered in order to reconstruct the true dynamics of the system. In the present work we present a simple algorithm capable of generating a real-time wind speed time series from data logger records containing the average, maximum, and minimum values of the wind speed in a fixed interval, as well as the standard deviation. The signal is generated from a generalized random Fourier series. The spectrum can be matched to any desired theoretical or measured frequency distribution. Extreme values are specified through a postprocessing step based on the concept of constrained simulation. Applications of the algorithm to 10-min wind speed records logged at a test site at 60 m height above the ground show that the recorded 10-min values can be reproduced by the simulated time series to a high degree of accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whilst common sense knowledge has been well researched in terms of intelligence and (in particular) artificial intelligence, specific, factual knowledge also plays a critical part in practice. When it comes to testing for intelligence, testing for factual knowledge is, in every-day life, frequently used as a front line tool. This paper presents new results which were the outcome of a series of practical Turing tests held on 23rd June 2012 at Bletchley Park, England. The focus of this paper is on the employment of specific knowledge testing by interrogators. Of interest are prejudiced assumptions made by interrogators as to what they believe should be widely known and subsequently the conclusions drawn if an entity does or does not appear to know a particular fact known to the interrogator. The paper is not at all about the performance of machines or hidden humans but rather the strategies based on assumptions of Turing test interrogators. Full, unedited transcripts from the tests are shown for the reader as working examples. As a result, it might be possible to draw critical conclusions with regard to the nature of human concepts of intelligence, in terms of the role played by specific, factual knowledge in our understanding of intelligence, whether this is exhibited by a human or a machine. This is specifically intended as a position paper, firstly by claiming that practicalising Turing's test is a useful exercise throwing light on how we humans think, and secondly, by taking a potentially controversial stance, because some interrogators adopt a solipsist questioning style of hidden entities with a view that it is a thinking intelligent human if it thinks like them and knows what they know. The paper is aimed at opening discussion with regard to the different aspects considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Advances in nutritional assessment are continuing to embrace developments in computer technology. The online Food4Me food frequency questionnaire (FFQ) was created as an electronic system for the collection of nutrient intake data. To ensure its accuracy in assessing both nutrient and food group intake, further validation against data obtained using a reliable, but independent, instrument and assessment of its reproducibility are required. Objective: The aim was to assess the reproducibility and validity of the Food4Me FFQ against a 4-day weighed food record (WFR). Methods: Reproducibility of the Food4Me FFQ was assessed using test-retest methodology by asking participants to complete the FFQ on 2 occasions 4 weeks apart. To assess the validity of the Food4Me FFQ against the 4-day WFR, half the participants were also asked to complete a 4-day WFR 1 week after the first administration of the Food4Me FFQ. Level of agreement between nutrient and food group intakes estimated by the repeated Food4Me FFQ and the Food4Me FFQ and 4-day WFR were evaluated using Bland-Altman methodology and classification into quartiles of daily intake. Crude unadjusted correlation coefficients were also calculated for nutrient and food group intakes. Results: In total, 100 people participated in the assessment of reproducibility (mean age 32, SD 12 years), and 49 of these (mean age 27, SD 8 years) also took part in the assessment of validity. Crude unadjusted correlations for repeated Food4Me FFQ ranged from .65 (vitamin D) to .90 (alcohol). The mean cross-classification into “exact agreement plus adjacent” was 92% for both nutrient and food group intakes, and Bland-Altman plots showed good agreement for energy-adjusted macronutrient intakes. Agreement between the Food4Me FFQ and 4-day WFR varied, with crude unadjusted correlations ranging from .23 (vitamin D) to .65 (protein, % total energy) for nutrient intakes and .11 (soups, sauces and miscellaneous foods) to .73 (yogurts) for food group intake. The mean cross-classification into “exact agreement plus adjacent” was 80% and 78% for nutrient and food group intake, respectively. There were no significant differences between energy intakes estimated using the Food4Me FFQ and 4-day WFR, and Bland-Altman plots showed good agreement for both energy and energy-controlled nutrient intakes. Conclusions: The results demonstrate that the online Food4Me FFQ is reproducible for assessing nutrient and food group intake and has moderate agreement with the 4-day WFR for assessing energy and energy-adjusted nutrient intakes. The Food4Me FFQ is a suitable online tool for assessing dietary intake in healthy adults.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An improved understanding of present-day climate variability and change relies on high-quality data sets from the past 2 millennia. Global efforts to model regional climate modes are in the process of being validated against, and integrated with, records of past vegetation change. For South America, however, the full potential of vegetation records for evaluating and improving climate models has hitherto not been sufficiently acknowledged due to an absence of information on the spatial and temporal coverage of study sites. This paper therefore serves as a guide to high-quality pollen records that capture environmental variability during the last 2 millennia. We identify 60 vegetation (pollen) records from across South America which satisfy geochronological requirements set out for climate modelling, and we discuss their sensitivity to the spatial signature of climate modes throughout the continent. Diverse patterns of vegetation response to climate change are observed, with more similar patterns of change in the lowlands and varying intensity and direction of responses in the highlands. Pollen records display local-scale responses to climate modes; thus, it is necessary to understand how vegetation–climate interactions might diverge under variable settings. We provide a qualitative translation from pollen metrics to climate variables. Additionally, pollen is an excellent indicator of human impact through time. We discuss evidence for human land use in pollen records and provide an overview considered useful for archaeological hypothesis testing and important in distinguishing natural from anthropogenically driven vegetation change. We stress the need for the palynological community to be more familiar with climate variability patterns to correctly attribute the potential causes of observed vegetation dynamics. This manuscript forms part of the wider LOng-Term multi-proxy climate REconstructions and Dynamics in South America – 2k initiative that provides the ideal framework for the integration of the various palaeoclimatic subdisciplines and palaeo-science, thereby jump-starting and fostering multidisciplinary research into environmental change on centennial and millennial timescales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Using continuing professional development (CPD) as part of the revalidation of pharmacy professionals has been proposed in the UK but not implemented. We developed a CPD Outcomes Framework (‘the framework’) for scoring CPD records, where the score range was -100 to +150 based on demonstrable relevance and impact of the CPD on practice. OBJECTIVE: This exploratory study aimed to test the outcome of training people to use the framework, through distance-learning material (active intervention), by comparing CPD scores before and after training. SETTING: Pharmacy professionals were recruited in the UK in Reading, Banbury, Southampton, Kingston-upon-Thames and Guildford in 2009. METHOD: We conducted a randomised, double-blinded, parallel-group, before and after study. The control group simply received information on new CPD requirements through the post; the active intervention group also received the framework and associated training. Altogether 48 participants (25 control, 23 active) completed the study. All participants submitted CPD records to the research team before and after receiving the posted resources. The records (n=226) were scored blindly by the researchers using the framework. A subgroup of CPD records (n=96) submitted first (before-stage) and rewritten (after-stage) were analysed separately. MAIN OUTCOME MEASURE: Scores for CPD records received before and after distributing group-dependent material through the post. RESULTS: Using a linear-regression model both analyses found an increase in CPD scores in favour of the active intervention group. For the complete set of records, the effect was a mean difference of 9.9 (95% CI = 0.4 to 19.3), p-value = 0.04. For the subgroup of rewritten records, the effect was a mean difference of 17.3 (95% CI = 5.6 to 28.9), p-value = 0.0048. CONCLUSION: The intervention improved participants’ CPD behaviour. Training pharmacy professionals to use the framework resulted in better CPD activities and CPD records, potentially helpful for revalidation of pharmacy professionals. IMPACT: • Using a bespoke Continuing Professional Development outcomes framework improves the value of pharmacy professionals’ CPD activities and CPD records, with the potential to improve patient care. • The CPD outcomes framework could be helpful to pharmacy professionals internationally who want to improve the quality of their CPD activities and CPD records. • Regulators and officials across Europe and beyond can assess the suitability of the CPD outcomes framework for use in pharmacy CPD and revalidation in their own setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the sunspot-number series have existed since the mid-19th century, they are still the subject of intense debate, with the largest uncertainty being related to the "calibration" of the visual acuity of individual observers in the past. Daisy-chain regression methods are applied to inter-calibrate the observers which may lead to significant bias and error accumulation. Here we present a novel method to calibrate the visual acuity of the key observers to the reference data set of Royal Greenwich Observatory sunspot groups for the period 1900-1976, using the statistics of the active-day fraction. For each observer we independently evaluate their observational thresholds [S_S] defined such that the observer is assumed to miss all of the groups with an area smaller than S_S and report all the groups larger than S_S. Next, using a Monte-Carlo method we construct, from the reference data set, a correction matrix for each observer. The correction matrices are significantly non-linear and cannot be approximated by a linear regression or proportionality. We emphasize that corrections based on a linear proportionality between annually averaged data lead to serious biases and distortions of the data. The correction matrices are applied to the original sunspot group records for each day, and finally the composite corrected series is produced for the period since 1748. The corrected series displays secular minima around 1800 (Dalton minimum) and 1900 (Gleissberg minimum), as well as the Modern grand maximum of activity in the second half of the 20th century. The uniqueness of the grand maximum is confirmed for the last 250 years. It is shown that the adoption of a linear relationship between the data of Wolf and Wolfer results in grossly inflated group numbers in the 18th and 19th centuries in some reconstructions.