87 resultados para Electronic record
em CentAUR: Central Archive University of Reading - UK
Resumo:
Objectives: To assess the impact of a closed-loop electronic prescribing, automated dispensing, barcode patient identification and electronic medication administration record (EMAR) system on prescribing and administration errors, confirmation of patient identity before administration, and staff time. Design, setting and participants: Before-and-after study in a surgical ward of a teaching hospital, involving patients and staff of that ward. Intervention: Closed-loop electronic prescribing, automated dispensing, barcode patient identification and EMAR system. Main outcome measures: Percentage of new medication orders with a prescribing error, percentage of doses with medication administration errors (MAEs) and percentage given without checking patient identity. Time spent prescribing and providing a ward pharmacy service. Nursing time on medication tasks. Results: Prescribing errors were identified in 3.8% of 2450 medication orders pre-intervention and 2.0% of 2353 orders afterwards (p<0.001; χ2 test). MAEs occurred in 7.0% of 1473 non-intravenous doses pre-intervention and 4.3% of 1139 afterwards (p = 0.005; χ2 test). Patient identity was not checked for 82.6% of 1344 doses pre-intervention and 18.9% of 1291 afterwards (p<0.001; χ2 test). Medical staff required 15 s to prescribe a regular inpatient drug pre-intervention and 39 s afterwards (p = 0.03; t test). Time spent providing a ward pharmacy service increased from 68 min to 98 min each weekday (p = 0.001; t test); 22% of drug charts were unavailable pre-intervention. Time per drug administration round decreased from 50 min to 40 min (p = 0.006; t test); nursing time on medication tasks outside of drug rounds increased from 21.1% to 28.7% (p = 0.006; χ2 test). Conclusions: A closed-loop electronic prescribing, dispensing and barcode patient identification system reduced prescribing errors and MAEs, and increased confirmation of patient identity before administration. Time spent on medication-related tasks increased.
Resumo:
Background: Advances in nutritional assessment are continuing to embrace developments in computer technology. The online Food4Me food frequency questionnaire (FFQ) was created as an electronic system for the collection of nutrient intake data. To ensure its accuracy in assessing both nutrient and food group intake, further validation against data obtained using a reliable, but independent, instrument and assessment of its reproducibility are required. Objective: The aim was to assess the reproducibility and validity of the Food4Me FFQ against a 4-day weighed food record (WFR). Methods: Reproducibility of the Food4Me FFQ was assessed using test-retest methodology by asking participants to complete the FFQ on 2 occasions 4 weeks apart. To assess the validity of the Food4Me FFQ against the 4-day WFR, half the participants were also asked to complete a 4-day WFR 1 week after the first administration of the Food4Me FFQ. Level of agreement between nutrient and food group intakes estimated by the repeated Food4Me FFQ and the Food4Me FFQ and 4-day WFR were evaluated using Bland-Altman methodology and classification into quartiles of daily intake. Crude unadjusted correlation coefficients were also calculated for nutrient and food group intakes. Results: In total, 100 people participated in the assessment of reproducibility (mean age 32, SD 12 years), and 49 of these (mean age 27, SD 8 years) also took part in the assessment of validity. Crude unadjusted correlations for repeated Food4Me FFQ ranged from .65 (vitamin D) to .90 (alcohol). The mean cross-classification into “exact agreement plus adjacent” was 92% for both nutrient and food group intakes, and Bland-Altman plots showed good agreement for energy-adjusted macronutrient intakes. Agreement between the Food4Me FFQ and 4-day WFR varied, with crude unadjusted correlations ranging from .23 (vitamin D) to .65 (protein, % total energy) for nutrient intakes and .11 (soups, sauces and miscellaneous foods) to .73 (yogurts) for food group intake. The mean cross-classification into “exact agreement plus adjacent” was 80% and 78% for nutrient and food group intake, respectively. There were no significant differences between energy intakes estimated using the Food4Me FFQ and 4-day WFR, and Bland-Altman plots showed good agreement for both energy and energy-controlled nutrient intakes. Conclusions: The results demonstrate that the online Food4Me FFQ is reproducible for assessing nutrient and food group intake and has moderate agreement with the 4-day WFR for assessing energy and energy-adjusted nutrient intakes. The Food4Me FFQ is a suitable online tool for assessing dietary intake in healthy adults.
Resumo:
Point defects in metal oxides such as TiO2 are key to their applications in numerous technologies. The investigation of thermally induced nonstoichiometry in TiO2 is complicated by the difficulties in preparing and determining a desired degree of nonstoichiometry. We study controlled self-doping of TiO2 by adsorption of 1/8 and 1/16 monolayer Ti at the (110) surface using a combination of experimental and computational approaches to unravel the details of the adsorption process and the oxidation state of Ti. Upon adsorption of Ti, x-ray and ultraviolet photoemission spectroscopy (XPS and UPS) show formation of reduced Ti. Comparison of pure density functional theory (DFT) with experiment shows that pure DFT provides an inconsistent description of the electronic structure. To surmount this difficulty, we apply DFT corrected for on-site Coulomb interaction (DFT+U) to describe reduced Ti ions. The optimal value of U is 3 eV, determined from comparison of the computed Ti 3d electronic density of states with the UPS data. DFT+U and UPS show the appearance of a Ti 3d adsorbate-induced state at 1.3 eV above the valence band and 1.0 eV below the conduction band. The computations show that the adsorbed Ti atom is oxidized to Ti2+ and a fivefold coordinated surface Ti atom is reduced to Ti3+, while the remaining electron is distributed among other surface Ti atoms. The UPS data are best fitted with reduced Ti2+ and Ti3+ ions. These results demonstrate that the complexity of doped metal oxides is best understood with a combination of experiment and appropriate computations.
Resumo:
Deposits of coral-bearing, marine shell conglomerate exposed at elevations higher than 20 m above present-day mean sea level (MSL) in Bermuda and the Bahamas have previously been interpreted as relict intertidal deposits formed during marine isotope stage (MIS) I I, ca. 360-420 ka before present. On the strength of this evidence, a sea level highstand more than 20 m higher than present-day MSL was inferred for the MIS I I interglacial, despite a lack of clear supporting evidence in the oxygen-isotope records of deep-sea sediment cores. We have critically re-examined the elevated marine deposits in Bermuda, and find their geological setting, sedimentary relations, and microfaunal assemblages to be inconsistent with intertidal deposition over an extended period. Rather, these deposits, which comprise a poorly sorted mixture of reef, lagoon and shoreline sediments, appear to have been carried tens of meters inside karst caves, presumably by large waves, at some time earlier than ca. 310-360 ka before present (MIS 9-11). We hypothesize that these deposits are the result of a large tsunami during the mid-Pleistocene, in which Bermuda was impacted by a wave set that carried sediments from the surrounding reef platform and nearshore waters over the eolianite atoll. Likely causes for such a megatsunami are the flank collapse of an Atlantic island volcano, such as the roughly synchronous Julan or Orotava submarine landslides in the Canary Islands, or a giant submarine landslide on the Atlantic continental margin. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Our recent paper [McMurtry, G.M., Tappin, D.R., Sedwick, P.N., Wilkinson, I., Fietzkc, J. and Sellwood, B., 2007a. Elevated marine deposits in Bermuda record a late Quaternary megatsunami. Sedimentary Geol. 200, 155-165.] critically re-examined elevated marine deposits in Bermuda, and concluded that their geological setting, sedimentary relations, micropetrography and microfaunal assemblages were inconsistent with sustained intertidal deposition. Instead, we hypothesized that these deposits were the result of a large tsunami that impacted the Bermuda island platform during the mid-Pleistocene. Hearty and Olson [Hearty, P.J., and Olson, S.L., in press. Mega-highstand or megatsunami? Discussion of McMurtry et al. "Elevated marine deposits in Bermuda record a late Quaternary megatsunami": Sedimentary Geology, 200, 155-165, 2007 (Aug. 07). Sedimentary Geol. 200, 155-165.] in their response, attempt to refute our conclusions and claim the deposits to be the result of a +21 m eustatic sea level highstand during marine isotope stage (MIS) 11. In our reply we answer the issues raised by Hearty and Olson [Hearty, P.J., and Olson, S.L., in press. Mega-highstand or megatsunami? Discussion of McMurtry et al. "Elevated marine deposits in Bermuda record a late Quaternary megatsunami": Sedimentary Geology, 200, 155-165, 2007 (Aug. 07). Sedimentary Geol. 200,155-165.] and conclude that the Bermuda deposits do not provide unequivocal evidence of a prolonged +21 m eustatic sea level highstand. Rather, the sediments are more likely the result of a past megatsunami in the North Atlantic basin. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Lacustrine sediments from southeastern Arabia reveal variations in lake level corresponding to changes in the strength and duration of Indian Ocean Monsoon (IOM) summer rainfall and winter cyclonic rainfall. The late glacial/Holocene transition of the region was characterised by the development of mega-linear dunes. These dunes became stabilised and vegetated during the early Holocene and interdunal lakes formed in response to the incursion of the IOM at approximately 8500 cal yr BP with the development of C3 dominated savanna grasslands. The IOM weakened ca. 6000 cal yr BP with the onset of regional aridity, aeolian sedimentation and dune reactivation and accretion. Despite this reduction in precipitation, the take was maintained by winter dominated rainfall. There was a shift to drier adapted C4 grasslands across the dune field. Lake sediment geochemical analyses record precipitation minima at 8200, 5000 and 4200 cal yr BP that coincide with Bond events in the North Atlantic. A number of these events correspond with changes in cultural periods, suggesting that climate was a key mechanism affecting human occupation and exploitation of this region. (c) 2006 University of Washington. All rights reserved.
Resumo:
Coral growth rate can be affected by environmental parameters such as seawater temperature, depth, and light intensity. The natural reef environment is also disturbed by human influences such as anthropogenic pollutants, which in Barbados are released close to the reefs. Here we describe a relatively new method of assessing the history of pollution and explain how these effects have influenced the coral communities off the west coast of Barbados. We evaluate the relative impact of both anthropogenic pollutants and natural stresses. Sclerochronology documents framework and skeletal growth rate and records pollution history (recorded as reduced growth) for a suite of sampled Montastraea annularis coral cores. X-radiography shows annual growth band patterns of the corals extending back over several decades and indicates significantly lower growth rate in polluted sites. Results using laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) on the whole sample (aragonite, organic matter, trapped particulate matter, etc.), have shown contrasting concentrations of the trace elements (Cu, Sn, Zn, and Pb) between corals at different locations and within a single coral. Deepwater corals 7 km apart, record different levels of Pb and Sn, suggesting that a current transported the metal pollution in the water. In addition, the 1995 hurricanes are associated with anomalous values for Sn and Cu from most sites. These are believed to result from dispersion of nearshore polluted water. We compared the concentrations of trace elements in the coral growth of particular years to those in the relevant contemporaneous seawater. Mean values for the concentration factor in the coral, relative to the water, ranged from 10 for Cu and Ni to 2.4 and 0.7 for Cd and Zn, respectively. Although the uncertainties are large (60-80%), the coral record enabled us to demonstrate the possibility of calculating a history of seawater pollution for these elements from the 1940s to 1997. Our values were much higher than those obtained from analysis of carefully cleaned coral aragonite; they demonstrate the incorporation of more contamination including that from particulate material as well as dissolved metals.
Resumo:
General circulation models (GCMs) use the laws of physics and an understanding of past geography to simulate climatic responses. They are objective in character. However, they tend to require powerful computers to handle vast numbers of calculations. Nevertheless, it is now possible to compare results from different GCMs for a range of times and over a wide range of parameterisations for the past, present and future (e.g. in terms of predictions of surface air temperature, surface moisture, precipitation, etc.). GCMs are currently producing simulated climate predictions for the Mesozoic, which compare favourably with the distributions of climatically sensitive facies (e.g. coals, evaporites and palaeosols). They can be used effectively in the prediction of oceanic upwelling sites and the distribution of petroleum source rocks and phosphorites. Models also produce evaluations of other parameters that do not leave a geological record (e.g. cloud cover, snow cover) and equivocal phenomena such as storminess. Parameterisation of sub-grid scale processes is the main weakness in GCMs (e.g. land surfaces, convection, cloud behaviour) and model output for continental interiors is still too cold in winter by comparison with palaeontological data. The sedimentary and palaeontological record provides an important way that GCMs may themselves be evaluated and this is important because the same GCMs are being used currently to predict possible changes in future climate. The Mesozoic Earth was, by comparison with the present, an alien world, as we illustrate here by reference to late Triassic, late Jurassic and late Cretaceous simulations. Dense forests grew close to both poles but experienced months-long daylight in warm summers and months-long darkness in cold snowy winters. Ocean depths were warm (8 degrees C or more to the ocean floor) and reefs, with corals, grew 10 degrees of latitude further north and south than at the present time. The whole Earth was warmer than now by 6 degrees C or more, giving more atmospheric humidity and a greatly enhanced hydrological cycle. Much of the rainfall was predominantly convective in character, often focused over the oceans and leaving major desert expanses on the continental areas. Polar ice sheets are unlikely to have been present because of the high summer temperatures achieved. The model indicates extensive sea ice in the nearly enclosed Arctic seaway through a large portion of the year during the late Cretaceous, and the possibility of sea ice in adjacent parts of the Midwest Seaway over North America. The Triassic world was a predominantly warm world, the model output for evaporation and precipitation conforming well with the known distributions of evaporites, calcretes and other climatically sensitive facies for that time. The message from the geological record is clear. Through the Phanerozoic, Earth's climate has changed significantly, both on a variety of time scales and over a range of climatic states, usually baldly referred to as "greenhouse" and "icehouse", although these terms disguise more subtle states between these extremes. Any notion that the climate can remain constant for the convenience of one species of anthropoid is a delusion (although the recent rate of climatic change is exceptional). (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Aims To investigate the effects of electronic prescribing (EP) on prescribing quality, as indicated by prescribing errors and pharmacists' clinical interventions, in a UK hospital. Methods Prescribing errors and pharmacists' interventions were recorded by the ward pharmacist during a 4 week period both pre- and post-EP, with a second check by the principal investigator. The percentage of new medication orders with a prescribing error and/or pharmacist's intervention was calculated for each study period. Results Following the introduction of EP, there was a significant reduction in both pharmacists' interventions and prescribing errors. Interventions reduced from 73 (3.0% of all medication orders) to 45 (1.9%) (95% confidence interval (CI) for the absolute reduction 0.2, 2.0%), and errors from 94 (3.8%) to 48 (2.0%) (95% CI 0.9, 2.7%). Ten EP-specific prescribing errors were identified. Only 52% of pharmacists' interventions related to a prescribing error pre-EP, and 60% post-EP; only 40% and 56% of prescribing errors resulted in an intervention pre- and post-EP, respectively. Conclusions EP improved the quality of prescribing by reducing both prescribing errors and pharmacists' clinical interventions. Prescribers and pharmacists need to be aware of new types of error with EP, so that they can best target their activities to reduce clinical risk. Pharmacists may need to change the way they work to complement, rather than duplicate, the benefits of EP.
Resumo:
Objective To assess the impact of a closed-loop electronic prescribing and automated dispensing system on the time spent providing a ward pharmacy service and the activities carried out. Setting Surgical ward, London teaching hospital. Method All data were collected two months pre- and one year post-intervention. First, the ward pharmacist recorded the time taken each day for four weeks. Second, an observational study was conducted over 10 weekdays, using two-dimensional work sampling, to identify the ward pharmacist's activities. Finally, medication orders were examined to identify pharmacists' endorsements that should have been, and were actually, made. Key findings Mean time to provide a weekday ward pharmacy service increased from 1 h 8 min to 1 h 38 min per day (P = 0.001; unpaired t-test). There were significant increases in time spent prescription monitoring, recommending changes in therapy/monitoring, giving advice or information, and non-productive time. There were decreases for supply, looking for charts and checking patients' own drugs. There was an increase in the amount of time spent with medical and pharmacy staff, and with 'self'. Seventy-eight per cent of patients' medication records could be assessed for endorsements pre- and 100% post-intervention. Endorsements were required for 390 (50%) of 787 medication orders pre-intervention and 190 (21%) of 897 afterwards (P < 0.0001; chi-square test). Endorsements were made for 214 (55%) of endorsement opportunities pre-intervention and 57 (30%) afterwards (P < 0.0001; chi-square test). Conclusion The intervention increased the overall time required to provide a ward pharmacy service and changed the types of activity undertaken. Contact time with medical and pharmacy staff increased. There was no significant change in time spent with patients. Fewer pharmacy endorsements were required post-intervention, but a lower percentage were actually made. The findings have important implications for the design, introduction and use of similar systems.
Resumo:
We describe the main features of a program written to perform electronic marking of quantitative or simple text questions. One of the main benefits is that it can check answers for being consistent with earlier errors, so can cope with a range of numerical questions. We summarise our experience of using it in a statistics course taught to 200 bioscience students.
Neat but not gaudy: planning and creating an electronic induction tutorial at the University of Bath
Resumo:
This paper describes a case study of an electronic data management system developed in-house by the Facilities Management Directorate (FMD) of an educational institution in the UK. The FMD Maintenance and Business Services department is responsible for the maintenance of the built-estate owned by the university. The department needs to have a clear definition of the type of work undertaken and the administration that enables any maintenance work to be carried out. These include the management of resources, budget, cash flow and workflow of reactive, preventative and planned maintenance of the campus. In order to be more efficient in supporting the business process, the FMD had decided to move from a paper-based information system to an electronic system, WREN, to support the business process of the FMD. Some of the main advantages of WREN are that it is tailor-made to fit the purpose of the users; it is cost effective when it comes to modifications on the system; and the database can also be used as a knowledge management tool. There is a trade-off; as WREN is tailored to the specific requirements of the FMD, it may not be easy to implement within a different institution without extensive modifications. However, WREN is successful in not only allowing the FMD to carry out the tasks of maintaining and looking after the built-estate of the university, but also has achieved its aim to minimise costs and maximise efficiency.