4 resultados para Reducing power
em CentAUR: Central Archive University of Reading - UK
Resumo:
BACKGROUND: Monitoring of fruit and vegetable (F&V) intake is fraught with difficulties. Available dietary assessment methods are associated with considerable error, and the use of biomarkers offers an attractive alternative. Few studies to date have examined the use of plasma biomarkers to monitor or predict the F&V intake of volunteers consuming a wide range of intakes from both habitual F&V and manipulated diets. OBJECTIVE: This study tested the hypothesis that an integrated biomarker calculated from a combination of plasma vitamin C, cholesterol-adjusted carotenoid concentration and Ferric Reducing Antioxidant Power (FRAP) had more power to predict F&V intake than each individual biomarker. METHODS: Data from a randomized controlled dietary intervention study [FLAVURS (Flavonoids University of Reading Study); n = 154] in which the test groups observed sequential increases of 2.3, 3.2, and 4.2 portions of F&Vs every 6 wk across an 18-wk period were used in this study. RESULTS: An integrated plasma biomarker was devised that included plasma vitamin C, total cholesterol-adjusted carotenoids, and FRAP values, which better correlated with F&V intake (r = 0.47, P < 0.001) than the individual biomarkers (r = 0.33, P < 0.01; r = 0.37, P < 0.001; and r = 0.14, respectively; P = 0.099). Inclusion of urinary potassium concentration did not significantly improve the correlation. The integrated plasma biomarker predicted F&V intake more accurately than did plasma total cholesterol-adjusted carotenoid concentration, with the difference being significant at visit 2 (P < 0.001) and with a tendency to be significant at visit 1 (P = 0.07). CONCLUSION: Either plasma total cholesterol-adjusted carotenoid concentration or the integrated biomarker could be used to distinguish between high- and moderate-F&V consumers. This trial was registered at www.controlled-trials.com as ISRCTN47748735.
Resumo:
Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.
Resumo:
This paper presents the evaluation in power consumption of gated clocks pipelined circuits with different register configurations in Virtex-based FPGA devices. Power impact of a gated clock circuitry aimed at reducing flip-flops output rate at the bit level is studied. Power performance is also given for pipeline stages based on the implementation of a double edge-triggered flip-flop. Using a pipelined Cordic Core circuit as an example, this study did not find evidence in power benefits either when gated clock at the bit-level or double-edge triggered flip-flops used when synthesized with FPGA logic resources.
Resumo:
Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.