8 resultados para error management
em CentAUR: Central Archive University of Reading - UK
Resumo:
Rationale: In UK hospitals, the preparation of all total parenteral nutrition (TPN) products must be made in the pharmacy as TPNs are categorised as high-risk injectables (NPSA/2007/20). The National Aseptic Error Reporting Scheme has been collecting data on pharmacy compounding errors in the UK since August 2003. This study reports on types of error associated with the preparation of TPNs, including the stage at which these were identified and potential and actual patient outcomes. Methods: Reports of compounding errors for the period 1/2004 - 3/2007 were analysed on an Excel spreadsheet. Results: Of a total of 3691 compounding error reports, 674 (18%) related to TPN products; 548 adult vs. 126 paediatric. A significantly higher proportion of adult TPNs (28% vs. 13% paediatric) were associated with labelling errors and a significantly higher proportion of paediatric TPNs (25% vs. 15% adult) were associated with incorrect transcriptions (Chi-Square Test; p<0.005). Labelling errors were identified equally by pharmacists (42%) and technicians (48%) with technicians detecting mainly at first check and pharmacists at final check. Transcription errors were identified mainly by technicians (65% vs. 27% pharmacist) at first check. Incorrect drug selection (13%) and calculation errors (9%) were associated with adult and paediatric TPN preparations in the same ratio. One paediatric TPN error detected at first check was considered potentially catastrophic; 31 (5%) errors were considered of major and 38 (6%) of moderate potential consequence. Five errors (2 moderate, 1 minor) were identified during or after administration. Conclusions: While recent UK patient safety initiatives are aimed at improving the safety of injectable medicines in clinical areas, the current study highlights safety problems that exist within pharmacy production units. This could be used in the creation of an error management tool for TPN compounding processes within hospital pharmacies.
Resumo:
Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.
Resumo:
Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.
Resumo:
The question as to whether active management adds any value above that of the funds investment policy is one of continual interest to investors. In order to investigate this issue in the UK real estate market we examine a number of related questions. First, how much return variability is explained by investment policy? Second, how similar are the policies across funds? Third, how much of a fund’s return is determined by investment policy? Finally, how was this added value achieved? Using data for 19 real estate funds we find that investment policy explains less than half of the variability in returns over time, nothing of the variation across funds and that more than 100% of a level of return is attributed to investment policy. The results also show UK real estate fund focus exclusively on trying to pick winners to add value and that in pursuit of active return fund mangers incur high tracking error risk, consequently, successful active management is very difficult to achieve. In addition, the results are dependent on the benchmark used to represent the investment policy of the fund. Nonetheless, active management can indeed add value to a real estate funds performance. This is the good news. The bad news is adding value is much more difficult to achieve than is generally accepted.
Resumo:
Recent research has suggested that forecast evaluation on the basis of standard statistical loss functions could prefer models which are sub-optimal when used in a practical setting. This paper explores a number of statistical models for predicting the daily volatility of several key UK financial time series. The out-of-sample forecasting performance of various linear and GARCH-type models of volatility are compared with forecasts derived from a multivariate approach. The forecasts are evaluated using traditional metrics, such as mean squared error, and also by how adequately they perform in a modern risk management setting. We find that the relative accuracies of the various methods are highly sensitive to the measure used to evaluate them. Such results have implications for any econometric time series forecasts which are subsequently employed in financial decisionmaking.
Resumo:
Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.
Resumo:
A dynamic, mechanistic model of enteric fermentation was used to investigate the effect of type and quality of grass forage, dry matter intake (DMI) and proportion of concentrates in dietary dry matter (DM) on variation in methane (CH(4)) emission from enteric fermentation in dairy cows. The model represents substrate degradation and microbial fermentation processes in rumen and hindgut and, in particular, the effects of type of substrate fermented and of pH oil the production of individual volatile fatty acids and CH, as end-products of fermentation. Effects of type and quality of fresh and ensiled grass were evaluated by distinguishing two N fertilization rates of grassland and two stages of grass maturity. Simulation results indicated a strong impact of the amount and type of grass consumed oil CH(4) emission, with a maximum difference (across all forage types and all levels of DM 1) of 49 and 77% in g CH(4)/kg fat and protein corrected milk (FCM) for diets with a proportion of concentrates in dietary DM of 0.1 and 0.4, respectively (values ranging from 10.2 to 19.5 g CH(4)/kg FCM). The lowest emission was established for early Cut, high fertilized grass silage (GS) and high fertilized grass herbage (GH). The highest emission was found for late cut, low-fertilized GS. The N fertilization rate had the largest impact, followed by stage of grass maturity at harvesting and by the distinction between GH and GS. Emission expressed in g CH(4)/kg FCM declined oil average 14% with an increase of DMI from 14 to 18 kg/day for grass forage diets with a proportion of concentrates of 0.1, and on average 29% with an increase of DMI from 14 to 23 kg/day for diets with a proportion of concentrates of 0.4. Simulation results indicated that a high proportion of concentrates in dietary DM may lead to a further reduction of CH, emission per kg FCM mainly as a result of a higher DM I and milk yield, in comparison to low concentrate diets. Simulation results were evaluated against independent data obtained at three different laboratories in indirect calorimetry trials with COWS consuming GH mainly. The model predicted the average of observed values reasonably, but systematic deviations remained between individual laboratories and root mean squared prediction error was a proportion of 0.12 of the observed mean. Both observed and predicted emission expressed in g CH(4)/kg DM intake decreased upon an increase in dietary N:organic matter (OM) ratio. The model reproduced reasonably well the variation in measured CH, emission in cattle sheds oil Dutch dairy farms and indicated that oil average a fraction of 0.28 of the total emissions must have originated from manure under these circumstances.
Resumo:
Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.