932 resultados para good auditing practice
Resumo:
A 75-year-old man diagnosed with lower esophageal adenocarcinoma suffered from epirubicin extravasation during the second cycle of neoadjuvant chemotherapy with epirubicin and oxaliplatin. A full recovery was achieved after treatment with dexrazoxane (Cardioxane® ). This is the first time in our hospital that extravasation of an anthracycline has been treated with dexrazoxane. We used Cardioxane® , approved for the prevention of anthracycline-induced cardiotoxicity, while Savene® is indicated for the treatment of anthracycline extravasation. The treatment was effective, and the selection of Cardioxane® (seven-fold cheaper than Savene® ) yielded a cost saving. Consequently, Cardioxane® has been included in our guidelines for anthracycline extravasation.
Resumo:
IMPORTANCE: The 2013 American College of Cardiology/American Heart Association (ACC/AHA) guidelines introduced a prediction model and lowered the threshold for treatment with statins to a 7.5% 10-year hard atherosclerotic cardiovascular disease (ASCVD) risk. Implications of the new guideline's threshold and model have not been addressed in non-US populations or compared with previous guidelines. OBJECTIVE: To determine population-wide implications of the ACC/AHA, the Adult Treatment Panel III (ATP-III), and the European Society of Cardiology (ESC) guidelines using a cohort of Dutch individuals aged 55 years or older. DESIGN, SETTING, AND PARTICIPANTS: We included 4854 Rotterdam Study participants recruited in 1997-2001. We calculated 10-year risks for "hard" ASCVD events (including fatal and nonfatal coronary heart disease [CHD] and stroke) (ACC/AHA), hard CHD events (fatal and nonfatal myocardial infarction, CHD mortality) (ATP-III), and atherosclerotic CVD mortality (ESC). MAIN OUTCOMES AND MEASURES: Events were assessed until January 1, 2012. Per guideline, we calculated proportions of individuals for whom statins would be recommended and determined calibration and discrimination of risk models. RESULTS: The mean age was 65.5 (SD, 5.2) years. Statins would be recommended for 96.4% (95% CI, 95.4%-97.1%; n = 1825) of men and 65.8% (95% CI, 63.8%-67.7%; n = 1523) of women by the ACC/AHA, 52.0% (95% CI, 49.8%-54.3%; n = 985) of men and 35.5% (95% CI, 33.5%-37.5%; n = 821) of women by the ATP-III, and 66.1% (95% CI, 64.0%-68.3%; n = 1253) of men and 39.1% (95% CI, 37.1%-41.2%; n = 906) of women by ESC guidelines. With the ACC/AHA model, average predicted risk vs observed cumulative incidence of hard ASCVD events was 21.5% (95% CI, 20.9%-22.1%) vs 12.7% (95% CI, 11.1%-14.5%) for men (192 events) and 11.6% (95% CI, 11.2%-12.0%) vs 7.9% (95% CI, 6.7%-9.2%) for women (151 events). Similar overestimation occurred with the ATP-III model (98 events in men and 62 events in women) and ESC model (50 events in men and 37 events in women). The C statistic was 0.67 (95% CI, 0.63-0.71) in men and 0.68 (95% CI, 0.64-0.73) in women for hard ASCVD (ACC/AHA), 0.67 (95% CI, 0.62-0.72) in men and 0.69 (95% CI, 0.63-0.75) in women for hard CHD (ATP-III), and 0.76 (95% CI, 0.70-0.82) in men and 0.77 (95% CI, 0.71-0.83) in women for CVD mortality (ESC). CONCLUSIONS AND RELEVANCE: In this European population aged 55 years or older, proportions of individuals eligible for statins differed substantially among the guidelines. The ACC/AHA guideline would recommend statins for nearly all men and two-thirds of women, proportions exceeding those with the ATP-III or ESC guidelines. All 3 risk models provided poor calibration and moderate to good discrimination. Improving risk predictions and setting appropriate population-wide thresholds are necessary to facilitate better clinical decision making.
Resumo:
OBJECTIVE AND METHOD: Isolated office hypertension, defined as hypertensive blood pressure values in a medical setting but normal self-measured or ambulatory-recorded blood pressures, is frequently encountered in clinical practice. Yet, whether this condition represents a transient state in the development of a sustained ambulatory hypertension is still unknown as no long-term analysis of the evolution of ambulatory blood pressure has been carried out in patients with isolated office hypertension. To evaluate whether such patients should be considered as truly normotensive or hypertensive, we have studied the long-term changes in office and ambulatory blood pressures in 81 patients in whom isolated office hypertension was observed between 1982 and 1988. RESULTS: After a 5-6 year follow-up, 60 of the 81 patients had a mean 12 h daytime ambulatory blood pressure greater than 140/90 mmHg, suggesting an evolution towards ambulatory hypertension. The development of hypertension could not be predicted on the basis of the follow-up office blood pressures as these tended to decrease during the follow-up period. CONCLUSIONS: The results of this study suggest that patients with isolated office hypertension should not be considered as truly normotensive individuals. Hence, these patients require a careful medical follow-up. Office blood pressure readings alone, however, do not appear to provide a good indicator of the long-term outcome of isolated office hypertension.
Resumo:
Supportive breeding is an important tool in conservation management, but its long-term genetic consequences are not well understood. Among the factors that could affect the genetics of the offspring is sperm competition as a consequence of mixed-milt fertilizations - which is still a common practice in many hatcheries. Here, we measured and combined the relevant factors to predict the genetic consequences of various kinds of hatchery-induced sperm competition. We drew a random sample of male Coregonus zugensis (an Alpine whitefish) from a hatchery program and quantified their in vitro sperm potency by integrating sperm velocity during the first minute after activation, and their in vitro milt potency by multiplying sperm potency with milt volume and sperm cell density. We found that not controlling for sperm density and/or milt volume would, at a constant population size, decrease the variance effective number of male breeders N-em by around 40-50%. This loss would decrease with increasing population growth rates. Partial multifactorial breeding and the separate rearing of in total 799 batches of eggs revealed that neither sperm nor milt potency was significantly linked to egg survival. Sperm and milt potency was also not significantly correlated to other potential quality measures such as breeding tubercles or condition factor. However, sperm potency was correlated to male age and milt potency to male growth rate. Our findings suggest that hatchery-induced sperm competition not only increases the loss of genetic variation but may also induce artificial selection, depending on the fertilization protocol. By not equalizing milt volume in multi-male fertilization hatchery managers lose relatively more genetic variation and give fast-growing males a reproductive advantage, while equalizing milt volume reduces the loss of genetic variation and favors younger males who may have fast sperm to compensate for their subdominance at the spawning place. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Obesity-induced chronic inflammation leads to activation of the immune system that causes alterations of iron homeostasis including hypoferraemia, iron-restricted erythropoiesis, and finally mild-to-moderate anaemia. Thus, preoperative anaemia and iron deficiency are common among obese patients scheduled for bariatric surgery (BS). Assessment of patients should include a complete haematological and biochemical laboratory work-up, including measurement of iron stores, vitamin B12 and folate. In addition, gastrointestinal evaluation is recommended for most patients with iron-deficiency anaemia. On the other hand, BS is a long-lasting inflammatory stimulus in itself and entails a reduction of the gastric capacity and/or exclusion from the gastrointestinal tract which impair nutrients absorption, including dietary iron. Chronic gastrointestinal blood loss and iron-losingenteropathy may also contribute to iron deficiency after BS. Perioperative anaemia has been linked to increased postoperative morbidity and mortality and decreased quality of life after major surgery, whereas treatment of perioperative anaemia, and even haematinic deficiency without anaemia, has been shown to improve patient outcomes and quality of life. However, long-term follow-up data in regard to prevalence, severity, and causes of anaemia after BS are mostly absent. Iron supplements should be administered to patients after BS, but compliance with oral iron is no good. In addition, once iron deficiency has developed, it may prove refractory to oral treatment. In these situations, IV iron (which can circumvent the iron blockade at enterocytes and macrophages) has emerged as a safe and effective alternative for perioperative anaemia management. Monitoring should continue indefinitely even after the initial iron repletion and anaemia resolution, and maintenance IV iron treatment should be provided as required. New IV preparations, such ferric carboxymaltose, are safe, easy to use and up to 1000 mg can be given in a single session, thus providing an excellent tool to avoid or treat iron deficiency in this patient population.
Resumo:
Compared to total allergenic extracts, recombinant allergens available for specific IgE measurement represent an important advance in the diagnosis and treatment of IgE-mediated allergies. Recombinant allergens lead to define the sensitization profile of allergic patients, to identify markers of sensitization and to understand better polysensitivities related to cross-reactions and markers of severity of allergic reactions. They also contribute to the decision to establish tolerance induction (allergen specific immunotherapy) and the optimal selection of the allergenic composition of the vaccine.
Resumo:
INTRODUCTION According to several series, hospital hyponutrition involves 30-50% of hospitalized patients. The high prevalence justifies the need for early detection from admission. There several classical screening tools that show important limitations in their systematic application in daily clinical practice. OBJECTIVES To analyze the relationship between hyponutrition, detected by our screening method, and mortality, hospital stay, or re-admissions. To analyze, as well, the relationship between hyponutrition and prescription of nutritional support. To compare different nutritional screening methods at admission on a random sample of hospitalized patients. Validation of the INFORNUT method for nutritional screening. MATERIAL AND METHODS In a previous phase from the study design, a retrospective analysis with data from the year 2003 was carried out in order to know the situation of hyponutrition in Virgen de la Victoria Hospital, at Malaga, gathering data from the MBDS (Minimal Basic Data Set), laboratory analysis of nutritional risk (FILNUT filter), and prescription of nutritional support. In the experimental phase, a cross-sectional cohort study was done with a random sample of 255 patients, on May of 2004. Anthropometrical study, Subjective Global Assessment (SGA), Mini-Nutritional Assessment (MNA), Nutritional Risk Screening (NRS), Gassull's method, CONUT and INFORNUT were done. The settings of the INFORNUT filter were: albumin < 3.5 g/dL, and/or total proteins <5 g/dL, and/or prealbumin <18 mg/dL, with or without total lymphocyte count < 1.600 cells/mm3 and/or total cholesterol <180 mg/dL. In order to compare the different methods, a gold standard is created based on the recommendations of the SENPE on anthropometrical and laboratory data. The statistical association analysis was done by the chi-squared test (a: 0.05) and agreement by the k index. RESULTS In the study performed in the previous phase, it is observed that the prevalence of hospital hyponutrition is 53.9%. One thousand six hundred and forty four patients received nutritional support, of which 66.9% suffered from hyponutrition. We also observed that hyponutrition is one of the factors favoring the increase in mortality (hyponourished patients 15.19% vs. non-hyponourished 2.58%), hospital stay (hyponourished patients 20.95 days vs. non-hyponourished 8.75 days), and re-admissions (hyponourished patients 14.30% vs. non-hyponourished 6%). The results from the experimental study are as follows: the prevalence of hyponutrition obtained by the gold standard was 61%, INFORNUT 60%. Agreement levels between INFORNUT, CONUT, and GASSULL are good or very good between them (k: 0.67 INFORNUT with CONUT, and k: 0.94 INFORNUT and GASSULL) and wit the gold standard (k: 0.83; k: 0.64 CONUT; k: 0.89 GASSULL). However, structured tests (SGA, MNA, NRS) show low agreement indexes with the gold standard and laboratory or mixed tests (Gassull), although they show a low to intermediate level of agreement when compared one to each other (k: 0.489 NRS with SGA). INFORNUT shows sensitivity of 92.3%, a positive predictive value of 94.1%, and specificity of 91.2%. After the filer phase, a preliminary report is sent, on which anthropometrical and intake data are added and a Nutritional Risk Report is done. CONCLUSIONS Hyponutrition prevalence in our study (60%) is similar to that found by other authors. Hyponutrition is associated to increased mortality, hospital stay, and re-admission rate. There are no tools that have proven to be effective to show early hyponutrition at the hospital setting without important applicability limitations. FILNUT, as the first phase of the filter process of INFORNUT represents a valid tool: it has sensitivity and specificity for nutritional screening at admission. The main advantages of the process would be early detection of patients with risk for hyponutrition, having a teaching and sensitization function to health care staff implicating them in nutritional assessment of their patients, and doing a hyponutrition diagnosis and nutritional support need in the discharge report that would be registered by the Clinical Documentation Department. Therefore, INFORNUT would be a universal screening method with a good cost-effectiveness ratio.
Resumo:
BACKGROUND Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. METHODOLOGY/PRINCIPAL FINDINGS Pre-post intervention study of HH performance at baseline (October 2007-December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: "3/3 strategy"); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2-80.7) vs 84.6% (95% CI:83.8-85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time ("positive": 90.1% as highest HH compliance coinciding with the "World hygiene day"; and "negative":73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). CONCLUSIONS/SIGNIFICANCE CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers.
Assessment of drug-induced hepatotoxicity in clinical practice: a challenge for gastroenterologists.
Resumo:
Currently, pharmaceutical preparations are serious contributors to liver disease; hepatotoxicity ranking as the most frequent cause for acute liver failure and post-commercialization regulatory decisions. The diagnosis of hepatotoxicity remains a difficult task because of the lack of reliable markers for use in general clinical practice. To incriminate any given drug in an episode of liver dysfunction is a step-by-step process that requires a high degree of suspicion, compatible chronology, awareness of the drug's hepatotoxic potential, the exclusion of alternative causes of liver damage and the ability to detect the presence of subtle data that favors a toxic etiology. This process is time-consuming and the final result is frequently inaccurate. Diagnostic algorithms may add consistency to the diagnostic process by translating the suspicion into a quantitative score. Such scales are useful since they provide a framework that emphasizes the features that merit attention in cases of suspected hepatic adverse reaction as well. Current efforts in collecting bona fide cases of drug-induced hepatotoxicity will make refinements of existing scales feasible. It is now relatively easy to accommodate relevant data within the scoring system and to delete low-impact items. Efforts should also be directed toward the development of an abridged instrument for use in evaluating suspected drug-induced hepatotoxicity at the very beginning of the diagnosis and treatment process when clinical decisions need to be made. The instrument chosen would enable a confident diagnosis to be made on admission of the patient and treatment to be fine-tuned as further information is collected.
Resumo:
Despite intense efforts, the socioeconomic burden of cancer remains unacceptably high and treatment advances for many common cancers have been limited, suggesting a need for a new approach to drug development. One issue central to this lack of progress is the heterogeneity and genetic complexity of many tumours. This results in considerable variability in therapeutic response and requires knowledge of the molecular profile of the tumour to guide appropriate treatment selection for individual patients. While recent advances in the molecular characterisation of different cancer types have the potential to transform cancer treatment through precision medicine, such an approach presents a major economic challenge for drug development, since novel targeted agents may only be suitable for a small cohort of patients. Identifying the patients who would benefit from individual therapies and recruiting sufficient numbers of patients with particular cancer subtypes into clinical trials is challenging, and will require collaborative efforts from research groups and industry in order to accelerate progress. A number of molecular screening platforms have already been initiated across Europe, and it is hoped that these networks, along with future collaborations, will benefit not only patients but also society through cost reductions as a result of more efficient use of resources. This review discusses how current developments in translational oncology may be applied in clinical practice in the future, assesses current programmes for the molecular characterisation of cancer and describes possible collaborative approaches designed to maximise the benefits of translational science for patients with cancer.
Resumo:
Neuropsychology is a scientific discipline, born in the XIX century, and bridges the fields of neurology and psychology. Neuropsychologists apply scientific knowledge about the relationship between brain function and mental performances. The major clinical role of a neuropsychological evaluation is to help to establish medical and functional diagnosis in patients (adults or infants) with different neurological pathologies such as stroke, traumatic brain injury, dementia, epilepsy.... Such analysis necessitates accurate observation of behaviour and administration of tests of mental abilities (e.g. language, memory...). Test results can also help to clarify the nature of cognitive difficulties and to support the formulation of plans for neuropsychological therapy and functional adjustment in every day life.
Resumo:
RATIONALE AND OBJECTIVE:. The information assessment method (IAM) permits health professionals to systematically document the relevance, cognitive impact, use and health outcomes of information objects delivered by or retrieved from electronic knowledge resources. The companion review paper (Part 1) critically examined the literature, and proposed a 'Push-Pull-Acquisition-Cognition-Application' evaluation framework, which is operationalized by IAM. The purpose of the present paper (Part 2) is to examine the content validity of the IAM cognitive checklist when linked to email alerts. METHODS: A qualitative component of a mixed methods study was conducted with 46 doctors reading and rating research-based synopses sent on email. The unit of analysis was a doctor's explanation of a rating of one item regarding one synopsis. Interviews with participants provided 253 units that were analysed to assess concordance with item definitions. RESULTS AND CONCLUSION: The content relevance of seven items was supported. For three items, revisions were needed. Interviews suggested one new item. This study has yielded a 2008 version of IAM.