9 resultados para Practice patterns
Resumo:
BACKGROUND. Total knee (TKR) and hip (THR) replacement (arthroplasty) are effective surgical procedures that relieve pain, improve patients' quality of life and increase functional capacity. Studies on variations in medical practice usually place the indications for performing these procedures to be highly variable, because surgeons appear to follow different criteria when recommending surgery in patients with different severity levels. We therefore proposed a study to evaluate inter-hospital variability in arthroplasty indication. METHODS. The pre-surgical condition of 1603 patients included was compared by their personal characteristics, clinical situation and self-perceived health status. Patients were asked to complete two health-related quality of life questionnaires: the generic SF-12 (Short Form) and the specific WOMAC (Western Ontario and Mcmaster Universities) scale. The type of patient undergoing primary arthroplasty was similar in the 15 different hospitals evaluated.The variability in baseline WOMAC score between hospitals in THR and TKR indication was described by range, mean and standard deviation (SD), mean and standard deviation weighted by the number of procedures at each hospital, high/low ratio or extremal quotient (EQ5-95), variation coefficient (CV5-95) and weighted variation coefficient (WCV5-95) for 5-95 percentile range. The variability in subjective and objective signs was evaluated using median, range and WCV5-95. The appropriateness of the procedures performed was calculated using a specific threshold proposed by Quintana et al for assessing pain and functional capacity. RESULTS. The variability expressed as WCV5-95 was very low, between 0.05 and 0.11 for all three dimensions on WOMAC scale for both types of procedure in all participating hospitals. The variability in the physical and mental SF-12 components was very low for both types of procedure (0.08 and 0.07 for hip and 0.03 and 0.07 for knee surgery patients). However, a moderate-high variability was detected in subjective-objective signs. Among all the surgeries performed, approximately a quarter of them could be considered to be inappropriate. CONCLUSIONS. A greater inter-hospital variability was observed for objective than for subjective signs for both procedures, suggesting that the differences in clinical criteria followed by surgeons when indicating arthroplasty are the main responsible factors for the variation in surgery rates.
Resumo:
An update of the levels of scientific evidence stating the varying degrees of recommendation for asymptomatic patients, indicating which procedures are most appropriate and what should be avoided all the systems described, we seek one that meets the principle of simplicity and utility. We chose for our setting the valuation of the Centre for Evidence-Based Medicine, Oxford (OCEBM). This classification has the advantage that assures us the knowledge on each scenario, its high degree of specialization. It also has the prerogative to clarify how it affects the lack of methodological rigor to the study design, reducing its assessment not only in the grading of the evidence, but also on the strength of recommendations.
Resumo:
BACKGROUND Lung cancer remains one of the most prevalent forms of cancer. Radiotherapy, with or without other therapeutic modalities, is an effective treatment. Our objective was to report on the use of radiotherapy for lung cancer, its variability in our region, and to compare our results with the previous study done in 2004 (VARA-I) in our region and with other published data. METHODS We reviewed the clinical records and radiotherapy treatment sheets of all patients undergoing radiotherapy for lung cancer during 2007 in the 12 public hospitals in Andalusia, an autonomous region of Spain. Data were gathered on hospital, patient type and histological type, radiotherapy treatment characteristics, and tumor stage. RESULTS 610 patients underwent initial radiotherapy. 37% of cases had stage III squamous cell lung cancer and were treated with radical therapy. 81% of patients with non-small and small cell lung cancer were treated with concomitant chemo-radiotherapy and the administered total dose was ≥60 Gy and ≥45 Gy respectively. The most common regimen for patients treated with palliative intent (44.6%) was 30 Gy. The total irradiation rate was 19.6% with significant differences among provinces (range, 8.5-25.6%; p<0.001). These differences were significantly correlated with the geographical distribution of radiation oncologists (r=0.78; p=0.02). Our results were similar to other published data and previous study VARA-I. CONCLUSIONS Our results shows no differences according to the other published data and data gathered in the study VARA-I. There is still wide variability in the application of radiotherapy for lung cancer in our setting that significantly correlates with the geographical distribution of radiation oncologists.
Resumo:
BACKGROUND: Despite the progress over recent decades in developing community mental health services internationally, many people still receive treatment and care in institutional settings. Those most likely to reside longest in these facilities have the most complex mental health problems and are at most risk of potential abuses of care and exploitation. This study aimed to develop an international, standardised toolkit to assess the quality of care in longer term hospital and community based mental health units, including the degree to which human rights, social inclusion and autonomy are promoted. METHOD: The domains of care included in the toolkit were identified from a systematic literature review, international expert Delphi exercise, and review of care standards in ten European countries. The draft toolkit comprised 154 questions for unit managers. Inter-rater reliability was tested in 202 units across ten countries at different stages of deinstitutionalisation and development of community mental health services. Exploratory factor analysis was used to corroborate the allocation of items to domains. Feedback from those using the toolkit was collected about its usefulness and ease of completion. RESULTS: The toolkit had excellent inter-rater reliability and few items with narrow spread of response. Unit managers found the content highly relevant and were able to complete it in around 90 minutes. Minimal refinement was required and the final version comprised 145 questions assessing seven domains of care. CONCLUSIONS: Triangulation of qualitative and quantitative evidence directed the development of a robust and comprehensive international quality assessment toolkit for units in highly variable socioeconomic and political contexts
Resumo:
Objectives Exposure assessment to a single pesticide does not capture the complexity of the occupational exposure. Recently, pesticide use patterns analysis has emerged as an alternative to study these exposures. The aim of this study is to identify the pesticide use pattern among flower growers in Mexico participating in the study on the endocrine and reproductive effects associated with pesticide exposure. Methods A cross-sectional study was carried out to gather retrospective information on pesticide use applying a questionnaire to the person in charge of the participating flower growing farms. Information about seasonal frequency of pesticide use (rainy and dry) for the years 2004 and 2005 was obtained. Principal components analysis was performed. Results Complete information was obtained for 88 farms and 23 pesticides were included in the analysis. Six principal components were selected, which explained more than 70% of the data variability. The identified pesticide use patterns during both years were: 1. fungicides benomyl, carbendazim, thiophanate and metalaxyl (both seasons), including triadimephon during the rainy season, chlorotalonyl and insecticide permethrin during the dry season; 2. insecticides oxamyl, biphenthrin and fungicide iprodione (both seasons), including insecticide methomyl during the dry season; 3. fungicide mancozeb and herbicide glyphosate (only during the rainy season); 4. insecticides metamidophos and parathion (both seasons); 5. insecticides omethoate and methomyl (only rainy season); and 6. insecticides abamectin and carbofuran (only dry season). Some pesticides do not show a clear pattern of seasonal use during the studied years. Conclusions The principal component analysis is useful to summarise a large set of exposure variables into smaller groups of exposure patterns, identifying the mixtures of pesticides in the occupational environment that may have an interactive effect on a particular health effect.
Resumo:
INTRODUCTION Functional imaging studies of addiction following protracted abstinence have not been systematically conducted to look at the associations between severity of use of different drugs and brain dysfunction. Findings from such studies may be relevant to implement specific interventions for treatment. The aim of this study was to examine the association between resting-state regional brain metabolism (measured with 18F-fluorodeoxyglucose Positron Emission Tomography (FDG-PET) and the severity of use of cocaine, heroin, alcohol, MDMA and cannabis in a sample of polysubstance users with prolonged abstinence from all drugs used. METHODS Our sample consisted of 49 polysubstance users enrolled in residential treatment. We conducted correlation analyses between estimates of use of cocaine, heroin, alcohol, MDMA and cannabis and brain metabolism (BM) (using Statistical Parametric Mapping voxel-based (VB) whole-brain analyses). In all correlation analyses conducted for each of the drugs we controlled for the co-abuse of the other drugs used. RESULTS The analysis showed significant negative correlations between severity of heroin, alcohol, MDMA and cannabis use and BM in the dorsolateral prefrontal cortex (DLPFC) and temporal cortex. Alcohol use was further associated with lower metabolism in frontal premotor cortex and putamen, and stimulants use with parietal cortex. CONCLUSIONS Duration of use of different drugs negatively correlated with overlapping regions in the DLPFC, whereas severity of cocaine, heroin and alcohol use selectively impact parietal, temporal, and frontal-premotor/basal ganglia regions respectively. The knowledge of these associations could be useful in the clinical practice since different brain alterations have been associated with different patterns of execution that may affect the rehabilitation of these patients.
Resumo:
OBJECTIVE To describe what is, to our knowledge, the first nosocomial outbreak of infection with pan-drug-resistant (including colistin-resistant) Acinetobacter baumannii, to determine the risk factors associated with these types of infections, and to determine their clinical impact. DESIGN Nested case-control cohort study and a clinical-microbiological study. SETTING A 1,521-bed tertiary care university hospital in Seville, Spain. PATIENTS Case patients were inpatients who had a pan-drug-resistant A. baumannii isolate recovered from a clinical or surveillance sample obtained at least 48 hours after admission to an intensive care unit (ICU) during the time of the epidemic outbreak. Control patients were patients who were admitted to any of the "boxes" (ie, rooms that partition off a distinct area for a patient's bed and the equipment needed to care for the patient) of an ICU for at least 48 hours during the time of the epidemic outbreak. RESULTS All the clinical isolates had similar antibiotic susceptibility patterns (ie, they were resistant to all the antibiotics tested, including colistin), and, on the basis of repetitive extragenic palindromic-polymerase chain reaction, it was determined that all of them were of the same clone. The previous use of quinolones and glycopeptides and an ICU stay were associated with the acquisition of infection or colonization with pan-drug-resistant A. baumannii. To control this outbreak, we implemented the following multicomponent intervention program: the performance of environmental decontamination of the ICUs involved, an environmental survey, a revision of cleaning protocols, active surveillance for colonization with pan-drug-resistant A. baumannii, educational programs for the staff, and the display of posters that illustrate contact isolation measures and antimicrobial use recommendations. CONCLUSIONS We were not able to identify the common source for these cases of infection, but the adopted measures have proven to be effective at controlling the outbreak.
Resumo:
Ambulatory blood pressure (BP) monitoring has become useful in the diagnosis and management of hypertensive individuals. In addition to 24-hour values, the circadian variation of BP adds prognostic significance in predicting cardiovascular outcome. However, the magnitude of circadian BP patterns in large studies has hardly been noticed. Our aims were to determine the prevalence of circadian BP patterns and to assess clinical conditions associated with the nondipping status in groups of both treated and untreated hypertensive subjects, studied separately. Clinical data and 24-hour ambulatory BP monitoring were obtained from 42,947 hypertensive patients included in the Spanish Society of Hypertension Ambulatory Blood Pressure Monitoring Registry. They were 8384 previously untreated and 34,563 treated hypertensives. Twenty-four-hour ambulatory BP monitoring was performed with an oscillometric device (SpaceLabs 90207). A nondipping pattern was defined when nocturnal systolic BP dip was <10% of daytime systolic BP. The prevalence of nondipping was 41% in the untreated group and 53% in treated patients. In both groups, advanced age, obesity, diabetes mellitus, and overt cardiovascular or renal disease were associated with a blunted nocturnal BP decline (P<0.001). In treated patients, nondipping was associated with the use of a higher number of antihypertensive drugs but not with the time of the day at which antihypertensive drugs were administered. In conclusion, a blunted nocturnal BP dip (the nondipping pattern) is common in hypertensive patients. A clinical pattern of high cardiovascular risk is associated with nondipping, suggesting that the blunted nocturnal BP dip may be merely a marker of high cardiovascular risk.
Assessment of drug-induced hepatotoxicity in clinical practice: a challenge for gastroenterologists.
Resumo:
Currently, pharmaceutical preparations are serious contributors to liver disease; hepatotoxicity ranking as the most frequent cause for acute liver failure and post-commercialization regulatory decisions. The diagnosis of hepatotoxicity remains a difficult task because of the lack of reliable markers for use in general clinical practice. To incriminate any given drug in an episode of liver dysfunction is a step-by-step process that requires a high degree of suspicion, compatible chronology, awareness of the drug's hepatotoxic potential, the exclusion of alternative causes of liver damage and the ability to detect the presence of subtle data that favors a toxic etiology. This process is time-consuming and the final result is frequently inaccurate. Diagnostic algorithms may add consistency to the diagnostic process by translating the suspicion into a quantitative score. Such scales are useful since they provide a framework that emphasizes the features that merit attention in cases of suspected hepatic adverse reaction as well. Current efforts in collecting bona fide cases of drug-induced hepatotoxicity will make refinements of existing scales feasible. It is now relatively easy to accommodate relevant data within the scoring system and to delete low-impact items. Efforts should also be directed toward the development of an abridged instrument for use in evaluating suspected drug-induced hepatotoxicity at the very beginning of the diagnosis and treatment process when clinical decisions need to be made. The instrument chosen would enable a confident diagnosis to be made on admission of the patient and treatment to be fine-tuned as further information is collected.