13 resultados para Withholding and withdrawing treatment
em Université de Lausanne, Switzerland
Resumo:
PRINCIPLES: This retrospective study analyzes the long-term results of endoscopic and surgical treatment of vesico-ureteral reflux in children. METHODS: A cohort of 130 patients, 67 girls and 63 boys with a mean age of 30 months were treated either by endoscopic subureteral collagen injection (SCIN) in 92 and by Cohen reimplantation surgery in 123 refluxing ureteral units. Mean follow-up was 4.2 years varying from 1 to 8.7 years. Reflux recurrence, urinary tract infection (UTI) and renal function were evaluated. RESULTS: After SCIN reflux was absent in 64% at 6 months. 20% of the initially 92 refluxing ureters were injected twice. After one or two injections reflux was absent in 71%. In 21% recurrent reflux was of grade I or II, not requiring further treatment. UTI was observed in 27%. After Cohen ureteral reimplantation reflux was absent in 96% at 6 months. UTI was observed in 23%. Renal function at diagnosis and follow-up was compared in children with bilateral grade III reflux only. In patients treated with SCIN it was normal in 77% preoperatively and in 90% at follow-up. In patients treated by open surgery it was normal in 47% preoperatively and in 76% at follow-up. CONCLUSION: For high-grade vesico-ureteral reflux re-implantation surgery remains the gold standard. SCIN is indicated for low and medium grade reflux. Recurrent bacteriuria was observed more often after SCIN and pyelonephritis more often after open surgery. The renal function seems to be preserved with both techniques.
Resumo:
An assessment of sewage workers' exposure to airborne cultivable bacteria, fungi and inhaled endotoxins was performed at 11 sewage treatment plants. We sampled the enclosed and unenclosed treatment areas in each plant and evaluated the influence of seasons (summer and winter) on bioaerosol levels. We also measured personal exposure to endotoxins of workers during special operation where a higher risk of bioaerosol inhalation was assumed. Results show that only fungi are present in significantly higher concentrations in summer than in winter (2331 +/- 858 versus 329 +/- 95 CFU m(-3)). We also found that there are significantly more bacteria in the enclosed area, near the particle grids for incoming water, than in the unenclosed area near the aeration basins (9455 +/- 2661 versus 2435 +/- 985 CFU m(-3) in summer and 11 081 +/- 2299 versus 2002 +/- 839 CFU m(-3) in winter). All bioaerosols were frequently above the recommended values of occupational exposure. Workers carrying out special tasks such as cleaning tanks were exposed to very high levels of endotoxins (up to 500 EU m(-3)) compared to routine work. The species composition and concentration of airborne Gram-negative bacteria were also studied. A broad spectrum of different species within the Pseudomonadaceae and the Enterobacteriaceae families were predominant in nearly all plants investigated. [Authors]
Resumo:
Abstract Low motivation is frequent in chronic disorders such as psychosis and may limit treatment efficacy. Although some evidence supports this view in adults, few studies so far have focused on adolescents. We assessed the impact of baseline symptoms, cognitive deficits and cognitive treatment characteristics on treatment motivation (TM), and examined whether TM affected treatment outcome. Twenty-eight adolescents with psychotic disorders participated in 16 sessions of computerized cognitive remediation or games. TM was assessed for each session. Lower TM was predicted by more severe symptoms at baseline, and was associated with smaller improvements in symptoms and both cognitive and psychosocial functioning at the end of the intervention. Experiencing success in the treatment exercises enhanced TM in all patients.
Resumo:
HIV-positive adolescents face a number of challenges in dealing with their disease and its treatment. In this qualitative study, twenty-nine HIV-positive adolescents aged 13 to 20 years (22 girls), who live in Switzerland, were asked, in a semi-structured interview (duration of 40-110 minutes), to describe their perceptions and experiences with the disease itself and with therapeutic adherence. While younger adolescents most often thought of their disease as fate, older adolescents usually knew that they had received it through vertical transmission, although the topic appeared to be particularly difficult to discuss for those living with their HIV-positive mothers. Based on their attending physician's assessment, 18 subjects were judged highly adherent, 4 fairly and 7 poorly adherent. High adherence appeared linked with adequate psychological adjustment and effective coping mechanisms, as well as with the discussion and adoption of explicit medication-taking strategies. The setting and organisation of health care teams should allow for ongoing discussions with HIV-positive adolescents that focus on their perceptions of their disease, how they cope with it and with the treatment, and how they could improve their adherence.
Resumo:
STUDY DESIGN: Prospective, controlled, observational outcome study using clinical, radiographic, and patient/physician-based questionnaire data, with patient outcomes at 12 months follow-up. OBJECTIVE: To validate appropriateness criteria for low back surgery. SUMMARY OF BACKGROUND DATA: Most surgical treatment failures are attributed to poor patient selection, but no widely accepted consensus exists on detailed indications for appropriate surgery. METHODS: Appropriateness criteria for low back surgery have been developed by a multispecialty panel using the RAND appropriateness method. Based on panel criteria, a prospective study compared outcomes of patients appropriately and inappropriately treated at a single institution with 12 months follow-up assessment. Included were patients with low back pain and/or sciatica referred to the neurosurgical department. Information about symptoms, neurologic signs, the health-related quality of life (SF-36), disability status (Roland-Morris), and pain intensity (VAS) was assessed at baseline, at 6 months, and at 12 months follow-up. The appropriateness criteria were administered prospectively to each clinical situation and outside of the clinical setting, with the surgeon and patients blinded to the results of the panel decision. The patients were further stratified into 2 groups: appropriate treatment group (ATG) and inappropriate treatment group (ITG). RESULTS: Overall, 398 patients completed all forms at 12 months. Treatment was considered appropriate for 365 participants and inappropriate for 33 participants. The mean improvement in the SF-36 physical component score at 12 months was significantly higher in the ATG (mean: 12.3 points) than in the ITG (mean: 6.8 points) (P = 0.01), as well as the mean improvement in the SF-36 mental component score (ATG mean: 5.0 points; ITG mean: -0.5 points) (P = 0.02). Improvement was also significantly higher in the ATG for the mean VAS back pain (ATG mean: 2.3 points; ITG mean: 0.8 points; P = 0.02) and Roland-Morris disability score (ATG mean: 7.7 points; ITG mean: 4.2 points; P = 0.004). The ATG also had a higher improvement in mean VAS for sciatica (4.0 points) than the ITG (2.8 points), but the difference was not significant (P = 0.08). The SF-36 General Health score declined in both groups after 12 months, however, the decline was worse in the ITG (mean decline: 8.2 points) than in the ATG (mean decline: 1.2 points) (P = 0.04). Overall, in comparison to ITG patients, ATG patients had significantly higher improvement at 12 months, both statistically and clinically. CONCLUSION: In comparison to previously reported literature, our study is the first to assess the utility of appropriateness criteria for low back surgery at 1-year follow-up with multiple outcome dimensions. Our results confirm the hypothesis that application of appropriateness criteria can significantly improve patient outcomes.
Resumo:
OBJECTIVE: Study of the uptake of new medical technologies provides useful information on the transfer of published evidence into usual practice. We conducted an audit of selected hospitals in three countries (Canada, France, and Switzerland) to identify clinical predictors of low-molecular-weight (LMW) heparin use and outpatient treatment, and to compare the pace of uptake of these new therapeutic approaches across hospitals. DESIGN: Historical review of medical records. SETTING AND PARTICIPANTS: We reviewed the medical records of 3043 patients diagnosed with deep vein thrombosis (DVT) in five Canadian, two French, and two Swiss teaching hospitals from 1994 to 1998. Measures. We explored independent clinical variables associated with LMW heparin use and outpatient treatment, and determined crude and adjusted rates of LMW heparin use and outpatient treatment across hospitals. RESULTS: For the years studied, the overall rates of LMW heparin use and outpatient treatment in the study sample were 34.1 and 15.8%, respectively, with higher rates of use in later years. Many comorbidities were negatively associated with outpatient treatment, and risk-adjusted rates of use of these new approaches varied significantly across hospitals. CONCLUSION: There has been a relatively rapid uptake of LMW heparins and outpatient treatment for DVT in their early years of availability, but the pace of uptake has varied considerably across hospitals and countries.
Resumo:
BACKGROUND: Vitamin D is an important immune modulator and preliminary data indicated an association between vitamin D deficiency and sustained virologic response (SVR) rates in patients with chronic hepatitis C. We therefore performed a comprehensive analysis on the impact of vitamin D serum levels and of genetic polymorphisms within the vitamin D cascade on chronic hepatitis C and its treatment. METHODS: Vitamin D serum levels, genetic polymorphisms within the vitamin D receptor and the 1α- hydroxylase were determined in a cohort of 468 HCV genotype 1, 2 and 3 infected patients who were treated with interferon-alfa based regimens. RESULTS: Chronic hepatitis C was associated with a high incidence of severe vitamin D deficiency compared to controls (25(OH)D3<10 ng/mL in 25% versus 12%, p<0.00001), which was in part reversible after HCV eradication. 25(OH)D3 deficiency correlated with SVR in HCV genotype 2 and 3 patients (63% and 83% SVR for patients with and without severe vitamin D deficiency, respectively, p<0.001). In addition, the CYPB27-1260 promoter polymorphism rs10877012 had substantial impact on 1-25- dihydroxyvitamin D serum levels and SVR rates in HCV genotype 1, 2 and 3 infected patients. CONCLUSIONS: Chronic hepatitis C virus infection is associated with vitamin D deficiency. Reduced 25- hydroxyvitamin D levels and CYPB27-1260 promoter polymorphism are associated with failure to achieve SVR in HCV genotype 1, 2, 3 infected patients.
Resumo:
Introduction. Preoperative malnutrition is a major risk factor for increased postoperative morbidity and mortality. Definition and diagnosis of malnutrition and its treatment is still subject for controversy. Furthermore, practical implementation of nutrition-related guidelines is unknown. Methods. A review of the available literature and of current guidelines on perioperative nutrition was conducted. We focused on nutritional screening and perioperative nutrition in patients undergoing digestive surgery, and we assessed translation of recent guidelines in clinical practice. Results and Conclusions. Malnutrition is a well-recognized risk factor for poor postoperative outcome. The prevalence of malnutrition depends largely on its definition; about 40% of patients undergoing major surgery fulfil current diagnostic criteria of being at nutritional risk. The Nutritional Risk Score is a pragmatic and validated tool to identify patients who should benefit from nutritional support. Adequate nutritional intervention entails reduced (infectious) complications, hospital stay, and costs. Preoperative oral supplementation of a minimum of five days is preferable; depending on the patient and the type of surgery, immune-enhancing formulas are recommended. However, surgeons' compliance with evidence-based guidelines remains poor and efforts are necessary to implement routine nutritional screening and nutritional support.
Resumo:
Repeated antimalarial treatment for febrile episodes and self-treatment are common in malaria-endemic areas. The intake of antimalarials prior to participating in an in vivo study may alter treatment outcome and affect the interpretation of both efficacy and safety outcomes. We report the findings from baseline plasma sampling of malaria patients prior to inclusion into an in vivo study in Tanzania and discuss the implications of residual concentrations of antimalarials in this setting. In an in vivo study conducted in a rural area of Tanzania in 2008, baseline plasma samples from patients reporting no antimalarial intake within the last 28 days were screened for the presence of 14 antimalarials (parent drugs or metabolites) using liquid chromatography-tandem mass spectrometry. Among the 148 patients enrolled, 110 (74.3%) had at least one antimalarial in their plasma: 80 (54.1%) had lumefantrine above the lower limit of calibration (LLC = 4 ng/mL), 7 (4.7%) desbutyl-lumefantrine (4 ng/mL), 77 (52.0%) sulfadoxine (0.5 ng/mL), 15 (10.1%) pyrimethamine (0.5 ng/mL), 16 (10.8%) quinine (2.5 ng/mL) and none chloroquine (2.5 ng/mL). The proportion of patients with detectable antimalarial drug levels prior to enrollment into the study is worrying. Indeed artemether-lumefantrine was supposed to be available only at government health facilities. Although sulfadoxine-pyrimethamine is only recommended for intermittent preventive treatment in pregnancy (IPTp), it was still widely used in public and private health facilities and sold in drug shops. Self-reporting of previous drug intake is unreliable and thus screening for the presence of antimalarial drug levels should be considered in future in vivo studies to allow for accurate assessment of treatment outcome. Furthermore, persisting sub-therapeutic drug levels of antimalarials in a population could promote the spread of drug resistance. The knowledge on drug pressure in a given population is important to monitor standard treatment policy implementation.
Resumo:
In this study, we assessed whether the white-coat effect (difference between office and daytime blood pressure (BP)) is associated with nondipping (absence of BP decrease at night). Data were available in 371 individuals of African descent from 74 families selected from a population-based hypertension register in the Seychelles Islands and in 295 Caucasian individuals randomly selected from a population-based study in Switzerland. We used standard multiple linear regression in the Swiss data and generalized estimating equations to account for familial correlations in the Seychelles data. The prevalence of systolic and diastolic nondipping (<10% nocturnal BP decrease) and white-coat hypertension (WCH) was respectively 51, 46, and 4% in blacks and 33, 37, and 7% in whites. When white coat effect and nocturnal dipping were taken as continuous variables (mm Hg), systolic (SBP) and diastolic BP (DBP) dipping were associated inversely and independently with white-coat effect (P < 0.05) in both populations. Analogously, the difference between office and daytime heart rate was inversely associated with the difference between daytime and night-time heart rate in the two populations. These results did not change after adjustment for potential confounders. The white-coat effect is associated with BP nondipping. The similar associations between office-daytime values and daytime-night-time values for both BP and heart rate suggest that the sympathetic nervous system might play a role. Our findings also further stress the interest, for clinicians, of assessing the presence of a white-coat effect as a means to further identify patients at increased cardiovascular risk and guide treatment accordingly.
Resumo:
Background: The public health burden of coronary artery disease (CAD) is important. Perfusion cardiac magnetic resonance (CMR) is generally accepted to detect and monitor CAD. Few studies have so far addressed its costs and costeffectiveness. Objectives: To compare in a large CMR registry the costs of a CMR-guided strategy vs two hypothetical invasive strategies for the diagnosis and the treatment of patients with suspected CAD. Methods: In 3'647 patients with suspected CAD included prospectively in the EuroCMR Registry (59 centers; 18 countries) costs were calculated for diagnostic examinations, revascularizations as well as for complication management over a 1-year follow-up. Patients with ischemia-positive CMR underwent an invasive X-ray coronary angiography (CXA) and revascularization at the discretion of the treating physician (=CMR+CXA strategy). Ischemia was found in 20.9% of patients and 17.4% of them were revascularized. In ischemia-negative patients by CMR, cardiac death and non-fatal myocardial infarctions occurred in 0.38%/y. In a hypothetical invasive arm the costs were calculated for an initial CXA followed by FFR testing in vessels with ≥50% diameter stenoses (=CXA+FFR strategy). To model this hypothetical arm, the same proportion of ischemic patients and outcome was assumed as for the CMR+CXA strategy. The coronary stenosis - FFR relationship reported in the literature was used to derive the proportion of patients with ≥50% diameter stenoses (Psten) in the study cohort. The costs of a CXA-only strategy were also calculated. Calculations were performed from a third payer perspective for the German, UK, Swiss, and US healthcare systems.
Resumo:
AIMS: Smoking cessation has been suggested to increase the short-term risk of type 2 diabetes mellitus (T2DM). This study aimed at assessing the association between smoking cessation and incidence of T2DM and impaired fasting glucose (IFG). METHODS: Data from participants in the CoLaus study, Switzerland, aged 35-75 at baseline and followed for 5.5years were used. Participants were classified as smokers, recent (≤5years), long-term (>5years) quitters, and non-smokers at baseline. Outcomes were IFG (fasting serum glucose (FSG) 5.6-6.99mmol/l) and T2DM (FSG ≥7.0mmol/l and/or treatment) at follow up. RESULTS: 3,166 participants (63% women) had normal baseline FSG, of whom 26.7% were smokers, 6.5% recent quitters, and 23.5% long-term quitters. During follow-up 1,311 participants (41.4%) developed IFG (33.6% women, 54.7% men) and 47 (1.5%) developed T2DM (1.1% women, 2.1% men). Former smokers did not have statistically significant increased odds of IFG compared with smokers after adjustment for age, education, physical activity, hypercholesterolemia, hypertension and alcohol intake, with OR of 1.29 [95% confidence interval 0.94-1.76] for recent quitters and 1.03 [0.84-1.27] for long-term quitters. Former smokers did not have significant increased odds of T2DM compared with smokers with multivariable-adjusted OR of 1.53 [0.58-4.00] for recent quitters and 0.64 [0.27-1.48] for long-term quitters. Adjustment for body-mass index and waist circumference attenuated the association between recent quitting and IFG (OR 1.07 [0.78-1.48]) and T2DM (OR 1.28 [0.48-3.40]. CONCLUSION: In this middle-aged population, smoking cessation was not associated with an increased risk of IFG or T2DM.