935 resultados para THERAPY USE
Resumo:
Objective: Aim of post operative treatments after cardiac surgery is to avoid low cardiac output syndrome (LCOS). Levosimendan, a new inotrope agent, has been demonstrated in adult patient to be an effective treatment for this purpose when classical therapy is not effective. It shows a positive effect on cardiac output, with fewer adverse effects and lower mortality than with dopamine. There is very few data on its benefit in the paediatric population. The aim of this study is to evaluate the effect of levosimendan in cardiac children with LCOS.Methods: Retrospective analysis of 25 children hospitalised in our PICU after cardiac surgery that demonstrated LCOS not responding to classical catecholamine therapy and who received levosimendan as rescue. LCOS parameters like urine output, mixed venous oxygen saturation (SvO2), arterio-venous differences in CO2 (AVCO2) and plasmatic lactate were compared before therapy and at 12, 24, 48 and 72 hours after the beginning of the levosimendan infusion. We also analyzed the effect on the utilisation of amines (amine score), adverse events and mortality.Results: After the beginning of levosimendan infusion, urine output (3.1 vs 5.3ml/kg/h, p=0.003) and SVO2 (56 vs 64mmHg, p=0.001) increase significantly during first 72 hours and at the same time plasmatic lactate (2.6 vs 1.4 mmole/l, p<0.001), AVCO2 (11 vs 8 mmHg, p=0.002) and amine score (63 vs 39, p=0.007) decrease significantly. No side effects were noted during administration of levosimendan. In this group of patients, mortality was 0%.Conclusion: Levosimendan is an effective treatment in children after congenital heart surgery. Our study, with a greater sample of patient than other studies, confirms the improvement of cardiac output already shown in other paediatric studies.
Resumo:
SUMMARY: Reluctance has been expressed about treating chronic hepatitis C in active intravenous (IV) drug users (IDUs), and this is found in both international guidelines and routine clinical practice. However, the medical literature provides no evidence for an unequivocal treatment deferral of this risk group. We retrospectively analyzed the direct effect of IV drug use on treatment outcome in 500 chronic hepatitis C patients enrolled in the Swiss Hepatitis C Cohort Study. Patients were eligible for the study if they had their serum hepatitis C virus (HCV) RNA tested 6 months after the end of treatment and at least one visit during the antiviral therapy, documenting the drug use status. Five hundred patients fulfilled the inclusion criteria (199 were IDU and 301 controls). A minimum exposure to 80% of the scheduled cumulative dose of antivirals was reached in 66.0% of IDU and 60.5% of controls (P = NS). The overall sustained virological response (SVR) rate was 63.6%. Active IDU reached a SVR of 69.3%, statistically not significantly different from controls (59.8%). A multivariate analysis for treatment success showed no significant negative influence of active IV drug use. In conclusion, our study shows no relevant direct influence of IV drugs on the efficacy of anti-HCV therapy among adherent patients.
Resumo:
Background/Purpose: The primary treatment goals for gouty arthritis (GA) are rapid relief of pain and inflammation during acute attacks, and long-term hyperuricemia management. A post-hoc analysis of 2 pivotal trials was performed to assess efficacy and safety of canakinumab (CAN), a fully human monoclonal anti-IL-1_ antibody, vs triamcinolone acetonide (TA) in GA patients unable to use NSAIDs and colchicine, and who were on stable urate lowering therapy (ULT) or unable to use ULT. Methods: In these 12-week, randomized, multicenter, double-blind, double-dummy, active-controlled studies (_-RELIEVED and _-RELIEVED II), patients had to have frequent attacks (_3 attacks in previous year) meeting preliminary GA ACR 1977 criteria, and were unresponsive, intolerant, or contraindicated to NSAIDs and/or colchicine, and if on ULT, ULT was stable. Patients were randomized during an acute attack to single dose CAN 150 mg s.c. or TA 40 mg i.m. and were redosed "on demand" for each new attack. Patients completing the core studies were enrolled into blinded 12-week extension studies to further investigate on-demand use of CAN vs TA for new attacks. The subpopulation selected for this post-hoc analysis was (a) unable to use NSAIDs and colchicine due to contraindication, intolerance or lack of efficacy for these drugs, and (b) currently on ULT, or contraindication or previous failure of ULT, as determined by investigators. Subpopulation comprised 101 patients (51 CAN; 50 TA) out of 454 total. Results: Several co-morbidities, including hypertension (56%), obesity (56%), diabetes (18%), and ischemic heart disease (13%) were reported in 90% of this subpopulation. Pain intensity (VAS 100 mm scale) was comparable between CAN and TA treatment groups at baseline (least-square [LS] mean 74.6 and 74.4 mm, respectively). A significantly lower pain score was reported with CAN vs TA at 72 hours post dose (1st co-primary endpoint on baseline flare; LS mean, 23.5 vs 33.6 mm; difference _10.2 mm; 95% CI, _19.9, _0.4; P_0.0208 [1-sided]). CAN significantly reduced risk for their first new attacks by 61% vs TA (HR 0.39; 95% CI, 0.17-0.91, P_0.0151 [1-sided]) for the first 12 weeks (2nd co-primary endpoint), and by 61% vs TA (HR 0.39; 95% CI, 0.19-0.79, P_0.0047 [1-sided]) over 24 weeks. Serum urate levels increased for CAN vs TA with mean change from baseline reaching a maximum of _0.7 _ 2.0 vs _0.1 _ 1.8 mg/dL at 8 weeks, and _0.3 _ 2.0 vs _0.2 _ 1.4 mg/dL at end of study (all had GA attack at baseline). Adverse Events (AEs) were reported in 33 (66%) CAN and 24 (47.1%) TA patients. Infections and infestations were the most common AEs, reported in 10 (20%) and 5 (10%) patients treated with CAN and TA respectively. Incidence of SAEs was comparable between CAN (gastritis, gastroenteritis, chronic renal failure) and TA (aortic valve incompetence, cardiomyopathy, aortic stenosis, diarrohea, nausea, vomiting, bicuspid aortic valve) groups (2 [4.0%] vs 2 [3.9%]). Conclusion: CAN provided superior pain relief and reduced risk of new attack in highly-comorbid GA patients unable to use NSAIDs and colchicine, and who were currently on stable ULT or unable to use ULT. The safety profile in this post-hoc subpopulation was consistent with the overall _-RELIEVED and _-RELIEVED II population.
Resumo:
OBJECTIVES: We studied the influence of noninjecting and injecting drug use on mortality, dropout rate, and the course of antiretroviral therapy (ART), in the Swiss HIV Cohort Study (SHCS). METHODS: Cohort participants, registered prior to April 2007 and with at least one drug use questionnaire completed until May 2013, were categorized according to their self-reported drug use behaviour. The probabilities of death and dropout were separately analysed using multivariable competing risks proportional hazards regression models with mutual correction for the other endpoint. Furthermore, we describe the influence of drug use on the course of ART. RESULTS: A total of 6529 participants (including 31% women) were followed during 31 215 person-years; 5.1% participants died; 10.5% were lost to follow-up. Among persons with homosexual or heterosexual HIV transmission, noninjecting drug use was associated with higher all-cause mortality [subhazard rate (SHR) 1.73; 95% confidence interval (CI) 1.07-2.83], compared with no drug use. Also, mortality was increased among former injecting drug users (IDUs) who reported noninjecting drug use (SHR 2.34; 95% CI 1.49-3.69). Noninjecting drug use was associated with higher dropout rates. The mean proportion of time with suppressed viral replication was 82.2% in all participants, irrespective of ART status, and 91.2% in those on ART. Drug use lowered adherence, and increased rates of ART change and ART interruptions. Virological failure on ART was more frequent in participants who reported concomitant drug injections while on opiate substitution, and in current IDUs, but not among noninjecting drug users. CONCLUSIONS: Noninjecting drug use and injecting drug use are modifiable risks for death, and they lower retention in a cohort and complicate ART.
Resumo:
PURPOSE: To investigate whether the prophylactic use of bevacizumab reduces the rate of rubeosis after proton therapy for uveal melanoma and improves the possibility to treat ischemic, reapplicated retina with laser photocoagulation. DESIGN: Comparative retrospective case series. METHODS: Uveal melanoma patients with ischemic retinal detachment and treated with proton therapy were included in this institutional study. Twenty-four eyes received prophylactic intravitreal bevacizumab injections and were compared with a control group of 44 eyes without bevacizumab treatment. Bevacizumab injections were performed at the time of tantalum clip insertion and were repeated every 2 months during 6 months, and every 3 months thereafter. Ultra-widefield angiography allowed determination of the extent of retinal ischemia, which was treated with laser photocoagulation after retinal reapplication. Main outcome measures were the time to rubeosis, the time to retinal reattachment, and the time to laser photocoagulation of ischemic retina. RESULTS: Baseline characteristics were balanced between the groups, except for thicker tumors and larger retinal detachments in the bevacizumab group, potentially to the disadvantage of the study group. Nevertheless, bevacizumab prophylaxis significantly reduced the rate of iris rubeosis from 36% to 4% (log-rank test P = .02) and tended to shorten the time to retinal reapplication until laser photocoagulation of the nonperfusion areas could be performed. CONCLUSIONS: Prophylactic intravitreal bevacizumab in patients treated with proton therapy for uveal melanoma with ischemic retinal detachment prevented anterior segment neovascularization, until laser photocoagulation to the reapplied retina could be performed.
Resumo:
BACKGROUND: The risk of catheter-related infection or bacteremia, with initial and extended use of femoral versus nonfemoral sites for double-lumen vascular catheters (DLVCs) during continuous renal replacement therapy (CRRT), is unclear. STUDY DESIGN: Retrospective observational cohort study. SETTING & PARTICIPANTS: Critically ill patients on CRRT in a combined intensive care unit of a tertiary institution. FACTOR: Femoral versus nonfemoral venous DLVC placement. OUTCOMES: Catheter-related colonization (CRCOL) and bloodstream infection (CRBSI). MEASUREMENTS: CRCOL/CRBSI rates expressed per 1,000 catheter-days. RESULTS: We studied 458 patients (median age, 65 years; 60% males) and 647 DLVCs. Of 405 single-site only DLVC users, 82% versus 18% received exclusively 419 femoral versus 82 jugular or subclavian DLVCs, respectively. The corresponding DLVC indwelling duration was 6±4 versus 7±5 days (P=0.03). Corresponding CRCOL and CRBSI rates (per 1,000 catheter-days) were 9.7 versus 8.8 events (P=0.8) and 1.2 versus 3.5 events (P=0.3), respectively. Overall, 96 patients with extended CRRT received femoral-site insertion first with subsequent site change, including 53 femoral guidewire exchanges, 53 new femoral venipunctures, and 47 new jugular/subclavian sites. CRCOL and CRBSI rates were similar for all such approaches (P=0.7 and P=0.9, respectively). On multivariate analysis, CRCOL risk was higher in patients older than 65 years and weighing >90kg (ORs of 2.1 and 2.2, respectively; P<0.05). This association between higher weight and greater CRCOL risk was significant for femoral DLVCs, but not for nonfemoral sites. Other covariates, including initial or specific DLVC site, guidewire exchange versus new venipuncture, and primary versus secondary DLVC placement, did not significantly affect CRCOL rates. LIMITATIONS: Nonrandomized retrospective design and single-center evaluation. CONCLUSIONS: CRCOL and CRBSI rates in patients on CRRT are low and not influenced significantly by initial or serial femoral catheterizations with guidewire exchange or new venipuncture. CRCOL risk is higher in older and heavier patients, the latter especially so with femoral sites.
Resumo:
Argatroban has been introduced as an alternative parenteral anticoagulant for HIT-patients in several European countries in 2005. In 2009 a panel of experts discussed their clinical experience with argatroban balancing risks and benefits of argatroban treatment in managing the highly procoagulant status of HIT-patients. This article summarizes the main conclusions of this round table discussion. An ongoing issue is the appropriate dosing of argatroban in special patient groups. Therefore, dosing recommendations for different HIT-patient groups (ICU patients; non-ICU patients, paediatric patients, and for patients undergoing renal replacement therapies) are summarized in this consensus statement. Because of the strong correlation between argatroban dosing requirements and scores used to characterize the severity of illness (APACHE; SAPS, SOFA) suitable dosing nomograms are given. This consensus statement contributes to clinically relevant information on the appropriate use and monitoring of argatroban based on the current literature, and provides additional information from clinical experience. As the two other approved drugs for HIT, danaparoid and lepirudin are either currently not available due to manufacturing problems (danaparoid) or will be withdrawn from the market in 2012 (lepirudin), this report should guide physicians who have limited experience with argatroban how to use this drug safely in patients with HIT.
Resumo:
The suitability of IgM antibodies to PGL-1 for monitoring the response to multidrug therapy (MDT) was sequentially tested by ELISA in 105 leprosy patients, and bacterial indexes (BI) were also determined. Patients were divided into 3 groups: group 1, 34 multibacillary (MB) patients treated for 12 months with MDT-MB; group 2, 33 MB patients treated for 24 months with MDT-MB, and group 3, 38 paucibacillary (PB) patients treated for 6 months with MDT-PB. Untreated MB patients exhibited higher antibody levels (mean ± SEM): group 1 (6.95 ± 1.35) and group 2 (12.53 ± 2.02) than untreated PB patients (1.28 ± 0.35). There was a significant difference (P < 0.01) in anti-PGL-1 levels in group 1 patients: untreated (6.95 ± 1.35) and treated for 12 months (2.78 ± 0.69) and in group 2 patients: untreated (12.53 ± 2.02) and treated for 24 months (2.62 ± 0.79). There was no significant difference between untreated (1.28 ± 0.35) and treated (0.62 ± 0.12) PB patients. Antibody levels correlated with BI. The correlation coefficient (Pearson’s r) was 0.72 before and 0.23 (P < 0.05) after treatment in group 1 and 0.67 before and 0.96 (P < 0.05) after treatment in group 2. BI was significantly reduced (P < 0.01) after 12 and 24 months on MDT (group 1: 1.26-0.26; group 2: 1.66-0.36). Our data indicate that monitoring anti-PGL-1 levels during MDT may be a sensitive tool for evaluating treatment efficacy. These data also indicate that the control of leprosy infection can be obtained with 12 months of MDT in MB patients.
Resumo:
Recent biotechnological advances have permitted the manipulation of genetic sequences to treat several diseases in a process called gene therapy. However, the advance of gene therapy has opened the door to the possibility of using genetic manipulation (GM) to enhance athletic performance. In such ‘gene doping’, exogenous genetic sequences are inserted into a specific tissue, altering cellular gene activity or leading to the expression of a protein product. The exogenous genes most likely to be utilized for gene doping include erythropoietin (EPO), vascular endothelial growth factor (VEGF), insulin-like growth factor type 1 (IGF-1), myostatin antagonists, and endorphin. However, many other genes could also be used, such as those involved in glucose metabolic pathways. Because gene doping would be very difficult to detect, it is inherently very attractive for those involved in sports who are prepared to cheat. Moreover, the field of gene therapy is constantly and rapidly progressing, and this is likely to generate many new possibilities for gene doping. Thus, as part of the general fight against all forms of doping, it will be necessary to develop and continually improve means of detecting exogenous gene sequences (or their products) in athletes. Nevertheless, some bioethicists have argued for a liberal approach to gene doping.
Resumo:
This research investigated the impact of stress management and relaxation techniques on psoriasis. It had a dual purpose to see if stress management and relaxation techniques, as an adjunct to traditional medical treatment, would improve the skin condition of psoriasis. In addition it attempted to provide psoriasis patients with a sense of control over their illness by educating them about the connection between mind and body through learning stress management and relaxation techniques. The former purpose was addressed quantitatively, while the latter was addressed qualitatively. Using an experimental design, the quantitative study tested the efficacy of stress management and relaxation techniques on 38 dermatological patients from St. John's, Newfoundland. The study which lasted ten weeks, suggested a weak relationship between psoriasis and stress. These relationships were not statistically significant. The qualitative data were gathered through unstructured interviews and descriptive/interpretative analysis was used to evaluate them. Patients in the experimental group believed in the mind body connection as it related to their illness and stress. The findings also showed that the patients believed that the stress reduction and relaxation techniques improved their quality of life, their level of psoriasis, and their ability to live with the condition. Based on the contradictory nature of the findings, further research is needed. It is posited that replication of this study would be vastly improved by increasing the sample size to increase the possibility of significant findings. As wel~ increasing the length of time for the experiment would control for the possibility of a lag effect. Finally, the study looked at linear relationships between stress and psoriasis. Further study should ascertain whether the relationship might be nonlinear
Resumo:
Organizations offering therapeutic wilderness programming have a responsibility to ensure the well-being of their front line employees. A system of social support that is formed through communication with others, either personally or professionally, can assist field instructors in effectively managing the demands arising from their work. Phenomenological analysis of semi-structured interview transcripts from seven participants provided insight on perceptions of necessity, accessibility and use of social support. Fourteen main themes and thirteen subthemes emerged from the data. Findings are presented using the six components of Parsons’ (1980) staff development model and strongly suggest program managers consider and apply specific measures aimed at increasing the social support for front line field instructors in a wilderness therapy work context.
Resumo:
This paper investigates the effectiveness of a group-based psychosocial rehabilitation program for cochlear implant patients and their frequent communication partners.