833 resultados para Colitis -- drug therapy
Resumo:
An extensive array of compounds has been studied for the treatment of ulcerative colitis (UC). The most frequently used nonbiologic drugs for the oral and intravenous treatment of ulcerative colitis include 5-aminosalicylate (5-ASA) drugs (mesalamine and derivatives), sulfasalazine, and other azo-bonded molecules of 5-ASA, steroids, calcineurin inhibitors (cyclosporine, tacrolimus, and sirolimus), thiopurines (azathioprine, 6-mercaptopurine), and methotrexate, which are already presented in other sections of this book and are thus not considered in this chapter. The therapies presented in this section should be considered as potential alternatives, mostly for mild-to-moderate ulcerative colitis (UC). They are substances mostly used without FDA indications, such as heparin, nicotine, rosiglitazone, and N-acetylcysteine as well as “natural” compounds suggested to have anti-inflammatory or reparative properties, such as aloe vera, curcumin, short-chain fatty acids, and Bowman-Birk inhibitor
Resumo:
Background Tumor necrosis factor (TNF) inhibition is central to the therapy of inflammatory bowel diseases (IBD). However, loss of response (LOR) is frequent and additional tests to help decision making with costly anti-TNF Therapy are needed. Methods Consecutive IBD Patients receiving anti-TNF therapy (Infliximab (IFX) or Adalimumab (after IFX LOR) from Bern University Hospital were identified and followed prospectively. Patient whole blood was stimulated with a dose-titration of two triggers of TLR receptors human: TNF and LPS. Median fluorescence intensity of CD62L on the surface of granulocytes was quantified by surface staining with specific antibodies (CD33, CD62L) and flow cytometry and logistic curves to these data permits the calculation of EC50 or the half maximal effective concentration TNF concentration to induce shedding [1]. A shift in the concentration were CD62L shedding occurred was seen before and after the anti-TNF agent administraion which permits to predict the response to the drug. This predicted response was correlated to the clinical evolution of the patients in order to analyze the ability of this test to identify LOR to IFX. Results We collected prospective clinical data and blood samples, before and after anti-TNF agent administration, on 33 IBD patients, 25 Crohn's disease and 8 ulcerative colitis patients (45% females) between June 2012 and November 2013. The assay showed a functional blockade of IFX (PFR) for 22 patients (17 CD and 5 UC) whereas 11 (8 CD and 3 UC) had no functional response (NR) to IFX. Clinical characteristics (e.g. diagnosis, disease location, smoking status, BMI and number of infusions) were no significantly different between predicted PFR and NR. Among the 22 Patients with PRF, only 1 patient was a clinical non responder (LOR to IFX), based on clinical prospective evaluation by IBD gastroenterologists (PJ, AM), and among the 11 predicted NR, 3 had no clinical LOR. Sensitivity of this test was 95% and specificity 73% and AUC adjusted for age and gender was 0.81 (Figure 1). During follow up (median 10 mo, 3–15) 8 “hard” outcomes occured (3 medic. flares, 4 resections and 1 new fistula) 2 in the PFR and 6 in the NR group (25% vs. 75%; p < 0.01). Correlation with clinical response is presented in Figure 2. Figure 1. Figure 2. Correlation clinical response - log EC50 changes: 1 No, 2 partial, 3 complete clinical response. Conclusion CD62L (L-Selectin) shedding is the first validated test of functional blockade of TNF alpha in anti-TNF treated IBD patients and will be a useful tool to guide medical decision on the use of anti-TNF agents. Comparative studies with ATI and trough level of IFX are ongoing. 1. Nicola Patuto, Emma Slack, Frank Seibold and Andrew J. Macpherson, (2011), Quantitating Anti-TNF Functionality to Inform Dosing and Choice of Therapy, Gastroenterology, 140 (5, Suppl. I), S689.
Resumo:
OBJECTIVES Pre-antiretroviral therapy (ART) inflammation and coagulation activation predict clinical outcomes in HIV-positive individuals. We assessed whether pre-ART inflammatory marker levels predicted the CD4 count response to ART. METHODS Analyses were based on data from the Strategic Management of Antiretroviral Therapy (SMART) trial, an international trial evaluating continuous vs. interrupted ART, and the Flexible Initial Retrovirus Suppressive Therapies (FIRST) trial, evaluating three first-line ART regimens with at least two drug classes. For this analysis, participants had to be ART-naïve or off ART at randomization and (re)starting ART and have C-reactive protein (CRP), interleukin-6 (IL-6) and D-dimer measured pre-ART. Using random effects linear models, we assessed the association between each of the biomarker levels, categorized as quartiles, and change in CD4 count from ART initiation to 24 months post-ART. Analyses adjusted for CD4 count at ART initiation (baseline), study arm, follow-up time and other known confounders. RESULTS Overall, 1084 individuals [659 from SMART (26% ART naïve) and 425 from FIRST] met the eligibility criteria, providing 8264 CD4 count measurements. Seventy-five per cent of individuals were male with the mean age of 42 years. The median (interquartile range) baseline CD4 counts were 416 (350-530) and 100 (22-300) cells/μL in SMART and FIRST, respectively. All of the biomarkers were inversely associated with baseline CD4 count in FIRST but not in SMART. In adjusted models, there was no clear relationship between changing biomarker levels and mean change in CD4 count post-ART (P for trend: CRP, P = 0.97; IL-6, P = 0.25; and D-dimer, P = 0.29). CONCLUSIONS Pre-ART inflammation and coagulation activation do not predict CD4 count response to ART and appear to influence the risk of clinical outcomes through other mechanisms than blunting long-term CD4 count gain.
Resumo:
OBJECTIVES Gender-specific data on the outcome of combination antiretroviral therapy (cART) are a subject of controversy. We aimed to compare treatment responses between genders in a setting of equal access to cART over a 14-year period. METHODS Analyses included treatment-naïve participants in the Swiss HIV Cohort Study starting cART between 1998 and 2011 and were restricted to patients infected by heterosexual contacts or injecting drug use, excluding men who have sex with men. RESULTS A total of 3925 patients (1984 men and 1941 women) were included in the analysis. Women were younger and had higher CD4 cell counts and lower HIV RNA at baseline than men. Women were less likely to achieve virological suppression < 50 HIV-1 RNA copies/mL at 1 year (75.2% versus 78.1% of men; P = 0.029) and at 2 years (77.5% versus 81.1%, respectively; P = 0.008), whereas no difference between sexes was observed at 5 years (81.3% versus 80.5%, respectively; P = 0.635). The probability of virological suppression increased in both genders over time (test for trend, P < 0.001). The median increase in CD4 cell count at 1, 2 and 5 years was generally higher in women during the whole study period, but it gradually improved over time in both sexes (P < 0.001). Women also were more likely to switch or stop treatment during the first year of cART, and stops were only partly driven by pregnancy. In multivariate analysis, after adjustment for sociodemographic factors, HIV-related factors, cART and calendar period, female gender was no longer associated with lower odds of virological suppression. CONCLUSIONS Gender inequalities in the response to cART are mainly explained by the different prevalence of socioeconomic characteristics in women compared with men.
Resumo:
OBJECTIVE To systematically review evidence on genetic variants influencing outcomes during warfarin therapy and provide practice recommendations addressing the key questions: (1) Should genetic testing be performed in patients with an indication for warfarin therapy to improve achievement of stable anticoagulation and reduce adverse effects? (2) Are there subgroups of patients who may benefit more from genetic testing compared with others? (3) How should patients with an indication for warfarin therapy be managed based on their genetic test results? METHODS A systematic literature search was performed for VKORC1 and CYP2C9 and their association with warfarin therapy. Evidence was critically appraised, and clinical practice recommendations were developed based on expert group consensus. RESULTS Testing of VKORC1 (-1639G>A), CYP2C9*2, and CYP2C9*3 should be considered for all patients, including pediatric patients, within the first 2 weeks of therapy or after a bleeding event. Testing for CYP2C9*5, *6, *8, or *11 and CYP4F2 (V433M) is currently not recommended. Testing should also be considered for all patients who are at increased risk of bleeding complications, who consistently show out-of-range international normalized ratios, or suffer adverse events while receiving warfarin. Genotyping results should be interpreted using a pharmacogenetic dosing algorithm to estimate the required dose. SIGNIFICANCE This review provides the latest update on genetic markers for warfarin therapy, clinical practice recommendations as a basis for informed decision making regarding the use of genotype-guided dosing in patients with an indication for warfarin therapy, and identifies knowledge gaps to guide future research.
Resumo:
Plasma drug-resistant minority HIV-1 variants (DRMV) increase the risk of virological failure to first-line NNRTI antiretroviral therapy (ART). The origin of DRMVs in ART-naive patients, however, remains unclear. In a large pan-European case-control study investigating the clinical relevance of pre-existing DRMVs using 454 pyrosequencing, the six most prevalent plasma DRMVs detected corresponded to G-to-A nucleotide mutations (V90I, V106I, V108I, E138K, M184I and M230I). Here, we evaluated if such DRMVs could have emerged from APOBEC3G/F activity. Out of 236 ART-naïve evaluated subjects, APOBEC3G/F hypermutation signatures were detected in plasma viruses of 14 (5.9%) individuals. Samples with minority E138K, M184I, and M230I mutations, but not those with V90I, V106I, or V108I were significantly associated with APOBEC3G/F activity (Fisher's p<0.005), defined as presence of >0.5% of sample sequences with an APOBEC3G/F signature. Mutations E138K, M184I and M230I co-occurred in the same sequence as APOBEC3G/F signatures in 3/9 (33%), 5/11 (45%) and 4/8 (50%) of samples, respectively; such linkage was not found for V90I, V106I or V108I. In-frame STOP codons were observed in 1.5% of all clonal sequences; 14.8% of them co-occurred with APOBEC3G/F signatures. APOBEC3G/F-associated E138K, M184I and M230I appeared within clonal sequences containing in-frame STOP codons in 2/3 (66%), 5/5 (100%) and 4/4 (100%) of the samples. In a reanalysis of the parent case-control study, presence of APOBEC3G/F signatures was not associated with virological failure. In conclusion, the contribution of APOBEC3G/F editing to the development of DRMVs is very limited and does not affect the efficacy of NNRTI ART.
Resumo:
The combined use of androgen deprivation therapy (ADT) and image-guided radiotherapy (IGRT) can improve overall survival in aggressive, localized prostate cancer. However, owing to the adverse effects of prolonged ADT, it is imperative to identify the patients who would benefit from this combined-modality therapy relative to the use of IGRT alone. Opportunities exist for more personalized approaches in treating aggressive, locally advanced prostate cancer. Biomarkers--such as disseminated tumour cells, circulating tumour cells, genomic signatures and molecular imaging techniques--could identify the patients who are at greatest risk for systemic metastases and who would benefit from the addition of systemic ADT. By contrast, when biomarkers of systemic disease are not present, treatment could proceed using local IGRT alone. The choice of drug, treatment duration and timing of ADT relative to IGRT could be predicated on these personalized approaches to prostate cancer medicine. These novel treatment intensification and reduction strategies could result in improved prostate-cancer-specific survival and overall survival, without incurring the added expense of metabolic syndrome and other adverse effects of ADT in all patients.
Resumo:
Minimal residual disease (MRD) is a major hurdle in the eradication of malignant tumors. Despite the high sensitivity of various cancers to treatment, some residual cancer cells persist and lead to tumor recurrence and treatment failure. Obvious reasons for residual disease include mechanisms of secondary therapy resistance, such as the presence of mutant cells that are insensitive to the drugs, or the presence of cells that become drug resistant due to activation of survival pathways. In addition to such unambiguous resistance modalities, several patients with relapsing tumors do not show refractory disease and respond again when the initial therapy is repeated. These cases cannot be explained by the selection of mutant tumor cells, and the precise mechanisms underlying this clinical drug resistance are ill-defined. In the current review, we put special emphasis on cell-intrinsic and -extrinsic mechanisms that may explain mechanisms of MRD that are independent of secondary therapy resistance. In particular, we show that studying genetically engineered mouse models (GEMMs), which highly resemble the disease in humans, provides a complementary approach to understand MRD. In these animal models, specific mechanisms of secondary resistance can be excluded by targeted genetic modifications. This allows a clear distinction between the selection of cells with stable secondary resistance and mechanisms that result in the survival of residual cells but do not provoke secondary drug resistance. Mechanisms that may explain the latter feature include special biochemical defense properties of cancer stem cells, metabolic peculiarities such as the dependence on autophagy, drug-tolerant persisting cells, intratumoral heterogeneity, secreted factors from the microenvironment, tumor vascularization patterns and immunosurveillance-related factors. We propose in the current review that a common feature of these various mechanisms is cancer cell dormancy. Therefore, dormant cancer cells appear to be an important target in the attempt to eradicate residual cancer cells, and eventually cure patients who repeatedly respond to anticancer therapy but lack complete tumor eradication.
Resumo:
AIMS In the dual antiplatelet therapy (DAPT) study, continued thienopyridine beyond 12 months after drug-eluting stent placement was associated with increased mortality compared with placebo. We sought to evaluate factors related to mortality in randomized patients receiving either drug-eluting or bare metal stents in the DAPT study. METHODS AND RESULTS Patients were enrolled after coronary stenting, given thienopyridine and aspirin for 12 months, randomly assigned to continued thienopyridine or placebo for an additional 18 months (while taking aspirin), and subsequently treated with aspirin alone for another 3 months. A blinded independent adjudication committee evaluated deaths. Among 11 648 randomized patients, rates of all-cause mortality rates were 1.9 vs. 1.5% (continued thienopyridine vs. placebo, P = 0.07), cardiovascular mortality, 1.0 vs. 1.0% (P = 0.97), and non-cardiovascular mortality, 0.9 vs. 0.5% (P = 0.01) over the randomized period (Months 12-30). Rates of fatal bleeding were 0.2 vs. 0.1% (P = 0.81), and deaths related to any prior bleeding were 0.3 vs. 0.2% (P = 0.36), Months 12-33). Cancer incidence did not differ (2.0 vs. 1.6%, P = 0.12). Cancer-related deaths occurred in 0.6 vs. 0.3% (P = 0.02) and were rarely related to bleeding (0.1 vs. 0, P = 0.25). After excluding those occurring in patients with cancer diagnosed before enrolment, rates were 0.4 vs. 0.3% (P = 0.16). CONCLUSION Bleeding accounted for a minority of deaths among patients treated with continued thienopyridine. Cancer-related death in association with thienopyridine therapy was mainly not related to bleeding and may be a chance finding. Caution is warranted when considering extended thienopyridine in patients with advanced cancer. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00977938.
Resumo:
OBJECTIVES This study sought to compare rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE) (composite of death, myocardial infarction, or stroke) after coronary stenting with drug-eluting stents (DES) versus bare-metal stents (BMS) in patients who participated in the DAPT (Dual Antiplatelet Therapy) study, an international multicenter randomized trial comparing 30 versus 12 months of dual antiplatelet therapy in subjects undergoing coronary stenting with either DES or BMS. BACKGROUND Despite antirestenotic efficacy of coronary DES compared with BMS, the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Many clinicians perceive BMS to be associated with fewer adverse ischemic events and to require shorter-duration dual antiplatelet therapy than DES. METHODS Prospective propensity-matched analysis of subjects enrolled into a randomized trial of dual antiplatelet therapy duration was performed. DES- and BMS-treated subjects were propensity-score matched in a many-to-one fashion. The study design was observational for all subjects 0 to 12 months following stenting. A subset of eligible subjects without major ischemic or bleeding events were randomized at 12 months to continued thienopyridine versus placebo; all subjects were followed through 33 months. RESULTS Among 10,026 propensity-matched subjects, DES-treated subjects (n = 8,308) had a lower rate of stent thrombosis through 33 months compared with BMS-treated subjects (n = 1,718, 1.7% vs. 2.6%; weighted risk difference -1.1%, p = 0.01) and a noninferior rate of MACCE (11.4% vs. 13.2%, respectively, weighted risk difference -1.8%, p = 0.053, noninferiority p < 0.001). CONCLUSIONS DES-treated subjects have long-term rates of stent thrombosis that are lower than BMS-treated subjects. (The Dual Antiplatelet Therapy Study [DAPT study]; NCT00977938).
Resumo:
The choice and duration of antiplatelet therapy for secondary prevention of coronary artery disease (CAD) is determined by the clinical context and treatment strategy. Oral antiplatelet agents for secondary prevention include the cyclo-oxygenase-1 inhibitor aspirin, and the ADP dependent P2Y12 inhibitors clopidogrel, prasugrel and ticagrelor. Aspirin constitutes the cornerstone in secondary prevention of CAD and is complemented by clopidogrel in patients with stable CAD undergoing percutaneous coronary intervention. Among patients with acute coronary syndrome, prasugrel and ticagrelor improve net clinical outcome by reducing ischaemic adverse events at the expense of an increased risk of bleeding as compared with clopidogrel. Prasugrel appears particularly effective among patients with ST elevation myocardial infarction to reduce the risk of stent thrombosis compared with clopidogrel, and offered a greater net clinical benefit among patients with diabetes compared with patients without diabetes. Ticagrelor is associated with reduced mortality without increasing the rate of coronary artery bypass graft (CABG)-related bleeding as compared with clopidogrel. Dual antiplatelet therapy should be continued for a minimum of 1 year among patients with acute coronary syndrome irrespective of stent type; among patients with stable CAD treated with new generation drug-eluting stents, available data suggest no benefit to prolong antiplatelet treatment beyond 6 months.
Resumo:
BACKGROUND The benefits and risks of prolonged dual antiplatelet therapy may be different for patients with acute myocardial infarction (MI) compared with more stable presentations. OBJECTIVES This study sought to assess the benefits and risks of 30 versus 12 months of dual antiplatelet therapy among patients undergoing coronary stent implantation with and without MI. METHODS The Dual Antiplatelet Therapy Study, a randomized double-blind, placebo-controlled trial, compared 30 versus 12 months of dual antiplatelet therapy after coronary stenting. The effect of continued thienopyridine on ischemic and bleeding events among patients initially presenting with versus without MI was assessed. The coprimary endpoints were definite or probable stent thrombosis and major adverse cardiovascular and cerebrovascular events (MACCE). The primary safety endpoint was GUSTO (Global Utilization of Streptokinase and Tissue Plasminogen Activator for Occluded Arteries) moderate or severe bleeding. RESULTS Of 11,648 randomized patients (9,961 treated with drug-eluting stents, 1,687 with bare-metal stents), 30.7% presented with MI. Between 12 and 30 months, continued thienopyridine reduced stent thrombosis compared with placebo in patients with and without MI at presentation (MI group, 0.5% vs. 1.9%, p < 0.001; no MI group, 0.4% vs. 1.1%, p < 0.001; interaction p = 0.69). The reduction in MACCE for continued thienopyridine was greater for patients with MI (3.9% vs. 6.8%; p < 0.001) compared with those with no MI (4.4% vs. 5.3%; p = 0.08; interaction p = 0.03). In both groups, continued thienopyridine reduced MI (2.2% vs. 5.2%, p < 0.001 for MI; 2.1% vs. 3.5%, p < 0.001 for no MI; interaction p = 0.15) but increased bleeding (1.9% vs. 0.8%, p = 0.005 for MI; 2.6% vs. 1.7%, p = 0.007 for no MI; interaction p = 0.21). CONCLUSIONS Compared with 12 months of therapy, 30 months of dual antiplatelet therapy reduced the risk of stent thrombosis and MI in patients with and without MI, and increased bleeding. (The Dual Antiplatelet Therapy Study [The DAPT Study]; NCT00977938).
Resumo:
Although recent guidelines recommend the combination of calcium channel blockers (CCBs) and thiazide (-like) diuretics, this combination is not widely used in clinical practice. The aim of this meta-analysis was to assess the efficacy and safety of this combination regarding the following endpoints: all-cause and cardiovascular mortality, myocardial infarction, and stroke. Four studies with a total of 30,791 of patients met the inclusion criteria. The combination CCB/thiazide (-like) diuretic was associated with a significant risk reduction for myocardial infarction (risk ratio [RR], 0.83; 95% confidence interval [CI], 0.73-0.95) and stroke (RR, 0.77; CI, 0.64-0.92) compared with other combinations, whereas it was similarly effective compared with other combinations in reducing the risk of all-cause (RR, 0.89; CI, 0.75-1.06) and cardiovascular (RR, 0.89; CI 0.71-1.10) mortality. Elderly patients with isolated systolic hypertension may particularly benefit from such a combination, since both drug classes have been shown to confer cerebrovascular protection.
Resumo:
Tyrosine kinase inhibitors represent today's treatment of choice in chronic myeloid leukemia (CML). Allogeneic hematopoietic stem cell transplantation (HSCT) is regarded as salvage therapy. This prospective randomized CML-study IIIA recruited 669 patients with newly diagnosed CML between July 1997 and January 2004 from 143 centers. Of these, 427 patients were considered eligible for HSCT and were randomized by availability of a matched family donor between primary HSCT (group A; N=166 patients) and best available drug treatment (group B; N=261). Primary end point was long-term survival. Survival probabilities were not different between groups A and B (10-year survival: 0.76 (95% confidence interval (CI): 0.69-0.82) vs 0.69 (95% CI: 0.61-0.76)), but influenced by disease and transplant risk. Patients with a low transplant risk showed superior survival compared with patients with high- (P<0.001) and non-high-risk disease (P=0.047) in group B; after entering blast crisis, survival was not different with or without HSCT. Significantly more patients in group A were in molecular remission (56% vs 39%; P=0.005) and free of drug treatment (56% vs 6%; P<0.001). Differences in symptoms and Karnofsky score were not significant. In the era of tyrosine kinase inhibitors, HSCT remains a valid option when both disease and transplant risk are considered.Leukemia advance online publication, 20 November 2015; doi:10.1038/leu.2015.281.