927 resultados para recommended drug dose


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Dual antiplatelet therapy is recommended after coronary stenting to prevent thrombotic complications, yet the benefits and risks of treatment beyond 1 year are uncertain. METHODS Patients were enrolled after they had undergone a coronary stent procedure in which a drug-eluting stent was placed. After 12 months of treatment with a thienopyridine drug (clopidogrel or prasugrel) and aspirin, patients were randomly assigned to continue receiving thienopyridine treatment or to receive placebo for another 18 months; all patients continued receiving aspirin. The coprimary efficacy end points were stent thrombosis and major adverse cardiovascular and cerebrovascular events (a composite of death, myocardial infarction, or stroke) during the period from 12 to 30 months. The primary safety end point was moderate or severe bleeding. RESULTS A total of 9961 patients were randomly assigned to continue thienopyridine treatment or to receive placebo. Continued treatment with thienopyridine, as compared with placebo, reduced the rates of stent thrombosis (0.4% vs. 1.4%; hazard ratio, 0.29 [95% confidence interval {CI}, 0.17 to 0.48]; P<0.001) and major adverse cardiovascular and cerebrovascular events (4.3% vs. 5.9%; hazard ratio, 0.71 [95% CI, 0.59 to 0.85]; P<0.001). The rate of myocardial infarction was lower with thienopyridine treatment than with placebo (2.1% vs. 4.1%; hazard ratio, 0.47; P<0.001). The rate of death from any cause was 2.0% in the group that continued thienopyridine therapy and 1.5% in the placebo group (hazard ratio, 1.36 [95% CI, 1.00 to 1.85]; P=0.05). The rate of moderate or severe bleeding was increased with continued thienopyridine treatment (2.5% vs. 1.6%, P=0.001). An elevated risk of stent thrombosis and myocardial infarction was observed in both groups during the 3 months after discontinuation of thienopyridine treatment. CONCLUSIONS Dual antiplatelet therapy beyond 1 year after placement of a drug-eluting stent, as compared with aspirin therapy alone, significantly reduced the risks of stent thrombosis and major adverse cardiovascular and cerebrovascular events but was associated with an increased risk of bleeding. (Funded by a consortium of eight device and drug manufacturers and others; DAPT ClinicalTrials.gov number, NCT00977938.).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND After heart transplantation (HTx), the interindividual pharmacokinetic variability of immunosuppressive drugs represents a major therapeutic challenge due to the narrow therapeutic window between over-immunosuppression causing toxicity and under-immunosuppression leading to graft rejection. Although genetic polymorphisms have been shown to influence pharmacokinetics of immunosuppressants, data in the context of HTx are scarce. We thus assessed the role of genetic variation in CYP3A4, CYP3A5, POR, NR1I2, and ABCB1 acting jointly in immunosuppressive drug pathways in tacrolimus (TAC) and ciclosporin (CSA) dose requirement in HTx recipients. METHODS Associations between 7 functional genetic variants and blood dose-adjusted trough (C0) concentrations of TAC and CSA at 1, 3, 6, and 12 months after HTx were evaluated in cohorts of 52 and 45 patients, respectively. RESULTS Compared with CYP3A5 nonexpressors (*3/*3 genotype), CYP3A5 expressors (*1/*3 or *1/*1 genotype) required around 2.2- to 2.6-fold higher daily TAC doses to reach the targeted C0 concentration at all studied time points (P ≤ 0.003). Additionally, the POR*28 variant carriers showed higher dose-adjusted TAC-C0 concentrations at all time points resulting in significant differences at 3 (P = 0.025) and 6 months (P = 0.047) after HTx. No significant associations were observed between the genetic variants and the CSA dose requirement. CONCLUSIONS The CYP3A5*3 variant has a major influence on the required TAC dose in HTx recipients, whereas the POR*28 may additionally contribute to the observed variability. These results support the importance of genetic markers in TAC dose optimization after HTx.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To systematically review evidence on genetic variants influencing outcomes during warfarin therapy and provide practice recommendations addressing the key questions: (1) Should genetic testing be performed in patients with an indication for warfarin therapy to improve achievement of stable anticoagulation and reduce adverse effects? (2) Are there subgroups of patients who may benefit more from genetic testing compared with others? (3) How should patients with an indication for warfarin therapy be managed based on their genetic test results? METHODS A systematic literature search was performed for VKORC1 and CYP2C9 and their association with warfarin therapy. Evidence was critically appraised, and clinical practice recommendations were developed based on expert group consensus. RESULTS Testing of VKORC1 (-1639G>A), CYP2C9*2, and CYP2C9*3 should be considered for all patients, including pediatric patients, within the first 2 weeks of therapy or after a bleeding event. Testing for CYP2C9*5, *6, *8, or *11 and CYP4F2 (V433M) is currently not recommended. Testing should also be considered for all patients who are at increased risk of bleeding complications, who consistently show out-of-range international normalized ratios, or suffer adverse events while receiving warfarin. Genotyping results should be interpreted using a pharmacogenetic dosing algorithm to estimate the required dose. SIGNIFICANCE This review provides the latest update on genetic markers for warfarin therapy, clinical practice recommendations as a basis for informed decision making regarding the use of genotype-guided dosing in patients with an indication for warfarin therapy, and identifies knowledge gaps to guide future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Inability to predict the therapeutic effect of a drug in individual pain patients prolongs the process of drug and dose finding until satisfactory pharmacotherapy can be achieved. Many chronic pain conditions are associated with hypersensitivity of the nervous system or impaired endogenous pain modulation. Pharmacotherapy often aims at influencing these disturbed nociceptive processes. Its effect might therefore depend on the extent to which they are altered. Quantitative sensory testing (QST) can evaluate various aspects of pain processing and might therefore be able to predict the analgesic efficacy of a given drug. In the present study three drugs commonly used in the pharmacological management of chronic low back pain are investigated. The primary objective is to examine the ability of QST to predict pain reduction. As a secondary objective, the analgesic effects of these drugs and their effect on QST are evaluated. METHODS/DESIGN In this randomized, double blinded, placebo controlled cross-over study, patients with chronic low back pain are randomly assigned to imipramine, oxycodone or clobazam versus active placebo. QST is assessed at baseline, 1 and 2 h after drug administration. Pain intensity, side effects and patients' global impression of change are assessed in intervals of 30 min up to two hours after drug intake. Baseline QST is used as explanatory variable to predict drug effect. The change in QST over time is analyzed to describe the pharmacodynamic effects of each drug on experimental pain modalities. Genetic polymorphisms are analyzed as co-variables. DISCUSSION Pharmacotherapy is a mainstay in chronic pain treatment. Antidepressants, anticonvulsants and opioids are frequently prescribed in a "trial and error" fashion, without knowledge however, which drug suits best which patient. The present study addresses the important need to translate recent advances in pain research to clinical practice. Assessing the predictive value of central hypersensitivity and endogenous pain modulation could allow for the implementation of a mechanism-based treatment strategy in individual patients. TRIAL REGISTRATION Clinicaltrials.gov, NCT01179828.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reluctance has been expressed about treating chronic hepatitis C in active intravenous (IV) drug users (IDUs), and this is found in both international guidelines and routine clinical practice. However, the medical literature provides no evidence for an unequivocal treatment deferral of this risk group. We retrospectively analyzed the direct effect of IV drug use on treatment outcome in 500 chronic hepatitis C patients enrolled in the Swiss Hepatitis C Cohort Study. Patients were eligible for the study if they had their serum hepatitis C virus (HCV) RNA tested 6 months after the end of treatment and at least one visit during the antiviral therapy, documenting the drug use status. Five hundred patients fulfilled the inclusion criteria (199 were IDU and 301 controls). A minimum exposure to 80% of the scheduled cumulative dose of antivirals was reached in 66.0% of IDU and 60.5% of controls (P = NS). The overall sustained virological response (SVR) rate was 63.6%. Active IDU reached a SVR of 69.3%, statistically not significantly different from controls (59.8%). A multivariate analysis for treatment success showed no significant negative influence of active IV drug use. In conclusion, our study shows no relevant direct influence of IV drugs on the efficacy of anti-HCV therapy among adherent patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND High-dose benzodiazepine (BZD) dependence is associated with a wide variety of negative health consequences. Affected individuals are reported to suffer from severe mental disorders and are often unable to achieve long-term abstinence via recommended discontinuation strategies. Although it is increasingly understood that treatment interventions should take subjective experiences and beliefs into account, the perceptions of this group of individuals remain under-investigated. METHODS We conducted an exploratory qualitative study with 41 adult subjects meeting criteria for (high-dose) BZD-dependence, as defined by ICD-10. One-on-one in-depth interviews allowed for an exploration of this group's views on the reasons behind their initial and then continued use of BZDs, as well as their procurement strategies. Mayring's qualitative content analysis was used to evaluate our data. RESULTS In this sample, all participants had developed explanatory models for why they began using BZDs. We identified a multitude of reasons that we grouped into four broad categories, as explaining continued BZD use: (1) to cope with symptoms of psychological distress or mental disorder other than substance use, (2) to manage symptoms of physical or psychological discomfort associated with somatic disorder, (3) to alleviate symptoms of substance-related disorders, and (4) for recreational purposes, that is, sensation-seeking and other social reasons. Subjects often considered BZDs less dangerous than other substances and associated their use more often with harm reduction than as recreational. Specific obtainment strategies varied widely: the majority of participants oscillated between legal and illegal methods, often relying on the black market when faced with treatment termination. CONCLUSIONS Irrespective of comorbidity, participants expressed a clear preference for medically related explanatory models for their BZD use. We therefore suggest that clinicians consider patients' motives for long-term, high-dose BZD use when formulating treatment plans for this patient group, especially since it is known that individuals are more compliant with approaches they perceive to be manageable, tolerable, and effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND High-dose benzodiazepine dependence constitutes a major clinical concern. Although withdrawal treatment is recommended, it is unsuccessful for a significant proportion of affected patients. More recently, a benzodiazepine maintenance approach has been suggested as an alternative for patients' failing discontinuation treatment. While there is some data supporting its effectiveness, patients' perceptions of such an intervention have not been investigated. METHODS An exploratory qualitative study was conducted among a sample of 41 high-dose benzodiazepine (BZD)-dependent patients, with long-term use defined as doses equivalent to more than 40 mg diazepam per day and/or otherwise problematic use, such as mixing substances, dose escalation, recreational use, or obtainment by illegal means. A qualitative content analysis approach was used to evaluate findings. RESULTS Participants generally favored a treatment discontinuation approach with abstinence from BZD as its ultimate aim, despite repeated failed attempts at withdrawal. A maintenance treatment approach with continued prescription of a slow-onset, long-acting agonist was viewed ambivalently, with responses ranging from positive and welcoming to rejection. Three overlapping themes of maintenance treatment were identified: "Only if I can try to discontinue…and please don't call it that," "More stability and less criminal activity…and that is why I would try it," and "No cure, no brain and no flash…and thus, just for everybody else!" CONCLUSIONS Some patients experienced slow-onset, long-acting BZDs as having stabilized their symptoms and viewed these BZDs as having helped avoid uncontrolled withdrawal and abstain from criminal activity. We therefore encourage clinicians to consider treatment alternatives if discontinuation strategies fail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cochlear implants are neuroprostheses that are inserted into the inner ear to directly electrically stimulate the auditory nerve, thus replacing lost cochlear receptors, the hair cells. The reduction of the gap between electrodes and nerve cells will contribute to technological solutions simultaneously increasing the frequency resolution, the sound quality and the amplification of the signal. Recent findings indicate that neurotrophins (NTs) such as brain derived neurotrophic factor (BDNF) stimulate the neurite outgrowth of auditory nerve cells by activating Trk receptors on the cellular surface (1–3). Furthermore, small-size TrkB receptor agonists such as di-hydroxyflavone (DHF) are now available, which activate the TrkB receptor with similar efficiency as BDNF, but are much more stable (4). Experimentally, such molecules are currently used to attract nerve cells towards, for example, the electrodes of cochlear implants. This paper analyses the scenarios of low dose aspects of controlled release of small-size Trk receptor agonists from the coated CI electrode array into the inner ear. The control must first ensure a sufficient dose for the onset of neurite growth. Secondly, a gradient in concentration needs to be maintained to allow directive growth of neurites through the perilymph-filled gap towards the electrodes of the implant. We used fluorescein as a test molecule for its molecular size similarity to DHF and investigated two different transport mechanisms of drug dispensing, which both have the potential to fulfil controlled low-throughput drug-deliverable requirements. The first is based on the release of aqueous fluorescein into water through well-defined 60-μm size holes arrays in a membrane by pure osmosis. The release was both simulated using the software COMSOL and observed experimentally. In the second approach, solid fluorescein crystals were encapsulated in a thin layer of parylene (PPX), hence creating random nanometer-sized pinholes. In this approach, the release occurred due to subsequent water diffusion through the pinholes, dissolution of the fluorescein and then release by out-diffusion. Surprisingly, the release rate of solid fluorescein through the nanoscopic scale holes was found to be in the same order of magnitude as for liquid fluorescein release through microscopic holes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fatal hyperammonemia secondary to chemotherapy for hematological malignancies or following bone marrow transplantation has been described in few patients so far. In these, the pathogenesis of hyperammonemia remained unclear and was suggested to be multifactorial. We observed severe hyperammonemia (maximum 475 μmol/L) in a 2-year-old male patient, who underwent high-dose chemotherapy with carboplatin, etoposide and melphalan, and autologous hematopoietic stem cell transplantation for a neuroblastoma stage IV. Despite intensive care treatment, hyperammonemia persisted and the patient died due to cerebral edema. The biochemical profile with elevations of ammonia and glutamine (maximum 1757 μmol/L) suggested urea cycle dysfunction. In liver homogenates, enzymatic activity and protein expression of the urea cycle enzyme carbamoyl phosphate synthetase 1 (CPS1) were virtually absent. However, no mutation was found in CPS1 cDNA from liver and CPS1 mRNA expression was only slightly decreased. We therefore hypothesized that the acute onset of hyperammonemia was due to an acquired, chemotherapy-induced (posttranscriptional) CPS1 deficiency. This was further supported by in vitro experiments in HepG2 cells treated with carboplatin and etoposide showing a dose-dependent decrease in CPS1 protein expression. Due to severe hyperlactatemia, we analysed oxidative phosphorylation complexes in liver tissue and found reduced activities of complexes I and V, which suggested a more general mitochondrial dysfunction. This study adds to the understanding of chemotherapy-induced hyperammonemia as drug-induced CPS1 deficiency is suggested. Moreover, we highlight the need for urgent diagnostic and therapeutic strategies addressing a possible secondary urea cycle failure in future patients with hyperammonemia during chemotherapy and stem cell transplantation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Recent small single-center data indicate that the current hemodynamic parameters used to diagnose critical limb ischemia are insensitive. We investigated the validity of the societal guidelines-recommended hemodynamic parameters against core laboratory-adjudicated angiographic data from the multicenter IN.PACT DEEP (RandomIzed AmPhirion DEEP DEB vs StAndard PTA for the treatment of below the knee Critical limb ischemia) Trial. METHODS Of the 358 patients in the IN.PACT DEEP Trial to assess drug-eluting balloon vs standard balloon angioplasty for infrapopliteal disease, 237 had isolated infrapopliteal disease with an available ankle-brachial index (ABI), and only 40 of the latter had available toe pressure measurements. The associations between ABI, ankle pressure, and toe pressure with tibial runoff, Rutherford category, and plantar arch were examined according to the cutoff points recommended by the societal guidelines. Abnormal tibial runoff was defined as severely stenotic (≥70%) or occluded and scored as one-, two-, or three-vessel disease. A stenotic or occluded plantar arch was considered abnormal. RESULTS Only 14 of 237 patients (6%) had an ABI <0.4. Abnormal ankle pressure, defined as <50 mm Hg if Rutherford category 4 and <70 mm Hg if Rutherford category 5 or 6, was found only in 37 patients (16%). Abnormal toe pressure, defined as <30 mm Hg if Rutherford category 4 and <50 mm Hg if Rutherford category 5 or 6, was found in 24 of 40 patients (60%) with available measurements. Importantly, 29% of these 24 patients had an ABI within normal reference ranges. A univariate multinomial logistic regression found no association between the above hemodynamic parameters and the number of diseased infrapopliteal vessels. However, there was a significant paradoxic association where patients with Rutherford category 6 had higher ABI and ankle pressure than those with Rutherford category 5. Similarly, there was no association between ABI and pedal arch patency. CONCLUSIONS The current recommended hemodynamic parameters fail to identify a significant portion of patients with lower extremity ulcers and angiographically proven severe disease. Toe pressure has better sensitivity and should be considered in all patients with critical limb ischemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pregnant BALB/c mice have been widely used as an in vivo model to study Neospora caninum infection biology and to provide proof-of-concept for assessments of drugs and vaccines against neosporosis. The fact that this model has been used with different isolates of variable virulence, varying infection routes and differing methods to prepare the parasites for infection, has rendered the comparison of results from different laboratories impossible. In most studies, mice were infected with similar number of parasites (2 × 10(6)) as employed in ruminant models (10(7) for cows and 10(6) for sheep), which seems inappropriate considering the enormous differences in the weight of these species. Thus, for achieving meaningful results in vaccination and drug efficacy experiments, a refinement and standardization of this experimental model is necessary. Thus, 2 × 10(6), 10(5), 10(4), 10(3) and 10(2) tachyzoites of the highly virulent and well-characterised Nc-Spain7 isolate were subcutaneously inoculated into mice at day 7 of pregnancy, and clinical outcome, vertical transmission, parasite burden and antibody responses were compared. Dams from all infected groups presented nervous signs and the percentage of surviving pups at day 30 postpartum was surprisingly low (24%) in mice infected with only 10(2) tachyzoites. Importantly, infection with 10(5) tachyzoites resulted in antibody levels, cerebral parasite burden in dams and 100% mortality rate in pups, which was identical to infection with 2 × 10(6) tachyzoites. Considering these results, it is reasonable to lower the challenge dose to 10(5) tachyzoites in further experiments when assessing drugs or vaccine candidates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maternal ingestion of high concentrations of radon-222 (Rn-222) in drinking during pregnancy may pose a significant radiation hazard to the developing embryo. The effects of ionizing radiation to the embryo and fetus have been the subject of research, analyses, and the development of a number of radiation dosimetric models for a variety of radionuclides. Currently, essentially all of the biokinetic and dosimetric models that have been developed by national and international radiation protection agencies and organizations recommend calculating the dose to the mother's uterus as a surrogate for estimating the dose to the embryo. Heretofore, the traditional radiation dosimetry models have neither considered the embryo a distinct and rapidly developing entity, the fact that it is implanted in the endometrial layer of the uterus, nor the physiological interchanges that take place between maternal and embryonic cells following the implantation of the blastocyst in the endometrium. The purpose of this research was to propose a new approach and mathematical model for calculating the absorbed radiation dose to the embryo by utilizing a semiclassical treatment of alpha particle decay and subsequent scattering of energy deposition in uterine and embryonic tissue. The new approach and model were compared and contrasted with the currently recommended biokinetic and dosimetric models for estimating the radiation dose to the embryo. The results obtained in this research demonstrate that the estimated absorbed dose for an embryo implanted in the endometrial layer of the uterus during the fifth week of embryonic development is greater than the estimated absorbed dose for an embryo implanted in the uterine muscle on the last day of the eighth week of gestation. This research provides compelling evidence that the recommended methodologies and dosimetric models of the Nuclear Regulatory Commission and International Commission on Radiological Protection employed for calculating the radiation dose to the embryo from maternal intakes of radionuclides, including maternal ingestion of Rn-222 in drinking water would result in an underestimation of dose. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. Itraconazole is recommended life-long for preventing relapse of disseminated histoplasmosis in HIV-infected patients. I sought to determine if serum itraconazole levels are affected by the type of Highly Active Anti-Retroviral Therapy (NNRTI or PI) being taken concomitantly to treat HIV. ^ Design. Retrospective cohort. ^ Methods. De-identified data were used from an IRB-approved parent study which identified patients on HAART and maintenance itraconazole for confirmed disseminated histoplasmosis between January 2003 and December 2006. Available itraconazole blood levels were abstracted as well as medications taken by each patient at the time of the blood tests. Mean itraconazole levels were compared using the student's t-test. ^ Results. 11 patients met study criteria. Patient characteristics were: median age 36, 91% men, 18% white, 18% black, 55% Hispanic and 9% Asians, median CD4 cell count 120 cells/mm3. 14 blood levels were available for analysis—8 on PI, 4 on NNRTI and 2 on both. 8/8 itraconazole levels obtained while taking concomitant PI were therapeutic (>0.4 μg/mL) in contrast to 0/4 obtained while taking NNRTI. Two patients switched from NNRTI to PI and reached therapeutic levels. Mean levels on NNRTI (0.05 μg/mL, s.d. 0.0) and on PI (2.45 μg/mL, s.d. 0.21) for these two patients were compared via a paired t-test (t = 16.00, d.f. = 1, P = 0.04). Remaining patient levels were compared using an unpaired t-test. Mean itraconazole on concomitant PI (n = 6) was 1.37 μg/mL (s.d. 0.74), while the mean on concomitant NNRTI was 0.05 μg/mL (s.d. 0.0), t = 2.39, d.f. = 6, P = 0.05. ^ Conclusions. Co-administration of NNRTI and itraconazole results in significant decreases in itraconazole blood levels, likely by inducing the CYP3A4 enzyme system. Itraconazole drug levels should be monitored in patients on concomitant NNRTI. PI-based HAART may be preferred over NNRTI-based HAART when using itraconazole to treat HIV-infected patients with disseminated histoplasmosis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

5-aza-2'-deoxycytidine (DAC) is a cytidine analogue that strongly inhibits DNA methylation, and was recently approved for the treatment of myelodysplastic syndromes (MDS). To maximize clinical results with DAC, we investigated its use as an anti-cancer drug. We also investigated mechanisms of resistance to DAC in vitro in cancer cell lines and in vivo in MDS patients after relapse. We found DAC sensitized cells to the effect of 1-β-D-Arabinofuranosylcytosine (Ara-C). The combination of DAC and Ara-C or Ara-C following DAC showed additive or synergistic effects on cell death in four human leukemia cell lines in vitro, but antagonism in terms of global methylation. RIL gene activation and H3 lys-9 acetylation of short interspersed elements (Alu). One possible explanation is that hypomethylated cells are sensitized to cell killing by Ara-C. Turning to resistance, we found that the IC50 of DAC differed 1000 fold among and was correlated with the dose of DAC that induced peak hypomethylation of long interspersed nuclear elements (LINE) (r=0.94, P<0.001), but not with LINE methylation at baseline (r=0.05, P=0.97). Sensitivity to DAC did not significantly correlate with sensitivity to another hypomethylating agent 5-azacytidine (AZA) (r=0.44, P=0.11). The cell lines most resistant to DAC had low dCK, hENT1, and hENT2 transporters and high cytosine deaminase (CDA). In an HL60 leukemia cell line, resistance to DAC could be rapidly induced by drug exposure, and was related to a switch from monoallelic to biallelic mutation of dCK or a loss of wild type DCK allele. Furthermore, we showed that DAC induced DNA breaks evidenced by histone H2AX phosphorylation and increased homologous recombination rates 7-10 folds. Finally, we found there were no dCK mutations in MDS patients after relapse. Cytogenetics showed that three of the patients acquired new abnormalities at relapse. These data suggest that in vitro spontaneous and acquired resistance to DAC can be explained by insufficient incorporation of drug into DNA. In vivo resistance to DAC is likely due to methylation-independent pathways such as chromosome changes. The lack of cross resistance between DAC and AZA is of potential clinical relevance, as is the combination of DAC and Ara-C. ^