108 resultados para drug dose increase
Resumo:
BACKGROUND After heart transplantation (HTx), the interindividual pharmacokinetic variability of immunosuppressive drugs represents a major therapeutic challenge due to the narrow therapeutic window between over-immunosuppression causing toxicity and under-immunosuppression leading to graft rejection. Although genetic polymorphisms have been shown to influence pharmacokinetics of immunosuppressants, data in the context of HTx are scarce. We thus assessed the role of genetic variation in CYP3A4, CYP3A5, POR, NR1I2, and ABCB1 acting jointly in immunosuppressive drug pathways in tacrolimus (TAC) and ciclosporin (CSA) dose requirement in HTx recipients. METHODS Associations between 7 functional genetic variants and blood dose-adjusted trough (C0) concentrations of TAC and CSA at 1, 3, 6, and 12 months after HTx were evaluated in cohorts of 52 and 45 patients, respectively. RESULTS Compared with CYP3A5 nonexpressors (*3/*3 genotype), CYP3A5 expressors (*1/*3 or *1/*1 genotype) required around 2.2- to 2.6-fold higher daily TAC doses to reach the targeted C0 concentration at all studied time points (P ≤ 0.003). Additionally, the POR*28 variant carriers showed higher dose-adjusted TAC-C0 concentrations at all time points resulting in significant differences at 3 (P = 0.025) and 6 months (P = 0.047) after HTx. No significant associations were observed between the genetic variants and the CSA dose requirement. CONCLUSIONS The CYP3A5*3 variant has a major influence on the required TAC dose in HTx recipients, whereas the POR*28 may additionally contribute to the observed variability. These results support the importance of genetic markers in TAC dose optimization after HTx.
Resumo:
Interactions between stressors contribute to the recently reported increase in losses of honey bee colonies. Here we demonstrated that a synergistic effect on mortality by the low toxic, commonly used neonicotinoid thiacloprid and the nearly ubiquitous gut parasite Nosemaceranae is dependent on the pesticide dose. Furthermore, thiacloprid had a negative influence on N.ceranae reproduction. Our results highlight that interactions among honey bee health stressors can be dynamic and should be studied across a broader range of combinations.
Resumo:
Plasma drug-resistant minority HIV-1 variants (DRMV) increase the risk of virological failure to first-line NNRTI antiretroviral therapy (ART). The origin of DRMVs in ART-naive patients, however, remains unclear. In a large pan-European case-control study investigating the clinical relevance of pre-existing DRMVs using 454 pyrosequencing, the six most prevalent plasma DRMVs detected corresponded to G-to-A nucleotide mutations (V90I, V106I, V108I, E138K, M184I and M230I). Here, we evaluated if such DRMVs could have emerged from APOBEC3G/F activity. Out of 236 ART-naïve evaluated subjects, APOBEC3G/F hypermutation signatures were detected in plasma viruses of 14 (5.9%) individuals. Samples with minority E138K, M184I, and M230I mutations, but not those with V90I, V106I, or V108I were significantly associated with APOBEC3G/F activity (Fisher's p<0.005), defined as presence of >0.5% of sample sequences with an APOBEC3G/F signature. Mutations E138K, M184I and M230I co-occurred in the same sequence as APOBEC3G/F signatures in 3/9 (33%), 5/11 (45%) and 4/8 (50%) of samples, respectively; such linkage was not found for V90I, V106I or V108I. In-frame STOP codons were observed in 1.5% of all clonal sequences; 14.8% of them co-occurred with APOBEC3G/F signatures. APOBEC3G/F-associated E138K, M184I and M230I appeared within clonal sequences containing in-frame STOP codons in 2/3 (66%), 5/5 (100%) and 4/4 (100%) of the samples. In a reanalysis of the parent case-control study, presence of APOBEC3G/F signatures was not associated with virological failure. In conclusion, the contribution of APOBEC3G/F editing to the development of DRMVs is very limited and does not affect the efficacy of NNRTI ART.
Resumo:
BACKGROUND Inability to predict the therapeutic effect of a drug in individual pain patients prolongs the process of drug and dose finding until satisfactory pharmacotherapy can be achieved. Many chronic pain conditions are associated with hypersensitivity of the nervous system or impaired endogenous pain modulation. Pharmacotherapy often aims at influencing these disturbed nociceptive processes. Its effect might therefore depend on the extent to which they are altered. Quantitative sensory testing (QST) can evaluate various aspects of pain processing and might therefore be able to predict the analgesic efficacy of a given drug. In the present study three drugs commonly used in the pharmacological management of chronic low back pain are investigated. The primary objective is to examine the ability of QST to predict pain reduction. As a secondary objective, the analgesic effects of these drugs and their effect on QST are evaluated. METHODS/DESIGN In this randomized, double blinded, placebo controlled cross-over study, patients with chronic low back pain are randomly assigned to imipramine, oxycodone or clobazam versus active placebo. QST is assessed at baseline, 1 and 2 h after drug administration. Pain intensity, side effects and patients' global impression of change are assessed in intervals of 30 min up to two hours after drug intake. Baseline QST is used as explanatory variable to predict drug effect. The change in QST over time is analyzed to describe the pharmacodynamic effects of each drug on experimental pain modalities. Genetic polymorphisms are analyzed as co-variables. DISCUSSION Pharmacotherapy is a mainstay in chronic pain treatment. Antidepressants, anticonvulsants and opioids are frequently prescribed in a "trial and error" fashion, without knowledge however, which drug suits best which patient. The present study addresses the important need to translate recent advances in pain research to clinical practice. Assessing the predictive value of central hypersensitivity and endogenous pain modulation could allow for the implementation of a mechanism-based treatment strategy in individual patients. TRIAL REGISTRATION Clinicaltrials.gov, NCT01179828.
Resumo:
Reluctance has been expressed about treating chronic hepatitis C in active intravenous (IV) drug users (IDUs), and this is found in both international guidelines and routine clinical practice. However, the medical literature provides no evidence for an unequivocal treatment deferral of this risk group. We retrospectively analyzed the direct effect of IV drug use on treatment outcome in 500 chronic hepatitis C patients enrolled in the Swiss Hepatitis C Cohort Study. Patients were eligible for the study if they had their serum hepatitis C virus (HCV) RNA tested 6 months after the end of treatment and at least one visit during the antiviral therapy, documenting the drug use status. Five hundred patients fulfilled the inclusion criteria (199 were IDU and 301 controls). A minimum exposure to 80% of the scheduled cumulative dose of antivirals was reached in 66.0% of IDU and 60.5% of controls (P = NS). The overall sustained virological response (SVR) rate was 63.6%. Active IDU reached a SVR of 69.3%, statistically not significantly different from controls (59.8%). A multivariate analysis for treatment success showed no significant negative influence of active IV drug use. In conclusion, our study shows no relevant direct influence of IV drugs on the efficacy of anti-HCV therapy among adherent patients.
Resumo:
Cisplatin, a major antineoplastic drug used in the treatment of solid tumors, is a known nephrotoxin. This retrospective cohort study evaluated the prevalence and severity of cisplatin nephrotoxicity in 54 children and its impact on height and weight.We recorded the weight, height, serum creatinine, and electrolytes in each cisplatin cycle and after 12 months of treatment. Nephrotoxicity was graded as follows: normal renal function (Grade 0); asymptomatic electrolyte disorders, including an increase in serum creatinine, up to 1.5 times baseline value (Grade 1); need for electrolyte supplementation <3 months and/or increase in serum creatinine 1.5 to 1.9 times from baseline (Grade 2); increase in serum creatinine 2 to 2.9 times from baseline or need for electrolyte supplementation for more than 3 months after treatment completion (Grade 3); and increase in serum creatinine ≥3 times from baseline or renal replacement therapy (Grade 4).Nephrotoxicity was observed in 41 subjects (75.9%). Grade 1 nephrotoxicity was observed in 18 patients (33.3%), Grade 2 in 5 patients (9.2%), and Grade 3 in 18 patients (33.3%). None had Grade 4 nephrotoxicity. Nephrotoxicity patients were younger and received higher cisplatin dose, they also had impairment in longitudinal growth manifested as statistically significant worsening on the height Z Score at 12 months after treatment. We used a multiple logistic regression model using the delta of height Z Score (baseline-12 months) as dependent variable in order to adjust for the main confounder variables such as: germ cell tumor, cisplatin total dose, serum magnesium levels at 12 months, gender, and nephrotoxicity grade. Patients with nephrotoxicity Grade 1 where at higher risk of not growing (OR 5.1, 95% CI 1.07-24.3, P=0.04). The cisplatin total dose had a significant negative relationship with magnesium levels at 12 months (Spearman r=-0.527, P=<0.001).
Resumo:
The radiation dose rates at flight altitudes can increase by orders of magnitude for a short time during energetic solar cosmic ray events, so called ground level enhancements (GLEs). Especially at high latitudes and flight altitudes, solar energetic particles superposed on galactic cosmic rays may cause radiation that exceeds the maximum allowed dosage limit for the general public. Therefore the determination of the radiation dose rate during GLEs should be as reliable as possible. Radiation dose rates along flight paths are typically determined by computer models that are based on cosmic ray flux and anisotropy parameters derived from neutron monitor and/or satellite measurements. The characteristics of the GLE on 15 April 2001 (GLE60) were determined and published by various authors. In this work we compare these results and investigate the consequences on the computed radiation dose rates along selected flight paths. In addition, we compare the computed radiation dose rates with measurements that were made during GLE60 on board two transatlantic flights.
Resumo:
OBJECTIVE Cochlear implants (CI) are standard treatment for prelingually deafened children and postlingually deafened adults. Computed tomography (CT) is the standard method for postoperative imaging of the electrode position. CT scans accurately reflect electrode depth and position, which is essential prior to use. However, routine CT examinations expose patients to radiation, which is especially problematic in children. We examined whether new CT protocols could reduce radiation doses while preserving diagnostic accuracy. METHODS To investigate whether electrode position can be assessed by low-dose CT protocols, a cadaveric lamb model was used because the inner ear morphology is similar to humans. The scans were performed at various volumetric CT dose-indexes CTDIvol)/kV combinations. For each constant CTDIvol the tube voltage was varied (i.e., 80, 100, 120 and 140kV). This procedure was repeated at different CTDIvol values (21mGy, 11mGy, 5.5mGy, 2.8mGy and 1.8mGy). To keep the CTDIvol constant at different tube voltages, the tube current values were adjusted. Independent evaluations of the images were performed by two experienced and blinded neuroradiologists. The criteria diagnostic usefulness, image quality and artifacts (scaled 1-4) were assessed in 14 cochlear-implanted cadaveric lamb heads with variable tube voltages. RESULTS Results showed that the standard CT dose could be substantially reduced without sacrificing diagnostic accuracy of electrode position. The assessment of the CI electrode position was feasible in almost all cases up to a CTDIvol of 2-3mGy. The number of artifacts did not increase for images within this dose range as compared to higher dosages. The extent of the artifacts caused by the implanted metal-containing CI electrode does not depend on the radiation dose and is not perceptibly influenced by changes in the tube voltage. Summarizing the evaluation of the CI electrode position is possible even at a very low radiation dose. CONCLUSIONS CT imaging of the temporal bone for postoperative electrode position control of the CI is possible with a very low and significantly radiation dose. The tube current-time product and voltage can be reduced by 50% without increasing artifacts. Low-dose postoperative CT scans are sufficient for localizing the CI electrode.
Resumo:
The long-term risk associated with different coronary artery disease (CAD) presentations in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stents (DES) is poorly characterized. We pooled patient-level data for women enrolled in 26 randomized clinical trials. Of 11,577 women included in the pooled database, 10,133 with known clinical presentation received a DES. Of them, 5,760 (57%) had stable angina pectoris (SAP), 3,594 (35%) had unstable angina pectoris (UAP) or non-ST-segment-elevation myocardial infarction (NSTEMI), and 779 (8%) had ST-segment-elevation myocardial infarction (STEMI) as clinical presentation. A stepwise increase in 3-year crude cumulative mortality was observed in the transition from SAP to STEMI (4.9% vs 6.1% vs 9.4%; p <0.01). Conversely, no differences in crude mortality rates were observed between 1 and 3 years across clinical presentations. After multivariable adjustment, STEMI was independently associated with greater risk of 3-year mortality (hazard ratio [HR] 3.45; 95% confidence interval [CI] 1.99 to 5.98; p <0.01), whereas no differences were observed between UAP or NSTEMI and SAP (HR 0.99; 95% CI 0.73 to 1.34; p = 0.94). In women with ACS, use of new-generation DES was associated with reduced risk of major adverse cardiac events (HR 0.58; 95% CI 0.34 to 0.98). The magnitude and direction of the effect with new-generation DES was uniform between women with or without ACS (pinteraction = 0.66). In conclusion, in women across the clinical spectrum of CAD, STEMI was associated with a greater risk of long-term mortality. Conversely, the adjusted risk of mortality between UAP or NSTEMI and SAP was similar. New-generation DESs provide improved long-term clinical outcomes irrespective of the clinical presentation in women.
Resumo:
Cochlear implants are neuroprostheses that are inserted into the inner ear to directly electrically stimulate the auditory nerve, thus replacing lost cochlear receptors, the hair cells. The reduction of the gap between electrodes and nerve cells will contribute to technological solutions simultaneously increasing the frequency resolution, the sound quality and the amplification of the signal. Recent findings indicate that neurotrophins (NTs) such as brain derived neurotrophic factor (BDNF) stimulate the neurite outgrowth of auditory nerve cells by activating Trk receptors on the cellular surface (1–3). Furthermore, small-size TrkB receptor agonists such as di-hydroxyflavone (DHF) are now available, which activate the TrkB receptor with similar efficiency as BDNF, but are much more stable (4). Experimentally, such molecules are currently used to attract nerve cells towards, for example, the electrodes of cochlear implants. This paper analyses the scenarios of low dose aspects of controlled release of small-size Trk receptor agonists from the coated CI electrode array into the inner ear. The control must first ensure a sufficient dose for the onset of neurite growth. Secondly, a gradient in concentration needs to be maintained to allow directive growth of neurites through the perilymph-filled gap towards the electrodes of the implant. We used fluorescein as a test molecule for its molecular size similarity to DHF and investigated two different transport mechanisms of drug dispensing, which both have the potential to fulfil controlled low-throughput drug-deliverable requirements. The first is based on the release of aqueous fluorescein into water through well-defined 60-μm size holes arrays in a membrane by pure osmosis. The release was both simulated using the software COMSOL and observed experimentally. In the second approach, solid fluorescein crystals were encapsulated in a thin layer of parylene (PPX), hence creating random nanometer-sized pinholes. In this approach, the release occurred due to subsequent water diffusion through the pinholes, dissolution of the fluorescein and then release by out-diffusion. Surprisingly, the release rate of solid fluorescein through the nanoscopic scale holes was found to be in the same order of magnitude as for liquid fluorescein release through microscopic holes.
Resumo:
Fatal hyperammonemia secondary to chemotherapy for hematological malignancies or following bone marrow transplantation has been described in few patients so far. In these, the pathogenesis of hyperammonemia remained unclear and was suggested to be multifactorial. We observed severe hyperammonemia (maximum 475 μmol/L) in a 2-year-old male patient, who underwent high-dose chemotherapy with carboplatin, etoposide and melphalan, and autologous hematopoietic stem cell transplantation for a neuroblastoma stage IV. Despite intensive care treatment, hyperammonemia persisted and the patient died due to cerebral edema. The biochemical profile with elevations of ammonia and glutamine (maximum 1757 μmol/L) suggested urea cycle dysfunction. In liver homogenates, enzymatic activity and protein expression of the urea cycle enzyme carbamoyl phosphate synthetase 1 (CPS1) were virtually absent. However, no mutation was found in CPS1 cDNA from liver and CPS1 mRNA expression was only slightly decreased. We therefore hypothesized that the acute onset of hyperammonemia was due to an acquired, chemotherapy-induced (posttranscriptional) CPS1 deficiency. This was further supported by in vitro experiments in HepG2 cells treated with carboplatin and etoposide showing a dose-dependent decrease in CPS1 protein expression. Due to severe hyperlactatemia, we analysed oxidative phosphorylation complexes in liver tissue and found reduced activities of complexes I and V, which suggested a more general mitochondrial dysfunction. This study adds to the understanding of chemotherapy-induced hyperammonemia as drug-induced CPS1 deficiency is suggested. Moreover, we highlight the need for urgent diagnostic and therapeutic strategies addressing a possible secondary urea cycle failure in future patients with hyperammonemia during chemotherapy and stem cell transplantation.
Resumo:
In chronic myelogenous leukemia (CML), oncogenic BCR-ABL1 activates the Wnt pathway, which is fundamental for leukemia stem cell (LSC) maintenance. Tyrosine kinase inhibitor (TKI) treatment reduces Wnt signaling in LSCs and often results in molecular remission of CML; however, LSCs persist long term despite BCR-ABL1 inhibition, ultimately causing disease relapse. We demonstrate that TKIs induce the expression of the tumor necrosis factor (TNF) family ligand CD70 in LSCs by down-regulating microRNA-29, resulting in reduced CD70 promoter DNA methylation and up-regulation of the transcription factor specificity protein 1. The resulting increase in CD70 triggered CD27 signaling and compensatory Wnt pathway activation. Combining TKIs with CD70 blockade effectively eliminated human CD34(+) CML stem/progenitor cells in xenografts and LSCs in a murine CML model. Therefore, targeting TKI-induced expression of CD70 and compensatory Wnt signaling resulting from the CD70/CD27 interaction is a promising approach to overcoming treatment resistance in CML LSCs.
Resumo:
Ninety-one Swiss veal farms producing under a label with improved welfare standards were visited between August and December 2014 to investigate risk factors related to antimicrobial drug use and mortality. All herds consisted of own and purchased calves, with a median of 77.4% of purchased calves. The calves' mean age was 29±15days at purchasing and the fattening period lasted at average 120±28 days. The mean carcass weight was 125±12kg. A mean of 58±33 calves were fattened per farm and year, and purchased calves were bought from a mean of 20±17 farms of origin. Antimicrobial drug treatment incidence was calculated with the defined daily dose methodology. The mean treatment incidence (TIADD) was 21±15 daily doses per calf and year. The mean mortality risk was 4.1%, calves died at a mean age of 94±50 days, and the main causes of death were bovine respiratory disease (BRD, 50%) and gastro-intestinal disease (33%). Two multivariable models were constructed, for antimicrobial drug treatment incidence (53 farms) and mortality (91 farms). No quarantine, shared air space for several groups of calves, and no clinical examination upon arrival at the farm were associated with increased antimicrobial treatment incidence. Maximum group size and weight differences >100kg within a group were associated with increased mortality risk, while vaccination and beef breed were associated with decreased mortality risk. The majority of antimicrobial treatments (84.6%) were given as group treatments with oral powder fed through an automatic milk feeding system. Combination products containing chlortetracycline with tylosin and sulfadimidine or with spiramycin were used for 54.9%, and amoxicillin for 43.7% of the oral group treatments. The main indication for individual treatment was BRD (73%). The mean age at the time of treatment was 51 days, corresponding to an estimated weight of 80-100kg. Individual treatments were mainly applied through injections (88.5%), and included administration of fluoroquinolones in 38.3%, penicillines (amoxicillin or benzylpenicillin) in 25.6%, macrolides in 13.1%, tetracyclines in 12.0%, 3th and 4th generation cephalosporines in 4.7%, and florfenicol in 3.9% of the cases. The present study allowed for identifying risk factors for increased antimicrobial drug treatment and mortality. This is an important basis for future studies aiming at reducing treatment incidence and mortality in veal farms. Our results indicate that improvement is needed in the selection of drugs for the treatment of veal calves according to the principles of prudent use of antibiotics.
Resumo:
OBJECTIVE Due to reduction of immune-suppressive drugs, patients with rheumatic diseases can experience an increase in disease activity during pregnancy. In such cases, TNF-inhibitors may be prescribed. However, monoclonal antibodies with the Fc moiety are actively transported across the placenta, resulting in therapeutic drug levels in the newborn. As certolizumab (CZP) lacks the Fc moiety, it may bear a lower risk for the child. METHOD We report a case series of thirteen patients (5 with rheumatoid arthritis and 8 with spondyloarthritis) treated with CZP during late pregnancy to control disease activity. RESULT CZP measured in cord blood of eleven infants ranged between undetectable levels and 1μg/mL whereas the median CZP level of maternal plasma was 32.97μg/mL. Three women developed an infection during the third trimester, of whom one had a severe infection and one had an infection that resulted in a pre-term delivery. During the postpartum period, 6 patients remained on CZP while breastfeeding. CZP levels in the breast milk of two breastfeeding patients were undetectable. CONCLUSION The lack of the active transplacental transfer of CZP gives the possibility to treat inflammatory arthritis during late gestation without potential harm to the newborn. However, in pregnant women treated with TNF-inhibitors and prednisone, attention should be given to the increased susceptibility to infections, which might cause prematurity. CZP treatment can be continued while breastfeeding.
Resumo:
Small chemicals like drugs tend to bind to proteins via noncovalent bonds, e.g. hydrogen bonds, salt bridges or electrostatic interactions. Some chemicals interact with other molecules than the actual target ligand, representing so-called 'off-target' activities of drugs. Such interactions are a main cause of adverse side effects to drugs and are normally classified as predictable type A reactions. Detailed analysis of drug-induced immune reactions revealed that off-target activities also affect immune receptors, such as highly polymorphic human leukocyte antigens (HLA) or T cell receptors (TCR). Such drug interactions with immune receptors may lead to T cell stimulation, resulting in clinical symptoms of delayed-type hypersensitivity. They are assigned the 'pharmacological interaction with immune receptors' (p-i) concept. Analysis of p-i has revealed that drugs bind preferentially or exclusively to distinct HLA molecules (p-i HLA) or to distinct TCR (p-i TCR). P-i reactions differ from 'conventional' off-target drug reactions as the outcome is not due to the effect on the drug-modified cells themselves, but is the consequence of reactive T cells. Hence, the complex and diverse clinical manifestations of delayed-type hypersensitivity are caused by the functional heterogeneity of T cells. In the abacavir model of p-i HLA, the drug binding to HLA may result in alteration of the presenting peptides. More importantly, the drug binding to HLA generates a drug-modified HLA, which stimulates T cells directly, like an allo-HLA. In the sulfamethoxazole model of p-i TCR, responsive T cells likely require costimulation for full T cell activation. These findings may explain the similarity of delayed-type hypersensitivity reactions to graft-versus-host disease, and how systemic viral infections increase the risk of delayed-type hypersensitivity reactions.