70 resultados para Route of drug intake
Resumo:
BACKGROUND: The most prevalent drug hypersensitivity reactions are T-cell mediated. The only established in vitro test for detecting T-cell sensitization to drugs is the lymphocyte transformation test, which is of limited practicability. To find an alternative in vitro method to detect drug-sensitized T cells, we screened the in vitro secretion of 17 cytokines/chemokines by peripheral blood mononuclear cells (PBMC) of patients with well-documented drug allergies, in order to identify the most promising cytokines/chemokines for detection of T-cell sensitization to drugs. METHODS: Peripheral blood mononuclear cell of 10 patients, five allergic to beta-lactams and five to sulfanilamides, and of five healthy controls were incubated for 3 days with the drug antigen. Cytokine concentrations were measured in the supernatants using commercially available 17-plex bead-based immunoassay kits. RESULTS: Among the 17 cytokines/chemokines analysed, interleukin-2 (IL-2), IL-5, IL-13 and interferon-gamma (IFN-gamma) secretion in response to the drugs were significantly increased in patients when compared with healthy controls. No difference in cytokine secretion patterns between sulfonamide- and beta-lactam-reactive PBMC could be observed. The secretion of other cytokines/chemokines showed a high variability among patients. CONCLUSION: The measurement of IL-2, IL-5, IL-13 or IFN-gamma or a combination thereof might be a useful in vitro tool for detection of T-cell sensitization to drugs. Secretion of these cytokines seems independent of the type of drug antigen and the phenotype of the drug reaction. A study including a higher number of patients and controls will be needed to determine the exact sensitivity and specificity of this test.
Resumo:
BACKGROUND:Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. METHODS:On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). RESULTS:Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had >or= 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. CONCLUSIONS:Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
The early phase of psychotherapy has been regarded as a sensitive period in the unfolding of psychotherapy leading to positive outcomes. However, there is disagreement about the degree to which early (especially relationship-related) session experiences predict outcome over and above initial levels of distress and early response to treatment. The goal of the present study was to simultaneously examine outcome at post treatment as a function of (a) intake symptom and interpersonal distress as well as early change in well-being and symptoms, (b) the patient's early session-experiences, (c) the therapist's early session-experiences/interventions, and (d) their interactions. The data of 430 psychotherapy completers treated by 151 therapists were analyzed using hierarchical linear models. Results indicate that early positive intra- and interpersonal session experiences as reported by patients and therapists after the sessions explained 58% of variance of a composite outcome measure, taking intake distress and early response into account. All predictors (other than problem-activating therapists' interventions) contributed to later treatment outcomes if entered as single predictors. However, the multi-predictor analyses indicated that interpersonal distress at intake as well as the early interpersonal session experiences by patients and therapists remained robust predictors of outcome. The findings underscore that early in therapy therapists (and their supervisors) need to understand and monitor multiple interconnected components simultaneously
Resumo:
BACKGROUND Inability to predict the therapeutic effect of a drug in individual pain patients prolongs the process of drug and dose finding until satisfactory pharmacotherapy can be achieved. Many chronic pain conditions are associated with hypersensitivity of the nervous system or impaired endogenous pain modulation. Pharmacotherapy often aims at influencing these disturbed nociceptive processes. Its effect might therefore depend on the extent to which they are altered. Quantitative sensory testing (QST) can evaluate various aspects of pain processing and might therefore be able to predict the analgesic efficacy of a given drug. In the present study three drugs commonly used in the pharmacological management of chronic low back pain are investigated. The primary objective is to examine the ability of QST to predict pain reduction. As a secondary objective, the analgesic effects of these drugs and their effect on QST are evaluated. METHODS/DESIGN In this randomized, double blinded, placebo controlled cross-over study, patients with chronic low back pain are randomly assigned to imipramine, oxycodone or clobazam versus active placebo. QST is assessed at baseline, 1 and 2 h after drug administration. Pain intensity, side effects and patients' global impression of change are assessed in intervals of 30 min up to two hours after drug intake. Baseline QST is used as explanatory variable to predict drug effect. The change in QST over time is analyzed to describe the pharmacodynamic effects of each drug on experimental pain modalities. Genetic polymorphisms are analyzed as co-variables. DISCUSSION Pharmacotherapy is a mainstay in chronic pain treatment. Antidepressants, anticonvulsants and opioids are frequently prescribed in a "trial and error" fashion, without knowledge however, which drug suits best which patient. The present study addresses the important need to translate recent advances in pain research to clinical practice. Assessing the predictive value of central hypersensitivity and endogenous pain modulation could allow for the implementation of a mechanism-based treatment strategy in individual patients. TRIAL REGISTRATION Clinicaltrials.gov, NCT01179828.
Resumo:
OBJECTIVES We studied the influence of noninjecting and injecting drug use on mortality, dropout rate, and the course of antiretroviral therapy (ART), in the Swiss HIV Cohort Study (SHCS). METHODS Cohort participants, registered prior to April 2007 and with at least one drug use questionnaire completed until May 2013, were categorized according to their self-reported drug use behaviour. The probabilities of death and dropout were separately analysed using multivariable competing risks proportional hazards regression models with mutual correction for the other endpoint. Furthermore, we describe the influence of drug use on the course of ART. RESULTS A total of 6529 participants (including 31% women) were followed during 31 215 person-years; 5.1% participants died; 10.5% were lost to follow-up. Among persons with homosexual or heterosexual HIV transmission, noninjecting drug use was associated with higher all-cause mortality [subhazard rate (SHR) 1.73; 95% confidence interval (CI) 1.07-2.83], compared with no drug use. Also, mortality was increased among former injecting drug users (IDUs) who reported noninjecting drug use (SHR 2.34; 95% CI 1.49-3.69). Noninjecting drug use was associated with higher dropout rates. The mean proportion of time with suppressed viral replication was 82.2% in all participants, irrespective of ART status, and 91.2% in those on ART. Drug use lowered adherence, and increased rates of ART change and ART interruptions. Virological failure on ART was more frequent in participants who reported concomitant drug injections while on opiate substitution, and in current IDUs, but not among noninjecting drug users. CONCLUSIONS Noninjecting drug use and injecting drug use are modifiable risks for death, and they lower retention in a cohort and complicate ART.
Resumo:
Tyrosine kinase inhibitors represent today's treatment of choice in chronic myeloid leukemia (CML). Allogeneic hematopoietic stem cell transplantation (HSCT) is regarded as salvage therapy. This prospective randomized CML-study IIIA recruited 669 patients with newly diagnosed CML between July 1997 and January 2004 from 143 centers. Of these, 427 patients were considered eligible for HSCT and were randomized by availability of a matched family donor between primary HSCT (group A; N=166 patients) and best available drug treatment (group B; N=261). Primary end point was long-term survival. Survival probabilities were not different between groups A and B (10-year survival: 0.76 (95% confidence interval (CI): 0.69-0.82) vs 0.69 (95% CI: 0.61-0.76)), but influenced by disease and transplant risk. Patients with a low transplant risk showed superior survival compared with patients with high- (P<0.001) and non-high-risk disease (P=0.047) in group B; after entering blast crisis, survival was not different with or without HSCT. Significantly more patients in group A were in molecular remission (56% vs 39%; P=0.005) and free of drug treatment (56% vs 6%; P<0.001). Differences in symptoms and Karnofsky score were not significant. In the era of tyrosine kinase inhibitors, HSCT remains a valid option when both disease and transplant risk are considered.Leukemia advance online publication, 20 November 2015; doi:10.1038/leu.2015.281.
Resumo:
BACKGROUND The pathomechanisms underlying very late stent thrombosis (VLST) after implantation of drug-eluting stents (DES) are incompletely understood. Using optical coherence tomography, we investigated potential causes of this adverse event. METHODS AND RESULTS Between August 2010 and December 2014, 64 patients were investigated at the time point of VLST as part of an international optical coherence tomography registry. Optical coherence tomography pullbacks were performed after restoration of flow and analyzed at 0.4 mm. A total of 38 early- and 20 newer-generation drug-eluting stents were suitable for analysis. VLST occurred at a median of 4.7 years (interquartile range, 3.1-7.5 years). An underlying putative cause by optical coherence tomography was identified in 98% of cases. The most frequent findings were strut malapposition (34.5%), neoatherosclerosis (27.6%), uncovered struts (12.1%), and stent underexpansion (6.9%). Uncovered and malapposed struts were more frequent in thrombosed compared with nonthrombosed regions (ratio of percentages, 8.26; 95% confidence interval, 6.82-10.04; P<0.001 and 13.03; 95% confidence interval, 10.13-16.93; P<0.001, respectively). The maximal length of malapposed or uncovered struts (3.40 mm; 95% confidence interval, 2.55-4.25; versus 1.29 mm; 95% confidence interval, 0.81-1.77; P<0.001), but not the maximal or average axial malapposition distance, was greater in thrombosed compared with nonthrombosed segments. The associations of both uncovered and malapposed struts with thrombus were consistent among early- and newer-generation drug-eluting stents. CONCLUSIONS The leading associated findings in VLST patients in descending order were malapposition, neoatherosclerosis, uncovered struts, and stent underexpansion without differences between patients treated with early- and new-generation drug-eluting stents. The longitudinal extension of malapposed and uncovered stent was the most important correlate of thrombus formation in VLST.
Resumo:
BACKGROUND Drug resistance is a major barrier to successful antiretroviral treatment (ART). Therefore, it is important to monitor time trends at a population level. METHODS We included 11,084 ART-experienced patients from the Swiss HIV Cohort Study (SHCS) between 1999 and 2013. The SHCS is highly representative and includes 72% of patients receiving ART in Switzerland. Drug resistance was defined as the presence of at least one major mutation in a genotypic resistance test. To estimate the prevalence of drug resistance, data for patients with no resistance test was imputed based on patient's risk of harboring drug resistant viruses. RESULTS The emergence of new drug resistance mutations declined dramatically from 401 to 23 patients between 1999 and 2013. The upper estimated prevalence limit of drug resistance among ART-experienced patients decreased from 57.0% in 1999 to 37.1% in 2013. The prevalence of three-class resistance decreased from 9.0% to 4.4% and was always <0.4% for patients who initiated ART after 2006. Most patients actively participating in the SHCS in 2013 with drug resistant viruses initiated ART before 1999 (59.8%). Nevertheless, in 2013, 94.5% of patients who initiated ART before 1999 had good remaining treatment options based on Stanford algorithm. CONCLUSION HIV-1 drug resistance among ART-experienced patients in Switzerland is a well-controlled relic from the pre-combination ART era. Emergence of drug resistance can be virtually stopped with new potent therapies and close monitoring.
Resumo:
Allergic reactions to drugs are a serious public health concern. In 2013, the Division of Allergy, Immunology, and Transplantation of the National Institute of Allergy and Infectious Diseases sponsored a workshop on drug allergy. International experts in the field of drug allergy with backgrounds in allergy, immunology, infectious diseases, dermatology, clinical pharmacology, and pharmacogenomics discussed the current state of drug allergy research. These experts were joined by representatives from several National Institutes of Health institutes and the US Food and Drug Administration. The participants identified important advances that make new research directions feasible and made suggestions for research priorities and for development of infrastructure to advance our knowledge of the mechanisms, diagnosis, management, and prevention of drug allergy. The workshop summary and recommendations are presented herein.
Resumo:
BACKGROUND AND PURPOSE Acute stroke patients with severely impaired oral intake are at risk of malnutrition and dehydration. Rapid identification of these patients is necessary to establish early enteral tube feeding. Whether specific lesion location predicts early tube dependency was analysed, and the neural correlates of impaired oral intake after hemispheric ischaemic stroke were assessed. METHODS Tube dependency and functional oral intake were evaluated with a standardized comprehensive swallowing assessment within the first 48 h after magnetic resonance imaging proven first-time acute supratentorial ischaemic stroke. Voxel-based lesion symptom mapping (VLSM) was performed to compare lesion location between tube-dependent patients versus patients without tube feeding and impaired versus unimpaired oral intake. RESULTS Out of 119 included patients 43 (36%) had impaired oral intake and 12 (10%) were tube dependent. Both tube dependency and impaired oral intake were significantly associated with a higher National Institutes of Health Stroke Scale score and larger infarct volume and these patients had worse clinical outcome at discharge. Clinical characteristics did not differ between left and right hemispheric strokes. In the VLSM analysis, mildly impaired oral intake correlated with lesions of the Rolandic operculum, the insular cortex, the superior corona radiata and to a lesser extent of the putamen, the external capsule and the superior longitudinal fascicle. Tube dependency was significantly associated with affection of the anterior insular cortex. CONCLUSIONS Mild impairment of oral intake correlates with damage to a widespread operculo-insular swallowing network. However, specific lesions of the anterior insula lead to severe impairment and tube dependency and clinicians might consider early enteral tube feeding in these patients.