992 resultados para Drug Intoxication.
Resumo:
Chagas disease, a neglected illness, affects nearly 12-14 million people in endemic areas of Latin America. Although the occurrence of acute cases sharply has declined due to Southern Cone Initiative efforts to control vector transmission, there still remain serious challenges, including the maintenance of sustainable public policies for Chagas disease control and the urgent need for better drugs to treat chagasic patients. Since the introduction of benznidazole and nifurtimox approximately 40 years ago, many natural and synthetic compounds have been assayed against Trypanosoma cruzi, yet only a few compounds have advanced to clinical trials. This reflects, at least in part, the lack of consensus regarding appropriate in vitro and in vivo screening protocols as well as the lack of biomarkers for treating parasitaemia. The development of more effective drugs requires (i) the identification and validation of parasite targets, (ii) compounds to be screened against the targets or the whole parasite and (iii) a panel of minimum standardised procedures to advance leading compounds to clinical trials. This third aim was the topic of the workshop entitled Experimental Models in Drug Screening and Development for Chagas Disease, held in Rio de Janeiro, Brazil, on the 25th and 26th of November 2008 by the Fiocruz Program for Research and Technological Development on Chagas Disease and Drugs for Neglected Diseases Initiative. During the meeting, the minimum steps, requirements and decision gates for the determination of the efficacy of novel drugs for T. cruzi control were evaluated by interdisciplinary experts and an in vitro and in vivo flowchart was designed to serve as a general and standardised protocol for screening potential drugs for the treatment of Chagas disease.
Resumo:
Hepatitis C virus (HCV) is the major infectious disease agent among injecting drug users (IDUs), with seroprevalence ranging from 50-90%. In this paper, serological and virological parameters were investigated among 194 IDUs, 94 ex-IDUs and 95 non-IDUs that were sampled by the "snowball" technique in three localities renowned for both intense drug use and trafficking activities in Salvador, Brazil. The majority of the participants were male, but sex and mean age differed significantly between IDUs/ex-IDUs and non-IDUs (p < 0.05). Anti-HCV screening revealed that 35.6%, 29.8% and 5.3% of samples from IDUs, ex-IDUs and non-IDUs, respectively, were seropositive. HCV-RNA detection confirmed that the prevalence of infection was 29.4%, 21.3% and 5.3% for IDUs, ex-IDUs and non-IDUs, respectively. Genotyping analysis among IDUs/ex-IDUs determined that 76.9% were infected with genotype 1, 18.5% with genotype 3 and 4.6% with a mixed genotype; this result differed significantly from non-IDUs, where genotype 3 was the most frequent (60%), followed by genotype 1 (20%) and a mixed genotype (20%). We report a significantly higher prevalence of HCV infection in IDUs/ex-IDUs compared to the control group (p < 0.001). Although the sample size of our study was small, the differences in HCV genotype distribution reported herein for IDUs/ex-IDUs and non-IDUs warrant further investigation.
Resumo:
OBJECTIVE To describe what is, to our knowledge, the first nosocomial outbreak of infection with pan-drug-resistant (including colistin-resistant) Acinetobacter baumannii, to determine the risk factors associated with these types of infections, and to determine their clinical impact. DESIGN Nested case-control cohort study and a clinical-microbiological study. SETTING A 1,521-bed tertiary care university hospital in Seville, Spain. PATIENTS Case patients were inpatients who had a pan-drug-resistant A. baumannii isolate recovered from a clinical or surveillance sample obtained at least 48 hours after admission to an intensive care unit (ICU) during the time of the epidemic outbreak. Control patients were patients who were admitted to any of the "boxes" (ie, rooms that partition off a distinct area for a patient's bed and the equipment needed to care for the patient) of an ICU for at least 48 hours during the time of the epidemic outbreak. RESULTS All the clinical isolates had similar antibiotic susceptibility patterns (ie, they were resistant to all the antibiotics tested, including colistin), and, on the basis of repetitive extragenic palindromic-polymerase chain reaction, it was determined that all of them were of the same clone. The previous use of quinolones and glycopeptides and an ICU stay were associated with the acquisition of infection or colonization with pan-drug-resistant A. baumannii. To control this outbreak, we implemented the following multicomponent intervention program: the performance of environmental decontamination of the ICUs involved, an environmental survey, a revision of cleaning protocols, active surveillance for colonization with pan-drug-resistant A. baumannii, educational programs for the staff, and the display of posters that illustrate contact isolation measures and antimicrobial use recommendations. CONCLUSIONS We were not able to identify the common source for these cases of infection, but the adopted measures have proven to be effective at controlling the outbreak.
Resumo:
BACKGROUND: Poor medication adherence is a frequent cause of treatment failure but is difficult to diagnose. In this study we have evaluated the impact of measuring adherence to cinacalcet-HCl and phosphate binders in dialysis patients with uncontrolled secondary hyperparathyroidism. METHODS: 7 chronic dialysis patients with iPTH-levels >= 300 pg/ml despite treatment with >= 60 mg cinacalcet-HCl were included. Medication adherence was measured using the "Medication Events Monitoring System" during 3 months, followed by another 3-month period without monitoring. The adherence results were monthly discussed with the patients, as well as strategies to improve them. RESULTS: During monitoring, the percentage of prescribed doses taken was higher for cinacalcet-HCl (87.4%) and sevelamer (86.3%) than for calcium acetate (76.1%), as was the taking adherence (81.9% vs. 57.3% vs. 49.1%) but not the percentage of drug holidays (12.3% vs. 4.5% vs. 3.6%). Mean PO4 levels (from 2.24 +/- 0.6 mmol/l to 1.73 +/- 0.41 mmol/l; p = 0.14) and Ca++ x PO4 product (4.73 +/- 1.43 to 3.41 +/- 1.04 mmol2/l2; p = 0.12) improved and iPTH-level improved significantly from 916 +/- 618 pg/ml to 442 +/- 326 pg/ml (p = 0.04), without any change in medication. However, as drug monitoring was interrupted, all laboratory parameters worsened again. CONCLUSIONS: Assessment of drug adherence helped to document episodes of non-compliance and helped to avoid seemingly necessary dose increases.
Resumo:
There is a little-noticed trend involving human immunodeficiency virus (HIV)-infected patients suspected of having tuberculosis: the triple-treatment regimen recommended in Brazil for years has been potentially ineffective in over 30% of the cases. This proportion may be attributable to drug resistance (to at least 1 drug) and/or to infection with non-tuberculous mycobacteria. This evidence was not disclosed in official statistics, but arose from a systematic review of a few regional studies in which the diagnosis was reliably confirmed by mycobacterial culture. This paper clarifies that there has long been ample evidence for the potential benefits of a four-drug regimen for co-infected patients in Brazil and it reinforces the need for determining the species and drug susceptibility in all positive cultures from HIV-positive patients.
Resumo:
Aims: Therapeutic Drug Monitoring (TDM) is an established tool to optimize thepharmacotherapy with immunosupressants, antibiotics, antiretroviral agents, anticonvulsantsand psychotropic drugs. The TDM expert group of the Association ofNeuropsychopharmacolgy and Pharmacopsychiatry recommended clinical guidelinesfor TDM of psychotropic drugs in 2004 and in 2011. They allocate 4 levelsof recommendation based on studies reporting plasma concentrations and clinicaloutcomes. To evaluate the additional benefit for drugs without direct evidence forTDM and to verify the recommendation levels of the expert group the authorsbuilt a new rating scale. Methods: This rating scale included 28 items and wasdivided in 5 categories: Efficacy, toxicity, pharmacokinetics, patient characteristicsand cost effectiveness. A literature search was performed for 10 antidepressants,10 antipsychotics, 8 drugs used in the treatment of substance related disordersand lithium, thereafter, a comparison with the assessment of the TDMexpert group was carried out. Results: The antidepressants as well as the antipsychoticsshowed a high and significant correlation with the recommendations inthe consensus guidelines. However, meanderings could be detected for the drugsused in the therapy of substance related disorders, for which TDM is mostly notestablished yet. The result of the antidepressants and antipsychotics permits aclassification of the reachable points; upper 13 - TDM strongly recommended10 to 13 - TDM recommended, 8 to 10 - TDM useful and below 8 - TDMpotentially useful. Conclusion: These results suggest this rating scale is sensitiveto detect the appropriateness of TDM for drug treatment. For those drugs TDM isnot established a more objective estimation is possible, thus the scoring helps tofocus on the most likely drugs to require TDM.
Resumo:
The activity of the antineoplastic drug tamoxifen was evaluated against Trypanosoma cruzi. In vitro activity was determined against epimastigote, trypomastigote and amastigote forms of CL14, Y and Y benznidazole resistant T. cruzi strains. Regardless of the strain used, the drug was active against all life-cycle stages of the parasite with a half maximal effective concentration ranging from 0.7-17.9 µM. Two experimental models of acute Chagas disease were used to evaluate the in vivo efficacy of treatment with tamoxifen. No differences in parasitemia and mortality were observed between control mock-treated and tamoxifen-treated mice.
Resumo:
Objectives: The study objective was to derive reference pharmacokinetic curves of antiretroviral drugs (ART) based on available population pharmacokinetic (Pop-PK) studies that can be used to optimize therapeutic drug monitoring guided dosage adjustment.¦Methods: A systematic search of Pop-PK studies of 8 ART in adults was performed in PubMed. To simulate reference PK curves, a summary of the PK parameters was obtained for each drug based on meta-analysis approach. Most models used one-compartment model, thus chosen as reference model. Models using bi-exponential disposition were simplified to one-compartment, since the first distribution phase was rapid and not determinant for the description of the terminal elimination phase, mostly relevant for this project. Different absorption were standardized for first-order absorption processes.¦Apparent clearance (CL), apparent volume of distribution of the terminal phase (Vz) and absorption rate constant (ka) and inter-individual variability were pooled into summary mean value, weighted by number of plasma levels; intra-individual variability was weighted by number of individuals in each study.¦Simulations based on summary PK parameters served to construct concentration PK percentiles (NONMEM®).¦Concordance between individual and summary parameters was assessed graphically using Forest-plots. To test robustness, difference in simulated curves based on published and summary parameters was calculated using efavirenz as probe drug.¦Results: CL was readily accessible from all studies. For studies with one-compartment, Vz was central volume of distribution; for two-compartment, Vz was CL/λz. ka was directly used or derived based on the mean absorption time (MAT) for more complicated absorption models, assuming MAT=1/ka.¦The value of CL for each drug was in excellent agreement throughout all Pop-PK models, suggesting that minimal concentration derived from summary models was adequately characterized. The comparison of the concentration vs. time profile for efavirenz between published and summary PK parameters revealed not more than 20% difference. Although our approach appears adequate for estimation of elimination phase, the simplification of absorption phase might lead to small bias shortly after drug intake.¦Conclusions: Simulated reference percentile curves based on such an approach represent a useful tool for interpretating drug concentrations. This Pop-PK meta-analysis approach should be further validated and could be extended to elaborate more sophisticated computerized tool for the Bayesian TDM of ART.
Resumo:
The management of patients scheduled for surgery with a coronary stent, and receiving 1 or more antiplatelet drugs, has many controversies. The premature discontinuation of antiplatelet drugs substantially increases the risk of stent thrombosis (ST), myocardial infarction, and cardiac death, and surgery under an altered platelet function could also lead to an increased risk of bleeding in the perioperative period. Because of the conflict in the recommendations, this article reviews the current antiplatelet protocols after positioning a coronary stent, the evidence of increased risk of ST associated with the withdrawal of antiplatelet drugs and increased bleeding risk associated with its maintenance, the different perioperative antiplatelet protocols when patients are scheduled for surgery or need an urgent operation, and the therapeutic options if excessive bleeding occurs.
Resumo:
QUESTION UNDER STUDY: To investigate the change over time in the number of ED admissions with positive blood alcohol concentration (BAC) and to evaluate predictors of BAC level. METHODS: We conducted a single site retrospective study at the ED of a tertiary referral hospital (western part of Switzerland) and obtained all the BAC performed from 2002 to 2011. We determined the proportion of ED admissions with positive BAC (number of positive BAC/number of admissions). Regression models assessed trends in the proportion of admissions with positive BAC and the predictors of BAC level among patients with positive BAC. RESULTS: A total of 319,489 admissions were recorded and 20,021 BAC tests were performed, of which 14,359 were positive, divided 34.5% female and 65.5% male. The mean (SD) age was 41.7(16.8), and the mean BAC was 2.12(1.04) permille (g of ethanol/liter of blood). An increase in the number of positive BAC was observed, from 756 in 2002 to 1,819 in 2011. The total number of admissions also increased but less: 1.2 versus 2.4 times more admissions with positive BAC. Being male was independently associated with a higher (+0.19 permille) BAC, as was each passing year (+0.03). A significant quadratic association with age indicated a maximum BAC at age 53. CONCLUSION: We observed an increase in the percentage of admissions with positive BAC that was not limited to younger individuals. Given the potential consequences of alcohol intoxication, and the large burden imposed on ED teams, communities should be encouraged to take measures aimed at reducing alcohol intoxication.
Resumo:
Recently, it has been proposed that drug permeation is essentially carrier-mediated only and that passive lipoidal diffusion is negligible. This opposes the prevailing hypothesis of drug permeation through biological membranes, which integrates the contribution of multiple permeation mechanisms, including both carrier-mediated and passive lipoidal diffusion, depending on the compound's properties, membrane properties, and solution properties. The prevailing hypothesis of drug permeation continues to be successful for application and prediction in drug development. Proponents of the carrier-mediated only concept argue against passive lipoidal diffusion. However, the arguments are not supported by broad pharmaceutics literature. The carrier-mediated only concept lacks substantial supporting evidence and successful applications in drug development.
Assessment of drug-induced hepatotoxicity in clinical practice: a challenge for gastroenterologists.
Resumo:
Currently, pharmaceutical preparations are serious contributors to liver disease; hepatotoxicity ranking as the most frequent cause for acute liver failure and post-commercialization regulatory decisions. The diagnosis of hepatotoxicity remains a difficult task because of the lack of reliable markers for use in general clinical practice. To incriminate any given drug in an episode of liver dysfunction is a step-by-step process that requires a high degree of suspicion, compatible chronology, awareness of the drug's hepatotoxic potential, the exclusion of alternative causes of liver damage and the ability to detect the presence of subtle data that favors a toxic etiology. This process is time-consuming and the final result is frequently inaccurate. Diagnostic algorithms may add consistency to the diagnostic process by translating the suspicion into a quantitative score. Such scales are useful since they provide a framework that emphasizes the features that merit attention in cases of suspected hepatic adverse reaction as well. Current efforts in collecting bona fide cases of drug-induced hepatotoxicity will make refinements of existing scales feasible. It is now relatively easy to accommodate relevant data within the scoring system and to delete low-impact items. Efforts should also be directed toward the development of an abridged instrument for use in evaluating suspected drug-induced hepatotoxicity at the very beginning of the diagnosis and treatment process when clinical decisions need to be made. The instrument chosen would enable a confident diagnosis to be made on admission of the patient and treatment to be fine-tuned as further information is collected.
Resumo:
In this chapter we summarize some aspects of the structure-functional relationship of the alpha 1a and alpha 1b-adrenergic receptor subtypes related to the receptor activation process as well as the effect of different alpha-blockers on the constitutive activity of the receptor. Molecular modeling of the alpha 1a and alpha 1b-adrenergic receptor subtypes and computational simulation of receptor dynamics were useful to interpret the experimental findings derived from site directed mutagenesis studies.