22 resultados para DRUG TESTING
Resumo:
BACKGROUND: Epidermal growth factor receptor (EGFR) and its downstream factors KRAS and BRAF are mutated in several types of cancer, affecting the clinical response to EGFR inhibitors. Mutations in the EGFR kinase domain predict sensitivity to the tyrosine kinase inhibitors gefitinib and erlotinib in lung adenocarcinoma, while activating point mutations in KRAS and BRAF confer resistance to the anti-EGFR monoclonal antibody cetuximab in colorectal cancer. The development of new generation methods for systematic mutation screening of these genes will allow more appropriate therapeutic choices. METHODS: We describe a high resolution melting (HRM) assay for mutation detection in EGFR exons 19-21, KRAS codon 12/13 and BRAF V600 using formalin-fixed paraffin-embedded samples. Somatic variation of KRAS exon 2 was also analysed by massively parallel pyrosequencing of amplicons with the GS Junior 454 platform. RESULTS: We tested 120 routine diagnostic specimens from patients with colorectal or lung cancer. Mutations in KRAS, BRAF and EGFR were observed in 41.9%, 13.0% and 11.1% of the overall samples, respectively, being mutually exclusive. For KRAS, six types of substitutions were detected (17 G12D, 9 G13D, 7 G12C, 2 G12A, 2 G12V, 2 G12S), while V600E accounted for all the BRAF activating mutations. Regarding EGFR, two cases showed exon 19 deletions (delE746-A750 and delE746-T751insA) and another two substitutions in exon 21 (one showed L858R with the resistance mutation T590M in exon 20, and the other had P848L mutation). Consistent with earlier reports, our results show that KRAS and BRAF mutation frequencies in colorectal cancer were 44.3% and 13.0%, respectively, while EGFR mutations were detected in 11.1% of the lung cancer specimens. Ultra-deep amplicon pyrosequencing successfully validated the HRM results and allowed detection and quantitation of KRAS somatic mutations. CONCLUSIONS: HRM is a rapid and sensitive method for moderate-throughput cost-effective screening of oncogene mutations in clinical samples. Rather than Sanger sequence validation, next-generation sequencing technology results in more accurate quantitative results in somatic variation and can be achieved at a higher throughput scale.
Resumo:
Introduction: Testing for HIV tropism is recommended before prescribing a chemokine receptor blocker. To date, in most European countries HIV tropism is determined using a phenotypic test. Recently, new data have emerged supporting the use of a genotypic HIV V3-loop sequence analysis as the basis for tropism determination. The European guidelines group on clinical management of HIV-1 tropism testing was established to make recommendations to clinicians and virologists. Methods: We searched online databases for articles from Jan 2006 until March 2010 with the terms: tropism or CCR5-antagonist or CCR5 antagonist or maraviroc or vicriviroc. Additional articles and/or conference abstracts were identified by hand searching. This strategy identified 712 potential articles and 1240 abstracts. All were reviewed and finally 57 papers and 42 abstracts were included and used by the panel to reach a consensus statement. Results: The panel recommends HIV-tropism testing for the following indications: i) drug-naïve patients in whom toxicity or limited therapeutic options are foreseen; ii) patients experiencing therapy failure whenever a treatment change is considered. Both the phenotypic Enhanced Trofile assay (ESTA) and genotypic population sequencing of the V3-loop are recommended for use in clinical practice. Although the panel does not recommend one methodology over another it is anticipated that genotypic testing will be used more frequently because of its greater accessibility, lower cost and shorter turnaround time. The panel also provides guidance on technical aspects and interpretation issues. If using genotypic methods, triplicate PCR amplification and sequencing testing is advised using the G2P interpretation tool (clonal model) with an FPR of 10%. If the viral load is below the level of reliable amplification, proviral DNA can be used, and the panel recommends performing triplicate testing and use of an FPR of 10%. If genotypic DNA testing is not performed in triplicate the FPR should be increased to 20%. Conclusions: The European guidelines on clinical management of HIV-1 tropism testing provide an overview of current literature, evidence-based recommendations for the clinical use of tropism testing and expert guidance on unresolved issues and current developments. Current data support both the use of genotypic population sequencing and ESTA for co-receptor tropism determination. For practical reasons genotypic population sequencing is the preferred method in Europe.
Resumo:
The agar dilution, broth microdilution, and disk diffusion methods were compared to determine the in vitro susceptibility of 428 extended-spectrum-beta-lactamase (ESBL)-producing Escherichia coli and Klebsiella pneumoniae to fosfomycin. Fosfomycin showed very high activity against all ESBL-producing strains. Excellent agreement between the three susceptibility methods was found for E. coli, whereas marked discrepancies were observed for K. pneumoniae.
Resumo:
The chemotherapeutic drug 5-FU is widely used in the treatment of a range of cancers, but resistance to the drug remains a major clinical problem. Since defects in the mediators of apoptosis may account for chemo-resistance, the identification of new targets involved in 5-FU-induced apoptosis is of main clinical interest. We have identified the ds-RNA-dependent protein kinase (PKR)as a key molecular target of 5-FU involved in apoptosis induction in human colon and breast cancer cell lines. PKR distribution and activation, apoptosis induction and cytotoxic effects were analyzed during 5-FU and 5-FU/IFNalpha treatment in several colon and breast cancer cell lines with different p53 status. PKR protein was activated by 5-FU treatment in a p53-independent manner,inducing phosphorylation of the protein synthesis translation initiation factor eIF-2alpha and cell death by apoptosis. Furthermore, PKR interference promoted a decreased response to 5-FU treatment and those cells were not affected by the synergistic antitumor activity of 5-FU/IFNalpha combination. These results, taken together, provide evidence that PKR is a key molecular target of 5-FU with potential relevance in the clinical use of this drug.
Resumo:
Influenza surveillance networks must detect early the viruses that will cause the forthcoming annual epidemics and isolate the strains for further characterization. We obtained the highest sensitivity (95.4%) with a diagnostic tool that combined a shell-vial assay and reverse transcription-PCR on cell culture supernatants at 48 h, and indeed, recovered the strain
Resumo:
OBJECTIVE To describe what is, to our knowledge, the first nosocomial outbreak of infection with pan-drug-resistant (including colistin-resistant) Acinetobacter baumannii, to determine the risk factors associated with these types of infections, and to determine their clinical impact. DESIGN Nested case-control cohort study and a clinical-microbiological study. SETTING A 1,521-bed tertiary care university hospital in Seville, Spain. PATIENTS Case patients were inpatients who had a pan-drug-resistant A. baumannii isolate recovered from a clinical or surveillance sample obtained at least 48 hours after admission to an intensive care unit (ICU) during the time of the epidemic outbreak. Control patients were patients who were admitted to any of the "boxes" (ie, rooms that partition off a distinct area for a patient's bed and the equipment needed to care for the patient) of an ICU for at least 48 hours during the time of the epidemic outbreak. RESULTS All the clinical isolates had similar antibiotic susceptibility patterns (ie, they were resistant to all the antibiotics tested, including colistin), and, on the basis of repetitive extragenic palindromic-polymerase chain reaction, it was determined that all of them were of the same clone. The previous use of quinolones and glycopeptides and an ICU stay were associated with the acquisition of infection or colonization with pan-drug-resistant A. baumannii. To control this outbreak, we implemented the following multicomponent intervention program: the performance of environmental decontamination of the ICUs involved, an environmental survey, a revision of cleaning protocols, active surveillance for colonization with pan-drug-resistant A. baumannii, educational programs for the staff, and the display of posters that illustrate contact isolation measures and antimicrobial use recommendations. CONCLUSIONS We were not able to identify the common source for these cases of infection, but the adopted measures have proven to be effective at controlling the outbreak.
Resumo:
The management of patients scheduled for surgery with a coronary stent, and receiving 1 or more antiplatelet drugs, has many controversies. The premature discontinuation of antiplatelet drugs substantially increases the risk of stent thrombosis (ST), myocardial infarction, and cardiac death, and surgery under an altered platelet function could also lead to an increased risk of bleeding in the perioperative period. Because of the conflict in the recommendations, this article reviews the current antiplatelet protocols after positioning a coronary stent, the evidence of increased risk of ST associated with the withdrawal of antiplatelet drugs and increased bleeding risk associated with its maintenance, the different perioperative antiplatelet protocols when patients are scheduled for surgery or need an urgent operation, and the therapeutic options if excessive bleeding occurs.
Assessment of drug-induced hepatotoxicity in clinical practice: a challenge for gastroenterologists.
Resumo:
Currently, pharmaceutical preparations are serious contributors to liver disease; hepatotoxicity ranking as the most frequent cause for acute liver failure and post-commercialization regulatory decisions. The diagnosis of hepatotoxicity remains a difficult task because of the lack of reliable markers for use in general clinical practice. To incriminate any given drug in an episode of liver dysfunction is a step-by-step process that requires a high degree of suspicion, compatible chronology, awareness of the drug's hepatotoxic potential, the exclusion of alternative causes of liver damage and the ability to detect the presence of subtle data that favors a toxic etiology. This process is time-consuming and the final result is frequently inaccurate. Diagnostic algorithms may add consistency to the diagnostic process by translating the suspicion into a quantitative score. Such scales are useful since they provide a framework that emphasizes the features that merit attention in cases of suspected hepatic adverse reaction as well. Current efforts in collecting bona fide cases of drug-induced hepatotoxicity will make refinements of existing scales feasible. It is now relatively easy to accommodate relevant data within the scoring system and to delete low-impact items. Efforts should also be directed toward the development of an abridged instrument for use in evaluating suspected drug-induced hepatotoxicity at the very beginning of the diagnosis and treatment process when clinical decisions need to be made. The instrument chosen would enable a confident diagnosis to be made on admission of the patient and treatment to be fine-tuned as further information is collected.
Resumo:
Antibiotics used by general practitioners frequently appear in adverse-event reports of drug-induced hepatotoxicity. Most cases are idiosyncratic (the adverse reaction cannot be predicted from the drug's pharmacological profile or from pre-clinical toxicology tests) and occur via an immunological reaction or in response to the presence of hepatotoxic metabolites. With the exception of trovafloxacin and telithromycin (now severely restricted), hepatotoxicity crude incidence remains globally low but variable. Thus, amoxicillin/clavulanate and co-trimoxazole, as well as flucloxacillin, cause hepatotoxic reactions at rates that make them visible in general practice (cases are often isolated, may have a delayed onset, sometimes appear only after cessation of therapy and can produce an array of hepatic lesions that mirror hepatobiliary disease, making causality often difficult to establish). Conversely, hepatotoxic reactions related to macrolides, tetracyclines and fluoroquinolones (in that order, from high to low) are much rarer, and are identifiable only through large-scale studies or worldwide pharmacovigilance reporting. For antibiotics specifically used for tuberculosis, adverse effects range from asymptomatic increases in liver enzymes to acute hepatitis and fulminant hepatic failure. Yet, it is difficult to single out individual drugs, as treatment always entails associations. Patients at risk are mainly those with previous experience of hepatotoxic reaction to antibiotics, the aged or those with impaired hepatic function in the absence of close monitoring, making it important to carefully balance potential risks with expected benefits in primary care. Pharmacogenetic testing using the new genome-wide association studies approach holds promise for better understanding the mechanism(s) underlying hepatotoxicity.
Resumo:
In this paper we discuss the consensus view on the use of qualifying biomarkers in drug safety, raised within the frame of the XXIV meeting of the Spanish Society of Clinical Pharmacology held in Málaga (Spain) in October, 2011. The widespread use of biomarkers as surrogate endpoints is a goal that scientists have long been pursuing. Thirty years ago, when molecular pharmacogenomics evolved, we anticipated that these genetic biomarkers would soon obviate the routine use of drug therapies in a way that patients should adapt to the therapy rather than the opposite. This expected revolution in routine clinical practice never took place as quickly nor with the intensity as initially expected. The concerted action of operating multicenter networks holds great promise for future studies to identify biomarkers related to drug toxicity and to provide better insight into the underlying pathogenesis. Today some pharmacogenomic advances are already widely accepted, but pharmacogenomics still needs further development to elaborate more precise algorithms and many barriers to implementing individualized medicine exist. We briefly discuss our view about these barriers and we provide suggestions and areas of focus to advance in the field.
Resumo:
Nucleic acid amplification techniques are commonly used currently to diagnose viral diseases and manage patients with this kind of illnesses. These techniques have had a rapid but unconventional route of development during the last 30 years, with the discovery and introduction of several assays in clinical diagnosis. The increase in the number of commercially available methods has facilitated the use of this technology in the majority of laboratories worldwide. This technology has reduced the use of some other techniques such as viral culture based methods and serological assays in the clinical virology laboratory. Moreover, nucleic acid amplification techniques are now the methods of reference and also the most useful assays for the diagnosis in several diseases. The introduction of these techniques and their automation provides new opportunities for the clinical laboratory to affect patient care. The main objectives in performing nucleic acid tests in this field are to provide timely results useful for high-quality patient care at a reasonable cost, because rapid results are associated with improvements in patients care. The use of amplification techniques such as polymerase chain reaction, real-time polymerase chain reaction or nucleic acid sequence-based amplification for virus detection, genotyping and quantification have some advantages like high sensitivity and reproducibility, as well as a broad dynamic range. This review is an up-to-date of the main nucleic acid techniques and their clinical applications, and special challenges and opportunities that these techniques currently provide for the clinical virology laboratory.
Resumo:
Despite stringent requirements for drug development imposed by regulatory agencies, drug-induced liver injury (DILI) is an increasing health problem and a significant cause for failure to approve drugs, market withdrawal of commercialized medications, and adoption of regulatory measures. The pathogenesis is yet undefined, though the rare occurrence of idiosyncratic DILI (1/100,000–1/10,000) and the fact that hepatotoxicity often recurs after re-exposure to the culprit drug under different environmental conditions strongly points toward a major role for genetic variations in the underlying mechanism and susceptibility. Pharmacogenetic studies in DILI have to a large extent focused on genes involved in drug metabolism, as polymorphisms in these genes may generate increased plasma drug concentrations as well as lower clearance rates when treated with standard medication doses. A range of studies have identified a number of genetic variants in drug metabolism Phase I, II, and III genes, including cytochrome P450 (CYP) 2E1, N-acetyltransferase 2, UDP-glucuronosyltransferase 2B7, glutathione S-transferase M1/T1, ABCB11, and ABCC2, that enhance DILI susceptibility (Andrade et al., 2009; Agundez et al., 2011). Several metabolic gene variants, such as CYP2E1c1 and NAT2 slow, have been associated with DILI induced by specific drugs based on individual drug metabolism information. Others, such as GSTM1 and T1 null alleles have been associated with enhanced risk of DILI development induced by a large range of drugs. Hence, these variants appear to have a more general role in DILI susceptibility due to their role in reducing the cell's antioxidative capacity (Lucena et al., 2008). Mitochondrial superoxide dismutase (SOD2) and glutathione peroxidase 1 (GPX1) are two additional enzymes involved in combating oxidative stress, with specific genetic variants shown to enhance the risk of developing DILI
Resumo:
BACKGROUND Persistence of anti-tumor necrosis factor (TNF) therapy in rheumatoid arthritis (RA) is an overall marker of treatment success. OBJECTIVE To assess the survival of anti-TNF treatment and to define the potential predictors of drug discontinuation in RA, in order to verify the adequacy of current practices. DESIGN An observational, descriptive, longitudinal, retrospective study. SETTING The Hospital Clínico Universitario de Valladolid, Valladolid, Spain. PATIENTS RA patients treated with anti-TNF therapy between January 2011 and January 2012. MEASUREMENTS Demographic information and therapy assessments were gathered from medical and pharmaceutical records. Data is expressed as means (standard deviations) for quantitative variables and frequency distribution for qualitative variables. Kaplan-Meier survival analysis was used to assess persistence, and Cox multivariate regression models were used to assess potential predictors of treatment discontinuation. RESULTS In total, 126 treatment series with infliximab (n = 53), etanercept (n = 51) or adalimumab (n = 22) were administered to 91 patients. Infliximab has mostly been used as a first-line treatment, but it was the drug with the shortest time until a change of treatment. Significant predictors of drug survival were: age; the anti-TNF agent; and the previous response to an anti-TNF drug. LIMITATION The small sample size. CONCLUSION The overall efficacy of anti-TNF drugs diminishes with time, with infliximab having the shortest time until a change of treatment. The management of biologic therapy in patients with RA should be reconsidered in order to achieve disease control with a reduction in costs.
Resumo:
Nonimmediate drug hypersensitivity reactions (DHRs) are difficult to manage in daily clinical practice, mainly owing to their heterogeneous clinical manifestations and the lack of selective biological markers. In vitro methods are necessaryto establish a diagnosis, especially given the low sensitivity of skin tests and the inherent risks of drug provocation testing. In vitro evaluation of nonimmediate DHRs must include approaches that can be applied during the different phases of the reaction. During the acute phase, monitoring markers in both skin and peripheral blood helps to discriminate between immediate and nonimmediate DHRs with cutaneous responses and to distinguish between reactions that, although they present similar clinical symptoms, are produced by different immunological mechanisms and therefore have a different treatment and prognosis. During the resolution phase, in vitro testing is used to detect the response of T cells to drug stimulation; however, this approach has certain limitations, such as the lack of validated studies assessing sensitivity. Moreover, in vitro tests indicate an immune response that is not always related to a DHR. In this review, members of the Immunology and Drug Allergy Committee of the Spanish Society of Allergy and Clinical Immunology (SEAIC) provide an overview of the most widely used in vitro tests for evaluating nonimmediate DHRs.