17 resultados para pre-symptomatic testing
Resumo:
OBJECTIVE: To explore the potential of deep HIV-1 sequencing for adding clinically relevant information relative to viral population sequencing in heavily pre-treated HIV-1-infected subjects. METHODS: In a proof-of-concept study, deep sequencing was compared to population sequencing in HIV-1-infected individuals with previous triple-class virological failure who also developed virologic failure to deep salvage therapy including, at least, darunavir, tipranavir, etravirine or raltegravir. Viral susceptibility was inferred before salvage therapy initiation and at virological failure using deep and population sequencing genotypes interpreted with the HIVdb, Rega and ANRS algorithms. The threshold level for mutant detection with deep sequencing was 1%. RESULTS: 7 subjects with previous exposure to a median of 15 antiretrovirals during a median of 13 years were included. Deep salvage therapy included darunavir, tipranavir, etravirine or raltegravir in 4, 2, 2 and 5 subjects, respectively. Self-reported treatment adherence was adequate in 4 and partial in 2; one individual underwent treatment interruption during follow-up. Deep sequencing detected all mutations found by population sequencing and identified additional resistance mutations in all but one individual, predominantly after virological failure to deep salvage therapy. Additional genotypic information led to consistent decreases in predicted susceptibility to etravirine, efavirenz, nucleoside reverse transcriptase inhibitors and indinavir in 2, 1, 2 and 1 subject, respectively. Deep sequencing data did not consistently modify the susceptibility predictions achieved with population sequencing for darunavir, tipranavir or raltegravir. CONCLUSIONS: In this subset of heavily pre-treated individuals, deep sequencing improved the assessment of genotypic resistance to etravirine, but did not consistently provide additional information on darunavir, tipranavir or raltegravir susceptibility. These data may inform the design of future studies addressing the clinical value of minority drug-resistant variants in treatment-experienced subjects.
Resumo:
Objective. To examine the association between pre-diagnostic circulating vitamin D concentration, dietary intake of vitamin D and calcium, and the risk of colorectal cancer in European populations. Design Nested case-control study. Setting. The study was conducted within the EPIC study, a cohort of more than 520 000 participants from 10 western European countries. Participants: 1248 cases of incident colorectal cancer, which developed after enrolment into the cohort, were matched to 1248 controls. Main outcome measures. Circulating vitamin D concentration (25-hydroxy-vitamin-D, 25-(OH)D) was measured by enzyme immunoassay. Dietary and lifestyle data were obtained from questionnaires. Incidence rate ratios and 95% confidence intervals for the risk of colorectal cancer by 25-(OH)D concentration and levels of dietary calcium and vitamin D intake were estimated from multivariate conditional logistic regression models, with adjustment for potential dietary and other confounders. Results. 25-(OH)D concentration showed a strong inverse linear dose-response association with risk of colorectal cancer (P for trend <0.001). Compared with a pre-defined mid-level concentration of 25-(OH)D (50.0-75.0 nmol/l), lower levels were associated with higher colorectal cancer risk (<25.0 nmol/l: incidence rate ratio 1.32 (95% confidence interval 0.87 to 2.01); 25.0-49.9 nmol/l: 1.28 (1.05 to 1.56), and higher concentrations associated with lower risk (75.0-99.9 nmol/l: 0.88 (0.68 to 1.13); ≥100.0 nmol/l: 0.77 (0.56 to 1.06)). In analyses by quintile of 25-(OH)D concentration, patients in the highest quintile had a 40% lower risk of colorectal cancer than did those in the lowest quintile (P<0.001). Subgroup analyses showed a strong association for colon but not rectal cancer (P for heterogeneity=0.048). Greater dietary intake of calcium was associated with a lower colorectal cancer risk. Dietary vitamin D was not associated with disease risk. Findings did not vary by sex and were not altered by corrections for season or month of blood donation. Conclusions The results of this large observational study indicate a strong inverse association between levels of pre-diagnostic 25-(OH)D concentration and risk of colorectal cancer in western European populations. Further randomised trials are needed to assess whether increases in circulating 25-(OH)D concentration can effectively decrease the risk of colorectal cancer.
Resumo:
BACKGROUND: Epidermal growth factor receptor (EGFR) and its downstream factors KRAS and BRAF are mutated in several types of cancer, affecting the clinical response to EGFR inhibitors. Mutations in the EGFR kinase domain predict sensitivity to the tyrosine kinase inhibitors gefitinib and erlotinib in lung adenocarcinoma, while activating point mutations in KRAS and BRAF confer resistance to the anti-EGFR monoclonal antibody cetuximab in colorectal cancer. The development of new generation methods for systematic mutation screening of these genes will allow more appropriate therapeutic choices. METHODS: We describe a high resolution melting (HRM) assay for mutation detection in EGFR exons 19-21, KRAS codon 12/13 and BRAF V600 using formalin-fixed paraffin-embedded samples. Somatic variation of KRAS exon 2 was also analysed by massively parallel pyrosequencing of amplicons with the GS Junior 454 platform. RESULTS: We tested 120 routine diagnostic specimens from patients with colorectal or lung cancer. Mutations in KRAS, BRAF and EGFR were observed in 41.9%, 13.0% and 11.1% of the overall samples, respectively, being mutually exclusive. For KRAS, six types of substitutions were detected (17 G12D, 9 G13D, 7 G12C, 2 G12A, 2 G12V, 2 G12S), while V600E accounted for all the BRAF activating mutations. Regarding EGFR, two cases showed exon 19 deletions (delE746-A750 and delE746-T751insA) and another two substitutions in exon 21 (one showed L858R with the resistance mutation T590M in exon 20, and the other had P848L mutation). Consistent with earlier reports, our results show that KRAS and BRAF mutation frequencies in colorectal cancer were 44.3% and 13.0%, respectively, while EGFR mutations were detected in 11.1% of the lung cancer specimens. Ultra-deep amplicon pyrosequencing successfully validated the HRM results and allowed detection and quantitation of KRAS somatic mutations. CONCLUSIONS: HRM is a rapid and sensitive method for moderate-throughput cost-effective screening of oncogene mutations in clinical samples. Rather than Sanger sequence validation, next-generation sequencing technology results in more accurate quantitative results in somatic variation and can be achieved at a higher throughput scale.
Resumo:
Introduction: Testing for HIV tropism is recommended before prescribing a chemokine receptor blocker. To date, in most European countries HIV tropism is determined using a phenotypic test. Recently, new data have emerged supporting the use of a genotypic HIV V3-loop sequence analysis as the basis for tropism determination. The European guidelines group on clinical management of HIV-1 tropism testing was established to make recommendations to clinicians and virologists. Methods: We searched online databases for articles from Jan 2006 until March 2010 with the terms: tropism or CCR5-antagonist or CCR5 antagonist or maraviroc or vicriviroc. Additional articles and/or conference abstracts were identified by hand searching. This strategy identified 712 potential articles and 1240 abstracts. All were reviewed and finally 57 papers and 42 abstracts were included and used by the panel to reach a consensus statement. Results: The panel recommends HIV-tropism testing for the following indications: i) drug-naïve patients in whom toxicity or limited therapeutic options are foreseen; ii) patients experiencing therapy failure whenever a treatment change is considered. Both the phenotypic Enhanced Trofile assay (ESTA) and genotypic population sequencing of the V3-loop are recommended for use in clinical practice. Although the panel does not recommend one methodology over another it is anticipated that genotypic testing will be used more frequently because of its greater accessibility, lower cost and shorter turnaround time. The panel also provides guidance on technical aspects and interpretation issues. If using genotypic methods, triplicate PCR amplification and sequencing testing is advised using the G2P interpretation tool (clonal model) with an FPR of 10%. If the viral load is below the level of reliable amplification, proviral DNA can be used, and the panel recommends performing triplicate testing and use of an FPR of 10%. If genotypic DNA testing is not performed in triplicate the FPR should be increased to 20%. Conclusions: The European guidelines on clinical management of HIV-1 tropism testing provide an overview of current literature, evidence-based recommendations for the clinical use of tropism testing and expert guidance on unresolved issues and current developments. Current data support both the use of genotypic population sequencing and ESTA for co-receptor tropism determination. For practical reasons genotypic population sequencing is the preferred method in Europe.
Resumo:
The agar dilution, broth microdilution, and disk diffusion methods were compared to determine the in vitro susceptibility of 428 extended-spectrum-beta-lactamase (ESBL)-producing Escherichia coli and Klebsiella pneumoniae to fosfomycin. Fosfomycin showed very high activity against all ESBL-producing strains. Excellent agreement between the three susceptibility methods was found for E. coli, whereas marked discrepancies were observed for K. pneumoniae.
Resumo:
Hematogones are normal B-lymphoid precursors that multiply in the bone marrow of small children and of adults with ferropenic anaemia, neuroblastoma or idiopathic thrombocytopenic purpura. They are not normally found in peripheral blood, and the immunophenotype is virtually indistinguishable from that of B lymphoblasts. We discuss the case of a 3-month infant with an active cytomegalovirus infection, with hepatitis and pancytopenia associated with 13% hematogones in the bone marrow
Resumo:
Influenza surveillance networks must detect early the viruses that will cause the forthcoming annual epidemics and isolate the strains for further characterization. We obtained the highest sensitivity (95.4%) with a diagnostic tool that combined a shell-vial assay and reverse transcription-PCR on cell culture supernatants at 48 h, and indeed, recovered the strain
Resumo:
Aim: The aim of the study was to investigate the influence of dietary intake of commercial hydrolyzed collagen (Gelatine Royal ®) on bone remodeling in pre-pubertal children. Methods: A randomized double-blind study was carried out in 60 children (9.42 ± 1.31 years) divided into three groups according to the amount of partially hydrolyzed collagen taken daily for 4 months: placebo (G-I, n = 18), collagen (G-II, n = 20) and collagen + calcium (G-III, n = 22) groups. Analyses of the following biochemical markers were carried out: total and bone alkaline phosphatase (tALP and bALP), osteocalcin, tartrate-resistant acid phosphatase (TRAP), type I collagen carboxy terminal telopeptide, lipids, calcium, 25-hydroxyvitamin D, insulin-like growth factor 1 (IGF-1), thyroid-stimulating hormone, free thyroxin and intact parathormone. Results: There was a significantly greater increase in serum IGF-1 in G-III than in G II (p < 0.01) or G-I (p < 0.05) during the study period, and a significantly greater increase in plasma tALP in G-III than in G-I (p < 0.05). Serum bALP behavior significantly (p < 0.05) differed between G-II (increase) and G-I (decrease). Plasma TRAP behavior significantly differed between G-II and G-I (p < 0.01) and between G-III and G-II (p < 0.05). Conclusion: Daily dietary intake of hydrolyzed collagen seems to have a potential role in enhancing bone remodeling at key stages of growth and development.
Resumo:
Background: Mortality from invasive meningococcal disease (IMD) has remained stable over the last thirty years and it is unclear whether pre-hospital antibiotherapy actually produces a decrease in this mortality. Our aim was to examine whether pre-hospital oral antibiotherapy reduces mortality from IMD, adjusting for indication bias. Methods: A retrospective analysis was made of clinical reports of all patients (n = 848) diagnosed with IMD from 1995 to 2000 in Andalusia and the Canary Islands, Spain, and of the relationship between the use of pre-hospital oral antibiotherapy and mortality. Indication bias was controlled for by the propensity score technique, and a multivariate analysis was performed to determine the probability of each patient receiving antibiotics, according to the symptoms identified before admission. Data on in-hospital death, use of antibiotics and demographic variables were collected. A logistic regression analysis was then carried out, using death as the dependent variable, and prehospital antibiotic use, age, time from onset of symptoms to parenteral antibiotics and the propensity score as independent variables. Results: Data were recorded on 848 patients, 49 (5.72%) of whom died. Of the total number of patients, 226 had received oral antibiotics before admission, mainly betalactams during the previous 48 hours. After adjusting the association between the use of antibiotics and death for age, time between onset of symptoms and in-hospital antibiotic treatment, pre-hospital oral antibiotherapy remained a significant protective factor (Odds Ratio for death 0.37, 95% confidence interval 0.15–0.93). Conclusion: Pre-hospital oral antibiotherapy appears to reduce IMD mortality.
Resumo:
Antibiotics used by general practitioners frequently appear in adverse-event reports of drug-induced hepatotoxicity. Most cases are idiosyncratic (the adverse reaction cannot be predicted from the drug's pharmacological profile or from pre-clinical toxicology tests) and occur via an immunological reaction or in response to the presence of hepatotoxic metabolites. With the exception of trovafloxacin and telithromycin (now severely restricted), hepatotoxicity crude incidence remains globally low but variable. Thus, amoxicillin/clavulanate and co-trimoxazole, as well as flucloxacillin, cause hepatotoxic reactions at rates that make them visible in general practice (cases are often isolated, may have a delayed onset, sometimes appear only after cessation of therapy and can produce an array of hepatic lesions that mirror hepatobiliary disease, making causality often difficult to establish). Conversely, hepatotoxic reactions related to macrolides, tetracyclines and fluoroquinolones (in that order, from high to low) are much rarer, and are identifiable only through large-scale studies or worldwide pharmacovigilance reporting. For antibiotics specifically used for tuberculosis, adverse effects range from asymptomatic increases in liver enzymes to acute hepatitis and fulminant hepatic failure. Yet, it is difficult to single out individual drugs, as treatment always entails associations. Patients at risk are mainly those with previous experience of hepatotoxic reaction to antibiotics, the aged or those with impaired hepatic function in the absence of close monitoring, making it important to carefully balance potential risks with expected benefits in primary care. Pharmacogenetic testing using the new genome-wide association studies approach holds promise for better understanding the mechanism(s) underlying hepatotoxicity.
Resumo:
The European Prospective Investigation into Cancer and nutrition (EPIC) is a long-term, multi-centric prospective study in Europe investigating the relationships between cancer and nutrition. This study has served as a basis for a number of Genome-Wide Association Studies (GWAS) and other types of genetic analyses. Over a period of 5 years, 52,256 EPIC DNA samples have been extracted using an automated DNA extraction platform. Here we have evaluated the pre-analytical factors affecting DNA yield, including anthropometric, epidemiological and technical factors such as center of subject recruitment, age, gender, body-mass index, disease case or control status, tobacco consumption, number of aliquots of buffy coat used for DNA extraction, extraction machine or procedure, DNA quantification method, degree of haemolysis and variations in the timing of sample processing. We show that the largest significant variations in DNA yield were observed with degree of haemolysis and with center of subject recruitment. Age, gender, body-mass index, cancer case or control status and tobacco consumption also significantly impacted DNA yield. Feedback from laboratories which have analyzed DNA with different SNP genotyping technologies demonstrate that the vast majority of samples (approximately 88%) performed adequately in different types of assays. To our knowledge this study is the largest to date to evaluate the sources of pre-analytical variations in DNA extracted from peripheral leucocytes. The results provide a strong evidence-based rationale for standardized recommendations on blood collection and processing protocols for large-scale genetic studies.
Resumo:
Nucleic acid amplification techniques are commonly used currently to diagnose viral diseases and manage patients with this kind of illnesses. These techniques have had a rapid but unconventional route of development during the last 30 years, with the discovery and introduction of several assays in clinical diagnosis. The increase in the number of commercially available methods has facilitated the use of this technology in the majority of laboratories worldwide. This technology has reduced the use of some other techniques such as viral culture based methods and serological assays in the clinical virology laboratory. Moreover, nucleic acid amplification techniques are now the methods of reference and also the most useful assays for the diagnosis in several diseases. The introduction of these techniques and their automation provides new opportunities for the clinical laboratory to affect patient care. The main objectives in performing nucleic acid tests in this field are to provide timely results useful for high-quality patient care at a reasonable cost, because rapid results are associated with improvements in patients care. The use of amplification techniques such as polymerase chain reaction, real-time polymerase chain reaction or nucleic acid sequence-based amplification for virus detection, genotyping and quantification have some advantages like high sensitivity and reproducibility, as well as a broad dynamic range. This review is an up-to-date of the main nucleic acid techniques and their clinical applications, and special challenges and opportunities that these techniques currently provide for the clinical virology laboratory.
Resumo:
The accuracy of the MicroScan WalkAway, BD Phoenix, and Vitek-2 systems for susceptibility testing of quinolones and aminoglycosides against 68 enterobacteria containing qnrB, qnrS, and/or aac(6 ')-Ib-cr was evaluated using reference microdilution. Overall, one very major error (0.09%), 6 major errors (0.52%), and 45 minor errors (3.89%) were noted.
Resumo:
OBJECTIVE To study the factors associated with choice of therapy and prognosis in octogenarians with severe symptomatic aortic stenosis (AS). STUDY DESIGN Prospective, observational, multicenter registry. Centralized follow-up included survival status and, if possible, mode of death and Katz index. SETTING Transnational registry in Spain. SUBJECTS We included 928 patients aged ≥80 years with severe symptomatic AS. INTERVENTIONS Aortic-valve replacement (AVR), transcatheter aortic-valve implantation (TAVI) or conservative therapy. MAIN OUTCOME MEASURES All-cause death. RESULTS Mean age was 84.2 ± 3.5 years, and only 49.0% were independent (Katz index A). The most frequent planned management was conservative therapy in 423 (46%) patients, followed by TAVI in 261 (28%) and AVR in 244 (26%). The main reason against recommending AVR in 684 patients was high surgical risk [322 (47.1%)], other medical motives [193 (28.2%)], patient refusal [134 (19.6%)] and family refusal in the case of incompetent patients [35 (5.1%)]. The mean time from treatment decision to AVR was 4.8 ± 4.6 months and to TAVI 2.1 ± 3.2 months, P < 0.001. During follow-up (11.2-38.9 months), 357 patients (38.5%) died. Survival rates at 6, 12, 18 and 24 months were 81.8%, 72.6%, 64.1% and 57.3%, respectively. Planned intervention, adjusted for multiple propensity score, was associated with lower mortality when compared with planned conservative treatment: TAVI Hazard ratio (HR) 0.68 (95% confidence interval [CI] 0.49-0.93; P = 0.016) and AVR HR 0.56 (95% CI 0.39-0.8; P = 0.002). CONCLUSION Octogenarians with symptomatic severe AS are frequently managed conservatively. Planned conservative management is associated with a poor prognosis.
Resumo:
Fragile X syndrome is the most common inherited form of intellectual disability. Here we report on a study based on a collaborative registry, involving 12 Spanish centres, of molecular diagnostic tests in 1105 fragile X families comprising 5062 individuals, of whom, 1655 carried a full mutation or were mosaic, three cases had deletions, 1840 had a premutation, and 102 had intermediate alleles. Two patients with the full mutation also had Klinefelter syndrome. We have used this registry to assess the risk of expansion from parents to children. From mothers with premutation, the overall rate of allele expansion to full mutation is 52.5%, and we found that this rate is higher for male than female offspring (63.6% versus 45.6%; P < 0.001). Furthermore, in mothers with intermediate alleles (45-54 repeats), there were 10 cases of expansion to a premutation allele, and for the smallest premutation alleles (55-59 repeats), there was a 6.4% risk of expansion to a full mutation, with 56 repeats being the smallest allele that expanded to a full mutation allele in a single meiosis. Hence, in our series the risk for alleles of <59 repeats is somewhat higher than in other published series. These findings are important for genetic counselling.