938 resultados para VERSUS-HOST-DISEASE
Resumo:
Chagas disease prevention remains mostly based on triatomine vector control to reduce or eliminate house infestation with these bugs. The level of adaptation of triatomines to human housing is a key part of vector competence and needs to be precisely evaluated to allow for the design of effective vector control strategies. In this review, we examine how the domiciliation/intrusion level of different triatomine species/populations has been defined and measured and discuss how these concepts may be improved for a better understanding of their ecology and evolution, as well as for the design of more effective control strategies against a large variety of triatomine species. We suggest that a major limitation of current criteria for classifying triatomines into sylvatic, intrusive, domiciliary and domestic species is that these are essentially qualitative and do not rely on quantitative variables measuring population sustainability and fitness in their different habitats. However, such assessments may be derived from further analysis and modelling of field data. Such approaches can shed new light on the domiciliation process of triatomines and may represent a key tool for decision-making and the design of vector control interventions.
Resumo:
Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.
Resumo:
Purpose: To compare entero-MDCT with entero-MRI performed for suspicion of acute exacerbation of known Crohn's disease. Methods and Materials: Fifty-seven patients (mean age 33.5) with histologically proven Crohn's disease were prospectively included. They presented with clinical symptoms suggesting acute exacerbation to the emergency department. After oral administration of 1-2 l of 5% methylcellulosis (+syrup), entero-MDCT and entero- MRI were performed on each patient (mean delay 1 day). Three experienced radiologists blindly and independently evaluated each examination for technical quality, eight pathological CT features (bowel wall thickening, pathological wall enhancement, stenosis, lymphadenopathy, mesenteric haziness, intraperitoneal fluid, abscess, fistula) and final main diagnosis. Interobserver agreement kappa was calculated. Sensitivity and specificity resulted from comparison with the reference standard, consisting of operation (n= 30) and long-time follow-up in case of conservative treatment (n=27). Results: Entero-MDCT demonstrated considerably less artefacts than entero-MRI (p 0.0001). In 9 entero-MDCT/-MRI, no activity of Crohn's disease was seen, whereas in 48 entero-MDCT/-MRI active disease could be demonstrated, such as intraperitoneal abscesses (n=11), fistulas (n=13), stenoses (n=23), acute (n=15) or chronic (n=23) inflammation. Interobserver agreement of the three readers was not significantly different between entero-MDCT and -MRI, neither was sensitivity (range 60-89%) and specificity (range 75-100%) for each of the eight pathological features or for the main diagnosis. Conclusion: Entero-MRI is statistically of similar diagnostic value as entero-MDCT for acute complications of Crohn's disease. Therefore, entero-IRM, devoid of harmful irradiation, should become the preferred imaging modality, since we deal with young patients, very likely exposed to frequent imaging controls in the future.
Resumo:
BACKGROUND: Human immunodeficiency virus (HIV) takes advantage of multiple host proteins to support its own replication. The gene ZNRD1 (zinc ribbon domain-containing 1) has been identified as encoding a potential host factor that influenced disease progression in HIV-positive individuals in a genomewide association study and also significantly affected HIV replication in a large-scale in vitro short interfering RNA (siRNA) screen. Genes and polymorphisms identified by large-scale analysis need to be followed up by means of functional assays and resequencing efforts to more precisely map causal genes. METHODS: Genotyping and ZNRD1 gene resequencing for 208 HIV-positive subjects (119 who experienced long-term nonprogression [LTNP] and 89 who experienced normal disease progression) was done by either TaqMan genotyping assays or direct sequencing. Genetic association analysis was performed with the SNPassoc package and Haploview software. siRNA and short hairpin RNA (shRNA) specifically targeting ZNRD1 were used to transiently or stably down-regulate ZNRD1 expression in both lymphoid and nonlymphoid cells. Cells were infected with X4 and R5 HIV strains, and efficiency of infection was assessed by reporter gene assay or p24 assay. RESULTS: Genetic association analysis found a strong statistically significant correlation with the LTNP phenotype (single-nucleotide polymorphism rs1048412; [Formula: see text]), independently of HLA-A10 influence. siRNA-based functional analysis showed that ZNRD1 down-regulation by siRNA or shRNA impaired HIV-1 replication at the transcription level in both lymphoid and nonlymphoid cells. CONCLUSION: Genetic association analysis unequivocally identified ZNRD1 as an independent marker of LTNP to AIDS. Moreover, in vitro experiments pointed to viral transcription as the inhibited step. Thus, our data strongly suggest that ZNRD1 is a host cellular factor that influences HIV-1 replication and disease progression in HIV-positive individuals.
Resumo:
OBJECTIVES: The purpose of this study was to determine whether thoracic endovascular aortic repair (TEVAR) reduces death and morbidity compared with open surgical repair for descending thoracic aortic disease. BACKGROUND: The role of TEVAR versus open surgery remains unclear. Metaregression can be used to maximally inform adoption of new technologies by utilizing evidence from existing trials. METHODS: Data from comparative studies of TEVAR versus open repair of the descending aorta were combined through meta-analysis. Metaregression was performed to account for baseline risk factor imbalances, study design, and thoracic pathology. Due to significant heterogeneity, registry data were analyzed separately from comparative studies. RESULTS: Forty-two nonrandomized studies involving 5,888 patients were included (38 comparative studies, 4 registries). Patient characteristics were balanced except for age, as TEVAR patients were usually older than open surgery patients (p = 0.001). Registry data suggested overall perioperative complications were reduced. In comparative studies, all-cause mortality at 30 days (odds ratio [OR]: 0.44, 95% confidence interval [CI]: 0.33 to 0.59) and paraplegia (OR: 0.42, 95% CI: 0.28 to 0.63) were reduced for TEVAR versus open surgery. In addition, cardiac complications, transfusions, reoperation for bleeding, renal dysfunction, pneumonia, and length of stay were reduced. There was no significant difference in stroke, myocardial infarction, aortic reintervention, and mortality beyond 1 year. Metaregression to adjust for age imbalance, study design, and pathology did not materially change the results. CONCLUSIONS: Current data from nonrandomized studies suggest that TEVAR may reduce early death, paraplegia, renal insufficiency, transfusions, reoperation for bleeding, cardiac complications, pneumonia, and length of stay compared with open surgery. Sustained benefits on survival have not been proven.
Resumo:
BACKGROUND: Results from cohort studies evaluating the severity of respiratory viral co-infections are conflicting. We conducted a systematic review and meta-analysis to assess the clinical severity of viral co-infections as compared to single viral respiratory infections. METHODS: We searched electronic databases and other sources for studies published up to January 28, 2013. We included observational studies on inpatients with respiratory illnesses comparing the clinical severity of viral co-infections to single viral infections as detected by molecular assays. The primary outcome reflecting clinical disease severity was length of hospital stay (LOS). A random-effects model was used to conduct the meta-analyses. RESULTS: Twenty-one studies involving 4,280 patients were included. The overall quality of evidence applying the GRADE approach ranged from moderate for oxygen requirements to low for all other outcomes. No significant differences in length of hospital stay (LOS) (mean difference (MD) -0.20 days, 95% CI -0.94, 0.53, p = 0.59), or mortality (RR 2.44, 95% CI 0.86, 6.91, p = 0.09) were documented in subjects with viral co-infections compared to those with a single viral infection. There was no evidence for differences in effects across age subgroups in post hoc analyses with the exception of the higher mortality in preschool children (RR 9.82, 95% CI 3.09, 31.20, p<0.001) with viral co-infection as compared to other age groups (I2 for subgroup analysis 64%, p = 0.04). CONCLUSIONS: No differences in clinical disease severity between viral co-infections and single respiratory infections were documented. The suggested increased risk of mortality observed amongst children with viral co-infections requires further investigation.
Resumo:
Pneumocystis jirovecii is a fungus causing severe pneumonia in immuno-compromised patients. Progress in understanding its pathogenicity and epidemiology has been hampered by the lack of a long-term in vitro culture method. Obligate parasitism of this pathogen has been suggested on the basis of various features but remains controversial. We analysed the 7.0 Mb draft genome sequence of the closely related species Pneumocystis carinii infecting rats, which is a well established experimental model of the disease. We predicted 8'085 (redundant) peptides and 14.9% of them were mapped onto the KEGG biochemical pathways. The proteome of the closely related yeast Schizosaccharomyces pombe was used as a control for the annotation procedure (4'974 genes, 14.1% mapped). About two thirds of the mapped peptides of each organism (65.7% and 73.2%, respectively) corresponded to crucial enzymes for the basal metabolism and standard cellular processes. However, the proportion of P. carinii genes relative to those of S. pombe was significantly smaller for the "amino acid metabolism" category of pathways than for all other categories taken together (40 versus 114 against 278 versus 427, P<0.002). Importantly, we identified in P. carinii only 2 enzymes specifically dedicated to the synthesis of the 20 standard amino acids. By contrast all the 54 enzymes dedicated to this synthesis reported in the KEGG atlas for S. pombe were detected upon reannotation of S. pombe proteome (2 versus 54 against 278 versus 427, P<0.0001). This finding strongly suggests that species of the genus Pneumocystis are scavenging amino acids from their host's lung environment. Consequently, they would have no form able to live independently from another organism, and these parasites would be obligate in addition to being opportunistic. These findings have implications for the management of patients susceptible to P. jirovecii infection given that the only source of infection would be other humans.
Resumo:
Background: Various patterns of HIV-1 disease progression are described in clinical practice and in research. There is a need to assess the specificity of commonly used definitions of long term non-progressor (LTNP) elite controllers (LTNP-EC), viremic controllers (LTNP-VC), and viremic non controllers (LTNP-NC), as well as of chronic progressors (P) and rapid progressors (RP). Methodology and Principal Findings: We re-evaluated the HIV-1 clinical definitions, summarized in Table 1, using the information provided by a selected number of host genetic markers and viral factors. There is a continuous decrease of protective factors and an accumulation of risk factors from LTNP-EC to RP. Statistical differences in frequency of protective HLA-B alleles (p-0.01), HLA-C rs9264942 (p-0.06), and protective CCR5/CCR2 haplotypes (p-0.02) across groups, and the presence of viruses with an ancestral genotype in the "viral dating" (i.e., nucleotide sequences with low viral divergence from the most recent common ancestor) support the differences among principal clinical groups of HIV-1 infected individuals. Conclusions: A combination of host genetic and viral factors supports current clinical definitions that discriminate among patterns of HIV-1 progression. The study also emphasizes the need to apply a standardized and accepted set of clinical definitions for the purpose of disease stratification and research.
Resumo:
BACKGROUND: There is an ever-increasing volume of data on host genes that are modulated during HIV infection, influence disease susceptibility or carry genetic variants that impact HIV infection. We created GuavaH (Genomic Utility for Association and Viral Analyses in HIV, http://www.GuavaH.org), a public resource that supports multipurpose analysis of genome-wide genetic variation and gene expression profile across multiple phenotypes relevant to HIV biology. FINDINGS: We included original data from 8 genome and transcriptome studies addressing viral and host responses in and ex vivo. These studies cover phenotypes such as HIV acquisition, plasma viral load, disease progression, viral replication cycle, latency and viral-host genome interaction. This represents genome-wide association data from more than 4,000 individuals, exome sequencing data from 392 individuals, in vivo transcriptome microarray data from 127 patients/conditions, and 60 sets of RNA-seq data. Additionally, GuavaH allows visualization of protein variation in ~8,000 individuals from the general population. The publicly available GuavaH framework supports queries on (i) unique single nucleotide polymorphism across different HIV related phenotypes, (ii) gene structure and variation, (iii) in vivo gene expression in the setting of human infection (CD4+ T cells), and (iv) in vitro gene expression data in models of permissive infection, latency and reactivation. CONCLUSIONS: The complexity of the analysis of host genetic influences on HIV biology and pathogenesis calls for comprehensive motors of research on curated data. The tool developed here allows queries and supports validation of the rapidly growing body of host genomic information pertinent to HIV research.
Resumo:
OBJECTIVES: Reactivation of latent tuberculosis (TB) in inflammatory bowel disease (IBD) patients treated with antitumor necrosis factor-alpha medication is a serious problem. Currently, TB screening includes chest x-rays and a tuberculin skin test (TST). The interferon-gamma release assay (IGRA) QuantiFERON-TB Gold In-Tube (QFT-G-IT) shows better specificity for diagnosing TB than the skin test. This study evaluates the two test methods among IBD patients. METHODS: Both TST and IGRA were performed on 212 subjects (114 Crohn's disease, 44 ulcerative colitis, 10 indeterminate colitis, 44 controls). RESULTS: Eighty-one percent of IBD patients were under immunosuppressive therapy; 71% of all subjects were vaccinated with Bacille Calmette Guérin; 18% of IBD patients and 43% of controls tested positive with the skin test (P < 0.0001). Vaccinated controls tested positive more often with the skin test (52%) than did vaccinated IBD patients (23%) (P = 0.011). Significantly fewer immunosuppressed patients tested positive with the skin test than did patients not receiving therapy (P = 0.007); 8% of patients tested positive with the QFT-G-IT test (14/168) compared to 9% (4/44) of controls. Test agreement was significantly higher in the controls (P = 0.044) compared to the IBD group. CONCLUSIONS: Agreement between the two test methods is poor in IBD patients. In contrast to the QFT-G-IT test, the TST is negatively influenced by immunosuppressive medication and vaccination status, and should thus be replaced by the IGRA for TB screening in immunosuppressed patients having IBD.
Resumo:
Atrial arrhythmias (AAs) are a common complication in adult patients with congenital heart disease. We sought to compare the lifetime prevalence of AAs in patients with right- versus left-sided congenital cardiac lesions and their effect on the prognosis. A congenital heart disease diagnosis was assigned using the International Disease Classification, Ninth Revision, diagnostic codes in the administrative databases of Quebec, from 1983 to 2005. Patients with AAs were those diagnosed with an International Disease Classification, Ninth Revision, code for atrial fibrillation or intra-atrial reentry tachycardia. To ensure that the diagnosis of AA was new, a washout period of 5 years after entry into the database was used, a period during which the patient could not have received an International Disease Classification, Ninth Revision, code for AA. The cumulative lifetime risk of AA was estimated using the Practical Incidence Estimators method. The hazard ratios (HRs) for mortality, morbidity, and cardiac interventions were compared between those with right- and left-sided lesions after adjustment for age, gender, disease severity, and cardiac risk factors. In a population of 71,467 patients, 7,756 adults developed AAs (isolated right-sided, 2,229; isolated left-sided, 1,725). The lifetime risk of developing AAs was significantly greater in patients with right- sided than in patients with left-sided lesions (61.0% vs 55.4%, p <0.001). The HR for mortality and the development of stroke or heart failure was similar in both groups (HR 0.96, 95% confidence interval [CI] 0.86 to 1.09; HR 0.94, 95% CI 0.80 to 1.09; and HR 1.10, 95% CI 0.98 to 1.23, respectively). However, the rates of cardiac catheterization (HR 0.63, 95% CI 0.55 to 0.72), cardiac surgery (HR 0.40, 95% CI 0.36 to 0.45), and arrhythmia surgery (HR 0.77, 95% CI 0.6 to 0.98) were significantly less for patients with right-sided lesions. In conclusion, patients with right-sided lesions had a greater lifetime burden of AAs. However, their morbidity and mortality were no less than those with left-sided lesions, although the rate of intervention was substantially different.
Resumo:
ABSTRACT: BACKGROUND: Perfusion-cardiovascular magnetic resonance (CMR) is generally accepted as an alternative to SPECT to assess myocardial ischemia non-invasively. However its performance vs gated-SPECT and in sub-populations is not fully established. The goal was to compare in a multicenter setting the diagnostic performance of perfusion-CMR and gated-SPECT for the detection of CAD in various populations using conventional x-ray coronary angiography (CXA) as the standard of reference. METHODS: In 33 centers (in US and Europe) 533 patients, eligible for CXA or SPECT, were enrolled in this multivendor trial. SPECT and CXA were performed within 4 weeks before or after CMR in all patients. Prevalence of CAD in the sample was 49% and 515 patients received MR contrast medium. Drop-out rates for CMR and SPECT were 5.6% and 3.7%, respectively (ns). The study was powered for the primary endpoint of non-inferiority of CMR vs SPECT for both, sensitivity and specificity for the detection of CAD (using a single-threshold reading), the results for the primary endpoint were reported elsewhere. In this article secondary endpoints are presented, i.e. the diagnostic performance of CMR versus SPECT in subpopulations such as multi-vessel disease (MVD), in men, in women, and in patients without prior myocardial infarction (MI). For diagnostic performance assessment the area under the receiver-operator-characteristics-curve (AUC) was calculated. Readers were blinded versus clinical data, CXA, and imaging results. RESULTS: The diagnostic performance (= area under ROC = AUC) of CMR was superior to SPECT (p = 0.0004, n = 425) and to gated-SPECT (p = 0.018, n = 253). CMR performed better than SPECT in MVD (p = 0.003 vs all SPECT, p = 0.04 vs gated-SPECT), in men (p = 0.004, n = 313) and in women (p = 0.03, n = 112) as well as in the non-infarct patients (p = 0.005, n = 186 in 1-3 vessel disease and p = 0.015, n = 140 in MVD). CONCLUSION: In this large multicenter, multivendor study the diagnostic performance of perfusion-CMR to detect CAD was superior to perfusion SPECT in the entire population and in sub-groups. Perfusion-CMR can be recommended as an alternative for SPECT imaging. TRIAL REGISTRATION: ClinicalTrials.gov, Identifier: NCT00977093.
Resumo:
Intense immune responses are observed during human or experimental infection with the digenetic protozoan parasite Trypanosoma cruzi. The reasons why such immune responses are unable to completely eliminate the parasites are unknown. The survival of the parasite leads to a parasite-host equilibrium found during the chronic phase of chagasic infection in most individuals. Parasite persistence is recognized as the most likely cause of the chagasic chronic pathologies. Therefore, a key question in Chagas' disease is to understand how this equilibrium is established and maintained for a long period. Understanding the basis for this equilibrium may lead to new approaches to interventions that could help millions of individuals at risk for infection or who are already infected with T. cruzi. Here, we propose that the phenomenon of immunodominance may be significant in terms of regulating the host-parasite equilibrium observed in Chagas' disease. T. cruzi infection restricts the repertoire of specific T cells generating, in some cases, an intense immunodominant phenotype and in others causing a dramatic interference in the response to distinct epitopes. This immune response is sufficiently strong to maintain the host alive during the acute phase carrying them to the chronic phase where transmission usually occurs. At the same time, immunodominance interferes with the development of a higher and broader immune response that could be able to completely eliminate the parasite. Based on this, we discuss how we can interfere with or take advantage of immunodominance in order to provide an immunotherapeutic alternative for chagasic individuals.
Resumo:
Inflammatory bowel disease (IBD), which includes Crohn's disease (CD) and ulcerative colitis (UC), is a chronic disorder that affects thousands of people around the world. These diseases are characterized by exacerbated uncontrolled intestinal inflammation that leads to poor quality of life in affected patients. Although the exact cause of IBD still remains unknown, compelling evidence suggests that the interplay among immune deregulation, environmental factors, and genetic polymorphisms contributes to the multifactorial nature of the disease. Therefore, in this review we present classical and novel findings regarding IBD etiopathogenesis. Considering the genetic causes of the diseases, alterations in about 100 genes or allelic variants, most of them in components of the immune system, have been related to IBD susceptibility. Dysbiosis of the intestinal microbiota also plays a role in the initiation or perpetuation of gut inflammation, which develops under altered or impaired immune responses. In this context, unbalanced innate and especially adaptive immunity has been considered one of the major contributing factors to IBD development, with the involvement of the Th1, Th2, and Th17 effector population in addition to impaired regulatory responses in CD or UC. Finally, an understanding of the interplay among pathogenic triggers of IBD will improve knowledge about the immunological mechanisms of gut inflammation, thus providing novel tools for IBD control.