961 resultados para GRAFT-VS-HOST DISEASE


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aims Perfusion-cardiac magnetic resonance (CMR) has emerged as a potential alternative to single-photon emission computed tomography (SPECT) to assess myocardial ischaemia non-invasively. The goal was to compare the diagnostic performance of perfusion-CMR and SPECT for the detection of coronary artery disease (CAD) using conventional X-ray coronary angiography (CXA) as the reference standard. Methods and results In this multivendor trial, 533 patients, eligible for CXA or SPECT, were enrolled in 33 centres (USA and Europe) with 515 patients receiving MR contrast medium. Single-photon emission computed tomography and CXA were performed within 4 weeks before or after CMR in all patients. The prevalence of CAD in the sample was 49%. Drop-out rates for CMR and SPECT were 5.6 and 3.7%, respectively (P = 0.21). The primary endpoint was non-inferiority of CMR vs. SPECT for both sensitivity and specificity for the detection of CAD. Readers were blinded vs. clinical data, CXA, and imaging results. As a secondary endpoint, the safety profile of the CMR examination was evaluated. For CMR and SPECT, the sensitivity scores were 0.67 and 0.59, respectively, with the lower confidence level for the difference of +0.02, indicating superiority of CMR over SPECT. The specificity scores for CMR and SPECT were 0.61 and 0.72, respectively (lower confidence level for the difference: -0.17), indicating inferiority of CMR vs. SPECT. No severe adverse events occurred in the 515 patients. Conclusion In this large multicentre, multivendor study, the sensitivity of perfusion-CMR to detect CAD was superior to SPECT, while its specificity was inferior to SPECT. Cardiac magnetic resonance is a safe alternative to SPECT to detect perfusion deficits in CAD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many cell types are currently being studied as potential sources of cardiomyocytes for cell transplantation therapy to repair and regenerate damaged myocardium. The question remains as to which progenitor cell represents the best candidate. Bone marrow-derived cells and endothelial progenitor cells have been tested in clinical studies. These cells are safe, but their cardiogenic potential is controversial. The functional benefits observed are probably due to enhanced angiogenesis, reduced ventricular remodeling, or to cytokine-mediated effects that promote the survival of endogenous cells. Human embryonic stem cells represent an unlimited source of cardiomyocytes due to their great differentiation potential, but each step of differentiation must be tightly controlled due to the high risk of teratoma formation. These cells, however, confront ethical barriers and there is a risk of graft rejection. These last two problems can be avoided by using induced pluripotent stem cells (iPS), which can be autologously derived, but the high risk of teratoma formation remains. Cardiac progenitor cells have the advantage of being cardiac committed, but important questions remain unanswered, such as what is the best marker to identify and isolate these cells? To date the different markers used to identify adult cardiac progenitor cells also recognize progenitor cells that are outside the heart. Thus, it cannot be determined whether the cardiac progenitor cells identified in the adult heart represent resident cells present since fetal life or extracardiac cells that colonized the heart after cardiac injury. Developmental studies have identified markers of multipotent progenitors, but it is unknown whether these markers are specific for adult progenitors when expressed in the adult myocardium. Cardiac regeneration is dependent on the stability of the cells transplanted into the host myocardium and on the electromechanical coupling with the endogenous cells. Finally, the promotion of endogenous regenerative processes by mobilizing endogenous progenitors represents a complementary approach to cell transplantation therapy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Leishmaniasis is a common parasitic disease in Southern Europe, caused by Leishmania infantum. The failures of current treatment with pentavalent antimonials are partially attributable to the emergence of antimony-resistant Leishmania strains. This study analyses the in vitro susceptibility to pentavalent antimony of intracellular amastigotes from a range of L. infantum strains, derived from the same infected animal, during in vitro and in vivo passages and after host treatment with meglumine antimoniate. Results: SbV-IC50 values for strains from two distinct isolates from the same host and one stock after two years of culture in NNN medium and posterior passage to hamster were similar (5.0 ± 0.2; 4.9 ± 0.2 and 4.4 ± 0.1 mgSbV/L, respectively). In contrast, a significant difference (P < 0.01, t test) was observed between the mean SbV-IC50 values in the stocks obtained before and after treatment of hosts with meglumine antimoniate (4.7 ± 0.4 mgSbV/L vs. 7.7 ± 1.5 mgSbV/L). Drug-resistance after drug pressure in experimentally infected dogs increased over repeated drug administration (6.4 ± 0.5 mgSbV/L after first treatment vs. 8.6 ± 1.4 mgSbV/L after the second) (P < 0.01, t test). Conclusions: These results confirm previous observations on strains from Leishmania/HIV co-infected patients and indicate the effect of the increasing use of antimony derivatives for treatment of canine leishmaniasis in endemic areas on the emergence of Leishmania antimony-resistant strains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We assessed the impact of antiviral prophylaxis and preemptive therapy on the incidence and outcomes of cytomegalovirus (CMV) disease in a nationwide prospective cohort of solid organ transplant recipients. Risk factors associated with CMV disease and graft failure-free survival were analyzed using Cox regression models. One thousand two hundred thirty-nine patients transplanted from May 2008 until March 2011 were included; 466 (38%) patients received CMV prophylaxis and 522 (42%) patients were managed preemptively. Overall incidence of CMV disease was 6.05% and was linked to CMV serostatus (D+/R- vs. R+, hazard ratio [HR] 5.36 [95% CI 3.14-9.14], pâeuro0/00<âeuro0/000.001). No difference in the incidence of CMV disease was observed in patients receiving antiviral prophylaxis as compared to the preemptive approach (HR 1.16 [95% CI 0.63-2.17], pâeuro0/00=âeuro0/000.63). CMV disease was not associated with a lower graft failure-free survival (HR 1.27 [95% CI 0.64-2.53], pâeuro0/00=âeuro0/000.50). Nevertheless, patients followed by the preemptive approach had an inferior graft failure-free survival after a median of 1.05 years of follow-up (HR 1.63 [95% CI 1.01-2.64], pâeuro0/00=âeuro0/000.044). The incidence of CMV disease in this cohort was low and not influenced by the preventive strategy used. However, patients on CMV prophylaxis were more likely to be free from graft failure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Various patterns of HIV-1 disease progression are described in clinical practice and in research. There is a need to assess the specificity of commonly used definitions of long term non-progressor (LTNP) elite controllers (LTNP-EC), viremic controllers (LTNP-VC), and viremic non controllers (LTNP-NC), as well as of chronic progressors (P) and rapid progressors (RP). Methodology and Principal Findings: We re-evaluated the HIV-1 clinical definitions, summarized in Table 1, using the information provided by a selected number of host genetic markers and viral factors. There is a continuous decrease of protective factors and an accumulation of risk factors from LTNP-EC to RP. Statistical differences in frequency of protective HLA-B alleles (p-0.01), HLA-C rs9264942 (p-0.06), and protective CCR5/CCR2 haplotypes (p-0.02) across groups, and the presence of viruses with an ancestral genotype in the "viral dating" (i.e., nucleotide sequences with low viral divergence from the most recent common ancestor) support the differences among principal clinical groups of HIV-1 infected individuals. Conclusions: A combination of host genetic and viral factors supports current clinical definitions that discriminate among patterns of HIV-1 progression. The study also emphasizes the need to apply a standardized and accepted set of clinical definitions for the purpose of disease stratification and research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: There is an ever-increasing volume of data on host genes that are modulated during HIV infection, influence disease susceptibility or carry genetic variants that impact HIV infection. We created GuavaH (Genomic Utility for Association and Viral Analyses in HIV, http://www.GuavaH.org), a public resource that supports multipurpose analysis of genome-wide genetic variation and gene expression profile across multiple phenotypes relevant to HIV biology. FINDINGS: We included original data from 8 genome and transcriptome studies addressing viral and host responses in and ex vivo. These studies cover phenotypes such as HIV acquisition, plasma viral load, disease progression, viral replication cycle, latency and viral-host genome interaction. This represents genome-wide association data from more than 4,000 individuals, exome sequencing data from 392 individuals, in vivo transcriptome microarray data from 127 patients/conditions, and 60 sets of RNA-seq data. Additionally, GuavaH allows visualization of protein variation in ~8,000 individuals from the general population. The publicly available GuavaH framework supports queries on (i) unique single nucleotide polymorphism across different HIV related phenotypes, (ii) gene structure and variation, (iii) in vivo gene expression in the setting of human infection (CD4+ T cells), and (iv) in vitro gene expression data in models of permissive infection, latency and reactivation. CONCLUSIONS: The complexity of the analysis of host genetic influences on HIV biology and pathogenesis calls for comprehensive motors of research on curated data. The tool developed here allows queries and supports validation of the rapidly growing body of host genomic information pertinent to HIV research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Extracranial carotid aneurysm is a rare vascular manifestation of Behçet disease. To our knowledge, only 32 cases have been reported. This article presents a complex case of a 28-year-old man who was first treated by vein graft reconstruction. At 12 months of follow-up, a nonanastomotic false aneurysm of the vein graft occurred and was treated by interposition of prosthetic graft. Two months later, an anastomotic pseudoaneurysm between the two grafts was excluded by two stent grafts. Based on our experience and a review of the literature, we compared the outcomes of prosthetic and autologous vein reconstructions and discussed the role of carotid ligation and immunosuppressive treatment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We conducted a retrospective analysis of the influence of full doses of calcineurin inhibitors [8-10 mg kg-1 day-1 cyclosporine (N = 80), or 0.2-0.3 mg kg-1 day-1 tacrolimus (N = 68)] administered from day 1 after transplantation on the transplant outcomes of a high-risk population. Induction therapy was used in 13% of the patients. Patients also received azathioprine (2 mg kg-1 day-1, N = 58) or mycophenolate mofetil (2 g/day, N = 90), and prednisone (0.5 mg kg-1 day-1, N = 148). Mean time on dialysis was 79 ± 41 months, 12% of the cases were re-transplants, and 21% had panel reactive antibodies >10%. In 43% of donors the cause of death was cerebrovascular disease and 27% showed creatinine above 1.5 mg/dL. The incidence of slow graft function (SGF) and delayed graft function (DGF) was 15 and 60%, respectively. Mean time to last dialysis and to nadir creatinine were 18 ± 15 and 34 ± 20 days, respectively. Mean creatinine at 1 year after transplantation was 1.48 ± 0.50 mg/dL (DGF 1.68 ± 0.65 vs SGF 1.67 ± 0.66 vs immediate graft function (IGF) 1.41 ± 0.40 mg/dL, P = 0.089). The incidence of biopsy-confirmed acute rejection was 22% (DGF 31%, SGF 10%, IGF 8%). One-year patient and graft survival was 92.6 and 78.4%, respectively. The incidence of cytomegalovirus disease, post-transplant diabetes mellitus and malignancies was 28, 8.1, and 0%, respectively. Compared to previous studies, the use of initial full doses of calcineurin inhibitors without antibody induction in patients with SGF or DGF had no negative impact on patient and graft survival.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Intense immune responses are observed during human or experimental infection with the digenetic protozoan parasite Trypanosoma cruzi. The reasons why such immune responses are unable to completely eliminate the parasites are unknown. The survival of the parasite leads to a parasite-host equilibrium found during the chronic phase of chagasic infection in most individuals. Parasite persistence is recognized as the most likely cause of the chagasic chronic pathologies. Therefore, a key question in Chagas' disease is to understand how this equilibrium is established and maintained for a long period. Understanding the basis for this equilibrium may lead to new approaches to interventions that could help millions of individuals at risk for infection or who are already infected with T. cruzi. Here, we propose that the phenomenon of immunodominance may be significant in terms of regulating the host-parasite equilibrium observed in Chagas' disease. T. cruzi infection restricts the repertoire of specific T cells generating, in some cases, an intense immunodominant phenotype and in others causing a dramatic interference in the response to distinct epitopes. This immune response is sufficiently strong to maintain the host alive during the acute phase carrying them to the chronic phase where transmission usually occurs. At the same time, immunodominance interferes with the development of a higher and broader immune response that could be able to completely eliminate the parasite. Based on this, we discuss how we can interfere with or take advantage of immunodominance in order to provide an immunotherapeutic alternative for chagasic individuals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Inflammatory bowel disease (IBD), which includes Crohn's disease (CD) and ulcerative colitis (UC), is a chronic disorder that affects thousands of people around the world. These diseases are characterized by exacerbated uncontrolled intestinal inflammation that leads to poor quality of life in affected patients. Although the exact cause of IBD still remains unknown, compelling evidence suggests that the interplay among immune deregulation, environmental factors, and genetic polymorphisms contributes to the multifactorial nature of the disease. Therefore, in this review we present classical and novel findings regarding IBD etiopathogenesis. Considering the genetic causes of the diseases, alterations in about 100 genes or allelic variants, most of them in components of the immune system, have been related to IBD susceptibility. Dysbiosis of the intestinal microbiota also plays a role in the initiation or perpetuation of gut inflammation, which develops under altered or impaired immune responses. In this context, unbalanced innate and especially adaptive immunity has been considered one of the major contributing factors to IBD development, with the involvement of the Th1, Th2, and Th17 effector population in addition to impaired regulatory responses in CD or UC. Finally, an understanding of the interplay among pathogenic triggers of IBD will improve knowledge about the immunological mechanisms of gut inflammation, thus providing novel tools for IBD control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The oxygen uptake efficiency slope (OUES) is a submaximal index incorporating cardiovascular, peripheral, and pulmonary factors that determine the ventilatory response to exercise. The purpose of this study was to evaluate the effects of continuous exercise training and interval exercise training on the OUES in patients with coronary artery disease. Thirty-five patients (59.3±1.8 years old; 28 men, 7 women) with coronary artery disease were randomly divided into two groups: continuous exercise training (n=18) and interval exercise training (n=17). All patients performed graded exercise tests with respiratory gas analysis before and 3 months after the exercise-training program to determine ventilatory anaerobic threshold (VAT), respiratory compensation point, and peak oxygen consumption (peak VO2). The OUES was assessed based on data from the second minute of exercise until exhaustion by calculating the slope of the linear relation between oxygen uptake and the logarithm of total ventilation. After the interventions, both groups showed increased aerobic fitness (P<0.05). In addition, both the continuous exercise and interval exercise training groups demonstrated an increase in OUES (P<0.05). Significant associations were observed in both groups: 1) continuous exercise training (OUES and peak VO2 r=0.57; OUES and VO2 VAT r=0.57); 2) interval exercise training (OUES and peak VO2 r=0.80; OUES and VO2 VAT r=0.67). Continuous and interval exercise training resulted in a similar increase in OUES among patients with coronary artery disease. These findings suggest that improvements in OUES among CAD patients after aerobic exercise training may be dependent on peripheral and central mechanisms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Dyslipidemia is recognized as a major cause of coronary heart disease (CHD). Emerged evidence suggests that the combination of triglycerides (TG) and waist circumference can be used to predict the risk of CHD. However, considering the known limitations of TG, non-high-density lipoprotein (non-HDL = Total cholesterol - HDL cholesterol) cholesterol and waist circumference model may be a better predictor of CHD. PURPOSE: The Framingham Offspring Study data were used to determine if combined non-HDL cholesterol and waist circumference is equivalent to or better than TG and waist circumference (hypertriglyceridemic waist phenotype) in predicting risk of CHD. METHODS: A total of3,196 individuals from Framingham Offspring Study, aged ~ 40 years old, who fasted overnight for ~ 9 hours, and had no missing information on nonHDL cholesterol, TG levels, and waist circumference measurements, were included in the analysis. Receiver Operator Characteristic Curve (ROC) Area Under the Curve (AUC) was used to compare the predictive ability of non-HDL cholesterol and waist circumference and TG and waist circumference. Cox proportional-hazards models were used to examine the association between the joint distributions of non-HDL cholesterol, waist circumference, and non-fatal CHD; TG, waist circumference, and non-fatal CHD; and the joint distribution of non-HDL cholesterol and TG by waist circumference strata, after adjusting for age, gender, smoking, alcohol consumption, diabetes, and hypertension status. RESULTS: The ROC AUC associated with non-HDL cholesterol and waist circumference and TG and waist circumference are 0.6428 (CI: 0.6183, 0.6673) and 0.6299 (CI: 0.6049, 0.6548) respectively. The difference in the ROC AVC is 1.29%. The p-value testing if the difference in the ROC AVCs between the two models is zero is 0.10. There was a strong positive association between non-HDL cholesterol and the risk for non-fatal CHD within each TO levels than that for TO levels within each level of nonHDL cholesterol, especially in individuals with high waist circumference status. CONCLUSION: The results suggest that the model including non-HDL cholesterol and waist circumference may be superior at predicting CHD compared to the model including TO and waist circumference.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The overall aim of the work presented was to evaluate soil health management with a specific focus on soil borne diseases of peas. For that purpose field experiments were carried out from 2009 until 2013 to assess crop performance and pathogen occurrence in the rotation winter pea-maize-winter wheat and if the application of composts can improve system performance. The winter peas were left untreated or inoculated with Phoma medicaginis, in the presence or absence of yard waste compost at rate of 5 t dry matter ha-1. A second application of compost was made to the winter wheat. Fusarium ssp. were isolated and identified from the roots of all three crops and the Ascochyta complex pathogens on peas. Bioassays were conducted under controlled conditions to assess susceptibility of two peas to Fusarium avenaceum, F. solani, P. medicaginis and Didymella pinodes and of nine plant species to F. avenaceum. Also, effects of compost applications and temperature on pea diseases were assessed. Application of composts overall stabilized crop performance but it did not lead to significant yield increases nor did it affect pathogen composition and occurrence. Phoma medicaginis was dominating the pathogen complex on peas. F. graminearum, F. culmorum, F. proliferatum, Microdochium nivale, F. crookwellense, F. sambucinum, F. oxysporum, F. avenaceum and F. equiseti were frequently isolated species from maize and winter wheat with no obvious influence of the pre-crop on the Fusarium species composition. The spring pea Santana was considerably more susceptible to the pathogens tested than the winter pea EFB33 in both sterile sand and non-sterilized field soil. F. avenaceum was the most aggressive pathogen, followed by P. medicaginis, D. pinodes, and F. solani. Aggressiveness of all pathogens was greatly reduced in non-sterile field soil. F. avenaceum caused severe symptoms on roots of all nine plant species tested. Especially susceptible were Trifolium repens, T. subterraneum, Brassica juncea and Sinapis alba in addition to peas. Reduction of growing temperatures from 19/16°C day/night to 16/12°C and 13/10°C did not affect the efficacy of compost. It reduced plant growth and slightly increased disease on EFB33 whereas the highest disease severity on Santana was observed at the highest temperature, 19/16°C. Application of 20% v/v of compost reduced disease on peas due to all four pathogens depending on pea variety, pathogen and growing media used. Suppression was also achieved with lower application rate of 3.5% v/v. Tests with γ sterilized compost suggest that the suppression of disease caused by Fusarium spp. is biological in origin, whereas chemical and physical properties of compost are playing an additional role in the suppression of disease caused by D. pinodes and P. medicaginis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: In previous studies cholesterol-rich nanoemulsions (LDE) resembling low-density lipoprotein were shown to concentrate in atherosclerotic lesions of rabbits. Lesions were pronouncedly reduced by treatment with paclitaxel associated with LDE. This study aimed to test the hypothesis of whether LDE-paclitaxel is able to concentrate in grafted hearts of rabbits and to ameliorate coronary allograft vasculopathy after the transplantation procedure. Methods: Twenty-one New Zealand rabbits fed 0.5% cholesterol were submitted to heterotopic heart transplantation at the cervical position. All rabbits undergoing transplantation were treated with cyclosporin A (10 mg . kg(-1) . d(-1) by mouth). Eleven rabbits were treated with LDE-paclitaxel (4 mg/kg body weight paclitaxel per week administered intravenously for 6 weeks), and 10 control rabbits were treated with 3 mL/wk intravenous saline. Four control animals were injected with LDE labeled with [(14)C]-cholesteryl oleate ether to determine tissue uptake. Results: Radioactive LDE uptake by grafts was 4-fold that of native hearts. In both groups the coronary arteries of native hearts showed no stenosis, but treatment with LDE-paclitaxel reduced the degree of stenosis in grafted hearts by 50%. The arterial luminal area in grafts of the treated group was 3-fold larger than in control animals. LDE-paclitaxel treatment resulted in a 7-fold reduction of macrophage infiltration. In grafted hearts LDE-paclitaxel treatment reduced the width of the intimal layer and inhibited the destruction of the medial layer. No toxicity was observed in rabbits receiving LDE-paclitaxel treatment. Conclusions: LDE-paclitaxel improved posttransplantation injury to the grafted heart. The novel therapeutic approach for heart transplantation management validated here is thus a promising strategy to be explored in future clinical studies. (J Thorac Cardiovasc Surg 2011;141:1522-8)