22 resultados para graft failure
em Duke University
Resumo:
Vein grafting results in the development of intimal hyperplasia with accompanying changes in guanine nucleotide-binding (G) protein expression and function. Several serum mitogens that act through G protein-coupled receptors, such as lysophosphatidic acid, stimulate proliferative pathways that are dependent on the G protein betagamma subunit (Gbetagamma)-mediated activation of p21ras. This study examines the role of Gbetagamma signaling in intimal hyperplasia by targeting a gene encoding a specific Gbetagamma inhibitor in an experimental rabbit vein graft model. This inhibitor, the carboxyl terminus of the beta-adrenergic receptor kinase (betaARK(CT)), contains a Gbetagamma-binding domain. Vein graft intimal hyperplasia was significantly reduced by 37% (P<0.01), and physiological studies demonstrated that the normal alterations in G protein coupling phenotypically seen in this model were blocked by betaARK(CT) treatment. Thus, it appears that Gbetagamma-mediated pathways play a major role in intimal hyperplasia and that targeting inhibitors of Gbetagamma signaling offers novel intraoperative therapeutic modalities to inhibit the development of vein graft intimal hyperplasia and subsequent vein graft failure.
Resumo:
OBJECTIVES: Identification of patient subpopulations susceptible to develop myocardial infarction (MI) or, conversely, those displaying either intrinsic cardioprotective phenotypes or highly responsive to protective interventions remain high-priority knowledge gaps. We sought to identify novel common genetic variants associated with perioperative MI in patients undergoing coronary artery bypass grafting using genome-wide association methodology. SETTING: 107 secondary and tertiary cardiac surgery centres across the USA. PARTICIPANTS: We conducted a stage I genome-wide association study (GWAS) in 1433 ethnically diverse patients of both genders (112 cases/1321 controls) from the Genetics of Myocardial Adverse Outcomes and Graft Failure (GeneMAGIC) study, and a stage II analysis in an expanded population of 2055 patients (225 cases/1830 controls) combined from the GeneMAGIC and Duke Perioperative Genetics and Safety Outcomes (PEGASUS) studies. Patients undergoing primary non-emergent coronary bypass grafting were included. PRIMARY AND SECONDARY OUTCOME MEASURES: The primary outcome variable was perioperative MI, defined as creatine kinase MB isoenzyme (CK-MB) values ≥10× upper limit of normal during the first postoperative day, and not attributable to preoperative MI. Secondary outcomes included postoperative CK-MB as a quantitative trait, or a dichotomised phenotype based on extreme quartiles of the CK-MB distribution. RESULTS: Following quality control and adjustment for clinical covariates, we identified 521 single nucleotide polymorphisms in the stage I GWAS analysis. Among these, 8 common variants in 3 genes or intergenic regions met p<10(-5) in stage II. A secondary analysis using CK-MB as a quantitative trait (minimum p=1.26×10(-3) for rs609418), or a dichotomised phenotype based on extreme CK-MB values (minimum p=7.72×10(-6) for rs4834703) supported these findings. Pathway analysis revealed that genes harbouring top-scoring variants cluster in pathways of biological relevance to extracellular matrix remodelling, endoplasmic reticulum-to-Golgi transport and inflammation. CONCLUSIONS: Using a two-stage GWAS and pathway analysis, we identified and prioritised several potential susceptibility loci for perioperative MI.
Development and validation of a rapid, aldehyde dehydrogenase bright-based cord blood potency assay.
Resumo:
Banked, unrelated umbilical cord blood provides access to hematopoietic stem cell transplantation for patients lacking matched bone marrow donors, yet 10% to 15% of patients experience graft failure or delayed engraftment. This may be due, at least in part, to inadequate potency of the selected cord blood unit (CBU). CBU potency is typically assessed before cryopreservation, neglecting changes in potency occurring during freezing and thawing. Colony-forming units (CFUs) have been previously shown to predict CBU potency, defined as the ability to engraft in patients by day 42 posttransplant. However, the CFU assay is difficult to standardize and requires 2 weeks to perform. Consequently, we developed a rapid multiparameter flow cytometric CBU potency assay that enumerates cells expressing high levels of the enzyme aldehyde dehydrogenase (ALDH bright [ALDH(br)]), along with viable CD45(+) or CD34(+) cell content. These measurements are made on a segment that was attached to a cryopreserved CBU. We validated the assay with prespecified criteria testing accuracy, specificity, repeatability, intermediate precision, and linearity. We then prospectively examined the correlations among ALDH(br), CD34(+), and CFU content of 3908 segments over a 5-year period. ALDH(br) (r = 0.78; 95% confidence interval [CI], 0.76-0.79), but not CD34(+) (r = 0.25; 95% CI, 0.22-0.28), was strongly correlated with CFU content as well as ALDH(br) content of the CBU. These results suggest that the ALDH(br) segment assay (based on unit characteristics measured before release) is a reliable assessment of potency that allows rapid selection and release of CBUs from the cord blood bank to the transplant center for transplantation.
Resumo:
Hematopoietic stem cell transplantation (HSCT) is the only curative treatment for most children with osteopetrosis (OP). Timing of HSCT is critical; therefore, umbilical cord blood transplantation (UCBT) is an attractive option. We analyzed outcomes after UCBT in 51 OP children. Median age at UCBT was 6 months. Seventy-seven percent of the cord blood grafts had 0 or 1 HLA disparity with the recipient. Conditioning regimen was myeloablative (mostly busulfan-based in 84% and treosulfan-based in 10%). Antithymocyte globulin was given to 90% of patients. Median number of total nucleated and CD34(+) cells infused was 14 × 10(7)/kg and 3.4 × 10(5)/kg, respectively. Median follow-up for survivors was 74 months. Cumulative incidence (CI) of neutrophil recovery was 67% with a median time to recovery of 23 days; 33% of patients had graft failure, 81% of engrafted patients had full donor engraftment, and 19% had mixed donor chimerism. Day 100 CI of acute graft-versus-host disease (grades II to IV) was 31% and 6-year CI of chronic graft-versus-host disease was 21%. Mechanical ventilation was required in 28%, and veno-occlusive disease was diagnosed in 16% of cases. Six-year overall survival rate was 46%. Comparative studies with other alternative donors should be performed to evaluate whether UCBT remains a valid alternative for children with OP without an HLA-matched donor.
Resumo:
Late outgrowth endothelial progenitor cells (EPCs) derived from the peripheral blood of patients with significant coronary artery disease were sodded into the lumens of small diameter expanded polytetrafluoroethylene (ePTFE) vascular grafts. Grafts (1mm inner diameter) were denucleated and sodded either with native EPCs or with EPCs transfected with an adenoviral vector containing the gene for human thrombomodulin (EPC+AdTM). EPC+AdTM was shown to increase the in vitro rate of graft activated protein C (APC) production 4-fold over grafts sodded with untransfected EPCs (p<0.05). Unsodded control and EPC-sodded and EPC+AdTM-sodded grafts were implanted bilaterally into the femoral arteries of athymic rats for 7 or 28 days. Unsodded control grafts, both with and without denucleation treatment, each exhibited 7 day patency rates of 25%. Unsodded grafts showed extensive thrombosis and were not tested for patency over 28 days. In contrast, grafts sodded with untransfected EPCs or EPC+AdTM both had 7 day patency rates of 88-89% and 28 day patency rates of 75-88%. Intimal hyperplasia was observed near both the proximal and distal anastomoses in all sodded graft conditions but did not appear to be the primary occlusive failure event. This in vivo study suggests autologous EPCs derived from the peripheral blood of patients with coronary artery disease may improve the performance of synthetic vascular grafts, although no differences were observed between untransfected EPCs and TM transfected EPCs.
Resumo:
Thymic graft-versus-host disease (tGVHD) can contribute to profound T cell deficiency and repertoire restriction after allogeneic BM transplantation (allo-BMT). However, the cellular mechanisms of tGVHD and interactions between donor alloreactive T cells and thymic tissues remain poorly defined. Using clinically relevant murine allo-BMT models, we show here that even minimal numbers of donor alloreactive T cells, which caused mild nonlethal systemic graft-versus-host disease, were sufficient to damage the thymus, delay T lineage reconstitution, and compromise donor peripheral T cell function. Furthermore, to mediate tGVHD, donor alloreactive T cells required trafficking molecules, including CCR9, L selectin, P selectin glycoprotein ligand-1, the integrin subunits alphaE and beta7, CCR2, and CXCR3, and costimulatory/inhibitory molecules, including Ox40 and carcinoembryonic antigen-associated cell adhesion molecule 1. We found that radiation in BMT conditioning regimens upregulated expression of the death receptors Fas and death receptor 5 (DR5) on thymic stromal cells (especially epithelium), while decreasing expression of the antiapoptotic regulator cellular caspase-8-like inhibitory protein. Donor alloreactive T cells used the cognate proteins FasL and TNF-related apoptosis-inducing ligand (TRAIL) (but not TNF or perforin) to mediate tGVHD, thereby damaging thymic stromal cells, cytoarchitecture, and function. Strategies that interfere with Fas/FasL and TRAIL/DR5 interactions may therefore represent a means to attenuate tGVHD and improve T cell reconstitution in allo-BMT recipients.
Resumo:
BACKGROUND: Heart failure is characterized by abnormalities in beta-adrenergic receptor (betaAR) signaling, including increased level of myocardial betaAR kinase 1 (betaARK1). Our previous studies have shown that inhibition of betaARK1 with the use of the Gbetagamma sequestering peptide of betaARK1 (betaARKct) can prevent cardiac dysfunction in models of heart failure. Because inhibition of betaARK activity is pivotal for amelioration of cardiac dysfunction, we investigated whether the level of betaARK1 inhibition correlates with the degree of heart failure. METHODS AND RESULTS: Transgenic (TG) mice with varying degrees of cardiac-specific expression of betaARKct peptide underwent transverse aortic constriction (TAC) for 12 weeks. Cardiac function was assessed by serial echocardiography in conscious mice, and the level of myocardial betaARKct protein was quantified at termination of the study. TG mice showed a positive linear relationship between the level of betaARKct protein expression and fractional shortening at 12 weeks after TAC. TG mice with low betaARKct expression developed severe heart failure, whereas mice with high betaARKct expression showed significantly less cardiac deterioration than wild-type (WT) mice. Importantly, mice with a high level of betaARKct expression had preserved isoproterenol-stimulated adenylyl cyclase activity and normal betaAR densities in the cardiac membranes. In contrast, mice with low expression of the transgene had marked abnormalities in betaAR function, similar to the WT mice. CONCLUSIONS: These data show that the level of betaARK1 inhibition determines the degree to which cardiac function can be preserved in response to pressure overload and has important therapeutic implications when betaARK1 inhibition is considered as a molecular target.
Resumo:
Antibodies specific for the beta(1)-adrenergic receptor are found in patients with chronic heart failure of various etiologies. From work presented in this issue of the JCI, we can now infer that these antibodies actually contribute to the pathogenesis of chronic heart failure. This commentary discusses mechanisms by which these antibodies may engender cardiomyopathy.
Resumo:
Chronic human heart failure is characterized by abnormalities in beta-adrenergic receptor (betaAR) signaling, including increased levels of betaAR kinase 1 (betaARK1), which seems critical to the pathogenesis of the disease. To determine whether inhibition of betaARK1 is sufficient to rescue a model of severe heart failure, we mated transgenic mice overexpressing a peptide inhibitor of betaARK1 (betaARKct) with transgenic mice overexpressing the sarcoplasmic reticulum Ca(2+)-binding protein, calsequestrin (CSQ). CSQ mice have a severe cardiomyopathy and markedly shortened survival (9 +/- 1 weeks). In contrast, CSQ/betaARKct mice exhibited a significant increase in mean survival age (15 +/- 1 weeks; P < 0.0001) and showed less cardiac dilation, and cardiac function was significantly improved (CSQ vs. CSQ/betaARKct, left ventricular end diastolic dimension 5.60 +/- 0.17 mm vs. 4.19 +/- 0.09 mm, P < 0.005; % fractional shortening, 15 +/- 2 vs. 36 +/- 2, P < 0.005). The enhancement of the survival rate in CSQ/betaARKct mice was substantially potentiated by chronic treatment with the betaAR antagonist metoprolol (CSQ/betaARKct nontreated vs. CSQ/betaARKct metoprolol treated, 15 +/- 1 weeks vs. 25 +/- 2 weeks, P < 0.0001). Thus, overexpression of the betaARKct resulted in a marked prolongation in survival and improved cardiac function in a mouse model of severe cardiomyopathy that can be potentiated with beta-blocker therapy. These data demonstrate a significant synergy between an established heart-failure treatment and the strategy of betaARK1 inhibition.
Resumo:
Heart failure is accompanied by severely impaired beta-adrenergic receptor (betaAR) function, which includes loss of betaAR density and functional uncoupling of remaining receptors. An important mechanism for the rapid desensitization of betaAR function is agonist-stimulated receptor phosphorylation by the betaAR kinase (betaARK1), an enzyme known to be elevated in failing human heart tissue. To investigate whether alterations in betaAR function contribute to the development of myocardial failure, transgenic mice with cardiac-restricted overexpression of either a peptide inhibitor of betaARK1 or the beta2AR were mated into a genetic model of murine heart failure (MLP-/-). In vivo cardiac function was assessed by echocardiography and cardiac catheterization. Both MLP-/- and MLP-/-/beta2AR mice had enlarged left ventricular (LV) chambers with significantly reduced fractional shortening and mean velocity of circumferential fiber shortening. In contrast, MLP-/-/betaARKct mice had normal LV chamber size and function. Basal LV contractility in the MLP-/-/betaARKct mice, as measured by LV dP/dtmax, was increased significantly compared with the MLP-/- mice but less than controls. Importantly, heightened betaAR desensitization in the MLP-/- mice, measured in vivo (responsiveness to isoproterenol) and in vitro (isoproterenol-stimulated membrane adenylyl cyclase activity), was completely reversed with overexpression of the betaARK1 inhibitor. We report here the striking finding that overexpression of this inhibitor prevents the development of cardiomyopathy in this murine model of heart failure. These findings implicate abnormal betaAR-G protein coupling in the pathogenesis of the failing heart and point the way toward development of agents to inhibit betaARK1 as a novel mode of therapy.
Resumo:
Grafts can be rejected even when matched for MHC because of differences in the minor histocompatibility Ags (mH-Ags). H4- and H60-derived epitopes are known as immunodominant mH-Ags in H2(b)-compatible BALB.B to C57BL/6 transplantation settings. Although multiple explanations have been provided to explain immunodominance of Ags, the role of vascularization of the graft is yet to be determined. In this study, we used heart (vascularized) and skin (nonvascularized) transplantations to determine the role of primary vascularization of the graft. A higher IFN-γ response toward H60 peptide occurs in heart recipients. In contrast, a higher IFN-γ response was generated against H4 peptide in skin transplant recipients. Peptide-loaded tetramer staining revealed a distinct antigenic hierarchy between heart and skin transplantation: H60-specific CD8(+) T cells were the most abundant after heart transplantation, whereas H4-specific CD8(+) T cells were more abundant after skin graft. Neither the tissue-specific distribution of mH-Ags nor the draining lymph node-derived dendritic cells correlated with the observed immunodominance. Interestingly, non-primarily vascularized cardiac allografts mimicked skin grafts in the observed immunodominance, and H60 immunodominance was observed in primarily vascularized skin grafts. However, T cell depletion from the BALB.B donor prior to cardiac allograft induces H4 immunodominance in vascularized cardiac allograft. Collectively, our data suggest that immediate transmigration of donor T cells via primary vascularization is responsible for the immunodominance of H60 mH-Ag in organ and tissue transplantation.
Resumo:
BACKGROUND: In Tanzania, HIV-1 RNA testing is rarely available and not standard of care. Determining virologic failure is challenging and resistance mutations accumulate, thereby compromising second-line therapy. We evaluated durability of antiretroviral therapy (ART) and predictors of virologic failure among a pediatric cohort at four-year follow-up. METHODS: This was a prospective cross-sectional study with retrospective chart review evaluating a perinatally HIV-infected Tanzanian cohort enrolled in 2008-09 with repeat HIV-1 RNA in 2012-13. Demographic, clinical, and laboratory data were extracted from charts, resistance mutations from 2008-9 were analyzed, and prospective HIV RNA was obtained. RESULTS: 161 (78%) participants of the original cohort consented to repeat HIV RNA. The average age was 12.2 years (55% adolescents ≥12 years). Average time on ART was 6.4 years with 41% receiving second-line (protease inhibitor based) therapy. Among those originally suppressed on a first-line (non-nucleoside reverse transcriptase based regimen) 76% remained suppressed. Of those originally failing first-line, 88% were switched to second-line and 72% have suppressed virus. Increased level of viremia and duration of ART trended with an increased number of thymidine analogue mutations (TAMs). Increased TAMs increased the odds of virologic failure (p = 0.18), as did adolescent age (p < 0.01). CONCLUSIONS: After viral load testing in 2008-09 many participants switched to second-line therapy. The majority achieved virologic suppression despite multiple resistance mutations. Though virologic testing would likely hasten the switch to second-line among those failing, methods to improve adherence is critical to maximize durability of ART and improve virologic outcomes among youth in resource-limited settings.
Resumo:
BACKGROUND: Blocking leukocyte function-associated antigen (LFA)-1 in organ transplant recipients prolongs allograft survival. However, the precise mechanisms underlying the therapeutic potential of LFA-1 blockade in preventing chronic rejection are not fully elucidated. Cardiac allograft vasculopathy (CAV) is the preeminent cause of late cardiac allograft failure characterized histologically by concentric intimal hyperplasia. METHODS: Anti-LFA-1 monoclonal antibody was used in a multiple minor antigen-mismatched, BALB.B (H-2B) to C57BL/6 (H-2B), cardiac allograft model. Endogenous donor-specific CD8 T cells were tracked down using major histocompatibility complex multimers against the immunodominant H4, H7, H13, H28, and H60 minor Ags. RESULTS: The LFA-1 blockade prevented acute rejection and preserved palpable beating quality with reduced CD8 T-cell graft infiltration. Interestingly, less CD8 T cell infiltration was secondary to reduction of T-cell expansion rather than less trafficking. The LFA-1 blockade significantly suppressed the clonal expansion of minor histocompatibility antigen-specific CD8 T cells during the expansion and contraction phase. The CAV development was evaluated with morphometric analysis at postoperation day 100. The LFA-1 blockade profoundly attenuated neointimal hyperplasia (61.6 vs 23.8%; P < 0.05), CAV-affected vessel number (55.3 vs 15.9%; P < 0.05), and myocardial fibrosis (grade 3.29 vs 1.8; P < 0.05). Finally, short-term LFA-1 blockade promoted long-term donor-specific regulation, which resulted in attenuated transplant arteriosclerosis. CONCLUSIONS: Taken together, LFA-1 blockade inhibits initial endogenous alloreactive T-cell expansion and induces more regulation. Such a mechanism supports a pulse tolerance induction strategy with anti-LFA-1 rather than long-term treatment.
Resumo:
BACKGROUND: Recent studies suggest that there is a learning curve for metal-on-metal hip resurfacing. The purpose of this study was to assess whether implant positioning changed with surgeon experience and whether positioning and component sizing were associated with implant longevity. METHODS: We evaluated the first 361 consecutive hip resurfacings performed by a single surgeon, which had a mean follow-up of 59 months (range, 28 to 87 months). Pre and post-operative radiographs were assessed to determine the inclination of the acetabular component, as well as the sagittal and coronal femoral stem-neck angles. Changes in the precision of component placement were determined by assessing changes in the standard deviation of each measurement using variance ratio and linear regression analysis. Additionally, the cup and stem-shaft angles as well as component sizes were compared between the 31 hips that failed over the follow-up period and the surviving components to assess for any differences that might have been associated with an increased risk for failure. RESULTS: Surgeon experience was correlated with improved precision of the antero-posterior and lateral positioning of the femoral component. However, femoral and acetabular radiographic implant positioning angles were not different between the surviving hips and failures. The failures had smaller mean femoral component diameters as compared to the non-failure group (44 versus 47 millimeters). CONCLUSIONS: These results suggest that there may be differences in implant positioning in early versus late learning curve procedures, but that in the absence of recognized risk factors such as intra-operative notching of the femoral neck and cup inclination in excess of 50 degrees, component positioning does not appear to be associated with failure. Nevertheless, surgeons should exercise caution in operating patients with small femoral necks, especially when they are early in the learning curve.