16 resultados para Chronic allograft nephropathy
em Duke University
Resumo:
Chronic allograft rejection is a major impediment to long-term transplant success. Humoral immune responses to alloantigens are a growing clinical problem in transplantation, with mounting evidence associating alloantibodies with the development of chronic rejection. Nearly a third of transplant recipients develop de novo antibodies, for which no established therapies are effective at preventing or eliminating, highlighting the need for a nonhuman primate model of antibody-mediated rejection. In this report, we demonstrate that depletion using anti-CD3 immunotoxin (IT) combined with maintenance immunosuppression that included tacrolimus with or without alefacept reliably prolonged renal allograft survival in rhesus monkeys. In these animals, a preferential skewing toward CD4 repopulation and proliferation was observed, particularly with the addition of alefacept. Furthermore, alefacept-treated animals demonstrated increased alloantibody production (100%) and morphologic features of antibody-mediated injury. In vitro, alefacept was found to enhance CD4 effector memory T cell proliferation. In conclusion, alefacept administration after depletion and with tacrolimus promotes a CD4+memory T cell and alloantibody response, with morphologic changes reflecting antibody-mediated allograft injury. Early and consistent de novo alloantibody production with associated histological changes makes this nonhuman primate model an attractive candidate for evaluating targeted therapeutics.
Resumo:
Even though the etiology of chronic rejection (CR) is multifactorial, donor specific antibody (DSA) is considered to have a causal effect on CR development. Currently the antibody-mediated mechanisms during CR are poorly understood due to lack of proper animal models and tools. In a clinical setting, we previously demonstrated that induction therapy by lymphocyte depletion, using alemtuzumab (anti-human CD52), is associated with an increased incidence of serum alloantibody, C4d deposition and antibody-mediated rejection in human patients. In this study, the effects of T cell depletion in the development of antibody-mediated rejection were examined using human CD52 transgenic (CD52Tg) mice treated with alemtuzumab. Fully mismatched cardiac allografts were transplanted into alemtuzumab treated CD52Tg mice and showed no acute rejection while untreated recipients acutely rejected their grafts. However, approximately half of long-term recipients showed increased degree of vasculopathy, fibrosis and perivascular C3d depositions at posttransplant day 100. The development of CR correlated with DSA and C3d deposition in the graft. Using novel tracking tools to monitor donor-specific B cells, alloreactive B cells were shown to increase in accordance with DSA detection. The current animal model could provide a means of testing strategies to understand mechanisms and developing therapeutic approaches to prevent chronic rejection.
Resumo:
BACKGROUND: Blocking leukocyte function-associated antigen (LFA)-1 in organ transplant recipients prolongs allograft survival. However, the precise mechanisms underlying the therapeutic potential of LFA-1 blockade in preventing chronic rejection are not fully elucidated. Cardiac allograft vasculopathy (CAV) is the preeminent cause of late cardiac allograft failure characterized histologically by concentric intimal hyperplasia. METHODS: Anti-LFA-1 monoclonal antibody was used in a multiple minor antigen-mismatched, BALB.B (H-2B) to C57BL/6 (H-2B), cardiac allograft model. Endogenous donor-specific CD8 T cells were tracked down using major histocompatibility complex multimers against the immunodominant H4, H7, H13, H28, and H60 minor Ags. RESULTS: The LFA-1 blockade prevented acute rejection and preserved palpable beating quality with reduced CD8 T-cell graft infiltration. Interestingly, less CD8 T cell infiltration was secondary to reduction of T-cell expansion rather than less trafficking. The LFA-1 blockade significantly suppressed the clonal expansion of minor histocompatibility antigen-specific CD8 T cells during the expansion and contraction phase. The CAV development was evaluated with morphometric analysis at postoperation day 100. The LFA-1 blockade profoundly attenuated neointimal hyperplasia (61.6 vs 23.8%; P < 0.05), CAV-affected vessel number (55.3 vs 15.9%; P < 0.05), and myocardial fibrosis (grade 3.29 vs 1.8; P < 0.05). Finally, short-term LFA-1 blockade promoted long-term donor-specific regulation, which resulted in attenuated transplant arteriosclerosis. CONCLUSIONS: Taken together, LFA-1 blockade inhibits initial endogenous alloreactive T-cell expansion and induces more regulation. Such a mechanism supports a pulse tolerance induction strategy with anti-LFA-1 rather than long-term treatment.
Resumo:
The autosomal recessive kidney disease nephronophthisis (NPHP) constitutes the most frequent genetic cause of terminal renal failure in the first 3 decades of life. Ten causative genes (NPHP1-NPHP9 and NPHP11), whose products localize to the primary cilia-centrosome complex, support the unifying concept that cystic kidney diseases are "ciliopathies". Using genome-wide homozygosity mapping, we report here what we believe to be a new locus (NPHP-like 1 [NPHPL1]) for an NPHP-like nephropathy. In 2 families with an NPHP-like phenotype, we detected homozygous frameshift and splice-site mutations, respectively, in the X-prolyl aminopeptidase 3 (XPNPEP3) gene. In contrast to all known NPHP proteins, XPNPEP3 localizes to mitochondria of renal cells. However, in vivo analyses also revealed a likely cilia-related function; suppression of zebrafish xpnpep3 phenocopied the developmental phenotypes of ciliopathy morphants, and this effect was rescued by human XPNPEP3 that was devoid of a mitochondrial localization signal. Consistent with a role for XPNPEP3 in ciliary function, several ciliary cystogenic proteins were found to be XPNPEP3 substrates, for which resistance to N-terminal proline cleavage resulted in attenuated protein function in vivo in zebrafish. Our data highlight an emerging link between mitochondria and ciliary dysfunction, and suggest that further understanding the enzymatic activity and substrates of XPNPEP3 will illuminate novel cystogenic pathways.
Resumo:
BACKGROUND: Heart failure is characterized by abnormalities in beta-adrenergic receptor (betaAR) signaling, including increased level of myocardial betaAR kinase 1 (betaARK1). Our previous studies have shown that inhibition of betaARK1 with the use of the Gbetagamma sequestering peptide of betaARK1 (betaARKct) can prevent cardiac dysfunction in models of heart failure. Because inhibition of betaARK activity is pivotal for amelioration of cardiac dysfunction, we investigated whether the level of betaARK1 inhibition correlates with the degree of heart failure. METHODS AND RESULTS: Transgenic (TG) mice with varying degrees of cardiac-specific expression of betaARKct peptide underwent transverse aortic constriction (TAC) for 12 weeks. Cardiac function was assessed by serial echocardiography in conscious mice, and the level of myocardial betaARKct protein was quantified at termination of the study. TG mice showed a positive linear relationship between the level of betaARKct protein expression and fractional shortening at 12 weeks after TAC. TG mice with low betaARKct expression developed severe heart failure, whereas mice with high betaARKct expression showed significantly less cardiac deterioration than wild-type (WT) mice. Importantly, mice with a high level of betaARKct expression had preserved isoproterenol-stimulated adenylyl cyclase activity and normal betaAR densities in the cardiac membranes. In contrast, mice with low expression of the transgene had marked abnormalities in betaAR function, similar to the WT mice. CONCLUSIONS: These data show that the level of betaARK1 inhibition determines the degree to which cardiac function can be preserved in response to pressure overload and has important therapeutic implications when betaARK1 inhibition is considered as a molecular target.
Resumo:
RATIONALE: Asthma is prospectively associated with age-related chronic diseases and mortality, suggesting the hypothesis that asthma may relate to a general, multisystem phenotype of accelerated aging. OBJECTIVES: To test whether chronic asthma is associated with a proposed biomarker of accelerated aging, leukocyte telomere length. METHODS: Asthma was ascertained prospectively in the Dunedin Multidisciplinary Health and Development Study cohort (n = 1,037) at nine in-person assessments spanning ages 9-38 years. Leukocyte telomere length was measured at ages 26 and 38 years. Asthma was classified as life-course-persistent, childhood-onset not meeting criteria for persistence, and adolescent/adult-onset. We tested associations between asthma and leukocyte telomere length using regression models. We tested for confounding of asthma-leukocyte telomere length associations using covariate adjustment. We tested serum C-reactive protein and white blood cell counts as potential mediators of asthma-leukocyte telomere length associations. MEASUREMENTS AND MAIN RESULTS: Study members with life-course-persistent asthma had shorter leukocyte telomere length as compared with sex- and age-matched peers with no reported asthma. In contrast, leukocyte telomere length in study members with childhood-onset and adolescent/adult-onset asthma was not different from leukocyte telomere length in peers with no reported asthma. Adjustment for life histories of obesity and smoking did not change results. Study members with life-course-persistent asthma had elevated blood eosinophil counts. Blood eosinophil count mediated 29% of the life-course-persistent asthma-leukocyte telomere length association. CONCLUSIONS: Life-course-persistent asthma is related to a proposed biomarker of accelerated aging, possibly via systemic eosinophilic inflammation. Life histories of asthma can inform studies of aging.
Resumo:
The role of antibodies in chronic injury to organ transplants has been suggested for many years, but recently emphasized by new data. We have observed that when immunosuppressive potency decreases either by intentional weaning of maintenance agents or due to homeostatic repopulation after immune cell depletion, the threshold of B cell activation may be lowered. In human transplant recipients the result may be donor-specific antibody, C4d+ injury, and chronic rejection. This scenario has precise parallels in a rhesus monkey renal allograft model in which T cells are depleted with CD3 immunotoxin, or in a CD52-T cell transgenic mouse model using alemtuzumab to deplete T cells. Such animal models may be useful for the testing of therapeutic strategies to prevent DSA. We agree with others who suggest that weaning of immunosuppression may place transplant recipients at risk of chronic antibody-mediated rejection, and that strategies to prevent this scenario are needed if we are to improve long-term graft and patient outcomes in transplantation. We believe that animal models will play a crucial role in defining the pathophysiology of antibody-mediated rejection and in developing effective therapies to prevent graft injury. Two such animal models are described herein.
Resumo:
Significance: This review article provides an overview of the critical roles of the innate immune system to wound healing. It explores aspects of dysregulation of individual innate immune elements known to compromise wound repair and promote nonhealing wounds. Understanding the key mechanisms whereby wound healing fails will provide seed concepts for the development of new therapeutic approaches. Recent Advances: Our understanding of the complex interactions of the innate immune system in wound healing has significantly improved, particularly in our understanding of the role of antimicrobials and peptides and the nature of the switch from inflammatory to reparative processes. This takes place against an emerging understanding of the relationship between human cells and commensal bacteria in the skin. Critical Issues: It is well established and accepted that early local inflammatory mediators in the wound bed function as an immunological vehicle to facilitate immune cell infiltration and microbial clearance upon injury to the skin barrier. Both impaired and excessive innate immune responses can promote nonhealing wounds. It appears that the switch from the inflammatory to the proliferative phase is tightly regulated and mediated, at least in part, by a change in macrophages. Defining the factors that initiate the switch in such macrophage phenotypes and functions is the subject of multiple investigations. Future Directions: The review highlights processes that may be useful targets for further investigation, particularly the switch from M1 to M2 macrophages that appears to be critical as dysregulation of this switch occurs during defective wound healing.
Resumo:
Opioids are efficacious and cost-effective analgesics, but tolerance limits their effectiveness. This paper does not present any new clinical or experimental data but demonstrates that there exist ascending sensory pathways that contain few opioid receptors. These pathways are located by brain PET scans and spinal cord autoradiography. These nonopioid ascending pathways include portions of the ventral spinal thalamic tract originating in Rexed layers VI-VIII, thalamocortical fibers that project to the primary somatosensory cortex (S1), and possibly a midline dorsal column visceral pathway. One hypothesis is that opioid tolerance and opioid-induced hyperalgesia may be caused by homeostatic upregulation during opioid exposure of nonopioid-dependent ascending pain pathways. Upregulation of sensory pathways is not a new concept and has been demonstrated in individuals impaired with deafness or blindness. A second hypothesis is that adjuvant nonopioid therapies may inhibit ascending nonopioid-dependent pathways and support the clinical observations that monotherapy with opioids usually fails. The uniqueness of opioid tolerance compared to tolerance associated with other central nervous system medications and lack of tolerance from excess hormone production is discussed. Experimental work that could prove or disprove the concepts as well as flaws in the concepts is discussed.
Resumo:
To identify patients at increased risk of cardiovascular (CV) outcomes, apparent treatment-resistant hypertension (aTRH) is defined as having a blood pressure above goal despite the use of 3 or more antihypertensive therapies of different classes at maximally tolerated doses, ideally including a diuretic. Recent epidemiologic studies in selected populations estimated the prevalence of aTRH as 10% to 15% among patients with hypertension and that aTRH is associated with elevated risk of CV and renal outcomes. Additionally, aTRH and CKD are associated. Although the pathogenesis of aTRH is multifactorial, the kidney is believed to play a significant role. Increased volume expansion, aldosterone concentration, mineralocorticoid receptor activity, arterial stiffness, and sympathetic nervous system activity are central to the pathogenesis of aTRH and are targets of therapies. Although diuretics form the basis of therapy in aTRH, pathophysiologic and clinical data suggest an important role for aldosterone antagonism. Interventional techniques, such as renal denervation and carotid baroreceptor activation, modulate the sympathetic nervous system and are currently in phase III trials for the treatment of aTRH. These technologies are as yet unproven and have not been investigated in relationship to CV outcomes or in patients with CKD.
Resumo:
Mild traumatic brain injury (TBI) is a common source of morbidity from the wars in Iraq and Afghanistan. With no overt lesions on structural MRI, diagnosis of chronic mild TBI in military veterans relies on obtaining an accurate history and assessment of behavioral symptoms that are also associated with frequent comorbid disorders, particularly posttraumatic stress disorder (PTSD) and depression. Military veterans from Iraq and Afghanistan with mild TBI (n = 30) with comorbid PTSD and depression and non-TBI participants from primary (n = 42) and confirmatory (n = 28) control groups were assessed with high angular resolution diffusion imaging (HARDI). White matter-specific registration followed by whole-brain voxelwise analysis of crossing fibers provided separate partial volume fractions reflecting the integrity of primary fibers and secondary (crossing) fibers. Loss of white matter integrity in primary fibers (P < 0.05; corrected) was associated with chronic mild TBI in a widely distributed pattern of major fiber bundles and smaller peripheral tracts including the corpus callosum (genu, body, and splenium), forceps minor, forceps major, superior and posterior corona radiata, internal capsule, superior longitudinal fasciculus, and others. Distributed loss of white matter integrity correlated with duration of loss of consciousness and most notably with "feeling dazed or confused," but not diagnosis of PTSD or depressive symptoms. This widespread spatial extent of white matter damage has typically been reported in moderate to severe TBI. The diffuse loss of white matter integrity appears consistent with systemic mechanisms of damage shared by blast- and impact-related mild TBI that involves a cascade of inflammatory and neurochemical events. © 2012 Wiley Periodicals, Inc.
Resumo:
BACKGROUND: In recent decades, low-level laser therapy (LLLT) has been widely used to relieve pain caused by different musculoskeletal disorders. Though widely used, its reported therapeutic outcomes are varied and conflicting. Results similarly conflict regarding its usage in patients with nonspecific chronic low back pain (NSCLBP). This study investigated the efficacy of low-level laser therapy (LLLT) for the treatment of NSCLBP by a systematic literature search with meta-analyses on selected studies. METHOD: MEDLINE, EMBASE, ISI Web of Science and Cochrane Library were systematically searched from January 2000 to November 2014. Included studies were randomized controlled trials (RCTs) written in English that compared LLLT with placebo treatment in NSCLBP patients. The efficacy effect size was estimated by the weighted mean difference (WMD). Standard random-effects meta-analysis was used, and inconsistency was evaluated by the I-squared index (I(2)). RESULTS: Of 221 studies, seven RCTs (one triple-blind, four double-blind, one single-blind, one not mentioning blinding, totaling 394 patients) met the criteria for inclusion. Based on five studies, the WMD in visual analog scale (VAS) pain outcome score after treatment was significantly lower in the LLLT group compared with placebo (WMD = -13.57 [95 % CI = -17.42, -9.72], I(2) = 0 %). No significant treatment effect was identified for disability scores or spinal range of motion outcomes. CONCLUSIONS: Our findings indicate that LLLT is an effective method for relieving pain in NSCLBP patients. However, there is still a lack of evidence supporting its effect on function.