12 resultados para Patellar ligament allograft

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human adipose stem cells (hASCs) can differentiate into a variety of phenotypes. Native extracellular matrix (e.g., demineralized bone matrix or small intestinal submucosa) can influence the growth and differentiation of stem cells. The hypothesis of this study was that a novel ligament-derived matrix (LDM) would enhance expression of a ligamentous phenotype in hASCs compared to collagen gel alone. LDM prepared using phosphate-buffered saline or 0.1% peracetic acid was mixed with collagen gel (COL) and was evaluated for its ability to induce proliferation, differentiation, and extracellular matrix synthesis in hASCs over 28 days in culture at different seeding densities (0, 0.25 x 10(6), 1 x 10(6), or 2 x 10(6) hASC/mL). Biochemical and gene expression data were analyzed using analysis of variance. Fisher's least significant difference test was used to determine differences between treatments following analysis of variance. hASCs in either LDM or COL demonstrated changes in gene expression consistent with ligament development. hASCs cultured with LDM demonstrated more dsDNA content, sulfated-glycosaminoglycan accumulation, and type I and III collagen synthesis, and released more sulfated-glycosaminoglycan and collagen into the medium compared to hASCs in COL (p ligament phenotype by hASCs, and may provide a novel scaffold material for ligament engineering applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Anterior cruciate ligament (ACL) reconstruction is associated with a high incidence of second tears (graft tears and contralateral ACL tears). These secondary tears have been attributed to asymmetrical lower extremity mechanics. Knee bracing is one potential intervention that can be used during rehabilitation that has the potential to normalize lower extremity asymmetry; however, little is known about the effect of bracing on movement asymmetry in patients following ACL reconstruction. HYPOTHESIS: Wearing a knee brace would increase knee joint flexion and joint symmetry. It was also expected that the joint mechanics would become more symmetrical in the braced condition. OBJECTIVE: To examine how knee bracing affects knee joint function and symmetry over the course of rehabilitation in patients 6 months following ACL reconstruction. STUDY DESIGN: Controlled laboratory study. LEVEL OF EVIDENCE: Level 3. METHODS: Twenty-three adolescent patients rehabilitating from ACL reconstruction surgery were recruited for the study. The subjects all underwent a motion analysis assessment during a stop-jump activity with and without a functional knee brace on the surgical side that resisted extension for 6 months following the ACL reconstruction surgery. Statistical analysis utilized a 2 × 2 (limb × brace) analysis of variance with a significant alpha level of 0.05. RESULTS: Subjects had increased knee flexion on the surgical side when they were braced. The brace condition increased knee flexion velocity, decreased the initial knee flexion angle, and increased the ground reaction force and knee extension moment on both limbs. Side-to-side asymmetry was present across conditions for the vertical ground reaction force and knee extension moment. CONCLUSION: Wearing a knee brace appears to increase lower extremity compliance and promotes normalized loading on the surgical side. CLINICAL RELEVANCE: Knee extension constraint bracing in postoperative ACL patients may improve symmetry of lower extremity mechanics, which is potentially beneficial in progressing rehabilitation and reducing the incidence of second ACL tears.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asymmetries in sagittal plane knee kinetics have been identified as a risk factor for anterior cruciate ligament (ACL) re-injury. Clinical tools are needed to identify the asymmetries. This study examined the relationships between knee kinetic asymmetries and ground reaction force (GRF) asymmetries during athletic tasks in adolescent patients following ACL reconstruction (ACL-R). Kinematic and GRF data were collected during a stop-jump task and a side-cutting task for 23 patients. Asymmetry indices between the surgical and non-surgical limbs were calculated for GRF and knee kinetic variables. For the stop-jump task, knee kinetics asymmetry indices were correlated with all GRF asymmetry indices (P < 0.05), except for loading rate. Vertical GRF impulse asymmetry index predicted peak knee moment, average knee moment, and knee work (R(2)  ≥ 0.78, P < 0.01) asymmetry indices. For the side-cutting tasks, knee kinetic asymmetry indices were correlated with the peak propulsion vertical GRF and vertical GRF impulse asymmetry indices (P < 0.05). Vertical GRF impulse asymmetry index predicted peak knee moment, average knee moment, and knee work (R(2)  ≥ 0.55, P < 0.01) asymmetry indices. The vertical GRF asymmetries may be a viable surrogate for knee kinetic asymmetries and therefore may assist in optimizing rehabilitation outcomes and minimizing re-injury rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Even though the etiology of chronic rejection (CR) is multifactorial, donor specific antibody (DSA) is considered to have a causal effect on CR development. Currently the antibody-mediated mechanisms during CR are poorly understood due to lack of proper animal models and tools. In a clinical setting, we previously demonstrated that induction therapy by lymphocyte depletion, using alemtuzumab (anti-human CD52), is associated with an increased incidence of serum alloantibody, C4d deposition and antibody-mediated rejection in human patients. In this study, the effects of T cell depletion in the development of antibody-mediated rejection were examined using human CD52 transgenic (CD52Tg) mice treated with alemtuzumab. Fully mismatched cardiac allografts were transplanted into alemtuzumab treated CD52Tg mice and showed no acute rejection while untreated recipients acutely rejected their grafts. However, approximately half of long-term recipients showed increased degree of vasculopathy, fibrosis and perivascular C3d depositions at posttransplant day 100. The development of CR correlated with DSA and C3d deposition in the graft. Using novel tracking tools to monitor donor-specific B cells, alloreactive B cells were shown to increase in accordance with DSA detection. The current animal model could provide a means of testing strategies to understand mechanisms and developing therapeutic approaches to prevent chronic rejection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Blocking leukocyte function-associated antigen (LFA)-1 in organ transplant recipients prolongs allograft survival. However, the precise mechanisms underlying the therapeutic potential of LFA-1 blockade in preventing chronic rejection are not fully elucidated. Cardiac allograft vasculopathy (CAV) is the preeminent cause of late cardiac allograft failure characterized histologically by concentric intimal hyperplasia. METHODS: Anti-LFA-1 monoclonal antibody was used in a multiple minor antigen-mismatched, BALB.B (H-2B) to C57BL/6 (H-2B), cardiac allograft model. Endogenous donor-specific CD8 T cells were tracked down using major histocompatibility complex multimers against the immunodominant H4, H7, H13, H28, and H60 minor Ags. RESULTS: The LFA-1 blockade prevented acute rejection and preserved palpable beating quality with reduced CD8 T-cell graft infiltration. Interestingly, less CD8 T cell infiltration was secondary to reduction of T-cell expansion rather than less trafficking. The LFA-1 blockade significantly suppressed the clonal expansion of minor histocompatibility antigen-specific CD8 T cells during the expansion and contraction phase. The CAV development was evaluated with morphometric analysis at postoperation day 100. The LFA-1 blockade profoundly attenuated neointimal hyperplasia (61.6 vs 23.8%; P < 0.05), CAV-affected vessel number (55.3 vs 15.9%; P < 0.05), and myocardial fibrosis (grade 3.29 vs 1.8; P < 0.05). Finally, short-term LFA-1 blockade promoted long-term donor-specific regulation, which resulted in attenuated transplant arteriosclerosis. CONCLUSIONS: Taken together, LFA-1 blockade inhibits initial endogenous alloreactive T-cell expansion and induces more regulation. Such a mechanism supports a pulse tolerance induction strategy with anti-LFA-1 rather than long-term treatment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Cryptococcosis occurring ≤30 days after transplantation is an unusual event, and its characteristics are not known. METHODS: Patients included 175 solid-organ transplant (SOT) recipients with cryptococcosis in a multicenter cohort. Very early-onset and late-onset cryptococcosis were defined as disease occurring ≤30 days or >30 days after transplantation, respectively. RESULTS: Very early-onset disease developed in 9 (5%) of the 175 patients at a mean of 5.7 days after transplantation. Overall, 55.6% (5 of 9) of the patients with very early-onset disease versus 25.9% (43 of 166) of the patients with late-onset disease were liver transplant recipients (P = .05). Very early cases were more likely to present with disease at unusual locations, including transplanted allograft and surgical fossa/site infections (55.6% vs 7.2%; P < .001). Two very early cases with onset on day 1 after transplantation (in a liver transplant recipient with Cryptococcus isolated from the lung and a heart transplant recipient with fungemia) likely were the result of undetected pretransplant disease. An additional 5 cases involving the allograft or surgical sites were likely the result of donor‐acquired infection. CONCLUSIONS: A subset of SOT recipients with cryptococcosis present very early after transplantation with disease that appears to occur preferentially in liver transplant recipients and involves unusual sites, such as the transplanted organ or the surgical site. These patients may have unrecognized pretransplant or donor-derived cryptococcosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Depletional strategies directed toward achieving tolerance induction in organ transplantation have been associated with an increased incidence and risk of antibody-mediated rejection (AMR) and graft injury. Our clinical data suggest correlation of increased serum B cell activating factor/survival factor (BAFF) with increased risk of antibody-mediated rejection in alemtuzumab treated patients. In the present study, we tested the ability of BAFF blockade (TACI-Ig) in a nonhuman primate AMR model to prevent alloantibody production and prolong allograft survival. Three animals received the AMR inducing regimen (CD3-IT/alefacept/tacrolimus) with TACI-Ig (atacicept), compared to five control animals treated with the AMR inducing regimen only. TACI-Ig treatment lead to decreased levels of DSA in treated animals at 2 and 4 weeks posttransplantation (p < 0.05). In addition, peripheral B cell numbers were significantly lower at 6 weeks posttransplantation. However, it provided only a marginal increase in graft survival (59 ± 22 vs. 102 ± 47 days; p = 0.11). Histological analysis revealed a substantial reduction in findings typically associated with humoral rejection with atacicept treatment. More T cell rejection findings were observed with increased graft T cell infiltration in atacicept treatment, likely secondary to the graft prolongation. We show that BAFF/APRIL blockade using concomitant TACI-Ig treatment reduced the humoral portion of rejection in our depletion-induced preclinical AMR model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The role of antibodies in chronic injury to organ transplants has been suggested for many years, but recently emphasized by new data. We have observed that when immunosuppressive potency decreases either by intentional weaning of maintenance agents or due to homeostatic repopulation after immune cell depletion, the threshold of B cell activation may be lowered. In human transplant recipients the result may be donor-specific antibody, C4d+ injury, and chronic rejection. This scenario has precise parallels in a rhesus monkey renal allograft model in which T cells are depleted with CD3 immunotoxin, or in a CD52-T cell transgenic mouse model using alemtuzumab to deplete T cells. Such animal models may be useful for the testing of therapeutic strategies to prevent DSA. We agree with others who suggest that weaning of immunosuppression may place transplant recipients at risk of chronic antibody-mediated rejection, and that strategies to prevent this scenario are needed if we are to improve long-term graft and patient outcomes in transplantation. We believe that animal models will play a crucial role in defining the pathophysiology of antibody-mediated rejection and in developing effective therapies to prevent graft injury. Two such animal models are described herein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic allograft rejection is a major impediment to long-term transplant success. Humoral immune responses to alloantigens are a growing clinical problem in transplantation, with mounting evidence associating alloantibodies with the development of chronic rejection. Nearly a third of transplant recipients develop de novo antibodies, for which no established therapies are effective at preventing or eliminating, highlighting the need for a nonhuman primate model of antibody-mediated rejection. In this report, we demonstrate that depletion using anti-CD3 immunotoxin (IT) combined with maintenance immunosuppression that included tacrolimus with or without alefacept reliably prolonged renal allograft survival in rhesus monkeys. In these animals, a preferential skewing toward CD4 repopulation and proliferation was observed, particularly with the addition of alefacept. Furthermore, alefacept-treated animals demonstrated increased alloantibody production (100%) and morphologic features of antibody-mediated injury. In vitro, alefacept was found to enhance CD4 effector memory T cell proliferation. In conclusion, alefacept administration after depletion and with tacrolimus promotes a CD4+memory T cell and alloantibody response, with morphologic changes reflecting antibody-mediated allograft injury. Early and consistent de novo alloantibody production with associated histological changes makes this nonhuman primate model an attractive candidate for evaluating targeted therapeutics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grafts can be rejected even when matched for MHC because of differences in the minor histocompatibility Ags (mH-Ags). H4- and H60-derived epitopes are known as immunodominant mH-Ags in H2(b)-compatible BALB.B to C57BL/6 transplantation settings. Although multiple explanations have been provided to explain immunodominance of Ags, the role of vascularization of the graft is yet to be determined. In this study, we used heart (vascularized) and skin (nonvascularized) transplantations to determine the role of primary vascularization of the graft. A higher IFN-γ response toward H60 peptide occurs in heart recipients. In contrast, a higher IFN-γ response was generated against H4 peptide in skin transplant recipients. Peptide-loaded tetramer staining revealed a distinct antigenic hierarchy between heart and skin transplantation: H60-specific CD8(+) T cells were the most abundant after heart transplantation, whereas H4-specific CD8(+) T cells were more abundant after skin graft. Neither the tissue-specific distribution of mH-Ags nor the draining lymph node-derived dendritic cells correlated with the observed immunodominance. Interestingly, non-primarily vascularized cardiac allografts mimicked skin grafts in the observed immunodominance, and H60 immunodominance was observed in primarily vascularized skin grafts. However, T cell depletion from the BALB.B donor prior to cardiac allograft induces H4 immunodominance in vascularized cardiac allograft. Collectively, our data suggest that immediate transmigration of donor T cells via primary vascularization is responsible for the immunodominance of H60 mH-Ag in organ and tissue transplantation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Interleukin (IL)-15 is a chemotactic factor to T cells. It induces proliferation and promotes survival of activated T cells. IL-15 receptor blockade in mouse cardiac and islet allotransplant models has led to long-term engraftment and a regulatory T-cell environment. This study investigated the efficacy of IL-15 receptor blockade using Mut-IL-15/Fc in an outbred non-human primate model of renal allotransplantation. METHODS: Male cynomolgus macaque donor-recipient pairs were selected based on ABO typing, major histocompatibility complex class I typing, and carboxy-fluorescein diacetate succinimidyl ester-based mixed lymphocyte responses. Once animals were assigned to one of six treatment groups, they underwent renal transplantation and bilateral native nephrectomy. Serum creatinine level was monitored twice weekly and as indicated, and protocol biopsies were performed. Rejection was defined as a increase in serum creatinine to 1.5 mg/dL or higher and was confirmed histologically. Complete blood counts and flow cytometric analyses were performed periodically posttransplant; pharmacokinetic parameters of Mut-IL-15/Fc were assessed. RESULTS: Compared with control animals, Mut-IL-15/Fc-treated animals did not demonstrate increased graft survival despite adequate serum levels of Mut-IL-15/Fc. Flow cytometric analysis of white blood cell subgroups demonstrated a decrease in CD8 T-cell and natural killer cell numbers, although this did not reach statistical significance. Interestingly, two animals receiving Mut-IL-15/Fc developed infectious complications, but no infection was seen in control animals. Renal pathology varied widely. CONCLUSIONS: Peritransplant IL-15 receptor blockade does not prolong allograft survival in non-human primate renal transplantation; however, it reduces the number of CD8 T cells and natural killer cells in the peripheral blood.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Controversies exist regarding the indications for unicompartmental knee arthroplasty. The objective of this study is to report the mid-term results and examine predictors of failure in a metal-backed unicompartmental knee arthroplasty design. METHODS: At a mean follow-up of 60 months, 80 medial unicompartmental knee arthroplasties (68 patients) were evaluated. Implant survivorship was analyzed using Kaplan-Meier method. The Knee Society objective and functional scores and radiographic characteristics were compared before surgery and at final follow-up. A Cox proportional hazard model was used to examine the association of patient's age, gender, obesity (body mass index > 30 kg/m2), diagnosis, Knee Society scores and patella arthrosis with failure. RESULTS: There were 9 failures during the follow up. The mean Knee Society objective and functional scores were respectively 49 and 48 points preoperatively and 95 and 92 points postoperatively. The survival rate was 92% at 5 years and 84% at 10 years. The mean age was younger in the failure group than the non-failure group (p < 0.01). However, none of the factors assessed was independently associated with failure based on the results from the Cox proportional hazard model. CONCLUSION: Gender, pre-operative diagnosis, preoperative objective and functional scores and patellar osteophytes were not independent predictors of failure of unicompartmental knee implants, although high body mass index trended toward significance. The findings suggest that the standard criteria for UKA may be expanded without compromising the outcomes, although caution may be warranted in patients with very high body mass index pending additional data to confirm our results. LEVEL OF EVIDENCE: IV.