16 resultados para Artefact rejection
Resumo:
This study tested a developmental cascade model of peer rejection, social information processing (SIP), and aggression using data from 585 children assessed at 12 time points from kindergarten through Grade 3. Peer rejection had direct effects on subsequent SIP problems and aggression. SIP had direct effects on subsequent peer rejection and aggression. Aggression had direct effects on subsequent peer rejection. Each construct also had indirect effects on each of the other constructs. These findings advance the literature beyond a simple mediation approach by demonstrating how each construct effects changes in the others in a snowballing cycle over time. The progressions of SIP problems and aggression cascaded through lower liking, and both better SIP skills and lower aggression facilitated the progress of social preference. Findings are discussed in terms of the dynamic, developmental relations among social environments, cognitions, and behavioral adjustment.
Resumo:
The relation between social rejection and growth in antisocial behavior was investigated. In Study 1,259 boys and girls (34% African American) were followed from Grades 1 to 3 (ages 6-8 years) to Grades 5 to 7 (ages 10-12 years). Early peer rejection predicted growth in aggression. In Study 2,585 boys and girls (16% African American) were followed from kindergarten to Grade 3 (ages 5-8 years), and findings were replicated. Furthermore, early aggression moderated the effect of rejection, such that rejection exacerbated antisocial development only among children initially disposed toward aggression. In Study 3, social information-processing patterns measured in Study 1 were found to mediate partially the effect of early rejection on later aggression. In Study 4, processing patterns measured in Study 2 replicated the mediation effect. Findings are integrated into a recursive model of antisocial development.
Resumo:
Even though the etiology of chronic rejection (CR) is multifactorial, donor specific antibody (DSA) is considered to have a causal effect on CR development. Currently the antibody-mediated mechanisms during CR are poorly understood due to lack of proper animal models and tools. In a clinical setting, we previously demonstrated that induction therapy by lymphocyte depletion, using alemtuzumab (anti-human CD52), is associated with an increased incidence of serum alloantibody, C4d deposition and antibody-mediated rejection in human patients. In this study, the effects of T cell depletion in the development of antibody-mediated rejection were examined using human CD52 transgenic (CD52Tg) mice treated with alemtuzumab. Fully mismatched cardiac allografts were transplanted into alemtuzumab treated CD52Tg mice and showed no acute rejection while untreated recipients acutely rejected their grafts. However, approximately half of long-term recipients showed increased degree of vasculopathy, fibrosis and perivascular C3d depositions at posttransplant day 100. The development of CR correlated with DSA and C3d deposition in the graft. Using novel tracking tools to monitor donor-specific B cells, alloreactive B cells were shown to increase in accordance with DSA detection. The current animal model could provide a means of testing strategies to understand mechanisms and developing therapeutic approaches to prevent chronic rejection.
Resumo:
Belying the spectacular success of solid organ transplantation and improvements in immunosuppressive therapy is the reality that long-term graft survival rates remain relatively unchanged, in large part due to chronic and insidious alloantibody-mediated graft injury. Half of heart transplant recipients develop chronic rejection within 10 years - a daunting statistic, particularly for young patients expecting to achieve longevity by enduring the rigors of a transplant. The current immunosuppressive pharmacopeia is relatively ineffective in preventing late alloantibody-associated chronic rejection. In this issue of the JCI, Kelishadi et al. report that preemptive deletion of B cells prior to heart transplantation in cynomolgus monkeys, in addition to conventional posttransplant immunosuppressive therapy with cyclosporine, markedly attenuated not only acute graft rejection but also alloantibody elaboration and chronic graft rejection. The success of this preemptive strike implies a central role for B cells in graft rejection, and this approach may help to delay or prevent chronic rejection after solid organ transplantation.
Resumo:
De novo donor-specific antibody (DSA) after organ transplantation promotes antibody-mediated rejection (AMR) and causes late graft loss. Previously, we demonstrated that depletion using anti-CD3 immunotoxin combined with tacrolimus and alefacept (AMR regimen) reliably induced early DSA production with AMR in a nonhuman primate kidney transplant model. Five animals were assigned as positive AMR controls, four received additional belatacept and four received additional anti-CD40 mAb (2C10R4). Notably, production of early de novo DSA was completely attenuated with additional belatacept or 2C10R4 treatment. In accordance with this, while positive controls experienced a decrease in peripheral IgM(+) B cells, bela- and 2C10R4-added groups maintained a predominant population of IgM(+) B cells, potentially indicating decreased isotype switching. Central memory T cells (CD4(+) CD28(+) CD95(+)) as well as PD-1(hi) CD4(+) T cells were decreased in both bela-added and 2C10R4-added groups. In analyzing germinal center (GC) reactions in situ, lymph nodes further revealed a reduction of B cell clonal expansion, GC-follicular helper T (Tfh) cells, and IL-21 production inside GCs with additional belatacept or 2C10R4 treatment. Here we provide evidence that belatacept and 2C10R4 selectively suppresses the humoral response via regulating Tfh cells and prevents AMR in this nonhuman primate model.
Resumo:
Purpose: Computed Tomography (CT) is one of the standard diagnostic imaging modalities for the evaluation of a patient’s medical condition. In comparison to other imaging modalities such as Magnetic Resonance Imaging (MRI), CT is a fast acquisition imaging device with higher spatial resolution and higher contrast-to-noise ratio (CNR) for bony structures. CT images are presented through a gray scale of independent values in Hounsfield units (HU). High HU-valued materials represent higher density. High density materials, such as metal, tend to erroneously increase the HU values around it due to reconstruction software limitations. This problem of increased HU values due to metal presence is referred to as metal artefacts. Hip prostheses, dental fillings, aneurysm clips, and spinal clips are a few examples of metal objects that are of clinical relevance. These implants create artefacts such as beam hardening and photon starvation that distort CT images and degrade image quality. This is of great significance because the distortions may cause improper evaluation of images and inaccurate dose calculation in the treatment planning system. Different algorithms are being developed to reduce these artefacts for better image quality for both diagnostic and therapeutic purposes. However, very limited information is available about the effect of artefact correction on dose calculation accuracy. This research study evaluates the dosimetric effect of metal artefact reduction algorithms on severe artefacts on CT images. This study uses Gemstone Spectral Imaging (GSI)-based MAR algorithm, projection-based Metal Artefact Reduction (MAR) algorithm, and the Dual-Energy method.
Materials and Methods: The Gemstone Spectral Imaging (GSI)-based and SMART Metal Artefact Reduction (MAR) algorithms are metal artefact reduction protocols embedded in two different CT scanner models by General Electric (GE), and the Dual-Energy Imaging Method was developed at Duke University. All three approaches were applied in this research for dosimetric evaluation on CT images with severe metal artefacts. The first part of the research used a water phantom with four iodine syringes. Two sets of plans, multi-arc plans and single-arc plans, using the Volumetric Modulated Arc therapy (VMAT) technique were designed to avoid or minimize influences from high-density objects. The second part of the research used projection-based MAR Algorithm and the Dual-Energy Method. Calculated Doses (Mean, Minimum, and Maximum Doses) to the planning treatment volume (PTV) were compared and homogeneity index (HI) calculated.
Results: (1) Without the GSI-based MAR application, a percent error between mean dose and the absolute dose ranging from 3.4-5.7% per fraction was observed. In contrast, the error was decreased to a range of 0.09-2.3% per fraction with the GSI-based MAR algorithm. There was a percent difference ranging from 1.7-4.2% per fraction between with and without using the GSI-based MAR algorithm. (2) A range of 0.1-3.2% difference was observed for the maximum dose values, 1.5-10.4% for minimum dose difference, and 1.4-1.7% difference on the mean doses. Homogeneity indexes (HI) ranging from 0.068-0.065 for dual-energy method and 0.063-0.141 with projection-based MAR algorithm were also calculated.
Conclusion: (1) Percent error without using the GSI-based MAR algorithm may deviate as high as 5.7%. This error invalidates the goal of Radiation Therapy to provide a more precise treatment. Thus, GSI-based MAR algorithm was desirable due to its better dose calculation accuracy. (2) Based on direct numerical observation, there was no apparent deviation between the mean doses of different techniques but deviation was evident on the maximum and minimum doses. The HI for the dual-energy method almost achieved the desirable null values. In conclusion, the Dual-Energy method gave better dose calculation accuracy to the planning treatment volume (PTV) for images with metal artefacts than with or without GE MAR Algorithm.
Resumo:
Attempts were made to measure the fraction of elemental carbon (EC) in ultrafine aerosol by modifying an Ambient Carbonaceous Particulate Monitor (ACPM, R&P 5400). The main modification consisted in placing a quartz filter in one of the sampling lines of this dual-channel instrument. With the filter all aerosol and EC contained in it is collected, while in the other line of the instrument the standard impactor samples only particles larger than 0.14 μm. The fraction of EC in particles smaller than 0.14 μm is derived from the difference in concentration as measured via the two sampling lines. Measurements with the modified instrument were made at a suburban site in Amsterdam, The Netherlands. An apparent adsorption artefact, which could not be eliminated by the use of denuders, precluded meaningful evaluation of the data for total carbon. Blanks in the measurements of EC were negligible and the EC data were hence further evaluated. We found that the concentration of EC obtained via the channel with the impactor was systematically lower than that in the filter-line. The average ratio of the concentrations was close to 0.6, which indicates that approximately 40% of the EC was in particles smaller than 0.14 μm. Alternative explanations for the difference in the concentration in the two sampling lines could be excluded, such as a difference in the extent of oxidation. This should be a function of loading, which is not the case. Another reason for the difference could be that less material is collected by the impactor due to rebound, but such bounce of aerosol is very unlikely in The Netherlands due to co-deposition of abundant deliquesced and thus viscous ammonium compounds. The conclusion is that a further modification to assess the true fraction of ultrafine EC, by installing an impactor with cut-off diameter at 0.1 μm, would be worth pursuing. © 2005 Elsevier Ltd. All rights reserved.
Resumo:
Depletional strategies directed toward achieving tolerance induction in organ transplantation have been associated with an increased incidence and risk of antibody-mediated rejection (AMR) and graft injury. Our clinical data suggest correlation of increased serum B cell activating factor/survival factor (BAFF) with increased risk of antibody-mediated rejection in alemtuzumab treated patients. In the present study, we tested the ability of BAFF blockade (TACI-Ig) in a nonhuman primate AMR model to prevent alloantibody production and prolong allograft survival. Three animals received the AMR inducing regimen (CD3-IT/alefacept/tacrolimus) with TACI-Ig (atacicept), compared to five control animals treated with the AMR inducing regimen only. TACI-Ig treatment lead to decreased levels of DSA in treated animals at 2 and 4 weeks posttransplantation (p < 0.05). In addition, peripheral B cell numbers were significantly lower at 6 weeks posttransplantation. However, it provided only a marginal increase in graft survival (59 ± 22 vs. 102 ± 47 days; p = 0.11). Histological analysis revealed a substantial reduction in findings typically associated with humoral rejection with atacicept treatment. More T cell rejection findings were observed with increased graft T cell infiltration in atacicept treatment, likely secondary to the graft prolongation. We show that BAFF/APRIL blockade using concomitant TACI-Ig treatment reduced the humoral portion of rejection in our depletion-induced preclinical AMR model.
Resumo:
The role of antibodies in chronic injury to organ transplants has been suggested for many years, but recently emphasized by new data. We have observed that when immunosuppressive potency decreases either by intentional weaning of maintenance agents or due to homeostatic repopulation after immune cell depletion, the threshold of B cell activation may be lowered. In human transplant recipients the result may be donor-specific antibody, C4d+ injury, and chronic rejection. This scenario has precise parallels in a rhesus monkey renal allograft model in which T cells are depleted with CD3 immunotoxin, or in a CD52-T cell transgenic mouse model using alemtuzumab to deplete T cells. Such animal models may be useful for the testing of therapeutic strategies to prevent DSA. We agree with others who suggest that weaning of immunosuppression may place transplant recipients at risk of chronic antibody-mediated rejection, and that strategies to prevent this scenario are needed if we are to improve long-term graft and patient outcomes in transplantation. We believe that animal models will play a crucial role in defining the pathophysiology of antibody-mediated rejection and in developing effective therapies to prevent graft injury. Two such animal models are described herein.
Resumo:
Chronic allograft rejection is a major impediment to long-term transplant success. Humoral immune responses to alloantigens are a growing clinical problem in transplantation, with mounting evidence associating alloantibodies with the development of chronic rejection. Nearly a third of transplant recipients develop de novo antibodies, for which no established therapies are effective at preventing or eliminating, highlighting the need for a nonhuman primate model of antibody-mediated rejection. In this report, we demonstrate that depletion using anti-CD3 immunotoxin (IT) combined with maintenance immunosuppression that included tacrolimus with or without alefacept reliably prolonged renal allograft survival in rhesus monkeys. In these animals, a preferential skewing toward CD4 repopulation and proliferation was observed, particularly with the addition of alefacept. Furthermore, alefacept-treated animals demonstrated increased alloantibody production (100%) and morphologic features of antibody-mediated injury. In vitro, alefacept was found to enhance CD4 effector memory T cell proliferation. In conclusion, alefacept administration after depletion and with tacrolimus promotes a CD4+memory T cell and alloantibody response, with morphologic changes reflecting antibody-mediated allograft injury. Early and consistent de novo alloantibody production with associated histological changes makes this nonhuman primate model an attractive candidate for evaluating targeted therapeutics.
Resumo:
Grafts can be rejected even when matched for MHC because of differences in the minor histocompatibility Ags (mH-Ags). H4- and H60-derived epitopes are known as immunodominant mH-Ags in H2(b)-compatible BALB.B to C57BL/6 transplantation settings. Although multiple explanations have been provided to explain immunodominance of Ags, the role of vascularization of the graft is yet to be determined. In this study, we used heart (vascularized) and skin (nonvascularized) transplantations to determine the role of primary vascularization of the graft. A higher IFN-γ response toward H60 peptide occurs in heart recipients. In contrast, a higher IFN-γ response was generated against H4 peptide in skin transplant recipients. Peptide-loaded tetramer staining revealed a distinct antigenic hierarchy between heart and skin transplantation: H60-specific CD8(+) T cells were the most abundant after heart transplantation, whereas H4-specific CD8(+) T cells were more abundant after skin graft. Neither the tissue-specific distribution of mH-Ags nor the draining lymph node-derived dendritic cells correlated with the observed immunodominance. Interestingly, non-primarily vascularized cardiac allografts mimicked skin grafts in the observed immunodominance, and H60 immunodominance was observed in primarily vascularized skin grafts. However, T cell depletion from the BALB.B donor prior to cardiac allograft induces H4 immunodominance in vascularized cardiac allograft. Collectively, our data suggest that immediate transmigration of donor T cells via primary vascularization is responsible for the immunodominance of H60 mH-Ag in organ and tissue transplantation.
Resumo:
BACKGROUND: Interleukin (IL)-15 is a chemotactic factor to T cells. It induces proliferation and promotes survival of activated T cells. IL-15 receptor blockade in mouse cardiac and islet allotransplant models has led to long-term engraftment and a regulatory T-cell environment. This study investigated the efficacy of IL-15 receptor blockade using Mut-IL-15/Fc in an outbred non-human primate model of renal allotransplantation. METHODS: Male cynomolgus macaque donor-recipient pairs were selected based on ABO typing, major histocompatibility complex class I typing, and carboxy-fluorescein diacetate succinimidyl ester-based mixed lymphocyte responses. Once animals were assigned to one of six treatment groups, they underwent renal transplantation and bilateral native nephrectomy. Serum creatinine level was monitored twice weekly and as indicated, and protocol biopsies were performed. Rejection was defined as a increase in serum creatinine to 1.5 mg/dL or higher and was confirmed histologically. Complete blood counts and flow cytometric analyses were performed periodically posttransplant; pharmacokinetic parameters of Mut-IL-15/Fc were assessed. RESULTS: Compared with control animals, Mut-IL-15/Fc-treated animals did not demonstrate increased graft survival despite adequate serum levels of Mut-IL-15/Fc. Flow cytometric analysis of white blood cell subgroups demonstrated a decrease in CD8 T-cell and natural killer cell numbers, although this did not reach statistical significance. Interestingly, two animals receiving Mut-IL-15/Fc developed infectious complications, but no infection was seen in control animals. Renal pathology varied widely. CONCLUSIONS: Peritransplant IL-15 receptor blockade does not prolong allograft survival in non-human primate renal transplantation; however, it reduces the number of CD8 T cells and natural killer cells in the peripheral blood.
Resumo:
BACKGROUND: Blocking leukocyte function-associated antigen (LFA)-1 in organ transplant recipients prolongs allograft survival. However, the precise mechanisms underlying the therapeutic potential of LFA-1 blockade in preventing chronic rejection are not fully elucidated. Cardiac allograft vasculopathy (CAV) is the preeminent cause of late cardiac allograft failure characterized histologically by concentric intimal hyperplasia. METHODS: Anti-LFA-1 monoclonal antibody was used in a multiple minor antigen-mismatched, BALB.B (H-2B) to C57BL/6 (H-2B), cardiac allograft model. Endogenous donor-specific CD8 T cells were tracked down using major histocompatibility complex multimers against the immunodominant H4, H7, H13, H28, and H60 minor Ags. RESULTS: The LFA-1 blockade prevented acute rejection and preserved palpable beating quality with reduced CD8 T-cell graft infiltration. Interestingly, less CD8 T cell infiltration was secondary to reduction of T-cell expansion rather than less trafficking. The LFA-1 blockade significantly suppressed the clonal expansion of minor histocompatibility antigen-specific CD8 T cells during the expansion and contraction phase. The CAV development was evaluated with morphometric analysis at postoperation day 100. The LFA-1 blockade profoundly attenuated neointimal hyperplasia (61.6 vs 23.8%; P < 0.05), CAV-affected vessel number (55.3 vs 15.9%; P < 0.05), and myocardial fibrosis (grade 3.29 vs 1.8; P < 0.05). Finally, short-term LFA-1 blockade promoted long-term donor-specific regulation, which resulted in attenuated transplant arteriosclerosis. CONCLUSIONS: Taken together, LFA-1 blockade inhibits initial endogenous alloreactive T-cell expansion and induces more regulation. Such a mechanism supports a pulse tolerance induction strategy with anti-LFA-1 rather than long-term treatment.
Resumo:
BACKGROUND: Some of the 600,000 patients with solid organ allotransplants need reconstruction with a composite tissue allotransplant, such as the hand, abdominal wall, or face. The aim of this study was to develop a rat model for assessing the effects of a secondary composite tissue allotransplant on a primary heart allotransplant. METHODS: Hearts of Wistar Kyoto rats were harvested and transplanted heterotopically to the neck of recipient Fisher 344 rats. The anastomoses were performed between the donor brachiocephalic artery and the recipient left common carotid artery, and between the donor pulmonary artery and the recipient external jugular vein. Recipients received cyclosporine A for 10 days only. Heart rate was assessed noninvasively. The sequential composite tissue allotransplant consisted of a 3 x 3-cm abdominal musculocutaneous flap harvested from Lewis rats and transplanted to the abdomen of the heart allotransplant recipients. The abdominal flap vessels were connected to the femoral vessels. No further immunosuppression was administered following the composite tissue allotransplant. Ten days after composite tissue allotransplantation, rejection of the heart and abdominal flap was assessed histologically. RESULTS: The rat survival rate of the two-stage transplant surgery was 80 percent. The transplanted heart rate decreased from 150 +/- 22 beats per minute immediately after transplant to 83 +/- 12 beats per minute on day 20 (10 days after stopping immunosuppression). CONCLUSIONS: This sequential allotransplant model is technically demanding. It will facilitate investigation of the effects of a secondary composite tissue allotransplant following primary solid organ transplantation and could be useful in developing future immunotherapeutic strategies.