932 resultados para ALLOGRAFT TOLERANCE
Resumo:
This master thesis work introduces the fuzzy tolerance/equivalence relation and its application in cluster analysis. The work presents about the construction of fuzzy equivalence relations using increasing generators. Here, we investigate and research on the role of increasing generators for the creation of intersection, union and complement operators. The objective is to develop different varieties of fuzzy tolerance/equivalence relations using different varieties of increasing generators. At last, we perform a comparative study with these developed varieties of fuzzy tolerance/equivalence relations in their application to a clustering method.
Resumo:
The gut mucosa is a major site of contact with antigens from food and microbiota. Usually, these daily contacts with natural antigens do not result in inflammatory reactions; instead they result in a state of systemic hyporesponsiveness named oral tolerance. Inflammatory bowel diseases (IBD) are associated with the breakdown of the immunoregulatory mechanisms that maintain oral tolerance. Several animal models of IBD/colitis are available. In mice, these include targeted disruptions of the genes encoding cytokines, T cell subsets or signaling proteins. Colitis can also be induced by intrarectal administration of chemical substances such as 2,4,6-trinitrobenzene sulfonic acid in 50% ethanol. We report here a novel model of colitis induced by intrarectal administration of 50% ethanol alone. Ethanol-treated mice develop an inflammatory reaction in the colon characterized by an intense inflammatory infiltrate in the mucosa and submucosa of the large intestine. They also present up-regulation of both interferon gamma (IFN-gamma) and interleukin-4 (IL-4) production by cecal lymph node and splenic cells. These results suggest a mixed type of inflammation as the substrate of the colitis. Interestingly, cells from mesenteric lymph nodes of ethanol-treated mice present an increase in IFN-gamma production and a decrease in IL-4 production indicating that the cytokine balance is altered throughout the gut mucosa. Moreover, induction of oral tolerance to ovalbumin is abolished in these animals, strongly suggesting that ethanol-induced colitis interferes with immunoregulatory mechanisms in the intestinal mucosa. This novel model of colitis resembles human IBD. It is easy to reproduce and may help us to understand the mechanisms involved in IBD pathogenesis.
Resumo:
The effect of different contextual stimuli on different ethanol-induced internal states was investigated during the time course of both the hypothermic effect of the drug and of drug tolerance. Minimitters were surgically implanted in 16 Wistar rats to assess changes in their body temperature under the effect of ethanol. Rat groups were submitted to ethanol or saline trials every other day. The animals were divided into two groups, one receiving a constant dose (CD) of ethanol injected intraperitoneally, and the other receiving increasing doses (ID) during the 10 training sessions. During the ethanol training sessions, conditioned stimuli A (tone) and B (buzzer) were presented at "state +" (35 min after drug injection) and "state -" (170 min after drug injection), respectively. Conditioned stimuli C (bip) and D (white noise) were presented at moments equivalent to stimuli A and B, respectively, but during the saline training sessions. All stimuli lasted 15 min. The CD group, but not the ID group, developed tolerance to the hypothermic effect of ethanol. Stimulus A (associated with drug "state +") induced hyperthermia with saline injection in the ID group. Stimulus B (associated with drug "state -") reduced ethanol tolerance in the CD group and modulated the hypothermic effect of the drug in the ID group. These results indicate that contextual stimuli acquire modulatory conditioned properties that are associated with the time course of both the action of the drug and the development of drug tolerance.
Resumo:
In order to evaluate the performance of a 1-h 75-g oral glucose tolerance test (OGTT) for the diagnosis of gestational diabetes mellitus (GDM), a cohort of 4998 women, 20 years or older, without previous diabetes being treated in prenatal care clinics in Brazil answered a questionnaire and performed a 75-g OGTT including fasting, 1-h and 2-h glucose measurements between their 24th and 28th gestational weeks. Pregnancy outcomes were transcribed from medical registries. GDM was defined according to WHO criteria (fasting: ≥126 mg/dL; 2-h value: ≥140 mg/dL) and macrosomia as a birth weight equal to or higher than 4000 g. Areas under the receiver operator characteristic curve (AUC) were compared and diagnostic properties of various cut-off points were evaluated. The AUCs for the prediction of macrosomia were 0.606 (0.572-0.637) for the 1-h and 0.589 (0.557-0.622) for the 2-h plasma glucose test. Similar predictability was demonstrable regarding combined adverse outcomes: 0.582 (0.559-0.604) for the 1-h test and 0.572 (0.549-0.595) for the 2-h test. When the 1-h glucose test was evaluated against a diagnosis of GDM defined by the 2-h glucose test, the AUC was 0.903 (0.886-0.919). The cut-off point that maximized sensitivity (83%) and specificity (83%) was 141 mg/dL, identifying 21% of the women as positive. A cut-off point of 160 mg/dL, with lower sensitivity (62%), had higher specificity (94%), labeling 8.6% as positive. Detection of GDM can be done with a 1-h 75-g OGTT: the value of 160 mg/dL has the same diagnostic performance as the conventional 2-h value (140 mg/dL). The simplification of the test may improve coverage and timing of the diagnosis of GDM.
Resumo:
A major problem in renal transplantation is identifying a grading system that can predict long-term graft survival. The present study determined the extent to which the two existing grading systems (Banff 97 and chronic allograft damage index, CADI) correlate with each other and with graft loss. A total of 161 transplant patient biopsies with chronic allograft nephropathy (CAN) were studied. The samples were coded and evaluated blindly by two pathologists using the two grading systems. Logistic regression analyses were used to evaluate the best predictor index for renal allograft loss. Patients with higher Banff 97 and CADI scores had higher rates of graft loss. Moreover, these measures also correlated with worse renal function and higher proteinuria levels at the time of CAN diagnosis. Logistic regression analyses showed that the use of angiotensin-converting enzyme inhibitor (ACEI), hepatitis C virus (HCV), tubular atrophy, and the use of mycophenolate mofetil (MMF) were associated with graft loss in the CADI, while the use of ACEI, HCV, moderate interstitial fibrosis and tubular atrophy and the use of MMF were associated in the Banff 97 index. Although Banff 97 and CADI analyze different parameters in different renal compartments, only some isolated parameters correlated with graft loss. This suggests that we need to review the CAN grading systems in order to devise a system that includes all parameters able to predict long-term graft survival, including chronic glomerulopathy, glomerular sclerosis, vascular changes, and severity of chronic interstitial fibrosis and tubular atrophy.
Resumo:
We evaluated changes in glucose tolerance of 17 progressors and 62 non-progressors for 9 years to improve our understanding of the pathogenesis of type 2 diabetes mellitus. Changes in anthropometric measurements and responses to an oral glucose tolerance test (OGTT) were analyzed. We identified 14 pairs of individuals, one from each group, who were initially normal glucose tolerant and were matched for gender, age, weight, and girth. We compared initial plasma glucose and insulin curves (from OGTT), insulin secretion (first and second phases) and insulin sensitivity indices (from hyperglycemic clamp assay) for both groups. In the normal glucose tolerant phase, progressors presented: 1) a higher OGTT blood glucose response with hyperglycemia in the second hour and a similar insulin response vs non-progressors; 2) a reduced first-phase insulin secretion (2.0 ± 0.3 vs 2.3 ± 0.3 pmol/L; P < 0.02) with a similar insulin sensitivity index and a lower disposition index (3.9 ± 0.2 vs 4.1 ± 0.2 µmol·kg-1·min-1 ; P < 0.05) vs non-progressors. After 9 years, both groups presented similar increases in weight and fasting blood glucose levels and progressors had an increased glycemic response at 120 min (P < 0.05) and reduced early insulin response to OGTT (progressors, 1st: 2.10 ± 0.34 vs 2nd: 1.87 ± 0.25 pmol/mmol; non-progressors, 1st: 2.15 ± 0.28 vs 2nd: 2.03 ± 0.39 pmol/mmol; P < 0.05). Theses data suggest that β-cell dysfunction might be a risk factor for type 2 diabetes mellitus.
Resumo:
Experimental data and few clinical non-randomized studies have shown that inhibition of the renin-angiotensin system by angiotensin-converting enzyme (ACE) associated or not with the use of mycophenolate mofetil (MMF) could delay or even halt the progression of chronic allograft nephropathy (CAN). In this retrospective historical study, we investigated whether ACE inhibition (ACEI) associated or not with the use of MMF has the same effect in humans as in experimental studies and what factors are associated with a clinical response. A total of 160 transplant patients with biopsy-proven CAN were enrolled. Eighty-one of them were on ACE therapy (G1) and 80 on ACEI_free therapy (G2). Patients were further stratified for the use of MMF. G1 patients showed a marked decrease in proteinuria and stabilized serum creatinine with time. Five-year graft survival after CAN diagnosis was more frequent in G1 (86.9 vs 67.7%; P < 0.05). In patients on ACEI-free therapy, the use of MMF was associated with better graft survival. The use of ACEI therapy protected 79% of the patients against graft loss (OR = 0.079, 95%CI = 0.015-0.426; P = 0.003). ACEI and MMF or the use of MMF alone after CAN diagnosis conferred protection against graft loss. This finding is well correlated with experimental studies in which ACEI and MMF interrupt the progression of chronic allograft dysfunction and injury. The use of ACEI alone or in combination with MMF significantly reduced proteinuria and stabilized serum creatinine, consequently improving renal allograft survival.
Resumo:
Oral tolerance can be induced in some mouse strains by gavage or spontaneous ingestion of dietary antigens. In the present study, we determined the influence of aging and oral tolerance on the secretion of co-stimulatory molecules by dendritic cells (DC), and on the ability of DC to induce proliferation and cytokine secretion by naive T cells from BALB/c and OVA transgenic (DO11.10) mice. We observed that oral tolerance could be induced in BALB/c mice (N = 5 in each group) of all ages (8, 20, 40, 60, and 80 weeks old), although a decline in specific antibody levels was observed in the sera of both tolerized and immunized mice with advancing age (40 to 80 weeks old). DC obtained from young, adult and middle-aged (8, 20, and 40 weeks old) tolerized mice were less efficient (65, 17 and 20%, respectively) than DC from immunized mice (P < 0.05) in inducing antigen-specific proliferation of naive T cells from both BALB/c and DO11.10 young mice, or in stimulating IFN-g, IL-4 and IL-10 production. However, TGF-β levels were significantly elevated in co-cultures carried out with DC from tolerant mice (P < 0.05). DC from both immunized and tolerized old and very old (60 and 80 weeks old) mice were equally ineffective in inducing T cell proliferation and cytokine production (P < 0.05). A marked reduction in CD86+ marker expression was observed in DC isolated from both old and tolerized mice (75 and 50%, respectively). The results indicate that the aging process does not interfere with the establishment of oral tolerance in BALB/c mice, but reduces DC functions, probably due to the decline of the expression of the CD86 surface marker.
Resumo:
Sepsis is a systemic inflammatory response that can lead to tissue damage and death. In order to increase our understanding of sepsis, experimental models are needed that produce relevant immune and inflammatory responses during a septic event. We describe a lipopolysaccharide tolerance mouse model to characterize the cellular and molecular alterations of immune cells during sepsis. The model presents a typical lipopolysaccharide tolerance pattern in which tolerance is related to decreased production and secretion of cytokines after a subsequent exposure to a lethal dose of lipopolysaccharide. The initial lipopolysaccharide exposure also altered the expression patterns of cytokines and was followed by an 8- and a 1.5-fold increase in the T helper 1 and 2 cell subpopulations. Behavioral data indicate a decrease in spontaneous activity and an increase in body temperature following exposure to lipopolysaccharide. In contrast, tolerant animals maintained production of reactive oxygen species and nitric oxide when terminally challenged by cecal ligation and puncture (CLP). Survival study after CLP showed protection in tolerant compared to naive animals. Spleen mass increased in tolerant animals followed by increases of B lymphocytes and subpopulation Th1 cells. An increase in the number of stem cells was found in spleen and bone marrow. We also showed that administration of spleen or bone marrow cells from tolerant to naive animals transfers the acquired resistance status. In conclusion, lipopolysaccharide tolerance is a natural reprogramming of the immune system that increases the number of immune cells, particularly T helper 1 cells, and does not reduce oxidative stress.
Resumo:
Interstitial fibrosis and tubular atrophy (IF/TA) are the most common cause of renal graft failure. Chronic transplant glomerulopathy (CTG) is present in approximately 1.5-3.0% of all renal grafts. We retrospectively studied the contribution of CTG and recurrent post-transplant glomerulopathies (RGN) to graft loss. We analyzed 123 patients with chronic renal allograft dysfunction and divided them into three groups: CTG (N = 37), RGN (N = 21), and IF/TA (N = 65). Demographic data were analyzed and the variables related to graft function identified by statistical methods. CTG had a significantly lower allograft survival than IF/TA. In a multivariate analysis, protective factors for allograft outcomes were: use of angiotensin-converting enzyme inhibitor (ACEI; hazard ratio (HR) = 0.12, P = 0.001), mycophenolate mofetil (MMF; HR = 0.17, P = 0.026), hepatitis C virus (HR = 7.29, P = 0.003), delayed graft function (HR = 5.32, P = 0.016), serum creatinine ≥1.5 mg/dL at the 1st year post-transplant (HR = 0.20, P = 0.011), and proteinuria ≥0.5 g/24 h at the 1st year post-transplant (HR = 0.14, P = 0.004). The presence of glomerular damage is a risk factor for allograft loss (HR = 4.55, P = 0.015). The presence of some degree of chronic glomerular damage in addition to the diagnosis of IF/TA was the most important risk factor associated with allograft loss since it could indicate chronic active antibody-mediated rejection. ACEI and MMF were associated with better outcomes, indicating that they might improve graft survival.
Resumo:
7-Nitroindazole (7-NI) inhibits neuronal nitric oxide synthase in vivo and reduces l-DOPA-induced dyskinesias in a rat model of parkinsonism. The aim of the present study was to determine if the anti-dyskinetic effect of 7-NI was subject to tolerance after repeated treatment and if this drug could interfere with the priming effect of l-DOPA. Adult male Wistar rats (200-250 g) with unilateral depletion of dopamine in the substantia nigra compacta were treated with l-DOPA (30 mg/kg) for 34 days. On the 1st day, 6 rats received ip saline and 6 received ip 7-NI (30 mg/kg) before l-DOPA. From the 2nd to the 26th day, all rats received l-DOPA daily and, from the 27th to the 34th day, they also received 7-NI before l-DOPA. Animals were evaluated before the drug and 1 h after l-DOPA using an abnormal involuntary movement scale and a stepping test. All rats had a similar initial motor deficit. 7-NI decreased abnormal involuntary movement induced by l-DOPA and the effect was maintained during the experiment before 7-NI, median (interquartile interval), day 26: 16.75 (15.88-17.00); day 28: 0.00 (0.00-9.63); day 29: 13.75 (2.25-15.50); day 30: 0.5 (0.00-6.25); day 31: 4.00 (0.00-7.13), and day 34: 0.5 (0.00-14.63), Friedman followed by Wilcoxon test,vs day 26, P < 0.05;. The response to l-DOPA alone was not modified by the use of 7-NI before the first administration of the drug (l-DOPA vs time interaction, F1,10 = 1.5, NS). The data suggest that tolerance to the anti-dyskinetic effects of a neuronal nitric oxide synthase inhibitor does not develop over a short-term period of repeated administration. These observations open a possible new therapeutic approach to motor complications of chronic l-DOPA therapy in patients with Parkinson’s disease.
Resumo:
The Banff classification was introduced to achieve uniformity in the assessment of renal allograft biopsies. The primary aim of this study was to evaluate the impact of specimen adequacy on the Banff classification. All renal allograft biopsies obtained between July 2010 and June 2012 for suspicion of acute rejection were included. Pre-biopsy clinical data on suspected diagnosis and time from renal transplantation were provided to a nephropathologist who was blinded to the original pathological report. Second pathological readings were compared with the original to assess agreement stratified by specimen adequacy. Cohen's kappa test and Fisher's exact test were used for statistical analyses. Forty-nine specimens were reviewed. Among these specimens, 81.6% were classified as adequate, 6.12% as minimal, and 12.24% as unsatisfactory. The agreement analysis among the first and second readings revealed a kappa value of 0.97. Full agreement between readings was found in 75% of the adequate specimens, 66.7 and 50% for minimal and unsatisfactory specimens, respectively. There was no agreement between readings in 5% of the adequate specimens and 16.7% of the unsatisfactory specimens. For the entire sample full agreement was found in 71.4%, partial agreement in 20.4% and no agreement in 8.2% of the specimens. Statistical analysis using Fisher's exact test yielded a P value above 0.25 showing that - probably due to small sample size - the results were not statistically significant. Specimen adequacy may be a determinant of a diagnostic agreement in renal allograft specimen assessment. While additional studies including larger case numbers are required to further delineate the impact of specimen adequacy on the reliability of histopathological assessments, specimen quality must be considered during clinical decision making while dealing with biopsy reports based on minimal or unsatisfactory specimens.
Resumo:
The occurrence of green soybean seed due to forced maturation or premature plant death caused by drought or foliar and/or root diseases has been common in several Brazilian production areas. Physiological quality of seed lots with green seed may have their germination and vigor potentials affected and therefore discarded by the grain industry. The objective of this experiment was to determine the maximum tolerated level of green seed in soybean seed lots, which is information of major importance for seed producers when taking the decision whether to sell these lots. Soybean seed of the cultivars CD 206, produced in Ubirata, Parana, and FMT Tucunare, produced in Alto Garças, Mato Grosso, were used in the study. Green seed and yellow seed of both cultivars were mixed in the following proportions: 0%, 3%, 6%, 9%, 12%, 15%, 20%, 30%, 40%, 50%, 75% and 100%. Seed quality was evaluated by the germination, accelerated aging, tetrazolium and electrical conductivity tests. The contents of a, b and total chlorophyll in the seed were also determined. A complete randomized block design in a factorial scheme (two cultivars x 12 levels of green seed) was used. Seed quality was negatively affected and chlorophyll contents incremented with the increase in the percentage of green seed. Seed germination, viability and vigor, measured by the accelerated aging test, were not reduced with levels of up to 3% green seed, for both cultivars. Levels above 6% green seed significantly reduced the quality of the seed. The quality of seed lots with 9% or more green seed was significantly reduced to the point that their commercialization is not recommended.
Resumo:
The aim of this study was to assess the desiccation tolerance and DNA integrity in Eugenia pleurantha seeds dehydrated to different moisture contents (MCs). Seeds extracted from mature fruits were submmited to drying in silica gel and evaluated at every five percentual points of decrease from the initial MC (35.5%, fresh weight basis). The effects of dehydration on seeds were verified through germination tests and DNA integrity assessment. Undried seeds achieved 87% germination, value reduced to 36% after being dried to 9.8% MC. When dried slightly more, to 7.4% MC, seeds were no longer able to germinate, suggesting an intermediate behavior in relation to desiccation tolerance. It was observed DNA degradation in seeds with 7.4% MC, which might have contributed to the loss of seed germination.
Resumo:
The aim of this study was to assess the desiccation tolerance and DNA integrity in Eugenia pleurantha seeds dehydrated to different moisture contents (MCs). Seeds extracted from mature fruits were dried in silica gel and evaluated at every five percentual points of decrease from the initial MC (35.5%, fresh weight basis). The effects of dehydration on seeds were verified through germination tests and DNA integrity assessment. Undried seeds achieved 87% germination, value reduced to 36% after being dried to 9.8% MC. When dried slightly more, to 7.4% MC, seeds were no longer able to germinate, suggesting an intermediate behavior in relation to desiccation tolerance. DNA degradation was observed in seeds with 7.4% MC, which might have contributed to the loss of seed germination.