998 resultados para Renal complication
Resumo:
The effects of strenuous exercise before and during pregnancy on the renal function and morphological alterations of the progeny were determined in a study on female Wistar rats. This research was done based on a previous study carried out in our laboratory, which showed morphological alterations in rats submitted to this kind of exercise. As the form is related to the function, the physiological relevance of submitting a pregnant female to a high-intensity exercise training regimen could be explained by the fact that morphological alterations can influence kidney function. The animals were assigned to one of two groups: control animals that did not exercise during pregnancy and trained animals that swam for 120 min 5 days a week for 8 weeks before pregnancy and daily for 60 min over a period of 8 weeks starting on the second day of pregnancy. Seven rats of each group were analyzed for morphological alterations and for renal function. The progeny of the rats used for morphological evaluation were born by cesarean section and the progeny of the animals used to evaluate renal function were born normally. The progeny were two months old when renal function was evaluated. Fertility and morbidity were the same for both groups. Strenuous maternal exercise had no significant influence on glomerular filtration rate (GFR) but renal plasma flow was lower in the progeny of the trained group (mean ± SD, 16.65 ± 3.77 ml min-1 kg-1) compared to the progeny of the control group (33.42 ± 2.56 ml min-1 kg-1). Antidiuretic and antinatriuretic effects on the progeny of the trained group were observed, since urine flow as percentage of GFR and the fraction of urinary sodium excretion were lower in this group (1.38 ± 0.10 and 0.60 ± 0.04%, respectively) compared to the progeny of the control group (2.36 ± 0.11 and 1.55 ± 0.20%, respectively). Moreover, in this exercise program, fetuses from trained animals were small-sized (2.45 ± 0.19 vs 4.66 ± 2.45 g for control animals) and showed lower differentiation compared to fetuses from the control group. These effects were probably caused by caloric restriction, hypoxia and reduction of umbilical cord length.
Resumo:
In the present study we determined the effect of chronic diet supplementation with n-3 PUFA on renal function of healthy and cachectic subjects by providing fish oil (1 g/kg body weight) to female rats throughout pregnancy and lactation and then to their offspring post-weaning and examined its effect on renal function parameters during their adulthood. The animals were divided into four groups of 5-10 rats in each group: control, control supplemented with fish oil (P), cachectic Walker 256 tumor-bearing (W), and W supplemented with fish oil (WP). Food intake was significantly lower in the W group compared to control (12.66 ± 4.24 vs 25.30 ± 1.07 g/day). Treatment with fish oil significantly reversed this reduction (22.70 ± 2.94 g/day). Tumor growth rate was markedly reduced in the P group (16.41 ± 2.09 for WP vs 24.06 ± 2.64 g for W). WP group showed a significant increase in mean glomerular filtration rate compared to P and control (1.520 ± 0.214 ml min-1 kg body weight-1; P < 0.05). Tumor-bearing groups had low urine osmolality compared to control rats. The fractional sodium excretion decreased in the W group compared to control (0.43 ± 0.16 vs 2.99 ± 0.87%; P < 0.05), and partially recovered in the WP group (0.90 ± 0.20%). In summary, the chronic supplementation with fish oil used in this study increased the amount of fat in the diet by only 0.1%, but caused remarkable changes in tumor growth rate and cachexia, also showing a renoprotective function.
Resumo:
Because thalidomide and pentoxifylline inhibit the synthesis and release of tumor necrosis factor-alpha (TNF-alpha), we determined the effect of these drugs on the renal damage induced by supernatants of macrophages activated with Crotalus durissus cascavella venom in order to identify the role of TNF-alpha in the process. Rat peritoneal macrophages were collected with RPMI medium and stimulated in vitro with C.d. cascavella venom (10 µg/ml) in the absence and presence of thalidomide (15 µM) or pentoxifylline (500 µM) for 1 h and washed and kept in culture for 2 h. Supernatant (1 ml) was tested on an isolated perfused rat kidney (N = 6 for each group). The first 30 min of each experiment were used as control. The supernatant was added to the perfusion system. All experiments lasted 120 min. The toxic effect of the preparation of venom-stimulated macrophages on renal parameters was determined. At 120 min, thalidomide (Thalid) and pentoxifylline (Ptx) inhibited (P < 0.05) the increase in perfusion pressure caused by the venom (control = 114.0 ± 1.3; venom = 137.1 ± 1.5; Thalid = 121.0 ± 2.5; Ptx = 121.4 ± 4.0 mmHg), renal vascular resistance (control = 4.5 ± 0.2; venom = 7.3 ± 0.6; Thalid = 4.5 ± 0.9; Ptx = 4.8 ± 0.6 mmHg/ml g-1 min-1), urinary flow (control = 0.23 ± 0.001; venom = 0.44 ± 0.01; Thalid = 0.22 ± 0.007; Ptx = 0.21 ± 0.009 ml g-1 min-1), glomerular filtration rate (control = 0.72 ± 0.06; venom = 1.91 ± 0.11; Thalid = 0.75 ± 0.04; Ptx = 0.77 ± 0.05 ml g-1 min-1) and the decrease in percent tubular sodium transport (control = 77.0 ± 0.9; venom = 73.9 ± 0.66; Thalid = 76.6 ± 1.1; Ptx = 81.8 ± 2.0%), percent tubular chloride transport (control = 77.1 ± 1.2; venom = 71.4 ± 1.1; Thalid = 77.6 ± 1.7; Ptx = 76.8 ± 1.2%), and percent tubular potassium transport (control = 72.7 ± 1.1; venom = 63.0 ± 1.1; Thalid = 72.6 ± 1.0; Ptx = 74.8 ± 1.0%), 30 min before and during the stimulation of macrophages with C.d. cascavella venom. These data suggest the participation of TNF-alpha in the renal effects induced by supernatant of macrophages activated with C.d. cascavella venom.
Resumo:
Atherosclerosis is a major complication of chronic renal failure. Microinflammation is involved in atherogenesis and is associated with uremia and dialysis. The role of dialysate water contamination in inducing inflammation has been debated. Our aim was to study inflammatory markers in patients on chronic dialysis, before and 3 to 6 months after switching the water purification system from deionization to reverse osmosis. Patients had demographic, clinical and nutritional information collected and blood drawn for determination of albumin, ferritin, C-reactive protein (CRP), interleukin-6, and tumor necrosis factor-alpha in both situations. Acceptable levels of water purity were less than 200 colony-forming units of bacteria and less than 1 ng/ml of endotoxin. Sixteen patients died. They had higher median CRP (26.6 vs 11.2 mg/dl, P = 0.007) and lower median albumin levels (3.1 vs 3.9 g/l, P < 0.05) compared to the 31 survivors. Eight patients were excluded because of obvious inflammatory conditions. From the 23 remaining patients (mean age ± SD: 51.3 ± 13.9 years), 18 had a decrease in CRP after the water treatment system was changed. Overall, median CRP was lower with reverse osmosis than with deionization (13.2 vs 4.5 mg/l, P = 0.022, N = 23). There was no difference in albumin, cytokines, subjective global evaluation, or clinical and biochemical parameters. In conclusion, uremic patients presented a clinically significant reduction in CRP levels when dialysate water purification system switched from deionization to reverse osmosis. It is possible that better water treatments induce less inflammation and eventually less atherosclerosis in hemodialysis patients.
Resumo:
The aim of the present study was to evaluate the effect of amiodarone on mean arterial pressure (MAP), heart rate (HR), baroreflex, Bezold-Jarisch, and peripheral chemoreflex in normotensive and chronic one-kidney, one-clip (1K1C) hypertensive rats (N = 9 to 11 rats in each group). Amiodarone (50 mg/kg, iv) elicited hypotension and bradycardia in normotensive (-10 ± 1 mmHg, -57 ± 6 bpm) and hypertensive rats (-37 ± 7 mmHg, -39 ± 19 bpm). The baroreflex index (deltaHR/deltaMAP) was significantly attenuated by amiodarone in both normotensive (-0.61 ± 0.12 vs -1.47 ± 0.14 bpm/mmHg for reflex bradycardia and -1.15 ± 0.19 vs -2.63 ± 0.26 bpm/mmHg for reflex tachycardia) and hypertensive rats (-0.26 ± 0.05 vs -0.72 ± 0.16 bpm/mmHg for reflex bradycardia and -0.92 ± 0.19 vs -1.51 ± 0.19 bpm/mmHg for reflex tachycardia). The slope of linear regression from deltapulse interval/deltaMAP was attenuated for both reflex bradycardia and tachycardia in normotensive rats (-0.47 ± 0.13 vs -0.94 ± 0.19 ms/mmHg and -0.80 ± 0.13 vs -1.11 ± 0.13 ms/mmHg), but only for reflex bradycardia in hypertensive rats (-0.15 ± 0.02 vs -0.23 ± 0.3 ms/mmHg). In addition, the MAP and HR responses to the Bezold-Jarisch reflex were 20-30% smaller in amiodarone-treated normotensive or hypertensive rats. The bradycardic response to peripheral chemoreflex activation with intravenous potassium cyanide was also attenuated by amiodarone in both normotensive (-30 ± 6 vs -49 ± 8 bpm) and hypertensive rats (-34 ± 13 vs -42 ± 10 bpm). On the basis of the well-known electrophysiological effects of amiodarone, the sinus node might be the responsible for the attenuation of the cardiovascular reflexes found in the present study.
Resumo:
The present study evaluated the acute effect of the intraperitoneal (ip) administration of a whey protein hydrolysate (WPH) on systolic arterial blood pressure (SBP) and renal sodium handling by conscious spontaneously hypertensive rats (SHR). The ip administration of WPH in a volume of 1 ml dose-dependently lowered the SBP in SHR 2 h after administration at doses of 0.5 g/kg (0.15 M NaCl: 188.5 ± 9.3 mmHg vs WPH: 176.6 ± 4.9 mmHg, N = 8, P = 0.001) and 1.0 g/kg (0.15 M NaCl: 188.5 ± 9.3 mmHg vs WPH: 163.8 ± 5.9 mmHg, N = 8, P = 0.0018). Creatinine clearance decreased significantly (P = 0.0084) in the WPH-treated group (326 ± 67 µL min-1 100 g body weight-1) compared to 0.15 M NaCl-treated (890 ± 26 µL min-1 100 g body weight-1) and captopril-treated (903 ± 72 µL min-1 100 g body weight-1) rats. The ip administration of 1.0 g WPH/kg also decreased fractional sodium excretion to 0.021 ± 0.019% compared to 0.126 ± 0.041 and 0.66 ± 0.015% in 0.15 M NaCl and captopril-treated rats, respectively (P = 0.033). Similarly, the fractional potassium excretion in WPH-treated rats (0.25 ± 0.05%) was significantly lower (P = 0.0063) than in control (0.91 ± 0.15%) and captopril-treated rats (1.24 ± 0.30%), respectively. The present study shows a decreased SBP in SHR after the administration of WPH associated with a rise in tubule sodium reabsorption despite an angiotensin I-converting enzyme (ACE)-inhibiting in vitro activity (IC50 = 0.68 mg/mL). The present findings suggest a pathway involving ACE inhibition but measurements of plasma ACE activity and angiotensin II levels are needed to support this suggestion.
Resumo:
Low bone remodeling and relatively low serum parathyroid hormone (PTH) levels characterize adynamic bone disease (ABD). The impact of renal transplantation (RT) on the course of ABD is unknown. We studied prospectively 13 patients with biopsy-proven ABD after RT. Bone histomorphometry and bone mineral density (BMD) measurements were performed in the 1st and 12th months after RT. Serum PTH, 25-hydroxyvitamin D, 1,25-dihydroxyvitamin D, and osteocalcin were measured regularly throughout the study. Serum PTH levels were slightly elevated at transplantation, normalized at the end of the third month and remained stable thereafter. Bone biopsies performed in the first month after RT revealed low bone turnover in all patients, with positive bone aluminum staining in 5. In the 12th month, second biopsies were performed on 12 patients. Bone histomorphometric dynamic parameters improved in 9 and were completely normalized in 6, whereas no bone mineralization was detected in 3 of these 12 patients. At 12 months post-RT, no bone aluminum was detected in any patient. We also found a decrease in lumbar BMD and an increase in femoral BMD. Patients suffering from ABD, even those with a reduction in PTH levels, may present partial or complete recovery of bone turnover after successful renal transplantation. However, it is not possible to positively identify the mechanisms responsible for the improvement. Identifying these mechanisms should lead to a better understanding of the physiopathology of ABD and to the development of more effective treatments.
Resumo:
We evaluated the prevalence of low bone mineral density (BMD) and osteoporotic fractures in kidney transplantation (KT) patients and determined risk factors associated with osteoporotic fractures. The study was conducted on 191 patients (94 men and 97 women) with first KT for 3 years or more presenting stable and preserved renal function (serum creatinine levels lower than 2.5 mg/dl). KT patients were on immunosuppressive therapy and the cumulative doses of these drugs were also evaluated. BMD was determined by dual-energy X-ray absorptiometry at multiple sites (spine, femur and total body). Quantitative ultrasound of the calcaneus (broadband ultrasound attenuation, speed of sound, and stiffness index, SI) was also performed. Twenty-four percent (46) of all patients had either vertebral (29/46) or appendicular (17/46) fractures. We found osteoporosis and osteopenia in 8.5-13.4 and 30.9-35.1% of KT patients, respectively. Women had more fractures than men. In women, prevalent fractures were associated with diabetes mellitus [OR = 11.5, 95% CI (2.4-55.7)], time since menopause [OR = 3.7, 95% CI (1.2-11.9)], femoral neck BMD [OR = 1.99, 95% CI (1.4-2.8)], cumulative dose of steroids [OR = 1.1, 95% CI (1.02-1.12)] and low SI [OR = 1.1, 95% CI (1.0-1.2)]. In men, fractures were associated with lower lumbar spine BMD [OR = 1.75, 95% CI (1.1-2.7)], lower SI [OR = 1.1, 95% CI (1.03-1.13)], duration of dialysis [OR = 1.3, 95% CI (1.13-2.7)], and lower body mass index [OR = 1.24, 95% CI (1.1-1.4). Our results demonstrate high prevalence of low BMD and osteoporotic fractures in patients receiving a successful kidney transplant and indicate the need for specific intervention to prevent osteoporosis in this population.
Resumo:
The objective of the present study was to determine the frequency of the most common clinical features in patients with autosomal dominant polycystic kidney disease in a sample of the Brazilian population. The medical records of 92 patients with autosomal dominant polycystic kidney disease attended during the period from 1985 to 2003 were reviewed. The following data were recorded: age at diagnosis, gender, associated clinical manifestations, occurrence of stroke, age at loss of renal function (beginning of dialysis), and presence of a family history. The involvement of abdominal viscera was investigated by ultrasonography. Intracranial alterations were prospectively investigated by magnetic resonance angiography in 42 asymptomatic patients, and complemented with digital subtraction arteriography when indicated. Mean age at diagnosis was 35.1 ± 14.9 years, and mean serum creatinine at referral was 2.4 ± 2.8 mg/dL. The most frequent clinical manifestations during the disease were arterial hypertension (63.3%), lumbar pain (55.4%), an abdominal mass (47.8%), and urinary infection (35.8%). Loss of renal function occurred in 27 patients (mean age: 45.4 ± 9.5 years). The liver was the second organ most frequently affected (39.1%). Stroke occurred in 7.6% of the patients. Asymptomatic intracranial aneurysm was detected in 3 patients and arachnoid cysts in 3 other patients. In conclusion, the most common clinical features were lumbar pain, arterial hypertension, abdominal mass, and urinary infection, and the most serious complications were chronic renal failure and stroke. Both intracranial aneurysms and arachnoid cysts occurred in asymptomatic patients at a frequency of 7.14%.
Resumo:
Mitogen-activated protein kinases (MAPK) may be involved in the pathogenesis of acute renal failure. This study investigated the expression of p-p38 MAPK and nuclear factor kappa B (NF-kappaB) in the renal cortex of rats treated with gentamicin. Twenty rats were injected with gentamicin, 40 mg/kg, im, twice a day for 9 days, 20 with gentamicin + pyrrolidine dithiocarbamate (PDTC, an NF-kappaB inhibitor), 14 with 0.15 M NaCl, im, twice a day for 9 days, and 14 with 0.15 M NaCl , im, twice a day for 9 days and PDTC, 50 mg kg-1 day-1, ip, twice a day for 15 days. The animals were killed 5 and 30 days after the last of the injections and the kidneys were removed for histological, immunohistochemical and Western blot analysis and for nitrate determination. The results of the immunohistochemical study were evaluated by counting the p-p38 MAPK-positive cells per area of renal cortex measuring 0.05 mm². Creatinine was measured by the Jaffé method in blood samples collected 5 and 30 days after the end of the treatments. Gentamicin-treated rats presented a transitory increase in plasma creatinine levels. In addition, animals killed 5 days after the end of gentamicin treatment presented acute tubular necrosis and increased nitrate levels in the renal cortex. Increased expression of p-p38 MAPK and NF-kappaB was also observed in the kidneys from these animals. The animals killed 30 days after gentamicin treatment showed residual areas of interstitial fibrosis in the renal cortex, although the expression of p-p38 MAPK in their kidneys did not differ from control. Treatment with PDTC reduced the functional and structural changes induced by gentamicin as well as the expression of p-p38 MAPK and NF-kappaB. The increased expression of p-p38 MAPK and NF-kappaB observed in these rats suggests that these signaling molecules may be involved in the pathogenesis of tubulointerstitial nephritis induced by gentamicin.
Resumo:
Treatment with indinavir (IDV), a protease inhibitor, is frequently associated with renal abnormalities. We determined the incidence of renal failure (creatinine clearance <80 mL min-1 1.73 (m²)-1) in HIV patients treated with highly active antiretroviral therapy, including IDV, and investigated the possible mechanisms and risk factors of IDV nephrotoxicity. Thirty-six patients receiving IDV were followed for 3 years. All were assessed for age, body weight, duration of infection, duration of IDV treatment, sulfur-derivative use, total cholesterol, triglycerides, magnesium, sodium, potassium, creatinine, and urinalysis. We also determined renal function in terms of creatinine clearance, urine osmolality and fractional excretion of sodium, potassium, and water. Urinary nitrate (NO3) excretion was measured in 18 IDV-treated patients and compared with that of 8 patients treated with efavirenz, a drug without renal side effects. Sterile leukocyturia occurred in 80.5% of the IDV-treated patients. Creatinine clearance <80 mL min-1 1.73 (m²)-1 was observed in 22 patients (61%) and was associated with low body weight and the use of sulfur-derivatives. These patients also had lower osmolality, lower urine volume and a higher fractional excretion of water compared to the normal renal function group. Urinary NO3 excretion was significantly lower in IDV-treated patients (809 ± 181 µM NO3-/mg creatinine) than in efavirenz-treated patients (2247 ± 648 µM NO3-/mg creatinine, P < 0.01). The lower NO3 excretion suggests that IDV decreases nitric oxide production.
Resumo:
Significant improvements have been noted in heart transplantation with the advent of cyclosporine. However, cyclosporine use is associated with significant side effects, such as chronic renal failure. We were interested in evaluating the incidence of long-term renal dysfunction in heart transplant recipients. Fifty-three heart transplant recipients were enrolled in the study. Forty-three patients completed the entire evaluation and follow-up. Glomerular (serum creatinine, creatinine clearance measured, and creatinine clearance calculated) and tubular functions (urinary retinol-binding protein, uRBP) were re-analyzed after 18 months. At the enrollment time, the prevalence of renal failure ranged from 37.7 to 54% according to criteria used to define it (serum creatinine > or = 1.5 mg/dL and creatinine clearance <60 mL/min). Mean serum creatinine was 1.61 ± 1.31 mg/dL (range 0.7 to 9.8 mg/dL) and calculated and measured creatinine clearances were 67.7 ± 25.9 and 61.18 ± 25.04 mL min-1 (1.73 m²)-1, respectively. Sixteen of the 43 patients who completed the follow-up (37.2%) had tubular dysfunction detected by increased levels of uRBP (median 1.06, 0.412-6.396 mg/dL). Eleven of the 16 patients (68.7%) with elevated uRBP had poorer renal function after 18 months of follow-up, compared with only eight of the 27 patients (29.6%) with normal uRBP (RR = 3.47, P = 0.0095). Interestingly, cyclosporine trough levels were not different between patients with or without tubular and glomerular dysfunction. Renal function impairment is common after heart transplantation. Tubular dysfunction, assessed by uRBP, correlates with a worsening of glomerular filtration and can be a useful tool for early detection of renal dysfunction.
Resumo:
The objective of the present study was to assess the incidence, risk factors and outcome of patients who develop acute renal failure (ARF) in intensive care units. In this prospective observational study, 221 patients with a 48-h minimum stay, 18-year-old minimum age and absence of overt acute or chronic renal failure were included. Exclusion criteria were organ donors and renal transplantation patients. ARF was defined as a creatinine level above 1.5 mg/dL. Statistics were performed using Pearsons' chi2 test, Student t-test, and Wilcoxon test. Multivariate analysis was run using all variables with P < 0.1 in the univariate analysis. ARF developed in 19.0% of the patients, with 76.19% resulting in death. Main risk factors (univariate analysis) were: higher intra-operative hydration and bleeding, higher death risk by APACHE II score, logist organ dysfunction system on the first day, mechanical ventilation, shock due to systemic inflammatory response syndrome (SIRS)/sepsis, noradrenaline use, and plasma creatinine and urea levels on admission. Heart rate on admission (OR = 1.023 (1.002-1.044)), male gender (OR = 4.275 (1.340-13642)), shock due to SIRS/sepsis (OR = 8.590 (2.710-27.229)), higher intra-operative hydration (OR = 1.002 (1.000-1004)), and plasma urea on admission (OR = 1.012 (0.980-1044)) remained significant (multivariate analysis). The mortality risk factors (univariate analysis) were shock due to SIRS/sepsis, mechanical ventilation, blood stream infection, potassium and bicarbonate levels. Only potassium levels remained significant (P = 0.037). In conclusion, ARF has a high incidence, morbidity and mortality when it occurs in intensive care unit. There is a very close association with hemodynamic status and multiple organ dysfunction.
Resumo:
Renal ischemia-reperfusion (IR) injury is the major cause of acute renal failure in native and transplanted kidneys. Mononuclear leukocytes have been reported in renal tissue as part of the innate and adaptive responses triggered by IR. We investigated the participation of CD4+ T lymphocytes in the pathogenesis of renal IR injury. Male mice (C57BL/6, 8 to 12 weeks old) were submitted to 45 min of ischemia by renal pedicle clamping followed by reperfusion. We evaluated the role of CD4+ T cells using a monoclonal depleting antibody against CD4 (GK1.5, 50 µ, ip), and class II-major histocompatibility complex molecule knockout mice. Both CD4-depleted groups showed a marked improvement in renal function compared to the ischemic group, despite the fact that GK1.5 mAb treatment promoted a profound CD4 depletion (to less than 5% compared to normal controls) only within the first 24 h after IR. CD4-depleted groups presented a significant improvement in 5-day survival (84 vs 80 vs 39%; antibody treated, knockout mice and non-depleted groups, respectively) and also a significant reduction in the tubular necrosis area with an early tubular regeneration pattern. The peak of CD4-positive cell infiltration occurred on day 2, coinciding with the high expression of ßC mRNA and increased urea levels. CD4 depletion did not alter the CD11b infiltrate or the IFN-g and granzyme-B mRNA expression in renal tissue. These data indicate that a CD4+ subset of T lymphocytes may be implicated as key mediators of very early inflammatory responses after renal IR injury and that targeting CD4+ T lymphocytes may yield novel therapies.
HLA-DRB1 alleles in juvenile-onset systemic lupus erythematosus: renal histologic class correlations
Resumo:
Human leukocyte antigens (HLA) DRB1*03 and DRB1*02 have been associated with systemic lupus erythematosus (SLE) in Caucasians and black populations. It has been observed that certain HLA alleles show stronger associations with SLE autoantibodies and clinical subsets, although they have rarely been associated with lupus renal histologic class. In the present study, HLA-DRB1 allele correlations with clinical features, autoantibodies and renal histologic class were analyzed in a cohort of racially mixed Brazilian patients with juvenile-onset SLE. HLA-DRB1 typing was carried out by polymerase chain reaction amplification with sequence-specific primers using genomic DNA from 55 children and adolescents fulfilling at least four of the American College of Rheumatology criteria for SLE. Significance was determined by the chi-square test applied to 2 x 2 tables. The HLA-DRB1*15 allele was most frequent in patients with renal, musculoskeletal, cutaneous, hematologic, cardiac, and neuropsychiatric involvement, as well as in patients positive for anti-dsDNA, anti-Sm, anti-U1-RNP, and anti-SSA/Ro antibodies, although an association between HLA alleles and SLE clinical features and autoantibodies could not be observed. The HLA-DRB1*17, HLA-DRB1*10, HLA-DRB1*15, and HLA-DRB1*07 alleles were significantly higher in patients with renal histologic class I, class IIA, class IIB, and class V, respectively. The present results suggest that the contribution of HLA- DRB1 alleles to juvenile-onset SLE could not be related to clinical or serological subsets of the disease, but it may be related to renal histologic classes, especially class I, class II A, class II B, and class V. The latter correlations have not been observed in literature.