61 resultados para ba
Resumo:
Joint pedological, geochemical, hydrological and geophysical investigations were performed to study the coexistence or saline and freshwater lakes in close proximity and similar climatic conditions in the Nhecolandia region, Pantanal wetlands in Brazil. The saline lakes are concentrically surrounded by green sandy loam horizons, which cause differential hydrological regimes. Mg-calcite, K-silicates, and amorphous silica precipitate in the soil cover, whereas Mg-silicates and more soluble Na-carbonates are concentrated in the topsoil along the shore of the saline lake. In saline solutions, some minor elements (As, Se) reach values above the water quality recommendations, whereas others are controlled and incorporated in solid phases (Ba, Sr). Locally, the destruction of the sandy loam horizons generates very acidic soil solution (pH similar to 3.5) through a process not yet understood. The soil distributions indicate that some freshwater lakes are former saline lakes. They are invaded by freshwater after destruction of the sandy loam green horizons, then the freshwater becomes enriched in K(+), SO(4)(2-), Fe, Al, and a stream of minor and trace elements. The formation of these green sandy loam horizons in the saline environment and their destruction in the non-saline one emphasizes the dynamic nature of this environment (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
BA is the most important disease requiring liver transplantation in children. Common BDL in rats is a classic experimental model to study biliary obstruction. The response of the neonatal animal to BDL has yet to be completely understood and few reports have focused on the behavioral differences of the liver between neonatal and adult animals. Ninety newborn Wistar rats aged six days, weighing 8.0-13.9 g, and 90 adult Wistar rats weighing 199.7-357.0 g, were submitted to BDL. After surgery, they were randomly divided and killed on the 3rd, 5th, 7th, 14th, 21st and 28th day post-BDL. Hepatic biopsies were obtained and the following were measured: (i) semiquantification of the bile ductule proliferation and inflammatory infiltrate by HE stain, (ii) quanti. cation of portal and periportal fibrosis with the Sirius-red stain. Although the initial response of ductule proliferation and inflammatory infiltrate were less intense in the newborn animal, the portal and periportal fibrosis were higher when compared with adult animals (p < 0.0491). These findings may contribute to the understanding of the pathophysiology of BA.
Resumo:
Aims: There remains significant concern about the long-term safety of drug-eluting stents (DES). However, bare metal stents (BMS) have been used safely for over two decades. There is therefore a pressing need to explore alternative strategies for reducing restenosis with BMS. This study was designed to examine whether IVUS-guided cutting balloon angioplasty (CBA) with BMS could convey similar restenosis rates to DES. Methods and results: In the randomised REstenosis reDUction by Cutting balloon angioplasty Evaluation (REDUCE III) study, 521 patients were divided into four groups based on device and IVUS use before BMS (IVUS-CBA-BMS: 137 patients; Angio-CBA-BMS: 123; IVUS-BA-BMS: 142; and Angio-BA-BMS: 119). At follow-up, the IVUS-CBA-BMS group had a significantly lower restenosis rate (6.6%) than the other groups (p=0.016). We performed a quantitative coronary angiography (QCA) based matched comparison between an IVUS-guided CBA-BMS strategy (REDUCE III) and a DES strategy (Rapamycin-Eluting-Stent Evaluation At Rotterdam Cardiology. Hospital, the RESEARCH study). We matched the presence of diabetes, vessel size, and lesion severity by QCA. Restenosis (>50% diameter stenosis at follow-up) and target vessel revascularisation (TVR) were examined. QCA-matched comparison resulted in 120-paired lesions. While acute gain was significantly greater in IVUS-CBA-BMS than DES (1.65 +/- 0.41 mm vs. 1.28 +/- 0.57 mm, p=0.001), late loss was significantly less with DES than with IVUS-CBA-BMS (0.03 +/- 0.42 mm vs. 0.80 +/- 0.47 mm, p=0.001). However, no difference was found in restenosis rates (IVUS-CBA-BMS: 6.6% vs. DES: 5.0%, p=0.582) and TVR (6.6% and 6.6%, respectively). Conclusions: An IVUS-guided CBA-BMS strategy yielded restenosis rates similar to those achieved by DES and provided an effective alternative to the use of DES.
Resumo:
Aneas I, Rodrigues MV, Pauletti BA, Silva GJ, Carmona R, Cardoso L, Kwitek AE, Jacob HJ, Soler JM, Krieger JE. Congenic strains provide evidence that four mapped loci in chromosomes 2, 4, and 16 influence hypertension in the SHR. Physiol Genomics 37: 52-57, 2009. First published January 6, 2009; doi: 10.1152/physiolgenomics.90299.2008. - To dissect the genetic architecture controlling blood pressure (BP) regulation in the spontaneously hypertensive rat (SHR) we derived congenic rat strains for four previously mapped BP quantitative trait loci (QTLs) in chromosomes 2, 4, and 16. Target chromosomal regions from the Brown Norway rat (BN) averaging 13 - 29 cM were introgressed by marker-assisted breeding onto the SHR genome in 12 or 13 generations. Under normal salt intake, QTLs on chromosomes 2a, 2c, and 4 were associated with significant changes in systolic BP (13, 20, and 15 mmHg, respectively), whereas the QTL on chromosome 16 had no measurable effect. On high salt intake (1% NaCl in drinking water for 2 wk), the chromosome 16 QTL had a marked impact on SBP, as did the QTLs on chromosome 2a and 2c (18, 17, and 19 mmHg, respectively), but not the QTL on chromosome 4. Thus these four QTLs affected BP phenotypes differently: 1) in the presence of high salt intake (chromosome 16), 2) only associated with normal salt intake (chromosome 4), and 3) regardless of salt intake (chromosome 2c and 2a). Moreover, salt sensitivity was abrogated in congenics SHR. BN2a and SHR. BN16. Finally, we provide evidence for the influence of genetic background on the expression of the mapped QTLs individually or as a group. Collectively, these data reveal previously unsuspected nuances of the physiological roles of each of the four mapped BP QTLs in the SHR under basal and/or salt loading conditions unforeseen by the analysis of the F2 cross.
Resumo:
Objectives: To describe clinical, radiological findings, and outcome in a multiethnic population of stroke survivors with basilar artery occlusive disease (BAOC). Methods: Forty patients with infarcts in the basilar artery (BA) territory, alive 30 days after the ictus, participated in the study. BA stenosis (>50%) or occlusion was shown by magnetic resonance or digital subtraction angiography in all patients. Demographical, clinical and radiological characteristics were described. Modified Rankin Scale (MRS) scores at 30 days and 6 months after the ischemic event were evaluated. Association between demographical, clinical, radiological features and outcome were analyzed with Chi-square and Fisher`s exact tests. MRS scores at 30 days and 6 months were compared with the Wilcoxon test. Results: Sixty percent of the patients were men, and 33% were Afro-Brazilian. Mean age was 55.8 +/- 12.9 years. Most (90%) had multiple vascular risk factors. Stroke was preceded by TIA in 48% of the patients, and 80% had a history of arterial hypertension. The most common neurological symptom was vertigo/dizziness (60%) and the sign, hemiparesis (60%). Most of the infarcts were located in the pons (85%) and the BA middle third was the most frequently affected segment (33%). BA occlusion occurred in 58% of the patients. More severe vascular occlusive lesions were present in Whites (p = 0.002) and in patients with involvement of the middle third of the BA (p = 0.021). Large-artery atherosclerosis was the most common stroke etiology (88%) and was more frequent in older patients (p < 0.001). Most patients were treated with anticoagulation. MRS scores improved significantly at 6 months (p < 0.001): at this time, 78% of the patients had MRS scores between 0 and 2. Conclusions: We observed different results compared with other series: greater proportion of Afro-descendents, higher frequency of atherosclerosis and BA occlusion. Rates of preceding TIAs and good outcome at 6 months were similar to previously published data. These results represent a step forward towards understanding BAOC in a multiethnic context. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
PURPOSE: To compare the ability of Fourier-domain (FD) optical coherence tomography (3D OCT-1000; Top, con, Tokyo, Japan) and time domain (TD) OCT (Stratus; Carl Zeiss Meditec Inc, Dublin, California, USA) to detect axonal loss in eyes with band atrophy (BA) of the optic nerve. DESIGN: Cross-sectional study. METHODS: Thirty-six eyes from 36 patients with BA and temporal visual field (VF) defect from chiasmal compression and 36 normal eyes were studied. Subjects were submitted to standard automated perimetry and macular and retinal nerve fiber layer (RNFL) measurements were taken using 3D OCT-1000 and Stratus OCT. Receiver operating characteristic (ROC) curves were calculated for each parameter. Spearman correlation coefficients were obtained to evaluate the relationship between RNFL and macular thickness parameters and severity of VF loss. Measurements from the two devices were compared. RESULTS: Regardless of OCT device, all RNFL and macular thickness parameters were significantly lower in eyes with BA compared with normal eyes, but no statistically significant difference was found with regard to the area under the ROC curve. Structure-function relationships were also similar for the two devices. In both groups, RNFL and macular thickness measurements were generally and in some cases significantly smaller with 3D OCT-1000 than with Stratus OCT. CONCLUSIONS: The introduction of FD technology did not lead to better discrimination ability for detecting BA of the optic nerve compared with TD technology when using the software currently provided by the manufacturer. 3D OCT-1000 FD OCT RNFL and macular measurements were generally smaller than TD Stratus OCT measurements. Investigators should be aware of this fact when comparing measurements obtained with these two devices. (Am J Oplathalmol 2009;147: 56-63. (c) 2009 by Elsevier Inc. All rights reserved.)
Resumo:
Otitis media with effusion (OME) affects 28-38% of pre-school children, and it occurs due to the dysfunction of the auditory tube. Anatomical development of the auditory tube depends on the craniofacial growth and development. Deviations of normal. craniofacial. morphology and growth using cephatometric studies, may predict the evolution of otitis. Our goal in this paper is to determine if there are differences in craniofacial morphology between children with adenoid enlargement, with and without otitis media with effusion. This is a prospective study in which the sample consisted of 67 children (mate and female) from 5 to 10 years old. All patients presented chronic upper airway obstruction due to tonsil. and adenoid enlargement (>80% degree of obstruction). Thirty-three patients presented otitis media with effusion, for more than 3 months and 34 did not. The tatter composed the control group. Standardized lateral head radiographs were obtained for all. subjects. Radiographs were taken with patient positioned by a cephalostat and stayed with mandibles in centric occlusion and Lips at rest. Radiographs were digitalized and specific Landmarks were identified using a computer program Radiocef 2003, 5th edition. Measurements, angles and tines were taken of the basicranium, maxilla and mandible according to the modified Ricketts analysis. In addition, facial height and facial axis were determined. Children with otitis media with effusion present differences in the morphology of the face, regarding these measures: N-S (anterior cranial base length), N-ANS (upper facial height), ANS-PNS (size of the hard palate), Po-Or.N-Pog (facial depth), Ba-N.Ptm-Gn (facial axis), Go-Me (mandibular Length) and Vaia--Vaip (inferior pharyngeal airway). (C) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Aim To compare the ability of scanning laser polarimeter (SLP) with variable corneal compensation (GDx VCC) and optical coherence tomograph (Stratus OCT) to discriminate between eyes with band atrophy (BA) of the optic nerve and healthy eyes. Methods The study included 37 eyes with BA and temporal visual field (VF) defects from chiasmal compression, and 29 normal eyes. Subjects underwent standard automated perimetry (SAP) and retinal nerve fibre layer (RNFL) scans using GDx VCC and Stratus OCT. The severity of the VF defects was evaluated by the temporal mean defect (TMD), calculated as the average of 22 values of the temporal total deviation plot on SAP. Receiver operating characteristic (ROC) curves were calculated. Pearson`s correlation coefficients were used to evaluate the relationship between RNFL thickness parameters and the TMD. Results No significant difference was found between the ROC curves areas (AUCs) for the GDx VCC and Stratus OCT with regard to average RNFL thickness (0.98 and 0.99, respectively) and the superior (0.94; 0.95), inferior (0.96; 0.97), and nasal (0.92; 0.96) quadrants. However, the AUC in the temporal quadrant (0.77) was significantly smaller (P < 0.001) with GDx VCC than with Stratus OCT (0.98). Lower TMD values were associated with smaller RNFL thickness in most parameters from both equipments. Conclusion Adding VCC resulted in improved performance in SLP when evaluating eyes with BA, and both technologies are sensitive in detecting average, superior, inferior, and nasal quadrant RNFL loss. However, GDx VCC still poorly discriminates RNFL loss in the temporal quadrant when compared with Stratus OCT.
Resumo:
The Canoparmelia texana epiphytic lichenized fungi was used to monitor atmospheric pollution in the Sao Paulo metropolitan region, SP, Brazil. The cluster analysis applied to the element concentration values confirmed the site groups of different levels of pollution due to industrial and vehicular emissions. In the distribution maps of element concentrations, higher concentrations of Ba and Mn were observed in the vicinity of industries and of a petrochemical complex. The highest concentration of Co found in lichens from the Sao Miguel Paulista site is due to the emissions from a metallurgical processing plant that produces this element. For Br and Zn, the highest concentrations could be associated both to vehicular and industrial emissions. Exploratory analyses revealed that the accumulation of toxic elements in C. texana may be of use in evaluating the human risk of cardiopulmonary mortality due to prolonged exposure to ambient levels of air pollution. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Particulate matter, especially PM2.5, is associated with increased morbidity and mortality from respiratory diseases. Studies that focus on the chemical composition of the material are frequent in the literature, but those that characterize the biological fraction are rare. The objectives of this study were to characterize samples collected in Sao Paulo, Brazil on the quantity of fungi and endotoxins associated with PM2.5, correlating with the mass of particulate matter, chemical composition and meteorological parameters. We did that by Principal Component Analysis (PCA) and multiple linear regressions. The results have shown that fungi and endotoxins represent significant portion of PM2.5, reaching average concentrations of 772.23 spores mu g(-1) of PM2.5 (SD: 400.37) and 5.52 EU mg(-1) of PM2.5 (SD: 4.51 EU mg(-1)), respectively. Hyaline basidiospores, Cladosporium and total spore counts were correlated to factor Ba/Ca/Fe/Zn/K/Si of PM2.5 (p < 0.05). Genera Pen/Asp were correlated to the total mass of PM2.5 (p < 0.05) and colorless ascospores were correlated to humidity (p < 0.05). Endotoxin was positively correlated with the atmospheric temperature (p < 0.05). This study has shown that bioaerosol is present in considerable amounts in PM2.5 in the atmosphere of Sao Paulo, Brazil. Some fungi were correlated with soil particle resuspension and mass of particulate matter. Therefore, the relative contribution of bioaerosol in PM2.5 should be considered in future studies aimed at evaluating the clinical impact of exposure to air pollution. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Objective. The objective of this study was to report our experience with pediatric orthotopic liver transplantation (OLT) with living related donors. Methods. We performed a retrospective chart analysis of 121 living related donor liver transplantations (LRDLT) from June 1998 to June 2010. Results. Indications were biliary atresia (BA; n = 81), primary sclerosing cholangitis (n = 5), alpha-1 antitrypsin deficiency (n = 4); cholestasis (n = 9), fulminant hepatic failure (n = 8), autoimmune hepatitis (n = 2), Alagille syndrome (n = 4), hepatoblastoma (n = 3), tyrosinemia (n = 2), and congenital hepatic fibrosis (n = 3). The age of the recipients ranged from 7-174 months (median, 22) and the weights ranged from 6-58 kg (median, 10). Forty-nine children (40.5%) weighed <= 10 kg. The grafts included the left lateral segment (n = 108), the left lobe (n = 12), and the right lobe (n = 1). The donors included 71 mothers, 45 fathers, 2 uncles, 1 grandmother, 1 grandfather, and 1 sister with a median age of 29 years (range, 16-53 ys) and a median weight of 68 kg (range, 47-106). Sixteen patients (12.9%) required retransplantation, most commonly due to hepatic artery thrombosis (HAT; n = 13; 10.7%). The other complications were biliary stenosis (n = 25; 20.6%), portal vein thrombosis (PVT; n = 11; 9.1%), portal vein stenosis (n = 5; 4.1%), hepatic vein stenosis (n = 6; 4.9%), and lymphoproliferative disorders (n = 8; 6.6%). The ultimate survival rate of recipients was 90.3% after 1 year and 75.8% after 3 years. Causes of early death within 1 month were HAT (n = 6), PVT (n = 2), severe graft dysfunction (n = 1), sepsis (n = 1), and intraoperative death in children with acute liver failure (n = 2). Causes of late deaths included lymphoproliferative disease (n = 3), chronic rejection (n = 2), biliary complications (n = 3), and recurrent disease (n = 3; hepatoblastoma and primary sclerosing cholangitis). Conclusions. Despite the heightened possibility of complications (mainly vascular), LRDLT represented a good alternative to transplantation from cadaveric donors in pediatric populations. It was associated with a high survival ratio.
Resumo:
Introduction. The use of arterial grafts (AG) in pediatric orthotopic liver transplantation (OLT) is an alternative in cases of poor hepatic arterial inflow, small or anomalous recipient hepatic arteries, and retransplantations (re-OLT) due to hepatic artery thrombosis (HAT). AG have been crucial to the success of the procedure among younger children. Herein we have reported our experience with AG. Methods. We retrospectively reviewed data from June 1989 to June 2010 among OLT in which we used AG, analyzing indications, short-term complications, and long-term outcomes. Results. Among 437 pediatric OLT, 58 children required an AG. A common iliac artery interposition graft was used in 57 cases and a donor carotid artery in 1 case. In 38 children the graft was used primarily, including 94% (36/38) in which it was due to poor hepatic arterial inflow. Ductopenia syndromes (n = 14), biliary atresia (BA; n = 11), and fulminant hepatitis (n = 8) were the main preoperative diagnoses among these children. Their mean weight was 18.4 kg and mean age was 68 months. At the mean follow-up of 27 months, multiple-organ failure and primary graft nonfunction (PNF) were the short-term causes of death in 9 children (26.5%). Among the remaining 29 patients, 2 (6,8%) developed early graft thrombosis requiring re-OLT; 5 (17%) developed biliary complications, and 1 (3.4%) had asymptomatic arterial stenosis. In 20 children, a graft was used during retransplantation. The main indication was HAT (75%). BA (n = 15), ductopenia syndromes (n = 2), and primary sclerosing cholangitis (n = 2) were the main diagnoses. Their mean weight was 16.7 kg and age was 65 months. At a mean follow-up of 53 months, 7 children died due to multiple-organ failure or PNF. Among the remaining 13 patients, 3 developed biliary complications and 1 had arterial stenosis. No thrombosis was observed. Conclusion. The data suggested that use of an AG is useful alternative in pediatric OLT. The technique is safe with a low risk of thrombosis.
Resumo:
Introduction. Biliary atresia (BA) is the leading indication for orthotopic liver transplantation (OLT) among children. However, there are technical difficulties, including the limited dimensions of anatomical structures, hypoplasia and/or thrombosis of the portal vein and previous portoenterostomy procedures. Objective. The objective of this study was to present our experience of 239 children with BA who underwent OLT between September 1989 and June 2010 compared with OLT performed for other causes. Methods. We performed a retrospective analysis of patient charts and analysis of complications and survival. Results. BA was the most common indication for OLT (207/409; 50.6%). The median age of subjects was 26 months (range, 7-192). Their median weight was 11 kg (range, 5-63) with 110 children (53.1%) weighing <= 10 kg. We performed 126 transplantations from cadaveric donors (60.8%) and 81 from living-related donors (LRD) (39.2%). Retransplantation was required for 31 recipients (14.9%), primarily due to hepatic artery thrombosis (HAT; 64.5%). Other complications included the following: portal vein thrombosis (PVT; 13.0%), biliary stenosis and/or fistula (22.2%), bowel perforation (7.0%), and posttransplantation lymphoproliferative disorder (PTLD; 5.3%). Among the cases of OLT for other causes, the median age of recipients was 81 months (range, 11-17 years), which was higher than that for children with BA. Retransplantation was required in 3.5% of these patients (P < .05), mostly due to HAT. The incidences of PVT, bowel perforation, and PTLD were significantly lower (P < .05). There was no significant difference between biliary complications in the 2 groups. The overall survival rates at 1 versus 5 years were 79.7% versus 68.1% for BA, and 81.2% versus 75.7% for other causes, respectively. Conclusions. Children who undergo OLT for BA are younger than those engrafted for other causes, displaying a higher risk of complications and retransplantations.
Resumo:
PURPOSE: To compare the abilities of scanning laser polarimetry (SLP) with enhanced corneal compensation (ECC) and variable corneal compensation (VCC) modes for detection of retinal nerve fiber layer (RNFL) loss in eyes with band atrophy (BA) of the optic nerve. DESIGN. Cross-sectional study. METHODS: Thirty-seven eyes from 37 patients with BA and temporal visual field defect from chiasmal compression and 40 eyes from 40 healthy subjects were studied. Subjects underwent standard automated perimetry and RNFL measurements using an SLP device equipped with VCC and ECC. Receiver operating characteristic (ROC) curves were calculated for each parameter. Pearson correlation coefficients were obtained to evaluate the relationship between RNFL thickness parameters and severity of visual field loss, as assessed by the temporal mean defect. RESULTS: All RNFL thickness parameters were significantly lower in eyes with BA compared with normal eyes with both compensation modes. However, no statistically significant differences were observed in the areas under the ROC curves for the different parameters between GDx VCC and ECC (Carl Zeiss Meditec, Inc, Dublin, California, USA). Structure-function relationships also were similar for both compensation modes. CONCLUSIONS: No significant differences were found between the diagnostic accuracy of GDx ECC and that of VCC for detection of BA of the optic nerve. The use of GDx ECC does not seem to provide a better evaluation of RNFL loss on the temporal and nasal sectors of the peripapillary retina in subjects with BA of the optic nerve.
Resumo:
Background: Birth weight is positively associated with adult bone mass. However, it is not clear if its effect is already evident in early adulthood. Objective: To investigate the association between birth weight, adult body size, the interaction between them and bone mass in young adults. Methods: Bone densitometry by DXA was performed on 496 individuals (240 men) aged 23-24 years from the 1978/79 Ribeirao Preto (southern Brazil) birth cohort, who were born and still residing in the city in 2002. Birth weight and length as well as adult weight and height were directly measured and converted to z-scores. The influence of birth weight and length, and adult weight and height on bone area (BA), bone mineral content (BMC) and bone mineral density (BMD) at the lumbar spine, proximal femur and femoral neck were investigated through simple and multiple linear regression models. Adjustments were made for sex, skin color, gestational age, physical activity level, smoking status and dietary consumption of protein, calcium and alcohol. Interaction terms between birth weight and adult weight, and birth length and adult height were tested. Results: Men in the highest fertile of birth weight distribution had greater BA and BMC at all three bone sites when compared with their counterparts in the lowest tertiles (p<0.008). For BMD, this trend was observed only in the lumbar spine. Adult weight and height were positively associated with BA and BMC at all three bone sites (p<0.05). For BMD, these associations were seen for adult weight, but for adult height an association was observed only in the lumbar spine. Birth weight retained positive associations with proximal femur BA and BMC after adjustments for current weight and height. No interaction was observed between variables measuring prenatal growth and adult body size. Conclusion: Birth weight and postnatal growth are independent determinants of adult bone mass in a sample of Brazilian adults. This effect is already evident in early adulthood. (C) 2010 Elsevier Inc. All rights reserved.