999 resultados para ba
Resumo:
Systems approaches can help to evaluate and improve the agronomic and economic viability of nitrogen application in the frequently water-limited environments. This requires a sound understanding of crop physiological processes and well tested simulation models. Thus, this experiment on spring wheat aimed to better quantify water x nitrogen effects on wheat by deriving some key crop physiological parameters that have proven useful in simulating crop growth. For spring wheat grown in Northern Australia under four levels of nitrogen (0 to 360 kg N ha(-1)) and either entirely on stored soil moisture or under full irrigation, kernel yields ranged from 343 to 719 g m(-2). Yield increases were strongly associated with increases in kernel number (9150-19950 kernels m(-2)), indicating the sensitivity of this parameter to water and N availability. Total water extraction under a rain shelter was 240 mm with a maximum extraction depth of 1.5 m. A substantial amount of mineral nitrogen available deep in the profile (below 0.9 m) was taken up by the crop. This was the source of nitrogen uptake observed after anthesis. Under dry conditions this late uptake accounted for approximately 50% of total nitrogen uptake and resulted in high (>2%) kernel nitrogen percentages even when no nitrogen was applied,Anthesis LAI values under sub-optimal water supply were reduced by 63% and under sub-optimal nitrogen supply by 50%. Radiation use efficiency (RUE) based on total incident short-wave radiation was 1.34 g MJ(-1) and did not differ among treatments. The conservative nature of RUE was the result of the crop reducing leaf area rather than leaf nitrogen content (which would have affected photosynthetic activity) under these moderate levels of nitrogen limitation. The transpiration efficiency coefficient was also conservative and averaged 4.7 Pa in the dry treatments. Kernel nitrogen percentage varied from 2.08 to 2.42%. The study provides a data set and a basis to consider ways to improve simulation capabilities of water and nitrogen effects on spring wheat. (C) 1997 Elsevier Science B.V.
Resumo:
Background. Physical inactivity is recognized as an important public health issue. Yet little is known about doctors' knowledge, attitude, skills, and resources specifically relating to the promotion of physical activity. Our survey assessed the current practice, perceived desirable practice, confidence, and barriers related to the promotion of physical activity in family practice, Methods. A questionnaire was developed and distributed to all 1,228 family practitioners in Perth, Western Australia. Results. We received a 71% response (n = 789). Family practitioners are most likely to recommend walking to sedentary adults to improve fitness and they are aware of the major barriers to patients participating in physical activity. Doctors are less confident at providing specific advice on exercise and may require further skills, knowledge, and experience, Although they promote exercise to patients through verbal advice in the consultation, few use written materials or referral systems, Conclusions. There are significant differences between self-reports of current practice and perceived desirable practice in the promotion of physical activity by doctors, Future strategies need to address the self-efficacy of family physicians and involve resources of proven effectiveness. The potential of referral systems for supporting efforts to increase physical activity by Australians should be explored. (C) 1997 Academic Press.
Resumo:
Empowering front-line staff to deal with service failures has been proposed as a method of recovering from service breakdown and ensuring greater customer satisfaction. However, no empirical study has investigated consumer responses to empowerment strategies. This research investigates the effect on customer satisfaction and service quality of two employee characteristics: the degree to which the employee is empowered (full, limited, and none), and the employee's communication style (accommodative - informal and personal, and underaccommodative-formal and impersonal). These employee characteristics are studied within the context of service failures. Subjects were shown videotaped service scenarios, and asked to complete satisfaction and service quality ratings. Results revealed that the fully empowered employee produced more customer satisfaction than the other conditions, but only when the service provider used an accommodating style of communication. Fully empowered and nonempowered employees were not judged differently when an underaccommodating style of communication was adopted. (C) 1997 John Wiley & Sons, Inc.
Resumo:
PURPOSE: To compare the ability of Fourier-domain (FD) optical coherence tomography (3D OCT-1000; Top, con, Tokyo, Japan) and time domain (TD) OCT (Stratus; Carl Zeiss Meditec Inc, Dublin, California, USA) to detect axonal loss in eyes with band atrophy (BA) of the optic nerve. DESIGN: Cross-sectional study. METHODS: Thirty-six eyes from 36 patients with BA and temporal visual field (VF) defect from chiasmal compression and 36 normal eyes were studied. Subjects were submitted to standard automated perimetry and macular and retinal nerve fiber layer (RNFL) measurements were taken using 3D OCT-1000 and Stratus OCT. Receiver operating characteristic (ROC) curves were calculated for each parameter. Spearman correlation coefficients were obtained to evaluate the relationship between RNFL and macular thickness parameters and severity of VF loss. Measurements from the two devices were compared. RESULTS: Regardless of OCT device, all RNFL and macular thickness parameters were significantly lower in eyes with BA compared with normal eyes, but no statistically significant difference was found with regard to the area under the ROC curve. Structure-function relationships were also similar for the two devices. In both groups, RNFL and macular thickness measurements were generally and in some cases significantly smaller with 3D OCT-1000 than with Stratus OCT. CONCLUSIONS: The introduction of FD technology did not lead to better discrimination ability for detecting BA of the optic nerve compared with TD technology when using the software currently provided by the manufacturer. 3D OCT-1000 FD OCT RNFL and macular measurements were generally smaller than TD Stratus OCT measurements. Investigators should be aware of this fact when comparing measurements obtained with these two devices. (Am J Oplathalmol 2009;147: 56-63. (c) 2009 by Elsevier Inc. All rights reserved.)
Resumo:
Otitis media with effusion (OME) affects 28-38% of pre-school children, and it occurs due to the dysfunction of the auditory tube. Anatomical development of the auditory tube depends on the craniofacial growth and development. Deviations of normal. craniofacial. morphology and growth using cephatometric studies, may predict the evolution of otitis. Our goal in this paper is to determine if there are differences in craniofacial morphology between children with adenoid enlargement, with and without otitis media with effusion. This is a prospective study in which the sample consisted of 67 children (mate and female) from 5 to 10 years old. All patients presented chronic upper airway obstruction due to tonsil. and adenoid enlargement (>80% degree of obstruction). Thirty-three patients presented otitis media with effusion, for more than 3 months and 34 did not. The tatter composed the control group. Standardized lateral head radiographs were obtained for all. subjects. Radiographs were taken with patient positioned by a cephalostat and stayed with mandibles in centric occlusion and Lips at rest. Radiographs were digitalized and specific Landmarks were identified using a computer program Radiocef 2003, 5th edition. Measurements, angles and tines were taken of the basicranium, maxilla and mandible according to the modified Ricketts analysis. In addition, facial height and facial axis were determined. Children with otitis media with effusion present differences in the morphology of the face, regarding these measures: N-S (anterior cranial base length), N-ANS (upper facial height), ANS-PNS (size of the hard palate), Po-Or.N-Pog (facial depth), Ba-N.Ptm-Gn (facial axis), Go-Me (mandibular Length) and Vaia--Vaip (inferior pharyngeal airway). (C) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Aim To compare the ability of scanning laser polarimeter (SLP) with variable corneal compensation (GDx VCC) and optical coherence tomograph (Stratus OCT) to discriminate between eyes with band atrophy (BA) of the optic nerve and healthy eyes. Methods The study included 37 eyes with BA and temporal visual field (VF) defects from chiasmal compression, and 29 normal eyes. Subjects underwent standard automated perimetry (SAP) and retinal nerve fibre layer (RNFL) scans using GDx VCC and Stratus OCT. The severity of the VF defects was evaluated by the temporal mean defect (TMD), calculated as the average of 22 values of the temporal total deviation plot on SAP. Receiver operating characteristic (ROC) curves were calculated. Pearson`s correlation coefficients were used to evaluate the relationship between RNFL thickness parameters and the TMD. Results No significant difference was found between the ROC curves areas (AUCs) for the GDx VCC and Stratus OCT with regard to average RNFL thickness (0.98 and 0.99, respectively) and the superior (0.94; 0.95), inferior (0.96; 0.97), and nasal (0.92; 0.96) quadrants. However, the AUC in the temporal quadrant (0.77) was significantly smaller (P < 0.001) with GDx VCC than with Stratus OCT (0.98). Lower TMD values were associated with smaller RNFL thickness in most parameters from both equipments. Conclusion Adding VCC resulted in improved performance in SLP when evaluating eyes with BA, and both technologies are sensitive in detecting average, superior, inferior, and nasal quadrant RNFL loss. However, GDx VCC still poorly discriminates RNFL loss in the temporal quadrant when compared with Stratus OCT.
Resumo:
The Canoparmelia texana epiphytic lichenized fungi was used to monitor atmospheric pollution in the Sao Paulo metropolitan region, SP, Brazil. The cluster analysis applied to the element concentration values confirmed the site groups of different levels of pollution due to industrial and vehicular emissions. In the distribution maps of element concentrations, higher concentrations of Ba and Mn were observed in the vicinity of industries and of a petrochemical complex. The highest concentration of Co found in lichens from the Sao Miguel Paulista site is due to the emissions from a metallurgical processing plant that produces this element. For Br and Zn, the highest concentrations could be associated both to vehicular and industrial emissions. Exploratory analyses revealed that the accumulation of toxic elements in C. texana may be of use in evaluating the human risk of cardiopulmonary mortality due to prolonged exposure to ambient levels of air pollution. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Particulate matter, especially PM2.5, is associated with increased morbidity and mortality from respiratory diseases. Studies that focus on the chemical composition of the material are frequent in the literature, but those that characterize the biological fraction are rare. The objectives of this study were to characterize samples collected in Sao Paulo, Brazil on the quantity of fungi and endotoxins associated with PM2.5, correlating with the mass of particulate matter, chemical composition and meteorological parameters. We did that by Principal Component Analysis (PCA) and multiple linear regressions. The results have shown that fungi and endotoxins represent significant portion of PM2.5, reaching average concentrations of 772.23 spores mu g(-1) of PM2.5 (SD: 400.37) and 5.52 EU mg(-1) of PM2.5 (SD: 4.51 EU mg(-1)), respectively. Hyaline basidiospores, Cladosporium and total spore counts were correlated to factor Ba/Ca/Fe/Zn/K/Si of PM2.5 (p < 0.05). Genera Pen/Asp were correlated to the total mass of PM2.5 (p < 0.05) and colorless ascospores were correlated to humidity (p < 0.05). Endotoxin was positively correlated with the atmospheric temperature (p < 0.05). This study has shown that bioaerosol is present in considerable amounts in PM2.5 in the atmosphere of Sao Paulo, Brazil. Some fungi were correlated with soil particle resuspension and mass of particulate matter. Therefore, the relative contribution of bioaerosol in PM2.5 should be considered in future studies aimed at evaluating the clinical impact of exposure to air pollution. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Objective. The objective of this study was to report our experience with pediatric orthotopic liver transplantation (OLT) with living related donors. Methods. We performed a retrospective chart analysis of 121 living related donor liver transplantations (LRDLT) from June 1998 to June 2010. Results. Indications were biliary atresia (BA; n = 81), primary sclerosing cholangitis (n = 5), alpha-1 antitrypsin deficiency (n = 4); cholestasis (n = 9), fulminant hepatic failure (n = 8), autoimmune hepatitis (n = 2), Alagille syndrome (n = 4), hepatoblastoma (n = 3), tyrosinemia (n = 2), and congenital hepatic fibrosis (n = 3). The age of the recipients ranged from 7-174 months (median, 22) and the weights ranged from 6-58 kg (median, 10). Forty-nine children (40.5%) weighed <= 10 kg. The grafts included the left lateral segment (n = 108), the left lobe (n = 12), and the right lobe (n = 1). The donors included 71 mothers, 45 fathers, 2 uncles, 1 grandmother, 1 grandfather, and 1 sister with a median age of 29 years (range, 16-53 ys) and a median weight of 68 kg (range, 47-106). Sixteen patients (12.9%) required retransplantation, most commonly due to hepatic artery thrombosis (HAT; n = 13; 10.7%). The other complications were biliary stenosis (n = 25; 20.6%), portal vein thrombosis (PVT; n = 11; 9.1%), portal vein stenosis (n = 5; 4.1%), hepatic vein stenosis (n = 6; 4.9%), and lymphoproliferative disorders (n = 8; 6.6%). The ultimate survival rate of recipients was 90.3% after 1 year and 75.8% after 3 years. Causes of early death within 1 month were HAT (n = 6), PVT (n = 2), severe graft dysfunction (n = 1), sepsis (n = 1), and intraoperative death in children with acute liver failure (n = 2). Causes of late deaths included lymphoproliferative disease (n = 3), chronic rejection (n = 2), biliary complications (n = 3), and recurrent disease (n = 3; hepatoblastoma and primary sclerosing cholangitis). Conclusions. Despite the heightened possibility of complications (mainly vascular), LRDLT represented a good alternative to transplantation from cadaveric donors in pediatric populations. It was associated with a high survival ratio.
Resumo:
Introduction. The use of arterial grafts (AG) in pediatric orthotopic liver transplantation (OLT) is an alternative in cases of poor hepatic arterial inflow, small or anomalous recipient hepatic arteries, and retransplantations (re-OLT) due to hepatic artery thrombosis (HAT). AG have been crucial to the success of the procedure among younger children. Herein we have reported our experience with AG. Methods. We retrospectively reviewed data from June 1989 to June 2010 among OLT in which we used AG, analyzing indications, short-term complications, and long-term outcomes. Results. Among 437 pediatric OLT, 58 children required an AG. A common iliac artery interposition graft was used in 57 cases and a donor carotid artery in 1 case. In 38 children the graft was used primarily, including 94% (36/38) in which it was due to poor hepatic arterial inflow. Ductopenia syndromes (n = 14), biliary atresia (BA; n = 11), and fulminant hepatitis (n = 8) were the main preoperative diagnoses among these children. Their mean weight was 18.4 kg and mean age was 68 months. At the mean follow-up of 27 months, multiple-organ failure and primary graft nonfunction (PNF) were the short-term causes of death in 9 children (26.5%). Among the remaining 29 patients, 2 (6,8%) developed early graft thrombosis requiring re-OLT; 5 (17%) developed biliary complications, and 1 (3.4%) had asymptomatic arterial stenosis. In 20 children, a graft was used during retransplantation. The main indication was HAT (75%). BA (n = 15), ductopenia syndromes (n = 2), and primary sclerosing cholangitis (n = 2) were the main diagnoses. Their mean weight was 16.7 kg and age was 65 months. At a mean follow-up of 53 months, 7 children died due to multiple-organ failure or PNF. Among the remaining 13 patients, 3 developed biliary complications and 1 had arterial stenosis. No thrombosis was observed. Conclusion. The data suggested that use of an AG is useful alternative in pediatric OLT. The technique is safe with a low risk of thrombosis.
Resumo:
Introduction. Biliary atresia (BA) is the leading indication for orthotopic liver transplantation (OLT) among children. However, there are technical difficulties, including the limited dimensions of anatomical structures, hypoplasia and/or thrombosis of the portal vein and previous portoenterostomy procedures. Objective. The objective of this study was to present our experience of 239 children with BA who underwent OLT between September 1989 and June 2010 compared with OLT performed for other causes. Methods. We performed a retrospective analysis of patient charts and analysis of complications and survival. Results. BA was the most common indication for OLT (207/409; 50.6%). The median age of subjects was 26 months (range, 7-192). Their median weight was 11 kg (range, 5-63) with 110 children (53.1%) weighing <= 10 kg. We performed 126 transplantations from cadaveric donors (60.8%) and 81 from living-related donors (LRD) (39.2%). Retransplantation was required for 31 recipients (14.9%), primarily due to hepatic artery thrombosis (HAT; 64.5%). Other complications included the following: portal vein thrombosis (PVT; 13.0%), biliary stenosis and/or fistula (22.2%), bowel perforation (7.0%), and posttransplantation lymphoproliferative disorder (PTLD; 5.3%). Among the cases of OLT for other causes, the median age of recipients was 81 months (range, 11-17 years), which was higher than that for children with BA. Retransplantation was required in 3.5% of these patients (P < .05), mostly due to HAT. The incidences of PVT, bowel perforation, and PTLD were significantly lower (P < .05). There was no significant difference between biliary complications in the 2 groups. The overall survival rates at 1 versus 5 years were 79.7% versus 68.1% for BA, and 81.2% versus 75.7% for other causes, respectively. Conclusions. Children who undergo OLT for BA are younger than those engrafted for other causes, displaying a higher risk of complications and retransplantations.
Resumo:
PURPOSE: To compare the abilities of scanning laser polarimetry (SLP) with enhanced corneal compensation (ECC) and variable corneal compensation (VCC) modes for detection of retinal nerve fiber layer (RNFL) loss in eyes with band atrophy (BA) of the optic nerve. DESIGN. Cross-sectional study. METHODS: Thirty-seven eyes from 37 patients with BA and temporal visual field defect from chiasmal compression and 40 eyes from 40 healthy subjects were studied. Subjects underwent standard automated perimetry and RNFL measurements using an SLP device equipped with VCC and ECC. Receiver operating characteristic (ROC) curves were calculated for each parameter. Pearson correlation coefficients were obtained to evaluate the relationship between RNFL thickness parameters and severity of visual field loss, as assessed by the temporal mean defect. RESULTS: All RNFL thickness parameters were significantly lower in eyes with BA compared with normal eyes with both compensation modes. However, no statistically significant differences were observed in the areas under the ROC curves for the different parameters between GDx VCC and ECC (Carl Zeiss Meditec, Inc, Dublin, California, USA). Structure-function relationships also were similar for both compensation modes. CONCLUSIONS: No significant differences were found between the diagnostic accuracy of GDx ECC and that of VCC for detection of BA of the optic nerve. The use of GDx ECC does not seem to provide a better evaluation of RNFL loss on the temporal and nasal sectors of the peripapillary retina in subjects with BA of the optic nerve.
Resumo:
Background: Birth weight is positively associated with adult bone mass. However, it is not clear if its effect is already evident in early adulthood. Objective: To investigate the association between birth weight, adult body size, the interaction between them and bone mass in young adults. Methods: Bone densitometry by DXA was performed on 496 individuals (240 men) aged 23-24 years from the 1978/79 Ribeirao Preto (southern Brazil) birth cohort, who were born and still residing in the city in 2002. Birth weight and length as well as adult weight and height were directly measured and converted to z-scores. The influence of birth weight and length, and adult weight and height on bone area (BA), bone mineral content (BMC) and bone mineral density (BMD) at the lumbar spine, proximal femur and femoral neck were investigated through simple and multiple linear regression models. Adjustments were made for sex, skin color, gestational age, physical activity level, smoking status and dietary consumption of protein, calcium and alcohol. Interaction terms between birth weight and adult weight, and birth length and adult height were tested. Results: Men in the highest fertile of birth weight distribution had greater BA and BMC at all three bone sites when compared with their counterparts in the lowest tertiles (p<0.008). For BMD, this trend was observed only in the lumbar spine. Adult weight and height were positively associated with BA and BMC at all three bone sites (p<0.05). For BMD, these associations were seen for adult weight, but for adult height an association was observed only in the lumbar spine. Birth weight retained positive associations with proximal femur BA and BMC after adjustments for current weight and height. No interaction was observed between variables measuring prenatal growth and adult body size. Conclusion: Birth weight and postnatal growth are independent determinants of adult bone mass in a sample of Brazilian adults. This effect is already evident in early adulthood. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
Objective: The emergence of periodontal medicine increased interest in defining the behaviour of peripheral blood cells in periodontitis subjects in comparison with healthy group. The aim of this study was to evaluate the levels of interleukin (IL)-8, tumour necrosis factor-alpha (TNF-alpha), IL-6 and IL-10 released by Escherichia coli lipopolysaccharide (LPS)-stimulated peripheral blood mononuclear cells (PBMC) obtained from the peripheral blood of chronic periodontitis subjects. Design: PBMC samples were isolated from 19 systemically healthy donors, divided into generalized chronic periodontitis (n = 10) and healthy (n = 9) subjects. Cells were incubated for 24-48 h in 500 mu L wells containing RPM! 1640 and stimulated with 1.0 ng/mL of E. coli LPS. Supernatants were used to quantify the amounts of IL-8, TNF-alpha, IL-6 and IL-10 released using enzyme-linked immunosorbent assay (ELISA). Results: PBMC cells from periodontitis subjects released higher levels of TNF-alpha and IL-6 than those from healthy subjects (P < 0.05). Conversely, the supernatants of the stimulated PBMC cells obtained from healthy subjects presented higher amounts of IL-8 than those from periodontitis (P < 0.05). No differences were observed in the levels of IL-10 (P > 0.05) between groups. Conclusion: In conclusion, the results of the present study showed that E. coli LPS-stimulated PBMC from subjects with periodontitis present a different pattern of cytokine release when compared to PBMC from healthy subjects. This phenomenon could have implications locally, in periodontitis, as well as in systemic diseases. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Hyperplastic polyposis is a loosely defined syndrome initially thought not to confer a clinically important predisposition to colorectal cancer. The aim of the current study was to examine the clinical, histologic, and molecular features of a prospective series of cases meeting a strict definition of the condition. Twelve patients were identified, seven of whom had developed colorectal cancer. Most polyps were hyperplastic, but 11 patients also had polyps containing dysplasia as either serrated adenomas. mixed polyps, or traditional adenomas. The mean percentage of dysplastic polyps in patients with cancer was 35%, and in patients without cancer, 11%(p < 0.05). Microsatellite instability (MSI) was present in 3 of 47 hyperplastic polyps and two of right serrated adenomas. Kras was mutated in 8 of 47 hyperplastic polyps and two of eight serrated adenomas. No polyps showed loss of heterozygosity of chromosomes 5q, 1p, or 18q. Two of seven cancers showed a high level of MSI. It is concluded that hyperplastic polyposis is associated with a high risk of colorectal cancer. Hyperplastic polyps are the dominant type of polyp, but most cases have some dysplastic epithelium. A higher proportion of dysplastic polyps is associated with increased cancer risk. Clonal generic changes are observed in some hyperplastic polyps and serrated adenomas.