985 resultados para safety monitoring
Resumo:
The Canoparmelia texana epiphytic lichenized fungi was used to monitor atmospheric pollution in the Sao Paulo metropolitan region, SP, Brazil. The cluster analysis applied to the element concentration values confirmed the site groups of different levels of pollution due to industrial and vehicular emissions. In the distribution maps of element concentrations, higher concentrations of Ba and Mn were observed in the vicinity of industries and of a petrochemical complex. The highest concentration of Co found in lichens from the Sao Miguel Paulista site is due to the emissions from a metallurgical processing plant that produces this element. For Br and Zn, the highest concentrations could be associated both to vehicular and industrial emissions. Exploratory analyses revealed that the accumulation of toxic elements in C. texana may be of use in evaluating the human risk of cardiopulmonary mortality due to prolonged exposure to ambient levels of air pollution. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This was a prospective study carried out during a period over 2 years (May/2006-September/2008) with a cohort of 1,099 individuals of both genders, aged 1 year old and older, from an endemic area of American visceral leishmaniasis (AVL) in Para state, Brazil. The object was to analyze the prevalence and incidence of human Leishmania (L.) infantum chagasi infection as well as the dynamics evolution of its clinical-immunological profiles prior identified: (1) asymptomatic infection (AI); (2) symptomatic infection (SI = AVL); (3) sub-clinical oligosymptomatic infection (SOI); (4) sub-clinical resistant infection (SRI) and; (5) indeterminate initial infection (III). The infection diagnosis was performed by using both the indirect fluorescent antibody test and leishmanin skin test with amastigotes and promastigotes antigens of L. (L.) i. chagasi, respectively. A total of 187 cases of infection were recorded in the prevalence (17%), 117 in the final incidence (6.9%), and 304 in the accumulated prevalence (26.7%), which provided the following distribution into the clinical-immunological profiles: AI, 51.6%; III, 22.4%; SRI, 20.1%; SOI, 4.3%; and SI (=AVL), 1.6%. The major finding regarding the dynamics evolution of infection was concerned to III profile, from which the cases of infection evolved to either the resistant profiles, SRI (21 cases, 30.8%) and AI (30 cases, 44.1%), or the susceptible SI (=AVL; 1 case, 1.5%); the latter 16 cases remained as III till the end of the study. These results provided the conclusion that this diagnostic approach may be useful for monitoring human L. (L.) i. chagasi infection in endemic area and preventing the high morbidity of severe AVL cases.
Resumo:
Group C rotavirus (GpCRV) has a worldwide distribution; however, its epidemiology and ecology are still unclear. Evidence for a possible zoonotic role has been postulated recently for Brazilian children strains. The aim of this study was to monitor GpCRV in children <= 15 years with acute gastroenteritis during the 2007-2010 national Brazilian rotavirus surveillance, and to undertake the molecular characterization of the major VP6 capsid protein. A total of 3,019 fecal samples were first screened for Group A rotavirus (GpARV). A total of 2,205 GpARV ELISA negative samples were tested further for the presence of GpCRV by SDS-PAGE, electronic microscopy, and RT-PCR for the VP6 gene. The genetic diversity of GpCRV was carried out by sequencing the VP6 gene. GpARV and GpCRV infections were detected in 24.6% (742/3,019) and 0.3% (8/3,019), respectively. The GpCRV detection rate increased from 0.2% (1/422) in 2007 to 1% (7/708) in 2008, and GpCRV cases were not detected in 2009 and 2010. The phylogenetic analysis indicated that the strains belonged to the human lineage, and showed a genetic relationship with the GpCRV strain from Japan isolated in 2009. None of the study sequences was related closely to animal GpCRV strains. This study provides further evidence that GpCRV is a minor cause of acute childhood gastroenteritis in Brazil, and does not suggest that GpCRV may assume epidemiological importance in the future, even after the introduction of a GpARV vaccine. In addition, the molecular analyses of the GpCRV samples in this study do not support the zoonotic hypothesis. J. Med. Virol. 83: 1631-1636, 2011. (C) 2011 Wiley-Liss, Inc.
Resumo:
Objectives We studied the relationship between changes in body composition and changes in blood pressure levels. Background The mechanisms underlying the frequently observed progression from pre-hypertension to hypertension are poorly understood. Methods We examined 1,145 subjects from a population-based survey at baseline in 1994/1995 and at follow-up in 2004/2005. First, we studied individuals pre-hypertensive at baseline who, during 10 years of follow-up, either had normalized blood pressure (PreNorm, n = 48), persistently had pre-hypertension (PrePre, n = 134), or showed progression to hypertension (PreHyp, n = 183). In parallel, we studied predictors for changes in blood pressure category in individuals hypertensive at baseline (n = 429). Results After 10 years, the PreHyp group was characterized by a marked increase in body weight (+5.71% [95% confidence interval (CI): 4.60% to 6.83%]) that was largely the result of an increase in fat mass (+17.8% [95% CI: 14.5% to 21.0%]). In the PrePre group, both the increases in body weight (+1.95% [95% CI: 0.68% to 3.22%]) and fat mass (+8.09% [95% CI: 4.42% to 11.7%]) were significantly less pronounced than in the PreHyp group (p < 0.001 for both). The PreNorm group showed no significant change in body weight (-1.55% [95% CI: -3.70% to 0.61%]) and fat mass (+0.20% [95% CI: -6.13% to 6.52%], p < 0.05 for both, vs. the PrePre group). Conclusions After 10 years of follow-up, hypertension developed in 50.1% of individuals with pre-hypertension and only 6.76% went from hypertensive to pre-hypertensive blood pressure levels. An increase in body weight and fat mass was a risk factor for the development of sustained hypertension, whereas a decrease was predictive of a decrease in blood pressure. (J Am Coll Cardiol 2010; 56: 65-76) (C) 2010 by the American College of Cardiology Foundation
Resumo:
Objectives This prospective study evaluated the association of obesity and hypertension with left atrial (LA) volume over 10 years. Background Although left atrial enlargement (LAE) is an independent risk factor for atrial fibrillation, stroke, and death, little information is available about determinants of LA size in the general population. Methods Participants (1,212 men and women, age 25 to 74 years) originated from a sex-and age-stratified random sample of German residents of the Augsburg area (MONICA S3). Left atrial volume was determined by standardized echocardiography at baseline and again after 10 years. Left atrial volume was indexed to body height (iLA). Left atrial enlargement was defined as iLA >= 35.7 and >= 33.7 ml/m in men and women, respectively. Results At baseline, the prevalence of LAE was 9.8%. Both obesity and hypertension were independent predictors of LAE, obesity (odds ratio [OR]: 2.4; p < 0.001) being numerically stronger than hypertension (OR: 2.2; p < 0.001). Adjusted mean values for iLA were significantly lower in normal-weight hypertensive patients (25.4 ml/m) than in obese normotensive individuals (27.3 ml/m; p = 0.016). The highest iLA was found in the obese hypertensive subgroup (30.0 ml/m; p < 0.001 vs. all other groups). This group also presented with the highest increase in iLA (+6.0 ml/m) and the highest incidence (31.6%) of LAE upon follow-up. Conclusions In the general population, obesity appears to be the most important risk factor for LAE. Given the increasing prevalence of obesity, early interventions, especially in young obese individuals, are essential to prevent premature onset of cardiac remodeling at the atrial level. (J Am Coll Cardiol 2009; 54: 1982-9) (C) 2009 by the American College of Cardiology Foundation
Resumo:
BACKGROUND: The most common laparoscopic complications are associated with trocar insertion. The purpose of this study was to develop an objective method of evaluating the safety profile of various access devices used in laparoscopic surgery. STUDY DESIGN: In 20 swine, 6 bladed and 2 needle access devices were evaluated. A force profile was determined by measuring the force required to drive the trocar or needle through the fascia and into the peritoneum, at 0 and 10 mmHg. The amount Of tissue deformation, the length of blade exposed, and the duration of exposure were measured using a high-speed digital imaging system. RESULTS: The needle system without the sheath required the least driving force and had the most favorable force profile. In contrast, the bladed, nonretractable trocar system required a higher driving force and a rapid loss of resistance. Insertion under a pneumoperitoneum did not significantly alter the force profile of the various access devices except for the amount of tissue deformation. With the bladed system, the blade itself was exposed for an average of 0.5 to 1.0 seconds for a distance of 4.5 to 5.0 cm. In comparison, the needle system was exposed for 0.2 seconds for a distance of 1.8 cm. CONCLUSIONS: We developed a reproducible method of measuring the forces required to place the access systems, their pattern of resistance loss, and the characteristics of the blade exposure. These parameters may provide an adjunctive and objective measurement of safety, allowing for more direct comparison between various trocar designs. (J Am Coll Surg 2009;209:222-232. (C) 2009 by the American College of Surgeons)
Resumo:
Blood pressure (BP) measurement is the basis for the diagnosis and management of arterial hypertension. The aim of this study was to compare BP measurements performed in the office and at home (home blood pressure monitoring, HBPM) in children and adolescents with chronic arterial hypertension. HBPM was performed by the patient or by his/her legal guardian. During a 14-day period, three BP measurements were performed in the morning or in the afternoon (daytime measurement) and in the evening (night-time measurement), with 1-min intervals between measurements, totalling six measurements per day. HBPM was defined for systolic blood pressure (SBP) and diastolic blood pressure (DBP) values. HBPM was evaluated in 40 patients (26 boys), mean age of 12.1 years (4-18 years). SBP and DBP records were analysed. The mean differences between average HBP and doctor`s office BP were 0.6 +/- 14 and 4 +/- 13 mm Hg for SBP and DBP, respectively. Average systolic HBPM (daytime and night-time) did not differ from average office BP, and diastolic HBPM (daytime and night-time) was statistically lower than office BP. The comparison of individual BP measurements along the study period (13 days) by s.d. of differences shows a significant decline only for DBP values from day 5, on which difference tends to disappear towards the end of the study. Mean daytime and night-time SBP and DBP values remained stable throughout the study period, confirming HBPM as an acceptable methodology for BP evaluation in hypertensive children and adolescents. Journal of Human Hypertension (2009) 23, 464-469; doi:10.1038/jhh.2008.167; published online 12 March 2009
Resumo:
Introduction: Pediatric percutaneous renal biopsy (Bx) is a routine procedure in pediatric nephrology to obtain renal tissues for histological study. We evaluated the safety, efficacy, indications and renal findings of this procedure at a tertiary care pediatric university hospital and compared our findings with the literature. Methods: Retrospective study based on medical records from January 1993 to June 2006. Results: In the study period, 305 Bx were performed in 262 patients, 127 (48.5%) male, aged 9.8 +/- 4.2 years. A 16-gauge needle was utilized in 56/305 Bx, an 18-gauge needle in 252/305 Bx (82.6%). 56.1% Bx were performed under sedation plus local anesthesia, 43.9% under general anesthesia. The number of punctures per Bx was 3.1 +/- 1.3. Minor complications occurred in 8.6% procedures. The 16-gauge needle caused a higher frequency of renal hematomas (p = 0.05). The number of glomeruli per puncture was >= 5 in 96.7% and >= 7 in 92%. Glomeruli number per puncture and frequency of complications were not different according to the type of anesthesia used. A renal pathology diagnosis was achieved in 93.1% Bx. The main indications of Bx were nephrotic syndrome (NS), lupus nephritis (LN) and hematuria (HE). The diagnosis of minimal change disease (MCD) (61.3%), class V (35.6%) and IgA nephropathy (26.3%) predominated in NS, LN and HE patients, respectively. Conclusion: Pediatric real-time ultrasound-guided percutaneous renal biopsy was safe and effective. The main clinical indications for Bx were NS and LN, the predominant renal pathology diagnoses were MCD and class V LN.
Resumo:
Introduction: The ACCM/PALS guidelines address early correction of paediatric septic shock using conventional measures. In the evolution of these recommendations, indirect measures of the balance between systemic oxygen delivery and demands using central venous or superior vena cava oxygen saturation ( ScvO(2) >= 70%) in a goal-directed approach have been added. However, while these additional goal-directed endpoints are based on evidence-based adult studies, the extrapolation to the paediatric patient remains unvalidated. Objective: The purpose of this study was to compare treatment according to ACCM/PALS guidelines, performed with and without ScvO(2) goal-directed therapy, on the morbidity and mortality rate of children with severe sepsis and septic shock. Design, participants and interventions: Children and adolescents with severe sepsis or fluid-refractory septic shock were randomly assigned to ACCM/PALS with or without ScvO(2) goal-directed resuscitation. Measurements: Twenty-eight-day mortality was the primary endpoint. Results: Of the 102 enrolled patients, 51 received ACCM/PALS with ScvO(2) goal-directed therapy and 51 received ACCM/PALS without ScvO(2) goal-directed therapy. ScvO(2) goal-directed therapy resulted in less mortality ( 28-day mortality 11.8% vs. 39.2%, p = 0.002), and fewer new organ dysfunctions ( p = 0.03). ScvO(2) goal-directed therapy resulted in more crystalloid ( 28 ( 20-40) vs. 5 ( 0-20) ml/kg, p < 0.0001), blood transfusion ( 45.1% vs. 15.7%, p = 0.002) and inotropic ( 29.4% vs. 7.8%, p = 0.01) support in the first 6 h. Conclusions: This study supports the current ACCM/PALS guidelines. Goal-directed therapy using the endpoint of a ScvO(2) = 70% has a significant and additive impact on the outcome of children and adolescents with septic shock.