31 resultados para POTENTIAL RISK
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Ingestion of vegetables containing heavy metals is one of the main ways in which these elements enter the human body. Once entered, heavy metals are deposited in bone and fat tissues, overlapping noble minerals. Slowly released into the body, heavy metals can cause an array of diseases. This study aimed to investigate the concentrations of cadmium, nickel, lead, cobalt and chromium in the most frequently consumed foodstuff in the Sao Paulo State, Brazil and to compare the heavy metal contents with the permissible limits established by the Brazilian legislation. A value of intake of heavy metals in human diets was also calculated to estimate the risk to human health. Vegetable samples were collected at the Sao Paulo General Warehousing and Centers Company, and the heavy metal content was determined by atomic absorption spectrophotometry. All sampled vegetables presented average concentrations of Cd and Ni lower than the permissible limits established by the Brazilian legislation. Pb and Cr exceeded the limits in 44 % of the analyzed samples. The Brazilian legislation does not establish a permissible limit for Co contents. Regarding the consumption habit of the population in the Sao Paulo State, the daily ingestion of heavy metals was below the oral dose of reference, therefore, consumption of these vegetables can be considered safe and without risk to human health.
Resumo:
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Physical and psychosocial risk factors for musculoskeletal disorders in Brazilian and Italian nurses
Resumo:
As part of the international CUPID investigation, we compared physical and psychosocial risk factors for musculoskeletal disorders among nurses in Brazil and Italy. Using questionnaires, we collected information on musculoskeletal disorders and potential risk factors from 751 nurses employed in public hospitals. By fitting country-specific multiple logistic regression models, we investigated the association of stressful physical activities and psychosocial characteristics with site-specific and multisite pain, and associated sickness absence. We found no clear relationship between low back pain and occupational lifting, but neck and shoulder pain were more common among nurses who reported prolonged work with the arms in an elevated position. After adjustment for potential confounding variables, pain in the low back, neck and shoulder, multisite pain, and sickness absence were all associated with somatizing tendency in both countries. Our findings support a role of somatizing tendency in predisposition to musculoskeletal disorders, acting as an important mediator of the individual response to triggering exposures, such as work-load.
Resumo:
Introduction: Nurse understaffing is frequently hypothesized as a potential risk factor for healthcare-associated infections (HAI). This study aimed to evaluate the role of nursing workload in the occurrence of HAI, using Nursing Activities Score (NAS). Methods: This prospective cohort study enrolled all patients admitted to 3 Medical ICUs and one step-down unit during 3 months (2009). Patients were followed-up until HAI, discharge or death. Information was obtained from direct daily observation of medical and nursing rounds, chart review and monitoring of laboratory system. Nursing workload was determined using NAS. Non-compliance to the nurses' patient care plans (NPC) was identified. Demographic data, clinical severity, invasive procedures, hospital interventions, and the occurrence of other adverse events were also recorded. Patients who developed HAI were compared with those who did not. Results: 195 patients were included and 43 (22%) developed HAI: 16 pneumonia, 12 urinary-tract, 8 bloodstream, 2 surgical site, 2 other respiratory infections and 3 other. Average NAS and average proportion of non compliance with NPC were significantly higher in HAI patients. They were also more likely to suffer other adverse events. Only excessive nursing workload (OR: 11.41; p: 0.019) and severity of patient's clinical condition (OR: 1.13; p: 0.015) remained as risk factors to HAI. Conclusions: Excessive nursing workload was the main risk factor for HAI, when evaluated together with other invasive devices except mechanical ventilation. To our knowledge, this study is the first to evaluate prospectively the nursing workload as a potential risk factor for HAI, using NAS.
Resumo:
OBJECTIVE: Many changes in mucosal morphology are observed following ileal pouch construction, including colonic metaplasia and dysplasia. Additionally, one rare but potential complication is the development of adenocarcinoma of the reservoir. The aim of this study was to evaluate the most frequently observed histopathological changes in ileal pouches and to correlate these changes with potential risk factors for complications. METHODS: A total of 41 patients were enrolled in the study and divided into the following three groups: a non-pouchitis group (group 1) (n = 20; 8 males; mean age: 47.5 years) demonstrating optimal outcome; a pouchitis without antibiotics group (group 2) (n = 14; 4 males; mean age: 47 years), containing individuals with pouchitis who did not receive treatment with antibiotics; and a pouchitis plus antibiotics group (group 3) (n = 7; 3 males; mean age: 41 years), containing those patients with pouchitis who were administered antibiotics. Ileal pouch endoscopy was performed, and tissue biopsy samples were collected for histopathological analysis. RESULTS: Colonic metaplasia was found in 15 (36.6%) of the 41 patients evaluated; of these, five (25%) were from group 1, eight (57.1%) were from group 2, and two (28.6%) were from group 3. However, no correlation was established between the presence of metaplasia and pouchitis (p = 0.17). and no differences in mucosal atrophy or the degree of chronic or acute inflammation were observed between groups 1, 2, and 3 (p > 0.45). Moreover, no dysplasia or neoplastic changes were detected. However, the degree of mucosal atrophy correlated well with the time of postoperative follow-up (p = 0.05). CONCLUSIONS: The degree of mucosal atrophy, the presence of colonic metaplasia, and the degree of acute or chronic inflammation do not appear to constitute risk factors for the development of pouchitis. Moreover, we observed that longer postoperative follow-up times were associated with greater degrees of mucosal atrophy.
Resumo:
It was verified to what extent cognitive and affective/emotional variables could distinguish caregivers accused of committing physical abuse (G1) from those without physical abuse records (G2). The Child Abuse Potential Inventory (CAP), which is an instrument designed to assess psychological risk factors in caregivers, was used. A questionnaire on socio-demographic characterization and another on economic classification were also employed to equate the groups. G1 presented a greater potential risk than G2, higher levels of Distress, Rigidity, Problems with the Child and with Themselves, Problems with Others, and a lower level of Ego Strength. These variables contribute with the composition of physical abuse risk, since, in agreement with the Social Information Processing Model, they would be related to cognitive and affective basic processes which are veiled to the perceptions and evaluation/interpretations, associated to abusive parental behavior.
Resumo:
Abstract Background Lower respiratory tract infection (LRTI) is a major cause of pediatric morbidity and mortality, especially among non-affluent communities. In this study we determine the impact of respiratory viruses and how viral co-detections/infections can affect clinical LRTI severity in children in a hospital setting. Methods Patients younger than 3 years of age admitted to a tertiary hospital in Brazil during the months of high prevalence of respiratory viruses had samples collected from nasopharyngeal aspiration. These samples were tested for 13 different respiratory viruses through real-time PCR (rt-PCR). Patients were followed during hospitalization, and clinical data and population characteristics were collected during that period and at discharge to evaluate severity markers, especially length of hospital stay and oxygen use. Univariate regression analyses identified potential risk factors and multivariate logistic regressions were used to determine the impact of specific viral detections as well as viral co-detections in relation to clinical outcomes. Results We analyzed 260 episodes of LRTI with a viral detection rate of 85% (n = 222). Co-detection was observed in 65% of all virus-positive episodes. The most prevalent virus was Respiratory Syncytial Virus (RSV) (54%), followed by Human Metapneumovirus (hMPV) (32%) and Human Rhinovirus (HRV) (21%). In the multivariate models, infants with co-detection of HRV + RSV stayed 4.5 extra days (p = 0.004), when compared to infants without the co-detection. The same trends were observed for the outcome of days of supplemental oxygen use. Conclusions Although RSV remains as the main cause of LRTI in infants our study indicates an increase in the length of hospital stay and oxygen use in infants with HRV detected by RT-PCR compared to those without HRV. Moreover, one can speculate that when HRV is detected simultaneously with RSV there is an additive effect that may be reflected in more severe clinical outcome. Also, our study identified a significant number of children infected by recently identified viruses, such as hMPV and Human Bocavirus (HBov), and this is a novel finding for poor communities from developing countries.
Resumo:
Abstract Background Hepatitis C chronic liver disease is a major cause of liver transplant in developed countries. This article reports the first nationwide population-based survey conducted to estimate the seroprevalence of HCV antibodies and associated risk factors in the urban population of Brazil. Methods The cross sectional study was conducted in all Brazilian macro-regions from 2005 to 2009, as a stratified multistage cluster sample of 19,503 inhabitants aged between 10 and 69 years, representing individuals living in all 26 State capitals and the Federal District. Hepatitis C antibodies were detected by a third-generation enzyme immunoassay. Seropositive individuals were retested by Polymerase Chain Reaction and genotyped. Adjusted prevalence was estimated by macro-regions. Potential risk factors associated with HCV infection were assessed by calculating the crude and adjusted odds ratios, 95% confidence intervals (95% CI) and p values. Population attributable risk was estimated for multiple factors using a case–control approach. Results The overall weighted prevalence of hepatitis C antibodies was 1.38% (95% CI: 1.12%–1.64%). Prevalence of infection increased in older groups but was similar for both sexes. The multivariate model showed the following to be predictors of HCV infection: age, injected drug use (OR = 6.65), sniffed drug use (OR = 2.59), hospitalization (OR = 1.90), groups socially deprived by the lack of sewage disposal (OR = 2.53), and injection with glass syringe (OR = 1.52, with a borderline p value). The genotypes 1 (subtypes 1a, 1b), 2b and 3a were identified. The estimated population attributable risk for the ensemble of risk factors was 40%. Approximately 1.3 million individuals would be expected to be anti-HCV-positive in the country. Conclusions The large estimated absolute numbers of infected individuals reveals the burden of the disease in the near future, giving rise to costs for the health care system and society at large. The known risk factors explain less than 50% of the infected cases, limiting the prevention strategies. Our findings regarding risk behaviors associated with HCV infection showed that there is still room for improving strategies for reducing transmission among drug users and nosocomial infection, as well as a need for specific prevention and control strategies targeting individuals living in poverty.
Resumo:
Objective: To evaluate the frequency of anti-Toxocara spp. antibodies in an adult healthy population. Methods: The study was performed by interviewing 253 blood donors, from 19 to 65 years of age, in a hematological centre in Presidente Prudente, São Paulo, southeast Brazil. A survey was applied to blood donors in order to evaluate the possible factors associated to the presence of antibodies, including individual (gender and age), socioeconomic (scholarship, familial income and sanitary facilities) and habit information (contact with soil, geophagy, onycophagy and intake of raw/undercooked meat) as well as the presence of dogs or cats in the household. ELISA test was run for detection of the anti-Toxocara spp. IgG antibodies. Bivariate analysis followed by logistic regression was performed to evaluate the potential risk factors associated to seropositivity. Results: The overall prevalence observed in this study was 8.7% (22/253). Contact with soil was the unique risk factor associated with the presence of antibodies (P=0.0178 ; OR=3.52; 95% CI=1.244-9.995) Conclusions. The results of this study reinforce the necessity in promoting preventive public health measures, even for healthy adult individual, particularly those related to the deworming of pets to avoid the soil contamination, and hygiene education of the population.
Resumo:
The present study aimed to evaluate the interactions of the pesticide Vertimec (R) 18EC in aquatic ecosystems. In this respect, soil plots were contaminated with Vertimec (R) 18EC at the concentration indicated for strawberry crops (0.125 L of solution m(-2)). After the contamination, torrential rainfall was simulated and the surface runoff was collected and transferred to mesocosm tanks in five treatments, run in triplicate: (1) control-C; (2) runoff from an uncontaminated plot-UR; (3) runoff from the plot contaminated with Vertimec (R) 18EC-CR; (4) direct application of Vertimec (R) 18EC in the water-V and (5) water samples gathered randomly to verify whether there was contamination between the mesocosms-RS. Water samples from these tanks were also submitted to ecotoxicological tests with Daphnia similis and analyses to evaluate the limnological characteristics, in five collection periods over 10 days (240 h). Physical and chemical differences were observed in the water samples, mainly related to increased turbidity, suspended solids and nutrients (nitrogen and phosphate forms). Acute toxicity was observed for the direct application treatment for the entire experimental period, and in some periods for the CR treatment (from 48 h to 168 h). The results obtained suggest that the pesticide did not fully degrade during the study period (10 days) in the direct application treatment, demonstrating that the presence of other substances in the commercial formulation contribute to the maintenance of toxicity. This represents a potential risk for aquatic ecosystems in areas adjacent to where the chemical is applied. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Background: Iron supplementation is a common recommendation to chronic kidney disease patients undergoing hemodialysis (HD). However, iron excess is closely associated with lipid peroxidation and, it is well known that electronegative low-density lipoproteins (LDL[-]) are present at higher plasma concentrations in diseases with high cardiovascular risk such as chronic kidney disease. Thus, the aim of this study was to investigate whether ferritin levels are associated with LDL(-) levels in HD patients. Design: This was a cross-sectional study. Setting: This study was conducted from a private clinic in Rio de Janeiro, Brazil. Patients: The study included 27 HD patients and 15 healthy subjects. Methods and Procedures: Twenty-seven HD patients (14 men, 58.6 +/- 10 years, 62.2 +/- 51.4 months on dialysis, and body mass index: 24.4 +/- 4.2 kg/m(2)) were studied and compared with 15 healthy individuals (6 men, 53.8 +/- 15.4 years, body mass index: 24.5 +/- 4.3 kg/m(2)). Serum LDL(-) levels were measured using the enzyme-linked immunosorbent assay method; ferritin levels by commercially available kits, and tumor necrosis factor-alpha, interleukin-6, monocyte chemoattractant protein-1, and plasminogen activator inhibitor-1 were determined with a multiplex assay kit manufactured by R&D Systems. Results: The HD patients presented higher LDL(-) and tumor necrosis factor-alpha levels (0.15 +/- 0.13 U/L and 5.9 +/- 2.3 pg/mL, respectively) than healthy subjects (0.07 +/- 0.05 U/L and 2.3 +/- 1.3 pg/mL, respectively) (P = .0001). The mean ferritin level in HD patients was 1,117.5 +/- 610.4 ng/mL, and 90% of patients showed ferritin levels exceeding 500 ng/mL. We found a positive correlation between LDL(-) and ferritin in the patients (r = 0.48; P = .01), and ferritin was a significant contributor to LDL(-) concentrations independent of inflammation. Conclusions: Excess body iron stores for HD patients was associated with signs of increased oxidative stress, as reflected by increased LDL(-) levels in HD patients. (C) 2012 by the National Kidney Foundation, Inc. All rights reserved.
Resumo:
Background: The CUPID (Cultural and Psychosocial Influences on Disability) study was established to explore the hypothesis that common musculoskeletal disorders (MSDs) and associated disability are importantly influenced by culturally determined health beliefs and expectations. This paper describes the methods of data collection and various characteristics of the study sample. Methods/Principal Findings: A standardised questionnaire covering musculoskeletal symptoms, disability and potential risk factors, was used to collect information from 47 samples of nurses, office workers, and other (mostly manual) workers in 18 countries from six continents. In addition, local investigators provided data on economic aspects of employment for each occupational group. Participation exceeded 80% in 33 of the 47 occupational groups, and after pre-specified exclusions, analysis was based on 12,426 subjects (92 to 1018 per occupational group). As expected, there was high usage of computer keyboards by office workers, while nurses had the highest prevalence of heavy manual lifting in all but one country. There was substantial heterogeneity between occupational groups in economic and psychosocial aspects of work; three-to fivefold variation in awareness of someone outside work with musculoskeletal pain; and more than ten-fold variation in the prevalence of adverse health beliefs about back and arm pain, and in awareness of terms such as "repetitive strain injury" (RSI). Conclusions/Significance: The large differences in psychosocial risk factors (including knowledge and beliefs about MSDs) between occupational groups should allow the study hypothesis to be addressed effectively.
Resumo:
This study evaluated the presence of fungi and mycotoxins [aflatoxins (AFs), cyclopiazonic acid (CPA), and aspergillic acid] in stored samples of peanut cultivar Runner IAC Caiapó and cultivar Runner IAC 886 during 6 months. A total of 70 pod and 70 kernel samples were directly seeded onto Aspergillus flavus and Aspergillus parasiticus agar for fungi isolation and aspergillic acid detection, and AFs and CPA were analyzed by high-performance liquid chromatography. The results showed the predominance of Aspergillus section Flavi strains, Aspergillus section Nigri strains, Fusarium spp., Penicillium spp. and Rhizopus spp. from both peanut cultivars. AFs were detected in 11.4% of kernel samples of the two cultivars and in 5.7% and 8.6% of pod samples of the Caiapó and 886 cultivars, respectively. CPA was detected in 60.0% and 74.3% of kernel samples of the Caiapó and 886 cultivars, respectively. Co-occurrence of both mycotoxins was observed in 11.4% of kernel samples of the two cultivars. These results indicate a potential risk of aflatoxin production if good storage practices are not applied. In addition, the large number of samples contaminated with CPA and the simultaneous detection of AFs and CPA highlight the need to investigate factors related to the control and co-occurrence of these toxins in peanuts.
Resumo:
In order to assess the epidemiological potential of the Culicidae species in remaining areas of the Brazilian Atlantic Forest, specimens of this family were collected in wild and anthropic environments. A total of 9,403 adult mosquitoes was collected from May, 2009 to June, 2010. The most prevalent among species collected in the wild environment were Anopheles (Kerteszia) cruzii, the Melanoconion section of Culex (Melanoconion), and Aedes serratus, while the most common in the anthropic site were Coquillettidia chrysonotum/albifera, Culex (Culex) Coronator group, and An. (Ker.) cruzii. Mosquito richness was similar between environments, although the abundance of individuals from different species varied. When comparing diversity patterns between environments, anthropic sites exhibited higher richness and evenness, suggesting that environmental stress increased the number of favorable niches for culicids, promoting diversity. Increased abundance of opportunistic species in the anthropic environment enhances contact with culicids that transmit vector-borne diseases.
Resumo:
OBJECTIVE: Differentiation between benign and malignant ovarian neoplasms is essential for creating a system for patient referrals. Therefore, the contributions of the tumor markers CA125 and human epididymis protein 4 (HE4) as well as the risk ovarian malignancy algorithm (ROMA) and risk malignancy index (RMI) values were considered individually and in combination to evaluate their utility for establishing this type of patient referral system. METHODS: Patients who had been diagnosed with ovarian masses through imaging analyses (n = 128) were assessed for their expression of the tumor markers CA125 and HE4. The ROMA and RMI values were also determined. The sensitivity and specificity of each parameter were calculated using receiver operating characteristic curves according to the area under the curve (AUC) for each method. RESULTS: The sensitivities associated with the ability of CA125, HE4, ROMA, or RMI to distinguish between malignant versus benign ovarian masses were 70.4%, 79.6%, 74.1%, and 63%, respectively. Among carcinomas, the sensitivities of CA125, HE4, ROMA (pre-and post-menopausal), and RMI were 93.5%, 87.1%, 80%, 95.2%, and 87.1%, respectively. The most accurate numerical values were obtained with RMI, although the four parameters were shown to be statistically equivalent. CONCLUSION: There were no differences in accuracy between CA125, HE4, ROMA, and RMI for differentiating between types of ovarian masses. RMI had the lowest sensitivity but was the most numerically accurate method. HE4 demonstrated the best overall sensitivity for the evaluation of malignant ovarian tumors and the differential diagnosis of endometriosis. All of the parameters demonstrated increased sensitivity when tumors with low malignancy potential were considered low-risk, which may be used as an acceptable assessment method for referring patients to reference centers.