994 resultados para Chemical risk
Resumo:
2 Centre of Research, Education, Innovation and Intervention In Sport, Faculty of Sport, University of Porto, Portugal Background: Regarding children aged _10 years, only a few international studies were conducted to determine the prevalence of and risk factors for back pain. Although other studies on the older Portuguese children point to prevalence between 17% and 39%, none exists for this specific age-group. Thus, the aim of this study was conducted to establish the prevalence of and risk factors for back pain in schoolchildren aged 7–10 years. Methods: A cross-sectional survey among 637 children was conducted. A self-rating questionnaire was used to verify prevalence and duration of back pain, life habits, school absence, medical treatments or limitation of activities. For posture assessment, photographic records with a bio-photogrammetric analysis were used to obtain data about head, acromion and pelvic alignment, horizontal alignment of the scapulae, vertical alignment of the trunk and vertical body alignment. Results: Postural problems were found in 25.4% of the children, especially in the 8- and 9-year-old groups. Back pain occurs in 12.7% with the highest values among the 7- and 10-year-old children. The probability of back pain increased 7 times when the children presented a history of school absences, 4.3 times when they experienced sleeping difficulties, 4.4 times when school furniture was uncomfortable, 4.7 times if the children perceived an occurrence of parental back pain and 2.5 times when children presented incorrect posture. Conclusions: The combination of school absences, parental pain, sleeping difficulties, inappropriate school furniture and postural deviations at the sagittal and frontal planes seem to prove the multifactorial aetiology of back pain.
Risk Acceptance in the Furniture Sector: Analysis of Acceptance Level and Relevant Influence Factors
Resumo:
Risk acceptance has been broadly discussed in relation to hazardous risk activities and/or technologies. A better understanding of risk acceptance in occupational settings is also important; however, studies on this topic are scarce. It seems important to understand the level of risk that stakeholders consider sufficiently low, how stakeholders form their opinion about risk, and why they adopt a certain attitude toward risk. Accordingly, the aim of this study is to examine risk acceptance in regard to occupational accidents in furniture industries. The safety climate analysis was conducted through the application of the Safety Climate in Wood Industries questionnaire. Judgments about risk acceptance, trust, risk perception, benefit perception, emotions, and moral values were measured. Several models were tested to explain occupational risk acceptance. The results showed that the level of risk acceptance decreased as the risk level increased. High-risk and death scenarios were assessed as unacceptable. Risk perception, emotions, and trust had an important influence on risk acceptance. Safety climate was correlated with risk acceptance and other variables that influence risk acceptance. These results are important for the risk assessment process in terms of defining risk acceptance criteria and strategies to reduce risks.
Resumo:
This study aims to analyse the relationship between safety climate and the level of risk acceptance, as well as its relationship with workplace safety performance. The sample includes 14 companies and 403 workers. The safety climate assessment was performed by the application of a Safety Climate in Wood Industries questionnaire and safety performance was assessed with a checklist. Judgements about risk acceptance were measured through questionnaires together with four other variables: trust, risk perception, benefit perception and emotion. Safety climate was found to be correlated with workgroup safety performance, and it also plays an important role in workers’ risk acceptance levels. Risk acceptance tends to be lower when safety climate scores of workgroups are high, and subsequently, their safety performance is better. These findings seem to be relevant, as they provide Occupational, Safety and Health practitioners with a better understanding of workers’ risk acceptance levels and of the differences among workgroups.
Resumo:
ABSTRACT – Background: According to the Report on Carcinogens, formaldehyde ranks 25th in the overall U.S. chemical production, with more than 5 million tons produced each year. Given its economic importance and widespread use, many people are exposed to formaldehyde environmentally and/or occupationally. Presently, the International Agency for Research on Cancer classifies formaldehyde as carcinogenic to humans (Group 1), based on sufficient evidence in humans and in experimental animals. Manyfold in vitro studies clearly indicated that formaldehyde can induce genotoxic effects in proliferating cultured mammalian cells. Furthermore, some in vivo studies have found changes in epithelial cells and in peripheral blood lymphocytes related to formaldehyde exposure. Methods: A study was carried out in Portugal, using 80 workers occupationally exposed to formaldehyde vapours: 30 workers from formaldehyde and formaldehyde-based resins production factory and 50 from 10 pathology and anatomy laboratories. A control group of 85 non-exposed subjects was considered. Exposure assessment was performed by applying simultaneously two techniques of air monitoring: NIOSH Method 2541 and Photo Ionization Detection equipment with simultaneously video recording. Evaluation of genotoxic effects was performed by application of micronucleus test in exfoliated epithelial cells from buccal mucosa and peripheral blood lymphocytes. Results: Time-weighted average concentrations not exceeded the reference value (0.75 ppm) in the two occupational settings studied. Ceiling concentrations, on the other hand, were higher than reference value (0.3 ppm) in both. The frequency of micronucleus in peripheral blood lymphocytes and in epithelial cells was significantly higher in both exposed groups than in the control group (p < 0.001). Moreover, the frequency of micronucleus in peripheral blood lymphocytes was significantly higher in the laboratories group than in the factory workers (p < 0.05). A moderate positive correlation was found between duration of occupational exposure to formaldehyde (years of exposure) and micronucleus frequency in peripheral blood lymphocytes (r = 0.401; p < 0.001) and in epithelial cells (r = 0.209; p < 0.01). Conclusions: The population studied is exposed to high peak concentrations of formaldehyde with a long-term exposure. These two aspects, cumulatively, can be the cause of the observed genotoxic endpoint effects. The association of these cytogenetic effects with formaldehyde exposure gives important information to risk assessment process and may also be used to assess health risks for exposed worker
Resumo:
We compared the indirect immunofluorescence assay (IFA) with Western blot (Wb) as a confirmatory method to detect antibodies anti retrovirus (HIV-1 and HTLV-I/II). Positive and negative HIV-1 and HTLV-I/II serum samples from different risk populations were studied. Sensitivity, specificity, positive, negative predictive and kappa index values were assayed, to assess the IFA efficiency versus Wb. The following cell lines were used as a source of viral antigens: H9 ( HTLV-III b); MT-2 and MT-4 (persistently infected with HTLV-I) and MO-T (persistently infected with HTLV-II). Sensitivity and specificity rates for HIV-1 were 96.80% and 98.60% respectively, while predictive positive and negative values were 99.50% and 92.00% respectively. No differences were found in HIV IFA performance between the various populations studied. As for IFA HTLV system, the sensitivity and specificity values were 97.91% and 100% respectively with positive and negative predictive values of 100% and 97.92%. Moreover, the sensitivity of the IFA for HTLV-I/II proved to be higher when the samples were tested simultaneously against both antigens (HTLV-I-MT-2 and HTLV-II-MO-T). The overall IFA efficiency for HIV-1 and HTLV-I/II-MT-2 antibody detection probed to be very satisfactory with an excellent correlation with Wb (Kappa indexes 0.93 and 0.98 respectively). These results confirmed that the IFA is a sensitive and specific alternative method for the confirmatory diagnosis of HIV-1 and HTLV-I/II infection in populations at different levels of risk to acquire the infection and suggest that IFA could be included in the serologic diagnostic algorithm.
Resumo:
This study was carried out in order to obtain base-line data concerning the epidemiology of American Visceral Leishmaniasis and Chagas Disease in an indigenous population with whom the government is starting a dwelling improvement programme. Information was collected from 242 dwellings (1,440 people), by means of house to house interviews about socio-economic and environmental factors associated with Leishmania chagasi and Trypanosoma cruzi transmission risk. A leishmanin skin test was applied to 385 people and 454 blood samples were collected on filter paper in order to detect L. chagasi antibodies by ELISA and IFAT and T. cruzi antibodies by ELISA. T. cruzi seroprevalence was 8.7% by ELISA, L. chagasi was 4.6% and 5.1% by IFAT and ELISA, respectively. ELISA sensitivity and specificity for L. chagasi antibodies were 57% and 97.5% respectively, as compared to the IFAT. Leishmanin skin test positivity was 19%. L. chagasi infection prevalence, being defined as a positive result in the three-immunodiagnostic tests, was 17.1%. Additionally, 2.7% of the population studied was positive to both L. chagasi and T. cruzi, showing a possible cross-reaction. L. chagasi and T. cruzi seropositivity increased with age, while no association with gender was observed. Age (p<0.007), number of inhabitants (p<0.05), floor material (p<0.03) and recognition of vector (p<0.01) were associated with T. cruzi infection, whilst age ( p<0.007) and dwelling improvement (p<0.02) were associated with L. chagasi infection. It is necessary to evaluate the long-term impact of the dwelling improvement programme on these parasitic infections in this community.
Resumo:
Antigenic preparations (saline, methylic, metabolic and exoantigens) of four agents of chromoblastomycosis, Fonsecaea pedrosoi, Phialophora verrucosa, Cladophialophora (Cladosporium) carrionii and Rhinocladiella aquaspersa were obtained. Partial chemical characterization of these antigenic preparations was obtained by determination of the levels of total lipids, protein, and carbohydrates, and identification of the main sterols and carbohydrates. Methylic antigens presented the highest lipid contents, whereas metabolic antigens showed the highest carbohydrate content. Total lipid, protein, and carbohydrate levels were in the range of 2.33 to 2.00mg/ml, 0.04 to 0.02 mg/ml and 0.10 to 0.02 mg/ml, respectively, in the methylic antigens and in the range of 0.53 to 0.18mg/ml, 0.44 to 0.26mg/ml, and 1.82 to 1.02 mg/ml, respectively, in saline antigens. Total lipid, protein, and carbohydrate contents were in the range of 0.55 to 0.20mg/ml, 0.69 to 0.57mg/ml and 10.73 to 5.93mg/ml, respectively, in the metabolic antigens, and in the range of 0.55 to 0.15mg/ml, 0.62 to 0.20mg/ml and 3.55 to 0.42mg/ml, respectively, in the exoantigens. Phospholipids were not detected in the preparations. Saline and metabolic antigens and exoantigens presented hexose and the methylic antigen revealed additional pentose units in their composition. The UV light absorption spectra of the sterols revealed squalene and an ergosterol fraction in the antigens. The characterization of these antigenic preparations may be useful for serological evaluation of patients of chromoblastomycosis.
Resumo:
A case-control study was conducted to identify risk factors for death from tetanus in the State of Pernambuco, Brazil. Information was obtained from medical records of 152 cases and 152 controls, admitted to the tetanus unit in the State University Hospital, in Recife, from 1990 to 1995. Variables were grouped in three different sets. Crude and adjusted odds ratios, p-values and 95% confidence intervals were estimated. Variables selected in the multivariate analysis in each set were controlled for the effect of those selected in the others. All factors related to the disease progression - incubation period, time elapsed between the occurrence of the first tetanus symptom and admission, and period of onset - showed a statistically significant association with death from tetanus. Similarly, signs and/or symptoms occurring on admission or in the following 24 hours (second set): reflex spasms, neck stiffness, respiratory signs/symptoms and respiratory failure requiring artificial ventilation (third set) were associated with death from tetanus even when adjusted for the effect of the others.
Resumo:
The objective of this study was to evaluate the prevalence and risk factors associated with HCV infection in a group of HIV seropositive patients. We analyzed the medical records of 1,457 patients. All patients were tested for HCV infection by third generation ELISA. Whenever possible, a sample of the positive patients was also tested for HCV by PCR. HCV positive patients were analyzed according to their risk factors for both infections. The prevalence of anti-HCV positive patients was 17.7% (258 patients). Eighty-two (82) of these patients were also tested by PCR and 81 were positive for HCV virus (98%). One hundred fifty-one (58.5%) were intravenous drug users (IDU); 42 (16.3%) were sexual partners of HIV patients; 23 (8.9%) were homosexual males; 12 (4.7%) had received blood transfusion; 61 (17.5%) had promiscuous sexual habits; 14 (5.4%) denied any risk factor; 12 (4.7%) were sexual partners of IDU. Two hundred four patients mentioned only one risk factor. Among them, 28 (10.9%) were sexual partners of HIV-positive patients. Although intravenous drug use was the most important risk factor for co-infection, sexual transmission seemed to contribute to the high HCV seroprevalence in this group of patients.
Resumo:
It is well recognized that professional musicians are at risk of hearing damage due to the exposure to high sound pressure levels during music playing. However, it is important to recognize that the musicians’ exposure may start early in the course of their training as students in the classroom and at home. Studies regarding sound exposure of music students and their hearing disorders are scarce and do not take into account important influencing variables. Therefore, this study aimed to describe sound level exposures of music students at different music styles, classes, and according to the instrument played. Further, this investigation attempted to analyze the perceptions of students in relation to exposure to loud music and consequent health risks, as well as to characterize preventive behaviors. The results showed that music students are exposed to high sound levels in the course of their academic activity. This exposure is potentiated by practice outside the school and other external activities. Differences were found between music style, instruments, and classes. Tinnitus, hyperacusis, diplacusis, and sound distortion were reported by the students. However, students were not entirely aware of the health risks related to exposure to high sound pressure levels. These findings reflect the importance of starting intervention in relation to noise risk reduction at an early stage, when musicians are commencing their activity as students.
Resumo:
The prevalence of TT virus (TTV) infection was investigated by Polymerase Chain Reaction (PCR) in low- (blood donors and healthy children/adolescents) and high-risk (hemophiliacs) groups from São Paulo, Brazil. Primers based on the untranslated region (UTR) of the viral genome proved to be much more ubiquitous, leading to much higher frequencies for both groups ( > or = 81%) than the earlier N22-PCR directed to the open reading frame 1 (blood donors, 5.5%, and hemophiliacs, 42.3%). The UTR-PCR also revealed an interesting profile for healthy children/adolescents: very high prevalence at the early years and significant decrease in male teenagers. The N22-PCR, in turn, demonstrated higher frequency in hemophiliacs treated with fresh blood products (58%), than in those treated with virus-inactivated clotting factors (9.4%) and blood donors (5.5%).
Resumo:
The use of appropriate acceptance criteria in the risk assessment process for occupational accidents is an important issue but often overlooked in the literature, particularly when new risk assessment methods are proposed and discussed. In most cases, there is no information on how or by whom they were defined, or even how companies can adapt them to their own circumstances. Bearing this in mind, this study analysed the problem of the definition of risk acceptance criteria for occupational settings, defining the quantitative acceptance criteria for the specific case study of the Portuguese furniture industrial sector. The key steps to be considered in formulating acceptance criteria were analysed in the literature review. By applying the identified steps, the acceptance criteria for the furniture industrial sector were then defined. The Cumulative Distribution Function (CDF) for the injury statistics of the industrial sector was identified as the maximum tolerable risk level. The acceptable threshold was defined by adjusting the CDF to the Occupational, Safety & Health (OSH) practitioners’ risk acceptance judgement. Adjustments of acceptance criteria to the companies’ safety cultures were exemplified by adjusting the Burr distribution parameters. An example of a risk matrix was also used to demonstrate the integration of the defined acceptance criteria into a risk metric. This work has provided substantial contributions to the issue of acceptance criteria for occupational accidents, which may be useful in overcoming the practical difficulties faced by authorities, companies and experts.
Resumo:
Barra de Guaratiba is a coastal area of the city of Rio de Janeiro where American visceral leishmaniasis (AVL) is endemic. Although control measures including killing of dogs and use of insecticides have been applied at this locality, the canine seroprevalence remains at 25% and during 1995 and 1997 eight autochthonous human cases were notified. In order to evaluate factors related to the increase of the risk for Leishmania (Leishmania) chagasi infection in dogs we have screened 365 dogs by anti-Leishmania immunofluorescent antibody test (IFAT) and captured sandflies in the domestic and peridomestic environment. Some variables related to the infection were assessed by uni- and multivariate analysis. The distance of the residence from the forest border, its altitude and the presence of the opossum Didelphis marsupialis in the backyard, were found predictor factors for L. (L.) chagasi infection in dogs in Barra de Guaratiba. The presence of Lutzomyia longipalpis in the peridomestic environment indicates the possibility of appearence of new human cases. Our data also suggest the presence of a sylvatic enzootic cycle at this locality.