96 resultados para Health Sciences, Occupational Health and Safety|Health Sciences, Toxicology
Resumo:
Objective: To assess the indoor environment of two different types of dental practices regarding VOCs, PM2.5, and ultrafine particulate concentrations and examine the relationship between specific dental activities and contaminant levels. Method: The indoor environments of two selected dental settings (private practice and community health center) will were assessed in regards to VOCs, PM 2.5, and ultrafine particulate concentrations, as well as other indoor air quality parameters (CO2, CO, temperature, and relative humidity). The sampling duration was four working days for each dental practice. Continuous monitoring and integrated sampling methods were used and number of occupants, frequency, type, and duration of dental procedures or activities recorded. Measurements were compared to indoor air quality standards and guidelines. Results: The private practice had higher CO2, CO, and most VOC concentrations than the community health center, but the community health center had higher PM2.5 and ultrafine PM concentrations. Concentrations of p-dichlorobenzene and PM2.5 exceeded some guidelines. Outdoor concentrations greatly influenced the indoor concentration. There were no significant differences in contaminant levels between the operatory and general area. Indoor concentrations during the working period were not always consistently higher than during the nonworking period. Peaks in particulate matter concentration occurred during root canal and composite procedures.^
Resumo:
The study aim was to determine whether using automated side loader (ASL) trucks in higher proportions compared to other types of trucks for residential waste collection results in lower injury rates (from all causes). The primary hypothesis was that the risk of injury to workers was lower for those who work with ASL trucks than for workers who work with other types of trucks used in residential waste collection. To test this hypothesis, data were collected from one of the nation’s largest companies in the solid waste management industry. Different local operating units (i.e. facilities) in the company used different types of trucks to varying degrees, which created a special opportunity to examine refuse collection injuries and illnesses and the risk reduction potential of ASL trucks.^ The study design was ecological and analyzed end-of-year data provided by the company for calendar year 2007. During 2007, there were a total of 345 facilities which provided residential services. Each facility represented one observation.^ The dependent variable – injury and illness rate, was defined as a facility’s total case incidence rate (TCIR) recorded in accordance with federal OSHA requirements for the year 2007. The TCIR is the rate of total recordable injury and illness cases per 100 full-time workers. The independent variable, percent of ASL trucks, was calculated by dividing the number of ASL trucks by the total number of residential trucks at each facility.^ Multiple linear regression models were estimated for the impact of the percent of ASL trucks on TCIR per facility. Adjusted analyses included three covariates: median number of hours worked per week for residential workers; median number of months of work experience for residential workers; and median age of residential workers. All analyses were performed with the statistical software, Stata IC (version 11.0).^ The analyses included three approaches to classifying exposure, percent of ASL trucks. The first approach included two levels of exposure: (1) 0% and (2) >0 - <100%. The second approach included three levels of exposure: (1) 0%, (2) ≥ 1 - < 100%, and (3) 100%. The third approach included six levels of exposure to improve detection of a dose-response relationship: (1) 0%, (2) 1 to <25%, (3) 25 to <50%, (4) 50 to <75%, (5) 75 to <100%, and (6) 100%. None of the relationships between injury and illness rate and percent ASL trucks exposure levels was statistically significant (i.e., p<0.05), even after adjustment for all three covariates.^ In summary, the present study shows that there is some risk reduction impact of ASL trucks but not statistically significant. The covariates demonstrated a varied yet more modest impact on the injury and illness rate but again, none of the relationships between injury and illness rate and the covariates were statistically significant (i.e., p<0.05). However, as an ecological study, the present study also has the limitations inherent in such designs and warrants replication in an individual level cohort design. Any stronger conclusions are not suggested.^
Resumo:
The primary objective of this study was to determine if there is a change in permeation rates when limited use protective fabrics undergo repeated exposure and wash cycles. The null hypothesis of this study was that no substantial change in permeation takes place after the test material is subjected to repeated contact with a strong acid or base and has undergone repeated wash cycles. ^ The materials tested were DuPont Tychem® CPF 3 and CPF 4 fabrics. The challenge chemicals in this study were ninety-eight percent sulfuric acid and fifty percent sodium hydroxide. Permeation testing was conducted utilizing ASTM designation F739-99a Standard Test Method for Resistance of Protective Clothing Materials to Permeation by Liquids or Gases Under Conditions of Continuous Contact. ^ In this study, no change in permeation rates of either challenge chemical was detected for CPF 3 or CPF 4 limited use protective fabrics after repeated exposure and wash cycles. Certain unexposed areas of the fabric suffered structural degradation unrelated to exposure and which may be due to multiple washings.^
Resumo:
Results from epidemiologic studies suggest that persons working in occupations with presumed electric and magnetic field (EMF) exposures are at increased risk of brain cancer. This study utilized data from a completed, population-based, interview case-control study of central nervous system (CNS) tumors and employment in the petrochemical industry to test the hypothesis that employment in EMF-related occupations increases CNS tumor risk. A total of 375 male residents of the Texas-Louisiana Gulf Coast Area, age 20 to 79, with primary neuroglial CNS tumors diagnosed during the period 1980-84 were identified. A population-based comparison group of 450 age, race and geographically matched males was selected. Occupational histories and potential risk factor data were collected via personal interviews with study subjects or their next-of-kin.^ Adjusted odds ratios were less than 1.0 for persons ever employed in an electrical occupation (OR = 0.65; 95% CI = 0.40-1.09) or whose usual occupation was electrical (OR = 0.76; 95% CI = 0.33-1.73). Relative risk estimates did not increase significantly as time since first employment or duration of employment increased. Examination of CNS tumor risk by high (OR = 0.80), medium (OR = 0.88) and low (OR = 0.45) exposure categories for persons whose usual occupation was electrical did not indicate a dose-response pattern. In addition, the mean age of exposed cases was not significantly younger than that for unexposed cases. Analysis of risk by probability of exposure to EMFs showed non-significant elevations in the adjusted odds ratio for definite exposed workers defined by their usual occupation (OR = 1.78; 95% CI = 0.70-4.51) and ever/never employed status (OR = 1.54; 95% CI = 0.17-4.91).^ These findings suggest that employment in occupations with presumed EMF exposures does not increase CNS tumor risk as was suggested by previous investigations. The results of this study also do not support the EMF-tumor promotion hypothesis. ^
Resumo:
Workplace wellness programs have revealed immense beneficial results for both the employer and employee. Examples of results include decrease in absenteeism, turnover rate, medical claims and increases in employee satisfaction, productivity, and return on investment. However, the approach taken when implementing requires greater attention since such programs and the financial and/or non-financial incentives chosen have shown to significantly impact employee participation thus the amount of savings the organization experiences. A systematic review was conducted to evaluate the overall effectiveness of workplace wellness programs on employee health status and lifestyle change, recognize the majority types of returns observed by such programs, and identify whether financial or non-financial incentives created a greater effect on the employee. Overall employee health status improvement occurred when participating in wellness programs. The dominant indirect benefit for the organization was employee weight loss leading to a decrease in absenteeism and direct benefits included decreases in medical claims and increases in return on investment. In general, factors such as rate of participation and health status changes were most influenced when a financial incentives was provided in the wellness program. The basis of providing a program with effective incentives resides from efforts made by the employer and their efforts to play a role on every level of the organization regarding planning, implementing, and strategizing the most optimal approach for creating changes for the employees' wellbeing and productivity, thus the organizations overall returns.^
Resumo:
Background: As obesity increases among U.S. workers, employers are implementing programs to increase physical activity and improve diets. Although programs to address individual determinants of obesity have been evaluated, less is known about the effects of workplace programs that change environmental factors, because most reviews have not isolated environmental programs; the one that did was published in 2005. ^ Objective: To update the 2005 review to determine the effectiveness of workplace environmental interventions. ^ Methods: The Medline database was searched for published English language reports (2003-2011) of randomized controlled (RCTs) or quasi-experimental trials (NRCTs) that evaluated strategies to modify physical activity opportunities or food services, targeting employees at least 18 years, not including retirees and that provided data for at least one physical activity, dietary, or health risk indicator. Three coders independently extracted study characteristics and scored the quality of study methods. Program effectiveness was determined using the 2005 review's best evidence approach. ^ Results: Seven studies represented in nine reports met eligibility criteria; three focused on diet and the remainder targeted diet and physical activity interventions. All but one study received a high quality score for internal validity. The evidence for the effectiveness of workplace environmental interventions was at best, inconclusive for diet and physical activity and limited for health risk indicators. The outcome constructs were inconsistent across the studies. ^ Conclusions: Limitations in the methods of the 2005 review made it challenging to draw conclusions about findings for this review that include: variation in outcome measures, reliance on distal measures without proximal behavior change measures, no distinction between changes at the workplace versus outside the workplace, and inappropriate analyses of cluster designs that biased findings toward statistical significance. The best evidence approach relied on vote-counting, using statistical significance alone rather than effect size and confidence intervals. Future research should address these limitations and use more rigorous methods; systematic reviews should use methods of meta-analysis to summarize study findings. These recommendations will help employers to better understand how environmental modifications in the workplace can support their efforts to combat the effects of obesity among employees.^
Resumo:
Personnel involved in natural or man-made disaster response and recovery efforts may be exposed to a wide variety of physical and mental stressors that can exhibit long-lasting and detrimental psychopathological outcomes. In a disaster situation, huge numbers of "secondary" responders can be involved in contaminant clean-up and debris removal and can be at risk of developing stress-related mental health outcomes. The Occupational Safety and Health Administration (OSHA) worker training hierarchy typically required for response workers, known as "Hazardous Waste Operations and Emergency Response" (HAZWOPER), does not address the mental health and safety concerns of workers. This study focused on the prevalence of traumatic stress experienced by secondary responders that had received or expressed interest in receiving HAZWOPER training through the National Institute of Environmental Health Sciences Worker Education and Training Program (NIEHS WETP). ^ The study involved the modification of two preexisting and validated survey tools to assess secondary responder awareness of physical, mental, and traumatic stressors on mental health and sought to determine if a need existed to include traumatic stress-related mental health education in the current HAZWOPER training regimen. The study evaluated post-traumatic stress disorder (PTSD), resiliency, mental distress, and negative effects within a secondary responder population of 176 respondents. Elevated PTSD levels were seen in the study population as compared to a general responder population (32.9% positive vs. 8%-22.5% positive). Results indicated that HAZWOPER-trained disaster responders were likely to test positive for PTSD, whereas, untrained responders with no disaster experience and responders who possessed either training or disaster experience only were likely to test PTSD negative. A majority (68.75%) of the population tested below the mean resiliency to cope score (80.4) of the average worker population. Results indicated that those who were trained only or who possessed both training and disaster work experience were more likely to have lower resiliency scores than those with no training or experience. There were direct correlations between being PTSD positive and having worked at a disaster site and experiencing mental distress and negative effects. However, HAZWOPER training status does not significantly correlate with mental distress or negative effect. ^ The survey indicated clear support (91% of respondents) for mental health education. The development of a pre- and post-deployment training module is recommended. Such training could provide responders with the necessary knowledge and skills to recognize the symptomology of PTSD, mental stressors, and physical and traumatic stressors, thus empowering them to employ protective strategies or seek professional help if needed. It is further recommended that pre-deployment mental health education be included in the current HAZWOPER 24- and 40-hour course curriculums, as well as, consideration be given towards integrating a stand-alone post-deployment mental health education training course into the current HAZWOPER hierarchy.^
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^
Resumo:
An exposure system was constructed to evaluate the performance of a personal organic vapor dosimeter (3520 OVM) at ppb concentrations of nine selected target volatile organic compounds (VOCs). These concentration levels are generally encountered in community air environments, both indoor and outdoor. It was demonstrated that the chamber system could provide closely-controlled conditions of VOC concentrations, temperature and relative humidity (RH) required for the experiments. The target experimental conditions included combinations of three VOC concentrations (10, 20 and 200 $\rm\mu g/m\sp3),$ three temperatures (10, 25 and 40$\sp\circ$C) and three RHs (12, 50 and 90% RH), leading to a total of 27 exposure conditions. No backgrounds of target VOCs were found in the exposure chamber system. In the exposure chamber, the variation of the temperature was controlled within $\pm$1$\sp\circ$C, and the variation of RH was controlled within $\pm$1.5% at 12% RH, $\pm$2% at 50% RH and $\pm$3% at 90% RH. High-emission permeation tubes were utilized to generate the target VOCs. Various patterns of the permeation rates were observed over time. The lifetimes and permeation rates of the tubes differed by compound, length of the tube and manufacturer. By carefully selecting the source and length of the tubes, and closely monitoring tube weight loss over time, the permeation tubes can be used for delivering low and stable concentrations of VOCs during multiple days.^ The results of this study indicate that the performance of the 3520 OVM is compound-specific and depends on concentration, temperature and humidity. With the exception of 1,3-butadiene under most conditions, and styrene and methylene chloride at very high relative humidities, recoveries were generally within $\pm$25% of theory, indicating that the 3520 OVM can be effectively used over the range of concentrations and environmental conditions tested with a 24-hour sampling period. Increasing humidities resulted in increasing negative bias from full recovery. Reverse diffusion conducted at 200 $\rm\mu g/m\sp3$ and five temperature/humidity combinations indicated severe diffusion losses only for 1,3-butadiene, methylene chloride and styrene under increased humidity. Overall, the results of this study do not support the need to employ diffusion samplers with backup sections for the exposure conditions tested. ^
Resumo:
This study assessed if hospital-wide implementation of a needleless intravenous connection system reduces the number of reported percutaneous injuries, overall and those specifically due to intravenous connection activities.^ Incidence rates were compared before and after hospital-wide implementation of a needleless intravenous system at two hospitals, a full service general hospital and a pediatric hospital. The years 1989-1991 were designated as pre-implementation and 1993 was designated as post-implementation. Data from 1992 were not included in the effectiveness evaluation to allow employees to become familiar with use of the new device. The two hospitals showed rate ratios of 1.37 (95% CI = 1.22-1.54, p $\le$.0001) and 1.63 (95% CI = 1.34-1.97, p $\le$.0001), or a 27.1% and a 38.6% reduction in overall injury rate, respectively. Rate ratios for intravenous connection injuries were 2.67 (95% CI = 1.89-3.78, p $\le$.0001) and 3.35 (95% CI = 1.87-6.02, p $\le$.0001), or a 62.5% and a 69.9% reduction in injury rate, respectively. Rate ratios for all non-intravenous connection injuries were calculated to control for factors other than device implementation that may have been operating to reduce the injury rate. These rate ratios were lower, 1.21 and 1.44, demonstrating the magnitude of injury reduction due to factors other than device implementation. It was concluded that the device was effective in reduction of numbers of reported percutaneous injuries.^ Use-effectiveness of the system was also assessed by a survey of randomly selected device users to determine satisfaction with the device, frequency of use and barriers to use. Four hundred seventy-eight surveys were returned for a response rate of 50.9%. Approximately 94% of respondents at both hospitals expressed satisfaction with the needleless system and recommended continued use. The survey also revealed that even though over 50% of respondents report using the device "always" or "most of the time" for intravenous medication administration, flushing lines, and connecting secondary intravenous lines, needles were still being used for these same activities. Compatibility, accessibility and other technical problems were reported as reasons for using needles for these activities. These problems must be addressed, by both manufacturers and users, before the needleless system will be effective in prevention of all intravenous connection injuries. ^
Resumo:
A cross-sectional study on the use of three pesticides and their presence in drinking water sources was conducted in Githunguri/Kiaria community between January 1994-March 1995. The main objective of the study was to determine the extent to which some of the pesticides used by the Githunguri/Kiaria agricultural community were polluting their drinking water sources. Due to monetary and physical limitations, only DDT, its isomers and metabolites, carbofuran and carbaryl pesticides were identified and used as surrogates of pollution for the other pesticides.^ The study area was divided into high and low lying geographic surface areas. Thirty-four and 38 water sampling sites were randomly selected respectively. During wet and dry seasons, a total of 144 water samples were collected and analyzed at the Kenya Bureau of Standards Laboratory in Nairobi. Gas chromatography was used to analyze samples for possible presence of DDT, its isomers and metabolites, while high pressure liquid chromatography was used to analyze samples for carbofuran and carbaryl pesticides.^ Six sites testing positively for DDT, its isomers and metabolites represented 19.4% of the total sampled sites, with a mean concentration of 0.00310 ppb in the dry season and 0.0130 ppb in the wet season. All the six sites testing positively for the same pesticide exceeded the European maximum contaminant limit (MCL) in the wet season, and only one site exceeded the European MCL in the dry season.^ Those sites testing positively for carbofuran and carbaryl represented 5.6% of the total sampled sites. The mean concentration for the carbofuran at the sites was 2.500 ppb and 1.590 ppb in the dry and wet seasons respectively. Similarly, the mean concentration for carbaryl at the sites was 0.281 ppb in the dry season and 0.326 ppb in the wet season.^ One site testing positively for carbofuran exceeded the European MCL and WHO set limit in the wet season, while one site testing positively for the same pesticide exceeded the USA, Canada, European and WHO MCLs in the dry season. Similarly, one site which tested positively for carbaryl pesticide exceeded the European MCL in both seasons.^ Out of the 2,587 community members in the study area, 333 (13%) were exposed through their drinking water sources to the three pesticides investigated by this study. As a public health measure, integrated pest management approaches (IPM), protection of the wells and education of the community is necessary to minimize the pollution of the environment and safeguard the drinking water sources from pollution by the pesticides. ^
Resumo:
Several studies have shown that successful Employee Assistance Programs (EAPs) have strong management endorsement. Strong management endorsement is defined as positive support in utilizing EAP services for themselves and their employees. This study focuses solely on middle management as opposed to upper or general management support. The study further examines success or lack of success of an EAP by the utilization rate defined as the number of employees over a year period who access EAP services.^ A analytical cross-sectional design was used to compare and observe differences between two groups of middle managers (utilizers and nonutilizers). Middle manager data was collected through a mail questionnaire. The study focused on identifying predictors that influence middle managers' utilization rate specifically: attitude toward EAPs, EAP knowledge level, attitude toward mental health professionals, age, gender, years worked as a middle manager, education level, training, and other possible predictors of utilization. The overall hypothesis states middle manager utilizers of EAP services have more positive attitudes and a better understanding of their EAP than middle management nonutilizers.^ As predicted, nonparametric bivariate results showed significant differences between the two groups. Middle managers in the utilization group (n = 473) tended to show more positive attitudes toward their EAP and mental health professionals and demonstrated greater EAP knowledge compared to the nonutilization group (n = 154). These findings support past studies on variables that influence EAP utilization rates.^ Further variables found to influence middle management utilization were identified by multivariate logistic regression results. These variable were gender (female supervisors), educational levels of employees supervised (employees with lower levels of education), number of employees supervised (greater the number supervised, more likely to utilize), managerial EAP training (trained supervisors) and awareness that problems do influence an employee's productivity.^ These findings strengthen the assertion that middle management's attitudes, as well as other variables may influence utilization. Study findings add new information about important variables specifically influencing middle management who utilize EAPs. An understanding of these variables is essential in developing competent EAP program training and orientation programs for middle managers. ^
Resumo:
Surveys of national religious denominational offices and of churches in Texas were conducted to evaluate the prevalence of HIV/AIDS policies for members and employees, and to get feedback on a proposed HIV/AIDS policy. Most religious organizations in Texas do not have a HIV/AIDS policy for their employees. Analysis of the data from 77 church questionnaire surveys revealed only 17 (22.1%) policies in existence. From the current data, policies for employees were most prevalent among Catholic churches with 8 (47.1%) and Baptist churches with 7 (41.2%). Nine of the churches (52.9%) who had HIV/AIDS policies for their employees were categorized as having 2501-5000 members. In 1994 and 1995 the largest number of policies developed by churches totaled 8 (47.1%). The findings of this exploratory study in Texas were consistent with the survey of 7 national denominational offices which demonstrated that only the Lutheran church had a policy (14.3%). The literature is consistent with the finding that some churches have decided no separate HIV/AIDS policy is needed for employees. More than half of the employers reporting a HIV/AIDS related experience still feel they do not need a specific policy (CDC, 1992). The range of number of employees in churches varied widely from a high of 54.5% of churches with 15-50 employees to a low of 7.8% of churches with more than 100 employees. Seventy-one of the churches (92.2%) reported that they had no employees infected with HIV/AIDS, while 1 church (1.3%) reported having more than 1 employee infected with HIV/AIDS. This indicates that churches are reacting to incidence of the HIV/AIDS infection rather than preparing ahead. The results of this study clearly indicate the need to develop a comprehensive HIV/AIDS policy for employees in religious communities. Church employees must carefully consider all the issues in the workplace when adopting and implementing a HIV/AIDS policy. A comprehensive policy was developed and guidelines are suggested. ^
Resumo:
Current toxic tort cases have increased national awareness of health concerns and present an important avenue in which public health scientists can perform a vital function: in litigation, and in public health initiatives and promotions which may result. This review presents a systematic approach, using the paradigm of interactive public health disciplines, for the design of a matrix framework for medical surveillance of workers exposed to toxic substances. The matrix framework design addresses the required scientific bases to support the legal remedy of medical monitoring for workers injured as a result of their exposure to toxic agents. A background of recent legal developments which have a direct impact on the use of scientific expertise in litigation is examined in the context of toxic exposure litigation and the attainment of public health goals. The matrix model is applied to five different workplace exposures: dental mercury, firefighting, vinyl chloride manufacture, radon in mining and silica. An exposure matrix designed by the Department of Energy for government nuclear workers is included as a reference comparison to the design matrix. ^
Resumo:
Firefighting is widely known to be one of the most physically demanding civilian occupations. A subset of this is Industrial Firefighting, which places similarly high physical demands on Industrial Firefighters. Although there are some studies on community firefighters, literature is scant on Industrial Firefighters and their physical fitness. ^ A cross-sectional study of Industrial Firefighters in Petrochemical companies in Texas was carried out in 1996–98 to assess their physical fitness and to develop a set of physical agility criteria useful in their selection and ongoing fitness for duty evaluations. ^ A physical agility criteria and a fitness scorecard was developed based on seven parameters (resting heart rate, diastolic blood pressure, aerobic capacity, muscle strength, muscle endurance, trunk flexibility and total body fat) of musculoskeletal and cardiorespiratory fitness. Each indicator received a minimum of 0 to a maximum of 20 points, based on individual performance. Therefore a minimum and maximum achievable score for the entire battery of tests was 0 and 140 respectively. Of the 111 study subjects, 5 (4.5%) were far above average, 31 (28%) above average, 46 (41.5 %) average, 29 (26%) below average and 0 (0%) far below average as deemed by the physical fitness scorecard. The mean score was 77 (±23) with a minimum and maximum score of 35 and 135 respectively. ^ Following univariate analysis an exploratory factor analysis to group variables and to assess the overall role of constituent variables in total fitness of a firefighter was developed. This was followed by a stepwise linear regression analysis using aerobic capacity as a dependent variable. ^ Finally, a graded implementation strategy was devised, such that all existing Industrial Firefighters would have an opportunity to improve or maintain their physical fitness at or above average level as deemed by the fitness scorecard. ^