6 resultados para Disposal of solid waste
em DigitalCommons@The Texas Medical Center
Resumo:
Inefficiencies during the management of healthcare waste can give rise to undesirable health effects such as transmission of infections and environmental pollution within and beyond the health facilities generating these wastes. Factors such as prevalence of diseases, conflicts, and the efflux of intellectual capacity make low income countries more susceptible to these adverse health effects. The purpose of this systematic review was to describe the effectiveness of interventions geared towards better managing the generation, collection, transport, treatment and disposal of medical waste, as they have been applied in lower and middle income countries.^ Using a systematic search strategy and evaluation of study quality, this study reviewed the literature for published studies on healthcare waste management interventions carried out in developing countries, specifically the low and lower middle income countries from year 2000 to the current year. From an initially identified set of 829 studies, only three studies ultimately met all inclusion, exclusion and high quality criteria. A multi component intervention in Syrian Arab Republic, conducted in 2007 was aimed at improving waste segregation practice in a hospital setting. There was an increased use of segregation boxes and reduced rates of sharps injury among staff as a result of the intervention. Another study, conducted in 2008, trained medical students as monitors of waste segregation practice in an Indian teaching hospital. There was improved practice in wards and laboratories but not in the intensive care units. The third study, performed in 2008 in China, consisted of modification of the components of a medical waste incinerator to improve efficiency and reduce stack emissions. Gaseous pollutants emitted, except polychlorodibenzofurans (PCDF) were below US EPA permissible exposure limits. Heavy metal residues in the fly ash remained unchanged.^ Due to the paucity of well-designed studies, there is insufficient evidence in literature to conclude on the effectiveness of interventions in low income settings. There is suggestive but insufficient evident that multi-component interventions aimed at improving waste segregation through behavior modification, provision of segregation tools and training of monitors are effective in low income settings.^
Resumo:
The study aim was to determine whether using automated side loader (ASL) trucks in higher proportions compared to other types of trucks for residential waste collection results in lower injury rates (from all causes). The primary hypothesis was that the risk of injury to workers was lower for those who work with ASL trucks than for workers who work with other types of trucks used in residential waste collection. To test this hypothesis, data were collected from one of the nation’s largest companies in the solid waste management industry. Different local operating units (i.e. facilities) in the company used different types of trucks to varying degrees, which created a special opportunity to examine refuse collection injuries and illnesses and the risk reduction potential of ASL trucks.^ The study design was ecological and analyzed end-of-year data provided by the company for calendar year 2007. During 2007, there were a total of 345 facilities which provided residential services. Each facility represented one observation.^ The dependent variable – injury and illness rate, was defined as a facility’s total case incidence rate (TCIR) recorded in accordance with federal OSHA requirements for the year 2007. The TCIR is the rate of total recordable injury and illness cases per 100 full-time workers. The independent variable, percent of ASL trucks, was calculated by dividing the number of ASL trucks by the total number of residential trucks at each facility.^ Multiple linear regression models were estimated for the impact of the percent of ASL trucks on TCIR per facility. Adjusted analyses included three covariates: median number of hours worked per week for residential workers; median number of months of work experience for residential workers; and median age of residential workers. All analyses were performed with the statistical software, Stata IC (version 11.0).^ The analyses included three approaches to classifying exposure, percent of ASL trucks. The first approach included two levels of exposure: (1) 0% and (2) >0 - <100%. The second approach included three levels of exposure: (1) 0%, (2) ≥ 1 - < 100%, and (3) 100%. The third approach included six levels of exposure to improve detection of a dose-response relationship: (1) 0%, (2) 1 to <25%, (3) 25 to <50%, (4) 50 to <75%, (5) 75 to <100%, and (6) 100%. None of the relationships between injury and illness rate and percent ASL trucks exposure levels was statistically significant (i.e., p<0.05), even after adjustment for all three covariates.^ In summary, the present study shows that there is some risk reduction impact of ASL trucks but not statistically significant. The covariates demonstrated a varied yet more modest impact on the injury and illness rate but again, none of the relationships between injury and illness rate and the covariates were statistically significant (i.e., p<0.05). However, as an ecological study, the present study also has the limitations inherent in such designs and warrants replication in an individual level cohort design. Any stronger conclusions are not suggested.^
Resumo:
Electronic waste is a fairly new and largely unknown phenomenon. Accordingly, governments have only recently acknowledged electronic waste as a threat to the environment and public health. In attempting to mitigate the hazards associated with this rapidly growing toxic waste stream, governments at all levels have started to implement e-waste management programs. The legislation enacted to create these programs is based on extended producer responsibility or EPR policy. ^ EPR shifts the burden of final disposal of e-waste from the consumer or municipal solid waste system to the manufacturer of electronic equipment. Applying an EPR policy is intended to send signals up the production chain to the manufacturer. The desired outcome is to change the methods of production in order to reduce production outputs/inputs with the ultimate goal of changing product design. This thesis performs a policy analysis of the current e-waste policies at the federal and state level of government, focusing specifically on Texas e-waste policies. ^ The Texas e-waste law known, as HB 2714 or the Texas Computer TakeBack Law, requires manufacturers to provide individual consumers with a free and convenient method for returning their used computers to manufacturers. The law is based on individual producer responsibility and shared responsibility among consumer, retailers, recyclers, and the TCEQ. ^ Using a set of evaluation criteria created by the Organization for Economic Co-operation and Development, the Texas e-waste law was examined to determine its effectiveness at reducing the threat of e-waste in Texas. Based on the outcomes of the analysis certain recommendations were made for the legislature to incorporate into HB 2714. ^ The results of the policy analysis show that HB 2714 is a poorly constructed law and does not provide the desired results seen in other states with EPR policies. The TakeBack Law does little to change the collection methods of manufacturers and even less to change their production habits. If the e-waste problem is to be taken seriously, HB 2714 must be amended to reflect the proposed changes in this thesis.^
Resumo:
We have investigated the in vivo safety, efficacy, and persistence of autologous Epstein Barr virus (EBV)-specific cytotoxic T lymphocytes (CTLs) for the treatment of solid organ transplant (SOT) recipients at high risk for EBV-associated posttransplantation lymphoproliferative disease (PTLD). EBV-CTLs generated from 35 patients expanded with normal kinetics contained both CD8 and CD4 lymphocytes and produced significant specific killing of autologous EBV-transformed B lymphoblastoid cell lines (LCLs). Twelve SOT recipients at high risk for PTLD, or with active disease, received autologous CTL infusions without toxicity. Real-time polymerase chain reaction (PCR) monitoring of EBV-DNA showed a transient increase in plasma EBV-DNA suggestive of lysis of EBV-infected cells, although there was no consistent decrease in virus load in peripheral-blood mononuclear cells. Interferon-gamma enzyme-linked immunospot (ELISPOT) assay and tetramer analysis showed an increase in the frequency of EBV-responsive T cells, which returned to preinfusion levels after 2 to 6 months. None of the treated patients developed PTLD. One patient with liver PTLD showed a complete response, and one with ocular disease has had a partial response stable for over one year. These data are consistent with an expansion and persistence of adoptively transferred EBV-CTLs that is limited in the presence of continued immunosuppression but that nonetheless produces clinically useful antiviral activity.
Resumo:
Background. Previous studies suggest an association between timing of introduction of solid food and increased risk of obesity in pre-school aged children, but no study included a representative sample of US children. We sought to examine whether there was any association between the timing of solid food introduction and overweight/obesity in pre-school aged children. Design/methods. Cross-sectional study of a nationally representative sample (N=2050) of US children aged 2 to 5 years with information on infant feeding practices and measured weight and height from the National Health and Nutrition Examination Survey 2003–2008. The main outcome measure was BMI for age and sex ≥ 85th percentile. The main exposure was timing of solid food introduction at < 4, 4–5, or ≥ 6 months of age. Binomial logistic regression was used in the analysis controlling for child's sex, birth weight and breastfeeding status as well as maternal age at birth, smoking status and socio-demographic variables. Results. Two thousand and fifty children were included in the sample; 51% male and 49% female; 57.1% Non-Hispanic White, 21.9% Hispanic, 14.0% Non-Hispanic Black, and 7% other race/ethnicity. Twenty-two percent of the children were overweight or obese. Sixty-nine percent were breastfed or fed breast milk at birth and 36% continued breastfeeding for ≥ six months. Solid foods were introduced before 4 months of age for 11.2% of the children; 30.3% received solid foods between 4 to 5 months; with 58.6% receiving solid foods at 6 months or later. Timing of solid food introduction was not associated with weight status (OR= 1.36, 95% CI [0.83–2.24]). Formula-fed infants and infants breastfed for < 4 months had increased odds of overweight and obesity (OR=1.54, 95% CI [1.05–2.27] and OR= 1.60, 95% CI [1.05–2.44], respectively) when compared to infants breastfed for ≥ 6 months. Conclusion. Timing of solid food introduction was not associated with weight status in a national sample of US children ages 2 to 5 years. More focus should be placed on promoting breastfeeding and healthy infant feeding practices as strategies to prevent obesity in children. ^
Resumo:
Diarrheal disease associated with enterotoxigenic Escherichia coli (ETEC) infection is one of the major public health problems in many developing countries, especially in infants and young children. Because tests suitable for field laboratories have been developed only relatively recently, the literature on the environmental risk factors associated with ETEC is not as complete as for many other pathogens or for diarrhea of unspecified etiology.^ Data from a diarrheal disease surveillance project in rural Egypt in which stool samples were tested for a variety of pathogens, and in which an environmental questionnaire was completed for the same study households, provided an opportunity to test for an association between ETEC and various risk factors present in those households. ETEC laboratory-positive specimens were compared with ETEC laboratory-negative specimens for both symptomatic and asymptomatic children less than three years of age at the individual and household level using a case-comparison design.^ Individual children more likely to have LT infection were those who lived in HHs that had cooked food stored for subsequent consumption at the time of the visit, where caretakers used water but not soap to clean an infant after a diarrheal stool, and that had an indoor, private water source. LT was more common in HHs where the caretaker did not clean an infant with soap after a diarrheal stool, and where a sleeping infant was not covered with a net. At both the individual and HH level, LT was significantly associated with good water supply in terms of quantity and storage.^ ST was isolated more frequently at the individual level where a sleeping infant was covered with a net, where large animals were kept in or around the house, where water was always available and was not potable, and where the water container was not covered. At the HH level, the absence of a toilet or latrine and the indiscriminate disposal of animal waste decreased risk. Using animal feces for fertilizer, the presence of large animals, and poor water quality were associated with ST at both the individual and HH level.^ These findings are mostly consistent with those of other studies, and/or are biologically plausible, with the obvious exception of those from this study where poorer water supplies are associated with less infection, at least in the case of LT. More direct observation of how animal ownership and feces disposal relates to different types of water supply and usage might clarify mechanisms through which some ETEC infection could be prevented in similar settings. ^