999 resultados para U.S. Army Environmental Hygiene Agency.
Resumo:
Methane-rich landfill gas is generated when biodegradable organic wastes disposed of in landfills decompose under anaerobic conditions. Methane is a significant greenhouse gas, and landfills are its major source in Finland. Methane production in landfill depends on many factors such as the composition of waste and landfill conditions, and it can vary a lot temporally and spatially. Methane generation from waste can be estimated with various models. In this thesis three spreadsheet applications, a reaction equation and a triangular model for estimating the gas generation were introduced. The spreadsheet models introduced are IPCC Waste Model (2006), Metaanilaskentamalli by Jouko Petäjä of Finnish Environment Institute and LandGEM (3.02) of U.S. Environmental Protection Agency. All these are based on the first order decay (FOD) method. Gas recovery methods and gas emission measurements were also examined. Vertical wells and horizontal trenches are the most commonly used gas collection systems. Emission measurements chamber method, tracer method, soil core and isotope measurements, micrometeorological mass-balance and eddy covariance methods and gas measuring FID-technology were discussed. Methane production at Ämmässuo landfill of HSY Helsinki Region Environmental Services Authority was estimated with methane generation models and the results were compared with the volumes of collected gas. All spreadsheet models underestimated the methane generation at some point. LandGEM with default parameters and Metaanilaskentamalli with modified parameters corresponded best with the gas recovery numbers. Reason for the differences between evaluated and collected volumes could be e.g. that the parameter values of the degradable organic carbon (DOC) and the fraction of decomposable degradable organic carbon (DOCf) do not represent the real values well enough. Notable uncertainty is associated with the modelling results and model parameters. However, no simple explanation for the discovered differences can be given within this thesis.
Resumo:
La exposición a polvo de cemento y sílice ha sido estudiada por años en países como Estados Unidos y Canadá, cuando el polvo de cemento se inhala durante diferentes actividades, se puede ocasionar afectación del tracto respiratorio de las personas expuestas. El estudio “Perfil de exposición ocupacional a polvo de cemento y sílice cristalina en procesos de cementación y Fracturamiento hidráulico en el sector Oil & Gas en Colombia: un estudio retrospectivo (2009 – 2013).” Permitió identificar las actividades funcionales que representan un riesgo potencial por la presencia de partículas aerosuspendidas, analizar una base de datos que reúne cerca de 18298 registros de evaluaciones higiénicas en el sector Oil & Gas, realizar posteriormente el cálculo de material particulado en la fracción respirable y sílice cristalina aplicables para cada proceso y el procesamiento de los datos estadísticamente, confrontar estos estimadores estadísticos con los valores límites permisibles definidos por el gobierno nacional, los resultados incluyeron la caracterización de un perfil de exposición ocupacional por actividad funcional para el proceso de cementación, la identificación de los trabajadores más expuestos según las condiciones de exposición y cuáles de estos perfiles superan los límites máximos permisibles para un turno de trabajo de 12 horas, esta información permitirá a los profesionales de la salud e higiene laboral orientar actividades de seguimiento, vigilancia y control en los grupos de exposición similar específicos. Para el proceso de fracturamiento hidráulico los datos encontrados no fueron estadísticamente significativos.
Resumo:
Objetivo Realizar una revisión sistemática de la neumonía lipoidea exógena (NLE) con el propósito de compactar y sintetizar los conocimientos fragmentados, así como el de informar el estado actual de este tema como riesgo laboral. Metodología La neumonía lipoidea es una condición poco común que resulta de la presencia de lípidos en el interior del espacio alveolar, se desconoce su actual incidencia. La neumonía lipoidea exógena se produce por aspiración o inhalación de sustancias oleosas, es una patología rara en pacientes sin enfermedades de base que predispongan a la broncoaspiración y cuando se presenta en individuos sanos se debe sospechar un origen ocupacional. Se realizó una revisión de literatura según la metodología estandarizada, incluyendo en la búsqueda: reporte de casos, descripciones de la enfermedad y el uso de técnicas diagnósticas. Las bases de datos fueron OVID además de GOOGLE ACADÉMICO; buscadores específicos: MEDLINE, CHEST, PUB MED, REDALYC, SCIELO, Europe PubMed Central (Europe PMC) y ELSEVIER. La búsqueda estuvo además, orientada por una serie de preguntas orientadoras sobre la neumonía lipoidea exógena (NLE) como riesgo laboral. Los artículos que cumplieron con los criterios de inclusión, se clasificaron de acuerdo con tipo de estudio, la calidad del artículo y finalmente se evaluaron con base en una lista de chequeo ajustada para este fin. Resultados Se seleccionó un total de 71 estudios incluyendo reporte de casos, descripciones de la enfermedad y técnicas diagnósticas; publicaciones provenientes de 21 países. Un total de 63 casos, 31 en mujeres (49,20%) y 32 en hombres (50,8%) fueron reportados; el 7,93% de los casos fueron atribuidos a la exposición de agentes en el ambiente laboral: combustibles, parafina/pintura/ pulverizaciones oleosas y diésel, ambientes propios de trabajadores de almacenes de pinturas y conductores de vehículos. Los síntomas de la neumonía lipoidea exógena (NLE) descritos fueron muy variables, puede cursar asintomática o con presencia de fiebre, disnea y tos irritativa; dolor torácico, en algunos casos hemoptisis, cianosis y pérdida de peso; El examen físico generalmente es normal, aunque puede revelar sibilancias o roncus. En pruebas de función respiratoria tales como la espirometría, se presenta un patrón restrictivo, además puede hallarse un descenso en la capacidad de difusión para el monóxido de carbono. En el hemograma puede detectarse leucocitosis con predominio de neutrófilos y aumento de la velocidad de sedimentación globular, hallazgos que también pueden ser producidos por una infección concomitante. Conclusiones Se destaca el hallazgo en la literatura de los diferentes agentes causales presentes en el ambiente laboral que pueden incidir en la aparición de la enfermedad: combustibles, parafina/pintura/ pulverizaciones oleosas y diésel, ambientes propios de trabajadores de almacenes de pinturas y conductores de vehículos. Se presentó neumonía lipoidea exógena (NLE) de tipo agudo después de 30 minutos a 24 horas de exposición masiva a aceites y sustancias oleosas, luego de de ocho meses a 9,5 años de inhalación continua secundaria. Los métodos más para de la diagnóstico enfermedad incluyen el lavado broncoalveolar (BAL), la biopsia de pulmón y la con radiografía torácica. Finalmente, es posible que en pacientes con neumonía lipoidea exógena, puedan presentarse complicaciones asociadas tales como: sobreinfección, fibrosis, retracción de las lesiones, cáncer de pulmón, bronconeumonías recurrentes e hipercalcemia.
Resumo:
La presente investigación tiene como objetivo analizar la incidencia de las agresiones cibernéticas en el desarrollo informático de las Fuerzas Armadas de Estados Unidos. Los diferentes estudios que se han realizado sobre el ciberespacio se han enfocado en el papel del individuo como actor principal y se ha dejado de lado las repercusiones que éste ha tenido para el Estado, como un nuevo eje de amenazas. Teniendo en cuenta lo anterior, esta investigación demostrará a partir del concepto de securitización, que se busca priorizar la “ciberseguridad” dentro de la agenda del gobierno estadounidense. Al ser este un estudio que aborda experiencias concretas durante un periodo de tiempo de más de 10 años, el diseño metodológico de la investigación será longitudinal, ya que abarcará estudios, artículos, textos y resoluciones que se han realizado desde 2003 hasta la actualidad.
Resumo:
Este trabajo se centra en el análisis de las actividades desarrolladas en torno a los servicios de procesos de impresión que ofrece la organización DATAPOINT de Colombia SAS para identificar los puntos críticos en la gestión de los residuos de impresión y las decisiones tomadas por parte de los involucrados durante todo el proceso (proveedores, clientes y la empresa), con el fin de revisar medidas y estrategias que permitan fortalecer la gestión integral de residuos de impresión a partir de una revisión y comparación de las mejores prácticas planteadas por los actores del sector. También se efectuaron recomendaciones con acciones de mejora que se podrían desarrollar con el fin de mitigar el impacto ambiental generado por estos residuos. Con la finalidad de cumplir con lo planteado se realizó inicialmente un estudio sobre la organización, sus clientes y proveedores para entender de manera integral la cadena de valor en torno a los tóner y su gestión inversa, (explicar) al igual que el entorno normativo tanto de manera nacional como internacional. Posteriormente, se identificaron los puntos de mejora comparando lo planteado por el proveedor versus lo ejecutado por los involucrados en el proceso, labor se realizó en campo con los clientes para entender la situación actual, sus necesidades y en que basan la toma de decisiones relacionada con el manejo de los residuos de impresión. Finalmente se listaran una serie de acciones de mejora y recomendaciones las cuales pueden ser incorporadas a los procesos críticos de DATAPOINT.
Resumo:
This paper reviews the concept of “organic”, its meaning and emphasizes a comparison with conventional goods. It develops the background of organic goods in the past 20 years, quotations different definitions of organic and developing a main definition. Also it states certain criteriab and variables in order to develop a deeper business analysis. And it has the objective to define the advantages, disadvantages, key points and strategies for companies that want to venture an organic production, and if it’s recommendable to pursue. After a cross case and SWOT analysis it is possible to determine that depending of the core strategy and type of company if an enterprise can decide to venture the organic market.
Resumo:
Common Loon (Gavia immer) is considered an emblematic and ecologically important example of aquatic-dependent wildlife in North America. The northern breeding range of Common Loon has contracted over the last century as a result of habitat degradation from human disturbance and lakeshore development. We focused on the state of New Hampshire, USA, where a long-term monitoring program conducted by the Loon Preservation Committee has been collecting biological data on Common Loon since 1976. The Common Loon population in New Hampshire is distributed throughout the state across a wide range of lake-specific habitats, water quality conditions, and levels of human disturbance. We used a multiscale approach to evaluate the association of Common Loon and breeding habitat within three natural physiographic ecoregions of New Hampshire. These multiple scales reflect Common Loon-specific extents such as territories, home ranges, and lake-landscape influences. We developed ecoregional multiscale models and compared them to single-scale models to evaluate model performance in distinguishing Common Loon breeding habitat. Based on information-theoretic criteria, there is empirical support for both multiscale and single-scale models across all three ecoregions, warranting a model-averaging approach. Our results suggest that the Common Loon responds to both ecological and anthropogenic factors at multiple scales when selecting breeding sites. These multiscale models can be used to identify and prioritize the conservation of preferred nesting habitat for Common Loon populations.
Resumo:
Acid mine drainage (AMD) is a widespread environmental problem associated with both working and abandoned mining operations. As part of an overall strategy to determine a long-term treatment option for AMD, a pilot passive treatment plant was constructed in 1994 at Wheat Jane Mine in Cornwall, UK. The plant consists of three separate systems; each containing aerobic reed beds, anaerobic cell and rock filters, and represents the largest European experimental facility of its kind. The systems only differ by the type of pre-treatment utilised to increase the pH of the influent minewater (pH<4): lime-dosed (LD), anoxic limestone drain (ALD) and lime free (LF), which receives no form of pre-treatment. The Wheal Jane pilot plant offered a unique facility and a major research project was established to evaluate the pilot plant and study in detail the biological mechanisms and the geochemical and physical processes that control passive treatment systems. The project has led to data, knowledge, models and design criteria for the future design, planning and sustainable management of passive treatment systems. A multidisciplinary team of scientists and managers from the U.K. universities, the Environment Agency and the Mining Industry has been put together to obtain the maximum advantage from the excellent facilities facility at Wheal Jane. (C) 2004 Elseaier B.V All rights reserved.
Resumo:
The Pax Americana and the grand strategy of hegemony (or “Primacy”) that underpins it may be becoming unsustainable. Particularly in the wake of exhausting wars, the Global Financial Crisis, and the shift of wealth from West to East, it may no longer be possible or prudent for the United States to act as the unipolar sheriff or guardian of a world order. But how viable are the alternatives, and what difficulties will these alternatives entail in their design and execution? This analysis offers a sympathetic but critical analysis of alternative U.S. National Security Strategies of “retrenchment” that critics of American diplomacy offer. In these strategies, the United States would anticipate the coming of a more multipolar world and organize its behavior around the dual principles of “concert” and “balance,” seeking a collaborative relationship with other great powers, while being prepared to counterbalance any hostile aggressor that threatens world order. The proponents of such strategies argue that by scaling back its global military presence and its commitments, the United States can trade prestige for security, shift burdens, and attain a more free hand. To support this theory, they often look to the 19th-century concert of Europe as a model of a successful security regime and to general theories about the natural balancing behavior of states. This monograph examines this precedent and measures its usefulness for contemporary statecraft to identify how great power concerts are sustained and how they break down. The project also applies competing theories to how states might behave if world politics are in transition: Will they balance, bandwagon, or hedge? This demonstrates the multiple possible futures that could shape and be shaped by a new strategy. viii A new strategy based on an acceptance of multipolarity and the limits of power is prudent. There is scope for such a shift. The convergence of several trends—including transnational problems needing collaborative efforts, the military advantages of defenders, the reluctance of states to engage in unbridled competition, and hegemony fatigue among the American people—means that an opportunity exists internationally and at home for a shift to a new strategy. But a Concert-Balance strategy will still need to deal with several potential dilemmas. These include the difficulty of reconciling competitive balancing with cooperative concerts, the limits of balancing without a forward-reaching onshore military capability, possible unanticipated consequences such as a rise in regional power competition or the emergence of blocs (such as a Chinese East Asia or an Iranian Gulf), and the challenge of sustaining domestic political support for a strategy that voluntarily abdicates world leadership. These difficulties can be mitigated, but they must be met with pragmatic and gradual implementation as well as elegant theorizing and the need to avoid swapping one ironclad, doctrinaire grand strategy for another.
Resumo:
An extensive data set of total arsenic analysis for 901 polished (white) grain samples, originating from 10 countries from 4 continents, was compiled. The samples represented the baseline (i.e., not specifically collected from arsenic contaminated areas), and all were for market sale in major conurbations. Median total arsenic contents of rice varied 7-fold, with Egypt (0.04 mg/kg) and India (0.07 mg/kg) having the lowest arsenic content while the U.S. (0.25 mg/kg) and France (0.28 mg/kg) had the highest content. Global distribution of total arsenic in rice was modeled by weighting each country’s arsenic distribution by that country’s contribution to global production. A subset of 63 samples from Bangladesh, China, India, Italy, and the U.S. was analyzed for arsenic species. The relationship between inorganic arsenic content versus total arsenic content significantly differed among countries, with Bangladesh and India having the steepest slope in linear regression, and the U.S. having the shallowest slope. Using country-specific rice consumption data, daily intake of inorganic arsenic was estimated and the associated internal cancer risk was calculated using the U.S. Environmental Protection Agency (EPA) cancer slope. Median excess internal cancer risks posed by inorganic arsenic ranged 30-fold for the 5 countries examined, being 0.7 per 10,000 for Italians to 22 per 10,000 for Bangladeshis, when a 60 kg person was considered.
Resumo:
Of the many sources of urban greenhouse gas (GHG) emissions, solid waste is the only one for which management decisions are undertaken primarily by municipal governments themselves and is hence often the largest component of cities’ corporate inventories. It is essential that decision-makers select an appropriate quantification methodology and have an appreciation of methodological strengths and shortcomings. This work compares four different waste emissions quantification methods, including Intergovernmental Panel on Climate Change (IPCC) 1996 guidelines, IPCC 2006 guidelines, U.S. Environmental Protection Agency (EPA) Waste Reduction Model (WARM), and the Federation of Canadian Municipalities- Partners for Climate Protection (FCM-PCP) quantification tool. Waste disposal data for the greater Toronto area (GTA) in 2005 are used for all methodologies; treatment options (including landfill, incineration, compost, and anaerobic digestion) are examined where available in methodologies. Landfill was shown to be the greatest source of GHG emissions, contributing more than three-quarters of total emissions associated with waste management. Results from the different landfill gas (LFG) quantification approaches ranged from an emissions source of 557 kt carbon dioxide equivalents (CO2e) (FCM-PCP) to a carbon sink of −53 kt CO2e (EPA WARM). Similar values were obtained between IPCC approaches. The IPCC 2006 method was found to be more appropriate for inventorying applications because it uses a waste-in-place (WIP) approach, rather than a methane commitment (MC) approach, despite perceived onerous data requirements for WIP. MC approaches were found to be useful from a planning standpoint; however, uncertainty associated with their projections of future parameter values limits their applicability for GHG inventorying. MC and WIP methods provided similar results in this case study; however, this is case specific because of similarity in assumptions of present and future landfill parameters and quantities of annual waste deposited in recent years being relatively consistent.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Air pollution is an environmental issue worldwide and frequently cause negative effects on population health and ecosystems on cities. The relationship between climate and atmospheric pollution can be used as a surrogate to the intensity of air pollution. The present and quantity of some gases can be used as indicators to air quality: particulate matter (PM), sulfur dioxide (SO2), carbon monoxide (CO), ozone (O3), and nitrogen dioxide (NO2). Among those gases, CO has its major source within the cities, where automobiles are the main emitter. But measure pollutant concentration are challenging, sometimes because the lack of good equipments due to high costs and of the large variability of models that varies in precision, way of measure and distribution of sellers. Modeling are useful when there are an intend to evaluate air pollution, its sources and evaluate scenarios. This work aims to use CAL3QHCR model developed by the U.S Environmental Protection Agency (EPA) to generate predictive surfaces of CO concentration distribution on a site within Campinas city, located in São Paulo state, Brazil. CAL3QHCR model use data urban automobile circulation to generate spatial results for CO distribution. We observed that the pollution concentration was lower on our modeling than the concentrations measured by Companhia Ambiental do Estado de São Paulo (CETESB), the main environmental company on the São Paulo state. Also the correlation between average estimates of our model and the measure by CETESB was weak, indicating that the model used on this study need to be or better parameterized, or the scale we measured of CO emissions need to be rescaled. Although the model failed to correlate to CETESB data, maybe one that explore the estimated emissions distributed within the sites to understand spatial distributions of CO on the regions. Also, the generated information can also be used to other studies, and come to be useful to explain heat island
Resumo:
In 1986, the U.S. Environmental Protection Agency (EPA) initiated an effort to comply more fully with the Endangered Species Act. This effort became their "Endangered Species Protection Program." The possibility of such a program was forecast in 1982 when Donald A. Spencer gave a presentation to the Tenth Vertebrate Pest Conference on "Vertebrate Pest Management and Changing Times." This paper focuses on current plans for implementing the EPA's Endangered Species Protection Program as it relates to the USDA Forest Service. It analyzes the potential effects this program will have on the agency, using the pocket gopher (Thomomys spp.), strychnine, and the grizzly bear (Ursus arctos horribilis) as examples of an affected pest, pesticide, and predator.
Resumo:
Airports worldwide are at a disadvantage when it comes to being able to spot birds and warn aircrews about the location of flocks either on the ground or close to the airfield. Birds simply cannot be easily seen during the day and are nearly invisible targets for planes at night or during low visibility. Thermal imaging (infrared) devices can be used to allow ground and tower personnel to pinpoint bird locations day or night, thus giving the airport operators the ability to launch countermeasures or simply warn the aircrews. This technology is available now, though it has been predominately isolated to medical and military system modifications. The cost of these devices has dropped significantly in recent years as technology, capability, and availability have continued to increase. Davison Army Airfield (DAAF), which is located about 20 miles south of Ronald Reagan National Airport in Washington, DC, is the transient home to many bird species including an abundance of ducks, seagulls, pigeons, and migrating Canadian geese. Over the past few years, DAAF implemented a variety of measures in an attempt to control the bird hazards on the airfield. Unfortunately, when it came to controlling these birds on or near our runways and aircraft movement areas we were more reactive than proactive. We would do airfield checks several times an hour to detect and deter any birds in these areas. The deterrents used included vehicle/human presence, pyrotechnics, and the periodic use of a trained border collie. At the time, we felt like we were doing all we could to reduce the threat to aircraft and human life. It was not until a near fatal accident in October 1998, when we truly realized how dangerous our operating environment really was to aircraft at or near the airfield. It was at this time, we had a C-12 (twin-engine passenger plane) land on our primary runway at night. The tower cleared the aircraft to land, and upon touchdown to the runway the aircraft collided with a flock of geese. Neither the tower nor the crew of the aircraft saw the geese because they were obscured in the darkness. The end result was 12 dead geese and $374,000 damage to the C-12. Fortunately, there were no human fatalities, but it was painfully clear we needed to improve our method of clearing the runway at night and during low visibility conditions. It was through this realization that we ventured to the U.S. Army Communications and Electronics Command for ideas on ways to deal with our threat. It was through a sub-organization within this command, Night Vision Labs, that we realized the possibilities of modifying thermal imagery and infrared technology to detecting wildlife on airports.