46 resultados para Smog


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An economic air pollution control model, which determines the least cost of reaching various air quality levels, is formulated. The model takes the form of a general, nonlinear, mathematical programming problem. Primary contaminant emission levels are the independent variables. The objective function is the cost of attaining various emission levels and is to be minimized subject to constraints that given air quality levels be attained.

The model is applied to a simplified statement of the photochemical smog problem in Los Angeles County in 1975 with emissions specified by a two-dimensional vector, total reactive hydrocarbon, (RHC), and nitrogen oxide, (NOx), emissions. Air quality, also two-dimensional, is measured by the expected number of days per year that nitrogen dioxide, (NO2), and mid-day ozone, (O3), exceed standards in Central Los Angeles.

The minimum cost of reaching various emission levels is found by a linear programming model. The base or "uncontrolled" emission levels are those that will exist in 1975 with the present new car control program and with the degree of stationary source control existing in 1971. Controls, basically "add-on devices", are considered here for used cars, aircraft, and existing stationary sources. It is found that with these added controls, Los Angeles County emission levels [(1300 tons/day RHC, 1000 tons /day NOx) in 1969] and [(670 tons/day RHC, 790 tons/day NOx) at the base 1975 level], can be reduced to 260 tons/day RHC (minimum RHC program) and 460 tons/day NOx (minimum NOx program).

"Phenomenological" or statistical air quality models provide the relationship between air quality and emissions. These models estimate the relationship by using atmospheric monitoring data taken at one (yearly) emission level and by using certain simple physical assumptions, (e. g., that emissions are reduced proportionately at all points in space and time). For NO2, (concentrations assumed proportional to NOx emissions), it is found that standard violations in Central Los Angeles, (55 in 1969), can be reduced to 25, 5, and 0 days per year by controlling emissions to 800, 550, and 300 tons /day, respectively. A probabilistic model reveals that RHC control is much more effective than NOx control in reducing Central Los Angeles ozone. The 150 days per year ozone violations in 1969 can be reduced to 75, 30, 10, and 0 days per year by abating RHC emissions to 700, 450, 300, and 150 tons/day, respectively, (at the 1969 NOx emission level).

The control cost-emission level and air quality-emission level relationships are combined in a graphical solution of the complete model to find the cost of various air quality levels. Best possible air quality levels with the controls considered here are 8 O3 and 10 NO2 violations per year (minimum ozone program) or 25 O3 and 3 NO2 violations per year (minimum NO2 program) with an annualized cost of $230,000,000 (above the estimated $150,000,000 per year for the new car control program for Los Angeles County motor vehicles in 1975).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While several studies have investigated winter-time air pollution with a wide range of concentration levels, hardly any results are available for longer time periods covering several winter-smog episodes at various locations; e.g., often only a few weeks from a single winter are investigated. Here, we present source apportionment results of winter-smog episodes from 16 air pollution monitoring stations across Switzerland from five consecutive winters. Radiocarbon (14C) analyses of the elemental (EC) and organic (OC) carbon fractions, as well as levoglucosan, major water-soluble ionic species and gas-phase pollutant measurements were used to characterize the different sources of PM10. The most important contributions to PM10 during winter-smog episodes in Switzerland were on average the secondary inorganic constituents (sum of nitrate, sulfate and ammonium = 41 ± 15%) followed by organic matter (OM) (34 ± 13%) and EC (5 ± 2%). The non-fossil fractions of OC (fNF,OC) ranged on average from 69 to 85 and 80 to 95% for stations north and south of the Alps, respectively, showing that traffic contributes on average only up to ~ 30% to OC. The non-fossil fraction of EC (fNF,EC), entirely attributable to primary wood burning, was on average 42 ± 13 and 49 ± 15% for north and south of the Alps, respectively. While a high correlation was observed between fossil EC and nitrogen oxides, both primarily emitted by traffic, these species did not significantly correlate with fossil OC (OCF), which seems to suggest that a considerable amount of OCF is secondary, from fossil precursors. Elevated fNF,EC and fNF,OC values and the high correlation of the latter with other wood burning markers, including levoglucosan and water soluble potassium (K+) indicate that residential wood burning is the major source of carbonaceous aerosols during winter-smog episodes in Switzerland. The inspection of the non-fossil OC and EC levels and the relation with levoglucosan and water-soluble K+ shows different ratios for stations north and south of the Alps (most likely because of differences in burning technologies) for these two regions in Switzerland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report on an inter-comparison of six different hygroscopicity tandem differential mobility analysers (HTDMAs). These HTDMAs are used worldwide in laboratories and in field campaigns to measure the water uptake of aerosol particles and were never intercompared. After an investigation of the different design of the instruments with their advantages and inconveniencies, the methods for calibration, validation and analysis are presented. Measurements of nebulised ammonium sulphate as well as of secondary organic aerosol generated from a smog chamber were performed. Agreement and discrepancies between the instrument and to the theory are discussed, and final recommendations for a standard instrument are given, as a benchmark for laboratory or field experiments to ensure a high quality of HTDMA data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Street Computing workshop, held in conjunction with OZCHI 2009, solicits papers discussing new research directions, early research results, works-in-progress and critical surveys of prior research work in the areas of ubiquitous computing and interaction design for urban environments. Urban spaces have unique characteristics. Typically, they are densely populated, buzzing with life twenty-four hours a day, seven days a week. These traits afford many opportunities, but they also present many challenges: traffic jams, smog and pollution, stress placed on public services, and more. Computing technology, particularly the kind that can be placed in the hands of citizens, holds much promise in combating some of these challenges. Yet, computation is not merely a tool for overcoming challenges; rather, when embedded appropriately in our everyday lives, it becomes a tool of opportunity, for shaping how our cities evolve, for enabling us to interact with our city and its people in new ways, and for uncovering useful, but hidden relationships and correlations between elements of the city. The increasing availability of an urban computing infrastructure has lead to new and exciting ways inhabitants can interact with their city. This includes interaction with a wide range of services (e.g. public transport, public services), conceptual representations of the city (e.g. local weather and traffic conditions), the availability of a variety of shared and personal displays (e.g. public, ambient, mobile) and the use of different interaction modes (e.g. tangible, gesture-based, token-based). This workshop solicits papers that address the above themes in some way. We encourage researchers to submit work that deals with challenges and possibilities that the availability of urban computing infrastructure such as sensors and middleware for sensor networks pose. This includes new and innovative ways of interacting with and within urban environments; user experience design and participatory design approaches for urban environments; social aspects of urban computing; and other related areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An experimental investigation has been made of a round, non-buoyant plume of nitric oxide, NO, in a turbulent grid flow of ozone, 03, using the Turbulent Smog Chamber at the University of Sydney. The measurements have been made at a resolution not previously reported in the literature. The reaction is conducted at non-equilibrium so there is significant interaction between turbulent mixing and chemical reaction. The plume has been characterized by a set of constant initial reactant concentration measurements consisting of radial profiles at various axial locations. Whole plume behaviour can thus be characterized and parameters are selected for a second set of fixed physical location measurements where the effects of varying the initial reactant concentrations are investigated. Careful experiment design and specially developed chemilurninescent analysers, which measure fluctuating concentrations of reactive scalars, ensure that spatial and temporal resolutions are adequate to measure the quantities of interest. Conserved scalar theory is used to define a conserved scalar from the measured reactive scalars and to define frozen, equilibrium and reaction dominated cases for the reactive scalars. Reactive scalar means and the mean reaction rate are bounded by frozen and equilibrium limits but this is not always the case for the reactant variances and covariances. The plume reactant statistics are closer to the equilibrium limit than those for the ambient reactant. The covariance term in the mean reaction rate is found to be negative and significant for all measurements made. The Toor closure was found to overestimate the mean reaction rate by 15 to 65%. Gradient model turbulent diffusivities had significant scatter and were not observed to be affected by reaction. The ratio of turbulent diffusivities for the conserved scalar mean and that for the r.m.s. was found to be approximately 1. Estimates of the ratio of the dissipation timescales of around 2 were found downstream. Estimates of the correlation coefficient between the conserved scalar and its dissipation (parallel to the mean flow) were found to be between 0.25 and the significant value of 0.5. Scalar dissipations for non-reactive and reactive scalars were found to be significantly different. Conditional statistics are found to be a useful way of investigating the reactive behaviour of the plume, effectively decoupling the interaction of chemical reaction and turbulent mixing. It is found that conditional reactive scalar means lack significant transverse dependence as has previously been found theoretically by Klimenko (1995). It is also found that conditional variance around the conditional reactive scalar means is relatively small, simplifying the closure for the conditional reaction rate. These properties are important for the Conditional Moment Closure (CMC) model for turbulent reacting flows recently proposed by Klimenko (1990) and Bilger (1993). Preliminary CMC model calculations are carried out for this flow using a simple model for the conditional scalar dissipation. Model predictions and measured conditional reactive scalar means compare favorably. The reaction dominated limit is found to indicate the maximum reactedness of a reactive scalar and is a limiting case of the CMC model. Conventional (unconditional) reactive scalar means obtained from the preliminary CMC predictions using the conserved scalar p.d.f. compare favorably with those found from experiment except where measuring position is relatively far upstream of the stoichiometric distance. Recommendations include applying a full CMC model to the flow and investigations both of the less significant terms in the conditional mean species equation and the small variation of the conditional mean with radius. Forms for the p.d.f.s, in addition to those found from experiments, could be useful for extending the CMC model to reactive flows in the atmosphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work investigates the computer modelling of the photochemical formation of smog products such as ozone and aerosol, in a system containing toluene, NOx and water vapour. In particular, the problem of modelling this process in the Commonwealth Scientific and Industrial Research Organization (CSIRO) smog chambers, which utilize outdoor exposure, is addressed. The primary requirement for such modelling is a knowledge of the photolytic rate coefficients. Photolytic rate coefficients of species other than N02 are often related to JNo2 (rate coefficient for the photolysis ofN02) by a simple factor, but for outdoor chambers, this method is prone to error as the diurnal profiles may not be similar in shape. Three methods for the calculation of diurnal JNo2 are investigated. The most suitable method for incorporation into a general model, is found to be one which determines the photolytic rate coefficients for N02, as well as several other species, from actinic flux, absorption cross section and quantum yields. A computer model was developed, based on this method, to calculate in-chamber photolysis rate coefficients for the CSIRO smog chambers, in which ex-chamber rate coefficients are adjusted by accounting for variation in light intensity by transmittance through the Teflon walls, albedo from the chamber floor and radiation attenuation due to clouds. The photochemical formation of secondary aerosol is investigated in a series of toluene-NOx experiments, which were performed in the CSIRO smog chambers. Three stages of aerosol formation, in plots of total particulate volume versus time, are identified: a delay period in which no significant mass of aerosol is formed, a regime of rapid aerosol formation (regime 1) and a second regime of slowed aerosol formation (regime 2). Two models are presented which were developed from the experimental data. One model is empirically based on observations of discrete stages of aerosol formation and readily allows aerosol growth profiles to be calculated. The second model is based on an adaptation of published toluene photooxidation mechanisms and provides some chemical information about the oxidation products. Both models compare favorably against the experimental data. The gross effects of precursor concentrations (toluene, NOx and H20) and ambient conditions (temperature, photolysis rate) on the formation of secondary aerosol are also investigated, primarily using the mechanism model. An increase in [NOx]o results in increased delay time, rate of aerosol formation in regime 1 and volume of aerosol formed in regime 1. This is due to increased formation of dinitrocresol and furanone products. An increase in toluene results in a decrease in the delay time and an increase in the rate of aerosol formation in regime 1, due to enhanced reactivity from the toluene products, such as the radicals from the photolysis of benzaldehyde. Water vapor has very little effect on the formation of aerosol volume, except that rates are slightly increased due to more OH radicals from reaction with 0(1D) from ozone photolysis. Increased temperature results in increased volume of aerosol formed in regime 1 (increased dinitrocresol formation), while increased photolysis rate results in increased rate of aerosol formation in regime 1. Both the rate and volume of aerosol formed in regime 2 are increased by increased temperature or photolysis rate. Both models indicate that the yield of secondary particulates from hydrocarbons (mass concentration aerosol formed/mass concentration hydrocarbon precursor) is proportional to the ratio [NOx]0/[hydrocarbon]0

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant fraction of the total nitrogen entering coastal and estuarine ecosystems along the eastern U.S. coast arises from atmospheric deposition; however, the exact role of atmospherically derived nitrogen in the decline of the health of coastal, estuarine, and inland waters is still uncertain. From the perspective of coastal ecosystem eutrophication, nitrogen compounds from the air, along with nitrogen from sewage, industrial effluent, and fertilizers, become a source of nutrients to the receiving ecosystem. Eutrophication, however, is only one of the detrimental impacts of the emission of nitrogen containing compounds to the atmosphere. Other adverse effects include the production of tropospheric ozone, acid deposition, and decreased visibility (photochemical smog). Assessments of the coastal eutrophication problem indicate that the atmospheric deposition loading is most important in the region extending from Albemarle/Parnlico Sounds to the Gulf of Maine; however, these assessments are based on model outputs supported by a meager amount of actual data. The data shortage is severe. The National Research Council specifically mentions the atmospheric role in its recent publication for the Committee on Environmental and Natural Resources, Priorities for Coastal Ecosystem Science (1994). It states that, "Problems associated with changes in the quantity and quality of inputs to coastal environments from runoff and atmospheric deposition are particularly important [to coastal ecosystem integrity]. These include nutrient loading from agriculture and fossil fuel combustion, habitat losses from eutrophication, widespread contamination by toxic materials, changes in riverborne sediment, and alteration of coastal hydrodynamics. "

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução: A saúde e a sua promoção estão intimamente relacionadas com o nível de educação ou de alfabetização dos indivíduos. Nas farmácias da comunidade, é habitual utilizar-se, na educação dos doentes, informação escrita na forma de folheto informativo. Esta ferramenta confere autonomia aos seus utilizadores, em função das competências individuais de literacia. A literacia em saúde é a capacidade individual de obter, processar e interpretar informações sobre saúde e serviços de saúde, com o objectivo de tomar decisões informadas. A medição do nível de dificuldade na leitura e compreensão de textos em língua inglesa tem sido estudada desde 1920, tendo sido propostas as primeiras fórmulas para classificar o nível mínimo de escolaridade que um texto exige para a sua leitura e compreensão, sendo estas designadas por fórmulas de legibilidade ou ‘lecturabilidade’. Material e métodos: O principal objectivo deste trabalho foi a experimentação de fórmulas de ‘lecturabilidade’, em particular o SMOG e o Índice Flesh-Kincaid, para a análise da complexidade de leitura e interpretação de folhetos informativos, cujo uso é generalizado nas Farmácias Portuguesas. As duas fórmulas foram aplicadas a uma amostra de 4 folhetos, estimando-se assim o número de anos de escolaridade necessários para a compreensão adequada desses textos. Dois tradutores técnicos independentes traduziram os folhetos para o Inglês, obtendo-se uma versão de consenso para cada folheto. As retroversões foram avaliadas para confirmar a justaposição das traduções com os textos originais. A utilização simultânea das duas fórmulas de ‘lecturabilidade’ permitiu estudar o grau de validade convergente e a fiabilidade dos resultados. Resultados: O valor médio para o SMOG foi 11,13 (DP = 0,81), enquanto a média Flesch-Kincaid foi igual a 8,32 (DP = 0,90). Estes resultados correspondem a um nível de escolaridade mínima em Portugal de cerca de 10 anos. A correlação Spearman entre as escalas foi de 0,948 (p = 0,51), convergência que confirmou a validade dos resultados. Para o conjunto de folhetos informativos analisados, a educação formal mínima necessária para ler e compreender o seu conteúdo foi de 9 anos completos de escolaridade. Uma percentagem significativa da população que utiliza as farmácias como uma fonte acessível e credível de informação em saúde, em particular os doentes crónicos e os idosos, possui níveis de escolaridade normalmente mais baixos que o 9º ano. Deste modo, os actuais folhetos informativos podem não ser totalmente compreendidos pelos seus destinatários. Embora constituindo uma ferramenta importante para as decisões relacionadas com a saúde, os folhetos actuais podem não apresentar a utilidade que inicialmente se poderia antecipar. Conclusões: As fórmulas de legibilidade ou ‘lecturabilidade’ são ferramentas importantes para a avaliação e ajuste de peças de informação escrita aos seus potenciais utilizadores, funcionando como alternativas exequíveis aos testes clássicos de literacia na população, e contribuindo para o sucesso das estratégias de educação para a saúde. Seria desejável desenvolver e validar ferramentas para o estudo da ‘lecturabilidade’ no nosso próprio idioma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto caracteriza la industria del cemento en Colombia a partir de la contextualización y descripción de esta actividad en el mundo y en el país, describiendo y analizando asuntos relacionados con la cadena de suministro de esta. Teniendo en cuenta los actores directos e indirectos que en esta interactúan y temas como el comercio internacional, el desempeño y la competitividad regional de la industria nacional, la percepción empresarial y factores como la responsabilidad social y las buenas prácticas. Además de la descripción de los factores nacionales que afectan a la industria como la infraestructura, el transporte y el desempeño logístico del país. En la contextualización mundial se presentan temas como la producción, oferta y demanda mundial de este producto y las principales tendencias y prácticas logísticas que caracterizan a esta industria internacionalmente; en el panorama nacional se caracteriza la competitividad y desempeño logístico del país y los factores que afectan a la industria del cemento. Se presentan los orígenes e historia de la industria en el país y se dan datos de la producción, despachos y comercio internacional de esta además de una breve descripción de su cadena de suministro y la caracterización de las relaciones entre sus eslabones. Finalmente se describen los entes con los cuales interactúa la industria, se describe lo más relevante en cuanto a políticas ambientales, responsabilidad social y buenas prácticas de los principales productores del país y se concluye con la caracterización de la competitividad general de la industria a nivel nacional y los retos y problemáticas que afronta el sector y que lo limitan para mejorar su desempeño a nivel regional.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se describe la doble incidencia del ozono en el medioambiente así como los procesos químicos que lo producen o alteran y los agentes que intervienen en estos. Esta sustancia química se manifiesta de dos formas: ozono estratosférico (capa de ozono) y ozono urbano ('smog' fotoquímico). El primero, esencial para la vida en la tierra, filtra un 90 por ciento de las radiaciones UV. La segunda manifestación de este elemento se da en zonas bajas de la atmósfera como resultado de diversas reacciones fotoquímicas en las que actúan sustancias contaminantes. Se recomienda reducir la emisión de sustancias contaminantes para mantener el equilibrio de la capa de ozono estratosférico y evitar el aumento de ozono en zonas bajas de la atmósfera donde resulta nocivo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is ongoing debate concerning the possible environmental and human health impacts of growing genetically modified (GM) crops. Here, we report the results of a life-cycle assessment (LCA) comparing the environmental and human health impacts of conventional sugar beet growing regimes in the UK and Germany with those that might be expected if GM herbicide-tolerant (to glyphosate) sugar beet is commercialized. The results presented for a number of environmental and human health impact categories suggest that growing the GM herbicide-tolerant crop would be less harmful to the environment and human health than growing the conventional crop, largely due to lower emissions from herbicide manufacture, transport and field operations. Emissions contributing to negative environmental impacts, such as global warming, ozone depletion, ecotoxicity of water and acidification and nutrification of soil and water, were much lower for the herbicide-tolerant crop than for the conventional crop. Emissions contributing to summer smog, toxic particulate matter and carcinogenicity, which have negative human health impacts, were also substantially lower for the herbicide-tolerant crop. The environmental and human health impacts of growing GM crops need to be assessed on a case-by-case basis using a holistic approach. LCA is a valuable technique for helping to undertake such assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Life-Cycle Assessment (LCA) was used to assess the potential environmental and human health impacts of growing genetically-modified (GM), herbicide-tolerant sugar beet in the UK and Germany compared with conventional sugar beet varieties. The GM variety results in lower potential environmental impacts on global warming, airborne nutrification, ecotoxicity (of soil and water) and watercourse enrichment, and lower potential human health impacts in terms of production of toxic particulates, summer smog, carcinogens and ozone depletion. Although the overall contribution of GM sugar beet to reducing harmful emissions to the environment would be relatively small, the potential for GM crops to reduce pollution from agriculture, including diffuse water pollution, is highlighted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The formation and composition of secondary organic aerosol (SOA) from the photooxidation of benzene, p-xylene, and 1,3,5-trimethylbenzene has been simulated using the Master Chemical Mechanism version 3.1 (MCM v3.1) coupled to a representation of the transfer of organic material from the gas to particle phase. The combined mechanism was tested against data obtained from a series of experiments conducted at the European Photoreactor (EUPHORE) outdoor smog chamber in Valencia, Spain. Simulated aerosol mass concentrations compared reasonably well with the measured SOA data only after absorptive partitioning coefficients were increased by a factor of between 5 and 30. The requirement of such scaling was interpreted in terms of the occurrence of unaccounted-for association reactions in the condensed organic phase leading to the production of relatively more nonvolatile species. Comparisons were made between the relative aerosol forming efficiencies of benzene, toluene, p-xylene, and 1,3,5-trimethylbenzene, and differences in the OH-initiated degradation mechanisms of these aromatic hydrocarbons. A strong, nonlinear relationship was observed between measured (reference) yields of SOA and (proportional) yields of unsaturated dicarbonyl aldehyde species resulting from ring-fragmenting pathways. This observation, and the results of the simulations, is strongly suggestive of the involvement of reactive aldehyde species in association reactions occurring in the aerosol phase, thus promoting SOA formation and growth. The effect of NO, concentrations on SOA formation efficiencies (and formation mechanisms) is discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The risks of accidents, illness and early death are part of life on Earth, which touches all living creatures, including Mankind. Because of modern mass communication media, the emotional impact of current risks are greater than in the past. Many unnecessary risk can and should be avoided, and with respect to other they can be drastically reduced. In addition to this, some risks should be confronted in order to avoid greater ones. In any risk analysis, eventual benefits should be taken into consideration, the risks surrounding other activities and other factors. Some risks examples and their implications are presented and discussed, in general. Nuclear energy is specifically treated, but it also refers and comments that which surrounds other human activities (airplanes, automobiles, smog, gasoline, DDT, and coal energy). As in the history of aviation, nuclear industry has a history of greater successes than failures. Nonetheless, in both cases, serious accidents deserve deep thought, including the increment of security norms. The current fear of some people to nuclear energy is compared to the unfounded fear at the advent of gasoline last century. Risks, naturally, should not be exagerated, but they cannot be discarded. The main intention of the author is to discuss the complexity of the problem, and to see that risks are evaluated and accepted. In relation to nuclear energy, the author only mentions his point of view, defended in other publications, that it involves very high risks.