973 resultados para particle number emissions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compression ignition (CI) engine design is subject to many constraints which presents a multi-criteria optimisation problem that the engine researcher must solve. In particular, the modern CI engine must not only be efficient, but must also deliver low gaseous, particulate and life cycle greenhouse gas emissions so that its impact on urban air quality, human health, and global warming are minimised. Consequently, this study undertakes a multi-criteria analysis which seeks to identify alternative fuels, injection technologies and combustion strategies that could potentially satisfy these CI engine design constraints. Three datasets are analysed with the Preference Ranking Organization Method for Enrichment Evaluations and Geometrical Analysis for Interactive Aid (PROMETHEE-GAIA) algorithm to explore the impact of 1): an ethanol fumigation system, 2): alternative fuels (20 % biodiesel and synthetic diesel) and alternative injection technologies (mechanical direct injection and common rail injection), and 3): various biodiesel fuels made from 3 feedstocks (i.e. soy, tallow, and canola) tested at several blend percentages (20-100 %) on the resulting emissions and efficiency profile of the various test engines. The results show that moderate ethanol substitutions (~20 % by energy) at moderate load, high percentage soy blends (60-100 %), and alternative fuels (biodiesel and synthetic diesel) provide an efficiency and emissions profile that yields the most “preferred” solutions to this multi-criteria engine design problem. Further research is, however, required to reduce Reactive Oxygen Species (ROS) emissions with alternative fuels, and to deliver technologies that do not significantly reduce the median diameter of particle emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonthermal plasma (NTP) treatment of exhaust gas is a promising technology for both nitrogen oxides (NOX) and particulate matter (PM) reduction by introducing plasma into the exhaust gases. This paper considers the effect of NTP on PM mass reduction, PM size distribution, and PM removal efficiency. The experiments are performed on real exhaust gases from a diesel engine. The NTP is generated by applying high-voltage pulses using a pulsed power supply across a dielectric barrier discharge (DBD) reactor. The effects of the applied high-voltage pulses up to 19.44 kVpp with repetition rate of 10 kHz are investigated. In this paper, it is shown that the PM removal and PM size distribution need to be considered both together, as it is possible to achieve high PM removal efficiency with undesirable increase in the number of small particles. Regarding these two important factors, in this paper, 17 kVpp voltage level is determined to be an optimum point for the given configuration. Moreover, particles deposition on the surface of the DBD reactor is found to be a significant phenomenon, which should be considered in all plasma PM removal tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The surface area of inhaled particles deposited in the alveolar region, as reported by the TSI nanoparticle surface area monitor (NSAM), was compared with the corresponding value estimated by a TSI scanning mobility particle sizer (SMPS) for a range of environmentally relevant aerosols, including petrol emissions, ETS, laser printer emissions, cooking emissions and ambient aerosols. The SMPS values were based on a mobility size distribution assuming spherical particles using the appropriate size-dependent alveolar-deposition factors provided by the ICRP. In most cases, the two instruments showed good linear agreement. With petrol emissions and ETS, the linearity extended to over 103 μm2 cm-3. With printer emissions, there was good linearity up to about 300 μm2 cm-3 while the NSAM increasingly overestimated the surface area at higher concentrations. The presence of a nucleation event in ambient air caused the NSAM to over-estimate the surface area by a factor of 2. We summarize these results and conclude that the maximum number concentration up to which the NSAM is accurate clearly depends on the type of aerosol being sampled and provide guidance for the use of the instrument.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sugarcane bagasse is an abundant and sustainable resource, generated as a by-product of sugarcane milling. The cellulosic material within bagasse can be broken down into glucose molecules and fermented to produce ethanol, making it a promising feedstock for biofuel production. Mild acid pretreatment hydrolyses the hemicellulosic component of biomass, thus allowing enzymes greater access to the cellulosic substrate during saccharification. A particle-scale mathematical model describing the mild acid pretreatment of sugarcane bagasse has been developed, using a volume averaged framework. Discrete population-balance equations are used to characterise the polymer degradation kinetics, and diffusive effects account for mass transport within the cell wall of the bagasse. As the fibrous material hydrolyses over time, variations in the porosity of the cell wall and the downstream effects on the reaction kinetics are accounted for using conservation of volume arguments. Non-dimensionalization of the model equations reduces the number of parameters in the system to a set of four dimensionless ratios that compare the timescales of different reaction and diffusion events. Theoretical yield curves are compared to macroscopic experimental observations from the literature and inferences are made as to constraints on these “unknown” parameters. These results enable connections to be made between experimental data and the underlying thermodynamics of acid pretreatment. Consequently, the results suggest that data-fitting techniques used to obtain kinetic parameters should be carefully applied, with prudent consideration given to the chemical and physiological processes being modeled.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the existence of air quality guidelines in Australia and New Zealand, the concentrations of particulate matter have exceeded these guidelines on several occasions. To identify the sources of particulate matter, examine the contributions of the sources to the air quality at specific areas and estimate the most likely locations of the sources, a growing number of source apportionment studies have been conducted. This paper provides an overview of the locations of the studies, salient features of the results obtained and offers some perspectives for the improvement of future receptor modelling of air quality in these countries. The review revealed that because of its advantages over alternative models, Positive Matrix Factorisation (PMF) was the most commonly applied model in the studies. Although there were differences in the sources identified in the studies, some general trends were observed. While biomass burning was a common problem in both countries, the characteristics of this source varied from one location to another. In New Zealand, domestic heating was the highest contributor to particle levels on days when the guidelines were exceeded. On the other hand, forest back-burning was a concern in Brisbane while marine aerosol was a major source in most studies. Secondary sulphate, traffic emissions, industrial emissions and re-suspended soil were also identified as important sources. Some unique species, for example, volatile organic compounds and particle size distribution were incorporated into some of the studies with results that have significant ramifications for the improvement of air quality. Overall, the application of source apportionment models provided useful information that can assist the design of epidemiological studies and refine air pollution reduction strategies in Australia and New Zealand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide applicability of correlation analysis inspired the development of this paper. In this paper, a new correlated modified particle swarm optimization (COM-PSO) is developed. The Correlation Adjustment algorithm is proposed to recover the correlation between the considered variables of all particles at each of iterations. It is shown that the best solution, the mean and standard deviation of the solutions over the multiple runs as well as the convergence speed were improved when the correlation between the variables was increased. However, for some rotated benchmark function, the contrary results are obtained. Moreover, the best solution, the mean and standard deviation of the solutions are improved when the number of correlated variables of the benchmark functions is increased. The results of simulations and convergence performance are compared with the original PSO. The improvement of results, the convergence speed, and the ability to simulate the correlated phenomena by the proposed COM-PSO are discussed by the experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cluster ions and charged and neutral nanoparticle concentrations were monitored using a neutral cluster and air ion spectrometer (NAIS) over a period of one year in Brisbane, Australia. The study yielded 242 complete days of usable data, of which particle formation events were observed on 101 days. Small, intermediate and large ion concentrations were evaluated in real time. In the diurnal cycle, small ion concentration was highest during the second half of the night while large ion concentrations were a maximum during the day. The small ion concentration showed a decrease when the large ion concentration increased. Particle formation was generally followed by a peak in the intermediate ion concentration. The rate of increase of intermediate ions was used as the criteria for identifying particle formation events. Such events were followed by a period of growth to larger sizes and usually occurred between 8 am and 2 pm. Particle formation events were found to be related to the wind direction. The gaseous precursors for the production of secondary particles in the urban environment of Brisbane have been shown to be ammonia and sulfuric acid. During these events, the nanoparticle number concentrations in the size range 1.6 to 42 nm, which were normally lower than 1x104 cm-3, often exceeded 5x104 cm-3 with occasional values over 1x105 cm-3. Cluster ions generally occurred in number concentrations between 300 and 600 cm-3 but decreased significantly to about 200 cm-3 during particle formation events. This was accompanied by an increase in the large ion concentration. We calculated the fraction of nanoparticles that were charged and investigated the occurrence of possible overcharging during particle formation events. Overcharging is defined as the condition where the charged fraction of particles is higher than in charge equilibrium. This can occur when cluster ions attach to neutral particles in the atmosphere, giving rise to larger concentrations of charged particles in the short term. Ion-induced nucleation is one of the mechanisms of particle formation in the atmosphere, and overcharging has previously been considered as an indicator of this process. The possible role of ions in particle formation was investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emission of particles in the ultrafine range (<100 nm) from laser printers has not been reported until recently (Uhde et al., 2006; He et al., 2007; Morawska et al., 2009). The research reported to date has provided a body of information about printer emissions and shed light on particle formation mechanisms. However, until now, the effect of fuser roller temperature on particle emissions had not been comprehensively investigated...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of the particulate size and number concentration emissions from a fleet of inner city medium duty CNG buses was conducted using the newly available Diffusion Size Classifier in comparison with more traditional SMPS's and CPC's. Studies were conducted at both steady state and transient driving modes on a vehicle dynamometer utilising a CVS dilution system. Comparative analysis of the results showed that the DiSC provided equivalent information during steady state conditions and was able to provide additional information during transient conditions, namely, the modal diameter of the particle size distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agriculture is responsible for a significant proportion of total anthropogenic greenhouse gas emissions (perhaps 18% globally), and therefore has the potential to contribute to efforts to reduce emissions as a means of minimising the risk of dangerous climate change. The largest contributions to emissions are attributed to ruminant methane production and nitrous oxide from animal waste and fertilised soils. Further, livestock, including ruminants, are an important component of global and Australian food production and there is a growing demand for animal protein sources. At the same time as governments and the community strengthen objectives to reduce greenhouse gas emissions, there are growing concerns about global food security. This paper provides an overview of a number of options for reducing methane and nitrous oxide emissions from ruminant production systems in Australia, while maintaining productivity to contribute to both objectives. Options include strategies for feed modification, animal breeding and herd management, rumen manipulation and animal waste and fertiliser management. Using currently available strategies, some reductions in emissions can be achieved, but practical commercially available techniques for significant reductions in methane emissions, particularly from extensive livestock production systems, will require greater time and resource investment. Decreases in the levels of emissions from these ruminant systems (i.e., the amount of emissions per unit of product such as meat) have already been achieved. However, the technology has not yet been developed for eliminating production of methane from the rumen of cattle and sheep digesting the cellulose and lignin-rich grasses that make up a large part of the diet of animals grazing natural pastures, particularly in arid and semi-arid grazing lands. Nevertheless, the abatement that can be achieved will contribute significantly towards reaching greenhouse gas emissions reduction targets and research will achieve further advances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study elucidated the shadow price of greenhouse gas (GHG) emissions for 1,024 international companies worldwide that were surveyed from 15 industries in 37 major countries. Our results indicate that the shadow price of GHG at the firm level is much higher than indicated in previous studies. The higher shadow price was found in this study as a result of the use of Scope 3 GHG emissions data. The results of this research indicate that a firm would carry a high cost of GHG emissions if Scope 3 GHG emissions were the focus of the discussion of corporate social responsibility. In addition, such shadow prices were determined to differ substantially among countries, among sectors, and within sectors. Although a number of studies have calculated the shadow price of GHG emissions, these studies have employed country-level or industry-level data or a small sample of firm-level data in one country. This new data from a worldwide firm analysis of the shadow price of GHG emissions can play an important role in developing climate policy and promoting sustainable development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The international shipping sector is a major contributor to global greenhouse gas (GHG) emissions. The International Maritime Organisation (IMO) has adopted some technical and operational measures to reduce GHG emissions from international shipping. However, these measures may not be enough to reduce the amount of GHG emissions from international shipping to an acceptable level. Therefore, the IMO Member States are currently considering a number of proposals for the introduction of market-based measures (MBMs). During the negotiation process, some leading developing countries raised questions about the probable confl ict of the proposed MBMs with the rules of the World Trade Organisation (WTO). This article comprehensively examines this issue and argues that none of the MBM proposals currently under consideration by the IMO has any confl ict with the WTO rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Throughout the world, there is increasing pressure on governments, companies,regulators and standard-setters to respond to the global challenge of climate change. The growing number of regulatory requirements for organisations to disclose their greenhouse gas (GHG) emissions and emergent national, regional and international emissions trading schemes (ETSs) reflect key government responses to this challenge. Assurance of GHG emissions disclosures enhances the credibility of these disclosures and any associated trading schemes. The auditing and assurance profession has an important role to play in the provision of such assurance, highlighted by the International Auditing and Assurance Standards Board’s (IAASB) decision to develop an international GHG emissions assurance standard. This article sets out the developments to date on an international standard for the assurance of GHG emissions disclosures. It then provides information on the way Australian companies have responded to the challenge of GHG reporting and assurance. Finally, it outlines the types of assurance that assurance providers in Australia are currently providing in this area.