4 resultados para Air Pollution, Air Quality Data, Multifractal Analysis, Multifractal Models

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A detailed series of simulation chamber experiments has been performed on the atmospheric degradation pathways of the primary air pollutant naphthalene and two of its photooxidation products, phthaldialdehyde and 1-nitronaphthalene. The measured yields of secondary organic aerosol (SOA) arising from the photooxidation of naphthalene varied from 6-20%, depending on the concentrations of naphthalene and nitrogen oxides as well as relative humidity. A range of carbonyls, nitro-compounds, phenols and carboxylic acids were identified among the gas- and particle-phase products. On-line analysis of the chemical composition of naphthalene SOA was performed using aerosol time-of-flight mass spectrometry (ATOFMS) for the first time. The results indicate that enhanced formation of carboxylic acids may contribute to the observed increase in SOA yields at higher relative humidity. The photolysis of phthaldialdehyde and 1-nitronaphthalene was investigated using natural light at the European Photoreactor (EUPHORE) in Valencia, Spain. The photolysis rate coefficients were measured directly and used to confirm that photolysis is the major atmospheric loss process for these compounds. For phthaldialdehyde, the main gas-phase products were phthalide and phthalic anhydride. SOA yields in the range 2-11% were observed, with phthalic acid and dihydroxyphthalic acid identified among the particle phase products. The photolysis of 1-nitronaphthalene yielded nitric oxide and a naphthoxy radical which reacted to form several products. SOA yields in the range 57-71% were observed, with 1,4-naphthoquinone, 1-naphthol and 1,4-naphthalenediol identified in the particle phase. On-line analysis of the SOA generated in an indoor chamber using ATOFMS provided evidence for the formation of high-molecular-weight products. Further investigations revealed that these products are oxygenated polycyclic compounds most likely produced from the dimerization of naphthoxy radicals. These results of this work indicate that naphthalene is a potentially large source of SOA in urban areas and should be included in atmospheric models. The kinetic and mechanistic information could be combined with existing literature data to produce an overall degradation mechanism for naphthalene suitable for inclusion in photochemical models that are used to predict the effect of emissions on air quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is the aim of this thesis to investigate Health Impact Assessment (HIA) use in public policy formulation in Northern Ireland and in the Republic of Ireland. The influences affecting the use of HIAs will be examined in this study. Four case studies, where HIA has been conducted, will be used for research analysis. This includes HIAs conducted on traffic and transport in Dublin, Traveller accommodation in Donegal, a draft air quality action plan in Belfast and on a social housing regeneration project in Derry. HIA aims to identify possible intended and unintended consequences that a project, policy or programme will have on the affected population’s health. Although it has been acknowledged as a worthwhile tool to inform decision-makers, the extent to which it is used in policy in Ireland is subject to scrutiny. A theoretical framework, drawing from institutionalist, impact assessment and knowledge utilisation theories and schools of literature, underpin this study. The investigation involves an examination of the unit of analysis which consists of the HIA steering groups. These are made up of local authority decision makers, statutory health practitioners and community representatives. The overarching structure and underlying values which are hypothesized as present in each HIA case are investigated in this research. Chapters 2 and 3 outline the main literature in the area which includes theories from the public health and health promotion paradigm, the policy sciences and impact assessment techniques. Chapter 4 describes the methodology in this research which is a multiple case study design. This is followed by an analysis of the cases and then concludes with practical recommendations for HIA in Ireland and theoretical conclusions of the research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comfort is, in essence, satisfaction with the environment, and with respect to the indoor environment it is primarily satisfaction with the thermal conditions and air quality. Improving comfort has social, health and economic benefits, and is more financially significant than any other building cost. Despite this, comfort is not strictly managed throughout the building lifecycle. This is mainly due to the lack of an appropriate system to adequately manage comfort knowledge through the construction process into operation. Previous proposals to improve knowledge management have not been successfully adopted by the construction industry. To address this, the BabySteps approach was devised. BabySteps is an approach, proposed by this research, which states that for an innovation to be adopted into the industry it must be implementable through a number of small changes. This research proposes that improving the management of comfort knowledge will improve comfort. ComMet is a new methodology proposed by this research that manages comfort knowledge. It enables comfort knowledge to be captured, stored and accessed throughout the building life-cycle and so allowing it to be re-used in future stages of the building project and in future projects. It does this using the following: Comfort Performances – These are simplified numerical representations of the comfort of the indoor environment. Comfort Performances quantify the comfort at each stage of the building life-cycle using standard comfort metrics. Comfort Ratings - These are a means of classifying the comfort conditions of the indoor environment according to an appropriate standard. Comfort Ratings are generated by comparing different Comfort Performances. Comfort Ratings provide additional information relating to the comfort conditions of the indoor environment, which is not readily determined from the individual Comfort Performances. Comfort History – This is a continuous descriptive record of the comfort throughout the project, with a focus on documenting the items and activities, proposed and implemented, which could potentially affect comfort. Each aspect of the Comfort History is linked to the relevant comfort entity it references. These three components create a comprehensive record of the comfort throughout the building lifecycle. They are then stored and made available in a common format in a central location which allows them to be re-used ad infinitum. The LCMS System was developed to implement the ComMet methodology. It uses current and emerging technologies to capture, store and allow easy access to comfort knowledge as specified by ComMet. LCMS is an IT system that is a combination of the following six components: Building Standards; Modelling & Simulation; Physical Measurement through the specially developed Egg-Whisk (Wireless Sensor) Network; Data Manipulation; Information Recording; Knowledge Storage and Access.Results from a test case application of the LCMS system - an existing office room at a research facility - highlighted that while some aspects of comfort were being maintained, the building’s environment was not in compliance with the acceptable levels as stipulated by the relevant building standards. The implementation of ComMet, through LCMS, demonstrates how comfort, typically only considered during early design, can be measured and managed appropriately through systematic application of the methodology as means of ensuring a healthy internal environment in the building.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain