929 resultados para Source analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, new tools in atmospheric pollutant sampling and analysis were applied in order to go deeper in source apportionment study. The project was developed mainly by the study of atmospheric emission sources in a suburban area influenced by a municipal solid waste incinerator (MSWI), a medium-sized coastal tourist town and a motorway. Two main research lines were followed. For what concerns the first line, the potentiality of the use of PM samplers coupled with a wind select sensor was assessed. Results showed that they may be a valid support in source apportionment studies. However, meteorological and territorial conditions could strongly affect the results. Moreover, new markers were investigated, particularly focusing on the processes of biomass burning. OC revealed a good biomass combustion process indicator, as well as all determined organic compounds. Among metals, lead and aluminium are well related to the biomass combustion. Surprisingly PM was not enriched of potassium during bonfire event. The second research line consists on the application of Positive Matrix factorization (PMF), a new statistical tool in data analysis. This new technique was applied to datasets which refer to different time resolution data. PMF application to atmospheric deposition fluxes identified six main sources affecting the area. The incinerator’s relative contribution seemed to be negligible. PMF analysis was then applied to PM2.5 collected with samplers coupled with a wind select sensor. The higher number of determined environmental indicators allowed to obtain more detailed results on the sources affecting the area. Vehicular traffic revealed the source of greatest concern for the study area. Also in this case, incinerator’s relative contribution seemed to be negligible. Finally, the application of PMF analysis to hourly aerosol data demonstrated that the higher the temporal resolution of the data was, the more the source profiles were close to the real one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is the power transient analysis concerning experimental devices placed within the reflector of Jules Horowitz Reactor (JHR). Since JHR material testing facility is designed to achieve 100 MW core thermal power, a large reflector hosts fissile material samples that are irradiated up to total relevant power of 3 MW. MADISON devices are expected to attain 130 kW, conversely ADELINE nominal power is of some 60 kW. In addition, MOLFI test samples are envisaged to reach 360 kW for what concerns LEU configuration and up to 650 kW according to HEU frame. Safety issues concern shutdown transients and need particular verifications about thermal power decreasing of these fissile samples with respect to core kinetics, as far as single device reactivity determination is concerned. Calculation model is conceived and applied in order to properly account for different nuclear heating processes and relative time-dependent features of device transients. An innovative methodology is carried out since flux shape modification during control rod insertions is investigated regarding the impact on device power through core-reflector coupling coefficients. In fact, previous methods considering only nominal core-reflector parameters are then improved. Moreover, delayed emissions effect is evaluated about spatial impact on devices of a diffuse in-core delayed neutron source. Delayed gammas transport related to fission products concentration is taken into account through evolution calculations of different fuel compositions in equilibrium cycle. Provided accurate device reactivity control, power transients are then computed for every sample according to envisaged shutdown procedures. Results obtained in this study are aimed at design feedback and reactor management optimization by JHR project team. Moreover, Safety Report is intended to utilize present analysis for improved device characterization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supernovae are among the most energetic events occurring in the universe and are so far the only verified extrasolar source of neutrinos. As the explosion mechanism is still not well understood, recording a burst of neutrinos from such a stellar explosion would be an important benchmark for particle physics as well as for the core collapse models. The neutrino telescope IceCube is located at the Geographic South Pole and monitors the antarctic glacier for Cherenkov photons. Even though it was conceived for the detection of high energy neutrinos, it is capable of identifying a burst of low energy neutrinos ejected from a supernova in the Milky Way by exploiting the low photomultiplier noise in the antarctic ice and extracting a collective rate increase. A signal Monte Carlo specifically developed for water Cherenkov telescopes is presented. With its help, we will investigate how well IceCube can distinguish between core collapse models and oscillation scenarios. In the second part, nine years of data taken with the IceCube precursor AMANDA will be analyzed. Intensive data cleaning methods will be presented along with a background simulation. From the result, an upper limit on the expected occurrence of supernovae within the Milky Way will be determined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Group B Streptococcus (GBS), in its transition from commensal to pathogen, will encounter diverse host environments and thus require coordinately controlling its transcriptional responses to these changes. This work was aimed at better understanding the role of two component signal transduction systems (TCS) in GBS pathophysiology through a systematic screening procedure. We first performed a complete inventory and sensory mechanism classification of all putative GBS TCS by genomic analysis. Five TCS were further investigated by the generation of knock-out strains, and in vitro transcriptome analysis identified genes regulated by these systems, ranging from 0.1-3% of the genome. Interestingly, two sugar phosphotransferase systems appeared differently regulated in the knock-out mutant of TCS-16, suggesting an involvement in monitoring carbon source availability. High throughput analysis of bacterial growth on different carbon sources showed that TCS-16 was necessary for growth of GBS on fructose-6-phosphate. Additional transcriptional analysis provided further evidence for a stimulus-response circuit where extracellular fructose-6-phosphate leads to autoinduction of TCS-16 with concomitant dramatic up-regulation of the adjacent operon encoding a phosphotransferase system. The TCS-16-deficient strain exhibited decreased persistence in a model of vaginal colonization and impaired growth/survival in the presence of vaginal mucoid components. All mutant strains were also characterized in a murine model of systemic infection, and inactivation of TCS-17 (also known as RgfAC) resulted in hypervirulence. Our data suggest a role for the previously unknown TCS-16, here named FspSR, in bacterial fitness and carbon metabolism during host colonization, and also provide experimental evidence for TCS-17/RgfAC involvement in virulence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Folates (vitamin B9) are essential water soluble vitamins, whose deficiency in humans may contribute to the onset of several diseases, such as anaemia, cancer, cardiovascular diseases, neurological problems as well as defects in embryonic development. Human and other mammals are unable to synthesize ex novo folate obtaining it from exogenous sources, via intestinal absorption. Recently the gut microbiota has been identified as an important source of folates and the selection and use of folate producing microorganisms represents an innovative strategy to increase human folate levels. The aim of this thesis was to gain a fundamental understanding of folate metabolism in Bifidobacterium adolescentis. The work was subdivided in three main phases, also aimed to solve different problems encountered working with Bifidobacterium strains. First, a new identification method (based on PCR-RFLP of hsp60 gene) was specifically developed to identify Bifidobacterium strains. Secondly, Bifidobacterium adolescentis biodiversity was explored in order to recognize representing strains of this species to be screened for their folate production ability. Results showed that this species is characterized by a wide variability and support the idea that a possible new taxonomic re-organization would be required. Finally B. adolescentis folate metabolism was studied using a double approach. A quantitative analysis of folate content was complemented by the examination of expression levels of genes involved in folate related pathways. For the normalization process, required to increase the robustness of the qRT-PCR analysis, an appropriate set of reference genes was tested using two different algorithms. Results demonstrate that B.adolescentis strains may represent an endogenous source of natural folate and they could be used to fortify fermented dairy products. This bio-fortification strategy presents many advantages for the consumer, providing native folate forms more bio-available, and not implicated in the discussed controversy concerning the safety of high intake of synthetic folic acid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Addressing current limitations of state-of-the-art instrumentation in aerosol research, the aim of this work was to explore and assess the applicability of a novel soft ionization technique, namely flowing atmospheric-pressure afterglow (FAPA), for the mass spectrometric analysis of airborne particulate organic matter. Among other soft ionization methods, the FAPA ionization technique was developed in the last decade during the advent of ambient desorption/ionization mass spectrometry (ADI–MS). Based on a helium glow discharge plasma at atmospheric-pressure, excited helium species and primary reagent ions are generated which exit the discharge region through a capillary electrode, forming the so-called afterglow region where desorption and ionization of the analytes occurs. Commonly, fragmentation of the analytes during ionization is reported to occur only to a minimum extent, predominantly resulting in the formation of quasimolecular ions, i.e. [M+H]+ and [M–H]– in the positive and the negative ion mode, respectively. Thus, identification and detection of signals and their corresponding compounds is facilitated in the acquired mass spectra. The focus of the first part of this study lies on the application, characterization and assessment of FAPA–MS in the offline mode, i.e. desorption and ionization of the analytes from surfaces. Experiments in both positive and negative ion mode revealed ionization patterns for a variety of compound classes comprising alkanes, alcohols, aldehydes, ketones, carboxylic acids, organic peroxides, and alkaloids. Besides the always emphasized detection of quasimolecular ions, a broad range of signals for adducts and losses was found. Additionally, the capabilities and limitations of the technique were studied in three proof-of-principle applications. In general, the method showed to be best suited for polar analytes with high volatilities and low molecular weights, ideally containing nitrogen- and/or oxygen functionalities. However, for compounds with low vapor pressures, containing long carbon chains and/or high molecular weights, desorption and ionization is in direct competition with oxidation of the analytes, leading to the formation of adducts and oxidation products which impede a clear signal assignment in the acquired mass spectra. Nonetheless, FAPA–MS showed to be capable of detecting and identifying common limonene oxidation products in secondary OA (SOA) particles on a filter sample and, thus, is considered a suitable method for offline analysis of OA particles. In the second as well as the subsequent parts, FAPA–MS was applied online, i.e. for real time analysis of OA particles suspended in air. Therefore, the acronym AeroFAPA–MS (i.e. Aerosol FAPA–MS) was chosen to refer to this method. After optimization and characterization, the method was used to measure a range of model compounds and to evaluate typical ionization patterns in the positive and the negative ion mode. In addition, results from laboratory studies as well as from a field campaign in Central Europe (F–BEACh 2014) are presented and discussed. During the F–BEACh campaign AeroFAPA–MS was used in combination with complementary MS techniques, giving a comprehensive characterization of the sampled OA particles. For example, several common SOA marker compounds were identified in real time by MSn experiments, indicating that photochemically aged SOA particles were present during the campaign period. Moreover, AeroFAPA–MS was capable of detecting highly oxidized sulfur-containing compounds in the particle phase, presenting the first real-time measurements of this compound class. Further comparisons with data from other aerosol and gas-phase measurements suggest that both particulate sulfate as well as highly oxidized peroxyradicals in the gas phase might play a role during formation of these species. Besides applying AeroFAPA–MS for the analysis of aerosol particles, desorption processes of particles in the afterglow region were investigated in order to gain a more detailed understanding of the method. While during the previous measurements aerosol particles were pre-evaporated prior to AeroFAPA–MS analysis, in this part no external heat source was applied. Particle size distribution measurements before and after the AeroFAPA source revealed that only an interfacial layer of OA particles is desorbed and, thus, chemically characterized. For particles with initial diameters of 112 nm, desorption radii of 2.5–36.6 nm were found at discharge currents of 15–55 mA from these measurements. In addition, the method was applied for the analysis of laboratory-generated core-shell particles in a proof-of-principle study. As expected, predominantly compounds residing in the shell of the particles were desorbed and ionized with increasing probing depths, suggesting that AeroFAPA–MS might represent a promising technique for depth profiling of OA particles in future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For crime scene investigation in cases of homicide, the pattern of bloodstains at the incident site is of critical importance. The morphology of the bloodstain pattern serves to determine the approximate blood source locations, the minimum number of blows and the positioning of the victim. In the present work, the benefits of the three-dimensional bloodstain pattern analysis, including the ballistic approximation of the trajectories of the blood drops, will be demonstrated using two illustrative cases. The crime scenes were documented in 3D, using the non-contact methods digital photogrammetry, tachymetry and laser scanning. Accurate, true-to-scale 3D models of the crime scenes, including the bloodstain pattern and the traces, were created. For the determination of the areas of origin of the bloodstain pattern, the trajectories of up to 200 well-defined bloodstains were analysed in CAD and photogrammetry software. The ballistic determination of the trajectories was performed using ballistics software. The advantages of this method are the short preparation time on site, the non-contact measurement of the bloodstains and the high accuracy of the bloodstain analysis. It should be expected that this method delivers accurate results regarding the number and position of the areas of origin of bloodstains, in particular the vertical component is determined more precisely than using conventional methods. In both cases relevant forensic conclusions regarding the course of events were enabled by the ballistic bloodstain pattern analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Commonality of activation of spontaneously forming and stimulus-induced mental representations is an often made but rarely tested assumption in neuroscience. In a conjunction analysis of two earlier studies, brain electric activity during visual-concrete and abstract thoughts was studied. The conditions were: in study 1, spontaneous stimulus-independent thinking (post-hoc, visual imagery or abstract thought were identified); in study 2, reading of single nouns ranking high or low on a visual imagery scale. In both studies, subjects' tasks were similar: when prompted, they had to recall the last thought (study 1) or the last word (study 2). In both studies, subjects had no instruction to classify or to visually imagine their thoughts, and accordingly were not aware of the studies' aim. Brain electric data were analyzed into functional topographic brain images (using LORETA) of the last microstate before the prompt (study 1) and of the word-type discriminating event-related microstate after word onset (study 2). Conjunction analysis across the two studies yielded commonality of activation of core networks for abstract thought content in left anterior superior regions, and for visual-concrete thought content in right temporal-posterior inferior regions. The results suggest that two different core networks are automatedly activated when abstract or visual-concrete information, respectively, enters working memory, without a subject task or instruction about the two classes of information, and regardless of internal or external origin, and of input modality. These core machineries of working memory thus are invariant to source or modality of input when treating the two types of information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Since late 2003, the highly pathogenic influenza A H5N1 had initiated several outbreak waves that swept across the Eurasia and Africa continents. Getting prepared for reassortment or mutation of H5N1 viruses has become a global priority. Although the spreading mechanism of H5N1 has been studied from different perspectives, its main transmission agents and spread route problems remain unsolved. Methodology/Principal Findings Based on a compilation of the time and location of global H5N1 outbreaks from November 2003 to December 2006, we report an interdisciplinary effort that combines the geospatial informatics approach with a bioinformatics approach to form an improved understanding on the transmission mechanisms of H5N1 virus. Through a spherical coordinate based analysis, which is not conventionally done in geographical analyses, we reveal obvious spatial and temporal clusters of global H5N1 cases on different scales, which we consider to be associated with two different transmission modes of H5N1 viruses. Then through an interdisciplinary study of both geographic and phylogenetic analysis, we obtain a H5N1 spreading route map. Our results provide insight on competing hypotheses as to which avian hosts are responsible for the spread of H5N1. Conclusions/Significance We found that although South China and Southeast Asia may be the virus pool of avian flu, East Siberia may be the source of the H5N1 epidemic. The concentration of migratory birds from different places increases the possibility of gene mutation. Special attention should be paid to East Siberia, Middle Siberia and South China for improved surveillance of H5N1 viruses and monitoring of migratory birds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To date, no research has rigorously addressed the concern that local and regional procurement (LRP) of food aid could affect food prices and food price volatility in food aid source and recipient countries. We assemble spatially and temporally disaggregated data and estimate the relationship between food prices and their volatility and local food aid procurement and distribution across seven countries for several commodities. In most cases, LRP activities have no statistically significant relationship with either local price levels or food price volatility. The few exceptions underscore the importance of market monitoring. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban agriculture is a phenomenon that can be observed world-wide, particularly in cities of devel- oping countries. It is contributing significantly to food security and food safety and has sustained livelihood of the urban and peri-urban low income dwe llers in developing countries for many years. Population increase due to rural-urban migration and natural - formal as well as informal - urbani- sation are competing with urban farming for available space and scarce water resources. A mul- titemporal and multisensoral urban change analysis over the period of 25 years (1982-2007) was performed in order to measure and visualise the urban expansion along the Kizinga and Mzinga valley in the south of Dar Es Salaam. Airphotos and VHR satellite data were analysed by using a combination of a composition of anisotropic textural measures and spectral information. The study revealed that unplanned built-up area is expanding continuously, and vegetation covers and agricultural lands decline at a fast rate. The validation showed that the overall classification accuracy varied depending on the database. The extracted built-up areas were used for visual in- terpretation mapping purposes and served as information source for another research project. The maps visualise an urban congestion and expansion of nearly 18% of the total analysed area that had taken place in the Kizinga valley between 1982 and 2007. The same development can be ob- served in the less developed and more remote Mzinga valley between 1981 and 2002. Both areas underwent fast changes where land prices still tend to go up and an influx of people both from rural and urban areas continuously increase the density with the consequence of increasing multiple land use interests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dual carbon isotope anal. of marine aerosol samples has been performed for the first time demonstrating a potential in org. matter apportionment between three principal sources: marine, terrestrial (non-fossil) and fossil fuel due to unique isotopic signatures. The results presented here, utilizing combinations of dual carbon isotope anal., provides conclusive evidence of a dominant biogenic org. fraction to org. aerosol over biol. active oceans. In particular, the NE Atlantic, which is also subjected to notable anthropogenic influences via pollution transport processes, was found to contain 80 % org. aerosol matter of biogenic origin directly linked to plankton emissions. The remaining carbonaceous aerosol was of terrestrial origin. By contrast, for polluted air advected out from Europe into the NE Atlantic, the source apportionment is 30 % marine biogenic, 40 % fossil fuel, and 30 % continental non-fossil fuel. The dominant marine org. aerosol source in the atm. has significant implications for climate change feedback processes. [on SciFinder(R)]