989 resultados para particle physics - cosmology connection
Resumo:
Future experiments in nuclear and particle physics are moving towards the high luminosity regime in order to access rare processes. In this framework, particle detectors require high rate capability together with excellent timing resolution for precise event reconstruction. In order to achieve this, the development of dedicated FrontEnd Electronics (FEE) for detectors has become increasingly challenging and expensive. Thus, a current trend in R&D is towards flexible FEE that can be easily adapted to a great variety of detectors, without impairing the required high performance. This thesis reports on a novel FEE for two different detector types: imaging Cherenkov counters and plastic scintillator arrays. The former requires high sensitivity and precision for detection of single photon signals, while the latter is characterized by slower and larger signals typical of scintillation processes. The FEE design was developed using high-bandwidth preamplifiers and fast discriminators which provide Time-over-Threshold (ToT). The use of discriminators allowed for low power consumption, minimal dead-times and self-triggering capabilities, all fundamental aspects for high rate applications. The output signals of the FEE are readout by a high precision TDC system based on FPGA. The performed full characterization of the analogue signals under realistic conditions proved that the ToT information can be used in a novel way for charge measurements or walk corrections, thus improving the obtainable timing resolution. Detailed laboratory investigations proved the feasibility of the ToT method. The full readout chain was investigated in test experiments at the Mainz Microtron: high counting rates per channel of several MHz were achieved, and a timing resolution of better than 100 ps after walk correction based on ToT was obtained. Ongoing applications to fast Time-of-Flight counters and future developments of FEE have been also recently investigated.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
This Habilitationsschrift (Habilitation thesis) is focused on my research activities on medical applications of particle physics and was written in 2013 to obtain the Venia Docendi (Habilitation) in experimental physics at the University of Bern. It is based on selected publications, which represented at that time my major scientific contributions as an experimental physicist to the field of particle accelerators and detectors applied to medical diagnostics and therapy. The thesis is structured in two parts. In Part I, Chapter 1 presents an introduction to accelerators and detectors applied to medicine, with particular focus on cancer hadrontherapy and on the production of radioactive isotopes. In Chapter 2, my publications on medical particle accelerators are introduced and put into their perspective. In particular, high frequency linear accelerators for hadrontherapy are discussed together with the new Bern cyclotron laboratory. Chapter 3 is dedicated to particle detectors with particular emphasis on three instruments I contributed to propose and develop: segmented ionization chambers for hadrontherapy, a proton radiography apparatus with nuclear emulsion films, and a beam monitor detector for ion beams based on doped silica fibres. Selected research and review papers are contained in Part II. For copyright reasons, they are only listed and not reprinted in this on-line version. They are available on the websites of the journals.
Resumo:
Abstract Heading into the 2020s, Physics and Astronomy are undergoing experimental revolutions that will reshape our picture of the fabric of the Universe. The Large Hadron Collider (LHC), the largest particle physics project in the world, produces 30 petabytes of data annually that need to be sifted through, analysed, and modelled. In astrophysics, the Large Synoptic Survey Telescope (LSST) will be taking a high-resolution image of the full sky every 3 days, leading to data rates of 30 terabytes per night over ten years. These experiments endeavour to answer the question why 96% of the content of the universe currently elude our physical understanding. Both the LHC and LSST share the 5-dimensional nature of their data, with position, energy and time being the fundamental axes. This talk will present an overview of the experiments and data that is gathered, and outlines the challenges in extracting information. Common strategies employed are very similar to industrial data! Science problems (e.g., data filtering, machine learning, statistical interpretation) and provide a seed for exchange of knowledge between academia and industry. Speaker Biography Professor Mark Sullivan Mark Sullivan is a Professor of Astrophysics in the Department of Physics and Astronomy. Mark completed his PhD at Cambridge, and following postdoctoral study in Durham, Toronto and Oxford, now leads a research group at Southampton studying dark energy using exploding stars called "type Ia supernovae". Mark has many years' experience of research that involves repeatedly imaging the night sky to track the arrival of transient objects, involving significant challenges in data handling, processing, classification and analysis.
Resumo:
The aim of this study was to determine the collection efficiency of ultrafine particles into an impinger fitted with a fritted nozzle tip as a means to increase contact surface area between the aerosol and the liquid. The influence of liquid sampling volume, frit porosity and the nature of the sampling liquid was explored and it was shown that all impact on the collection efficiency of particles smaller than 220 nm. Obtained values for overall collection efficiency were substantially higher (~30–95%) than have been previously reported, mainly due to the high deposition of particles in the fritted nozzle tip, especially in case of finer porosity frits and smaller particles. Values for the capture efficiency of the solvent alone ranged from 20 to 45%, depending on the type and the volume of solvent. Additionally, our results show that airstream dispersion into bubbles improves particle trapping by the liquid and that there is a difference in collection efficiencies based on the nature and volume of the solvent used.
Resumo:
The aim of this work was to review the existing instrumental methods to monitor airborne nanoparticle in different types of indoor and outdoor environments in order to detect their presence and to characterise their properties. Firstly the terminology and definitions used in this field are discussed, which is followed by a review of the methods to measure particle physical characteristics including number concentration, size distribution and surface area. An extensive discussion is provided on the direct methods for particle elemental composition measurements, as well as on indirect methods providing information on particle volatility and solubility, and thus in turn on volatile and semivolatile compounds of which the particle is composed. A brief summary of broader considerations related to nanoparticle monitoring in different environments concludes the paper.
Resumo:
This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.
Resumo:
High time resolution aerosol mass spectrometry measurements were conducted during a field campaign at Mace Head Research Station, Ireland, in June 2007. Observations on one particular day of the campaign clearly indicated advection of aerosol from volcanoes and desert plains in Iceland which could be traced with NOAA Hysplit air mass back trajectories and satellite images. In conjunction with this event, elevated levels of sulphate and light absorbing particles were encountered at Mace Head. While sulphate concentration was continuously increasing, nitrate levels remained low indicating no significant contribution from anthropogenic pollutants. Sulphate concentration increased about 3.8 g/m3 in comparison with the background conditions. Corresponding sulphur flux from volcanic emissions was estimated to about 0.3 TgS/yr, suggesting that a large amount of sulphur released from Icelandic volcanoes may be distributed over distances larger than 1000 km. Overall, our results corroborate that transport of volcanogenic sulphate and dust particles can significantly change the chemical composition, size distribution, and optical properties of aerosol over the North Atlantic Ocean and should be considered accordingly by regional climate models.
Resumo:
We report on an inter-comparison of six different hygroscopicity tandem differential mobility analysers (HTDMAs). These HTDMAs are used worldwide in laboratories and in field campaigns to measure the water uptake of aerosol particles and were never intercompared. After an investigation of the different design of the instruments with their advantages and inconveniencies, the methods for calibration, validation and analysis are presented. Measurements of nebulised ammonium sulphate as well as of secondary organic aerosol generated from a smog chamber were performed. Agreement and discrepancies between the instrument and to the theory are discussed, and final recommendations for a standard instrument are given, as a benchmark for laboratory or field experiments to ensure a high quality of HTDMA data.
Resumo:
The ‘anti- of ‘(Anti)Queer’ is a queer anti. In particle physics, a domain of science which was for a long time peddled as ultimately knowable, rational and objective, the postmodern turn has made everything queer (or chaotic, as the scientific version of this turn is perhaps more commonly named). This is a world where not only do two wrongs not make a right, but a negative and positive do not calmly cancel each other out to leave nothing, as mathematics might suggest. When matter meets with anti-matter, the resulting explosion can produce not only energy - heat and light? - but new matter. We live in a world whose very basics are no longer the electron and the positron, but an ever proliferating number of chaotic, unpredictable - queer? - subatomic particles. Some are ‘charmed’, others merely ‘strange’ . Weird science indeed. The ‘Anti-’ of ‘Anti-queer’ does not place itself neatly into binaries. This is not a refutation of all that queer has been or will be. It is explicitly a confrontation, a challenge, an attempt to take seriously not only the claims made for queer but the potent contradictions and silences which stand proudly when any attempt is made to write a history of the term. Specifically, ‘Anti-Queer’ is not Beyond Queer, the title of Bruce Bawer’s 1996 book which calmly and self-confidently explains the failings of queer, extols a return to a liberal political theory of cultural change and places its own marker on queer as a movement whose purpose has been served. We are not Beyond Queer. And if we are Anti-Queer, it is only to challenge those working in the arena to acknowledge and work with some of the facts of the movement’s history whose productivity has been erased with a gesture which has, proved, bizarrely, to be reductive and homogenising.