990 resultados para Particle physics, QCD


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This Habilitationsschrift (Habilitation thesis) is focused on my research activities on medical applications of particle physics and was written in 2013 to obtain the Venia Docendi (Habilitation) in experimental physics at the University of Bern. It is based on selected publications, which represented at that time my major scientific contributions as an experimental physicist to the field of particle accelerators and detectors applied to medical diagnostics and therapy. The thesis is structured in two parts. In Part I, Chapter 1 presents an introduction to accelerators and detectors applied to medicine, with particular focus on cancer hadrontherapy and on the production of radioactive isotopes. In Chapter 2, my publications on medical particle accelerators are introduced and put into their perspective. In particular, high frequency linear accelerators for hadrontherapy are discussed together with the new Bern cyclotron laboratory. Chapter 3 is dedicated to particle detectors with particular emphasis on three instruments I contributed to propose and develop: segmented ionization chambers for hadrontherapy, a proton radiography apparatus with nuclear emulsion films, and a beam monitor detector for ion beams based on doped silica fibres. Selected research and review papers are contained in Part II. For copyright reasons, they are only listed and not reprinted in this on-line version. They are available on the websites of the journals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most experiments in particle physics are scattering experiments, the analysis of which leads to masses, scattering phases, decay widths and other properties of one or multi-particle systems. Until the advent of Lattice Quantum Chromodynamics (LQCD) it was difficult to compare experimental results on low energy hadron-hadron scattering processes to the predictions of QCD, the current theory of strong interactions. The reason being, at low energies the QCD coupling constant becomes large and the perturbation expansion for scattering; amplitudes does not converge. To overcome this, one puts the theory onto a lattice, imposes a momentum cutoff, and computes the integral numerically. For particle masses, predictions of LQCD agree with experiment, but the area of decay widths is largely unexplored. ^ LQCD provides ab initio access to unusual hadrons like exotic mesons that are predicted to contain real gluonic structure. To study decays of these type resonances the energy spectra of a two-particle decay state in a finite volume of dimension L can be related to the associated scattering phase shift δ(k) at momentum k through exact formulae derived by Lüscher. Because the spectra can be computed using numerical Monte Carlo techniques, the scattering phases can thus be determined using Lüscher's formulae, and the corresponding decay widths can be found by fitting Breit-Wigner functions. ^ Results of such a decay width calculation for an exotic hybrid( h) meson (JPC = 1-+) are presented for the decay channel h → πa 1. This calculation employed Lüscher's formulae and an approximation of LQCD called the quenched approximation. Energy spectra for the h and πa1 systems were extracted using eigenvalues of a correlation matrix, and the corresponding scattering phase shifts were determined for a discrete set of πa1 momenta. Although the number of phase shift data points was sparse, fits to a Breit-Wigner model were made, resulting in a decay width of about 60 MeV. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Heading into the 2020s, Physics and Astronomy are undergoing experimental revolutions that will reshape our picture of the fabric of the Universe. The Large Hadron Collider (LHC), the largest particle physics project in the world, produces 30 petabytes of data annually that need to be sifted through, analysed, and modelled. In astrophysics, the Large Synoptic Survey Telescope (LSST) will be taking a high-resolution image of the full sky every 3 days, leading to data rates of 30 terabytes per night over ten years. These experiments endeavour to answer the question why 96% of the content of the universe currently elude our physical understanding. Both the LHC and LSST share the 5-dimensional nature of their data, with position, energy and time being the fundamental axes. This talk will present an overview of the experiments and data that is gathered, and outlines the challenges in extracting information. Common strategies employed are very similar to industrial data! Science problems (e.g., data filtering, machine learning, statistical interpretation) and provide a seed for exchange of knowledge between academia and industry. Speaker Biography Professor Mark Sullivan Mark Sullivan is a Professor of Astrophysics in the Department of Physics and Astronomy. Mark completed his PhD at Cambridge, and following postdoctoral study in Durham, Toronto and Oxford, now leads a research group at Southampton studying dark energy using exploding stars called "type Ia supernovae". Mark has many years' experience of research that involves repeatedly imaging the night sky to track the arrival of transient objects, involving significant challenges in data handling, processing, classification and analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to determine the collection efficiency of ultrafine particles into an impinger fitted with a fritted nozzle tip as a means to increase contact surface area between the aerosol and the liquid. The influence of liquid sampling volume, frit porosity and the nature of the sampling liquid was explored and it was shown that all impact on the collection efficiency of particles smaller than 220 nm. Obtained values for overall collection efficiency were substantially higher (~30–95%) than have been previously reported, mainly due to the high deposition of particles in the fritted nozzle tip, especially in case of finer porosity frits and smaller particles. Values for the capture efficiency of the solvent alone ranged from 20 to 45%, depending on the type and the volume of solvent. Additionally, our results show that airstream dispersion into bubbles improves particle trapping by the liquid and that there is a difference in collection efficiencies based on the nature and volume of the solvent used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this work was to review the existing instrumental methods to monitor airborne nanoparticle in different types of indoor and outdoor environments in order to detect their presence and to characterise their properties. Firstly the terminology and definitions used in this field are discussed, which is followed by a review of the methods to measure particle physical characteristics including number concentration, size distribution and surface area. An extensive discussion is provided on the direct methods for particle elemental composition measurements, as well as on indirect methods providing information on particle volatility and solubility, and thus in turn on volatile and semivolatile compounds of which the particle is composed. A brief summary of broader considerations related to nanoparticle monitoring in different environments concludes the paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High time resolution aerosol mass spectrometry measurements were conducted during a field campaign at Mace Head Research Station, Ireland, in June 2007. Observations on one particular day of the campaign clearly indicated advection of aerosol from volcanoes and desert plains in Iceland which could be traced with NOAA Hysplit air mass back trajectories and satellite images. In conjunction with this event, elevated levels of sulphate and light absorbing particles were encountered at Mace Head. While sulphate concentration was continuously increasing, nitrate levels remained low indicating no significant contribution from anthropogenic pollutants. Sulphate concentration increased about 3.8 g/m3 in comparison with the background conditions. Corresponding sulphur flux from volcanic emissions was estimated to about 0.3 TgS/yr, suggesting that a large amount of sulphur released from Icelandic volcanoes may be distributed over distances larger than 1000 km. Overall, our results corroborate that transport of volcanogenic sulphate and dust particles can significantly change the chemical composition, size distribution, and optical properties of aerosol over the North Atlantic Ocean and should be considered accordingly by regional climate models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on an inter-comparison of six different hygroscopicity tandem differential mobility analysers (HTDMAs). These HTDMAs are used worldwide in laboratories and in field campaigns to measure the water uptake of aerosol particles and were never intercompared. After an investigation of the different design of the instruments with their advantages and inconveniencies, the methods for calibration, validation and analysis are presented. Measurements of nebulised ammonium sulphate as well as of secondary organic aerosol generated from a smog chamber were performed. Agreement and discrepancies between the instrument and to the theory are discussed, and final recommendations for a standard instrument are given, as a benchmark for laboratory or field experiments to ensure a high quality of HTDMA data.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ‘anti- of ‘(Anti)Queer’ is a queer anti. In particle physics, a domain of science which was for a long time peddled as ultimately knowable, rational and objective, the postmodern turn has made everything queer (or chaotic, as the scientific version of this turn is perhaps more commonly named). This is a world where not only do two wrongs not make a right, but a negative and positive do not calmly cancel each other out to leave nothing, as mathematics might suggest. When matter meets with anti-matter, the resulting explosion can produce not only energy - heat and light? - but new matter. We live in a world whose very basics are no longer the electron and the positron, but an ever proliferating number of chaotic, unpredictable - queer? - subatomic particles. Some are ‘charmed’, others merely ‘strange’ . Weird science indeed. The ‘Anti-’ of ‘Anti-queer’ does not place itself neatly into binaries. This is not a refutation of all that queer has been or will be. It is explicitly a confrontation, a challenge, an attempt to take seriously not only the claims made for queer but the potent contradictions and silences which stand proudly when any attempt is made to write a history of the term. Specifically, ‘Anti-Queer’ is not Beyond Queer, the title of Bruce Bawer’s 1996 book which calmly and self-confidently explains the failings of queer, extols a return to a liberal political theory of cultural change and places its own marker on queer as a movement whose purpose has been served. We are not Beyond Queer. And if we are Anti-Queer, it is only to challenge those working in the arena to acknowledge and work with some of the facts of the movement’s history whose productivity has been erased with a gesture which has, proved, bizarrely, to be reductive and homogenising.

Relevância:

80.00% 80.00%

Publicador: