990 resultados para particle physics


Relevância:

70.00% 70.00%

Publicador:

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Within the dinuclear system (DNS) conception, instead of solving the Fokker-Planck equation (FPE) analytically, the master equation is solved numerically to calculate the fusion probability of super-heavy nuclei, so that the harmonic oscillator approximation to the potential energy of the DNS is avoided. The relative motion concerning the energy, the angular momentum and the fragment deformation relaxations is explicitly treated to couple with the diffusion process. The nucleon transition probabilities, which are derived microscopically, are related with the energy dissipation of the relative motion. Thus they are time dependent. Comparing with the analytical solution of FPE at the equilibrium, our time-dependent results preserve more dynamical effects. The calculated evaporation residue cross-sections for one-neutron emission channel of Pb-based reactions are basically in agreement with the known experimental data within one order of magnitude.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

An immense variety of problems in theoretical physics are of the non-linear type. Non~linear partial differential equations (NPDE) have almost become the rule rather than an exception in diverse branches of physics such as fluid mechanics, field theory, particle physics, statistical physics and optics, and the construction of exact solutions of these equations constitutes one of the most vigorous activities in theoretical physics today. The thesis entitled ‘Some Non-linear Problems in Theoretical Physics’ addresses various aspects of this problem at the classical level. For obtaining exact solutions we have used mathematical tools like the bilinear operator method, base equation technique and similarity method with emphasis on its group theoretical aspects. The thesis deals with certain methods of finding exact solutions of a number of non-linear partial differential equations of importance to theoretical physics. Some of these new solutions are of relevance from the applications point of view in diverse branches such as elementary particle physics, field theory, solid state physics and non-linear optics and give some insight into the stable or unstable behavior of dynamical Systems The thesis consists of six chapters.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In a recent paper, the hydrodynamic code NEXSPheRIO was used in conjunction with STAR analysis methods to study two-particle correlations as a function of Delta(eta) and Delta phi. The various structures observed in the data were reproduced. In this work, we discuss the origin of these structures as well as present new results.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A solution to a version of the Stieltjes moment. problem is presented. Using this solution, we construct a family of coherent states of a charged particle in a uniform magnetic field. We prove that these states form an overcomplete set that is normalized and resolves the unity. By the help of these coherent states we construct the Fock-Bergmann representation related to the particle quantization. This quantization procedure takes into account a circle topology of the classical motion. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Large Hadron Collider presents an unprecedented opportunity to probe the realm of new physics in the TeV region and shed light on some of the core unresolved issues of particle physics. These include the nature of electroweak symmetry breaking, the origin of mass, the possible constituent of cold dark matter, new sources of CP violation needed to explain the baryon excess in the universe, the possible existence of extra gauge groups and extra matter, and importantly the path Nature chooses to resolve the hierarchy problem - is it supersymmetry or extra dimensions. Many models of new physics beyond the standard model contain a hidden sector which can be probed at the LHC. Additionally, the LHC will be a. top factory and accurate measurements of the properties of the top and its rare decays will provide a window to new physics. Further, the LHC could shed light on the origin of neutralino masses if the new physics associated with their generation lies in the TeV region. Finally, the LHC is also a laboratory to test the hypothesis of TeV scale strings and D brane models. An overview of these possibilities is presented in the spirit that it will serve as a companion to the Technical Design Reports (TDRs) by the particle detector groups ATLAS and CMS to facilitate the test of the new theoretical ideas at the LHC. Which of these ideas stands the test of the LHC data will govern the course of particle physics in the subsequent decades.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider ( LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking - through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start- up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb(-1) or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, B-s production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb(-1) to 30 fb(-1). The Standard Model processes include QCD, B-physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z(0) boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2-6 describe examples of full analyses, with photons, electrons, muons, jets, missing E-T, B-mesons and tau's, and for quarkonia in heavy ion collisions. Chapters 7-15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this thesis, the phenomenology of the Randall-Sundrum setup is investigated. In this context models with and without an enlarged SU(2)_L x SU(2)_R x U(1)_X x P_{LR} gauge symmetry, which removes corrections to the T parameter and to the Z b_L \bar b_L coupling, are compared with each other. The Kaluza-Klein decomposition is formulated within the mass basis, which allows for a clear understanding of various model-specific features. A complete discussion of tree-level flavor-changing effects is presented. Exact expressions for five dimensional propagators are derived, including Yukawa interactions that mediate flavor-off-diagonal transitions. The symmetry that reduces the corrections to the left-handed Z b \bar b coupling is analyzed in detail. In the literature, Randall-Sundrum models have been used to address the measured anomaly in the t \bar t forward-backward asymmetry. However, it will be shown that this is not possible within a natural approach to flavor. The rare decays t \to cZ and t \to ch are investigated, where in particular the latter could be observed at the LHC. A calculation of \Gamma_{12}^{B_s} in the presence of new physics is presented. It is shown that the Randall-Sundrum setup allows for an improved agreement with measurements of A_{SL}^s, S_{\psi\phi}, and \Delta\Gamma_s. For the first time, a complete one-loop calculation of all relevant Higgs-boson production and decay channels in the custodial Randall-Sundrum setup is performed, revealing a sensitivity to large new-physics scales at the LHC.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Future experiments in nuclear and particle physics are moving towards the high luminosity regime in order to access rare processes. In this framework, particle detectors require high rate capability together with excellent timing resolution for precise event reconstruction. In order to achieve this, the development of dedicated FrontEnd Electronics (FEE) for detectors has become increasingly challenging and expensive. Thus, a current trend in R&D is towards flexible FEE that can be easily adapted to a great variety of detectors, without impairing the required high performance. This thesis reports on a novel FEE for two different detector types: imaging Cherenkov counters and plastic scintillator arrays. The former requires high sensitivity and precision for detection of single photon signals, while the latter is characterized by slower and larger signals typical of scintillation processes. The FEE design was developed using high-bandwidth preamplifiers and fast discriminators which provide Time-over-Threshold (ToT). The use of discriminators allowed for low power consumption, minimal dead-times and self-triggering capabilities, all fundamental aspects for high rate applications. The output signals of the FEE are readout by a high precision TDC system based on FPGA. The performed full characterization of the analogue signals under realistic conditions proved that the ToT information can be used in a novel way for charge measurements or walk corrections, thus improving the obtainable timing resolution. Detailed laboratory investigations proved the feasibility of the ToT method. The full readout chain was investigated in test experiments at the Mainz Microtron: high counting rates per channel of several MHz were achieved, and a timing resolution of better than 100 ps after walk correction based on ToT was obtained. Ongoing applications to fast Time-of-Flight counters and future developments of FEE have been also recently investigated.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This Habilitationsschrift (Habilitation thesis) is focused on my research activities on medical applications of particle physics and was written in 2013 to obtain the Venia Docendi (Habilitation) in experimental physics at the University of Bern. It is based on selected publications, which represented at that time my major scientific contributions as an experimental physicist to the field of particle accelerators and detectors applied to medical diagnostics and therapy. The thesis is structured in two parts. In Part I, Chapter 1 presents an introduction to accelerators and detectors applied to medicine, with particular focus on cancer hadrontherapy and on the production of radioactive isotopes. In Chapter 2, my publications on medical particle accelerators are introduced and put into their perspective. In particular, high frequency linear accelerators for hadrontherapy are discussed together with the new Bern cyclotron laboratory. Chapter 3 is dedicated to particle detectors with particular emphasis on three instruments I contributed to propose and develop: segmented ionization chambers for hadrontherapy, a proton radiography apparatus with nuclear emulsion films, and a beam monitor detector for ion beams based on doped silica fibres. Selected research and review papers are contained in Part II. For copyright reasons, they are only listed and not reprinted in this on-line version. They are available on the websites of the journals.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Abstract Heading into the 2020s, Physics and Astronomy are undergoing experimental revolutions that will reshape our picture of the fabric of the Universe. The Large Hadron Collider (LHC), the largest particle physics project in the world, produces 30 petabytes of data annually that need to be sifted through, analysed, and modelled. In astrophysics, the Large Synoptic Survey Telescope (LSST) will be taking a high-resolution image of the full sky every 3 days, leading to data rates of 30 terabytes per night over ten years. These experiments endeavour to answer the question why 96% of the content of the universe currently elude our physical understanding. Both the LHC and LSST share the 5-dimensional nature of their data, with position, energy and time being the fundamental axes. This talk will present an overview of the experiments and data that is gathered, and outlines the challenges in extracting information. Common strategies employed are very similar to industrial data! Science problems (e.g., data filtering, machine learning, statistical interpretation) and provide a seed for exchange of knowledge between academia and industry. Speaker Biography Professor Mark Sullivan Mark Sullivan is a Professor of Astrophysics in the Department of Physics and Astronomy. Mark completed his PhD at Cambridge, and following postdoctoral study in Durham, Toronto and Oxford, now leads a research group at Southampton studying dark energy using exploding stars called "type Ia supernovae". Mark has many years' experience of research that involves repeatedly imaging the night sky to track the arrival of transient objects, involving significant challenges in data handling, processing, classification and analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to determine the collection efficiency of ultrafine particles into an impinger fitted with a fritted nozzle tip as a means to increase contact surface area between the aerosol and the liquid. The influence of liquid sampling volume, frit porosity and the nature of the sampling liquid was explored and it was shown that all impact on the collection efficiency of particles smaller than 220 nm. Obtained values for overall collection efficiency were substantially higher (~30–95%) than have been previously reported, mainly due to the high deposition of particles in the fritted nozzle tip, especially in case of finer porosity frits and smaller particles. Values for the capture efficiency of the solvent alone ranged from 20 to 45%, depending on the type and the volume of solvent. Additionally, our results show that airstream dispersion into bubbles improves particle trapping by the liquid and that there is a difference in collection efficiencies based on the nature and volume of the solvent used.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this work was to review the existing instrumental methods to monitor airborne nanoparticle in different types of indoor and outdoor environments in order to detect their presence and to characterise their properties. Firstly the terminology and definitions used in this field are discussed, which is followed by a review of the methods to measure particle physical characteristics including number concentration, size distribution and surface area. An extensive discussion is provided on the direct methods for particle elemental composition measurements, as well as on indirect methods providing information on particle volatility and solubility, and thus in turn on volatile and semivolatile compounds of which the particle is composed. A brief summary of broader considerations related to nanoparticle monitoring in different environments concludes the paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.