829 resultados para LHC
Resumo:
The Large Hadron Collider is the world’s largest and most powerful particle accelerator. The project is divided in phases. The first one goes from 2009 until 2020. The second phase will consist of the implementation of upgrades. One of the upgrades is to increase the ratio of collision, the luminosity. This objective is the main of one of the most important projects which is carrying out the upgrades: Hi-Lumi LHC project. Increasing luminosity could be done by using a new material in the superconductor magnets placed at the interaction points: Nb3Sn, instead of NbTi, the one being used right now. Before implementing it many aspects should be analysed. One of them is the induction magnetic field quality. The tool used so far has been ROXIE, software developed at CERN by S. Russenschuck. One of the main features of the programme is the time-transient analysis, which is based on three mathematical models. It is quite precise for fields above 1.5 Tesla. However, they are not very accurate for lower fields. Therefore the aim of this project is to evaluate a more accurate model: Classical Preisach Model of Hysteresis, in order to better analyse induced field quality in the new material Nb3Sn. Resumen: El Gran Colisionador de Hadrones es el mayor acelerador de partículas circular del mundo. Se trata de uno de los mayores proyectos de investigación. La primera fase de funcionamiento comprende desde 2009 a 2020, cuando comenzará la siguiente fase. Durante el primer periodo se han pensado mejoras para que puedan ser implementadas en la segunda fase. Una de ellas es el aumento del ratio de las colisiones entre protones por choque. Este es el principal objetivo de uno de los proyectos que está llevando a cabo las mejoras a ser implementadas en 2020: Hi- Lumi LHC. Se cambiarán los imanes superconductores de NbTi de las dos zonas principales de interacción, y se sustituirán por imanes de Nb3Sn. Esta sustituciónn conlleva un profundo estudio previo. Entre otros, uno de los factores a analizar es la calidad del campo magnético. La herramienta utilizada es el software desarrollado por S. Russenschuck en el CERN llamado ROXIE. Está basado en tres modelos de magnetización, los cuales son precisos para campos mayores de 1.5 T. Sin embargo, no lo son tanto para campos menores. Con este proyecto se pretende evaluar la implementación de un cuarto modelo, el modelo clásico de histéresis de Preisach que permita llevar a cabo un mejor análisis de la calidad del campo inducido por el futuro material a utilizar en algunos de los imanes.
Resumo:
Con el devenir de los tiempos e incentivado por el desarrollo tecnológico, la cantidad y complejidad de los experimentos realizados en el conocido laboratorio de física de partículas, C.E.R.N, ha alcanzado límites insospechados. Además, su evolución se acentúa y motiva con cada nuevo descubrimiento. Prueba de estas ansias por desvelar las entrañas y secretos del universo se encuentra en el choque de 13 TeV que tuvo lugar el pasado mes de mayo. Con él, no sólo se marcaban inequívocamente las expectativas del complejo para este nuevo ciclo de funcionamiento, sino que además se daba el pistoletazo de salida a la carrera que culminaría con el descubrimiento de los pentaquarks. A nivel ingenieril, esta mejora de las capacidades del complejo implica un exponencial endurecimiento de las exigencias impuestas a los sistemas empleados. Por consiguiente y de forma inevitable, las condiciones del interior del acelerador migran hacia baremos cada vez más drásticos. Tanto es así que los niveles de radiación alcanzados actualmente limitan notablemente el acceso de personal al acelerador; lo que se traduce en un incremento de los tiempos de mantenimiento y reparación. Actualmente estos retardos tratan de ser mitigados mediante el uso de robots móviles operados remotamente. De entre ellos, llama la atención aquél conocido bajo el acrónimo T.I.M (Train for RP Survey and visual inspection in LHC). Este tren, constituido por 5 vagones, se desplaza a lo largo del acelerador de partículas midiendo los niveles de radiación y oxígeno al tiempo que proporciona realimentación visual. En el presente proyecto se propone la mejora de las tareas de inspección y mantenimiento mediante la integración de un manipulador robótico de 6 grados de libertad en uno de los vagones del citado tren. De este modo, se consigue un sistema capaz de trasladarse a cualquier punto del acelerador, en un tiempo record, y realizar una gran cantidad de tareas de mantenimiento que comprenden desde simples inspecciones visuales a complejas labores como puede ser desatornillado o extracción de componentes dañados. Por otro lado, se plantea un segundo desarrollo sobre el que sustentar el diseño propuesto: “Construcción de un simulador robótico de alta fiabilidad, basado en ROS y Gazebo”. Adicionalmente, esta herramienta Software atiende a otros fines complementarios: sirve de trampolín para futuros desarrollos encauzados a la mejora del sistema mecánico; entrega una herramienta de bajo coste con la que analizar la integración de nuevos hitos en robótica y, por último, permite evaluar la adopción de un nuevo paradigma de programación en el que ROS se encuentre inmerso.
Resumo:
Previous studies of photosynthetic acclimation to elevated CO2 have focused on the most recently expanded, sunlit leaves in the canopy. We examined acclimation in a vertical profile of leaves through a canopy of wheat (Triticum aestivum L.). The crop was grown at an elevated CO2 partial pressure of 55 Pa within a replicated field experiment using free-air CO2 enrichment. Gas exchange was used to estimate in vivo carboxylation capacity and the maximum rate of ribulose-1,5-bisphosphate-limited photosynthesis. Net photosynthetic CO2 uptake was measured for leaves in situ within the canopy. Leaf contents of ribulose-1,5-bisphosphate carboxylase/oxygenase (Rubisco), light-harvesting-complex (LHC) proteins, and total N were determined. Elevated CO2 did not affect carboxylation capacity in the most recently expanded leaves but led to a decrease in lower, shaded leaves during grain development. Despite this acclimation, in situ photosynthetic CO2 uptake remained higher under elevated CO2. Acclimation at elevated CO2 was accompanied by decreases in both Rubisco and total leaf N contents and an increase in LHC content. Elevated CO2 led to a larger increase in LHC/Rubisco in lower canopy leaves than in the uppermost leaf. Acclimation of leaf photosynthesis to elevated CO2 therefore depended on both vertical position within the canopy and the developmental stage.
Resumo:
In the dinoflagellate Amphidinium carterae, photoadaptation involves changes in the transcription of genes encoding both of the major classes of light-harvesting proteins, the peridinin chlorophyll a proteins (PCPs) and the major a/c-containing intrinsic light-harvesting proteins (LHCs). PCP and LHC transcript levels were increased up to 86- and 6-fold higher, respectively, under low-light conditions relative to cells grown at high illumination. These increases in transcript abundance were accompanied by decreases in the extent of methylation of CpG and CpNpG motifs within or near PCP- and LHC-coding regions. Cytosine methylation levels in A. carterae are therefore nonstatic and may vary with environmental conditions in a manner suggestive of involvement in the regulation of gene expression. However, chemically induced undermethylation was insufficient in activating transcription, because treatment with two methylation inhibitors had no effect on PCP mRNA or protein levels. Regulation of gene activity through changes in DNA methylation has traditionally been assumed to be restricted to higher eukaryotes (deuterostomes and green plants); however, the atypically large genomes of dinoflagellates may have generated the requirement for systems of this type in a relatively “primitive” organism. Dinoflagellates may therefore provide a unique perspective on the evolution of eukaryotic DNA-methylation systems.
Azimuthal asymmetry in the risetime of the surface detector signals of the Pierre Auger Observatory.
Resumo:
The azimuthal asymmetry in the risetime of signals in Auger surface detector stations is a source of information on shower development. The azimuthal asymmetry is due to a combination of the longitudinal evolution of the shower and geometrical effects related to the angles of incidence of the particles into the detectors. The magnitude of the effect depends upon the zenith angle and state of development of the shower and thus provides a novel observable, (sec theta)(max), sensitive to the mass composition of cosmic rays above 3 x 10(18) eV. By comparing measurements with predictions from shower simulations, we find for both of our adopted models of hadronic physics (QGSJETII-04 and EPOS-LHC) an indication that the mean cosmic-ray mass increases slowly with energy, as has been inferred from other studies. However, the mass estimates are dependent on the shower model and on the range of distance from the shower core selected. Thus the method has uncovered further deficiencies in our understanding of shower modeling that must be resolved before the mass composition can be inferred from (sec theta)(max).
Resumo:
ALICE is one of four major experiments of particle accelerator LHC installed in the European laboratory CERN. The management committee of the LHC accelerator has just approved a program update for this experiment. Among the upgrades planned for the coming years of the ALICE experiment is to improve the resolution and tracking efficiency maintaining the excellent particles identification ability, and to increase the read-out event rate to 100 KHz. In order to achieve this, it is necessary to update the Time Projection Chamber detector (TPC) and Muon tracking (MCH) detector modifying the read-out electronics, which is not suitable for this migration. To overcome this limitation the design, fabrication and experimental test of new ASIC named SAMPA has been proposed . This ASIC will support both positive and negative polarities, with 32 channels per chip and continuous data readout with smaller power consumption than the previous versions. This work aims to design, fabrication and experimental test of a readout front-end in 130nm CMOS technology with configurable polarity (positive/negative), peaking time and sensitivity. The new SAMPA ASIC can be used in both chambers (TPC and MCH). The proposed front-end is composed of a Charge Sensitive Amplifier (CSA) and a Semi-Gaussian shaper. In order to obtain an ASIC integrating 32 channels per chip, the design of the proposed front-end requires small area and low power consumption, but at the same time requires low noise. In this sense, a new Noise and PSRR (Power Supply Rejection Ratio) improvement technique for the CSA design without power and area impact is proposed in this work. The analysis and equations of the proposed circuit are presented which were verified by electrical simulations and experimental test of a produced chip with 5 channels of the designed front-end. The measured equivalent noise charge was <550e for 30mV/fC of sensitivity at a input capacitance of 18.5pF. The total core area of the front-end was 2300?m × 150?m, and the measured total power consumption was 9.1mW per channel.
Resumo:
ALICE is one of four major experiments of particle accelerator LHC installed in the European laboratory CERN. The management committee of the LHC accelerator has just approved a program update for this experiment. Among the upgrades planned for the coming years of the ALICE experiment is to improve the resolution and tracking efficiency maintaining the excellent particles identification ability, and to increase the read-out event rate to 100 KHz. In order to achieve this, it is necessary to update the Time Projection Chamber detector (TPC) and Muon tracking (MCH) detector modifying the read-out electronics, which is not suitable for this migration. To overcome this limitation the design, fabrication and experimental test of new ASIC named SAMPA has been proposed . This ASIC will support both positive and negative polarities, with 32 channels per chip and continuous data readout with smaller power consumption than the previous versions. This work aims to design, fabrication and experimental test of a readout front-end in 130nm CMOS technology with configurable polarity (positive/negative), peaking time and sensitivity. The new SAMPA ASIC can be used in both chambers (TPC and MCH). The proposed front-end is composed of a Charge Sensitive Amplifier (CSA) and a Semi-Gaussian shaper. In order to obtain an ASIC integrating 32 channels per chip, the design of the proposed front-end requires small area and low power consumption, but at the same time requires low noise. In this sense, a new Noise and PSRR (Power Supply Rejection Ratio) improvement technique for the CSA design without power and area impact is proposed in this work. The analysis and equations of the proposed circuit are presented which were verified by electrical simulations and experimental test of a produced chip with 5 channels of the designed front-end. The measured equivalent noise charge was <550e for 30mV/fC of sensitivity at a input capacitance of 18.5pF. The total core area of the front-end was 2300?m × 150?m, and the measured total power consumption was 9.1mW per channel.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
For the first time, the Z0 boson angular distribution in the center-of-momentum frame is measured in proton-proton collisions at [special characters omitted] = 7 TeV at the CERN LHC. The data sample, recorded with the CMS detector, corresponds to an integrated luminosity of approximately 36 pb–1 . Events in which there is a Z0 and at least one jet, with a jet transverse momentum threshold of 20 GeV and absolute jet rapidity less than 2.4, are selected for the analysis. Only the Z0's muon decay channel is studied. Within experimental and theoretical uncertainties, the measured angular distribution is in agreement with next-to-leading order perturbative QCD predictions.
Resumo:
Al Large Hadron Collider (LHC) ogni anno di acquisizione dati vengono raccolti più di 30 petabyte di dati dalle collisioni. Per processare questi dati è necessario produrre un grande volume di eventi simulati attraverso tecniche Monte Carlo. Inoltre l'analisi fisica richiede accesso giornaliero a formati di dati derivati per centinaia di utenti. La Worldwide LHC Computing GRID (WLCG) è una collaborazione interazionale di scienziati e centri di calcolo che ha affrontato le sfide tecnologiche di LHC, rendendone possibile il programma scientifico. Con il prosieguo dell'acquisizione dati e la recente approvazione di progetti ambiziosi come l'High-Luminosity LHC, si raggiungerà presto il limite delle attuali capacità di calcolo. Una delle chiavi per superare queste sfide nel prossimo decennio, anche alla luce delle ristrettezze economiche dalle varie funding agency nazionali, consiste nell'ottimizzare efficientemente l'uso delle risorse di calcolo a disposizione. Il lavoro mira a sviluppare e valutare strumenti per migliorare la comprensione di come vengono monitorati i dati sia di produzione che di analisi in CMS. Per questa ragione il lavoro è comprensivo di due parti. La prima, per quanto riguarda l'analisi distribuita, consiste nello sviluppo di uno strumento che consenta di analizzare velocemente i log file derivanti dalle sottomissioni di job terminati per consentire all'utente, alla sottomissione successiva, di sfruttare meglio le risorse di calcolo. La seconda parte, che riguarda il monitoring di jobs sia di produzione che di analisi, sfrutta tecnologie nel campo dei Big Data per un servizio di monitoring più efficiente e flessibile. Un aspetto degno di nota di tali miglioramenti è la possibilità di evitare un'elevato livello di aggregazione dei dati già in uno stadio iniziale, nonché di raccogliere dati di monitoring con una granularità elevata che tuttavia consenta riprocessamento successivo e aggregazione “on-demand”.
Resumo:
A search for new heavy resonances decaying to boson pairs (WZ, WW or ZZ) using 20.3 inverse femtobarns of proton-proton collision data at a center of mass energy of 8 TeV is presented. The data were recorded by the ATLAS detector at the Large Hadron Collider (LHC) in 2012. The analysis combines several search channels with the leptonic, semi-leptonic and fully hadronic final states. The diboson invariant mass spectrum is studied for local excesses above the Standard Model background prediction, and no significant excess is observed for the combined analysis. 95$\%$ confidence limits are set on the cross section times branching ratios for three signal models: an extended gauge model with a heavy W boson, a bulk Randall-Sundrum model with a spin-2 graviton, and a simplified model with a heavy vector triplet. Among the individual search channels, the fully-hadronic channel is predominantly presented where boson tagging technique and jet substructure cuts are used. Local excesses are found in the dijet mass distribution around 2 TeV, leading to a global significance of 2.5 standard deviations. This deviation from the Standard Model prediction results in many theory explanations, and the possibilities could be further explored using the LHC Run 2 data.
Resumo:
Searches for the supersymmetric partner of the top quark (stop) are motivated by natural supersymmetry, where the stop has to be light to cancel the large radiative corrections to the Higgs boson mass. This thesis presents three different searches for the stop at √s = 8 TeV and √s = 13 TeV using data from the ATLAS experiment at CERN’s Large Hadron Collider. The thesis also includes a study of the primary vertex reconstruction performance in data and simulation at √s = 7 TeV using tt and Z events. All stop searches presented are carried out in final states with a single lepton, four or more jets and large missing transverse energy. A search for direct stop pair production is conducted with 20.3 fb−1 of data at a center-of-mass energy of √s = 8 TeV. Several stop decay scenarios are considered, including those to a top quark and the lightest neutralino and to a bottom quark and the lightest chargino. The sensitivity of the analysis is also studied in the context of various phenomenological MSSM models in which more complex decay scenarios can be present. Two different analyses are carried out at √s = 13 TeV. The first one is a search for both gluino-mediated and direct stop pair production with 3.2 fb−1 of data while the second one is a search for direct stop pair production with 13.2 fb−1 of data in the decay scenario to a bottom quark and the lightest chargino. The results of the analyses show no significant excess over the Standard Model predictions in the observed data. Consequently, exclusion limits are set at 95% CL on the masses of the stop and the lightest neutralino.
Resumo:
Lo scopo della tesi è di stimare le prestazioni del rivelatore ALICE nella rivelazione del barione Lambda_c nelle collisioni PbPb usando un approccio innovativo per l'identificazione delle particelle. L'idea principale del nuovo approccio è di sostituire l'usuale selezione della particella, basata su tagli applicati ai segnali del rivelatore, con una selezione che usi le probabilità derivate dal teorema di Bayes (per questo è chiamato "pesato Bayesiano"). Per stabilire quale metodo è il più efficiente , viene presentato un confronto con altri approcci standard utilizzati in ALICE. Per fare ciò è stato implementato un software di simulazione Monte Carlo "fast", settato con le abbondanze di particelle che ci si aspetta nel nuovo regime energetico di LHC e con le prestazioni osservate del rivelatore. E' stata quindi ricavata una stima realistica della produzione di Lambda_c, combinando i risultati noti da esperimenti precedenti e ciò è stato usato per stimare la significatività secondo la statistica al RUN2 e RUN3 dell'LHC. Verranno descritti la fisica di ALICE, tra cui modello standard, cromodinamica quantistica e quark gluon plasma. Poi si passerà ad analizzare alcuni risultati sperimentali recenti (RHIC e LHC). Verrà descritto il funzionamento di ALICE e delle sue componenti e infine si passerà all'analisi dei risultati ottenuti. Questi ultimi hanno mostrato che il metodo risulta avere una efficienza superiore a quella degli usuali approcci in ALICE e che, conseguentemente, per quantificare ancora meglio le prestazioni del nuovo metodo si dovrebbe eseguire una simulazione "full", così da verificare i risultati ottenuti in uno scenario totalmente realistico.
Resumo:
Abstract Heading into the 2020s, Physics and Astronomy are undergoing experimental revolutions that will reshape our picture of the fabric of the Universe. The Large Hadron Collider (LHC), the largest particle physics project in the world, produces 30 petabytes of data annually that need to be sifted through, analysed, and modelled. In astrophysics, the Large Synoptic Survey Telescope (LSST) will be taking a high-resolution image of the full sky every 3 days, leading to data rates of 30 terabytes per night over ten years. These experiments endeavour to answer the question why 96% of the content of the universe currently elude our physical understanding. Both the LHC and LSST share the 5-dimensional nature of their data, with position, energy and time being the fundamental axes. This talk will present an overview of the experiments and data that is gathered, and outlines the challenges in extracting information. Common strategies employed are very similar to industrial data! Science problems (e.g., data filtering, machine learning, statistical interpretation) and provide a seed for exchange of knowledge between academia and industry. Speaker Biography Professor Mark Sullivan Mark Sullivan is a Professor of Astrophysics in the Department of Physics and Astronomy. Mark completed his PhD at Cambridge, and following postdoctoral study in Durham, Toronto and Oxford, now leads a research group at Southampton studying dark energy using exploding stars called "type Ia supernovae". Mark has many years' experience of research that involves repeatedly imaging the night sky to track the arrival of transient objects, involving significant challenges in data handling, processing, classification and analysis.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08