863 resultados para Event Generator
Resumo:
We report on the event structure and double helicity asymmetry (A(LL)) of jet production in longitudinally polarized p + p collisions at root s = 200 GeV. Photons and charged particles were measured by the PHENIX experiment at midrapidity vertical bar eta vertical bar < 0.35 with the requirement of a high-momentum (> 2 GeV/c) photon in the event. Event structure, such as multiplicity, p(T) density and thrust in the PHENIX acceptance, were measured and compared with the results from the PYTHIA event generator and the GEANT detector simulation. The shape of jets and the underlying event were well reproduced at this collision energy. For the measurement of jet A(LL), photons and charged particles were clustered with a seed-cone algorithm to obtain the cluster pT sum (p(T)(reco)). The effect of detector response and the underlying events on p(T)(reco) was evaluated with the simulation. The production rate of reconstructed jets is satisfactorily reproduced with the next-to-leading-order and perturbative quantum chromodynamics jet production cross section. For 4< p(T)(reco) < 12 GeV/c with an average beam polarization of < P > = 49% we measured Lambda(LL) = -0.0014 +/- 0.0037(stat) at the lowest p(T)(reco) bin (4-5 GeV= c) and -0.0181 +/- 0.0282(stat) at the highest p(T)(reco) bin (10-12 GeV= c) with a beam polarization scale error of 9.4% and a pT scale error of 10%. Jets in the measured p(T)(reco) range arise primarily from hard-scattered gluons with momentum fraction 0: 02 < x < 0: 3 according to PYTHIA. The measured A(LL) is compared with predictions that assume various Delta G(x) distributions based on the Gluck-Reya-Stratmann-Vogelsang parameterization. The present result imposes the limit -a.1 < integral(0.3)(0.02) dx Delta G(x, mu(2) = GeV2) < 0.4 at 95% confidence level or integral(0.3)(0.002) dx Delta G(x, mu(2) = 1 GeV2) < 0.5 at 99% confidence level.
Resumo:
We present a generator for single top-quark production via flavour-changing neutral currents. The MEtop event generator allows for Next-to-Leading-Order direct top production pp -> t and Leading-Order production of several other single top processes. A few packages with definite sets of dimension six operators are available. We discuss how to improve the bounds on the effective operators and how well new physics can be probed with each set of independent dimension six operators.
Resumo:
Measurements are presented of the production of primary KS0 and Λ particles in proton-proton collisions at √s=7 TeV in the region transverse to the leading charged-particle jet in each event. The average multiplicity and average scalar transverse momentum sum of KS0 and Λ particles measured at pseudorapidities |η|<2 rise with increasing charged-particle jet pT in the range 1-10 GeV/c and saturate in the region 10-50 GeV/c. The rise and saturation of the strange-particle yields and transverse momentum sums in the underlying event are similar to those observed for inclusive charged particles, which confirms the impact-parameter picture of multiple parton interactions. The results are compared to recent tunes of the pythia Monte Carlo event generator. The pythia simulations underestimate the data by 15%-30% for KS0 mesons and by about 50% for Λ baryons, a deficit similar to that observed for the inclusive strange-particle production in non-single-diffractive proton-proton collisions. The constant strange- to charged-particle activity ratios with respect to the leading jet pT and similar trends for mesons and baryons indicate that the multiparton-interaction dynamics is decoupled from parton hadronization, which occurs at a later stage. © 2013 CERN, for the CMS Collaboration Published by the American Physical Society under the terms of the Creative Commons Attribution 3.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
Resumo:
Catastrophe risk models used by the insurance industry are likely subject to significant uncertainty, but due to their proprietary nature and strict licensing conditions they are not available for experimentation. In addition, even if such experiments were conducted, these would not be repeatable by other researchers because commercial confidentiality issues prevent the details of proprietary catastrophe model structures from being described in public domain documents. However, such experimentation is urgently required to improve decision making in both insurance and reinsurance markets. In this paper we therefore construct our own catastrophe risk model for flooding in Dublin, Ireland, in order to assess the impact of typical precipitation data uncertainty on loss predictions. As we consider only a city region rather than a whole territory and have access to detailed data and computing resources typically unavailable to industry modellers, our model is significantly more detailed than most commercial products. The model consists of four components, a stochastic rainfall module, a hydrological and hydraulic flood hazard module, a vulnerability module, and a financial loss module. Using these we undertake a series of simulations to test the impact of driving the stochastic event generator with four different rainfall data sets: ground gauge data, gauge-corrected rainfall radar, meteorological reanalysis data (European Centre for Medium-Range Weather Forecasts Reanalysis-Interim; ERA-Interim) and a satellite rainfall product (The Climate Prediction Center morphing method; CMORPH). Catastrophe models are unusual because they use the upper three components of the modelling chain to generate a large synthetic database of unobserved and severe loss-driving events for which estimated losses are calculated. We find the loss estimates to be more sensitive to uncertainties propagated from the driving precipitation data sets than to other uncertainties in the hazard and vulnerability modules, suggesting that the range of uncertainty within catastrophe model structures may be greater than commonly believed.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A measurement of the exclusive two-photon production of muon pairs in proton-proton collisions at root s = 7 TeV, pp -> p mu(+)mu(-) p, is reported using data corresponding to an integrated luminosity of 40 pb-1. For muon pairs with invariant mass greater than 11.5 GeV, transverse momentum p(T)(mu) > 4 GeV and pseudorapidity 1770.1) < 2.1, a fit to the dimuon p(T)(mu(+)mu(-)) distribution results in a measured cross section of sigma(p -> p mu(+)mu(-) p) - 3.38(-0.55)(+0.58) (stat.)+/- 0.16 (syst.) +/- 0.14 (lumi.) pb, consistent with the theoretical prediction evaluated with the event generator LPAIR. The ratio to the predicted cross section is 0.83+(0.14)(-0.13) (stat.) +/- 0.04 (syst.) +/- 0.03 (lumi.). The characteristic distributions of the muon pairs produced via Ty fusion, such as the muon acoplanarity, the muon pair invariant mass and transverse momentum agree with those from the theory.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
A measurement of the multi-strange Xi(-) and Omega(-) baryons and their antiparticles by the ALICE experiment at the CERN Large Hadron Collider (LHC) is presented for inelastic proton-proton collisions at a centre-of-mass energy of 7 TeV. The transverse momentum (p(T)) distributions were studied at mid-rapidity (vertical bar y vertical bar < 0.5) in the range of 0.6 < p(T) < 8.5 GeV/c Xi(-) for and Xi(+) baryons, and in the range of 0.8 < P-T < 5 GeV/c for Omega(-) and<(Omega)over bar>(+). Baryons and antibaryons were measured as separate particles and we find that the baryon to antibaryon ratio of both particle species is consistent with unity over the entire range of the measurement. The statistical precision of the current data has allowed us to measure a difference between the mean p(T) of Xi(-) ((Xi) over bar)(+) and Omega(-) ((Omega) over bar (+)). Particle yields, mean pi, and the spectra in the intermediate pi range are not well described by the PYTHIA Perugia 2011 tune Monte Carlo event generator, which has been tuned to reproduce the early LHC data. The discrepancy is largest for Omega(-)((Omega) over bar (+)). This PYTHIA tune approaches the pi spectra of Xi(-) and Xi(+) baryons below p(T) <0.85 GeV/c and describes the Xi(-) and Xi(+) spectra above p(T) > 6.0 GeV/c. We also illustrate the difference between the experimental data and model by comparing the corresponding ratios of (Omega(-) +(Omega) over bar (+))/(Xi(-) + Xi(+)) as a function of transverse mass. (C) 2012 CERN. Published by Elsevier B.V. All rights reserved.
Resumo:
The electromagnetic form factors of the proton are fundamental quantities sensitive to the distribution of charge and magnetization inside the proton. Precise knowledge of the form factors, in particular of the charge and magnetization radii provide strong tests for theory in the non-perturbative regime of QCD. However, the existing data at Q^2 below 1 (GeV/c)^2 are not precise enough for a hard test of theoretical predictions.rnrnFor a more precise determination of the form factors, within this work more than 1400 cross sections of the reaction H(e,e′)p were measured at the Mainz Microtron MAMI using the 3-spectrometer-facility of the A1-collaboration. The data were taken in three periods in the years 2006 and 2007 using beam energies of 180, 315, 450, 585, 720 and 855 MeV. They cover the Q^2 region from 0.004 to 1 (GeV/c)^2 with counting rate uncertainties below 0.2% for most of the data points. The relative luminosity of the measurements was determined using one of the spectrometers as a luminosity monitor. The overlapping acceptances of the measurements maximize the internal redundancy of the data and allow, together with several additions to the standard experimental setup, for tight control of systematic uncertainties.rnTo account for the radiative processes, an event generator was developed and implemented in the simulation package of the analysis software which works without peaking approximation by explicitly calculating the Bethe-Heitler and Born Feynman diagrams for each event.rnTo separate the form factors and to determine the radii, the data were analyzed by fitting a wide selection of form factor models directly to the measured cross sections. These fits also determined the absolute normalization of the different data subsets. The validity of this method was tested with extensive simulations. The results were compared to an extraction via the standard Rosenbluth technique.rnrnThe dip structure in G_E that was seen in the analysis of the previous world data shows up in a modified form. When compared to the standard-dipole form factor as a smooth curve, the extracted G_E exhibits a strong change of the slope around 0.1 (GeV/c)^2, and in the magnetic form factor a dip around 0.2 (GeV/c)^2 is found. This may be taken as indications for a pion cloud. For higher Q^2, the fits yield larger values for G_M than previous measurements, in agreement with form factor ratios from recent precise polarized measurements in the Q2 region up to 0.6 (GeV/c)^2.rnrnThe charge and magnetic rms radii are determined as rn⟨r_e⟩=0.879 ± 0.005(stat.) ± 0.004(syst.) ± 0.002(model) ± 0.004(group) fm,rn⟨r_m⟩=0.777 ± 0.013(stat.) ± 0.009(syst.) ± 0.005(model) ± 0.002(group) fm.rnThis charge radius is significantly larger than theoretical predictions and than the radius of the standard dipole. However, it is in agreement with earlier results measured at the Mainz linear accelerator and with determinations from Hydrogen Lamb shift measurements. The extracted magnetic radius is smaller than previous determinations and than the standard-dipole value.
Resumo:
In this thesis, we develop high precision tools for the simulation of slepton pair production processes at hadron colliders and apply them to phenomenological studies at the LHC. Our approach is based on the POWHEG method for the matching of next-to-leading order results in perturbation theory to parton showers. We calculate matrix elements for slepton pair production and for the production of a slepton pair in association with a jet perturbatively at next-to-leading order in supersymmetric quantum chromodynamics. Both processes are subsequently implemented in the POWHEG BOX, a publicly available software tool that contains general parts of the POWHEG matching scheme. We investigate phenomenological consequences of our calculations in several setups that respect experimental exclusion limits for supersymmetric particles and provide precise predictions for slepton signatures at the LHC. The inclusion of QCD emissions in the partonic matrix elements allows for an accurate description of hard jets. Interfacing our codes to the multi-purpose Monte-Carlo event generator PYTHIA, we simulate parton showers and slepton decays in fully exclusive events. Advanced kinematical variables and specific search strategies are examined as means for slepton discovery in experimentally challenging setups.
Resumo:
During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.
Resumo:
Arrancando en la definición de paisaje como construcción de un espacio virtual desde una posición referencial, la investigación constituye una inmersión en la escucha como mirada específica para llevarlo a cabo. La investigación se centra en la búsqueda de una serie de estrategias: herramientas de interpretación y modelos de análisis de la realidad que pueden interactuar con el medio espacial. Mediante el estudio del territorio a través de parámetros vinculados a las energías dinámicas existentes en el mismo, el fin es aportar una nueva lectura, una transcripción del evento sonoro generador del ser de un lugar, empleando para ello el lenguaje arquitectónico y la notación espacio-temporal. El acantilado del Pointe du Hoc se toma como primer caso de estudio para el planteamiento del análisis de la dimensión sonora del territorio, a través de la fabricación de ntermediarios espaciales, que constituirán un modelo aplicable a otros lugares cuya identidad y energía sonora formen un rasgo significativo de su existencia. PALABRAS CLAVE Paisaje sonoro, sonido, Intermediario espacial, territorio Starting in the definition of landscape as building a virtual space from a reference position, this research is an immersion in listening as specific look to carry it out. The research is focused on a number of strategies: prototypes, tools for interpretation and analysis models of reality that can interact with the space environment. By studying the area using parameters linked to existing dynamic energies in territory, the aim is to provide a new reading, a transcript of the sound event generator, employing the architectural language and space-temporary notation. The Pointe du Hoc cliff is taken as a first case study for the approach to the analysis of the sound dimension of the territory, through space intermediaries, which will be a model for other places whose sound energy is the significance of their existence. KEYWORDS Soundscape, Sound, space intermediate, territory