41 resultados para sistema distribuito data-grid cloud computing CERN LHC Hazelcast Elasticsearch
Refined Physical Retrieval of Intergrated Water Vapor and Cloud Liquid for Microwave Radiometer Data
Resumo:
Measurements are presented of production properties and couplings of the recently discovered Higgs boson using the decays into boson pairs, H --> gamma-gamma, H --> ZZ* --> 4 leptons and H --> WW --> 2 leptons + 2 neutrinos. The results are based on the complete pp collision data sample recorded by the ATLAS experiment at the CERN Large Hadron Collider at centre-of-mass energies of 7 TeV and 8 TeV, corresponding to an integrated luminosity of about 25/fb. Evidence for Higgs boson production through vector-boson fusion is reported. Results of combined fits probing Higgs boson couplings to fermions and bosons, as well as anomalous contributions to loop-induced production and decay modes, are presented. All measurements are consistent with expectations for the Standard Model Higgs boson.
Resumo:
The large difference between the Planck scale and the electroweak scale, known as the hierarchy problem, is addressed in certain models through the postulate of extra spatial dimensions. A search for evidence of extra spatial dimensions in the diphoton channel has been performed using the full set of proton-proton collisions at root s = 7 TeV recorded in 2011 with the ATLAS detector at the CERN Large Hadron Collider. This dataset corresponds to an integrated luminosity of 4.9 fb(-1). The diphoton invariant mass spectrum is observed to be in good agreement with the Standard Model expectation. In the context of the model proposed by Arkani-Hamed, Dimopoulos and Dvali, 95% confidence level lower limits of between 2.52 and 3.92 TeV are set on the ultraviolet cutoff scale MS depending on the number of extra dimensions and the theoretical formalism used. In the context of the Randall-Sundrum model, a lower limit of 2.06 (1.00) TeV at 95% confidence level is set on the mass of the lightest graviton for couplings of k/(M) over bar (Pl) = 0.1(0.01). Combining with the ATLAS dilepton searches based on the 2011 data, the 95% confidence level lower limit on the Randall-Sundrum graviton mass is further tightened to 2.23 (1.03) TeV for k/(M) over bar (Pl) = 0.1(0.01).
Resumo:
This paper presents the electron and photon energy calibration achieved with the ATLAS detector using about 25 fb−1 of LHC proton–proton collision data taken at centre-of-mass energies of √s = 7 and 8 TeV. The reconstruction of electron and photon energies is optimised using multivariate algorithms. The response of the calorimeter layers is equalised in data and simulation, and the longitudinal profile of the electromagnetic showers is exploited to estimate the passive material in front of the calorimeter and reoptimise the detector simulation. After all corrections, the Z resonance is used to set the absolute energy scale. For electrons from Z decays, the achieved calibration is typically accurate to 0.05% in most of the detector acceptance, rising to 0.2% in regions with large amounts of passive material. The remaining inaccuracy is less than 0.2–1% for electrons with a transverse energy of 10 GeV, and is on average 0.3% for photons. The detector resolution is determined with a relative inaccuracy of less than 10% for electrons and photons up to 60 GeV transverse energy, rising to 40% for transverse energies above 500 GeV.
Resumo:
This paper presents the performance of the ATLAS muon reconstruction during the LHC run with pp collisions at √s = 7–8 TeV in 2011–2012, focusing mainly on data collected in 2012. Measurements of the reconstruction efficiency and of the momentum scale and resolution, based on large reference samples of J/ψ → μμ, Z → μμ and ϒ → μμ decays, are presented and compared to Monte Carlo simulations. Corrections to the simulation, to be used in physics analysis, are provided. Over most of the covered phase space (muon |η| < 2.7 and 5 ≲ pT ≲ 100 GeV) the efficiency is above 99% and is measured with per-mille precision. The momentum resolution ranges from 1.7% at central rapidity and for transverse momentum pT ≅ 10 GeV, to 4% at large rapidity and pT ≅ 100 GeV. The momentum scale is known with an uncertainty of 0.05% to 0.2% depending on rapidity. A method for the recovery of final state radiation from the muons is also presented.
Resumo:
The liquid argon calorimeter is a key component of the ATLAS detector installed at the CERN Large Hadron Collider. The primary purpose of this calorimeter is the measurement of electron and photon kinematic properties. It also provides a crucial input for measuring jets and missing transverse momentum. An advanced data monitoring procedure was designed to quickly identify issues that would affect detector performance and ensure that only the best quality data are used for physics analysis. This article presents the validation procedure developed during the 2011 and 2012 LHC data-taking periods, in which more than 98% of the proton-proton luminosity recorded by ATLAS at a centre-of-mass energy of 7–8 TeV had calorimeter data quality suitable for physics analysis.
Resumo:
Many of the interesting physics processes to be measured at the LHC have a signature involving one or more isolated electrons. The electron reconstruction and identification efficiencies of the ATLAS detector at the LHC have been evaluated using proton–proton collision data collected in 2011 at √s = 7 TeV and corresponding to an integrated luminosity of 4.7 fb−1. Tag-and-probe methods using events with leptonic decays of W and Z bosons and J/ψ mesons are employed to benchmark these performance parameters. The combination of all measurements results in identification efficiencies determined with an accuracy at the few per mil level for electron transverse energy greater than 30 GeV.
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.