973 resultados para Data-fusion
Resumo:
The ATLAS experiment at the LHC has measured the Higgs boson couplings and mass, and searched for invisible Higgs boson decays, using multiple production and decay channels with up to 4.7 fb−1 of pp collision data at √s=7 TeV and 20.3 fb−1 at √s=8 TeV. In the current study, the measured production and decay rates of the observed Higgs boson in the γγ, ZZ, W W , Zγ, bb, τ τ , and μμ decay channels, along with results from the associated production of a Higgs boson with a top-quark pair, are used to probe the scaling of the couplings with mass. Limits are set on parameters in extensions of the Standard Model including a composite Higgs boson, an additional electroweak singlet, and two-Higgs-doublet models. Together with the measured mass of the scalar Higgs boson in the γγ and ZZ decay modes, a lower limit is set on the pseudoscalar Higgs boson mass of m A > 370 GeV in the “hMSSM” simplified Minimal Supersymmetric Standard Model. Results from direct searches for heavy Higgs bosons are also interpreted in the hMSSM. Direct searches for invisible Higgs boson decays in the vector-boson fusion and associated production of a Higgs boson with W/Z (Z → ℓℓ, W/Z → jj) modes are statistically combined to set an upper limit on the Higgs boson invisible branching ratio of 0.25. The use of the measured visible decay rates in a more general coupling fit improves the upper limit to 0.23, constraining a Higgs portal model of dark matter.
Resumo:
A search for Higgs boson production in association with a W or Z boson, in the H→ W W ∗ decay channel, is performed with a data sample collected with the ATLAS detector at the LHC in proton-proton collisions at centre-of-mass energies s√=7 TeV and 8 TeV, corresponding to integrated luminosities of 4.5 fb−1 and 20.3 fb−1, respectively. The WH production mode is studied in two-lepton and three-lepton final states, while two- lepton and four-lepton final states are used to search for the ZH production mode. The observed significance, for the combined W H and ZH production, is 2.5 standard deviations while a significance of 0.9 standard deviations is expected in the Standard Model Higgs boson hypothesis. The ratio of the combined W H and ZH signal yield to the Standard Model expectation, μ V H , is found to be μ V H = 3.0 − 1.1 + 1.3 (stat.) − 0.7 + 1.0 (sys.) for the Higgs boson mass of 125.36 GeV. The W H and ZH production modes are also combined with the gluon fusion and vector boson fusion production modes studied in the H → W W ∗ → ℓνℓν decay channel, resulting in an overall observed significance of 6.5 standard deviations and μ ggF + VBF + VH = 1. 16 − 0.15 + 0.16 (stat.) − 0.15 + 0.18 (sys.). The results are interpreted in terms of scaling factors of the Higgs boson couplings to vector bosons (κ V ) and fermions (κ F ); the combined results are: |κ V | = 1.06 − 0.10 + 0.10 , |κ F | = 0. 85 − 0.20 + 0.26 .
Resumo:
It was found that the non-perturbative corrections calculated using Pythia with the Perugia 2011 tune did not include the effect of the underlying event. The affected correction factors were recomputed using the Pythia 6.427 generator. These corrections are applied as baseline to the NLO pQCD calculations and thus the central values of the theoretical predictions have changed by a few percent with the new corrections. This has a minor impact on the agreement between the data and the theoretical predictions. Figures 2 and 6 to 13, and all the tables have been updated with the new values. A few sentences in the discussion in sections 5.2 and 9 were altered or removed.
Resumo:
This paper describes the concept, technical realisation and validation of a largely data-driven method to model events with Z→ττ decays. In Z→μμ events selected from proton-proton collision data recorded at s√=8 TeV with the ATLAS experiment at the LHC in 2012, the Z decay muons are replaced by τ leptons from simulated Z→ττ decays at the level of reconstructed tracks and calorimeter cells. The τ lepton kinematics are derived from the kinematics of the original muons. Thus, only the well-understood decays of the Z boson and τ leptons as well as the detector response to the τ decay products are obtained from simulation. All other aspects of the event, such as the Z boson and jet kinematics as well as effects from multiple interactions, are given by the actual data. This so-called τ-embedding method is particularly relevant for Higgs boson searches and analyses in ττ final states, where Z→ττ decays constitute a large irreducible background that cannot be obtained directly from data control samples.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Tese de Doutoramento em Ciências (Especialidade em Matemática)
Resumo:
The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto
Resumo:
Distributed data aggregation is an important task, allowing the de- centralized determination of meaningful global properties, that can then be used to direct the execution of other applications. The resulting val- ues result from the distributed computation of functions like count, sum and average. Some application examples can found to determine the network size, total storage capacity, average load, majorities and many others. In the last decade, many di erent approaches have been pro- posed, with di erent trade-o s in terms of accuracy, reliability, message and time complexity. Due to the considerable amount and variety of ag- gregation algorithms, it can be di cult and time consuming to determine which techniques will be more appropriate to use in speci c settings, jus- tifying the existence of a survey to aid in this task. This work reviews the state of the art on distributed data aggregation algorithms, providing three main contributions. First, it formally de nes the concept of aggrega- tion, characterizing the di erent types of aggregation functions. Second, it succinctly describes the main aggregation techniques, organizing them in a taxonomy. Finally, it provides some guidelines toward the selection and use of the most relevant techniques, summarizing their principal characteristics.
Resumo:
Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.