6 resultados para LHC,CMS,Big Data
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.
Resumo:
The ATLAS and CMS collaborations have recently shown data suggesting the presence of a Higgs boson in the vicinity of 125 GeV. We show that a two-Higgs-doublet model spectrum, with the pseudoscalar state being the lightest, could be responsible for the diphoton signal events. In this model, the other scalars are considerably heavier and are not excluded by the current LHC data. If this assumption is correct, future LHC data should show a strengthening of the gamma gamma signal, while the signals in the ZZ(()*()) -> 4l and WW(*()) -> 2l2 nu channels should diminish and eventually disappear, due to the absence of diboson tree-level couplings of the CP-odd state. The heavier CP-even neutral scalars can now decay into channels involving the CP-odd light scalar which, together with their larger masses, allow them to avoid the existing bounds on Higgs searches. We suggest additional signals to confirm this scenario at the LHC, in the decay channels of the heavier scalars into AA and AZ. Finally, this inverted two-Higgs-doublet spectrum is characteristic in models where fermion condensation leads to electroweak symmetry breaking. We show that in these theories it is possible to obtain the observed diphoton signal at or somewhat above the prediction for the standard model Higgs for the typical values of the parameters predicted.
Resumo:
Several extensions of the standard model predict the existence of new neutral spin-1 resonances associated with the electroweak symmetry breaking sector. Using the data from ATLAS (with integrated luminosity of L = 1.02 fb(-1)) and CMS (with integrated luminosity of L = 1.55 fb(-1)) on the production of W+W- pairs through the process pp --> l(+)l(-)' is not an element of(T), we place model independent bounds on these new vector resonances masses, couplings, and widths. Our analyses show that the present data exclude new neutral vector resonances with masses up to 1-2.3 TeV depending on their couplings and widths. We also demonstrate how to extend our analysis framework to different models with a specific example.
Resumo:
We present measurements of Underlying Event observables in pp collisions at root s = 0 : 9 and 7 TeV. The analysis is performed as a function of the highest charged-particle transverse momentum p(T),L-T in the event. Different regions are defined with respect to the azimuthal direction of the leading (highest transverse momentum) track: Toward, Transverse and Away. The Toward and Away regions collect the fragmentation products of the hardest partonic interaction. The Transverse region is expected to be most sensitive to the Underlying Event activity. The study is performed with charged particles above three different p(T) thresholds: 0.15, 0.5 and 1.0 GeV/c. In the Transverse region we observe an increase in the multiplicity of a factor 2-3 between the lower and higher collision energies, depending on the track p(T) threshold considered. Data are compared to PYTHIA 6.4, PYTHIA 8.1 and PHOJET. On average, all models considered underestimate the multiplicity and summed p(T) in the Transverse region by about 10-30%.
Resumo:
In the framework of gauged flavour symmetries, new fermions in parity symmetric representations of the standard model are generically needed for the compensation of mixed anomalies. The key point is that their masses are also protected by flavour symmetries and some of them are expected to lie way below the flavour symmetry breaking scale(s), which has to occur many orders of magnitude above the electroweak scale to be compatible with the available data from flavour changing neutral currents and CP violation experiments. We argue that, actually, some of these fermions would plausibly get masses within the LHC range. If they are taken to be heavy quarks and leptons, in (bi)-fundamental representations of the standard model symmetries, their mixings with the light ones are strongly constrained to be very small by electroweak precision data. The alternative chosen here is to exactly forbid such mixings by breaking of flavour symmetries into an exact discrete symmetry, the so-called proton-hexality, primarily suggested to avoid proton decay. As a consequence of the large value needed for the flavour breaking scale, those heavy particles are long-lived and rather appropriate for the current and future searches at the LHC for quasi-stable hadrons and leptons. In fact, the LHC experiments have already started to look for them.
Resumo:
Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.