6 resultados para Massive spin-2

em Duke University


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A search for new heavy resonances decaying to boson pairs (WZ, WW or ZZ) using 20.3 inverse femtobarns of proton-proton collision data at a center of mass energy of 8 TeV is presented. The data were recorded by the ATLAS detector at the Large Hadron Collider (LHC) in 2012. The analysis combines several search channels with the leptonic, semi-leptonic and fully hadronic final states. The diboson invariant mass spectrum is studied for local excesses above the Standard Model background prediction, and no significant excess is observed for the combined analysis. 95$\%$ confidence limits are set on the cross section times branching ratios for three signal models: an extended gauge model with a heavy W boson, a bulk Randall-Sundrum model with a spin-2 graviton, and a simplified model with a heavy vector triplet. Among the individual search channels, the fully-hadronic channel is predominantly presented where boson tagging technique and jet substructure cuts are used. Local excesses are found in the dijet mass distribution around 2 TeV, leading to a global significance of 2.5 standard deviations. This deviation from the Standard Model prediction results in many theory explanations, and the possibilities could be further explored using the LHC Run 2 data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is known that the exact density functional must give ground-state energies that are piecewise linear as a function of electron number. In this work we prove that this is also true for the lowest-energy excited states of different spin or spatial symmetry. This has three important consequences for chemical applications: the ground state of a molecule must correspond to the state with the maximum highest-occupied-molecular-orbital energy, minimum lowest-unoccupied-molecular-orbital energy, and maximum chemical hardness. The beryllium, carbon, and vanadium atoms, as well as the CH(2) and C(3)H(3) molecules are considered as illustrative examples. Our result also directly and rigorously connects the ionization potential and electron affinity to the stability of spin states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes advances in statistical computation for large-scale data analysis in structured Bayesian mixture models via graphics processing unit (GPU) programming. The developments are partly motivated by computational challenges arising in fitting models of increasing heterogeneity to increasingly large datasets. An example context concerns common biological studies using high-throughput technologies generating many, very large datasets and requiring increasingly high-dimensional mixture models with large numbers of mixture components.We outline important strategies and processes for GPU computation in Bayesian simulation and optimization approaches, give examples of the benefits of GPU implementations in terms of processing speed and scale-up in ability to analyze large datasets, and provide a detailed, tutorial-style exposition that will benefit readers interested in developing GPU-based approaches in other statistical models. Novel, GPU-oriented approaches to modifying existing algorithms software design can lead to vast speed-up and, critically, enable statistical analyses that presently will not be performed due to compute time limitations in traditional computational environments. Supplementalmaterials are provided with all source code, example data, and details that will enable readers to implement and explore the GPU approach in this mixture modeling context. © 2010 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.

Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.