984 resultados para Minimal Supersymmetric Standard Model (MSSM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two searches for supersymmetric particles in final states containing a same-flavour opposite-sign lepton pair, jets and large missing transverse momentum are presented. The proton--proton collision data used in these searches were collected at a centre-of-mass energy s√=8 TeV by the ATLAS detector at the Large Hadron Collider and corresponds to an integrated luminosity of 20.3 fb−1. Two leptonic production mechanisms are considered: decays of squarks and gluinos with Z bosons in the final state, resulting in a peak in the dilepton invariant mass distribution around the Z-boson mass; and decays of neutralinos (e.g. χ~02→ℓ+ℓ−χ~01), resulting in a kinematic endpoint in the dilepton invariant mass distribution. For the former, an excess of events above the expected Standard Model background is observed, with a significance of 3 standard deviations. In the latter case, the data are well-described by the expected Standard Model background. The results from each channel are interpreted in the context of several supersymmetric models involving the production of squarks and gluinos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of s√=8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT>120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between EmissT>150 GeV and EmissT>700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with large extra spatial dimensions, pair production of weakly interacting dark matter candidates, and production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search for a new resonance decaying to a W or Z boson and a Higgs boson in the ℓℓ/ℓν/νν+bb¯ final states is performed using 20.3 fb−1 of pp collision data recorded at s√= 8 TeV with the ATLAS detector at the Large Hadron Collider. The search is conducted by examining the WH/ZH invariant mass distribution for a localized excess. No significant deviation from the Standard Model background prediction is observed. The results are interpreted in terms of constraints on the Minimal Walking Technicolor model and on a simplified approach based on a phenomenological Lagrangian of Heavy Vector Triplets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A search is presented for photonic signatures motivated by generalised models of gauge-mediated supersymmetry breaking. This search makes use of 20.3 fb−1 of proton-proton collision data at s√=8 TeV recorded by the ATLAS detector at the LHC, and explores models dominated by both strong and electroweak production of supersymmetric partner states. Four experimental signatures incorporating an isolated photon and significant missing transverse momentum are explored. These signatures include events with an additional photon, lepton, b-quark jet, or jet activity not associated with any specific underlying quark flavor. No significant excess of events is observed above the Standard Model prediction and model-dependent 95% confidence-level exclusion limits are set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many extensions of the Standard Model posit the existence of heavy particles with long lifetimes. This article presents the results of a search for events containing at least one long-lived particle that decays at a significant distance from its production point into two leptons or into five or more charged particles. This analysis uses a data sample of proton-proton collisions at s√ = 8 TeV corresponding to an integrated luminosity of 20.3 fb−1 collected in 2012 by the ATLAS detector operating at the Large Hadron Collider. No events are observed in any of the signal regions, and limits are set on model parameters within supersymmetric scenarios involving R-parity violation, split supersymmetry, and gauge mediation. In some of the search channels, the trigger and search strategy are based only on the decay products of individual long-lived particles, irrespective of the rest of the event. In these cases, the provided limits can easily be reinterpreted in different scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers the instrumental variable regression model when there is uncertainty about the set of instruments, exogeneity restrictions, the validity of identifying restrictions and the set of exogenous regressors. This uncertainty can result in a huge number of models. To avoid statistical problems associated with standard model selection procedures, we develop a reversible jump Markov chain Monte Carlo algorithm that allows us to do Bayesian model averaging. The algorithm is very exible and can be easily adapted to analyze any of the di¤erent priors that have been proposed in the Bayesian instrumental variables literature. We show how to calculate the probability of any relevant restriction (e.g. the posterior probability that over-identifying restrictions hold) and discuss diagnostic checking using the posterior distribution of discrepancy vectors. We illustrate our methods in a returns-to-schooling application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We construct a cofibrantly generated Thomason model structure on the category of small n-fold categories and prove that it is Quillen equivalent to the standard model structure on the category of simplicial sets. An n-fold functor is a weak equivalence if and only if the diagonal of its n-fold nerve is a weak equivalence of simplicial sets. We introduce an n-fold Grothendieck construction for multisimplicial sets, and prove that it is a homotopy inverse to the n-fold nerve. As a consequence, the unit and counit of the adjunction between simplicial sets and n-fold categories are natural weak equivalences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the two Higgs doublet model extension of the standard model in the limit where all physical scalar particles are very heavy, too heavy, in fact, to be experimentally produced in forthcoming experiments. The symmetry-breaking sector can thus be described by an effective chiral Lagrangian. We obtain the values of the coefficients of the O(p4) operators relevant to the oblique corrections and investigate to what extent some nondecoupling effects may remain at low energies. A comparison with recent CERN LEP data shows that this model is indistinguishable from the standard model with one doublet and with a heavy Higgs boson, unless the scalar mass splittings are large.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-country models have not been very successful in replicating important features of the international transmission of business cycles. Standard models predict cross-country correlations of output and consumption which are respectively too low and too high. In this paper, we build a multi-country model of the business cycle with multiple sectors in order to analyze the role of sectoral shocks in the international transmission of the business cycle. We find that a model with multiple sectors generates a higher cross-country correlation of output than standard one-sector models, and a lower cross-country correlation of consumption. In addition, it predicts cross-country correlations of employment and investment that are closer to the data than the standard model. We also analyze the relative effects of multiple sectors, trade in intermediate goods, imperfect substitution between domestic and foreign goods, home preference, capital adjustment costs, and capital depreciation on the international transmission of the business cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les observations astronomiques et cosmologiques suggèrent fortement la présence d’une matière exotique, non-relativiste et non-baryonique qui représenterait 26% du contenu de masse-énergie de l’Univers actuel. Cette matière dite sombre et froide serait compo- sée de particules neutres, massives et interagissant faiblement avec la matière ordinaire (WIMP : Weakly Interactive Massive Particles). Le projet PICASSO (Projet d’Identification des CAndidats Supersymétriques de la matière SOmbre) est une des expériences installées dans le site souterrain de SNOLAB à Sudbury en Ontario, qui tente de détecter directement un des candidats de la matière sombre, proposé dans le cadre des extensions supersymétriques du modèle standard : le neutralino. Pour cela, PICASSO utilise des détecteurs à gouttelettes surchauffées de C4F10, basés sur le principe de la chambre à bulles. Les transitions de phase dans les liquides surchauffés peuvent être déclenchées par le recul du 19 F, causé par une collision élastique avec les neutralinos. La nucléation de la gouttelette génère une onde sonore enregistrée par des senseurs piézo-électriques. Cette thèse présentera les récents progrès de l’expérience PICASSO qui ont conduit à une augmentation substantielle de sa sensibilité dans la recherche du neutralino. En effet, de nouvelles procédures de fabrication et de purification ont permis de réduire à un facteur de 10, la contamination majeure des détecteurs, causée par les émetteurs alpha. L’étude de cette contamination dans les détecteurs a permis de localiser la source de ces émetteurs. Les efforts effectués dans le cadre de l’analyse des données, ont permis d’améliorer l’effet de discrimination entre des évènements engendrés par les particules alpha et par les reculs nucléaires. De nouveaux outils d’analyse ont également été implémentés dans le but de discriminer les évènements générés par des particules de ceux générés par des bruits de fond électroniques ou acoustiques. De plus, un mécanisme important de suppression de bruit de fond indésirable à haute température, a permis à l’expérience PICASSO d’être maintenant sensible aux WIMPs de faibles masses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human electroencephalogram (EEG) is globally characterized by a 1/f power spectrum superimposed with certain peaks, whereby the "alpha peak" in a frequency range of 8-14 Hz is the most prominent one for relaxed states of wakefulness. We present simulations of a minimal dynamical network model of leaky integrator neurons attached to the nodes of an evolving directed and weighted random graph (an Erdos-Renyi graph). We derive a model of the dendritic field potential (DFP) for the neurons leading to a simulated EEG that describes the global activity of the network. Depending on the network size, we find an oscillatory transition of the simulated EEG when the network reaches a critical connectivity. This transition, indicated by a suitably defined order parameter, is reflected by a sudden change of the network's topology when super-cycles are formed from merging isolated loops. After the oscillatory transition, the power spectra of simulated EEG time series exhibit a 1/f continuum superimposed with certain peaks. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The SU(3)(L)circle times U(1)(N) electroweak model predicts new Higgs bosons beyond the one of the standard model. In this work we investigate the signature and production of neutral SU(3)(L)circle times U(1)(N) Higgs bosons in the e(-)e(+) Next Linear Collider and in the CERN Linear Collider . We compute the branching ratios of two of the SU(3)(L)circle times U(1)(N) neutral Higgs bosons and study the possibility to detect them and the Z' extra neutral boson of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The SU(3)(L) circle times U(1)(N) electroweak model predicts new Higgs bosons beyond the one of the standard model. In this work we investigate the signature and production of doubly charged Higgs bosons in the e(+)e(-) International Linear Collider and in the CERN Linear Collider. We compute the branching ratios for the doubly charged gauge bosons of the model.