981 resultados para Momentum Theorem
Resumo:
Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of s√=8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT>120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between EmissT>150 GeV and EmissT>700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with large extra spatial dimensions, pair production of weakly interacting dark matter candidates, and production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presented.
Resumo:
The results of a search for supersymmetry in final states containing at least one isolated lepton (electron or muon), jets and large missing transverse momentum with the ATLAS detector at the Large Hadron Collider (LHC) are reported. The search is based on proton-proton collision data at a centre-of-mass energy s√=8 TeV collected in 2012, corresponding to an integrated luminosity of 20 fb−1. No significant excess above the Standard Model expectation is observed. Limits are set on the parameters of a minimal universal extra dimensions model, excluding a compactification radius of 1/Rc=950 GeV for a cut-off scale times radius (ΛRc) of approximately 30, as well as on sparticle masses for various supersymmetric models. Depending on the model, the search excludes gluino masses up to 1.32 TeV and squark masses up to 840 GeV.
Resumo:
Results of a search for new phenomena in events with large missing transverse momentum and a Higgs boson decaying to two photons are reported. Data from proton--proton collisions at a center-of-mass energy of 8 TeV and corresponding to an integrated luminosity of 20.3 fb−1 have been collected with the ATLAS detector at the LHC. The observed data are well described by the expected Standard Model backgrounds. Upper limits on the cross section of events with large missing transverse momentum and a Higgs boson candidate are also placed. Exclusion limits are presented for models of physics beyond the Standard Model featuring dark-matter candidates.
Resumo:
La verificación y el análisis de programas con características probabilistas es una tarea necesaria del quehacer científico y tecnológico actual. El éxito y su posterior masificación de las implementaciones de protocolos de comunicación a nivel hardware y soluciones probabilistas a problemas distribuidos hacen más que interesante el uso de agentes estocásticos como elementos de programación. En muchos de estos casos el uso de agentes aleatorios produce soluciones mejores y más eficientes; en otros proveen soluciones donde es imposible encontrarlas por métodos tradicionales. Estos algoritmos se encuentran generalmente embebidos en múltiples mecanismos de hardware, por lo que un error en los mismos puede llegar a producir una multiplicación no deseada de sus efectos nocivos.Actualmente el mayor esfuerzo en el análisis de programas probabilísticos se lleva a cabo en el estudio y desarrollo de herramientas denominadas chequeadores de modelos probabilísticos. Las mismas, dado un modelo finito del sistema estocástico, obtienen de forma automática varias medidas de performance del mismo. Aunque esto puede ser bastante útil a la hora de verificar programas, para sistemas de uso general se hace necesario poder chequear especificaciones más completas que hacen a la corrección del algoritmo. Incluso sería interesante poder obtener automáticamente las propiedades del sistema, en forma de invariantes y contraejemplos.En este proyecto se pretende abordar el problema de análisis estático de programas probabilísticos mediante el uso de herramientas deductivas como probadores de teoremas y SMT solvers. Las mismas han mostrado su madurez y eficacia en atacar problemas de la programación tradicional. Con el fin de no perder automaticidad en los métodos, trabajaremos dentro del marco de "Interpretación Abstracta" el cual nos brinda un delineamiento para nuestro desarrollo teórico. Al mismo tiempo pondremos en práctica estos fundamentos mediante implementaciones concretas que utilicen aquellas herramientas.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
The main aim of this short paper is to advertize the Koosis theorem in the mathematical community, especially among those who study orthogonal polynomials. We (try to) do this by proving a new theorem about asymptotics of orthogonal polynomi- als for which the Koosis theorem seems to be the most natural tool. Namely, we consider the case when a SzegÄo measure on the unit circumference is perturbed by an arbitrary measure inside the unit disk and an arbitrary Blaschke sequence of point masses outside the unit disk.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We present Shelah’s famous theorem in a version for modules, together with a self-contained proof and some examples. This exposition is based on lectures given at CRM in October 2006.
Resumo:
We prove a double commutant theorem for hereditary subalgebras of a large class of C*-algebras, partially resolving a problem posed by Pedersen[8]. Double commutant theorems originated with von Neumann, whose seminal result evolved into an entire field now called von Neumann algebra theory. Voiculescu proved a C*-algebraic double commutant theorem for separable subalgebras of the Calkin algebra. We prove a similar result for hereditary subalgebras which holds for arbitrary corona C*-algebras. (It is not clear how generally Voiculescu's double commutant theorem holds.)
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We examine the proof of a classical localization theorem of Bousfield and Friedlander and we remove the assumption that the underlying model category be right proper. The key to the argument is a lemma about factoring in morphisms in the arrow category of a model category.
Resumo:
In this paper, we consider an exchange economy µa la Shitovitz (1973), with atoms and an atomless set. We associate with it a strategic market game of the kind first proposed by Lloyd S. Shapley and known as the Shapley window model. We analyze the relationship between the set of the Cournot-Nash equilibrium allocations of the strategic market game and the Walras equilibrium allocations of the exchange economy with which it is associated. We show, with an example, that even when atoms are countably in¯nite, any Cournot-Nash equilibrium allocation of the game is not a Walras equilibrium of the underlying exchange economy. Accordingly, in the original spirit of Cournot (1838), we par- tially replicate the mixed exchange economy by increasing the number of atoms, without a®ecting the atomless part, and ensuring that the measure space of agents remains finite. We show that any sequence of Cournot-Nash equilibrium allocations of the strategic market games associated with the partially replicated exchange economies approximates a Walras equilibrium allocation of the original exchange economy.
Resumo:
We present an envelope theorem for establishing first-order conditions in decision problems involving continuous and discrete choices. Our theorem accommodates general dynamic programming problems, even with unbounded marginal utilities. And, unlike classical envelope theorems that focus only on differentiating value functions, we accommodate other endogenous functions such as default probabilities and interest rates. Our main technical ingredient is how we establish the differentiability of a function at a point: we sandwich the function between two differentiable functions from above and below. Our theory is widely applicable. In unsecured credit models, neither interest rates nor continuation values are globally differentiable. Nevertheless, we establish an Euler equation involving marginal prices and values. In adjustment cost models, we show that first-order conditions apply universally, even if optimal policies are not (S,s). Finally, we incorporate indivisible choices into a classic dynamic insurance analysis.