975 resultados para SUPERSYMMETRIC POLYNOMIALS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]We present a new strategy for constructing tensor product spline spaces over quadtree and octree T-meshes. The proposed technique includes some simple rules for inferring local knot vectors to define spline blending functions. These rules allow to obtain for a given T-mesh a set of cubic spline functions that span a space with nice properties: it can reproduce cubic polynomials, the functions are C2-continuous, linearly independent, and spaces spanned by nested T-meshes are also nested. In order to span spaces with these properties applying the proposed rules, the T-mesh should fulfill the only requirement of being a 0-balanced quadtree or octree. ..

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]We present a new strategy for constructing spline spaces over hierarchical T-meshes with quad- and octree subdivision scheme. The proposed technique includes some simple rules for inferring local knot vectors to define C 2 -continuous cubic tensor product spline blending functions. Our conjecture is that these rules allow to obtain, for a given T-mesh, a set of linearly independent spline functions with the property that spaces spanned by nested T-meshes are also nested, and therefore, the functions can reproduce cubic polynomials. In order to span spaces with these properties applying the proposed rules, the T-mesh should fulfill the only requirement of being a 0- balanced mesh...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]We present a new strategy for constructing tensor product spline spaces over quadtree and octree T-meshes. The proposed technique includes some simple rules for inferring local knot vectors to define spline blending functions. These rules allow to obtain for a given T-mesh a set of cubic spline functions that span a space with nice properties: it can reproduce cubic polynomials, the functions are C2-continuous, linearly independent, and spaces spanned by nested T-meshes are also nested. In order to span spaces with these properties applying the proposed rules, the T-mesh should fulfill the only requirement of being a 0-balanced quadtree or octree. ..

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis work I analyze higher spin field theories from a first quantized perspective, finding in particular new equations describing complex higher spin fields on Kaehler manifolds. They are studied by means of worldline path integrals and canonical quantization, in the framework of supersymmetric spinning particle theories, in order to investigate their quantum properties both in flat and curved backgrounds. For instance, by quantizing a spinning particle with one complex extended supersymmetry, I describe quantum massless (p,0)-forms and find a worldline representation for their effective action on a Kaehler background, as well as exact duality relations. Interesting results are found also in the definition of the functional integral for the so called O(N) spinning particles, that will allow to study real higher spins on curved spaces. In the second part, I study Weyl invariant field theories by using a particular mathematical framework known as tractor calculus, that enable to maintain at each step manifest Weyl covariance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By using a symbolic method, known in the literature as the classical umbral calculus, a symbolic representation of Lévy processes is given and a new family of time-space harmonic polynomials with respect to such processes, which includes and generalizes the exponential complete Bell polynomials, is introduced. The usefulness of time-space harmonic polynomials with respect to Lévy processes is that it is a martingale the stochastic process obtained by replacing the indeterminate x of the polynomials with a Lévy process, whereas the Lévy process does not necessarily have this property. Therefore to find such polynomials could be particularly meaningful for applications. This new family includes Hermite polynomials, time-space harmonic with respect to Brownian motion, Poisson-Charlier polynomials with respect to Poisson processes, Laguerre and actuarial polynomials with respect to Gamma processes , Meixner polynomials of the first kind with respect to Pascal processes, Euler, Bernoulli, Krawtchuk, and pseudo-Narumi polynomials with respect to suitable random walks. The role played by cumulants is stressed and brought to the light, either in the symbolic representation of Lévy processes and their infinite divisibility property, either in the generalization, via umbral Kailath-Segall formula, of the well-known formulae giving elementary symmetric polynomials in terms of power sum symmetric polynomials. The expression of the family of time-space harmonic polynomials here introduced has some connections with the so-called moment representation of various families of multivariate polynomials. Such moment representation has been studied here for the first time in connection with the time-space harmonic property with respect to suitable symbolic multivariate Lévy processes. In particular, multivariate Hermite polynomials and their properties have been studied in connection with a symbolic version of the multivariate Brownian motion, while multivariate Bernoulli and Euler polynomials are represented as powers of multivariate polynomials which are time-space harmonic with respect to suitable multivariate Lévy processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis provides efficient and robust algorithms for the computation of the intersection curve between a torus and a simple surface (e.g. a plane, a natural quadric or another torus), based on algebraic and numeric methods. The algebraic part includes the classification of the topological type of the intersection curve and the detection of degenerate situations like embedded conic sections and singularities. Moreover, reference points for each connected intersection curve component are determined. The required computations are realised efficiently by solving quartic polynomials at most and exactly by using exact arithmetic. The numeric part includes algorithms for the tracing of each intersection curve component, starting from the previously computed reference points. Using interval arithmetic, accidental incorrectness like jumping between branches or the skipping of parts are prevented. Furthermore, the environments of singularities are correctly treated. Our algorithms are complete in the sense that any kind of input can be handled including degenerate and singular configurations. They are verified, since the results are topologically correct and approximate the real intersection curve up to any arbitrary given error bound. The algorithms are robust, since no human intervention is required and they are efficient in the way that the treatment of algebraic equations of high degree is avoided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Das Standardmodell der Teilchenphysik, das drei der vier fundamentalen Wechselwirkungen beschreibt, stimmt bisher sehr gut mit den Messergebnissen der Experimente am CERN, dem Fermilab und anderen Forschungseinrichtungen überein. rnAllerdings können im Rahmen dieses Modells nicht alle Fragen der Teilchenphysik beantwortet werden. So lässt sich z.B. die vierte fundamentale Kraft, die Gravitation, nicht in das Standardmodell einbauen.rnDarüber hinaus hat das Standardmodell auch keinen Kandidaten für dunkle Materie, die nach kosmologischen Messungen etwa 25 % unseres Universum ausmacht.rnAls eine der vielversprechendsten Lösungen für diese offenen Fragen wird die Supersymmetrie angesehen, die eine Symmetrie zwischen Fermionen und Bosonen einführt. rnAus diesem Modell ergeben sich sogenannte supersymmetrische Teilchen, denen jeweils ein Standardmodell-Teilchen als Partner zugeordnet sind.rnEin mögliches Modell dieser Symmetrie ist das R-Paritätserhaltende mSUGRA-Modell, falls Supersymmetrie in der Natur realisiert ist.rnIn diesem Modell ist das leichteste supersymmetrische Teilchen (LSP) neutral und schwach wechselwirkend, sodass es nicht direkt im Detektor nachgewiesen werden kann, sondern indirekt über die vom LSP fortgetragene Energie, die fehlende transversale Energie (etmiss), nachgewiesen werden muss.rnrnDas ATLAS-Experiment wird 2010 mit Hilfe des pp-Beschleunigers LHC mit einer Schwerpunktenergie von sqrt(s)=7-10 TeV mit einer Luminosität von 10^32 #/(cm^2*s) mit der Suche nach neuer Physik starten.rnDurch die sehr hohe Datenrate, resultierend aus den etwa 10^8 Auslesekanälen des ATLAS-Detektors bei einer Bunchcrossingrate von 40 MHz, wird ein Triggersystem benötigt, um die zu speichernde Datenmenge zu reduzieren.rnDabei muss ein Kompromiss zwischen der verfügbaren Triggerrate und einer sehr hohen Triggereffizienz für die interessanten Ereignisse geschlossen werden, da etwa nur jedes 10^8-te Ereignisse für die Suche nach neuer Physik interessant ist.rnZur Erfüllung der Anforderungen an das Triggersystem wird im Experiment ein dreistufiges System verwendet, bei dem auf der ersten Triggerstufe mit Abstand die höchste Datenreduktion stattfindet.rnrnIm Rahmen dieser Arbeit rn%, die vollständig auf Monte-Carlo-Simulationen basiert, rnist zum einen ein wesentlicher Beitrag zum grundlegenden Verständnis der Eigenschaft der fehlenden transversalen Energie auf der ersten Triggerstufe geleistet worden.rnZum anderen werden Methoden vorgestellt, mit denen es möglich ist, die etmiss-Triggereffizienz für Standardmodellprozesse und mögliche mSUGRA-Szenarien aus Daten zu bestimmen. rnBei der Optimierung der etmiss-Triggerschwellen für die erste Triggerstufe ist die Triggerrate bei einer Luminosität von 10^33 #/(cm^2*s) auf 100 Hz festgelegt worden.rnFür die Triggeroptimierung wurden verschiedene Simulationen benötigt, bei denen eigene Entwicklungsarbeit eingeflossen ist.rnMit Hilfe dieser Simulationen und den entwickelten Optimierungsalgorithmen wird gezeigt, dass trotz der niedrigen Triggerrate das Entdeckungspotential (für eine Signalsignifikanz von mindestens 5 sigma) durch Kombinationen der etmiss-Schwelle mit Lepton bzw. Jet-Triggerschwellen gegenüber dem bestehenden ATLAS-Triggermenü auf der ersten Triggerstufe um bis zu 66 % erhöht wird.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present dissertation we consider Feynman integrals in the framework of dimensional regularization. As all such integrals can be expressed in terms of scalar integrals, we focus on this latter kind of integrals in their Feynman parametric representation and study their mathematical properties, partially applying graph theory, algebraic geometry and number theory. The three main topics are the graph theoretic properties of the Symanzik polynomials, the termination of the sector decomposition algorithm of Binoth and Heinrich and the arithmetic nature of the Laurent coefficients of Feynman integrals.rnrnThe integrand of an arbitrary dimensionally regularised, scalar Feynman integral can be expressed in terms of the two well-known Symanzik polynomials. We give a detailed review on the graph theoretic properties of these polynomials. Due to the matrix-tree-theorem the first of these polynomials can be constructed from the determinant of a minor of the generic Laplacian matrix of a graph. By use of a generalization of this theorem, the all-minors-matrix-tree theorem, we derive a new relation which furthermore relates the second Symanzik polynomial to the Laplacian matrix of a graph.rnrnStarting from the Feynman parametric parameterization, the sector decomposition algorithm of Binoth and Heinrich serves for the numerical evaluation of the Laurent coefficients of an arbitrary Feynman integral in the Euclidean momentum region. This widely used algorithm contains an iterated step, consisting of an appropriate decomposition of the domain of integration and the deformation of the resulting pieces. This procedure leads to a disentanglement of the overlapping singularities of the integral. By giving a counter-example we exhibit the problem, that this iterative step of the algorithm does not terminate for every possible case. We solve this problem by presenting an appropriate extension of the algorithm, which is guaranteed to terminate. This is achieved by mapping the iterative step to an abstract combinatorial problem, known as Hironaka's polyhedra game. We present a publicly available implementation of the improved algorithm. Furthermore we explain the relationship of the sector decomposition method with the resolution of singularities of a variety, given by a sequence of blow-ups, in algebraic geometry.rnrnMotivated by the connection between Feynman integrals and topics of algebraic geometry we consider the set of periods as defined by Kontsevich and Zagier. This special set of numbers contains the set of multiple zeta values and certain values of polylogarithms, which in turn are known to be present in results for Laurent coefficients of certain dimensionally regularized Feynman integrals. By use of the extended sector decomposition algorithm we prove a theorem which implies, that the Laurent coefficients of an arbitrary Feynman integral are periods if the masses and kinematical invariants take values in the Euclidean momentum region. The statement is formulated for an even more general class of integrals, allowing for an arbitrary number of polynomials in the integrand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, my work in the Compact Muon Solenoid (CMS) experiment on the search for the neutral Minimal Supersymmetric Standard Model (MSSM) Higgs decaying into two muons is presented. The search is performed on the full data collected during the years 2011 and 2012 by CMS in proton-proton collisions at CERN Large Hadron Collider (LHC). The MSSM is explored within the most conservative benchmark scenario, m_h^{max}, and within its modified versions, m_h^{mod +} and m_h^{mod -}. The search is sensitive to MSSM Higgs boson production in association with a b\bar{b} quark pair and to the gluon-gluon fusion process. In the m_h^{max} scenario, the results exclude values of tanB larger than 15 in the m_A range 115-200 GeV, and values of tanB greater than 30 in the m_A range up to 300 GeV. There are no significant differences in the results obtained within the three different scenarios considered. Comparisons with other neutral MSSM Higgs searches are shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The diameters of traditional dish concentrators can reach several tens of meters, the construction of monolithic mirrors being difficult at these scales: cheap flat reflecting facets mounted on a common frame generally reproduce a paraboloidal surface. When a standard imaging mirror is coupled with a PV dense array, problems arise since the solar image focused is intrinsically circular. Moreover, the corresponding irradiance distribution is bell-shaped in contrast with the requirement of having all the cells under the same illumination. Mismatch losses occur when interconnected cells experience different conditions, in particular in series connections. In this PhD Thesis, we aim at solving these issues by a multidisciplinary approach, exploiting optical concepts and applications developed specifically for astronomical use, where the improvement of the image quality is a very important issue. The strategy we propose is to boost the spot uniformity acting uniquely on the primary reflector and avoiding the big mirrors segmentation into numerous smaller elements that need to be accurately mounted and aligned. In the proposed method, the shape of the mirrors is analytically described by the Zernike polynomials and its optimization is numerically obtained to give a non-imaging optics able to produce a quasi-square spot, spatially uniform and with prescribed concentration level. The freeform primary optics leads to a substantial gain in efficiency without secondary optics. Simple electrical schemes for the receiver are also required. The concept has been investigated theoretically modeling an example of CPV dense array application, including the development of non-optical aspects as the design of the detector and of the supporting mechanics. For the method proposed and the specific CPV system described, a patent application has been filed in Italy with the number TO2014A000016. The patent has been developed thanks to the collaboration between the University of Bologna and INAF (National Institute for Astrophysics).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this dissertation is to improve the knowledge of knots and links in lens spaces. If the lens space L(p,q) is defined as a 3-ball with suitable boundary identifications, then a link in L(p,q) can be represented by a disk diagram, i.e. a regular projection of the link on a disk. In this contest, we obtain a complete finite set of Reidemeister-type moves establishing equivalence, up to ambient isotopy. Moreover, the connections of this new diagram with both grid and band diagrams for links in lens spaces are shown. A Wirtinger-type presentation for the group of the link and a diagrammatic method giving the first homology group are described. A class of twisted Alexander polynomials for links in lens spaces is computed, showing its correlation with Reidemeister torsion. One of the most important geometric invariants of links in lens spaces is the lift in 3-sphere of a link L in L(p,q), that is the counterimage of L under the universal covering of L(p,q). Starting from the disk diagram of the link, we obtain a diagram of the lift in the 3-sphere. Using this construction it is possible to find different knots and links in L(p,q) having equivalent lifts, hence we cannot distinguish different links in lens spaces only from their lift. The two final chapters investigate whether several existing invariants for links in lens spaces are essential, i.e. whether they may assume different values on links with equivalent lift. Namely, we consider the fundamental quandle, the group of the link, the twisted Alexander polynomials, the Kauffman Bracket Skein Module and an HOMFLY-PT-type invariant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After briefly discuss the natural homogeneous Lie group structure induced by Kolmogorov equations in chapter one, we define an intrinsic version of Taylor polynomials and Holder spaces in chapter two. We also compare our definition with others yet known in literature. In chapter three we prove an analogue of Taylor formula, that is an estimate of the remainder in terms of the homogeneous metric.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In my work I derive closed-form pricing formulas for volatility based options by suitably approximating the volatility process risk-neutral density function. I exploit and adapt the idea, which stands behind popular techniques already employed in the context of equity options such as Edgeworth and Gram-Charlier expansions, of approximating the underlying process as a sum of some particular polynomials weighted by a kernel, which is typically a Gaussian distribution. I propose instead a Gamma kernel to adapt the methodology to the context of volatility options. VIX vanilla options closed-form pricing formulas are derived and their accuracy is tested for the Heston model (1993) as well as for the jump-diffusion SVJJ model proposed by Duffie et al. (2000).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Im Jahr 2011 wurde am Large Hadron Collider mit dem ATLAS Experiment ein Datensatz von 4.7 inversen Femtobarn bei einer Schwerpunktsenergie von 7 TeV aufgezeichnet. Teil des umfangreichen Physikprogrammes des ATLAS Experiments ist die Suche nach Physik jenseits des Standardmodells. Supersymmetrie - eine neue Symmetrie zwischen Bosonen und Fermionen - wird als aussichtsreichester Kandidat für neue Physik angesehen, und zahlreiche direkte und indirekte Suchen nach Supersymmetrie wurden in den letzten Jahrzehnten bereits durchgeführt. In der folgenden Arbeit wird eine direkte Suche nach Supersymmetrie in Endzuständen mit Jets, fehlender Transversalenergie und genau einem Elektron oder Myon durchgeführt. Der analysierte Datensatz von 4.7 inversen Femtobarn umfasst die gesamte Datenmenge, welche am ATLAS Experiment bei einer Schwerpunktsenergie von 7 TeV aufgezeichnet wurde. Die Ergebnisse der Analyse werden mit verschiedenen anderen leptonischen Suchkanälen kombiniert, um die Sensitivität auf diversen supersymmetrischen Produktions- und Zerfallsmodi zu maximieren. Die gemessenen Daten sind kompatibel mit der Standardmodellerwartung, und neue Ausschlussgrenzen in verschiedenen supersymmetrischen Modellen werden berechnet.