983 resultados para Superstrings and Ileterotic Strings
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
In a recent paper, the partition function (character) of ten-dimensional pure spinor worldsheet variables was calculated explicitly up to the fifth mass-level. In this letter, we propose a novel application of Padé approximants as a tool for computing the character of pure spinors. We get results up to the twelfth mass-level. This work is a first step towards an explicit construction of the complete pure spinor partition function.
Resumo:
Using the pure spinor formalism, a quantizable sigma model has been constructed for the superstring in an AdS(5) X S-5 background with manifest PSU(2,2 vertical bar 4) invariance. The PSU(2,2 vertical bar 4) metric g(AB) has both vector components gab and spinor components g, 3, and in the limit where the spinor components g, 3 are taken to infinity, the AdS5 X S5 sigma model reduces to the worldsheet action in a flat background. In this paper, we instead consider the limit where the vector components g(ab) are taken to infinity. In this limit, the AdS5 X S5 sigma model simplifies to a topological A-model constructed from fermionic N=2 superfields whose bosonic components transform like twistor variables. Just as d=3 Chern-Simons theory can be described by the open string sector of a topological A-model, the open string sector of this topological A-model describes d=4 N=4 super-Yang-Mills. These results might be useful for constructing a worldsheet proof of the Maldacena conjecture analogous to the Gopakumar-Vafa-Ooguri worldsheet proof of Chern-Simons/conifold duality.
Resumo:
Two experiments examined the claim for distinct implicit and explicit learning modes in the artificial grammar-learning task (Reber, 1967, 1989). Subjects initially attempted to memorize strings of letters generated by a finite-state grammar and then classified new grammatical and nongrammatical strings. Experiment 1 showed that subjects' assessment of isolated parts of strings was sufficient to account for their classification performance but that the rules elicited in free report were not sufficient. Experiment 2 showed that performing a concurrent random number generation task under different priorities interfered with free report and classification performance equally. Furthermore, giving different groups of subjects incidental or intentional learning instructions did not affect classification or free report.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We report the results of a multimessenger search for coincident signals from the LIGO and Virgo gravitational-wave observatories and the partially completed IceCube high-energy neutrino detector, including periods of joint operation between 2007-2010. These include parts of the 2005-2007 run and the 2009-2010 run for LIGO-Virgo, and IceCube's observation periods with 22, 59 and 79 strings. We find no significant coincident events, and use the search results to derive upper limits on the rate of joint sources for a range of source emission parameters. For the optimistic assumption of gravitational-wave emission energy of 10(-2) M(circle dot)c(2) at similar to 150 Hz with similar to 60 ms duration, and high-energy neutrino emission of 1051 erg comparable to the isotropic gamma-ray energy of gamma-ray bursts, we limit the source rate below 1.6 x 10(-2) Mpc(-3) yr(-1). We also examine how combining information from gravitational waves and neutrinos will aid discovery in the advanced gravitational-wave detector era.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Using the fact the BTZ black hole is a quotient of AdS(3) we show that classical string propagation in the BTZ background is integrable. We construct the flat connection and its monodromy matrix which generates the non-local charges. From examining the general behaviour of the eigen values of the monodromy matrix we determine the set of integral equations which constrain them. These equations imply that each classical solution is characterized by a density function in the complex plane. For classical solutions which correspond to geodesics and winding strings we solve for the eigen values of the monodromy matrix explicitly and show that geodesics correspond to zero density in the complex plane. We solve the integral equations for BMN and magnon like solutions and obtain their dispersion relation. We show that the set of integral equations which constrain the eigen values of the monodromy matrix can be identified with the continuum limit of the Bethe equations of a twisted SL(2, R) spin chain at one loop. The Landau-Lifshitz equations from the spin chain can also be identified with the sigma model equations of motion.
Resumo:
We dimensionally reduce the ABJM model, obtaining a two-dimensional theory that can be thought of as a 'master action'. This encodes information about both T- and S-duality, i.e. describes fundamental (F1) and D-strings (D1) in 9 and 10 dimensions. The Higgsed theory at large VEV, (v) over tilde, and large k yields D1-brane actions in 9d and 10d, depending on which auxiliary fields are integrated out. For N = 1 there is a map to a Green-Schwarz string wrapping a nontrivial circle in C(4)/Z(k).
Resumo:
Pós-graduação em Agronomia - FEIS
A simpler prescription to calculate MHV amplitudes for gravitons at tree level in superstring theory
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
[ES] El Trabajo de Fin de Grado, Monitor Web de Expresiones Regulares (MWRegEx), es una herramienta basada en tecnologías web, desarrollada usando el entorno Visual Studio. El objetivo principal de la aplicación es dar apoyo a la docencia de expresiones regulares, en el marco de la enseñanza del manejo de ristras de caracteres en las asignaturas de programación del Grado en Ingeniería Informática. La aplicación permite obtener el dibujo de un autómata de una expresión regular, facilitando su comprensión; además, permite aplicar la expresión a diferentes ristras de caracteres, mostrando las coincidencias encontradas, y ofrece una versión de la expresión adaptada a su uso en literales string de lenguajes como Java y otros. La herramienta se ha implementado en dos partes: un servicio web, escrito en C#, donde se realizan todos los análisis de las expresiones regulares y las ristras a contrastar; y un cliente web, implementado usando tecnología asp.net, con JavaScript y JQuery, que gestiona la interfaz de usuario y muestra los resultados. Esta separación permite que el servicio web pueda ser reutilizado con otras aplicaciones cliente. El autómata que representa una expresión regular esta dibujado usando la librería Raphaël JavaScript que permite manejar los elementos SVG. Cada elemento de la expresión regular tiene un dibujo diferente y único para así diferenciarlo. Toda la interfaz gráfica de usuario está internacionalizada de manera tal que pueda adaptarse a diferentes idiomas y regiones sin la necesidad de realizar cambios de ingeniería ni en el código. Tanto el servicio web como la parte cliente están estructurados para que se puedan agregar nuevas modificaciones sin que esto genere una onda expansiva a lo largo de las diversas clases existentes.
Resumo:
SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.