961 resultados para Set covering theory
Resumo:
The production of a W boson decaying to eν or μν in association with a W or Z boson decaying to two jets is studied using 4.6 fb−1 of proton--proton collision data at s√=7 TeV recorded with the ATLAS detector at the LHC. The combined WW+WZ cross section is measured with a significance of 3.4σ and is found to be 68±7 (stat.)±19 (syst.) pb, in agreement with the Standard Model expectation of 61.1±2.2 pb. The distribution of the transverse momentum of the dijet system is used to set limits on anomalous contributions to the triple gauge coupling vertices and on parameters of an effective-field-theory model.
Resumo:
A search for a massive W′ gauge boson is performed with the ATLAS detector at the LHC in pp collisions at a centre-of-mass energy of s√ = 8 TeV, corresponding to 20.3 fb−1 of integrated luminosity. This analysis is done in the W′→tb→qqbb mode for W′ masses above 1.5 TeV, where the W′ decay products are highly boosted. Novel jet substructure techniques are used to identify jets from high-momentum top quarks to ensure high sensitivity, independent of W′ mass, up to 3 TeV; b-tagging is also used to identify jets originating from b-quarks. The data are consistent with Standard Model background-only expectations, and upper limits at 95% confidence level are set on the W′→tb cross section times branching ratio ranging from 0.16 pb to 0.33 pb for left-handed W′ bosons, and ranging from 0.10 pb to 0.21 pb for W′ bosons with purely right-handed couplings. Upper limits at 95% confidence level are set on the W′-boson coupling to tb as a function of the W′ mass using an effective field theory approach, which is independent of details of particular models predicting a W′ boson.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
The computation of the optical conductivity of strained and deformed graphene is discussed within the framework of quantum field theory in curved spaces. The analytical solutions of the Dirac equation in an arbitrary static background geometry for one dimensional periodic deformations are computed, together with the corresponding Dirac propagator. Analytical expressions are given for the optical conductivity of strained and deformed graphene associated with both intra and interbrand transitions. The special case of small deformations is discussed and the result compared to the prediction of the tight-binding model.
Resumo:
A modified version of the metallic-phase pseudofermion dynamical theory (PDT) of the 1D Hubbard model is introduced for the spin dynamical correlation functions of the half-filled 1D Hubbard model Mott– Hubbard phase. The Mott–Hubbard insulator phase PDT is applied to the study of the model longitudinal and transverse spin dynamical structure factors at finite magnetic field h, focusing in particular on the sin- gularities at excitation energies in the vicinity of the lower thresholds. The relation of our theoretical results to both condensed-matter and ultra-cold atom systems is discussed.
Resumo:
Measurements of differential cross sections for J/ψ production in p+Pb collisions at sNN−−−−√=5.02 TeV at the CERN Large Hadron Collider with the ATLAS detector are presented. The data set used corresponds to an integrated luminosity of 28.1 nb−1. The J/ψ mesons are reconstructed in the dimuon decay channel over the transverse momentum range 8
Resumo:
We study the low frequency absorption cross section of spherically symmetric nonextremal d-dimensional black holes. In the presence of α′ corrections, this quantity must have an explicit dependence on the Hawking temperature of the form 1/TH. This property of the low frequency absorption cross section is shared by the D1-D5 system from type IIB superstring theory already at the classical level, without α′ corrections. We apply our formula to the simplest example, the classical d-dimensional Reissner-Nordstr¨om solution, checking that the obtained formula for the cross section has a smooth extremal limit. We also apply it for a d-dimensional Tangherlini-like solution with α′3 corrections.
Resumo:
We analyze the low frequency absorption cross section of minimally coupled massless scalar fields by different kinds of charged static black holes in string theory, namely the D1–D5 system in d=5 and a four dimensional dyonic four-charged black hole. In each case we show that this cross section always has the form of some parameter of the solution divided by the black hole Hawking temperature. We also verify in each case that, despite its explicit temperature dependence, such quotient is finite in the extremal limit, giving a well defined cross section. We show that this precise explicit temperature dependence also arises in the same cross section for black holes with string \alpha' corrections: it is actually induced by them.
Resumo:
Tese de Doutoramento em Filosofia (área de especialização em Filosofia Moderna e Contemporânea).
Resumo:
In this study, a mathematical model for the production of Fructo-oligosaccharides (FOS) by Aureobasidium pullulans is developed. This model contains a relatively large set of unknown parameters, and the identification problem is analyzed using simulation data, as well as experimental data. Batch experiments were not sufficiently informative to uniquely estimate all the unknown parameters, thus, additional experiments have to be achieved in fed-batch mode to supplement the missing information. © 2015 IEEE.
Resumo:
Tese de Doutoramento em Filosofia - Especialidade de Filosofia da Mente
Resumo:
NIPE WP 05/2016
Resumo:
Lipid nanoballoons integrating multiple emulsions of the type water-in-oil-in-water enclose, at least in theory, a biomimetic aqueous-core suitable for housing hydrophilic biomolecules such as proteins, peptides and bacteriophage particles. The research effort entertained in this paper reports a full statistical 23x31 factorial design study (three variables at two levels and one variable at three levels) to optimize biomimetic aqueous-core lipid nanoballoons for housing hydrophilic protein entities. The concentrations of protein, lipophilic and hydrophilic emulsifiers, and homogenization speed were set as the four independent variables, whereas the mean particle hydrodynamic size (HS), zeta potential (ZP) and polydispersity index (PI) were set as the dependent variables. The V23x31 factorial design constructed led to optimization of the higher (+1) and lower (-1) levels, with triplicate testing for the central (0) level, thus producing thirty three experiments and leading to selection of the optimized processing parameters as 0.015% (w/w) protein entity, 0.75% (w/w) lipophilic emulsifier (soybean lecithin) and 0.50% (w/w) hydrophilic emulsifier (poloxamer 188). In the present research effort, statistical optimization and production of protein derivatives encompassing full stabilization of their three-dimensional structure, has been attempted via housing said molecular entities within biomimetic aqueous-core lipid nanoballoons integrating a multiple (W/O/W) emulsion.
Resumo:
Entre los factores que contribuyen a predecir el rendimiento académico se pueden destacar aquellos que reflejan capacidades cognitivas (inteligencia, por ejemplo), y aquellas diferencias individuales consideradas como no-cognitivas (rasgos de personalidad, por ejemplo). En los últimos años, también se considera al Conocimiento General (CG) como un criterio para el éxito académico (ver Ackerman, 1997), ya que se ha evidenciado que el conocimiento previo ayuda en la adquisición de nuevo conocimiento (Hambrick & Engle, 2001). Uno de los objetivos de la psicología educacional consiste en identificar las principales variables que explican el rendimiento académico, como también proponer modelos teóricos que expliquen las relaciones existentes entre estas variables. El modelo teórico PPIK (Inteligencia-como-Proceso, Personalidad, Intereses e Inteligencia-como-Conocimiento) propuesto por Ackerman (1996) propone que el conocimiento y las destrezas adquiridas en un dominio en particular son el resultado de la dedicación de recursos cognitivos que una persona realiza durante un prolongado período de tiempo. Este modelo propone que los rasgos de personalidad, intereses individuales/vocacionales y aspectos motivacionales están integrados como rasgos complejos que determinan la dirección y la intensidad de la dedicación de recursos cognitivos sobre el aprendizaje que realiza una persona (Ackerman, 2003). En nuestro medio (Córdoba, Argentina), un grupo de investigadores ha desarrollado una serie de recursos técnicos necesarios para la evaluación de algunos de los constructos propuesto por este modelo. Sin embargo, por el momento no contamos con una medida de Conocimiento General. Por lo tanto, en el presente proyecto se propone la construcción de un instrumento para medir Conocimiento General (CG), indispensable para poder contar con una herramienta que permita establecer parámetros sobre el nivel de conocimiento de la población universitaria y para en próximos trabajos poner a prueba los postulados de la teoría PPIK (Ackerman, 1996). Between the factors that contribute to predict the academic achievement, may be featured those who reflect cognitive capacities (i.g. intelligence) and those who reflect individual differences that are considered like non-cognitive (i.g. personality traits). In the last years, also the General Knowledge has been considered like a criterion for the academic successfully (see Ackerman, 1997), since it has been shown that the previous knowledge helps in the acquisition of the new knowledge (Hambrick & Engle, 2001). An interesting theoretical model that has proposed an explanation for the academic achievement, is the PPIK (intelligence like a process, interests and inteligence like knowledge) proposed by Ackerman (1996), who argues that knowledge and the acquired skills in a particular domain are the result of the dedication of cognitive resources that a person perform during a long period of time. This model proposes that personality traits, individuals interests and motivational aspects are integrated as complex traits that determine the direction and the intensity of the dedication of cognitive resources on the learning that a person make (Ackerman, 2003). In our context, (Córdoba, Argentina), a group of researcher has developed a series of necessary technical resoures for the assesment of some of the theoretical constructs proposed by this model. However, by the moment, we do not have an instrument for evaluate the General Knowledge. Therefore, this project aims the construction of an instrument to asess General Knowledge, essential to set parameters on the knowledge level of the university population and for in next works test the PPIK theory postulates.
Resumo:
Este proyecto se enmarca en la utlización de métodos formales (más precisamente, en la utilización de teoría de tipos) para garantizar la ausencia de errores en programas. Por un lado se plantea el diseño de nuevos algoritmos de chequeo de tipos. Para ello, se proponen nuevos algoritmos basados en la idea de normalización por evaluación que sean extensibles a otros sistemas de tipos. En el futuro próximo extenderemos resultados que hemos conseguido recientemente [16,17] para obtener: una simplificación de los trabajos realizados para sistemas sin regla eta (acá se estudiarán dos sistemas: a la Martin Löf y a la PTS), la formulación de estos chequeadores para sistemas con variables, generalizar la noción de categoría con familia utilizada para dar semántica a teoría de tipos, obtener una formulación categórica de la noción de normalización por evaluación y finalmente, aplicar estos algoritmos a sistemas con reescrituras. Para los primeros resultados esperados mencionados, nos proponemos como método adaptar las pruebas de [16,17] a los nuevos sistemas. La importancia radica en que permitirán tornar más automatizables (y por ello, más fácilmente utilizables) los asistentes de demostración basados en teoría de tipos. Por otro lado, se utilizará la teoría de tipos para certificar compiladores, intentando llevar adelante la propuesta nunca explorada de [22] de utilizar un enfoque abstracto basado en categorías funtoriales. El método consistirá en certificar el lenguaje "Peal" [29] y luego agregar sucesivamente funcionalidad hasta obtener Forsythe [23]. En este período esperamos poder agregar varias extensiones. La importancia de este proyecto radica en que sólo un compilador certificado garantiza que un programa fuente correcto se compile a un programa objeto correcto. Es por ello, crucial para todo proceso de verificación que se base en verificar código fuente. Finalmente, se abordará la formalización de sistemas con session types. Los mismos han demostrado tener fallas en sus formulaciones [30], por lo que parece conveniente su formalización. Durante la marcha de este proyecto, esperamos tener alguna formalización que dé lugar a un algoritmo de chequeo de tipos y a demostrar las propiedades usuales de los sistemas. La contribución es arrojar un poco de luz sobre estas formulaciones cuyos errores revelan que el tema no ha adquirido aún suficiente madurez o comprensión por parte de la comunidad. This project is about using type theory to garantee program correctness. It follows three different directions: 1) Finding new type-checking algorithms based on normalization by evaluation. First, we would show that recent results like [16,17] extend to other type systems like: Martin-Löf´s type theory without eta rule, PTSs, type systems with variables (in addition to systems in [16,17] which are a la de Bruijn), systems with rewrite rules. This will be done by adjusting the proofs in [16,17] so that they apply to such systems as well. We will also try to obtain a more general definition of categories with families and normalization by evaluation, formulated in categorical terms. We expect this may turn proof-assistants more automatic and useful. 2) Exploring the proposal in [22] to compiler construction for Algol-like languages using functorial categories. According to [22] such approach is suitable for verifying compiler correctness, claim which was never explored. First, the language Peal [29] will be certified in type theory and we will gradually add funtionality to it until a correct compiler for the language Forsythe [23] is obtained. 3) Formilizing systems for session types. Several proposals have shown to be faulty [30]. This means that a formalization of it may contribute to the general understanding of session types.