965 resultados para Algebra, Boolean.
Resumo:
Formal correctness of complex multi-party network protocols can be difficult to verify. While models of specific fixed compositions of agents can be checked against design constraints, protocols which lend themselves to arbitrarily many compositions of agents-such as the chaining of proxies or the peering of routers-are more difficult to verify because they represent potentially infinite state spaces and may exhibit emergent behaviors which may not materialize under particular fixed compositions. We address this challenge by developing an algebraic approach that enables us to reduce arbitrary compositions of network agents into a behaviorally-equivalent (with respect to some correctness property) compact, canonical representation, which is amenable to mechanical verification. Our approach consists of an algebra and a set of property-preserving rewrite rules for the Canonical Homomorphic Abstraction of Infinite Network protocol compositions (CHAIN). Using CHAIN, an expression over our algebra (i.e., a set of configurations of network protocol agents) can be reduced to another behaviorally-equivalent expression (i.e., a smaller set of configurations). Repeated applications of such rewrite rules produces a canonical expression which can be checked mechanically. We demonstrate our approach by characterizing deadlock-prone configurations of HTTP agents, as well as establishing useful properties of an overlay protocol for scheduling MPEG frames, and of a protocol for Web intra-cache consistency.
Resumo:
In our previous work, we developed TRAFFIC(X), a specification language for modeling bi-directional network flows featuring a type system with constrained polymorphism. In this paper, we present two ways to customize the constraint system: (1) when using linear inequality constraints for the constraint system, TRAFFIC(X) can describe flows with numeric properties such as MTU (maximum transmission unit), RTT (round trip time), traversal order, and bandwidth allocation over parallel paths; (2) when using Boolean predicate constraints for the constraint system, TRAFFIC(X) can describe routing policies of an IP network. These examples illustrate how to use the customized type system.
Resumo:
We consider a fault model of Boolean gates, both classical and quantum, where some of the inputs may not be connected to the actual gate hardware. This model is somewhat similar to the stuck-at model which is a very popular model in testing Boolean circuits. We consider the problem of detecting such faults; the detection algorithm can query the faulty gate and its complexity is the number of such queries. This problem is related to determining the sensitivity of Boolean functions. We show how quantum parallelism can be used to detect such faults. Specifically, we show that a quantum algorithm can detect such faults more efficiently than a classical algorithm for a Parity gate and an AND gate. We give explicit constructions of quantum detector algorithms and show lower bounds for classical algorithms. We show that the model for detecting such faults is similar to algebraic decision trees and extend some known results from quantum query complexity to prove some of our results.
Resumo:
Existing type systems for object calculi are based on invariant subtyping. Subtyping invariance is required for soundness of static typing in the presence of method overrides, but it is often in the way of the expressive power of the type system. Flexibility of static typing can be recovered in different ways: in first-order systems, by the adoption of object types with variance annotations, in second-order systems by resorting to Self types. Type inference is known to be P-complete for first-order systems of finite and recursive object types, and NP-complete for a restricted version of Self types. The complexity of type inference for systems with variance annotations is yet unknown. This paper presents a new object type system based on the notion of Split types, a form of object types where every method is assigned two types, namely, an update type and a select type. The subtyping relation that arises for Split types is variant and, as a result, subtyping can be performed both in width and in depth. The new type system generalizes all the existing first-order type systems for objects, including systems based on variance annotations. Interestingly, the additional expressive power does not affect the complexity of the type inference problem, as we show by presenting an O(n^3) inference algorithm.
Resumo:
Much work has been done on learning from failure in search to boost solving of combinatorial problems, such as clause-learning and clause-weighting in boolean satisfiability (SAT), nogood and explanation-based learning, and constraint weighting in constraint satisfaction problems (CSPs). Many of the top solvers in SAT use clause learning to good effect. A similar approach (nogood learning) has not had as large an impact in CSPs. Constraint weighting is a less fine-grained approach where the information learnt gives an approximation as to which variables may be the sources of greatest contention. In this work we present two methods for learning from search using restarts, in order to identify these critical variables prior to solving. Both methods are based on the conflict-directed heuristic (weighted-degree heuristic) introduced by Boussemart et al. and are aimed at producing a better-informed version of the heuristic by gathering information through restarting and probing of the search space prior to solving, while minimizing the overhead of these restarts. We further examine the impact of different sampling strategies and different measurements of contention, and assess different restarting strategies for the heuristic. Finally, two applications for constraint weighting are considered in detail: dynamic constraint satisfaction problems and unary resource scheduling problems.
Resumo:
We prove that the first complex homology of the Johnson subgroup of the Torelli group Tg is a non-trivial, unipotent Tg-module for all g ≥ 4 and give an explicit presentation of it as a Sym H 1(Tg,C)-module when g ≥ 6. We do this by proving that, for a finitely generated group G satisfying an assumption close to formality, the triviality of the restricted characteristic variety implies that the first homology of its Johnson kernel is a nilpotent module over the corresponding Laurent polynomial ring, isomorphic to the infinitesimal Alexander invariant of the associated graded Lie algebra of G. In this setup, we also obtain a precise nilpotence test. © European Mathematical Society 2014.
Resumo:
Using BRST-cohomological techniques, we analyze the consistent deformations of theories describing free tensor gauge fields whose symmetries are represented by Young tableaux made of two columns of equal length p, p > 1. Under the assumptions of locality and Poincaré invariance, we find that there is no consistent deformation of these theories that non-trivially modifies the gauge algebra and/or the gauge transformations. Adding the requirement that the deformation contains no more than two derivatives, the only possible deformation is a cosmological-constant-like term. © SISSA/ISAS 2004.
Resumo:
We investigate the problem of introducing consistent self-couplings in free theories for mixed tensor gauge fields whose symmetry properties are characterized by Young diagrams made of two columns of arbitrary (but different) lengths. We prove that, in flat space, these theories admit no local, Poincaré-invariant, smooth, selfinteracting deformation with at most two derivatives in the Lagrangian. Relaxing the derivative and Lorentz-invariance assumptions, there still is no deformation that modifies the gauge algebra, and in most cases no deformation that alters the gauge transformations. Our approach is based on a Becchi-Rouet-Stora-iyutin (BRST) -cohomology deformation procedure. © 2005 American Institute of Physics.
Resumo:
We study the problem of consistent interactions for spin-3 gauge fields in flat spacetime of arbitrary dimension 3$">n>3. Under the sole assumptions of Poincaré and parity invariance, local and perturbative deformation of the free theory, we determine all nontrivial consistent deformations of the abelian gauge algebra and classify the corresponding deformations of the quadratic action, at first order in the deformation parameter. We prove that all such vertices are cubic, contain a total of either three or five derivatives and are uniquely characterized by a rank-three constant tensor (an internal algebra structure constant). The covariant cubic vertex containing three derivatives is the vertex discovered by Berends, Burgers and van Dam, which however leads to inconsistencies at second order in the deformation parameter. In dimensions 4$">n>4 and for a completely antisymmetric structure constant tensor, another covariant cubic vertex exists, which contains five derivatives and passes the consistency test where the previous vertex failed. © SISSA 2006.
Resumo:
The problem of constructing consistent parity-violating interactions for spin-3 gauge fields is considered in Minkowski space. Under the assumptions of locality, Poincaré invariance, and parity noninvariance, we classify all the nontrivial perturbative deformations of the Abelian gauge algebra. In space-time dimensions n=3 and n=5, deformations of the free theory are obtained which make the gauge algebra non-Abelian and give rise to nontrivial cubic vertices in the Lagrangian, at first order in the deformation parameter g. At second order in g, consistency conditions are obtained which the five-dimensional vertex obeys, but which rule out the n=3 candidate. Moreover, in the five-dimensional first-order deformation case, the gauge transformations are modified by a new term which involves the second de Wit-Freedman connection in a simple and suggestive way. © 2006 The American Physical Society.
Resumo:
Pattern generalization is considered one of the prominent routes for in-troducing students to algebra. However, not all generalizations are al-gebraic. In the use of pattern generalization as a route to algebra, we —teachers and educators— thus have to remain vigilant in order not to confound algebraic generalizations with other forms of dealing with the general. But how to distinguish between algebraic and non-algebraic generalizations? On epistemological and semiotic grounds, in this arti-cle I suggest a characterization of algebraic generalizations. This char-acterization helps to bring about a typology of algebraic and arithmetic generalizations. The typology is illustrated with classroom examples.
Resumo:
Al respecto de las múltiples angustias surgidas por docentes de matemáticas en formación entorno a las dificultades y errores evidenciados por estudiantes de básica segundaria y media en la construcción de pensamiento algebraico, se expone a continuación para el caso de la generalización algebraica los hallazgos logrados desde la investigación que recupera en primera instancia a manera de reseña los referentes teórico conceptuales, las definiciones pertinentes y la clasificación de las dificultades y errores en la educación matemática especialmente en el caso de algebra; de igual manera se detallan características y acuerdos conceptuales entorno a razonamiento, razonamiento algebraico; esta ponencia evidencia los presupuestos e ideales para la educación matemática y la enseñanza del algebra para finalmente establecer la relación y justificación conceptual entre: sistemas de representación (errores); las dificultades (comprensión) y razonamiento algebraico. Con la exposición de ejemplos logrados en las experiencias de aula y analizados producto del trabajo de campo en este estudio, se presenta a manera de propuesta los comentarios, reflexiones y recomendaciones que permitirán al futuro docente de matemáticas diseñar un modelo de competencia formal y cognitivo para entender y actuar en situaciones de la enseñabilidad que se dan en el entorno educativo en especial en relación al razonamiento algebraico.
Resumo:
A nivel educativo la noción de derivada se enseña en los cursos regulares de cálculo, pero por lo general, siempre en la forma en que fue definida por Cauchy, lo que implica un procedimiento se hace necesario hacer una factorización. Constantin Caratheodory establece una definición diferente. Esta definición presenta tres aspectos didácticos destacados: Nos muestra que el proceso de acercamiento de las pendientes de las secantes a la pendiente de la tangente es continuo y por tanto, la continuidad es esencial para la derivabilidad, la segunda parte se refiere a la facilidad de la derivación como un proceso de factorización repetitivo y no como cálculo de límites, así como simplicidad en la demostración de teoremas de linealidad, regla de la cadena, algebra de derivadas (suma, producto y cociente), aplicado a funciones polinómicas de valor real y la tercera es que a nivel escolar se generan alternativas en la enseñanza del cálculo a través de la implementación de conceptos nuevos, con el fin de evitar procedimientos tediosos que se tienen con las definiciones tradicionales como la de Cauchy.
Resumo:
En este taller (de una sesión) se proponen ciertas actividades que conectan el algebra con diversas situaciones del mundo real. La idea es hacer que los presentes desarrollen las tareas para que conozcan otras alternativas para construir conceptos como tasa de cambio o pendiente, modelamiento de datos, líneas de mejor ajuste, datos atípicos, errores en experimentos, bases de ingenierías civil, uso de modelos matemáticos para hacer predicciones y cuando los modelos matemáticos no describen la realidad de los experimentos. En el taller se realizaran tres actividades: A. FORTALEZA DE LAS VIGAS B. ATANDO NUDOS C. CONSTRUCCION DEL TRIACONTRAEDRO ROMBICO (LAMPARA DANESA) El realizar estas experiencias nos ayudaran a entender los estados de conflicto que entra el estudiante a la hora de procesar, adquirir y afianzar el conocimiento