906 resultados para Symbolic Computations
Resumo:
We have been investigating the cryptographical properties of in nite families of simple graphs of large girth with the special colouring of vertices during the last 10 years. Such families can be used for the development of cryptographical algorithms (on symmetric or public key modes) and turbocodes in error correction theory. Only few families of simple graphs of large unbounded girth and arbitrarily large degree are known. The paper is devoted to the more general theory of directed graphs of large girth and their cryptographical applications. It contains new explicit algebraic constructions of in finite families of such graphs. We show that they can be used for the implementation of secure and very fast symmetric encryption algorithms. The symbolic computations technique allow us to create a public key mode for the encryption scheme based on algebraic graphs.
Resumo:
We discuss some main points of computer-assisted proofs based on reliable numerical computations. Such so-called self-validating numerical methods in combination with exact symbolic manipulations result in very powerful mathematical software tools. These tools allow proving mathematical statements (existence of a fixed point, of a solution of an ODE, of a zero of a continuous function, of a global minimum within a given range, etc.) using a digital computer. To validate the assertions of the underlying theorems fast finite precision arithmetic is used. The results are absolutely rigorous. To demonstrate the power of reliable symbolic-numeric computations we investigate in some details the verification of very long periodic orbits of chaotic dynamical systems. The verification is done directly in Maple, e.g. using the Maple Power Tool intpakX or, more efficiently, using the C++ class library C-XSC.
Resumo:
I present a novel design methodology for the synthesis of automatic controllers, together with a computational environment---the Control Engineer's Workbench---integrating a suite of programs that automatically analyze and design controllers for high-performance, global control of nonlinear systems. This work demonstrates that difficult control synthesis tasks can be automated, using programs that actively exploit and efficiently represent knowledge of nonlinear dynamics and phase space and effectively use the representation to guide and perform the control design. The Control Engineer's Workbench combines powerful numerical and symbolic computations with artificial intelligence reasoning techniques. As a demonstration, the Workbench automatically designed a high-quality maglev controller that outperforms a previous linear design by a factor of 20.
Resumo:
The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.
Resumo:
Substantial behavioural and neuropsychological evidence has been amassed to support the dual-route model of morphological processing, which distinguishes between a rule-based system for regular items (walk–walked, call–called) and an associative system for the irregular items (go–went). Some neural-network models attempt to explain the neuropsychological and brain-mapping dissociations in terms of single-system associative processing. We show that there are problems in the accounts of homogeneous networks in the light of recent brain-mapping evidence of systematic double-dissociation. We also examine the superior capabilities of more internally differentiated connectionist models, which, under certain conditions, display systematic double-dissociations. It appears that the more differentiation models show, the more easily they account for dissociation patterns, yet without implementing symbolic computations.
Resumo:
BOOK REVIEWS Multibody System Mechanics: Modelling, Stability, Control, and Ro- bustness, by V. A. Konoplev and A. Cheremensky, Mathematics and its Appli- cations Vol. 1, Union of Bulgarian Mathematicians, Sofia, 2001, XXII + 288 pp., $ 65.00, ISBN 954-8880-09-01
Resumo:
Irregular computations pose sorne of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures, which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. Starting in the mid 80s there has been significant progress in the development of parallelizing compilers for logic programming (and more recently, constraint programming) resulting in quite capable parallelizers. The typical applications of these paradigms frequently involve irregular computations, and make heavy use of dynamic data structures with pointers, since logical variables represent in practice a well-behaved form of pointers. This arguably makes the techniques used in these compilers potentially interesting. In this paper, we introduce in a tutoríal way, sorne of the problems faced by parallelizing compilers for logic and constraint programs and provide pointers to sorne of the significant progress made in the area. In particular, this work has resulted in a series of achievements in the areas of inter-procedural pointer aliasing analysis for independence detection, cost models and cost analysis, cactus-stack memory management, techniques for managing speculative and irregular computations through task granularity control and dynamic task allocation such as work-stealing schedulers), etc.
Resumo:
Irregular computations pose some of the most interesting and challenging problems in automatic parallelization. Irregularity appears in certain kinds of numerical problems and is pervasive in symbolic applications. Such computations often use dynamic data structures which make heavy use of pointers. This complicates all the steps of a parallelizing compiler, from independence detection to task partitioning and placement. In the past decade there has been significant progress in the development of parallelizing compilers for logic programming and, more recently, constraint programming. The typical applications of these paradigms frequently involve irregular computations, which arguably makes the techniques used in these compilers potentially interesting. In this paper we introduce in a tutorial way some of the problems faced by parallelizing compilers for logic and constraint programs. These include the need for inter-procedural pointer aliasing analysis for independence detection and having to manage speculative and irregular computations through task granularity control and dynamic task allocation. We also provide pointers to some of the progress made in these áreas. In the associated talk we demónstrate representatives of several generations of these parallelizing compilers.
Resumo:
There are some interesting connections between the theory of quantum computation and quantum measurement. As an illustration, we present a scheme in which an ion trap quantum computer can be used to make arbitrarily accurate measurements of the quadrature phase variables for the collective vibrational motion of the ion. We also discuss some more general aspects of quantum computation and measurement in terms of the Feynman-Deutsch principle.
Resumo:
Objective To evaluate the influence of oral contraceptives (OCs) containing 20 mu mu g ethinylestradiol (EE) and 150 mu mu g gestodene (GEST) on the autonomic modulation of heart rate (HR) in women. Methods One-hundred and fifty-five women aged 24 +/-+/- 2 years were divided into four groups according to their physical activity and the use or not of an OC: active-OC, active-non-OC (NOC), sedentary-OC, and sedentary-NOC. The heart rate was registered in real time based on the electrocardiogram signal for 15 minutes, in the supine-position. The heart rate variability (HRV) was analysed using Shannon`s entropy (SE), conditional entropy (complexity index [CInd] and normalised CInd [NCI]), and symbolic analysis (0V%, 1V%, 2LV%, and 2ULV%). For statistical analysis the Kruskal-Wallis test with Dunn post hoc and the Wilcoxon test (p < 0.05 was considered significant) were applied. Results Treatment with this COC caused no significant changes in SE, CInd, NCI, or symbolic analysis in either active or sedentary groups. Active groups presented higher values for SE and 2ULV%, and lower values for 0V% when compared to sedentary groups (p < 0.05). Conclusion HRV patterns differed depending on life style; the non-linear method applied was highly reliable for identifying these changes. The use of OCs containing 20 mu mu g EE and 150 mu mu g GEST does not influence HR autonomic modulation.
Resumo:
In this work, we consider the numerical solution of a large eigenvalue problem resulting from a finite rank discretization of an integral operator. We are interested in computing a few eigenpairs, with an iterative method, so a matrix representation that allows for fast matrix-vector products is required. Hierarchical matrices are appropriate for this setting, and also provide cheap LU decompositions required in the spectral transformation technique. We illustrate the use of freely available software tools to address the problem, in particular SLEPc for the eigensolvers and HLib for the construction of H-matrices. The numerical tests are performed using an astrophysics application. Results show the benefits of the data-sparse representation compared to standard storage schemes, in terms of computational cost as well as memory requirements.
Resumo:
In this work we investigate the population dynamics of cooperative hunting extending the McCann and Yodzis model for a three-species food chain system with a predator, a prey, and a resource species. The new model considers that a given fraction sigma of predators cooperates in prey's hunting, while the rest of the population 1-sigma hunts without cooperation. We use the theory of symbolic dynamics to study the topological entropy and the parameter space ordering of the kneading sequences associated with one-dimensional maps that reproduce significant aspects of the dynamics of the species under several degrees of cooperative hunting. Our model also allows us to investigate the so-called deterministic extinction via chaotic crisis and transient chaos in the framework of cooperative hunting. The symbolic sequences allow us to identify a critical boundary in the parameter spaces (K, C-0) and (K, sigma) which separates two scenarios: (i) all-species coexistence and (ii) predator's extinction via chaotic crisis. We show that the crisis value of the carrying capacity K-c decreases at increasing sigma, indicating that predator's populations with high degree of cooperative hunting are more sensitive to the chaotic crises. We also show that the control method of Dhamala and Lai [Phys. Rev. E 59, 1646 (1999)] can sustain the chaotic behavior after the crisis for systems with cooperative hunting. We finally analyze and quantify the inner structure of the target regions obtained with this control method for wider parameter values beyond the crisis, showing a power law dependence of the extinction transients on such critical parameters.
Resumo:
This work describes a methodology to extract symbolic rules from trained neural networks. In our approach, patterns on the network are codified using formulas on a Lukasiewicz logic. For this we take advantage of the fact that every connective in this multi-valued logic can be evaluated by a neuron in an artificial network having, by activation function the identity truncated to zero and one. This fact simplifies symbolic rule extraction and allows the easy injection of formulas into a network architecture. We trained this type of neural network using a back-propagation algorithm based on Levenderg-Marquardt algorithm, where in each learning iteration, we restricted the knowledge dissemination in the network structure. This makes the descriptive power of produced neural networks similar to the descriptive power of Lukasiewicz logic language, minimizing the information loss on the translation between connectionist and symbolic structures. To avoid redundance on the generated network, the method simplifies them in a pruning phase, using the "Optimal Brain Surgeon" algorithm. We tested this method on the task of finding the formula used on the generation of a given truth table. For real data tests, we selected the Mushrooms data set, available on the UCI Machine Learning Repository.
Resumo:
The purpose of this paper was to introduce the symbolic formalism based on kneading theory, which allows us to study the renormalization of non-autonomous periodic dynamical systems.