631 resultados para algorithmic skeletons
Resumo:
A large class of special functions are solutions of systems of linear difference and differential equations with polynomial coefficients. For a given function, these equations considered as operator polynomials generate a left ideal in a noncommutative algebra called Ore algebra. This ideal with finitely many conditions characterizes the function uniquely so that Gröbner basis techniques can be applied. Many problems related to special functions which can be described by such ideals can be solved by performing elimination of appropriate noncommutative variables in these ideals. In this work, we mainly achieve the following: 1. We give an overview of the theoretical algebraic background as well as the algorithmic aspects of different methods using noncommutative Gröbner elimination techniques in Ore algebras in order to solve problems related to special functions. 2. We describe in detail algorithms which are based on Gröbner elimination techniques and perform the creative telescoping method for sums and integrals of special functions. 3. We investigate and compare these algorithms by illustrative examples which are performed by the computer algebra system Maple. This investigation has the objective to test how far noncommutative Gröbner elimination techniques may be efficiently applied to perform creative telescoping.
Resumo:
I have designed and implemented a system for the multilevel verification of synchronous MOS VLSI circuits. The system, called Silica Pithecus, accepts the schematic of an MOS circuit and a specification of the circuit's intended digital behavior. Silica Pithecus determines if the circuit meets its specification. If the circuit fails to meet its specification Silica Pithecus returns to the designer the reason for the failure. Unlike earlier verifiers which modelled primitives (e.g., transistors) as unidirectional digital devices, Silica Pithecus models primitives more realistically. Transistors are modelled as bidirectional devices of varying resistances, and nodes are modelled as capacitors. Silica Pithecus operates hierarchically, interactively, and incrementally. Major contributions of this research include a formal understanding of the relationship between different behavioral descriptions (e.g., signal, boolean, and arithmetic descriptions) of the same device, and a formalization of the relationship between the structure, behavior, and context of device. Given these formal structures my methods find sufficient conditions on the inputs of circuits which guarantee the correct operation of the circuit in the desired descriptive domain. These methods are algorithmic and complete. They also handle complex phenomena such as races and charge sharing. Informal notions such as races and hazards are shown to be derivable from the correctness conditions used by my methods.
Resumo:
This is a presentation introducing students to algorithmic concepts such as sequencing, pseudocode and modularity. It includes a class exercise to define the algorithm to make a cup of tea.
Resumo:
These are the materials for a course run at the University of Southampton to teach Algorithmic thinking to Information Technology in Organisation students. The course takes a lightweight approach, and is designed to be used alongside simple programming labs (for example, using Alice).
Resumo:
This presentation explains how we move from a problem definition to an algorithmic solution using simple tools like noun verb analysis. It also looks at how we might judge the quality of a solution through coupling, cohesion and generalisation.
Resumo:
En este documento se explica el rol de las compañías aseguradoras colombianas dentro del sistema pensional y se busca, a través de la comprensión de la evolución del entorno macroeconómico y del marco regulatorio, identificar los retos que enfrentan. Los retos explicados en el documento son tres: el reto de la rentabilidad, el reto que plantean los cambios relativamente frecuentes de la regulación, y el reto del “calce”. El documento se enfoca principalmente en el reto de la rentabilidad y desarrolla un ejercicio de frontera eficiente que utiliza retornos esperados calculados a partir de la metodología de Damodaran (2012). Los resultados del ejercicio soportan la idea de que en efecto los retornos esperados serán menores para cualquier nivel de riesgo y sugiere que ante tal panorama, la relajación de las restricciones impuestas por el Régimen de inversiones podría alivianar los preocupaciones de las compañías aseguradoras en esta materia. Para los otros dos retos también se sugieren alternativas: el Algorithmic Trading para el caso del reto que impone los cambios en la regulación, y las Asociaciones Público-Privadas para abordar el reto del “calce”.
Resumo:
Las estrategias de inversión pairs trading se basan en desviaciones del precio entre pares de acciones correlacionadas y han sido ampliamente implementadas por fondos de inversión tomando posiciones largas y cortas en las acciones seleccionadas cuando surgen divergencias y obteniendo utilidad cerrando la posición al converger. Se describe un modelo de reversión a la media para analizar la dinámica que sigue el diferencial del precio entre acciones ordinarias y preferenciales de una misma empresa en el mismo mercado. La media de convergencia en el largo plazo es obtenida con un filtro de media móvil, posteriormente, los parámetros del modelo de reversión a la media se estiman mediante un filtro de Kalman bajo una formulación de estado espacio sobre las series históricas. Se realiza un backtesting a la estrategia de pairs trading algorítmico sobre el modelo propuesto indicando potenciales utilidades en mercados financieros que se observan por fuera del equilibrio. Aplicaciones de los resultados podrían mostrar oportunidades para mejorar el rendimiento de portafolios, corregir errores de valoración y sobrellevar mejor periodos de bajos retornos.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
This article describes a novel algorithmic development extending the contour advective semi-Lagrangian model to include nonconservative effects. The Lagrangian contour representation of finescale tracer fields, such as potential vorticity, allows for conservative, nondiffusive treatment of sharp gradients allowing very high numerical Reynolds numbers. It has been widely employed in accurate geostrophic turbulence and tracer advection simulations. In the present, diabatic version of the model the constraint of conservative dynamics is overcome by including a parallel Eulerian field that absorbs the nonconservative ( diabatic) tendencies. The diabatic buildup in this Eulerian field is limited through regular, controlled transfers of this field to the contour representation. This transfer is done with a fast newly developed contouring algorithm. This model has been implemented for several idealized geometries. In this paper a single-layer doubly periodic geometry is used to demonstrate the validity of the model. The present model converges faster than the analogous semi-Lagrangian models at increased resolutions. At the same nominal spatial resolution the new model is 40 times faster than the analogous semi-Lagrangian model. Results of an orographically forced idealized storm track show nontrivial dependency of storm-track statistics on resolution and on the numerical model employed. If this result is more generally applicable, this may have important consequences for future high-resolution climate modeling.
Resumo:
Advances made over the past decade in structure determination from powder diffraction data are reviewed with particular emphasis on algorithmic developments and the successes and limitations of the technique. While global optimization methods have been successful in the solution of molecular crystal structures, new methods are required to make the solution of inorganic crystal structures more routine. The use of complementary techniques such as NMR to assist structure solution is discussed and the potential for the combined use of X-ray and neutron diffraction data for structure verification is explored. Structures that have proved difficult to solve from powder diffraction data are reviewed and the limitations of structure determination from powder diffraction data are discussed. Furthermore, the prospects of solving small protein crystal structures over the next decade are assessed.
Resumo:
This study compares associations between demographic profiles, long bone lengths, bone mineral content, and frequencies of stress indicators in the preadult populations of two medieval skeletal assemblages from Denmark. One is from a leprosarium, and thus probably represents a disadvantaged group (Naestved). The other comes from a normal, and in comparison rather privileged, medieval community (AEbelholt). Previous studies of the adult population indicated differences between the two skeletal collections with regard to mortality, dental size, and metabolic and specific infectious disease. The two samples were analyzed against the view known as the "osteological paradox" (Wood et al. [1992] Curr. Anthropol. 33:343-370), according to which skeletons displaying pathological modification are likely to represent the healthier individuals of a population, whereas those without lesions would have died without acquiring modifications as a result of a depressed immune response. Results reveal that older age groups among the preadults from Naestved are shorter and have less bone mineral content than their peers from AEbelholt. On average, the Naestved children have a higher prevalence of stress indicators, and in some cases display skeletal signs of leprosy. This is likely a result of the combination of compromised health and social disadvantage, thus supporting a more traditional interpretation. The study provides insights into the health of children from two different biocultural settings of medieval Danish society and illustrates the importance of comparing samples of single age groups.
Resumo:
This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so. that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
Advances made over the past decade in structure determination from powder diffraction data are reviewed with particular emphasis on algorithmic developments and the successes and limitations of the technique. While global optimization methods have been successful in the solution of molecular crystal structures, new methods are required to make the solution of inorganic crystal structures more routine. The use of complementary techniques such as NMR to assist structure solution is discussed and the potential for the combined use of X-ray and neutron diffraction data for structure verification is explored. Structures that have proved difficult to solve from powder diffraction data are reviewed and the limitations of structure determination from powder diffraction data are discussed. Furthermore, the prospects of solving small protein crystal structures over the next decade are assessed.
Resumo:
Design for low power in FPGA is rather limited since technology factors affecting power are either fixed or limited for FPGA families. This paper investigates opportunities for power savings of a pipelined 2D IDCT design at the architecture and logic level. We report power consumption savings of over 25% achieved in FPGA circuits obtained from clock gating implementation of optimizations made at the algorithmic level(1).