503 resultados para algorithmic skeletons
Resumo:
These are the materials for a course run at the University of Southampton to teach Algorithmic thinking to Information Technology in Organisation students. The course takes a lightweight approach, and is designed to be used alongside simple programming labs (for example, using Alice).
Resumo:
This presentation explains how we move from a problem definition to an algorithmic solution using simple tools like noun verb analysis. It also looks at how we might judge the quality of a solution through coupling, cohesion and generalisation.
Resumo:
En este documento se explica el rol de las compañías aseguradoras colombianas dentro del sistema pensional y se busca, a través de la comprensión de la evolución del entorno macroeconómico y del marco regulatorio, identificar los retos que enfrentan. Los retos explicados en el documento son tres: el reto de la rentabilidad, el reto que plantean los cambios relativamente frecuentes de la regulación, y el reto del “calce”. El documento se enfoca principalmente en el reto de la rentabilidad y desarrolla un ejercicio de frontera eficiente que utiliza retornos esperados calculados a partir de la metodología de Damodaran (2012). Los resultados del ejercicio soportan la idea de que en efecto los retornos esperados serán menores para cualquier nivel de riesgo y sugiere que ante tal panorama, la relajación de las restricciones impuestas por el Régimen de inversiones podría alivianar los preocupaciones de las compañías aseguradoras en esta materia. Para los otros dos retos también se sugieren alternativas: el Algorithmic Trading para el caso del reto que impone los cambios en la regulación, y las Asociaciones Público-Privadas para abordar el reto del “calce”.
Resumo:
Las estrategias de inversión pairs trading se basan en desviaciones del precio entre pares de acciones correlacionadas y han sido ampliamente implementadas por fondos de inversión tomando posiciones largas y cortas en las acciones seleccionadas cuando surgen divergencias y obteniendo utilidad cerrando la posición al converger. Se describe un modelo de reversión a la media para analizar la dinámica que sigue el diferencial del precio entre acciones ordinarias y preferenciales de una misma empresa en el mismo mercado. La media de convergencia en el largo plazo es obtenida con un filtro de media móvil, posteriormente, los parámetros del modelo de reversión a la media se estiman mediante un filtro de Kalman bajo una formulación de estado espacio sobre las series históricas. Se realiza un backtesting a la estrategia de pairs trading algorítmico sobre el modelo propuesto indicando potenciales utilidades en mercados financieros que se observan por fuera del equilibrio. Aplicaciones de los resultados podrían mostrar oportunidades para mejorar el rendimiento de portafolios, corregir errores de valoración y sobrellevar mejor periodos de bajos retornos.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
This article describes a novel algorithmic development extending the contour advective semi-Lagrangian model to include nonconservative effects. The Lagrangian contour representation of finescale tracer fields, such as potential vorticity, allows for conservative, nondiffusive treatment of sharp gradients allowing very high numerical Reynolds numbers. It has been widely employed in accurate geostrophic turbulence and tracer advection simulations. In the present, diabatic version of the model the constraint of conservative dynamics is overcome by including a parallel Eulerian field that absorbs the nonconservative ( diabatic) tendencies. The diabatic buildup in this Eulerian field is limited through regular, controlled transfers of this field to the contour representation. This transfer is done with a fast newly developed contouring algorithm. This model has been implemented for several idealized geometries. In this paper a single-layer doubly periodic geometry is used to demonstrate the validity of the model. The present model converges faster than the analogous semi-Lagrangian models at increased resolutions. At the same nominal spatial resolution the new model is 40 times faster than the analogous semi-Lagrangian model. Results of an orographically forced idealized storm track show nontrivial dependency of storm-track statistics on resolution and on the numerical model employed. If this result is more generally applicable, this may have important consequences for future high-resolution climate modeling.
Resumo:
Advances made over the past decade in structure determination from powder diffraction data are reviewed with particular emphasis on algorithmic developments and the successes and limitations of the technique. While global optimization methods have been successful in the solution of molecular crystal structures, new methods are required to make the solution of inorganic crystal structures more routine. The use of complementary techniques such as NMR to assist structure solution is discussed and the potential for the combined use of X-ray and neutron diffraction data for structure verification is explored. Structures that have proved difficult to solve from powder diffraction data are reviewed and the limitations of structure determination from powder diffraction data are discussed. Furthermore, the prospects of solving small protein crystal structures over the next decade are assessed.
Resumo:
This study compares associations between demographic profiles, long bone lengths, bone mineral content, and frequencies of stress indicators in the preadult populations of two medieval skeletal assemblages from Denmark. One is from a leprosarium, and thus probably represents a disadvantaged group (Naestved). The other comes from a normal, and in comparison rather privileged, medieval community (AEbelholt). Previous studies of the adult population indicated differences between the two skeletal collections with regard to mortality, dental size, and metabolic and specific infectious disease. The two samples were analyzed against the view known as the "osteological paradox" (Wood et al. [1992] Curr. Anthropol. 33:343-370), according to which skeletons displaying pathological modification are likely to represent the healthier individuals of a population, whereas those without lesions would have died without acquiring modifications as a result of a depressed immune response. Results reveal that older age groups among the preadults from Naestved are shorter and have less bone mineral content than their peers from AEbelholt. On average, the Naestved children have a higher prevalence of stress indicators, and in some cases display skeletal signs of leprosy. This is likely a result of the combination of compromised health and social disadvantage, thus supporting a more traditional interpretation. The study provides insights into the health of children from two different biocultural settings of medieval Danish society and illustrates the importance of comparing samples of single age groups.
Resumo:
This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so. that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
Advances made over the past decade in structure determination from powder diffraction data are reviewed with particular emphasis on algorithmic developments and the successes and limitations of the technique. While global optimization methods have been successful in the solution of molecular crystal structures, new methods are required to make the solution of inorganic crystal structures more routine. The use of complementary techniques such as NMR to assist structure solution is discussed and the potential for the combined use of X-ray and neutron diffraction data for structure verification is explored. Structures that have proved difficult to solve from powder diffraction data are reviewed and the limitations of structure determination from powder diffraction data are discussed. Furthermore, the prospects of solving small protein crystal structures over the next decade are assessed.
Resumo:
Design for low power in FPGA is rather limited since technology factors affecting power are either fixed or limited for FPGA families. This paper investigates opportunities for power savings of a pipelined 2D IDCT design at the architecture and logic level. We report power consumption savings of over 25% achieved in FPGA circuits obtained from clock gating implementation of optimizations made at the algorithmic level(1).
Resumo:
An unaltered rearrangement of the original computation of a neural based predictor at the algorithmic level is introduced as a new organization. Its FPGA implementation generates circuits that are 1.7 faster than a direct implementation of the original algorithm. This faster clock rate allows to implement predictors with longer history lengths using the nearly the same hardware budget.
Resumo:
In this paper we analyse applicability and robustness of Markov chain Monte Carlo algorithms for eigenvalue problems. We restrict our consideration to real symmetric matrices. Almost Optimal Monte Carlo (MAO) algorithms for solving eigenvalue problems are formulated. Results for the structure of both - systematic and probability error are presented. It is shown that the values of both errors can be controlled independently by different algorithmic parameters. The results present how the systematic error depends on the matrix spectrum. The analysis of the probability error is presented. It shows that the close (in some sense) the matrix under consideration is to the stochastic matrix the smaller is this error. Sufficient conditions for constructing robust and interpolation Monte Carlo algorithms are obtained. For stochastic matrices an interpolation Monte Carlo algorithm is constructed. A number of numerical tests for large symmetric dense matrices are performed in order to study experimentally the dependence of the systematic error from the structure of matrix spectrum. We also study how the probability error depends on the balancing of the matrix. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
The diagnosis of thalassaemia in archaeological populations has long been hindered by a lack of pathogonomic features, and the non-specific nature of cribra orbitalia and porotic hyperostosis. In fact, clinical research has highlighted more specific diagnostic criteria for thalassaemia major and intermedia based on changes to the thorax (‘rib-within-a-rib’ and costal osteomas). A recent re-examination of 364 child skeletons from Romano-British Poundbury Camp, Dorset revealed children with general ‘wasting’ of the bones and three children who demonstrated a variety of severe lesions (e.g. zygomatic bone and rib hypertrophy, porotic hyperostosis, rib lesions, osteopenia and pitted diaphyseal shafts) that are inconsistent with dietary deficiency alone, and more consistent with a diagnosis of genetic anaemia. Two of these children displayed rib lesions typical of those seen in modern cases of thalassaemia. The children of Poundbury Camp represent the first cases of genetic anaemia identified in a British archaeological population. As thalassaemia is a condition strongly linked to Mediterranean communities, the presence of this condition in a child from England, found within a mausoleum, suggests that they were born to wealthy immigrant parents living in this small Roman settlement in Dorset. This paper explores the diagnostic criteria for genetic anaemia in the archaeological literature and what its presence in ancient populations can contribute to our knowledge of past human migration.