35 resultados para Algoritmo FORM
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
One controversial idea present in the debate on urban sustainability is that urban sprawl is an ecological stressing problem. We have tested this popular assumption by measuring the ecological footprint of commuting and housing of the 163 municipalities of the Barcelona Metropolitan Region and by relating the estimated values with residential density and accessibility, the fundamental determinant of residential density according to the Monocentric City Model.
Resumo:
Este trabajo analiza el rendimiento de cuatro nodos de cómputo multiprocesador de memoria compartida para resolver el problema N-body. Se paraleliza el algoritmo serie, y se codifica usando el lenguaje C extendido con OpenMP. El resultado son dos variantes que obedecen a dos criterios de optimización diferentes: minimizar los requisitos de memoria y minimizar el volumen de cómputo. Posteriormente, se realiza un proceso de análisis de las prestaciones del programa sobre los nodos de cómputo. Se modela el rendimiento de las variantes secuenciales y paralelas de la aplicación, y de los nodos de cómputo; se instrumentan y ejecutan los programas para obtener resultados en forma de varias métricas; finalmente se muestran e interpretan los resultados, proporcionando claves que explican ineficiencias y cuellos de botella en el rendimiento y posibles líneas de mejora. La experiencia de este estudio concreto ha permitido esbozar una incipiente metodología de análisis de rendimiento, identificación de problemas y sintonización de algoritmos a nodos de cómputo multiprocesador de memoria compartida.
Resumo:
BOLD APS es un software diseñado para solventar el problema de la planificación de la producción. Como tal, cuenta con un algoritmo, en continuo desarrollo, cuya función es tomar las decisiones oportunas para obtener una buena programación de tareas. Este proyecto consta de dos fases: la principal comprende el diseño e implementación de una nueva sección dentro del algoritmo de planificación de la producción que utiliza la empresa Global Planning Solution, con el objetivo de ofrecer mejoras en la calidad de las soluciones actuales; la fase secundaria consiste en una labor de depuración, limpieza y ordenación del código, para facilitar su comprensión y posterior modificación.
Resumo:
This project deals with the generation of profitability and the distribution of its benefits. Inspired by Davis (1947, 1955), we define profitability as the ratio of revenue to cost. Profitability is not as popular a measure of business financial performance as profit, the difference between revenue and cost. Regardless of its popularity, however, profitability is surely a useful financial performance measure. Our primary objective in this project is to identify the factors that generate change in profitability. One set of factors, which we refer to as sources, consists of changes in quantities and prices of outputs and inputs. Individual quantity changes aggregate to the overall impact of quantity change on profitability change, which we call productivity change. Individual price changes aggregate to the overall impact of price change on profitability change, which we call price recovery change. In this framework profitability change consists exclusively of productivity change and price recovery change. A second set of factors, which we refer to as drivers, consists of phenomena such as technical change, change in the efficiency of resource allocation, and the impact of economies of scale. The ability of management to harness these factors drives productivity change, which is one component of profitability change. Thus the term sources refers to quantities and prices of individual outputs and inputs, whose changes influence productivity change or price recovery change, either of which influences profitability change. The term drivers refers to phenomena related to technology and management that influence productivity change (but not price recovery change), and hence profitability change.
Diseño y evaluación de un algoritmo paralelo para la eliminación gausiana en procesadores multi-core
Resumo:
In the eighties, John Aitchison (1986) developed a new methodological approach for the statistical analysis of compositional data. This new methodology was implemented in Basic routines grouped under the name CODA and later NEWCODA inMatlab (Aitchison, 1997). After that, several other authors have published extensions to this methodology: Marín-Fernández and others (2000), Barceló-Vidal and others (2001), Pawlowsky-Glahn and Egozcue (2001, 2002) and Egozcue and others (2003). (...)
Resumo:
El siguiente artículo presenta el trabajo realizado en la creación de una aplicación de software libre que representa gráficamente las rutas generadas y la distribución de los elementos transportados en el interior de un vehículo de carga.
Resumo:
El agoritmo Rho de Pollard es uno de los mejores conocidos para resolver el problema del logaritmo discreto. Se trata de una implementación de una paralelización utilizando MPI sobre un clúster. El lector encontrará en este proyecto el algoritmo de paralelización utilizado, así como, un conjunto de pruebas y resultados de la ejecución debidamente analizados.
Resumo:
La teor\'\ı a de Morales–Ramis es la teor\'\ı a de Galois en el contextode los sistemas din\'amicos y relaciona dos tipos diferentes de integrabilidad:integrabilidad en el sentido de Liouville de un sistema hamiltonianoe integrabilidad en el sentido de la teor\'\ı a de Galois diferencial deuna ecuaci\'on diferencial. En este art\'\i culo se presentan algunas aplicacionesde la teor\'\i a de Morales–Ramis en problemas de no integrabilidadde sistemas hamiltonianos cuya ecuaci\'on variacional normal a lo largode una curva integral particular es una ecuaci\'on diferencial lineal desegundo orden con coeficientes funciones racionales. La integrabilidadde la ecuaci\'on variacional normal es analizada mediante el algoritmode Kovacic.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This paper deals with the form and use of reformulation markers in research papers written in English, Spanish and Catalan. Considering the form and frequency of themarkers, English papers tends to prefer simple fixed markers and includes less reformulators than Spanish and Catalan. On the contrary, formal Catalan and Spanish papers include more markers, some of which are complex and allow for some structural variability. As for use, reformulation markers establish dynamic relationships between portions of discourse which can be identified in our corpus with expansion, reduction, and permutation. The analysis of the corpus shows that English authors usually reformulate to add more information to the concept (expansion), whereas Catalan and Spanish authors reduce the contents or the implicatures of the previous formulation more frequently than English.
Resumo:
In this paper we view bargaining and cooperation as an interaction superimposed on a strategic form game. A multistage bargaining procedure for N players, the proposer commitment procedure, is presented. It is inspired by Nash s two-player variable-threat model; a key feature is the commitment to threats. We establish links to classical cooperative game theory solutions, such as the Shapley value in the transferable utility case. However, we show that even in standard pure exchange economies the traditional coalitional function may not be adequate when utilities are not transferable.
Resumo:
In this paper we propose a general technique to develop first and second order closed-form approximation formulas for short-time options withrandom strikes. Our method is based on Malliavin calculus techniques andallows us to obtain simple closed-form approximation formulas dependingon the derivative operator. The numerical analysis shows that these formulas are extremely accurate and improve some previous approaches ontwo-assets and three-assets spread options as Kirk's formula or the decomposition mehod presented in Alòs, Eydeland and Laurence (2011).
Resumo:
We explain why European trucking carriers are much smaller and rely more heavily on owner-operators(as opposed to employee drivers) than their US counterparts. Our analysis begins by ruling outdifferences in technology as the source of those disparities and confirms that standard hypothesesin organizational economics, which have been shown to explain the choice of organizational form inUS industry, also apply in Europe. We then argue that the preference for subcontracting oververtical integration in Europe is the result of European institutions particularly, labor regulationand tax laws that increase the costs of vertical integration.
Resumo:
Equivalence classes of normal form games are defined using the geometryof correspondences of standard equilibiurm concepts like correlated, Nash,and robust equilibrium or risk dominance and rationalizability. Resultingequivalence classes are fully characterized and compared across differentequilibrium concepts for 2 x 2 games. It is argued that the procedure canlead to broad and game-theoretically meaningful distinctions of games aswell as to alternative ways of viewing and testing equilibrium concepts.Larger games are also briefly considered.