942 resultados para Program analysis techniques
Resumo:
Traditional logic programming languages, such as Prolog, use a fixed left-to-right atom scheduling rule. Recent logic programming languages, however, usually provide more flexible scheduling in which computation generally proceeds leftto- right but in which some calis are dynamically "delayed" until their arguments are sufRciently instantiated to allow the cali to run efficiently. Such dynamic scheduling has a significant cost. We give a framework for the global analysis of logic programming languages with dynamic scheduling and show that program analysis based on this framework supports optimizations which remove much of the overhead of dynamic scheduling.
Resumo:
The aim of program specialization is to optimize programs by exploiting certain knowledge about the context in which the program will execute. There exist many program manipulation techniques which allow specializing the program in different ways. Among them, one of the best known techniques is partial evaluation, often referred to simply as program specialization, which optimizes programs by specializing them for (partially) known input data. In this work we describe abstract specialization, a technique whose main features are: (1) specialization is performed with respect to "abstract" valúes rather than "concrete" ones, and (2) abstract interpretation rather than standard interpretation of the program is used in order to propágate information about execution states. The concept of abstract specialization is at the heart of the specialization system in CiaoPP, the Ciao system preprocessor. In this paper we present a unifying view of the different specialization techniques used in CiaoPP and discuss their potential applications by means of examples. The applications discussed include program parallelization, optimization of dynamic scheduling (concurreney), and integration of partial evaluation techniques.
Resumo:
Ciao Prolog incorporates a module system which allows sepárate compilation and sensible creation of standalone executables. We describe some of the main aspects of the Ciao modular compiler, ciaoc, which takes advantage of the characteristics of the Ciao Prolog module system to automatically perform sepárate and incremental compilation and efficiently build small, standalone executables with competitive run-time performance, ciaoc can also detect statically a larger number of programming errors. We also present a generic code processing library for handling modular programs, which provides an important part of the functionality of ciaoc. This library allows the development of program analysis and transformation tools in a way that is to some extent orthogonal to the details of module system design, and has been used in the implementation of ciaoc and other Ciao system tools. We also describe the different types of executables which can be generated by the Ciao compiler, which offer different tradeoffs between executable size, startup time, and portability, depending, among other factors, on the linking regime used (static, dynamic, lazy, etc.). Finally, we provide experimental data which illustrate these tradeoffs.
Resumo:
Ciao is a public domain, next generation multi-paradigm programming environment with a unique set of features: Ciao offers a complete Prolog system, supporting ISO-Prolog, but its novel modular design allows both restricting and extending the language. As a result, it allows working with fully declarative subsets of Prolog and also to extend these subsets (or ISO-Prolog) both syntactically and semantically. Most importantly, these restrictions and extensions can be activated separately on each program module so that several extensions can coexist in the same application for different modules. Ciao also supports (through such extensions) programming with functions, higher-order (with predicate abstractions), constraints, and objects, as well as feature terms (records), persistence, several control rules (breadth-first search, iterative deepening, ...), concurrency (threads/engines), a good base for distributed execution (agents), and parallel execution. Libraries also support WWW programming, sockets, external interfaces (C, Java, TclTk, relational databases, etc.), etc. Ciao offers support for programming in the large with a robust module/object system, module-based separate/incremental compilation (automatically -no need for makefiles), an assertion language for declaring (optional) program properties (including types and modes, but also determinacy, non-failure, cost, etc.), automatic static inference and static/dynamic checking of such assertions, etc. Ciao also offers support for programming in the small producing small executables (including only those builtins used by the program) and support for writing scripts in Prolog. The Ciao programming environment includes a classical top-level and a rich emacs interface with an embeddable source-level debugger and a number of execution visualization tools. The Ciao compiler (which can be run outside the top level shell) generates several forms of architecture-independent and stand-alone executables, which run with speed, efficiency and executable size which are very competive with other commercial and academic Prolog/CLP systems. Library modules can be compiled into compact bytecode or C source files, and linked statically, dynamically, or autoloaded. The novel modular design of Ciao enables, in addition to modular program development, effective global program analysis and static debugging and optimization via source to source program transformation. These tasks are performed by the Ciao preprocessor ( ciaopp, distributed separately). The Ciao programming environment also includes lpdoc, an automatic documentation generator for LP/CLP programs. It processes Prolog files adorned with (Ciao) assertions and machine-readable comments and generates manuals in many formats including postscript, pdf, texinfo, info, HTML, man, etc. , as well as on-line help, ascii README files, entries for indices of manuals (info, WWW, ...), and maintains WWW distribution sites.
Resumo:
Urban areas benefit from significant improvements in accessibility when a new high speed rail (HSR) project is built. These improvements, which are due mainly to a rise in efficiency, produce locational advantagesand increase the attractiveness of these cities, thereby possibly enhancing their competitivenessand economic growth. However, there may be equity issues at stake, as the main accessibility benefits are primarily concentrated in urban areas with a HSR station, whereas other locations obtain only limited benefits. HSR extensions may contribute to an increase in spatial imbalance and lead to more polarized patterns of spatial development. Procedures for assessing the spatial impacts of HSR must therefore follow a twofold approach which addresses issues of both efficiency and equity. This analysis can be made by jointly assessing both the magnitude and distribution of the accessibility improvements deriving from a HSR project. This paper describes an assessment methodology for HSR projects which follows this twofold approach. The procedure uses spatial impact analysis techniques and is based on the computation of accessibility indicators, supported by a Geographical Information System (GIS). Efficiency impacts are assessed in terms of the improvements in accessibility resulting from the HSR project, with a focus on major urban areas; and spatial equity implications are derived from changes in the distribution of accessibility values among these urban agglomerations.
Resumo:
The figure of the coordinator in health and safety issues in the construction sector first appeared in our legislation through the incorporation of the European Directives (in our case Royal Decree 1627/97 on the minimum health and safety regulations in construction works), and is viewed differently in different countries of the European Union regarding the way they are hired and their role in the construction industry. Coordinating health and safety issues is also a management process that requires certain competencies that are not only based on technical or professional training, but which, taking account of the work environment, require the use of strategies and tools that are related to experience and personal skills. Through a piece of research that took account of expert opinions in the matter, we have found which competencies need to be possessed by the health and safety coordinator in order to improve the safety in the works they are coordinating. The conclusions of the analyses performed using the appropriate statistical methods (comparing means and multivariate analysis techniques), will enable training programmes to be designed and ensure that the health and safety coordinators selected have the competencies required to carry out their duties.
Resumo:
Commercial computer-aided design systems support the geometric definition of product, but they lack utilities to support initial design stages. Typical tasks such as customer need capture, functional requirement formalization, or design parameter definition are conducted in applications that, for instance, support ?quality function deployment? and ?failure modes and effects analysis? techniques. Such applications are noninteroperable with the computer-aided design systems, leading to discontinuous design information flows. This study addresses this issue and proposes a method to enhance the integration of design information generated in the early design stages into a commercial computer-aided design system. To demonstrate the feasibility of the approach adopted, a prototype application was developed and two case studies were executed.
Resumo:
The figure of the coordinator in health and safety issues in the construction sector first appeared in our legislation through the incorporation of the European Directives (in our case Royal Decree 1627/97 on the minimum health and safety regulations in construction works), and is viewed differently in different countries of the European Union regarding the way they are hired and their role in the construction industry. Coordinating health and safety issues is also a management process that requires certain competencies that are not only based on technical or professional training, but which, taking account of the work environment, require the use of strategies and tools that are related to experience and personal skills. Through a piece of research that took account of expert opinions in the matter, we have found which competencies need to be possessed by the health and safety coordinator in order to improve the safety in the works they are coordinating. The conclusions of the analyses performed using the appropriate statistical methods (comparing means and multivariate analysis techniques), will enable training programmes to be designed and ensure that the health and safety coordinators selected have the competencies required to carry out their duties.
Resumo:
Fourier transform infrared (FTIR) spectroscopy was applied to determine the type of surface treatment and dose used on cork stoppers and to predict the friction between stopper and bottleneck. Agglomerated cork stoppers were finished with two different doses and using two surface treatments: P (paraffin and silicone), 15 and 25 mg/stopper, and S (only silicone), 10 and 15 mg/stopper. FTIR spectra were recorded at five points for each stopper by attenuated total reflectance (ATR). Absorbances at 1,010, 2,916, and 2,963 cm -1 were obtained in each spectrum. Discriminant analysis techniques allowed the treatment, and dose applied to each stopper to be identified from the absorbance values. 91.2% success rates were obtained from individual values and 96.0% from the mean values of each stopper. Spectrometric data also allowed treatment homogeneity to be determined on the stopper surface, and a multiple regression model was used to predict the friction index (If = Fe/Fc) (R 2 = 0.93)
Resumo:
A solar cell is a solid state device that converts the energy of sunlight directly into electricity by the photovoltaic effect. When light with photon energies greater than the band gap is absorbed by a semiconductor material, free electrons and free holes are generated by optical excitation in the material. The main characteristic of a photovoltaic device is the presence of internal electric field able to separate the free electrons and holes so they can pass out of the material to the external circuit before they recombine. Numerical simulation of photovoltaic devices plays a crucial role in their design, performance prediction, and comprehension of the fundamental phenomena ruling their operation. The electrical transport and the optical behavior of the solar cells discussed in this work were studied with the simulation code D-AMPS-1D. This software is an updated version of the one-dimensional (1D) simulation program Analysis of Microelectronic and Photonic Devices (AMPS) that was initially developed at The Penn State University, USA. Structures such as homojunctions, heterojunctions, multijunctions, etc., resulting from stacking layers of different materials can be studied by appropriately selecting characteristic parameters. In this work, examples of cells simulation made with D-AMPS-1D are shown. Particularly, results of Ge photovoltaic devices are presented. The role of the InGaP buffer on the device was studied. Moreover, a comparison of the simulated electrical parameters with experimental results was performed.
Resumo:
En los últimos años se ha construido un gran número de pasarelas peatonales como respuesta a la demanda de nuevas vías de paso en las ciudades. Estas estructuras tienen requisitos constructivos menos exigentes en comparación con otros tipos de puentes, lo cual ha facilitado el desarrollo de diseños con nuevos esquemas resistentes, complicadas geometrías y el empleo de nuevos materiales. En general estas estructuras son esbeltas, ligeras y poco amortiguadas, lo que en ocasiones ha generado problemas de vi-braciones al paso de peatones una vez puestas en servicio. Las normativas actuales son cada vez más sensibles a esta problemática, recomendando diseños cuyas frecuencias naturales deben estar alejadas de los rangos de frecuencia de paso típicos de los peatones y fijando límites de confort en forma de valores máximos de aceleración permitidos, asegurándose así un correcto comportamiento de la estructura. En el presente artículo se analiza esta problemática desde un punto de vista práctico. Para ello se muestran los puntos clave de las normativas y guías de diseño de pasarelas que se pueden encontrar actualmente en la bibliografía, se presentan las técnicas que habitualmente se emplean en el análisis dinámico experimental de estas estructuras, y se comentan las soluciones a las que generalmente se recurre para mejorar su comportamiento dinámico. Por último, se muestran los trabajos llevados a cabo por el Centro Tecnológico CARTIF en colaboración con las Universidades de Valladolid y Castilla-La Mancha en la pasarela peatonal del Museo de la Ciencia de Valladolid. Estos trabajos incluyen: (1) el estudio dinámico de los tres vanos metálicos de dicha pasarela, (2) el diseño e implementación de un amortiguador de masa sintonizado en el vano más sensible a las vibraciones, (3) la implementación de un amortiguador de masa activo utilizando un excitador electrodinámico, y (4) el desarrollo de pruebas para la verificación del estado de servicio de la pasarela. In the last years, a wide number of footbridges have been built as demand response of more direct pathways in cities. These structures have lower building requirements as compared with standard bridges. This circumstance has facilitated the development of new structural design with complex geometries and innovative materials. As a result, these structures may be slender, light and low damped, leading to vibration problems once in service. The current codes take into account this problem, and recommend designs with natural frequencies away from the typical pedestrian pacing rates and fix comfort limits to guarantee the serviceability of the structure.This paper studies this problem from a practical point of view. Thus, the key points of codes and footbridges guidelines are showed, the typical experimental dynamic analysis techniques are presented, and the usual solutions adopted to improve the dynamic performance of these structures are discussed. Finally, the works carried out on the Valladolid Science Museum Footbridge by Centro Tecnológico CARTIF in collaboration with the Universities of Valladolid and Castilla-La Mancha are showed. These works include: (1) the dynamic study of the three steel spans of the footbridge, (2) the design and implementation of a tuned mass damper in the liveliest span, (3) the implementation of an active mass damper using an electrodynamic shaker, and (4) the development of field tests to assess the serviceability of such span.
Resumo:
The employment of nonlinear analysis techniques for automatic voice pathology detection systems has gained popularity due to the ability of such techniques for dealing with the underlying nonlinear phenomena. On this respect, characterization using nonlinear analysis typically employs the classical Correlation Dimension and the largest Lyapunov Exponent, as well as some regularity quantifiers computing the system predictability. Mostly, regularity features highly depend on a correct choosing of some parameters. One of those, the delay time �, is usually fixed to be 1. Nonetheless, it has been stated that a unity � can not avoid linear correlation of the time series and hence, may not correctly capture system nonlinearities. Therefore, present work studies the influence of the � parameter on the estimation of regularity features. Three � estimations are considered: the baseline value 1; a � based on the Average Automutual Information criterion; and � chosen from the embedding window. Testing results obtained for pathological voice suggest that an improved accuracy might be obtained by using a � value different from 1, as it accounts for the underlying nonlinearities of the voice signal.
Resumo:
En este proyecto se van a aplicar las técnicas de análisis de ruido para caracterizar la respuesta dinámica de varios sensores de temperatura, tanto termorresistencias de platino como de termopares. Estos sensores son imprescindibles para él correcto funcionamiento de las centrales nucleares y requieren vigilancia para garantizar la exactitud de las medidas. Las técnicas de análisis de ruido son técnicas pasivas, es decir, no afectan a la operación de la planta y permiten realizar una vigilancia in situ de los sensores. Para el caso de los sensores de temperatura, dado que se pueden asimilar a sistemas de primer orden, el parámetro fundamental a vigilar es el tiempo de respuesta. Éste puede obtenerse para cada una de las sondas por medio de técnicas en el dominio de la frecuencia (análisis espectral) o por medio de técnicas en el dominio del tiempo (modelos autorregresivos). Además de la estimación del tiempo de respuesta, se realizará una caracterización estadística de las sondas. El objetivo es conocer el comportamiento de los sensores y vigilarlos de manera que se puedan diagnosticar las averías aunque éstas estén en una etapa incipiente. ABSTRACT In this project we use noise analysis technique to study the dynamic response of RTDs (Resistant temperature detectors) and thermocouples. These sensors are essential for the proper functioning of nuclear power plants and therefore need to be monitored to guarantee accurate measurements. The noise analysis techniques do not affect plant operation and allow in situ monitoring of the sensors. Temperature sensors are equivalent to first order systems. In these systems the main parameter to monitor is the response time which can be obtained by means of techniques in the frequency domain (spectral analysis) as well as time domain (autoregressive models). Besides response time estimation the project will also include a statistical study of the probes. The goal is to understand the behavior of the sensors and monitor them in order to detect any anomalies or malfunctions even if they occur in an early stage.
Resumo:
In this work, a methodology is proposed to find the dynamic poles of a capacitive pressure transmitter in order to enhance and extend the online surveillance of this type of sensor based on the response time measurement by applying noise analysis techniques and the dynamic data system procedure. Several measurements taken from a pressurized water reactor have been analyzed. The methodology proposes an autoregressive fit whose order is determined by the sensor dynamic poles. Nevertheless, the signals that have been analyzed could not be filtered properly in order to remove the plant noise; thus, this was considered as an additional pair of complex conjugate poles. With this methodology we have come up with the numerical value of the sensor second real pole in spite of its low influence on the sensor dynamic response. This opens up a more accurate online sensor surveillance since the previous methods were achieved by considering one real pole only.
Resumo:
In this work, a methodology is proposed to find the dynamics poles of a capacitive pressure transmitter in order to enhance and extend the on line surveillance of this type of sensors based on the response time measurement by applying noise analysis techniques and the Dynamic Data System. Several measurements have been analyzed taken from a Pressurized Water Reactor. The methodology proposes an autoregressive fit whose order is determined by the sensor dynamics poles. Nevertheless, the signals that have been analyzed, could not be filtered properly in order to remove the plant noise, thus, this was considered as an additional pair of complex conjugate poles. With this methodology we have come up with the numerical value of the sensor second real pole in spite of its low influence on the sensor dynamic response. This opens up a more accurate on line sensor surveillance since the previous methods were achieved by considering one real pole only.