940 resultados para formal verification
Resumo:
Side Channel Attack (SCA) differs from traditional mathematic attacks. It gets around of the exhaustive mathematic calculation and precisely pin to certain points in the cryptographic algorithm to reveal confidential information from the running crypto-devices. Since the introduction of SCA by Paul Kocher et al [1], it has been considered to be one of the most critical threats to the resource restricted but security demanding applications, such as wireless sensor networks. In this paper, we focus our work on the SCA-concerned security verification on WSN (wireless sensor network). A detailed setup of the platform and an analysis of the results of DPA (power attack) and EMA (electromagnetic attack) is presented. The setup follows the way of low-cost setup to make effective SCAs. Meanwhile, surveying the weaknesses of WSNs in resisting SCA attacks, especially for the EM attack. Finally, SCA-Prevention suggestions based on Differential Security Strategy for the FPGA hardware implementation in WSN will be given, helping to get an improved compromise between security and cost.
Resumo:
The verification of compliance with a design specification in manufacturing requires the use of metrological instruments to check if the magnitude associated with the design specification is or not according with tolerance range. Such instrumentation and their use during the measurement process, has associated an uncertainty of measurement whose value must be related to the value of tolerance tested. Most papers dealing jointly tolerance and measurement uncertainties are mainly focused on the establishment of a relationship uncertainty-tolerance without paying much attention to the impact from the standpoint of process cost. This paper analyzes the cost-measurement uncertainty, considering uncertainty as a productive factor in the process outcome. This is done starting from a cost-tolerance model associated with the process. By means of this model the existence of a measurement uncertainty is calculated in quantitative terms of cost and its impact on the process is analyzed.
Resumo:
This work is based on the prototype High Engineering Test Reactor (HTTR) of the Japan Agency of Energy Atomic (JAEA). Its objective is to describe an adequate deterministic model to be used in the assessment of its design safety margins via damage domains. The concept of damage domain is defined and it is shown its relevance in the ongoing effort to apply dynamic risk assessment methods and tools based on the Theory of Stimulated Dynamics (TSD). To illustrate, we present results of an abnormal control rod (CR) withdrawal during subcritical condition and its comparison with results obtained by JAEA. No attempt is made yet to actually assess the detailed scenarios, rather to show how the approach may handle events of its kind
Resumo:
We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional postconditions, properties at arbitrary program points, and certain computational properties. The implemented transformation includes several optimizations to reduce run-time overhead. We also propose a minimal addition to the assertion language which allows defining unit tests to be run in order to detect possible violations of the (partial) specifications expressed by the assertions. This language can express for example the input data for performing the unit tests or the number of times that the unit tests should be repeated. We have implemented the framework within the Ciao/CiaoPP system and effectively applied it to the verification of ISO-prolog compliance and to the detection of different types of bugs in the Ciao system source code. Several experimental results are presented that ¡Ilústrate different trade-offs among program size, running time, or levéis of verbosity of the messages shown to the user.
Resumo:
CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource usage control, and program verification. More recently, novel and promising applications of such semantic approximations are being applied in the more general context of program development such as program verification. In this work, we describe our extensión of the system to incorpórate Abstraction-Carrying Code (ACC), a novel approach to mobile code safety. ACC follows the standard strategy of associating safety certificates to programs, originally proposed in Proof Carrying- Code. A distinguishing feature of ACC is that we use an abstraction (or abstract model) of the program computed by standard static analyzers as a certifícate. The validity of the abstraction on the consumer side is checked in a single-pass by a very efficient and specialized abstractinterpreter. We have implemented and benchmarked ACC within CiaoPP. The experimental results show that the checking phase is indeed faster than the proof generation phase, and that the sizes of certificates are reasonable. Moreover, the preprocessor is based on compile-time (and run-time) tools for the certification of CLP programs with resource consumption assurances.
Resumo:
The technique of Abstract Interpretation [13] has allowed the development of sophisticated program analyses which are provably correct and practical. The semantic approximations produced by such analyses have been traditionally applied to optimization during program compilation. However, recently, novel and promising applications of semantic approximations have been proposed in the more general context of program verification and debugging [3],[10],[7].
Resumo:
In an increasing number of applications (e.g., in embedded, real-time, or mobile systems) it is important or even essential to ensure conformance with respect to a specification expressing resource usages, such as execution time, memory, energy, or user-defined resources. In previous work we have presented a novel framework for data size-aware, static resource usage verification. Specifications can include both lower and upper bound resource usage functions. In order to statically check such specifications, both upper- and lower-bound resource usage functions (on input data sizes) approximating the actual resource usage of the program which are automatically inferred and compared against the specification. The outcome of the static checking of assertions can express intervals for the input data sizes such that a given specification can be proved for some intervals but disproved for others. After an overview of the approach in this paper we provide a number of novel contributions: we present a full formalization, and we report on and provide results from an implementation within the Ciao/CiaoPP framework (which provides a general, unified platform for static and run-time verification, as well as unit testing). We also generalize the checking of assertions to allow preconditions expressing intervals within which the input data size of a program is supposed to lie (i.e., intervals for which each assertion is applicable), and we extend the class of resource usage functions that can be checked.
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional postconditions, properties at arbitrary program points, and certain computational properties. The implemented transformation includes several optimizations to reduce run-time overhead. We also propose a minimal addition to the assertion language which allows defining unit tests to be run in order to detect possible violations of the (partial) specifications expressed by the assertions. This language can express for example the input data for performing the unit tests or the number of times that the unit tests should be repeated. We have implemented the framework within the Ciao/CiaoPP system and effectively applied it to the verification of ISO-prolog compliance and to the detection of different types of bugs in the Ciao system source code. Several experimental results are presented that illustrate different trade-offs among program size, running time, or levels of verbosity of the messages shown to the user.
Resumo:
Muchas instituciones de educación superior de todo el mundo llevan tiempo dinamizando sus enseñanzas a fin de transmitir a sus egresados las nuevas competencias que demanda la sociedad del siglo XXI. Afortunadamente en España también existe una fuerte apuesta por la flexibilización de la educación superior, que se materializa en iniciativas tales como la competición internacional MotoStudent.
Resumo:
In this paper we present a global description of a telematic voting system based on advanced cryptography and on the use of smart cards (VOTESCRIPT system) whose most outstanding characteristic is the ability to verify that the tally carried out by the system is correct, meaning that the results published by the system correspond with votes cast. The VOTESCRIPT system provides an individual verification mechanism allowing each Voter to confirm whether his vote has been correctly counted. The innovation with respect to other solutions lies in the fact that the verification process is private so that Voters have no way of proving what they voted in the presence of a non-authorized third party. Vote buying and selling or any other kind of extortion are prevented. The existence of the Intervention Systems allows the whole electoral process to be controlled by groups of citizens or authorized candidatures. In addition to this the system can simply make an audit not only of the final results, but also of the whole process. Global verification provides the Scrutineers with robust cryptographic evidence which enables unequivocal proof if the system has operated in a fraudulent way.
Resumo:
The paper presents research conducted in the Flow workpackage of the EU funded UPWIND project which focuses on improving models for flow within and downwind of large wind farms in complex terrain and offshore. The main activity is modelling the behaviour of wind turbine wakes in order to improve power output predictions.
Resumo:
En el presente trabajo de investigación se ha analizado la presencia de la Esgrima como materia de enseñanza dentro de diversas Instituciones educativas formales de Madrid, abarcando un rango temporal de 225 años (1725-1950). Para la realización de la investigación se elaboró un esquema de trabajo basado en la metodología de la investigación histórica, la historiografía. La búsqueda de la información se llevó a cabo, fundamentalmente en el Archivo Histórico Nacional el Archivo General de la Administración, el Archivo Regional de Madrid, el Archivo General de la Universidad Complutense de Madrid, el Archivo del Real Conservatorio Superior de Música de Madrid, el Archivo General de Palacio, el Archivo de la Villa y la Biblioteca Nacional. La investigación presentada abarca cinco Instituciones educativas en las que se estudia la presencia de la Esgrima como enseñanza formal, siendo estas el Real Seminario de Nobles de Madrid, la Escuela Central de Profesores y Profesoras de Gimnástica, el Real Conservatorio de Música y Declamación y los Institutos de Segunda Enseñanza del Cardenal Cisneros y de San Isidro. Se analizaron las diferentes Constituciones de formación, Planes de Estudios, Reglamentos Interinos de funcionamiento, Leyes de fundación, libros, Programas de asignaturas y todos aquellos documentos, tanto oficiales como internos de cada uno de los Centros tratados. Así mismo, se estudiaron las figuras de los Maestros encargados de impartir la Esgrima como materia formal en las diversas Instituciones, alcanzando la mayor relevancia D. Manuel Antonio de Brea, D. Francisco de la Macorra y Guijeño, D. Ángel Lancho Martín de la Fuente, D. Afrodisio Aparicio Aparicio y D. José Carbonell. De la investigación realizada se concluye la presencia de la Esgrima como una materia perfectamente sistematizada en cada uno de las Instituciones analizadas así como la importancia que alcanzaba la figura del Maestro de Esgrima en todas ellas, siendo en algunos casos “Maestro Mayor del Reyno” y en otros convirtiéndose en Catedráticos de Esgrima, al tiempo que todos ellos contaban con una titulación que les capacitaba para el ejercicio de su profesión. Así mismo encontramos la presencia de la mujer en una de las Instituciones, el Real Conservatorio de Música y Declamación, con los mismos derechos que sus compañeros varones para matricularse y estudiar la Esgrima. Los Maestros estudiados, fundamentalmente D. Ángel Lancho, D. Afrodisio Aparicio y D. José Carbonell, contribuyeron a popularizar la Esgrima y a su desarrollo como deporte, organizando torneos para sus alumnos y compitiendo ellos mismos en multitud de asaltos. A lo largo de los 225 años que abarca esta investigación, la Esgrima evolucionó de Destreza a Habilidad y más tarde a Deporte, sin perder en ningún momento su importancia, extendiéndose a sectores de la sociedad a los que nunca habría llegado de no ser por la labor de los Maestros de Esgrima encargados de su formación. ABSTRACT. In this research work, the presence of Fencing as an educational subject within different formal educational Institutions in Madrid, over a period of 225 years (1725-1950), has been analysed. In order to carry out this research, a work schedule was drawn up, based on the methods of the historical research, historiography. The search for the information was primarily carried out in the Archivo Historico Nacional, the Archivo General de la Administración, the Archivo Regional of Madrid, the Archivo General de la Universidad Complutense de Madrid, the Archivo del Real Conservatorio Superior de Música of Madrid, the Archivo General de Palacio, the Archivo de la Villa and the Biblioteca Nacional. The research presented covers five educational Institutions, where the presence of Fencing as a formal education is studied. These Institutions are: the Real Seminario de Nobles de Madrid, the Escuela Central de Profesores y Profesoras de Gimnástica, the Real Conservatorio de Música y Declamación and the Cardenal Cisneros and the San Isidro Secondary Schools. The different Academic formation, Syllabuses, Temporary working Regulations, founding Law, books, subject Plans and all the documents, official or internal, of each of the Centres dealt with, were analysed. Moreover, the roles of the Masters in charge of teaching Fencing as a formal subject in different Institutions were studied, and Mr. Manuel Antonio de Brea, Mr. Francisco de la Macorra y Guijeño, Mr. Ángel Lancho Martín de la Fuente, Mr. Afrodisio Aparicio Aparicio and Mr. José Carbonell were the most prominent. The conclusion of the research carried out, is that the presence of Fencing is a subject perfectly organised in each one of the Institutions analysed, as well as the importance that the role of the Fencing Master reached in every Institution, in some cases being “Maestro Mayor del Reyno” and in others becoming Fencing Professors, while all of them had a qualification that enabled them to practice their profession. Similarly, we can find the presence of women in one of the Institutions, the Real Conservatorio de Música y Declamación, where they had the same rights as her male classmates to enrol and study Fencing. The Fencing Masters who were studied, principally Mr. Ángel Lancho, Mr. Afrodisio Aparicio and Mr. José Carbonell, all contributed to making Fencing popular and also towards its development as a sport, by organising tournaments for their students and by competing themselves in a number of assaults. Over the 225 years which the research covers, Fencing evolved from Skill to Ability and later on to a Sport, without, at any moment, losing its importance, spreading to parts of society that it would never have reached if it were not for the work of the Fencing Masters in charge of teaching it.
Resumo:
This paper presents a new verification procedure for sound source coverage according to ISO 140?5 requirements. The ISO 140?5 standard applies to the measurement of façade insulation and requires a sound source able to achieve a sufficiently uniform sound field in free field conditions on the façade under study. The proposed method involves the electroacoustic characterisation of the sound source in laboratory free field conditions (anechoic room) and the subsequent prediction by computer simulation of the sound free field radiated on a rectangular surface equal in size to the façade being measured. The loudspeaker is characterised in an anechoic room under laboratory controlled conditions, carefully measuring directivity, and then a computer model is designed to calculate the acoustic free field coverage for different loudspeaker positions and façade sizes. For each sound source position, the method provides the maximum direct acoustic level differences on a façade specimen and therefore determines whether the loudspeaker verifies the maximum allowed level difference of 5 dB (or 10 dB for façade dimensions greater than 5 m) required by the ISO standard. Additionally, the maximum horizontal dimension of the façade meeting the standard is calculated and provided for each sound source position, both with the 5 dB and 10 dB criteria. In the last section of the paper, the proposed procedure is compared with another method used by the authors in the past to achieve the same purpose: in situ outdoor measurements attempting to recreate free field conditions. From this comparison, it is concluded that the proposed method is able to reproduce the actual measurements with high accuracy, for example, the ground reflection effect, at least at low frequencies, which is difficult to avoid in the outdoor measurement method, and it is fully eliminated with the proposed method to achieve the free field requisite.