922 resultados para Static verification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Biomédica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

work presented in the context of the European Master’s program in Computational Logic, as the partial requirement for obtaining Master of Science degree in Computational Logic

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Eletrotécnica e Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conventionally the problem of the best path in a network refers to the shortest path problem. However, for the vast majority of networks present nowadays this solution has some limitations which directly affect their proper functioning, as well as an inefficient use of their potentialities. Problems at the level of large networks where graphs of high complexity are commonly present as well as the appearing of new services and their respective requirements, are intrinsically related to the inability of this solution. In order to overcome the needs present in these networks, a new approach to the problem of the best path must be explored. One solution that has aroused more interest in the scientific community considers the use of multiple paths between two network nodes, where they can all now be considered as the best path between those nodes. Therefore, the routing will be discontinued only by minimizing one metric, where only one path between nodes is chosen, and shall be made by the selection of one of many paths, thereby allowing the use of a greater diversity of the present paths (obviously, if the network consents). The establishment of multi-path routing in a given network has several advantages for its operation. Its use may well improve the distribution of network traffic, improve recovery time to failure, or it can still offer a greater control of the network by its administrator. These factors still have greater relevance when networks have large dimensions, as well as when their constitution is of high complexity, such as the Internet, where multiple networks managed by different entities are interconnected. A large part of the growing need to use multipath protocols is associated to the routing made based on policies. Therefore, paths with different characteristics can be considered with equal level of preference, and thus be part of the solution for the best way problem. To perform multi-path routing using protocols based only on the destination address has some limitations but it is possible. Concepts of graph theory of algebraic structures can be used to describe how the routes are calculated and classified, enabling to model the routing problem. This thesis studies and analyzes multi-path routing protocols from the known literature and derives a new algebraic condition which allows the correct operation of these protocols without any network restriction. It also develops a range of software tools that allows the planning and the respective verification/validation of new protocols models according to the study made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a reconfigurable system, the response to contextual or internal change may trigger reconfiguration events which, on their turn, activate scripts that change the system׳s architecture at runtime. To be safe, however, such reconfigurations are expected to obey the fundamental principles originally specified by its architect. This paper introduces an approach to ensure that such principles are observed along reconfigurations by verifying them against concrete specifications in a suitable logic. Architectures, reconfiguration scripts, and principles are specified in Archery, an architectural description language with formal semantics. Principles are encoded as constraints, which become formulas of a two-layer graded hybrid logic, where the upper layer restricts reconfigurations, and the lower layer constrains the resulting configurations. Constraints are verified by translating them into logic formulas, which are interpreted over models derived from Archery specifications of architectures and reconfigurations. Suitable notions of bisimulation and refinement, to which the architect may resort to compare configurations, are given, and their relationship with modal validity is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este proyecto se enmarca en la utlización de métodos formales (más precisamente, en la utilización de teoría de tipos) para garantizar la ausencia de errores en programas. Por un lado se plantea el diseño de nuevos algoritmos de chequeo de tipos. Para ello, se proponen nuevos algoritmos basados en la idea de normalización por evaluación que sean extensibles a otros sistemas de tipos. En el futuro próximo extenderemos resultados que hemos conseguido recientemente [16,17] para obtener: una simplificación de los trabajos realizados para sistemas sin regla eta (acá se estudiarán dos sistemas: a la Martin Löf y a la PTS), la formulación de estos chequeadores para sistemas con variables, generalizar la noción de categoría con familia utilizada para dar semántica a teoría de tipos, obtener una formulación categórica de la noción de normalización por evaluación y finalmente, aplicar estos algoritmos a sistemas con reescrituras. Para los primeros resultados esperados mencionados, nos proponemos como método adaptar las pruebas de [16,17] a los nuevos sistemas. La importancia radica en que permitirán tornar más automatizables (y por ello, más fácilmente utilizables) los asistentes de demostración basados en teoría de tipos. Por otro lado, se utilizará la teoría de tipos para certificar compiladores, intentando llevar adelante la propuesta nunca explorada de [22] de utilizar un enfoque abstracto basado en categorías funtoriales. El método consistirá en certificar el lenguaje "Peal" [29] y luego agregar sucesivamente funcionalidad hasta obtener Forsythe [23]. En este período esperamos poder agregar varias extensiones. La importancia de este proyecto radica en que sólo un compilador certificado garantiza que un programa fuente correcto se compile a un programa objeto correcto. Es por ello, crucial para todo proceso de verificación que se base en verificar código fuente. Finalmente, se abordará la formalización de sistemas con session types. Los mismos han demostrado tener fallas en sus formulaciones [30], por lo que parece conveniente su formalización. Durante la marcha de este proyecto, esperamos tener alguna formalización que dé lugar a un algoritmo de chequeo de tipos y a demostrar las propiedades usuales de los sistemas. La contribución es arrojar un poco de luz sobre estas formulaciones cuyos errores revelan que el tema no ha adquirido aún suficiente madurez o comprensión por parte de la comunidad. This project is about using type theory to garantee program correctness. It follows three different directions: 1) Finding new type-checking algorithms based on normalization by evaluation. First, we would show that recent results like [16,17] extend to other type systems like: Martin-Löf´s type theory without eta rule, PTSs, type systems with variables (in addition to systems in [16,17] which are a la de Bruijn), systems with rewrite rules. This will be done by adjusting the proofs in [16,17] so that they apply to such systems as well. We will also try to obtain a more general definition of categories with families and normalization by evaluation, formulated in categorical terms. We expect this may turn proof-assistants more automatic and useful. 2) Exploring the proposal in [22] to compiler construction for Algol-like languages using functorial categories. According to [22] such approach is suitable for verifying compiler correctness, claim which was never explored. First, the language Peal [29] will be certified in type theory and we will gradually add funtionality to it until a correct compiler for the language Forsythe [23] is obtained. 3) Formilizing systems for session types. Several proposals have shown to be faulty [30]. This means that a formalization of it may contribute to the general understanding of session types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Otto-von-Guericke-Universität Magdeburg, Fakultät für Elektrotechnik und Informationstechnik, Univ., Dissertation, 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Adaptations to Internal (IR) and external (ER) rotator shoulder muscles improving overhead throwing kinematics could lead to muscular strength imbalances and be considered an intrinsic risk factor for shoulder injury, as well as modified shoulder range of motion (RoM). OBJECTIVE: To establish profiles of internal and external rotation RoM and isokinetic IR and ER strength in adolescent- and national-level javelin throwers. METHODS: Fourteen healthy subjects were included in this preliminary cross-sectional study, 7 javelin throwers (JTG) and 7 nonathletes (CG). Passive internal and external rotation RoM were measured at 90 degrees of shoulder abduction. Isokinetic strength of dominant and non-dominant IR and ER was evaluated during concentric (60, 120 and 240 degrees/s) and eccentric (60 degrees/s) contractions by Con-Trex (R) dynamometer with the subject in a seated position with 45 degrees of shoulder abduction in the scapular plane. RESULTS: We reported significantly lower internal rotation and significantly higher external rotation RoM in JTG than in CG. Concentric and eccentric IR and ER strength were significantly higher for the dominant shoulder side in JTG (P < 0.05), without significant differences in ER/IR ratios. CONCLUSIONS: The main finding of this preliminary study confirmed static and dynamic shoulder stabilizer adaptations due to javelin throw practice in a population of adolescent- and national-level javelin throwers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates in vitro growth of human urinary tract smooth muscle cells under static conditions and mechanical stimulation. The cells were cultured on collagen type I- and laminin-coated silicon membranes. Using a Flexcell device for mechanical stimulation, a cyclic strain of 0-20% was applied in a strain-stress-time model (stretch, 104 min relaxation, 15 s), imitating physiological bladder filling and voiding. Cell proliferation and alpha-actin, calponin, and caldesmon phenotype marker expression were analyzed. Nonstretched cells showed significant better growth on laminin during the first 8 days, thereafter becoming comparable to cells grown on collagen type I. Cyclic strain significantly reduced cell growth on both surfaces; however, better growth was observed on laminin. Neither the type of surface nor mechanical stimulation influenced the expression pattern of phenotype markers; alpha-actin was predominantly expressed. Coating with the extracellular matrix protein laminin improved in vitro growth of human urinary tract smooth muscle cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider a representative a priori unstable Hamiltonian system with 2+1/2 degrees of freedom, to which we apply the geometric mechanism for diffusion introduced in the paper Delshams et al., Mem.Amer.Math. Soc. 2006, and generalized in Delshams and Huguet, Nonlinearity 2009, and provide explicit, concrete and easily verifiable conditions for the existence of diffusing orbits. The simplification of the hypotheses allows us to perform explicitly the computations along the proof, which contribute to present in an easily understandable way the geometric mechanism of diffusion. In particular, we fully describe the construction of the scattering map and the combination of two types of dynamics on a normally hyperbolic invariant manifold.