933 resultados para correctness verification
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Eletrotécnica e Computadores
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Informática
Resumo:
Conventionally the problem of the best path in a network refers to the shortest path problem. However, for the vast majority of networks present nowadays this solution has some limitations which directly affect their proper functioning, as well as an inefficient use of their potentialities. Problems at the level of large networks where graphs of high complexity are commonly present as well as the appearing of new services and their respective requirements, are intrinsically related to the inability of this solution. In order to overcome the needs present in these networks, a new approach to the problem of the best path must be explored. One solution that has aroused more interest in the scientific community considers the use of multiple paths between two network nodes, where they can all now be considered as the best path between those nodes. Therefore, the routing will be discontinued only by minimizing one metric, where only one path between nodes is chosen, and shall be made by the selection of one of many paths, thereby allowing the use of a greater diversity of the present paths (obviously, if the network consents). The establishment of multi-path routing in a given network has several advantages for its operation. Its use may well improve the distribution of network traffic, improve recovery time to failure, or it can still offer a greater control of the network by its administrator. These factors still have greater relevance when networks have large dimensions, as well as when their constitution is of high complexity, such as the Internet, where multiple networks managed by different entities are interconnected. A large part of the growing need to use multipath protocols is associated to the routing made based on policies. Therefore, paths with different characteristics can be considered with equal level of preference, and thus be part of the solution for the best way problem. To perform multi-path routing using protocols based only on the destination address has some limitations but it is possible. Concepts of graph theory of algebraic structures can be used to describe how the routes are calculated and classified, enabling to model the routing problem. This thesis studies and analyzes multi-path routing protocols from the known literature and derives a new algebraic condition which allows the correct operation of these protocols without any network restriction. It also develops a range of software tools that allows the planning and the respective verification/validation of new protocols models according to the study made.
Resumo:
Nowadays, data available and used by companies is growing very fast creating the need to use and manage this data in the most efficient way. To this end, data is replicated overmultiple datacenters and use different replication protocols, according to their needs, like more availability or stronger consistency level. The costs associated with full data replication can be very high, and most of the times, full replication is not needed since information can be logically partitioned. Another problem, is that by using datacenters to store and process information clients become heavily dependent on them. We propose a partial replication protocol called ParTree, which replicates data to clients, and organizes clients in a hierarchy, using communication between them to propagate information. This solution addresses some of these problems, namely by supporting partial data replication and offline execution mode. Given the complexity of the protocol, the use of formal verification is crucial to ensure the protocol two correctness properties: causal consistency and preservation of data. The use of TLA+ language and tools to formally specificity and verify the proposed protocol are also described.
Resumo:
In a reconfigurable system, the response to contextual or internal change may trigger reconfiguration events which, on their turn, activate scripts that change the system׳s architecture at runtime. To be safe, however, such reconfigurations are expected to obey the fundamental principles originally specified by its architect. This paper introduces an approach to ensure that such principles are observed along reconfigurations by verifying them against concrete specifications in a suitable logic. Architectures, reconfiguration scripts, and principles are specified in Archery, an architectural description language with formal semantics. Principles are encoded as constraints, which become formulas of a two-layer graded hybrid logic, where the upper layer restricts reconfigurations, and the lower layer constrains the resulting configurations. Constraints are verified by translating them into logic formulas, which are interpreted over models derived from Archery specifications of architectures and reconfigurations. Suitable notions of bisimulation and refinement, to which the architect may resort to compare configurations, are given, and their relationship with modal validity is discussed.
Resumo:
La programación concurrente es una tarea difícil aún para los más experimentados programadores. Las investigaciones en concurrencia han dado como resultado una gran cantidad de mecanismos y herramientas para resolver problemas de condiciones de carrera de datos y deadlocks, problemas que surgen por el mal uso de los mecanismos de sincronización. La verificación de propiedades interesantes de programas concurrentes presenta dificultades extras a los programas secuenciales debido al no-determinismo de su ejecución, lo cual resulta en una explosión en el número de posibles estados de programa, haciendo casi imposible un tratamiento manual o aún con la ayuda de computadoras. Algunos enfoques se basan en la creación de lenguajes de programación con construcciones con un alto nivel de abstración para expresar concurrencia y sincronización. Otros enfoques tratan de desarrollar técnicas y métodos de razonamiento para demostrar propiedades, algunos usan demostradores de teoremas generales, model-checking o algortimos específicos sobre un determinado sistema de tipos. Los enfoques basados en análisis estático liviano utilizan técnicas como interpretación abstracta para detectar ciertos tipos de errores, de una manera conservativa. Estas técnicas generalmente escalan lo suficiente para aplicarse en grandes proyectos de software pero los tipos de errores que pueden detectar es limitada. Algunas propiedades interesantes están relacionadas a condiciones de carrera y deadlocks, mientras que otros están interesados en problemas relacionados con la seguridad de los sistemas, como confidencialidad e integridad de datos. Los principales objetivos de esta propuesta es identificar algunas propiedades de interés a verificar en sistemas concurrentes y desarrollar técnicas y herramientas para realizar la verificación en forma automática. Para lograr estos objetivos, se pondrá énfasis en el estudio y desarrollo de sistemas de tipos como tipos dependientes, sistema de tipos y efectos, y tipos de efectos sensibles al flujo de datos y control. Estos sistemas de tipos se aplicarán a algunos modelos de programación concurrente como por ejemplo, en Simple Concurrent Object-Oriented Programming (SCOOP) y Java. Además se abordarán propiedades de seguridad usando sistemas de tipos específicos. Concurrent programming has remained a dificult task even for very experienced programmers. Concurrency research has provided a rich set of tools and mechanisms for dealing with data races and deadlocks that arise of incorrect use of synchronization. Verification of most interesting properties of concurrent programs is a very dificult task due to intrinsic non-deterministic nature of concurrency, resulting in a state explosion which make it almost imposible to be manually treat and it is a serious challenge to do that even with help of computers. Some approaches attempts create programming languages with higher levels of abstraction for expressing concurrency and synchronization. Other approaches try to develop reasoning methods to prove properties, either using general theorem provers, model-checking or specific algorithms on some type systems. The light-weight static analysis approach apply techniques like abstract interpretation to find certain kind of bugs in a conservative way. This techniques scale well to be applied in large software projects but the kind of bugs they may find are limited. Some interesting properties are related to data races and deadlocks, while others are interested in some security problems like confidentiality and integrity of data. The main goals of this proposal is to identify some interesting properties to verify in concurrent systems and develop techniques and tools to do full automatic verification. The main approach will be the application of type systems, as dependent types, type and effect systems, and flow-efect types. Those type systems will be applied to some models for concurrent programming as Simple Concurrent Object-Oriented Programming (SCOOP) and Java. Other goals include the analysis of security properties also using specific type systems.
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2015
Resumo:
In this paper we consider a representative a priori unstable Hamiltonian system with 2+1/2 degrees of freedom, to which we apply the geometric mechanism for diffusion introduced in the paper Delshams et al., Mem.Amer.Math. Soc. 2006, and generalized in Delshams and Huguet, Nonlinearity 2009, and provide explicit, concrete and easily verifiable conditions for the existence of diffusing orbits. The simplification of the hypotheses allows us to perform explicitly the computations along the proof, which contribute to present in an easily understandable way the geometric mechanism of diffusion. In particular, we fully describe the construction of the scattering map and the combination of two types of dynamics on a normally hyperbolic invariant manifold.
Resumo:
This study was conducted to assess if fingerprint specialists could be influenced by extraneous contextual information during a verification process. Participants were separated into three groups: a control group (no contextual information was given), a low bias group (minimal contextual information was given in the form of a report prompting conclusions), and a high bias group (an internationally recognized fingerprint expert provided conclusions and case information to deceive this group into believing that it was his case and conclusions). A similar experiment was later conducted with laypersons. The results showed that fingerprint experts were influenced by contextual information during fingerprint comparisons, but not towards making errors. Instead, fingerprint experts under the biasing conditions provided significantly fewer definitive and erroneous conclusions than the control group. In contrast, the novice participants were more influenced by the bias conditions and did tend to make incorrect judgments, especially when prompted towards an incorrect response by the bias prompt.
Resumo:
Several superstructure design methodologies have been developed for low volume road bridges by the Iowa State University Bridge Engineering Center. However, to date no standard abutment designs have been developed. Thus, there was a need to establish an easy to use design methodology in addition to generating generic abutment standards and other design aids for the more common substructure systems used in Iowa. The final report for this project consists of three volumes. The first volume summarizes the research completed in this project. A survey of the Iowa County Engineers was conducted from which it was determined that while most counties use similar types of abutments, only 17 percent use some type of standard abutment designs or plans. A literature review revealed several possible alternative abutment systems for future use on low volume road bridges in addition to two separate substructure lateral load analysis methods. These consisted of a linear and a non-linear method. The linear analysis method was used for this project due to its relative simplicity and the relative accuracy of the maximum pile moment when compared to values obtained from the more complex non-linear analysis method. The resulting design methodology was developed for single span stub abutments supported on steel or timber piles with a bridge span length ranging from 20 to 90 ft and roadway widths of 24 and 30 ft. However, other roadway widths can be designed using the foundation design template provided. The backwall height is limited to a range of 6 to 12 ft, and the soil type is classified as cohesive or cohesionless. The design methodology was developed using the guidelines specified by the American Association of State Highway Transportation Officials Standard Specifications, the Iowa Department of Transportation Bridge Design Manual, and the National Design Specifications for Wood Construction. The second volume introduces and outlines the use of the various design aids developed for this project. Charts for determining dead and live gravity loads based on the roadway width, span length, and superstructure type are provided. A foundation design template was developed in which the engineer can check a substructure design by inputting basic bridge site information. Tables published by the Iowa Department of Transportation that provide values for estimating pile friction and end bearing for different combinations of soils and pile types are also included. Generic standard abutment plans were developed for which the engineer can provide necessary bridge site information in the spaces provided. These tools enable engineers to design and detail county bridge substructures more efficiently. The third volume (this volume) provides two sets of calculations that demonstrate the application of the substructure design methodology developed in this project. These calculations also verify the accuracy of the foundation design template. The printouts from the foundation design template are provided at the end of each example. Also several tables provide various foundation details for a pre-cast double tee superstructure with different combinations of soil type, backwall height, and pile type.
Resumo:
Report on a review of selected general and application controls over the Iowa Department of Human Services’ Issuance Verification System for the period March 19, 2009 through April 17, 2009
Resumo:
Plant circadian clock controls a wide variety of physiological and developmental events, which include the short-days (SDs)-specific promotion of the elongation of hypocotyls during de-etiolation and also the elongation of petioles during vegetative growth. In A. thaliana, the PIF4 gene encoding a phytochrome-interacting basic helix-loop-helix (bHLH) transcription factor plays crucial roles in this photoperiodic control of plant growth. According to the proposed external coincidence model, the PIF4 gene is transcribed precociously at the end of night specifically in SDs, under which conditions the protein product is stably accumulated, while PIF4 is expressed exclusively during the daytime in long days (LDs), under which conditions the protein product is degraded by the light-activated phyB and also the residual proteins are inactivated by the DELLA family of proteins. A number of previous reports provided solid evidence to support this coincidence model mainly at the transcriptional level of the PIF 4 and PIF4-traget genes. Nevertheless, the diurnal oscillation profiles of PIF4 proteins, which were postulated to be dependent on photoperiod and ambient temperature, have not yet been demonstrated. Here we present such crucial evidence on PIF4 protein level to further support the external coincidence model underlying the temperature-adaptive photoperiodic control of plant growth in A. thaliana.
Resumo:
We have designed and built an experimental device, which we called a "thermoelectric bridge." Its primary purpose is simultaneous measurement of the relative Peltier and Seebeck coefficients. The systematic errors for both coefficients are equal with this device and manipulation is not necessary between the measurement of one coefficient and the other. Thus, this device is especially suitable for verifying their linear relation postulated by Lord Kelvin. Also, simultaneous measurement of thermal conductivity is described in the text. A sample is made up of the couple nickel¿platinum, taking measurements in the range of ¿20¿60°C and establishing the dependence of each coefficient with temperature, with nearly equal random errors ±0.2%, and systematic errors estimated at ¿0.5%. The aforementioned Kelvin relation is verified in this range from these results, proving that the behavioral deviations are ¿0.3% contained in the uncertainty ±0.5% caused by the propagation of errors