933 resultados para correctness verification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The strategic plan for bridge engineering issued by AASHTO in 2005 identified extending the service life and optimizing structural systems of bridges in the United States as two grand challenges in bridge engineering, with the objective of producing safer bridges that have a minimum service life of 75 years and reduced maintenance cost. Material deterioration was identified as one of the primary challenges to achieving the objective of extended life. In substructural applications (e.g., deep foundations), construction materials such as timber, steel, and concrete are subjected to deterioration due to environmental impacts. Using innovative and new materials for foundation applications makes the AASHTO objective of 75 years service life achievable. Ultra High Performance Concrete (UHPC) with compressive strength of 180 MPa (26,000 psi) and excellent durability has been used in superstructure applications but not in geotechnical and foundation applications. This study explores the use of precast, prestressed UHPC piles in future foundations of bridges and other structures. An H-shaped UHPC section, which is 10-in. (250-mm) deep with weight similar to that of an HP10×57 steel pile, was designed to improve constructability and reduce cost. In this project, instrumented UHPC piles were cast and laboratory and field tests were conducted. Laboratory tests were used to verify the moment-curvature response of UHPC pile section. In the field, two UHPC piles have been successfully driven in glacial till clay soil and load tested under vertical and lateral loads. This report provides a complete set of results for the field investigation conducted on UHPC H-shaped piles. Test results, durability, drivability, and other material advantages over normal concrete and steel indicate that UHPC piles are a viable alternative to achieve the goals of AASHTO strategic plan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current means and methods of verifying that high-strength bolts have been properly tightened are very laborious and time consuming. In some cases, the techniques require special equipment and, in other cases, the verification itself may be somewhat subjective. While some commercially available verification techniques do exist, these options still have some limitations and might be considered costly options. The main objectives of this project were to explore high-strength bolt-tightening and verification techniques and to investigate the feasibility of developing and implementing new alternatives. A literature search and a survey of state departments of transportation (DOTs) were conducted to collect information on various bolt-tightening techniques such that an understanding of available and under-development techniques could be obtained. During the literature review, the requirements for materials, inspection, and installation methods outlined in the Research Council on Structural Connections specification were also reviewed and summarized. To guide the search for finding new alternatives and technology development, a working group meeting was held at the Iowa State University Institute for Transportation October 12, 2015. During the meeting, topics central to the research were discussed with Iowa DOT engineers and other professionals who have relevant experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laboratoriomittakaavainen formeri on välttämätön, jotta paperinvalmistusprosessin jäljitteleminen olisi mahdollista. Vaikka erilaisia formereita löytyykin paperiteollisuudesta, tilaa on kuitenkin laboratoriomittakaavaiselle paperinvalmistusmenetelmälle, joka sijoittuisipilottikoneen ja perinteisen laboratorioarkkimuotin välille. Formeri, jolla saadaan aikaiseksi oikean paperinvalmistuksen kaltaiset olosuhteet ja ilmiöt on kehitetty, ja sen toiminta on testattu Nalcon Papermaking Centreof Excellence:ssä Espoossa. Formeri on yhdistetty Nalcon lähestymisjärjetelmäsimulaattoriin ja simulaattorilla aikaansaadut hydro-kemialliset ilmiöt voidaan testata nyt myös arkeista. Laitteessa on perälaatikko ja viiraosa. Perälaatikosta massa virtaa viiralle, joka liikkuu eteenpäin hihnakuljettimen hihnojen päällä. Suihku-viira -suhdetta voidaan muuttaa joko muuttamalla virtausnopeutta tai viiran nopeutta tai säätämällä perälaatikon huuliaukkoa. Formerintoiminnan testaus osoitti, että se toimii teknisesti hyvin ja tulokset ovat toistettavia ja loogisia. Arkeissa kuidut ovat orientoituneet, formaatio ja vetolujuussuhde KS/PS riippuvat voimakkaasti suihku-viira -suhteesta, kuten oikeillakinpaperikoneilla.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What makes necessary truths true? I argue that all truth supervenes on how things are, and that necessary truths are no exception. What makes them true are proofs. But if so, the notion of proof needs to be generalized to include verification-transcendent proofs, proofs whose correctness exceeds our ability to verify it. It is incumbent on me, therefore, to show that arguments, such as Dummett's, that verification-truth is not compatible with the theory of meaning, are mistaken. The answer is that what we can conceive and construct far outstrips our actual abilities. I conclude by proposing a proof-theoretic account of modality, rejecting a claim of Armstrong's that modality can reside in non-modal truthmakers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The oxidation potential of pulsed corona discharge concerning aqueous impurities is limited in respect to certain refractory compounds. This may be enhanced in combination of the discharge with catalysis/photocatalysis as developed in homogeneous gas-phase reactions. The objective of the work consists of testing the hypothesis of oxidation potential enhancement in combination of the discharge with TiO2 photocatalysis applied to aqueous solutions of refractory oxalate. Meglumine acridone acetate was included for meeting the practical needs. The experimental research was undertaken into oxidation of aqueous solutions under conditions of various target pollutant concentrations, pH and the pulse repetition rate with plain electrodes and the electrodes with TiO2 attached to their surface. The results showed no positive influence of the photocatalyst, the pollutants were oxidized with the rate identical within the accuracy of measurements. The possible explanation for the observed inefficiency may include low UV irradiance, screening effect of water and generally low oxidation rate in photocatalytic reactions. Further studies might include combination of electric discharge with ozone decomposition/radical formation catalysts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamic logic is an extension of modal logic originally intended for reasoning about computer programs. The method of proving correctness of properties of a computer program using the well-known Hoare Logic can be implemented by utilizing the robustness of dynamic logic. For a very broad range of languages and applications in program veri cation, a theorem prover named KIV (Karlsruhe Interactive Veri er) Theorem Prover has already been developed. But a high degree of automation and its complexity make it di cult to use it for educational purposes. My research work is motivated towards the design and implementation of a similar interactive theorem prover with educational use as its main design criteria. As the key purpose of this system is to serve as an educational tool, it is a self-explanatory system that explains every step of creating a derivation, i.e., proving a theorem. This deductive system is implemented in the platform-independent programming language Java. In addition, a very popular combination of a lexical analyzer generator, JFlex, and the parser generator BYacc/J for parsing formulas and programs has been used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.