989 resultados para Verification tool


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We present a case study of formal verification of full-wave rectifier for analog and mixed signal designs. We have used the Checkmate tool from CMU [1], which is a public domain formal verification tool for hybrid systems. Due to the restriction imposed by Checkmate it necessitates to make the changes in the Checkmate implementation to implement the complex and non-linear system. Full-wave rectifier has been implemented by using the Checkmate custom blocks and the Simulink blocks from MATLAB from Math works. After establishing the required changes in the Checkmate implementation we are able to efficiently verify, the safety properties of the full-wave rectifier.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Formal specification is vital to the development of distributed real-time systems as these systems are inherently complex and safety-critical. It is widely acknowledged that formal specification and automatic analysis of specifications can significantly increase system reliability. Although a number of specification techniques for real-time systems have been reported in the literature, most of these formalisms do not adequately address to the constraints that the aspects of 'distribution' and 'real-time' impose on specifications. Further, an automatic verification tool is necessary to reduce human errors in the reasoning process. In this regard, this paper is an attempt towards the development of a novel executable specification language for distributed real-time systems. First, we give a precise characterization of the syntax and semantics of DL. Subsequently, we discuss the problems of model checking, automatic verification of satisfiability of DL specifications, and testing conformance of event traces with DL specifications. Effective solutions to these problems are presented as extensions to the classical first-order tableau algorithm. The use of the proposed framework is illustrated by specifying a sample problem.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

An extension to a formal verification approach of hybrid systems is proposed to verify analog and mixed signal (AMS) designs. AMS designs can be formally modeled as hybrid systems and therefore lend themselves to the formal analysis and verification techniques applied to hybrid systems. The proposed approach employs simulation traces obtained from an actual design implementation of AMS circuit blocks (for example, in the form of SPICE netlists) to carry out formal analysis and verification. This enables the same platform used for formally validating an abstract model of an AMS design, to be also used for validating its different refinements and design implementation; thereby, providing a simple route to formal verification at different levels of implementation. The feasibility of the proposed approach is demonstrated with a case study based on a tunnel diode oscillator. Since the device characteristic of a tunnel diode is highly non-linear with a negative resistance region, dynamic behavior of circuits in which it is employed as an element is difficult to model, analyze and verify within a general hybrid system formal verification tool. In the case study presented the formal model and the proposed computational techniques have been incorporated into CheckMate, a formal verification tool based on MATLAB and Simulink-Stateflow Framework from MathWorks.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Purpose: To investigate the dosimetric properties of an electronic portal imaging device (EPID) for electron beam detection and to evaluate its potential for quality assurance (QA) of modulated electron radiotherapy (MERT). Methods: A commercially available EPID was used to detect electron beams shaped by a photon multileaf collimator (MLC) at a source-surface distance of 70 cm. The fundamental dosimetric properties such as reproducibility, dose linearity, field size response, energy response, and saturation were investigated for electron beams. A new method to acquire the flood-field for the EPID calibration was tested. For validation purpose, profiles of open fields and various MLC fields (square and irregular) were measured with a diode in water and compared to the EPID measurements. Finally, in order to use the EPID for QA of MERT delivery, a method was developed to reconstruct EPID two-dimensional (2D) dose distributions in a water-equivalent depth of 1.5 cm. Comparisons were performed with film measurement for static and dynamic monoenergy fields as well as for multienergy fields composed by several segments of different electron energies. Results: The advantageous EPID dosimetric properties already known for photons as reproducibility, linearity with dose, and dose rate were found to be identical for electron detection. The flood-field calibration method was proven to be effective and the EPID was capable to accurately reproduce the dose measured in water at 1.0 cm depth for 6 MeV, 1.3 cm for 9 MeV, and 1.5 cm for 12, 15, and 18 MeV. The deviations between the output factors measured with EPID and in water at these depths were within ±1.2% for all the energies with a mean deviation of 0.1%. The average gamma pass rate (criteria: 1.5%, 1.5 mm) for profile comparison between EPID and measurements in water was better than 99% for all the energies considered in this study. When comparing the reconstructed EPID 2D dose distributions at 1.5 cm depth to film measurements, the gamma pass rate (criteria: 2%, 2 mm) was better than 97% for all the tested cases. Conclusions: This study demonstrates the high potential of the EPID for electron dosimetry, and in particular, confirms the possibility to use it as an efficient verification tool for MERT delivery.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de doutoramento, Informática (Ciências da Computação), Universidade de Lisboa, Faculdade de Ciências, 2015

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Architectures based on Coordinated Atomic action (CA action) concepts have been used to build concurrent fault-tolerant systems. This conceptual model combines concurrent exception handling with action nesting to provide a general mechanism for both enclosing interactions among system components and coordinating forward error recovery measures. This article presents an architectural model to guide the formal specification of concurrent fault-tolerant systems. This architecture provides built-in Communicating Sequential Processes (CSPs) and predefined channels to coordinate exception handling of the user-defined components. Hence some safety properties concerning action scoping and concurrent exception handling can be proved by using the FDR (Failure Divergence Refinement) verification tool. As a result, a formal and general architecture supporting software fault tolerance is ready to be used and proved as users define components with normal and exceptional behaviors. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mobile agents have been implemented in e-commerce to search and filter information of interest from electronic markets. When the in format ion is very sensitive and critical, it is important to develop a novel security protocol that can efficiently protect the information from malicious tampering as well as unauthorized disclosure or at least detect any malicious act of intruders. In this chapter, we describe robust security techniques that ensure a sound security of information gathered throughout agent's itinerary against various security attacks, as well as truncation attacks. A sound security protocol is described , which implements the various security techniques that would jointly prevent or at least detect any malicious act of intruders. We reason about the soundness of the protocol using Symbolic Trace Analyzer (STA), a formal verification tool that is based on symbolic techniques. We analyze the protocol in key configurations and show that it is free of flaws. We also show that the protocol fulfils the various security requirements of exchanged information in MAS, including data-integrity, data-confidentiality, data authenticity, origin confidentiality and data non-repudiability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

© 2015 Published by Elsevier Ltd. All rights reserved. Accurate static recrystallization (SRX) models are necessary to improve the properties of austenitic steels by thermo-mechanical operations. This relies heavily on a careful and accurate analysis of "the interrupted test data" and conversion of the heterogeneous deformation data to the flow stress. A "computational-experimental inverse method" was presented and implemented here to analyze the SRX test data, which takes into account the heterogeneous softening of the post-interruption test sample. Conventional and "inverse" methods were used to identify the SRX kinetics for a model austenitic steel deformed at 1273 K (with a strain rate of 1 s-1) using the hot torsion test assess the merits of each method. Typical "static recrystallization distribution maps" in the test sample indicated that, at the onset of the second pass deformation with less than a critical holding time and a given pre-strain, a "partially-recrystallized zone" existed in the cylindrical core of the specimen near its center line. For the investigated scenario, the core was confined in the first half of the gauge radius when the holding time and the maximum pre strain were below 29 s and 0.5, respectively. For maximum pre strains smaller than 0.2, the specimen did not fully recrystallize, even at the gauge surface after holding for 50 s. Under such conditions, the conventional methods produced significant error.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Networked systems have adapted Radio Frequency identification technology (RFID) to automate their business process. The Networked RFID Systems (NRS) has some unique characteristics which raise new privacy and security concerns for organizations and their NRS systems. The businesses are always having new realization of business needs using NRS. One of the most recent business realization of NRS implementation on large scale distributed systems (such as Internet of Things (IoT), supply chain) is to ensure visibility and traceability of the object throughout the chain. However, this requires assurance of security and privacy to ensure lawful business operation. In this paper, we are proposing a secure tracker protocol that will ensure not only visibility and traceability of the object but also genuineness of the object and its travel path on-site. The proposed protocol is using Physically Unclonable Function (PUF), Diffie-Hellman algorithm and simple cryptographic primitives to protect privacy of the partners, injection of fake objects, non-repudiation, and unclonability. The tag only performs a simple mathematical computation (such as combination, PUF and division) that makes the proposed protocol suitable to passive tags. To verify our security claims, we performed experiment on Security Protocol Description Language (SPDL) model of the proposed protocol using automated claim verification tool Scyther. Our experiment not only verified our claims but also helped us to eliminate possible attacks identified by Scyther.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a secure object tracking protocol to ensure the visibility and traceability of an object along the travel path to support the Internet of Things (IoT). The proposed protocol is based on radio frequency identification system for global unique identification of IoT objects. For ensuring secure object tracking, lightweight cryptographic primitives and physically unclonable function are used by the proposed protocol in tags. We evaluated the proposed protocol both quantitatively and qualitatively. In our experiment, we modeled the protocol using security protocol description language (SPDL) and simulated SPDL model using automated claim verification tool Scyther. The results show that the proposed protocol is more secure and requires less computation compared to existing similar protocols.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this work was to define the processes through which the cooling of thermoplastics parts occur inside the mold cavity in an injection process. The plastic materials have become more widespread in the automobile industry and, among its manufacturing processes, injection moulding develops quickly, allowing the manufacturing of quality parts in great volumes. Data was collected from the injection of Volkswagen Gol NF 23X (Gol Generation 5). Using approximated methods for calculation for the heat Exchange inside the mould, in the cooling system, the required water flow was determined to properly cool the parts. Comparing the obtained value with Project specifications, it was verified that the method, in spite of incurring in some mistakes, is efficient in determining the flow of cooling fluid and serves as a verification tool for the parameters defined on project, and can be applied to simple projects. The definition of the cooling system, in practice, is dependent on innumerable variables and each case must be approached in itself, since the parameters for one product may not be ideal for another

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Process algebraic architectural description languages provide a formal means for modeling software systems and assessing their properties. In order to bridge the gap between system modeling and system im- plementation, in this thesis an approach is proposed for automatically generating multithreaded object-oriented code from process algebraic architectural descriptions, in a way that preserves – under certain assumptions – the properties proved at the architectural level. The approach is divided into three phases, which are illustrated by means of a running example based on an audio processing system. First, we develop an architecture-driven technique for thread coordination management, which is completely automated through a suitable package. Second, we address the translation of the algebraically-specified behavior of the individual software units into thread templates, which will have to be filled in by the software developer according to certain guidelines. Third, we discuss performance issues related to the suitability of synthesizing monitors rather than threads from software unit descriptions that satisfy specific constraints. In addition to the running example, we present two case studies about a video animation repainting system and the implementation of a leader election algorithm, in order to summarize the whole approach. The outcome of this thesis is the implementation of the proposed approach in a translator called PADL2Java and its integration in the architecture-centric verification tool TwoTowers.