922 resultados para Static verification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presented at SEMINAR "ACTION TEMPS RÉEL:INFRASTRUCTURES ET SERVICES SYSTÉMES". 10, Apr, 2015. Brussels, Belgium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La programación concurrente es una tarea difícil aún para los más experimentados programadores. Las investigaciones en concurrencia han dado como resultado una gran cantidad de mecanismos y herramientas para resolver problemas de condiciones de carrera de datos y deadlocks, problemas que surgen por el mal uso de los mecanismos de sincronización. La verificación de propiedades interesantes de programas concurrentes presenta dificultades extras a los programas secuenciales debido al no-determinismo de su ejecución, lo cual resulta en una explosión en el número de posibles estados de programa, haciendo casi imposible un tratamiento manual o aún con la ayuda de computadoras. Algunos enfoques se basan en la creación de lenguajes de programación con construcciones con un alto nivel de abstración para expresar concurrencia y sincronización. Otros enfoques tratan de desarrollar técnicas y métodos de razonamiento para demostrar propiedades, algunos usan demostradores de teoremas generales, model-checking o algortimos específicos sobre un determinado sistema de tipos. Los enfoques basados en análisis estático liviano utilizan técnicas como interpretación abstracta para detectar ciertos tipos de errores, de una manera conservativa. Estas técnicas generalmente escalan lo suficiente para aplicarse en grandes proyectos de software pero los tipos de errores que pueden detectar es limitada. Algunas propiedades interesantes están relacionadas a condiciones de carrera y deadlocks, mientras que otros están interesados en problemas relacionados con la seguridad de los sistemas, como confidencialidad e integridad de datos. Los principales objetivos de esta propuesta es identificar algunas propiedades de interés a verificar en sistemas concurrentes y desarrollar técnicas y herramientas para realizar la verificación en forma automática. Para lograr estos objetivos, se pondrá énfasis en el estudio y desarrollo de sistemas de tipos como tipos dependientes, sistema de tipos y efectos, y tipos de efectos sensibles al flujo de datos y control. Estos sistemas de tipos se aplicarán a algunos modelos de programación concurrente como por ejemplo, en Simple Concurrent Object-Oriented Programming (SCOOP) y Java. Además se abordarán propiedades de seguridad usando sistemas de tipos específicos. Concurrent programming has remained a dificult task even for very experienced programmers. Concurrency research has provided a rich set of tools and mechanisms for dealing with data races and deadlocks that arise of incorrect use of synchronization. Verification of most interesting properties of concurrent programs is a very dificult task due to intrinsic non-deterministic nature of concurrency, resulting in a state explosion which make it almost imposible to be manually treat and it is a serious challenge to do that even with help of computers. Some approaches attempts create programming languages with higher levels of abstraction for expressing concurrency and synchronization. Other approaches try to develop reasoning methods to prove properties, either using general theorem provers, model-checking or specific algorithms on some type systems. The light-weight static analysis approach apply techniques like abstract interpretation to find certain kind of bugs in a conservative way. This techniques scale well to be applied in large software projects but the kind of bugs they may find are limited. Some interesting properties are related to data races and deadlocks, while others are interested in some security problems like confidentiality and integrity of data. The main goals of this proposal is to identify some interesting properties to verify in concurrent systems and develop techniques and tools to do full automatic verification. The main approach will be the application of type systems, as dependent types, type and effect systems, and flow-efect types. Those type systems will be applied to some models for concurrent programming as Simple Concurrent Object-Oriented Programming (SCOOP) and Java. Other goals include the analysis of security properties also using specific type systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Es desenvolupa una eina de disseny per l'anàlisi de la tolerància al dany en composites. L'eina pot predir el inici i la propagació de fisures interlaminars. També pot ser utilitzada per avaluar i planificar la necessitat de reparar o reemplaçar components durant la seva vida útil. El model desenvolupat pot ser utilitzat tan per simular càrregues estàtiques com de fatiga. El model proposat és un model de dany termodinàmicament consistent que permet simular la delaminació en composites sota càrregues variables. El model es formula dins el context de la Mecànica del Dany, fent ús dels models de zona cohesiva. Es presenta un metodologia per determinar els paràmetres del model constitutiu que permet utilitzar malles d'elements finits més bastes de les que es poden usar típicament. Finalment, el model és també capaç de simular la delaminació produïda per càrregues de fatiga.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Static analysis tools report software defects that may or may not be detected by other verification methods. Two challenges complicating the adoption of these tools are spurious false positive warnings and legitimate warnings that are not acted on. This paper reports automated support to help address these challenges using logistic regression models that predict the foregoing types of warnings from signals in the warnings and implicated code. Because examining many potential signaling factors in large software development settings can be expensive, we use a screening methodology to quickly discard factors with low predictive power and cost-effectively build predictive models. Our empirical evaluation indicates that these models can achieve high accuracy in predicting accurate and actionable static analysis warnings, and suggests that the models are competitive with alternative models built without screening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To investigate the dosimetric properties of an electronic portal imaging device (EPID) for electron beam detection and to evaluate its potential for quality assurance (QA) of modulated electron radiotherapy (MERT). Methods: A commercially available EPID was used to detect electron beams shaped by a photon multileaf collimator (MLC) at a source-surface distance of 70 cm. The fundamental dosimetric properties such as reproducibility, dose linearity, field size response, energy response, and saturation were investigated for electron beams. A new method to acquire the flood-field for the EPID calibration was tested. For validation purpose, profiles of open fields and various MLC fields (square and irregular) were measured with a diode in water and compared to the EPID measurements. Finally, in order to use the EPID for QA of MERT delivery, a method was developed to reconstruct EPID two-dimensional (2D) dose distributions in a water-equivalent depth of 1.5 cm. Comparisons were performed with film measurement for static and dynamic monoenergy fields as well as for multienergy fields composed by several segments of different electron energies. Results: The advantageous EPID dosimetric properties already known for photons as reproducibility, linearity with dose, and dose rate were found to be identical for electron detection. The flood-field calibration method was proven to be effective and the EPID was capable to accurately reproduce the dose measured in water at 1.0 cm depth for 6 MeV, 1.3 cm for 9 MeV, and 1.5 cm for 12, 15, and 18 MeV. The deviations between the output factors measured with EPID and in water at these depths were within ±1.2% for all the energies with a mean deviation of 0.1%. The average gamma pass rate (criteria: 1.5%, 1.5 mm) for profile comparison between EPID and measurements in water was better than 99% for all the energies considered in this study. When comparing the reconstructed EPID 2D dose distributions at 1.5 cm depth to film measurements, the gamma pass rate (criteria: 2%, 2 mm) was better than 97% for all the tested cases. Conclusions: This study demonstrates the high potential of the EPID for electron dosimetry, and in particular, confirms the possibility to use it as an efficient verification tool for MERT delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource usage control, and program verification. More recently, novel and promising applications of such semantic approximations are being applied in the more general context of program development such as program verification. In this work, we describe our extensión of the system to incorpórate Abstraction-Carrying Code (ACC), a novel approach to mobile code safety. ACC follows the standard strategy of associating safety certificates to programs, originally proposed in Proof Carrying- Code. A distinguishing feature of ACC is that we use an abstraction (or abstract model) of the program computed by standard static analyzers as a certifícate. The validity of the abstraction on the consumer side is checked in a single-pass by a very efficient and specialized abstractinterpreter. We have implemented and benchmarked ACC within CiaoPP. The experimental results show that the checking phase is indeed faster than the proof generation phase, and that the sizes of certificates are reasonable. Moreover, the preprocessor is based on compile-time (and run-time) tools for the certification of CLP programs with resource consumption assurances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an increasing number of applications (e.g., in embedded, real-time, or mobile systems) it is important or even essential to ensure conformance with respect to a specification expressing resource usages, such as execution time, memory, energy, or user-defined resources. In previous work we have presented a novel framework for data size-aware, static resource usage verification. Specifications can include both lower and upper bound resource usage functions. In order to statically check such specifications, both upper- and lower-bound resource usage functions (on input data sizes) approximating the actual resource usage of the program which are automatically inferred and compared against the specification. The outcome of the static checking of assertions can express intervals for the input data sizes such that a given specification can be proved for some intervals but disproved for others. After an overview of the approach in this paper we provide a number of novel contributions: we present a full formalization, and we report on and provide results from an implementation within the Ciao/CiaoPP framework (which provides a general, unified platform for static and run-time verification, as well as unit testing). We also generalize the checking of assertions to allow preconditions expressing intervals within which the input data size of a program is supposed to lie (i.e., intervals for which each assertion is applicable), and we extend the class of resource usage functions that can be checked.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of empirical studies are limited to particular contexts, difficult to generalise and the studies themselves are expensive to perform. Despite these problems, empirical studies in software engineering can be made effective and they are important to both researchers and practitioners. The key to their effectiveness lies in the maximisation of the information that can be gained by examining existing studies, conducting power analyses for an accurate minimum sample size and benefiting from previous studies through replication. This approach was applied in a controlled experiment examining the combination of automated static analysis tools and code inspection in the context of verification and validation (V&V) of concurrent Java components. The combination of these V&V technologies was shown to be cost-effective despite the size of the study, which thus contributes to research in V&V technology evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine, in the imaginary-time formalism, the high temperature behavior of n-point thermal loops in static Yang-Mills and gravitational fields. We show that in this regime, any hard thermal loop gives the same leading contribution as the one obtained by evaluating the loop integral at zero external energies and momenta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bacurau, RFP, Monteiro, GA, Ugrinowitsch C, Tricoli, V, Cabral, LF, Aoki, MS. Acute effect of a ballistic and a static stretching exercise bout on flexibility and maximal strength. J Strength Cond Res 23(1): 304-308, 2009-Different stretching techniques have been used during warm-up routines. However, these routines may decrease force production. The purpose of this study was to compare the acute effect of a ballistic and a static stretching protocol on lower-limb maximal strength. Fourteen physically active women (169.3 +/- 8.2 cm; 64.9 +/- 5.9 kg; 23.1 +/- 3.6 years) performed three experimental sessions: a control session (estimation of 45 degrees leg press one-repetition maximum [1RM]), a ballistic session (20 minutes of ballistic stretch and 45 degrees leg press 1RM), and a static session (20 minutes of static stretch and 45 degrees leg press 1RM). Maximal strength decreased after static stretching (213.2 +/- 36.1 to 184.6 +/- 28.9 kg), but it was unaffected by ballistic stretching (208.4 +/- 34.8 kg). In addition, static stretching exercises produce a greater acute improvement in flexibility compared with ballistic stretching exercises. Consequently, static stretching may not be recommended before athletic events or physical activities that require high levels of force. On the other hand, ballistic stretching could be more appropriate because it seems less likely to decrease maximal strength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Samogin Lopes, FA, Menegon, EM, Franchini, E, Tricoli, V, and de M. Bertuzzi, RC. Is acute static stretching able to reduce the time to exhaustion at power output corresponding to maximal oxygen uptake? J Strength Cond Res 24(6): 1650-1656, 2010-This study analyzed the effect of an acute static stretching bout on the time to exhaustion (T(lim)) at power output corresponding to (V) over dotO(2)max. Eleven physically active male subjects (age 22.3 +/- 2.8 years, (V) over dotO(2)max 2.7 +/- 0.5 L . min(-1)) completed an incremental cycle ergometer test, 2 muscle strength tests, and 2 maximal tests to exhaustion at power output corresponding to (V) over dotO(2)max with and without a previous static stretching bout. The T(lim) was not significantly affected by the static stretching (164 +/- 28 vs. 150 +/- 26 seconds with and without stretching, respectively, p = 0.09), but the time to reach (V) over dotO(2)max (118 +/- 22 vs. 102 +/- 25 seconds), blood-lactate accumulation immediately after exercise (10.7 +/- 2.9 vs. 8.0 +/- 1.7 mmol . L(-1)), and oxygen deficit (2.4 +/- 0.9 vs. 2.1 +/- 0.7 L) were significantly reduced (p <= 0.02). Thus, an acute static stretching bout did not reduce T(lim) at power output corresponding to (V) over dotO(2)max possibly by accelerating aerobic metabolism activation at the beginning of exercise. These results suggest that coaches and practitioners involved with aerobic dependent activities may use static stretching as part of their warm-up routines without fear of diminishing high-intensity aerobic exercise performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of Nb(3)Al and Nb(3)Sn superconductors is of great interest for the applied superconductivity area. These intermetallics composites are obtained normally by heat treatment reactions at high temperature. Processes that allow formation of the superconducting phases at lower temperatures (<1000 degrees C), particularly for Nb(3)Al, are of great interest. The present work studies phase formation and stability of Nb(3)Al and Nb(3)Sn superconducting phases using mechanical alloying (high energy ball milling). Our main objective was to form composites near stoichiometry, which could be transformed into the superconducting phases using low-temperature heat treatments. High purity Nb-Sn and Nb-Al powders were mixed to generate the required superconducting phases (Nb-25at.%Sn and Nb-25at.%Al) in an argon atmosphere glove-box. After milling in a Fritsch mill, the samples were compressed in a hydraulic uniaxial press and encapsulated in evacuated quartz tubes for heat treatment. The compressed and heat treated samples were characterized using X-ray diffractometry. Microstructure and chemical analysis were accomplished using scanning electron microscopy and energy dispersive spectrometry. Nb(3)Al XRD peaks were observed after the sintering at 800 degrees C for the sample milled for 30 h. Nb(3)Sn XRD peaks could be observed even before the heat treatment. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new two-dimensionally mapped infinite boundary element (IBE) is presented. The formulation is based on a triangular boundary element (BE) with linear shape functions instead of the quadrilateral IBEs usually found in the literature. The infinite solids analyzed are assumed to be three-dimensional, linear-elastic and isotropic, and Kelvin fundamental solutions are employed. One advantage of the proposed formulation over quadratic or higher order elements is that no additional degrees of freedom are added to the original BE mesh by the presence of the IBEs. Thus, the IBEs allow the mesh to be reduced without compromising the accuracy of the result. Two examples are presented, in which the numerical results show good agreement with authors using quadrilateral IBEs and analytical solutions. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents results on a verification test of a Direct Numerical Simulation code of mixed high-order of accuracy using the method of manufactured solutions (MMS). This test is based on the formulation of an analytical solution for the Navier-Stokes equations modified by the addition of a source term. The present numerical code was aimed at simulating the temporal evolution of instability waves in a plane Poiseuille flow. The governing equations were solved in a vorticity-velocity formulation for a two-dimensional incompressible flow. The code employed two different numerical schemes. One used mixed high-order compact and non-compact finite-differences from fourth-order to sixth-order of accuracy. The other scheme used spectral methods instead of finite-difference methods for the streamwise direction, which was periodic. In the present test, particular attention was paid to the boundary conditions of the physical problem of interest. Indeed, the verification procedure using MMS can be more demanding than the often used comparison with Linear Stability Theory. That is particularly because in the latter test no attention is paid to the nonlinear terms. For the present verification test, it was possible to manufacture an analytical solution that reproduced some aspects of an instability wave in a nonlinear stage. Although the results of the verification by MMS for this mixed-order numerical scheme had to be interpreted with care, the test was very useful as it gave confidence that the code was free of programming errors. Copyright (C) 2009 John Wiley & Sons, Ltd.