355 resultados para Correctness
Resumo:
Using formal methods, the developer can increase software s trustiness and correctness. Furthermore, the developer can concentrate in the functional requirements of the software. However, there are many resistance in adopting this software development approach. The main reason is the scarcity of adequate, easy to use, and useful tools. Developers typically write code and test it. These tests usually consist of executing the program and checking its output against its requirements. This, however, is not always an exhaustive discipline. On the other side, using formal methods one might be able to investigate the system s properties further. Unfortunately, specification languages do not always have tools like animators or simulators, and sometimes there are no friendly Graphical User Interfaces. On the other hand, specification languages usually have a compiler which normally generates a Labeled Transition System (LTS). This work proposes an application that provides graphical animation for formal specifications using the LTS as input. The application initially supports the languages B, CSP, and Z. However, using a LTS in a specified XML format, it is possible to animate further languages. Additionally, the tool provides traces visualization, the choices the user did, in a graphical tree. The intention is to improve the comprehension of a specification by providing information about errors and animating it, as the developers do for programming languages, such as Java and C++.
Resumo:
The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model
Resumo:
PLCs (acronym for Programmable Logic Controllers) perform control operations, receiving information from the environment, processing it and modifying this same environment according to the results produced. They are commonly used in industry in several applications, from mass transport to petroleum industry. As the complexity of these applications increase, and as various are safety critical, a necessity for ensuring that they are reliable arouses. Testing and simulation are the de-facto methods used in the industry to do so, but they can leave flaws undiscovered. Formal methods can provide more confidence in an application s safety, once they permit their mathematical verification. We make use of the B Method, which has been successfully applied in the formal verification of industrial systems, is supported by several tools and can handle decomposition, refinement, and verification of correctness according to the specification. The method we developed and present in this work automatically generates B models from PLC programs and verify them in terms of safety constraints, manually derived from the system requirements. The scope of our method is the PLC programming languages presented in the IEC 61131-3 standard, although we are also able to verify programs not fully compliant with the standard. Our approach aims to ease the integration of formal methods in the industry through the abbreviation of the effort to perform formal verification in PLCs
Resumo:
The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification
Resumo:
The area between Galinhos and São Bento do Norte beaches, located in the northern coast of the Rio Grande do Norte State is submitted to intense and constant processes of littoral and aeolian transport, causing erosion, alterations in the sediments balance and modifications in the shoreline. Beyond these natural factors, the human interference is huge in the surroundings due to the Guamaré Petroliferous Pole nearby, the greater terrestrial oil producing in Brazil. Before all these characteristics had been organized MAMBMARE and MARPETRO projects with the main objective to execute the geo-environmental monitoring of coastal areas on the northern portion of RN. There is a bulky amount of database from the study area such as geologic and geophysical multitemporal data, hydrodynamic measurements, remote sensing multitemporal images, thematic maps, among others; it is of extreme importance to elaborate a Geographic Database (GD), one of the main components of a Geographic Information System (GIS), to store this amount of information, allowing the access to researchers and users. The first part of this work consisted to elaborate a GD to store the data of the area between Galinhos and São Bento do Norte cities. The main goal was to use the potentiality of the GIS as a tool to support decisions in the environmental monitoring of this region, a valuable target for oil exploration, salt companies and shrimp farms. The collected data was stored as a virtual library to assist men decisions from the results presented as digital thematic maps, tables and reports, useful as source of data in the preventive planning and as guidelines to the future research themes both on regional and local context. The second stage of this work consisted on elaborate the Oil-Spill Environmental Sensitivity Maps. These maps based on the Environmental Sensitivity Index Maps to Oil Spill developed by the Ministry of Environment are cartographic products that supply full information to the decision making, contingency planning and assessment in case of an oil spilling incident in any area. They represent the sensitivity of the areas related to oil spilling, through basic data such as geology, geomorphology, oceanographic, social-economic and biology. Some parameters, as hydrodynamic data, sampling data, coastal type, declivity of the beach face, types of resources in risk (biologic, economic, human or cultural) and the land use of the area are some of the essential information used on the environmental sensitivity maps elaboration. Thus using the available data were possible to develop sensitivity maps of the study area on different dates (June/2000 and December/2000) and to perceive that there was a difference on the sensitivity index generated. The area on December presented more sensible to the oil than the June one because hydrodynamic data (wave and tide energy) allowed a faster natural cleaning on June. The use of the GIS on sensitivity maps showed to be a powerful tool, since it was possible to manipulate geographic data with correctness and to elaborate more accurate maps with a higher level of detail to the study area. This presented an medium index (3 to 4) to the long shore and a high index (10) to the mangrove areas highly vulnerable to oil spill
Resumo:
O objetivo do trabalho foi apreender as representações sociais de puérperas sobre o cuidado em saúde no período pré-natal, no parto e no puerpério, em um contexto regional de serviços públicos de saúde do interior paulista. Seguindo a abordagem de pesquisa qualitativa, os dados foram colhidos por meio de entrevistas semi-estruturadas, realizadas em 2004, e organizados segundo o método do Discurso do Sujeito Coletivo, tendo o Programa de Humanização do Pré-natal e Nascimento (PHPN) como referencial teórico para discussão dos resultados. A perspectiva das puérperas sobre o cuidado em saúde no ciclo gravídico-puerperal evidenciou a importância das relações interpessoais, a essencialidade da qualidade técnica do atendimento e a propriedade da percepção de que o sujeito da atenção é a mulher e, como tal, dela deve participar efetivamente. Conclui-se que as diretrizes do PHPN devem ser incorporadas de forma mais ampla nas práticas de saúde voltadas à mulher, recomendando-se a adoção de indicadores específicos para avaliação das dimensões do cuidado evidenciadas por este estudo.
Resumo:
The element-free Galerkin method (EFGM) is a very attractive technique for solutions of partial differential equations, since it makes use of nodal point configurations which do not require a mesh. Therefore, it differs from FEM-like approaches by avoiding the need of meshing, a very demanding task for complicated geometry problems. However, the imposition of boundary conditions is not straightforward, since the EFGM is based on moving-least-squares (MLS) approximations which are not necessarily interpolants. This feature requires, for instance, the introduction of modified functionals with additional unknown parameters such as Lagrange multipliers, a serious drawback which leads to poor conditionings of the matrix equations. In this paper, an interpolatory formulation for MLS approximants is presented: it allows the direct introduction of boundary conditions, reducing the processing time and improving the condition numbers. The formulation is applied to the study of two-dimensional magnetohydrodynamic flow problems, and the computed results confirm the accuracy and correctness of the proposed formulation. (C) 2002 Elsevier B.V. All rights reserved.
Resumo:
In the past few years, vehicular ad hoc networks(VANETs) was studied extensively by researchers. VANETs is a type of P2P network, though it has some distinct characters (fast moving, short lived connection etc.). In this paper, we present several limitations of current trust management schemes in VANETs and propose ways to counter them. We first review several trust management techniques in VANETs and argue that the ephemeral nature of VANETs render them useless in practical situations. We identify that the problem of information cascading and oversampling, which commonly arise in social networks, also adversely affects trust management schemes in VANETs. To the best of our knowledge, we are the first to introduce information cascading and oversampling to VANETs. We show that simple voting for decision making leads to oversampling and gives incorrect results in VANETs. To overcome this problem, we propose a novel voting scheme. In our scheme, each vehicle has different voting weight according to its distance from the event. The vehicle which is more closer to the event possesses higher weight. Simulations show that our proposed algorithm performs better than simple voting, increasing the correctness of voting. © 2012 Springer Science + Business Media, LLC.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Ciência da Informação - FFC
Resumo:
Dificuldades apresentadas por crianças surdas na aprendizagem da matemática têm conduzido educadores ao desenvolvimento de procedimentos especiais de ensino. O paradigma de equivalência tem sido útil na explicação de comportamentos complexos, como comportamentos conceituais numéricos. Uma expansão desse paradigma envolve a formação de classes de estímulos equivalentes em seqüência. A emergência de novas relações através do responder ordinal já foi documentada em estudos com contingências de reforçamento de três termos. Há necessidade de verificar se esses resultados se mantém estáveis sob contingências de quatro e cinco termos. Três experimentos foram programados com o objetivo de investigar a emergência de relações ordinais com controle discriminativo simples, sob controle condicional (sem e com randomização das tentativas) e sob controle contextual em crianças surdas. No Experimento 1 participaram cinco crianças surdas, matriculadas numa Escola Pública Especializada. Um microcomputador com um software (REL 4.0, utilizado nos Experimentos 1, 2a e 2b e atualizado para a versão 5.0 no Estudo 3) foi utilizado. Nesse estudo foi ensinado aos participantes seqüências de pares de estímulos sobrepostos. Em seguida, foram realizados testes de transitividade e conectividade. Todos os participantes alcançaram o critério de acerto e responderam aos testes. Os resultados replicaram estudos da literatura confirmando a eficiência do procedimento de ensino por sobreposição de estímulos no estabelecimento de relações ordinais. No Experimento 2a, quatro novos participantes e um com história experimental, foram ensinados a selecionar estímulos, aos pares, na ordem crescente na presença da cor verde e na ordem decrescente na presença da cor vermelha. Foram aplicados testes de transitividade e conectividade sob controle condicional. Em seguida foi conduzido um teste de generalização com estímulos do ambiente escolar. Todos os participantes alcançaram o critério de acerto e responderam aos testes de transitividade e conectividade. Nos testes de generalização, três participantes responderam consistentemente aos novos estímulos, um respondeu parcialmente e um não respondeu ao teste. Os resultados corroboraram a eficiência do procedimento de ensino por sobreposição de estímulos sob controle condicional em crianças surdas. O Experimento 2b envolveu os mesmos participantes do Experimento 1 com história experimental e a randomização das tentativas com os estímulos condicionais. Todos os participantes alcançaram o critério de acerto. Nos testes demonstraram um responder consistente com a linha de base. No Experimento 3 participaram três crianças dos Experimento 1 e duas do Experimento 2a, que foram expostas ao procedimento de ensino por pares sobrepostos sob controle contextual de duas formas círculo e triângulo e sob controle condicional das cores verde e vermelha (ex. A1A2, na presença do círculo e da cor verde; ou A2A1, na presença do círculo e da cor vermelha). Todos os participantes alcançaram o critério de acerto e responderam aos testes de transitividade e conectividade. Os resultados indicaram a eficiência do procedimento de ensino por sobreposição de estímulos sob controle contextual, sugerindo que o ensino por contingências de reforçamento simples e sob controle condicional foram pré-requisitos para a emergência de classes ordinais sob controle contextual. Uma extensão deste estudo deve ampliar o número de membros na seqüência e investigar a emergência de novas relações ordinais com seqüências mais longas, e verificar se a ordem de treino em que a seqüência é ensinada interfere sobre o responder ordinal.
Resumo:
O presente trabalho apresenta os resultados da modelagem de canal de propagação baseado em séries temporais multivariadas com a utilização de dados coletados em campanhas de medição e as principais características da urbanização de onze vias do centro da cidade de Belém-Pa. Modelos de função de transferência foram utilizados para avaliar efeitos na série temporal da potência do sinal recebido (dBm) que foi utilizada como variável resposta e como variáveis explicativas a altura dos prédios e as distâncias entre os prédios. Como nos modelos em séries temporais desconsideram-se as possíveis correlações entre amostras vizinhas, utilizou-se um modelo geoestatístico para se estabelecer a correção do erro deste modelo. Esta fase do trabalho consistiu em um conjunto de procedimentos necessários às técnicas geoestatísticas. Tendo como objetivo a análise em duas dimensões para dados espacialmente distribuídos, no que diz respeito à interpolação de superfícies geradas a partir das mostras georreferenciadas obtidas dos resíduos da potência do sinal recebido calculados com o modelo em séries temporais. Os resultados obtidos com o modelo proposto apresentam um bom desempenho, com erro médio quadrático na ordem de 0,33 dB em relação ao sinal medido, considerando os dados das onze vias do centro urbano da cidade de Belém/Pa. A partir do mapa de distribuição espacial da potência do sinal recebido (dBm), pode se identificar com facilidade as zonas infra ou supra dimensionadas em termos desta variável, isto é, beneficiadas ou prejudicadas com relação a recepção do sinal, o que pode resultar em um maior investimento da operadora (concessionária de telefonia celular móvel) local naquelas regiões onde o sinal é fraco.
Resumo:
The article investigates complex impulsive systems in which the so-called controlling systems jumps effect emerges. In particular, this research includes the correctness of the solution to the impulsive control system and approximation lemmas. A 3D model example is provided which illustrates the relevance of the considered approach to the study of complex impulsive systems.
Resumo:
In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of He-5(Lambda), C-12(Lambda) and C-13(Lambda) using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.
Resumo:
The algorithm creates a buffer area around the cartographic features of interest in one of the images and compare it with the other one. During the comparison, the algorithm calculates the number of equals and different points and uses it to calculate the statistical values of the analysis. One calculated statistical value is the correctness, which shows the user the percentage of points that were correctly extracted. Another one is the completeness that shows the percentage of points that really belong to the interest feature. And the third value shows the idea of quality obtained by the extraction method, since that in order to calculate the quality the algorithm uses the correctness and completeness previously calculated. All the performed tests using this algorithm were possible to use the statistical values calculated to represent quantitatively the quality obtained by the extraction method executed. So, it is possible to say that the developed algorithm can be used to analyze extraction methods of cartographic features of interest, since that the results obtained were promising.