846 resultados para Detecção automática
Resumo:
Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process
Resumo:
A remoção de inconsistências em um projeto é menos custosa quando realizadas nas etapas iniciais da sua concepção. A utilização de Métodos Formais melhora a compreensão dos sistemas além de possuir diversas técnicas, como a especificação e verificação formal, para identificar essas inconsistências nas etapas iniciais de um projeto. Porém, a transformação de uma especificação formal para uma linguagem de programação é uma tarefa não trivial. Quando feita manualmente, é uma tarefa passível da inserção de erros. O uso de ferramentas que auxiliem esta etapa pode proporcionar grandes benefícios ao produto final a ser desenvolvido. Este trabalho propõe a extensão de uma ferramenta cujo foco é a tradução automática de especificações em CSPm para Handel-C. CSP é uma linguagem de descrição formal adequada para trabalhar com sistemas concorrentes. Handel-C é uma linguagem de programação cujo resultado pode ser compilado diretamente para FPGA's. A extensão consiste no aumento no número de operadores CSPm aceitos pela ferramenta, permitindo ao usuário definir processos locais, renomear canais e utilizar guarda booleana em escolhas externas. Além disto, propomos também a implementação de um protocolo de comunicação que elimina algumas restrições da composição paralela de processos na tradução para Handel-C, permitindo que a comunicação entre múltiplos processos possa ser mapeada de maneira consistente e que a mesma somente ocorra quando for autorizada.
Resumo:
Removing inconsistencies in a project is a less expensive activity when done in the early steps of design. The use of formal methods improves the understanding of systems. They have various techniques such as formal specification and verification to identify these problems in the initial stages of a project. However, the transformation from a formal specification into a programming language is a non-trivial task and error prone, specially when done manually. The aid of tools at this stage can bring great benefits to the final product to be developed. This paper proposes the extension of a tool whose focus is the automatic translation of specifications written in CSPM into Handel-C. CSP is a formal description language suitable for concurrent systems, and CSPM is the notation used in tools support. Handel-C is a programming language whose result can be compiled directly into FPGA s. Our extension increases the number of CSPM operators accepted by the tool, allowing the user to define local processes, to rename channels in a process and to use Boolean guards on external choices. In addition, we also propose the implementation of a communication protocol that eliminates some restrictions on parallel composition of processes in the translation into Handel-C, allowing communication in a same channel between multiple processes to be mapped in a consistent manner and that improper communication in a channel does not ocurr in the generated code, ie, communications that are not allowed in the system specification
Resumo:
Typically Web services contain only syntactic information that describes their interfaces. Due to the lack of semantic descriptions of the Web services, service composition becomes a difficult task. To solve this problem, Web services can exploit the use of ontologies for the semantic definition of service s interface, thus facilitating the automation of discovering, publication, mediation, invocation, and composition of services. However, ontology languages, such as OWL-S, have constructs that are not easy to understand, even for Web developers, and the existing tools that support their use contains many details that make them difficult to manipulate. This paper presents a MDD tool called AutoWebS (Automatic Generation of Semantic Web Services) to develop OWL-S semantic Web services. AutoWebS uses an approach based on UML profiles and model transformations for automatic generation of Web services and their semantic description. AutoWebS offers an environment that provides many features required to model, implement, compile, and deploy semantic Web services
Uma abordagem para a verificação do comportamento excepcional a partir de regras de designe e testes
Resumo:
Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code
Resumo:
The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification
Resumo:
Web services are software units that allow access to one or more resources, supporting the deployment of business processes in the Web. They use well-defined interfaces, using web standard protocols, making possible the communication between entities implemented on different platforms. Due to these features, Web services can be integrated as services compositions to form more robust loose coupling applications. Web services are subject to failures, unwanted situations that may compromise the business process partially or completely. Failures can occur both in the design of compositions as in the execution of compositions. As a result, it is essential to create mechanisms to make the implementation of service compositions more robust and to treat failures. Specifically, we propose the support for fault recovery in service compositions described in PEWS language and executed on PEWS-AM, an graph reduction machine. To support recovery failure on PEWS-AM, we extend the PEWS language specification and adapted the rules of translation and reduction of graphs for this machine. These contributions were made both in the model of abstract machine as at the implementation level
Resumo:
In February 2011, the National Agency of Petroleum, Natural Gas and Biofuels (ANP) has published a new Technical Rules for Handling Land Pipeline Petroleum and Natural Gas Derivatives (RTDT). Among other things, the RTDT made compulsory the use of monitoring systems and leak detection in all onshore pipelines in the country. This document provides a study on the method for detection of transient pressure. The study was conducted on a industrial duct 16" diameter and 9.8 km long. The pipeline is fully pressurized and carries a multiphase mixture of crude oil, water and natural gas. For the study, was built an infrastructure for data acquisition and validation of detection algorithms. The system was designed with SCADA architecture. Piezoresistive sensors were installed at the ends of the duct and Digital Signal Processors (DSPs) were used for sampling, storage and processing of data. The study was based on simulations of leaks through valves and search for patterns that characterize the occurrence of such phenomena
Resumo:
This work presents contributions in the detection and identication of faults in multilevel inverters through the study of the converters behavior under these operation conditions. Basically, the approached fault consists of an open-circuit in any switch of a three-level clamped diode inverter. The converter operation is characterized in the pre and post-fault states. A wave form behavior analysis of the pole voltage, phase current and dc-bus current is also done, which highlights characteristics that allow the detection of failure and, even, under favorable conditions, the identication of the faulty device. A compensation strategy of the approached fault (open-switch) is also investigated with the purpose of maintaining the driving system operational when a failure occurs. The proposed topology uses SCRs in parallel with the internal switches of the inverter, which allows, in some occasions, the full utilization of the dc-bus
Resumo:
In this work, we propose a two-stage algorithm for real-time fault detection and identification of industrial plants. Our proposal is based on the analysis of selected features using recursive density estimation and a new evolving classifier algorithm. More specifically, the proposed approach for the detection stage is based on the concept of density in the data space, which is not the same as probability density function, but is a very useful measure for abnormality/outliers detection. This density can be expressed by a Cauchy function and can be calculated recursively, which makes it memory and computational power efficient and, therefore, suitable for on-line applications. The identification/diagnosis stage is based on a self-developing (evolving) fuzzy rule-based classifier system proposed in this work, called AutoClass. An important property of AutoClass is that it can start learning from scratch". Not only do the fuzzy rules not need to be prespecified, but neither do the number of classes for AutoClass (the number may grow, with new class labels being added by the on-line learning process), in a fully unsupervised manner. In the event that an initial rule base exists, AutoClass can evolve/develop it further based on the newly arrived faulty state data. In order to validate our proposal, we present experimental results from a level control didactic process, where control and error signals are used as features for the fault detection and identification systems, but the approach is generic and the number of features can be significant due to the computationally lean methodology, since covariance or more complex calculations, as well as storage of old data, are not required. The obtained results are significantly better than the traditional approaches used for comparison
Resumo:
Objective to establish a methodology for the oil spill monitoring on the sea surface, located at the Submerged Exploration Area of the Polo Region of Guamaré, in the State of Rio Grande do Norte, using orbital images of Synthetic Aperture Radar (SAR integrated with meteoceanographycs products. This methodology was applied in the following stages: (1) the creation of a base map of the Exploration Area; (2) the processing of NOAA/AVHRR and ERS-2 images for generation of meteoceanographycs products; (3) the processing of RADARSAT-1 images for monitoring of oil spills; (4) the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products; and (5) the structuring of a data base. The Integration of RADARSAT-1 image of the Potiguar Basin of day 21.05.99 with the base map of the Exploration Area of the Polo Region of Guamaré for the identification of the probable sources of the oil spots, was used successfully in the detention of the probable spot of oil detected next to the exit to the submarine emissary in the Exploration Area of the Polo Region of Guamaré. To support the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products, a methodology was developed for the classification of oil spills identified by RADARSAT-1 images. For this, the following algorithms of classification not supervised were tested: K-means, Fuzzy k-means and Isodata. These algorithms are part of the PCI Geomatics software, which was used for the filtering of RADARSAT-1 images. For validation of the results, the oil spills submitted to the unsupervised classification were compared to the results of the Semivariogram Textural Classifier (STC). The mentioned classifier was developed especially for oil spill classification purposes and requires PCI software for the whole processing of RADARSAT-1 images. After all, the results of the classifications were analyzed through Visual Analysis; Calculation of Proportionality of Largeness and Analysis Statistics. Amongst the three algorithms of classifications tested, it was noted that there were no significant alterations in relation to the spills classified with the STC, in all of the analyses taken into consideration. Therefore, considering all the procedures, it has been shown that the described methodology can be successfully applied using the unsupervised classifiers tested, resulting in a decrease of time in the identification and classification processing of oil spills, if compared with the utilization of the STC classifier
Resumo:
The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web
Resumo:
This study includes the results of the analysis of areas susceptible to degradation by remote sensing in semi-arid region, which is a matter of concern and affects the whole population and the catalyst of this process occurs by the deforestation of the savanna and improper practices by the use of soil. The objective of this research is to use biophysical parameters of the MODIS / Terra and images TM/Landsat-5 to determine areas susceptible to degradation in semi-arid Paraiba. The study area is located in the central interior of Paraíba, in the sub-basin of the River Taperoá, with average annual rainfall below 400 mm and average annual temperature of 28 ° C. To draw up the map of vegetation were used TM/Landsat-5 images, specifically, the composition 5R4G3B colored, commonly used for mapping land use. This map was produced by unsupervised classification by maximum likelihood. The legend corresponds to the following targets: savanna vegetation sparse and dense, riparian vegetation and exposed soil. The biophysical parameters used in the MODIS were emissivity, albedo and vegetation index for NDVI (NDVI). The GIS computer programs used were Modis Reprojections Tools and System Information Processing Georeferenced (SPRING), which was set up and worked the bank of information from sensors MODIS and TM and ArcGIS software for making maps more customizable. Initially, we evaluated the behavior of the vegetation emissivity by adapting equation Bastiaanssen on NDVI for spatialize emissivity and observe changes during the year 2006. The albedo was used to view your percentage of increase in the periods December 2003 and 2004. The image sensor of Landsat TM were used for the month of December 2005, according to the availability of images and in periods of low emissivity. For these applications were made in language programs for GIS Algebraic Space (LEGAL), which is a routine programming SPRING, which allows you to perform various types of algebras of spatial data and maps. For the detection of areas susceptible to environmental degradation took into account the behavior of the emissivity of the savanna that showed seasonal coinciding with the rainy season, reaching a maximum emissivity in the months April to July and in the remaining months of a low emissivity . With the images of the albedo of December 2003 and 2004, it was verified the percentage increase, which allowed the generation of two distinct classes: areas with increased variation percentage of 1 to 11.6% and the percentage change in areas with less than 1 % albedo. It was then possible to generate the map of susceptibility to environmental degradation, with the intersection of the class of exposed soil with varying percentage of the albedo, resulting in classes susceptibility to environmental degradation
Resumo:
OBJETIVO: Auxiliar o profissional de saúde na identificação dos fatores de risco e de proteção, e no manejo de pacientes com risco de suicídio, por meio de entrevista clinica, no contexto de emergência médica. MÉTODO: Revisão seletiva da literatura para identificar achados clínicos relevantes e ilustrativos. RESULTADO: A entrevista clinica é o melhor método para avaliar o risco suicida e tem dois objetivos: 1) apoio emocional e de estabelecimento de vínculo; 2) coleta de informações. Existe um número considerável de informações a serem coletadas durante a entrevista: fatores de risco e proteção (predisponentes e precipitantes), dados epidemiológicos, caracterização do ato, aspectos psicodinâmicos, antecedentes pessoais e familiares, modelos de identificação, dados sobre saúde física e rede de apoio social. Dificuldades ao longo da entrevista serão encontradas, mas com conhecimento e treinamento adequado, o profissional poderá abordar e ajudar adequadamente o paciente. Embora várias escalas tenham sido propostas, nenhuma delas demonstrou eficiência para a detecção de risco de suicídio. CONCLUSÃO: Não há como prever quem cometerá suicídio, mas é possível avaliar o risco individual que cada paciente apresenta, tendo em vista a investigação detalhada e empática da entrevista clinica. Impedir que o paciente venha a se matar é regra preliminar e fundamental.
Resumo:
OBJETIVOS: Identificar a frequência de ocorrência de desvios oculares e as características dos portadores em uma amostra populacional. MÉTODOS: Estudo transversal, observacional e probabilístico, entre os anos de 2004 e 2005, envolvendo 11 cidades da região centro-oeste do estado de São Paulo. Foram examinados 10.994 indivíduos, sendo utilizada para este estudo uma subamostra desta população, identificada pelo diagnóstico de estrabismo. A população foi abordada por uma equipe treinada e padronizada para os procedimentos da pesquisa. Os dados foram analisados estatisticamente por meio de análise descritiva, frequência de ocorrência, análise de contingência e testes de associação (p<0,05). RESULTADOS: A frequência de ocorrência de estrabismo na população estudada foi de 1,4% (148 portadores de estrabismo), sem diferença entre sexos. Portadores de esodesvios (ET) eram 46,3%, 38,2% casos de exodesvio (XT) e 15,4% de desvios verticais associados a horizontais ou síndromes. A análise de contingência mostrou que 3 indivíduos (2,3%) estrábicos apresentavam cegueira e 7 (5,43%) apresentavam baixa visão em um dos olhos. Tanto a ET, quanto a XT estiveram presentes em indivíduos com graus variáveis de miopia (até -5,75 para XT e -2,50 para ET) e de hipermetropia (até +9,00 para XT e +8,00 para ET). A associação entre estrabismo e o equivalente esférico obtido na refração estática não mostrou diferença significativa (p>0,05). CONCLUSÃO: A frequência de ocorrência de estrabismo em uma amostra populacional foi de 1,4%, sem diferença entre sexos ou tipo de desvio ocular. A presença de cegueira e de baixa visão associadas aos desvios oculares reforçam a necessidade de tratamento precoce.