129 resultados para Confiabilidade


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Java Card technology allows the development and execution of small applications embedded in smart cards. A Java Card application is composed of an external card client and of an application in the card that implements the services available to the client by means of an Application Programming Interface (API). Usually, these applications manipulate and store important information, such as cash and confidential data of their owners. Thus, it is necessary to adopt rigor on developing a smart card application to improve its quality and trustworthiness. The use of formal methods on the development of these applications is a way to reach these quality requirements. The B method is one of the many formal methods for system specification. The development in B starts with the functional specification of the system, continues with the application of some optional refinements to the specification and, from the last level of refinement, it is possible to generate code for some programming language. The B formalism has a good tool support and its application to Java Card is adequate since the specification and development of APIs is one of the major applications of B. The BSmart method proposed here aims to promote the rigorous development of Java Card applications up to the generation of its code, based on the refinement of its formal specification described in the B notation. This development is supported by the BSmart tool, that is composed of some programs that automate each stage of the method; and by a library of B modules and Java Card classes that model primitive types, essential Java Card API classes and reusable data structures

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The process for choosing the best components to build systems has become increasingly complex. It becomes more critical if it was need to consider many combinations of components in the context of an architectural configuration. These circumstances occur, mainly, when we have to deal with systems involving critical requirements, such as the timing constraints in distributed multimedia systems, the network bandwidth in mobile applications or even the reliability in real-time systems. This work proposes a process of dynamic selection of architectural configurations based on non-functional requirements criteria of the system, which can be used during a dynamic adaptation. This proposal uses the MAUT theory (Multi-Attribute Utility Theory) for decision making from a finite set of possibilities, which involve multiple criteria to be analyzed. Additionally, it was proposed a metamodel which can be used to describe the application s requirements in terms of the non-functional requirements criteria and their expected values, to express them in order to make the selection of the desired configuration. As a proof of concept, it was implemented a module that performs the dynamic choice of configurations, the MoSAC. This module was implemented using a component-based development approach (CBD), performing a selection of architectural configurations based on the proposed selection process involving multiple criteria. This work also presents a case study where an application was developed in the context of Digital TV to evaluate the time spent on the module to return a valid configuration to be used in a middleware with autoadaptative features, the middleware AdaptTV

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using formal methods, the developer can increase software s trustiness and correctness. Furthermore, the developer can concentrate in the functional requirements of the software. However, there are many resistance in adopting this software development approach. The main reason is the scarcity of adequate, easy to use, and useful tools. Developers typically write code and test it. These tests usually consist of executing the program and checking its output against its requirements. This, however, is not always an exhaustive discipline. On the other side, using formal methods one might be able to investigate the system s properties further. Unfortunately, specification languages do not always have tools like animators or simulators, and sometimes there are no friendly Graphical User Interfaces. On the other hand, specification languages usually have a compiler which normally generates a Labeled Transition System (LTS). This work proposes an application that provides graphical animation for formal specifications using the LTS as input. The application initially supports the languages B, CSP, and Z. However, using a LTS in a specified XML format, it is possible to animate further languages. Additionally, the tool provides traces visualization, the choices the user did, in a graphical tree. The intention is to improve the comprehension of a specification by providing information about errors and animating it, as the developers do for programming languages, such as Java and C++.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of smart card applications requires a high level of reliability. Formal methods provide means for this reliability to be achieved. The BSmart method and tool contribute to the development of smart card applications with the support of the B method, generating Java Card code from B specifications. For the development with BSmart to be effectively rigorous without overloading the user it is important to have a library of reusable components built in B. The goal of KitSmart is to provide this support. A first research about the composition of this library was a graduation work from Universidade Federal do Rio Grande do Norte, made by Thiago Dutra in 2006. This first version of the kit resulted in a specification of Java Card primitive types byte, short and boolean in B and the creation of reusable components for application development. This work provides an improvement of KitSmart with the addition of API Java Card specification made in B and a guide for the creation of new components. The API Java Card in B, besides being available to be used for development of applications, is also useful as a documentation of each API class. The reusable components correspond to modules to manipulate specific structures, such as date and time. These structures are not available for B or Java Card. These components for Java Card are generated from specifications formally verified in B. The guide contains quick reference on how to specify some structures and how some situations were adapted from object-orientation to the B Method. This work was evaluated through a case study made through the BSmart tool, that makes use of the KitSmart library. In this case study, it is possible to see the contribution of the components in a B specification. This kit should be useful for B method users and Java Card application developers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The component-based development of systems revolutionized the software development process, facilitating the maintenance, providing more confiability and reuse. Nevertheless, even with all the advantages of the development of components, their composition is an important concern. The verification through informal tests is not enough to achieve a safe composition, because they are not based on formal semantic models with which we are able to describe precisally a system s behaviour. In this context, formal methods provide ways to accurately specify systems through mathematical notations providing, among other benefits, more safety. The formal method CSP enables the specification of concurrent systems and verification of properties intrinsic to them, as well as the refinement among different models. Some approaches apply constraints using CSP, to check the behavior of composition between components, assisting in the verification of those components in advance. Hence, aiming to assist this process, considering that the software market increasingly requires more automation, reducing work and providing agility in business, this work presents a tool that automatizes the verification of composition among components, in which all complexity of formal language is kept hidden from users. Thus, through a simple interface, the tool BST (BRIC-Tool-Suport) helps to create and compose components, predicting, in advance, undesirable behaviors in the system, such as deadlocks

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Internet atual vem sofrendo vários problemas em termos de escalabilidade, desempenho, mobilidade, etc., devido ao vertiginoso incremento no número de usuários e o surgimento de novos serviços com novas demandas, propiciando assim o nascimento da Internet do Futuro. Novas propostas sobre redes orientadas a conteúdo, como a arquitetura Entidade Titulo (ETArch), proveem novos serviços para este tipo de cenários, implementados sobre o paradigma de redes definidas por software. Contudo, o modelo de transporte do ETArch é equivalente ao modelo best-effort da Internet atual, e vem limitando a confiabilidade das suas comunicações. Neste trabalho, ETArch é redesenhado seguindo o paradigma do sobreaprovisionamento de recursos para conseguir uma alocação de recursos avançada integrada com OpenFlow. Como resultado, o framework SMART (Suporte de Sessões Móveis com Alta Demanda de Recursos de Transporte), permite que a rede defina semanticamente os requisitos qualitativos das sessões para assim gerenciar o controle de Qualidade de Serviço visando manter a melhor Qualidade de Experiência possível. A avaliação do planos de dados e de controle teve lugar na plataforma de testes na ilha do projeto OFELIA, mostrando o suporte de aplicações móveis multimídia com alta demanda de recursos de transporte com QoS e QoE garantidos através de um esquema de sinalização restrito em comparação com o ETArch legado

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the study of some complex systems is done with use of two distinct procedures. In the first part, we have studied the usage of Wavelet transform on analysis and characterization of (multi)fractal time series. We have test the reliability of Wavelet Transform Modulus Maxima method (WTMM) in respect to the multifractal formalism, trough the calculation of the singularity spectrum of time series whose fractality is well known a priori. Next, we have use the Wavelet Transform Modulus Maxima method to study the fractality of lungs crackles sounds, a biological time series. Since the crackles sounds are due to the opening of a pulmonary airway bronchi, bronchioles and alveoli which was initially closed, we can get information on the phenomenon of the airway opening cascade of the whole lung. Once this phenomenon is associated with the pulmonar tree architecture, which displays fractal geometry, the analysis and fractal characterization of this noise may provide us with important parameters for comparison between healthy lungs and those affected by disorders that affect the geometry of the tree lung, such as the obstructive and parenchymal degenerative diseases, which occurs, for example, in pulmonary emphysema. In the second part, we study a site percolation model for square lattices, where the percolating cluster grows governed by a control rule, corresponding to a method of automatic search. In this model of percolation, which have characteristics of self-organized criticality, the method does not use the automated search on Leaths algorithm. It uses the following control rule: pt+1 = pt + k(Rc − Rt), where p is the probability of percolation, k is a kinetic parameter where 0 < k < 1 and R is the fraction of percolating finite square lattices with side L, LxL. This rule provides a time series corresponding to the dynamical evolution of the system, in particular the likelihood of percolation p. We proceed an analysis of scaling of the signal obtained in this way. The model used here enables the study of the automatic search method used for site percolation in square lattices, evaluating the dynamics of their parameters when the system goes to the critical point. It shows that the scaling of , the time elapsed until the system reaches the critical point, and tcor, the time required for the system loses its correlations, are both inversely proportional to k, the kinetic parameter of the control rule. We verify yet that the system has two different time scales after: one in which the system shows noise of type 1 f , indicating to be strongly correlated. Another in which it shows white noise, indicating that the correlation is lost. For large intervals of time the dynamics of the system shows ergodicity

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The gravity inversion method is a mathematic process that can be used to estimate the basement relief of a sedimentary basin. However, the inverse problem in potential-field methods has neither a unique nor a stable solution, so additional information (other than gravity measurements) must be supplied by the interpreter to transform this problem into a well-posed one. This dissertation presents the application of a gravity inversion method to estimate the basement relief of the onshore Potiguar Basin. The density contrast between sediments and basament is assumed to be known and constant. The proposed methodology consists of discretizing the sedimentary layer into a grid of rectangular juxtaposed prisms whose thicknesses correspond to the depth to basement which is the parameter to be estimated. To stabilize the inversion I introduce constraints in accordance with the known geologic information. The method minimizes an objective function of the model that requires not only the model to be smooth and close to the seismic-derived model, which is used as a reference model, but also to honor well-log constraints. The latter are introduced through the use of logarithmic barrier terms in the objective function. The inversion process was applied in order to simulate different phases during the exploration development of a basin. The methodology consisted in applying the gravity inversion in distinct scenarios: the first one used only gravity data and a plain reference model; the second scenario was divided in two cases, we incorporated either borehole logs information or seismic model into the process. Finally I incorporated the basement depth generated by seismic interpretation into the inversion as a reference model and imposed depth constraint from boreholes using the primal logarithmic barrier method. As a result, the estimation of the basement relief in every scenario has satisfactorily reproduced the basin framework, and the incorporation of the constraints led to improve depth basement definition. The joint use of surface gravity data, seismic imaging and borehole logging information makes the process more robust and allows an improvement in the estimate, providing a result closer to the actual basement relief. In addition, I would like to remark that the result obtained in the first scenario already has provided a very coherent basement relief when compared to the known basin framework. This is significant information, when comparing the differences in the costs and environment impact related to gravimetric and seismic surveys and also the well drillings

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main hypothesis of this thesis is that the deve lopment of industrial automation applications efficiently, you need a good structuri ng of data to be handled. Then, with the aim of structuring knowledge involved in the contex t of industrial processes, this thesis proposes an ontology called OntoAuto that conceptua lly models the elements involved in the description of industrial processes. To validat e the proposed ontology, several applica- tions are presented. In the first, two typical indu strial processes are modeled conceptually: treatment unit DEA (Diethanolamine) and kiln. In th e second application, the ontology is used to perform a semantic filtering alarms, which together with the analysis of correla- tions, provides temporal relationships between alar ms from an industrial process. In the third application, the ontology was used for modeli ng and analysis of construction cost and operation processes. In the fourth application, the ontology is adopted to analyze the reliability and availability of an industrial plant . Both for the application as it involves costs for the area of reliability, it was necessary to create new ontologies, and OntoE- con OntoConf, respectivamentem, importing the knowl edge represented in OntoAuto but adding specific information. The main conclusions of the thesis has been that on tology approaches are well suited for structuring the knowledge of industrial process es and based on them, you can develop various advanced applications in industrial automat ion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The humanity reached a time of unprecedented technological development. Science has achieved and continues to achieve technologies that allowed increasingly to understand the universe and the laws which govern it, and also try to coexist without destroying the planet we live on. One of the main challenges of the XXI century is to seek and increase new sources of clean energy, renewable and able to sustain our growth and lifestyle. It is the duty of every researcher engage and contribute in this race of energy. In this context, wind power presents itself as one of the great promises for the future of electricity generation . Despite being a bit older than other sources of renewable energy, wind power still presents a wide field for improvement. The development of new techniques for control of the generator along with the development of research laboratories specializing in wind generation are one of the key points to improve the performance, efficiency and reliability of the system. Appropriate control of back-to-back converter scheme allows wind turbines based on the doubly-fed induction generator to operate in the variable-speed mode, whose benefits include maximum power extraction, reactive power injection and mechanical stress reduction. The generator-side converter provides control of active and reactive power injected into the grid, whereas the grid-side converter provides control of the DC link voltage and bi-directional power flow. The conventional control structure uses PI controllers with feed-forward compensation of cross-coupling dq terms. This control technique is sensitive to model uncertainties and the compensation of dynamic dq terms results on a competing control strategy. Therefore, to overcome these problems, it is proposed in this thesis a robust internal model based state-feedback control structure in order to eliminate the cross-coupling terms and thereby improve the generator drive as well as its dynamic behavior during sudden changes in wind speed. It is compared the conventional control approach with the proposed control technique for DFIG wind turbine control under both steady and gust wind conditions. Moreover, it is also proposed in this thesis an wind turbine emulator, which was developed to recreate in laboratory a realistic condition and to submit the generator to several wind speed conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

About 10% of faults involving the electrical system occurs in power transformers. Therefore, the protection applied to the power transformers is essential to ensure the continuous operation of this device and the efficiency of the electrical system. Among the protection functions applied to power transformers, the differential protection appears as one of the main schemes, presenting reliable discrimination between internal faults and external faults or inrush currents. However, when using the low frequency components of the differential currents flowing through the transformer, the main difficulty of the conventional methods of differential protection is the delay for detection of the events. However, internal faults, external faults and other disturbances related to the transformer operation present transient and can be appropriately detected by the wavelet transform. In this paper is proposed the development of a wavelet-based differential protection for detection and identification of external faults to the transformer, internal faults, and transformer energizing by using the wavelet coefficient energy of the differential currents. The obtained results reveal the advantages of using of the wavelet transform in the differential protection compared to conventional protection, since it provides reliability and speed in detection of these events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivo: Traduzir e avaliar as propriedades psicométricas do Mobility Assessment Tool Physical Activity (MAT-PA) em idosos comunitários brasileiros. Métodos: Trata-se de um estudo tradução, adaptação cultural, e acurácia do instrumento MAT-PA, no qual foram avaliados 329 idosos, com idade mínima de 60 anos, residentes na comunidade. Os indivíduos submeteram-se a um formulário de avaliação composto por: questionário sócio-demográfico e de saúde percebida; avaliação física; Prova Cognitiva de Leganés (PCL); Center for Epidemiologic Studies Depression Scale (CES-D); International Physical Activity Questionnaire (IPAQ); Mobility Assessment Tool Physical Activity (MAT-PA). Dessa amostra total, 42 idosos utilizaram o acelerômetro durante 8 dias. Para verificar a confiabilidade teste-reteste do MAT-PA, reaplicou-se esse instrumento em 34 idosos 8 dias após a primeira avaliação. A análise estatística utilizada foi a correlação de Spearman, o Coeficiente de Correlação Intra-classe, o coeficiente α de Cronbach, o Bland-Altman e o teste T pareado. Resultados: As correlações dos dados IPAQ e acelerômetro versus o escore total do MAT-PA foram significativas e apresentaram um coeficiente de correlação de Spearman de 0,13 e 0,41, respectivamente. Analisou-se também a confiabilidade que apresentou as seguintes medidas: consistência interna, pelo coeficiente alfa de Cronbach (α= 0,70); Concordância teste-reteste, pelo coeficiente de correlação intra-classe (CCI=0,53; p<0,001). Conclusão: A versão brasileira do Mobility Assessment Tool Physical Activity (MAT-PA) como um instrumento de avaliação da atividade física de idosos, mostrou ser um método válido e confiável.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The principal effluent in the oil industry is the produced water, which is commonly associated to the produced oil. It presents a pronounced volume of production and it can be reflected on the environment and society, if its discharge is unappropriated. Therefore, it is indispensable a valuable careful to establish and maintain its management. The traditional treatment of produced water, usualy includes both tecniques, flocculation and flotation. At flocculation processes, there are traditional floculant agents that aren’t well specified by tecnichal information tables and still expensive. As for the flotation process, it’s the step in which is possible to separate the suspended particles in the effluent. The dissolved air flotation (DAF) is a technique that has been consolidating economically and environmentally, presenting great reliability when compared with other processes. The DAF is presented as a process widely used in various fields of water and wastewater treatment around the globe. In this regard, this study was aimed to evaluate the potential of an alternative natural flocculant agent based on Moringa oleifera to reduce the amount of oil and grease (TOG) in produced water from the oil industry by the method of flocculation/DAF. the natural flocculant agent was evaluated by its efficacy, as well as its efficiency when compared with two commercial flocculant agents normally used by the petroleum industry. The experiments were conducted following an experimental design and the overall efficiencies for all flocculants were treated through statistical calculation based on the use of STATISTICA software version 10.0. Therefore, contour surfaces were obtained from the experimental design and were interpreted in terms of the response variable removal efficiency TOG (total oil and greases). The plan still allowed to obtain mathematical models for calculating the response variable in the studied conditions. Commercial flocculants showed similar behavior, with an average overall efficiency of 90% for oil removal, however it is the economical analysis the decisive factor to choose one of these flocculant agents to the process. The natural alternative flocculant agent based on Moringa oleifera showed lower separation efficiency than those of commercials one (average 70%), on the other hand this flocculant causes less environmental impacts and it´s less expensive