69 resultados para sistemas dinâmicos de geração de resíduo
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The detection and diagnosis of faults, ie., find out how , where and why failures occur is an important area of study since man came to be replaced by machines. However, no technique studied to date can solve definitively the problem. Differences in dynamic systems, whether linear, nonlinear, variant or invariant in time, with physical or analytical redundancy, hamper research in order to obtain a unique solution . In this paper, a technique for fault detection and diagnosis (FDD) will be presented in dynamic systems using state observers in conjunction with other tools in order to create a hybrid FDD. A modified state observer is used to create a residue that allows also the detection and diagnosis of faults. A bank of faults signatures will be created using statistical tools and finally an approach using mean squared error ( MSE ) will assist in the study of the behavior of fault diagnosis even in the presence of noise . This methodology is then applied to an educational plant with coupled tanks and other with industrial instrumentation to validate the system.
Resumo:
The industries are getting more and more rigorous, when security is in question, no matter is to avoid financial damages due to accidents and low productivity, or when it s related to the environment protection. It was thinking about great world accidents around the world involving aircrafts and industrial process (nuclear, petrochemical and so on) that we decided to invest in systems that could detect fault and diagnosis (FDD) them. The FDD systems can avoid eventual fault helping man on the maintenance and exchange of defective equipments. Nowadays, the issues that involve detection, isolation, diagnose and the controlling of tolerance fault are gathering strength in the academic and industrial environment. It is based on this fact, in this work, we discuss the importance of techniques that can assist in the development of systems for Fault Detection and Diagnosis (FDD) and propose a hybrid method for FDD in dynamic systems. We present a brief history to contextualize the techniques used in working environments. The detection of fault in the proposed system is based on state observers in conjunction with other statistical techniques. The principal idea is to use the observer himself, in addition to serving as an analytical redundancy, in allowing the creation of a residue. This residue is used in FDD. A signature database assists in the identification of system faults, which based on the signatures derived from trend analysis of the residue signal and its difference, performs the classification of the faults based purely on a decision tree. This FDD system is tested and validated in two plants: a simulated plant with coupled tanks and didactic plant with industrial instrumentation. All collected results of those tests will be discussed
Sistema de detecção e isolamento de falhas em sistemas dinâmicos baseado em identificação paramétrica
Resumo:
The present research aims at contributing to the area of detection and diagnosis of failure through the proposal of a new system architecture of detection and isolation of failures (FDI, Fault Detection and Isolation). The proposed architecture presents innovations related to the way the physical values monitored are linked to the FDI system and, as a consequence, the way the failures are detected, isolated and classified. A search for mathematical tools able to satisfy the objectives of the proposed architecture has pointed at the use of the Kalman Filter and its derivatives EKF (Extended Kalman Filter) and UKF (Unscented Kalman Filter). The use of the first one is efficient when the monitored process presents a linear relation among its physical values to be monitored and its out-put. The other two are proficient in case this dynamics is no-linear. After that, a short comparative of features and abilities in the context of failure detection concludes that the UFK system is a better alternative than the EKF one to compose the architecture of the FDI system proposed in case of processes of no-linear dynamics. The results shown in the end of the research refer to the linear and no-linear industrial processes. The efficiency of the proposed architecture may be observed since it has been applied to simulated and real processes. To conclude, the contributions of this thesis are found in the end of the text
Resumo:
This master dissertation presents the development of a fault detection and isolation system based in neural network. The system is composed of two parts: an identification subsystem and a classification subsystem. Both of the subsystems use neural network techniques with multilayer perceptron training algorithm. Two approaches for identifica-tion stage were analyzed. The fault classifier uses only residue signals from the identification subsystem. To validate the proposal we have done simulation and real experiments in a level system with two water reservoirs. Several faults were generated above this plant and the proposed fault detection system presented very acceptable behavior. In the end of this work we highlight the main difficulties found in real tests that do not exist when it works only with simulation environments
Resumo:
A typical electrical power system is characterized by centr alization of power gene- ration. However, with the restructuring of the electric sys tem, this topology is changing with the insertion of generators in parallel with the distri bution system (distributed gene- ration) that provides several benefits to be located near to e nergy consumers. Therefore, the integration of distributed generators, especially fro m renewable sources in the Brazi- lian system has been common every year. However, this new sys tem topology may result in new challenges in the field of the power system control, ope ration, and protection. One of the main problems related to the distributed generati on is the islanding formation, witch can result in safety risk to the people and to the power g rid. Among the several islanding protection techniques, passive techniques have low implementation cost and simplicity, requiring only voltage and current measuremen ts to detect system problems. This paper proposes a protection system based on the wavelet transform with overcur- rent and under/overvoltage functions as well as infomation of fault-induced transients in order to provide a fast detection and identification of fault s in the system. The propo- sed protection scheme was evaluated through simulation and experimental studies, with performance similar to the overcurrent and under/overvolt age conventional methods, but with the additional detection of the exact moment of the fault.
Resumo:
The great interest in nonlinear system identification is mainly due to the fact that a large amount of real systems are complex and need to have their nonlinearities considered so that their models can be successfully used in applications of control, prediction, inference, among others. This work evaluates the application of Fuzzy Wavelet Neural Networks (FWNN) to identify nonlinear dynamical systems subjected to noise and outliers. Generally, these elements cause negative effects on the identification procedure, resulting in erroneous interpretations regarding the dynamical behavior of the system. The FWNN combines in a single structure the ability to deal with uncertainties of fuzzy logic, the multiresolution characteristics of wavelet theory and learning and generalization abilities of the artificial neural networks. Usually, the learning procedure of these neural networks is realized by a gradient based method, which uses the mean squared error as its cost function. This work proposes the replacement of this traditional function by an Information Theoretic Learning similarity measure, called correntropy. With the use of this similarity measure, higher order statistics can be considered during the FWNN training process. For this reason, this measure is more suitable for non-Gaussian error distributions and makes the training less sensitive to the presence of outliers. In order to evaluate this replacement, FWNN models are obtained in two identification case studies: a real nonlinear system, consisting of a multisection tank, and a simulated system based on a model of the human knee joint. The results demonstrate that the application of correntropy as the error backpropagation algorithm cost function makes the identification procedure using FWNN models more robust to outliers. However, this is only achieved if the gaussian kernel width of correntropy is properly adjusted.
Resumo:
The introduction of new digital services in the cellular networks, in transmission rates each time more raised, has stimulated recent research that comes studying ways to increase the data communication capacity and to reduce the delays in forward and reverse links of third generation WCDMA systems. These studies have resulted in new standards, known as 3.5G, published by 3GPP group, for the evolution of the third generation of the cellular systems. In this Masters Thesis the performance of a 3G WCDMA system, with diverse base stations and thousand of users is developed with assists of the planning tool NPSW. Moreover the performance of the 3.5G techniques hybrid automatic retransmission and multi-user detection with interference cancellation, candidates for enhance the WCDMA uplink capacity, is verified by means of computational simulations in Matlab of the increase of the data communication capacity and the reduction of the delays in the retransmission of packages of information
Resumo:
In the execution of civil engineering works, either by wasting during the coating of wall or demolition of gypsum walls, the generation of the gypsum waste involves serious environmental concerns. These concerns are increased by the high demand of this raw material in the sector and by the difficulties of proper disposal byproduct generated. In the search for alternatives to minimize this problem, many research works are being conducted, giving emphasis in using gypsum waste as fillers in composites materials in order to improve the acoustic, thermal and mechanical performances. Through empirical testing, it was observed that the crystallization water contained in the residue (CaSO4.2H2O) could act like primary agent in the expanding of the polyurethane foam. Considering that polyurethane produced from vegetable oils are biodegradable synthetic polymers and that are admittedly to represent an alternative to petrochemical synthetic polyurethane, this research consist an analysis of the thermal behavior of a composite whose matrix obtained from a resin derived from the expansive castor oil seed, with loads of 4%, 8%, 12% and 16% of gypsum waste replacing to the polyol prepolymer blend. Contributors to this analysis: a characterization of the raw material through analysis of spectroscopy by Fourier transform infrared (FTIR), chemical analysis by X-Ray Fluorescence (XRF) and mineralogical analysis by X Ray Diffraction (XRD), complemented by thermo gravimetric analysis (TGA). In order to evaluate the thermo physical properties and thermal behavior of the composites manufactured in die closed with expansion contained, were also carried tests to determine the percentage of open pore volume using a gas pycnometer, scanning electronic microscopy (SEM), in addition to testing of flammability and the resistance to contact with hot surfaces. Through the analysis of the results, it appears that it is possible to produce a new material, which few changes in their thermo physical properties and thermal performance, promotes significant changes and attractive to the environment
Resumo:
Some programs may have their entry data specified by formalized context-free grammars. This formalization facilitates the use of tools in the systematization and the rise of the quality of their test process. This category of programs, compilers have been the first to use this kind of tool for the automation of their tests. In this work we present an approach for definition of tests from the formal description of the entries of the program. The generation of the sentences is performed by taking into account syntactic aspects defined by the specification of the entries, the grammar. For optimization, their coverage criteria are used to limit the quantity of tests without diminishing their quality. Our approach uses these criteria to drive generation to produce sentences that satisfy a specific coverage criterion. The approach presented is based on the use of Lua language, relying heavily on its resources of coroutines and dynamic construction of functions. With these resources, we propose a simple and compact implementation that can be optimized and controlled in different ways, in order to seek satisfaction the different implemented coverage criteria. To make the use of our tool simpler, the EBNF notation for the specification of the entries was adopted. Its parser was specified in the tool Meta-Environment for rapid prototyping
Resumo:
With the increasing complexity of software systems, there is also an increased concern about its faults. These faults can cause financial losses and even loss of life. Therefore, we propose in this paper the minimization of faults in software by using formally specified tests. The combination of testing and formal specifications is gaining strength in searches mainly through the MBT (Model-Based Testing). The development of software from formal specifications, when the whole process of refinement is done rigorously, ensures that what is specified in the application will be implemented. Thus, the implementation generated from these specifications would accurately depict what was specified. But not always the specification is refined to the level of implementation and code generation, and in these cases the tests generated from the specification tend to find fault. Additionally, the generation of so-called "invalid tests", ie tests that exercise the application scenarios that were not addressed in the specification, complements more significantly the formal development process. Therefore, this paper proposes a method for generating tests from B formal specifications. This method was structured in pseudo-code. The method is based on the systematization of the techniques of black box testing of boundary value analysis, equivalence partitioning, as well as the technique of orthogonal pairs. The method was applied to a B specification and B test machines that generate test cases independent of implementation language were generated. Aiming to validate the method, test cases were transformed manually in JUnit test cases and the application, created from the B specification and developed in Java, was tested. Faults were found with the execution of the JUnit test cases
Resumo:
The game industry has been experiencing a consistent increase in production costs of games lately. Part of this increase refers to the current trend of having bigger, more interactive and replayable environments. This trend translates to an increase in both team size and development time, which makes game development a even more risky investment and may reduce innovation in the area. As a possible solution to this problem, the scientific community is focusing on the generation of procedural content and, more specifically, on procedurally generated levels. Given the great diversity and complexity of games, most works choose to deal with a specific genre, platform games being one of the most studied. This work aims at proposing a procedural level generation method for platform/adventure games, a fairly more complex genre than most classic platformers which so far has not been the subject of study from other works. The level generation process was divided in two steps, planning and viusal generation, respectively responsible for generating a compact representation of the level and determining its view. The planning stage was divided in game design and level design, and uses a goaloriented process to output a set of rooms. The visual generation step receives a set of rooms and fills its interior with the appropriate parts of previously authored geometry
Resumo:
A remoção de inconsistências em um projeto é menos custosa quando realizadas nas etapas iniciais da sua concepção. A utilização de Métodos Formais melhora a compreensão dos sistemas além de possuir diversas técnicas, como a especificação e verificação formal, para identificar essas inconsistências nas etapas iniciais de um projeto. Porém, a transformação de uma especificação formal para uma linguagem de programação é uma tarefa não trivial. Quando feita manualmente, é uma tarefa passível da inserção de erros. O uso de ferramentas que auxiliem esta etapa pode proporcionar grandes benefícios ao produto final a ser desenvolvido. Este trabalho propõe a extensão de uma ferramenta cujo foco é a tradução automática de especificações em CSPm para Handel-C. CSP é uma linguagem de descrição formal adequada para trabalhar com sistemas concorrentes. Handel-C é uma linguagem de programação cujo resultado pode ser compilado diretamente para FPGA's. A extensão consiste no aumento no número de operadores CSPm aceitos pela ferramenta, permitindo ao usuário definir processos locais, renomear canais e utilizar guarda booleana em escolhas externas. Além disto, propomos também a implementação de um protocolo de comunicação que elimina algumas restrições da composição paralela de processos na tradução para Handel-C, permitindo que a comunicação entre múltiplos processos possa ser mapeada de maneira consistente e que a mesma somente ocorra quando for autorizada.
Resumo:
Removing inconsistencies in a project is a less expensive activity when done in the early steps of design. The use of formal methods improves the understanding of systems. They have various techniques such as formal specification and verification to identify these problems in the initial stages of a project. However, the transformation from a formal specification into a programming language is a non-trivial task and error prone, specially when done manually. The aid of tools at this stage can bring great benefits to the final product to be developed. This paper proposes the extension of a tool whose focus is the automatic translation of specifications written in CSPM into Handel-C. CSP is a formal description language suitable for concurrent systems, and CSPM is the notation used in tools support. Handel-C is a programming language whose result can be compiled directly into FPGA s. Our extension increases the number of CSPM operators accepted by the tool, allowing the user to define local processes, to rename channels in a process and to use Boolean guards on external choices. In addition, we also propose the implementation of a communication protocol that eliminates some restrictions on parallel composition of processes in the translation into Handel-C, allowing communication in a same channel between multiple processes to be mapped in a consistent manner and that improper communication in a channel does not ocurr in the generated code, ie, communications that are not allowed in the system specification
Resumo:
The work proposed by Cleverton Hentz (2010) presented an approach to define tests from the formal description of a program s input. Since some programs, such as compilers, may have their inputs formalized through grammars, it is common to use context-free grammars to specify the set of its valid entries. In the original work the author developed a tool that automatically generates tests for compilers. In the present work we identify types of problems in various areas where grammars are used to describe them , for example, to specify software configurations, which are potential situations to use LGen. In addition, we conducted case studies with grammars of different domains and from these studies it was possible to evaluate the behavior and performance of LGen during the generation of sentences, evaluating aspects such as execution time, number of generated sentences and satisfaction of coverage criteria available in LGen
Resumo:
Web services are computational solutions designed according to the principles of Service Oriented Computing. Web services can be built upon pre-existing services available on the Internet by using composition languages. We propose a method to generate WS-BPEL processes from abstract specifications provided with high-level control-flow information. The proposed method allows the composition designer to concentrate on high-level specifi- cations, in order to increase productivity and generate specifications that are independent of specific web services. We consider service orchestrations, that is compositions where a central process coordinates all the operations of the application. The process of generating compositions is based on a rule rewriting algorithm, which has been extended to support basic control-flow information.We created a prototype of the extended refinement method and performed experiments over simple case studies