34 resultados para Identificação de sistemas
Resumo:
This paper presents a new multi-model technique of dentification in ANFIS for nonlinear systems. In this technique, the structure used is of the fuzzy Takagi-Sugeno of which the consequences are local linear models that represent the system of different points of operation and the precursors are membership functions whose adjustments are realized by the learning phase of the neuro-fuzzy ANFIS technique. The models that represent the system at different points of the operation can be found with linearization techniques like, for example, the Least Squares method that is robust against sounds and of simple application. The fuzzy system is responsible for informing the proportion of each model that should be utilized, using the membership functions. The membership functions can be adjusted by ANFIS with the use of neural network algorithms, like the back propagation error type, in such a way that the models found for each area are correctly interpolated and define an action of each model for possible entries into the system. In multi-models, the definition of action of models is known as metrics and, since this paper is based on ANFIS, it shall be denominated in ANFIS metrics. This way, ANFIS metrics is utilized to interpolate various models, composing a system to be identified. Differing from the traditional ANFIS, the created technique necessarily represents the system in various well defined regions by unaltered models whose pondered activation as per the membership functions. The selection of regions for the application of the Least Squares method is realized manually from the graphic analysis of the system behavior or from the physical characteristics of the plant. This selection serves as a base to initiate the linear model defining technique and generating the initial configuration of the membership functions. The experiments are conducted in a teaching tank, with multiple sections, designed and created to show the characteristics of the technique. The results from this tank illustrate the performance reached by the technique in task of identifying, utilizing configurations of ANFIS, comparing the developed technique with various models of simple metrics and comparing with the NNARX technique, also adapted to identification
Resumo:
A new method to perform TCP/IP fingerprinting is proposed. TCP/IP fingerprinting is the process of identify a remote machine through a TCP/IP based computer network. This method has many applications related to network security. Both intrusion and defence procedures may use this process to achieve their objectives. There are many known methods that perform this process in favorable conditions. However, nowadays there are many adversities that reduce the identification performance. This work aims the creation of a new OS fingerprinting tool that bypass these actual problems. The proposed method is based on the use of attractors reconstruction and neural networks to characterize and classify pseudo-random numbers generators
Resumo:
In this thesis, we study the application of spectral representations to the solution of problems in seismic exploration, the synthesis of fractal surfaces and the identification of correlations between one-dimensional signals. We apply a new approach, called Wavelet Coherency, to the study of stratigraphic correlation in well log signals, as an attempt to identify layers from the same geological formation, showing that the representation in wavelet space, with introduction of scale domain, can facilitate the process of comparing patterns in geophysical signals. We have introduced a new model for the generation of anisotropic fractional brownian surfaces based on curvelet transform, a new multiscale tool which can be seen as a generalization of the wavelet transform to include the direction component in multidimensional spaces. We have tested our model with a modified version of the Directional Average Method (DAM) to evaluate the anisotropy of fractional brownian surfaces. We also used the directional behavior of the curvelets to attack an important problem in seismic exploration: the atenuation of the ground roll, present in seismograms as a result of surface Rayleigh waves. The techniques employed are effective, leading to sparse representation of the signals, and, consequently, to good resolutions
Resumo:
There are authentication models which use passwords, keys, personal identifiers (cards, tags etc) to authenticate a particular user in the authentication/identification process. However, there are other systems that can use biometric data, such as signature, fingerprint, voice, etc., to authenticate an individual in a system. In another hand, the storage of biometric can bring some risks such as consistency and protection problems for these data. According to this problem, it is necessary to protect these biometric databases to ensure the integrity and reliability of the system. In this case, there are models for security/authentication biometric identification, for example, models and Fuzzy Vault and Fuzzy Commitment systems. Currently, these models are mostly used in the cases for protection of biometric data, but they have fragile elements in the protection process. Therefore, increasing the level of security of these methods through changes in the structure, or even by inserting new layers of protection is one of the goals of this thesis. In other words, this work proposes the simultaneous use of encryption (Encryption Algorithm Papilio) with protection models templates (Fuzzy Vault and Fuzzy Commitment) in identification systems based on biometric. The objective of this work is to improve two aspects in Biometric systems: safety and accuracy. Furthermore, it is necessary to maintain a reasonable level of efficiency of this data through the use of more elaborate classification structures, known as committees. Therefore, we intend to propose a model of a safer biometric identification systems for identification.
Resumo:
The aim of this work was to perform the extraction and characterization of xylan from corn cobs and prepare xylan-based microcapsules. For that purpose, an alkaline extraction of xylan was carried out followed by the polymer characterization regarding its technological properties, such as angle of repose, Hausner factor, density, compressibility and compactability. Also, a low-cost and rapid analytical procedure to identify xylan by means of infrared spectroscopy was studied. Xylan was characterized as a yellowish fine powder with low density and poor flow properties. After the extraction and characterization of the polymer, xylan-based microcapsules were prepared by means of interfacial crosslinking polymerization and their characterization was performed in order to obtain gastroresistant multiparticulate systems. This work involved the most suitable parameters of the preparation of microcapsules as well as the study of the process, scale-up methodology and biological analysis. Magnetic nanoparticles were used as a model system to be encapsulated by the xylan microcapsules. According to the results, xylan-based microcapsules were shown to be resistant to several conditions found along the gastrointestinal tract and they were able to avoid the early degradation of the magnetic nanoparticles
Resumo:
This thesis aims at analyzing from the perspective of the manager the importance of the use of quality management tools and concepts in Federal Universities. It was motivated by the following research problem: do Federal University managers consider it to be relevant the quality management in their institution? Therefore, we sought to gather evidence for a satisfactory approach that addresses the complexity of the topic researched: quality, higher education and quality management systems. We chose to adopt an applied study, the exploratory-descriptive research as to the objective and the quantitative and qualitative research as to the approach to the problem. The object of study is composed by the Planning Provosts of Federal Universities listed in the University Ranking Sheet - (RUF) in 2013. We chose to restrict the sample listing only the provosts of the 20 best-placed universities in the ranking of the Federal Universities. The research instrument was composed of 26 questions, of which 6 questions were designed to identify the profile of the manager, 16 questions of perception (manifested variables) on the importance of quality management in the University, where the managers assigned values (answers) to the affirmatives (that address the main topic of this thesis) based on a Likert scale of 5 points, and 4 open and optional questions, in order to identify general management practices used. It was used for statistical analysis (data analysis) descriptive and factorial statistics. The responses collected through the questionnaire portray the managers´ perception regarding the importance of quality management in their institutions. Sixteen variables were addressed, the results of factor analysis of importance were "Important" and "Very Important", where the variable (V2) was "Important" and all others "Very important." With this information, it is possible to prioritize some areas that deserve immediate action. As it was observed that some variables are "Very important" for the vast majority of managers, others did not show the same result as example (V2, V10, V11). It is concluded that the manager´s perception of quality management in his or her institution is relevant, but the same importance is not given to quality programs implemented in other segments of the economy, and that, despite the advancements offered by SINAES, the model does not evaluate the institution in a global way. Thus, with the results, it is expected to contribute to the advancement of the subject, trying to arouse interest from the managers of Federal Universities in the subject, emphasizing the importance of quality management systems as a necessary tool to raise the institutional quality
Resumo:
This master dissertation presents the development of a fault detection and isolation system based in neural network. The system is composed of two parts: an identification subsystem and a classification subsystem. Both of the subsystems use neural network techniques with multilayer perceptron training algorithm. Two approaches for identifica-tion stage were analyzed. The fault classifier uses only residue signals from the identification subsystem. To validate the proposal we have done simulation and real experiments in a level system with two water reservoirs. Several faults were generated above this plant and the proposed fault detection system presented very acceptable behavior. In the end of this work we highlight the main difficulties found in real tests that do not exist when it works only with simulation environments
Resumo:
The industries are getting more and more rigorous, when security is in question, no matter is to avoid financial damages due to accidents and low productivity, or when it s related to the environment protection. It was thinking about great world accidents around the world involving aircrafts and industrial process (nuclear, petrochemical and so on) that we decided to invest in systems that could detect fault and diagnosis (FDD) them. The FDD systems can avoid eventual fault helping man on the maintenance and exchange of defective equipments. Nowadays, the issues that involve detection, isolation, diagnose and the controlling of tolerance fault are gathering strength in the academic and industrial environment. It is based on this fact, in this work, we discuss the importance of techniques that can assist in the development of systems for Fault Detection and Diagnosis (FDD) and propose a hybrid method for FDD in dynamic systems. We present a brief history to contextualize the techniques used in working environments. The detection of fault in the proposed system is based on state observers in conjunction with other statistical techniques. The principal idea is to use the observer himself, in addition to serving as an analytical redundancy, in allowing the creation of a residue. This residue is used in FDD. A signature database assists in the identification of system faults, which based on the signatures derived from trend analysis of the residue signal and its difference, performs the classification of the faults based purely on a decision tree. This FDD system is tested and validated in two plants: a simulated plant with coupled tanks and didactic plant with industrial instrumentation. All collected results of those tests will be discussed
Resumo:
abstract
Resumo:
RFID (Radio Frequency Identification) identifies object by using the radio frequency which is a non-contact automatic identification technique. This technology has shown its powerful practical value and potential in the field of manufacturing, retailing, logistics and hospital automation. Unfortunately, the key problem that impacts the application of RFID system is the security of the information. Recently, researchers have demonstrated solutions to security threats in RFID technology. Among these solutions are several key management protocols. This master dissertations presents a performance evaluation of Neural Cryptography and Diffie-Hellman protocols in RFID systems. For this, we measure the processing time inherent in these protocols. The tests was developed on FPGA (Field-Programmable Gate Array) platform with Nios IIr embedded processor. The research methodology is based on the aggregation of knowledge to development of new RFID systems through a comparative analysis between these two protocols. The main contributions of this work are: performance evaluation of protocols (Diffie-Hellman encryption and Neural) on embedded platform and a survey on RFID security threats. According to the results the Diffie-Hellman key agreement protocol is more suitable for RFID systems
Resumo:
This work presents a modelling and identification method for a wheeled mobile robot, including the actuator dynamics. Instead of the classic modelling approach, where the robot position coordinates (x,y) are utilized as state variables (resulting in a non linear model), the proposed discrete model is based on the travelled distance increment Delta_l. Thus, the resulting model is linear and time invariant and it can be identified through classical methods such as Recursive Least Mean Squares. This approach has a problem: Delta_l can not be directly measured. In this paper, this problem is solved using an estimate of Delta_l based on a second order polynomial approximation. Experimental data were colected and the proposed method was used to identify the model of a real robot
Resumo:
This work uses computer vision algorithms related to features in the identification of medicine boxes for the visually impaired. The system is for people who have a disease that compromises his vision, hindering the identification of the correct medicine to be ingested. We use the camera, available in several popular devices such as computers, televisions and phones, to identify the box of the correct medicine and audio through the image, showing the poor information about the medication, such: as the dosage, indication and contraindications of the medication. We utilize a model of object detection using algorithms to identify the features in the boxes of drugs and playing the audio at the time of detection of feauteres in those boxes. Experiments carried out with 15 people show that where 93 % think that the system is useful and very helpful in identifying drugs for boxes. So, it is necessary to make use of this technology to help several people with visual impairments to take the right medicine, at the time indicated in advance by the physician
Resumo:
The traditional perimeter-based approach for computer network security (the castle and the moat model) hinders the progress of enterprise systems and promotes, both in administrators and users, the delusion that systems are protected. To deal with the new range of threats, a new data-safety oriented paradigm, called de-perimeterisation , began to be studied in the last decade. One of the requirements for the implementation of the de-perimeterised model of security is the definition of a safe and effective mechanism for federated identity. This work seeks to fill this gap by presenting the specification, modelling and implementation of a mechanism for federated identity, based on the combination of SAML and X.509 digital certificates stored in smart-cards, following the A3 standard of ICP-Brasil (Brazilian official certificate authority and PKI)
Resumo:
O estudo da variabilidade da precipitação é importante para o planejamento das atividades econômicas, possibilitando o uso mais eficiente e racional dos recursos hídricos. Dessa forma, o objetivo desta pesquisa é caracterizar o estado do Rio Grande do Norte com relação à variabilidade temporal da precipitação, agrupá-lo em regiões homogêneas e comparar diferentes técnicas de agrupamento. Para o estudo da variabilidade pluvial foram utilizados os índices: Grau de Concentração de Precipitação (PCD), que representa o grau em que a precipitação é distribuída ao longo do ano; e o Período de Concentração de Precipitação (PCP), que reflete o período no qual a precipitação está mais concentrada. Para a realização dos agrupamentos foram escolhidas as variáveis: PCD, PCP, médias da precipitações anuais e médias das precipitações mensais. Posteriormente, foi aplicada a análise de agrupamento para obter grupos com características similares. Os resultados mostraram que as precipitações são melhor distribuídas na região leste do estado, neste caso, os meses mais chuvosos são de maio a agosto. Os municípios localizados nessa área possuem dois picos de chuvas, devido à atuação de dois sistemas: Perturbações Ondulatórias dos Alísios (POA s) e Zona de Convergência Intertropical (ZCIT). Nas regiões localizadas a oeste os meses que possuem maior concentração de chuvas são março e abril, neste caso temos apenas um pico de precipitação, devido a atuação da ZCIT. A identificação de áreas homogêneas favorece o planejamento adequado de acordo com as características de cada grupo formado e o RN pode foi dividido em 4 (quatro) regiões homogêneas. As técnicas de agrupamento utilizadas apresentaram resultados semelhantes, porém, sugere-se o uso de mais de uma técnica para que se possa analisar qual delas reflete melhor a realidade local. O estudo da variabilidade de precipitação, através dos índices estudados e do agrupamento realizado, são ferramentas adequadas ao planejamento ambiental e econômico
Resumo:
When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software