965 resultados para CNPQ::ENGENHARIAS::ENGENHARIA MECANICA::PROCESSOS DE FABRICACAO
Resumo:
This graduate thesis proposes a model to asynchronously replicate heterogeneous databases. This model singularly combines -in a systematic way and in a single project -different concepts, techniques and paradigms related to the areas of database replication and management of heterogeneous databases. One of the main advantages of the replication is to allow applications to continue to process information, during time intervals when they are off the network and to trigger the database synchronization, as soon as the network connection is reestablished. Therefore, the model introduces a communication and update protocol that takes in consideration the environment of asynchronous characteristics used. As part of the work, a tool was developed in Java language, based on the model s premises in order to process, test, simulate and validate the proposed model
Resumo:
Large efforts have been maden by the scientific community on tasks involving locomotion of mobile robots. To execute this kind of task, we must develop to the robot the ability of navigation through the environment in a safe way, that is, without collisions with the objects. In order to perform this, it is necessary to implement strategies that makes possible to detect obstacles. In this work, we deal with this problem by proposing a system that is able to collect sensory information and to estimate the possibility for obstacles to occur in the mobile robot path. Stereo cameras positioned in parallel to each other in a structure coupled to the robot are employed as the main sensory device, making possible the generation of a disparity map. Code optimizations and a strategy for data reduction and abstraction are applied to the images, resulting in a substantial gain in the execution time. This makes possible to the high level decision processes to execute obstacle deviation in real time. This system can be employed in situations where the robot is remotely operated, as well as in situations where it depends only on itself to generate trajectories (the autonomous case)
Resumo:
This work presents a set of intelligent algorithms with the purpose of correcting calibration errors in sensors and reducting the periodicity of their calibrations. Such algorithms were designed using Artificial Neural Networks due to its great capacity of learning, adaptation and function approximation. Two approaches willbe shown, the firstone uses Multilayer Perceptron Networks to approximate the many shapes of the calibration curve of a sensor which discalibrates in different time points. This approach requires the knowledge of the sensor s functioning time, but this information is not always available. To overcome this need, another approach using Recurrent Neural Networks was proposed. The Recurrent Neural Networks have a great capacity of learning the dynamics of a system to which it was trained, so they can learn the dynamics of a sensor s discalibration. Knowingthe sensor s functioning time or its discalibration dynamics, it is possible to determine how much a sensor is discalibrated and correct its measured value, providing then, a more exact measurement. The algorithms proposed in this work can be implemented in a Foundation Fieldbus industrial network environment, which has a good capacity of device programming through its function blocks, making it possible to have them applied to the measurement process
Resumo:
Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
The industries are getting more and more rigorous, when security is in question, no matter is to avoid financial damages due to accidents and low productivity, or when it s related to the environment protection. It was thinking about great world accidents around the world involving aircrafts and industrial process (nuclear, petrochemical and so on) that we decided to invest in systems that could detect fault and diagnosis (FDD) them. The FDD systems can avoid eventual fault helping man on the maintenance and exchange of defective equipments. Nowadays, the issues that involve detection, isolation, diagnose and the controlling of tolerance fault are gathering strength in the academic and industrial environment. It is based on this fact, in this work, we discuss the importance of techniques that can assist in the development of systems for Fault Detection and Diagnosis (FDD) and propose a hybrid method for FDD in dynamic systems. We present a brief history to contextualize the techniques used in working environments. The detection of fault in the proposed system is based on state observers in conjunction with other statistical techniques. The principal idea is to use the observer himself, in addition to serving as an analytical redundancy, in allowing the creation of a residue. This residue is used in FDD. A signature database assists in the identification of system faults, which based on the signatures derived from trend analysis of the residue signal and its difference, performs the classification of the faults based purely on a decision tree. This FDD system is tested and validated in two plants: a simulated plant with coupled tanks and didactic plant with industrial instrumentation. All collected results of those tests will be discussed
Resumo:
This work proposes the specification of a new function block according to Foundation Fieldbus standards. The new block implements an artificial neural network, which may be useful in process control applications. The specification includes the definition of a main algorithm, that implements a neural network, as well as the description of some accessory functions, which provide safety characteristics to the block operation. Besides, it also describes the block attributes emphasizing its parameters, which constitute the block interfaces. Some experimental results, obtained from an artificial neural network implementation using actual standard functional blocks on a laboratorial FF network, are also shown, in order to demonstrate the possibility and also the convenience of integrating a neural network to Fieldbus devices
Resumo:
The semiconductor technologies evolutions leads devices to be developed with higher processing capability. Thus, those components have been used widely in more fields. Many industrial environment such as: oils, mines, automotives and hospitals are frequently using those devices on theirs process. Those industries activities are direct related to environment and health safe. So, it is quite important that those systems have extra safe features yield more reliability, safe and availability. The reference model eOSI that will be presented by this work is aimed to allow the development of systems under a new view perspective which can improve and make simpler the choice of strategies for fault tolerant. As a way to validate the model na architecture FPGA-based was developed.
Resumo:
This work proposes a new methodology to verify those analog circuits, providing an automated tools to help the verifiers to have a more truthful result. This work presents the development of new methodology for analog circuits verification. The main goal is to provide a more automated verification process to certify analog circuits functional behavior. The proposed methodology is based on the golden model technique. A verification environment based on this methodology was built and results of a study case based on the validation of an operational amplifier design are offered as a confirmation of its effectiveness. The results had shown that the verification process was more truthful because of the automation provided by the tool developed
Resumo:
The objective of this research is to discuss about the need for implementation of new alternatives for the implementation on the metrological control: on the findings of initial and subsequent measurements, the control procedures of measurement uncertainty applied in assessing the loss or remains found in handling operations of bulk liquids, when used turbine meters used in measuring the tax on the business of Petrobras, due to the current environment of legal metrology and scientific, both domestic and international. We aim, with these alternatives: standardizing the minimization of random and systematic errors, the estimate of the remaining errors, as well as the management control of metrological calibration procedures, control of measurement uncertainty, and contribute to the change in the form of performance of legal metrology and scientific disseminating new information to change management of metrological control, objectively focused on aspects of supervision in implementing these activities in the control of the uncertainties of measurement used in our processes in the fiscal measurement system Petrobras. Results are presented, information and comments on the influence of measurement uncertainty in the current results of the fiscal and transfer of custody. This will emphasize the need, among other things, improvement and expansion of metrological control monitored by setting a better meet demand, calibration equipment and measuring instruments for Petrobras. Finally, we intend to establish the need for improving the method of evaluation of the data meter applied to the current management control of measurement uncertainty by proposing a methodology for addressing the problem, as well as highlighting the expected results.
Resumo:
This paper describes the design, implementation and enforcement of a system for industrial process control based on fuzzy logic and developed using Java, with support for industrial communication protocol through the OPC (Ole for Process Control). Besides the java framework, the software is completely independent from other platforms. It provides friendly and functional tools for modeling, construction and editing of complex fuzzy inference systems, and uses these logical systems in control of a wide variety of industrial processes. The main requirements of the developed system should be flexibility, robustness, reliability and ease of expansion
Resumo:
Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study
Resumo:
Hospital Automation is an area that is constantly growing. The emergency of new technologies and hardware is transforming the processes more efficient. Nevertheless, some of the hospital processes are still being performed manually, such as monitoring of patients that is considered critical because it involves human lives. One of the factors that should be taken into account during a monitoring is the agility to detect any abnormality in vital signs of patients, as well as warning of this anomaly to the medical team involved. So, this master's thesis aims to develop an architecture to automate this process of monitoring and reporting of possible alert to a professional, so that emergency care can be done effectively. The computing mobile was used to improve the communication by distributing messages between a central located into the hospital and the mobile carried by the duty
Resumo:
The main purpose of this work is to develop an environment that allows HYSYS R chemical process simulator communication with sensors and actuators from a Foundation Fieldbus industrial network. The environment is considered a hybrid resource since it has a real portion (industrial network) and a simulated one (process) with all measurement and control signals also real. It is possible to reproduce different industrial process dynamics without being required any physical network modification, enabling simulation of some situations that exist in a real industrial environment. This feature testifies the environment flexibility. In this work, a distillation column is simulated through HYSYS R with all its variables measured and controlled by Foundation Fieldbus devices
Resumo:
Every day, water scarcity becomes a more serious problem and, directly affects global society. Studies are directed in order to raise awareness of the rational use of this natural asset that is essential to our survival. Only 0.007% of the water available in the world have easy access and can be consumed by humans, it can be found in rivers, lakes, etc... To better take advantage of the water used in homes and small businesses, reuse projects are often implemented, resulting in savings for customers of water utilities. The reuse projects involve several areas of engineering, like Environmental, Chemical, Electrical and Computer Engineering. The last two are responsible for the control of the process, which aims to make gray water (soapy water), and clear blue water (rain water), ideal for consumption, or for use in watering gardens, flushing, among others applications. Water has several features that should be taken into consideration when it comes to working its reuse. Some of the features are, turbidity, temperature, electrical conductivity and, pH. In this document there is a proposal to control the pH (potential Hydrogen) through a microcontroller, using the fuzzy logic as strategy of control. The controller was developed in the fuzzy toolbox of Matlab®