403 resultados para Automação bancária
Resumo:
The Electrical Submersible Pumping is an artificial lift method for oil wells employed in onshore and offshore areas. The economic revenue of the petroleum production in a well depends on the oil flow and the availability of lifting equipment. The fewer the failures, the lower the revenue shortfall and costs to repair it. The frequency with which failures occur depends on the operating conditions to which the pumps are submitted. In high-productivity offshore wells monitoring is done by operators with engineering support 24h/day, which is not economically viable for the land areas. In this context, the automation of onshore wells has clear economic advantages. This work proposes a system capable of automatically control the operation of electrical submersible pumps, installed in oil wells, by an adjustment at the electric motor rotation based on signals provided by sensors installed on the surface and subsurface, keeping the pump operating within the recommended range, closest to the well s potential. Techniques are developed to estimate unmeasured variables, enabling the automation of wells that do not have all the required sensors. The automatic adjustment, according to an algorithm that runs on a programmable logic controller maintains the flow and submergence within acceptable parameters avoiding undesirable operating conditions, as the gas interference and high engine temperature, without need to resort to stopping the engine, which would reduce the its useful life. The control strategy described, based on modeling of physical phenomena and operational experience reported in literature, is materialized in terms of a fuzzy controller based on rules, and all generated information can be accompanied by a supervisory system
Resumo:
Human population have a significant number of polymorphic loci, whose use and applications range from construction of linkage maps, to study the evolution of populations, through the determination of paternity, forensic medicine and migration. Currently, STRs (Short Tanden Repeats) markers are considered the major markers for human identification, mainly due to its abundance and high variability because of the fact that they are easily amplifiable by PCR (Polymerase Chain Reaction), work with low amounts of DNA and be capable of automation processes involving fluorescence detection. The creation of regional databases containing allele frequencies of population provide subsidies to increase the reliability of the results of determining the genetic link. This paper aims to obtain a database of allele frequencies of 15 polymorphic molecular loci (D8S1179, D21S11, D7S820, CSF1PO, D19S433, vWA, TPOX, D18S51, D3S1358, TH01, D13S317, D16S539, D2S1338, D5S818 e FGA) in a population classifies as born in the State of Rio Grande do Norte, Brazil, totaling 1100 unrelated individuals. To evaluate the frequency, DNA samples were submitted to PCR amplification, followed by capilarry electrophoresis genetic sequencer. The frequencies identified in this study were compared with brazilian population in general and other states in Brazil. Except for the loci D21S11, D19S433 and D2D1338, the genotypes found were in Hardy-Weinberg equilibrium and no significant differences among the frequencies were found in the populations studied. The most informative loci was D2S1338 and D18S51, and the less informative is the locus TPOX
Resumo:
The conceptions of the judicial function, the process and the factors of legitimacy of the norm of decision are changed according to the model of State (liberal, social democratic and constitutional). The right of access to justice, likewise, follows the ideals present in constitutional movements experienced in different historical moments. The deficit of legitimacy of the judiciary is recurring subject of study in the doctrine, especially in the face of democratic standards that permeate the current paradigm of state. Under the process law, the essential element for the distinction of the states based on the rule of law (formal and material) and the democratic constitutional state lies in the democratic guarantee of participation to the litigants in the process of elaborating the norm of decision. The concern about the participatory democracy and the concretion of fundamental rights has as presupposition the conception of popular sovereignty. Keeping this effort in mind, the civil procedure cannot be oblivious to such considerations, especially when it justifies its constitutional conformation from the institutionalization of discourse within the procedural field (democratic principle) and of the democratization of access to justice, leading to the necessary contestation of the theory of instrumentality of the process. The democratic prospects of civil procedure and the concern about the legitimacy of the rule of decision cannot be moved away from the analysis of the judicial function and the elements that influence the legal suit s progress. The computerization of the judicial process entails extensive modification in the way the judicial function is developed, in view of automation of activities held, of the eliminating of bureaucratic tasks, manual and repetitive, and of streamlining the procedure. The objective of this study is to analyze the dogmatic changes and resulting practices from the implementation of the Judicial Electronic Process (JEP), prepared by the National Council of Justice, under the parameters of procedural discourse and democratic access to justice. Two hypotheses are raised, which, based on a bibliographic-documentary, applied and exploratory research, are contested dialectically. The expansion of publicity of procedural acts and the facilitating of communication and practice of such acts are elements that contribute to the effective participation of the recipients of the norm of decision in its making process and, therefore, the democratic principle in the procedural field. Ensuring access (to the parts) to the case files and reasonable duration of the process along with the preservation of its founding principles (contradictory, legal defense and isonomy) are essential to ensure democratic access to justice within the virtual system
Resumo:
In this study the objective is to implant Balanced Scorecard administration for the development of a Strategic Map, for the support of the electric outlet of decisions in the administration of operations of an unit of attendance doctor-hospitalar. The present work presents a case study developed at a private hospital in the State of Rio Grande do Norte. The collection of data was developed after the analysis of the revision of the literature, and he/she had as critical judgement of evaluation used by the following Unit. The work is concluded in the proposition of a strategic map that elevates the return on investment (financial perspective), in the item profitability and growth. In the search of the customer's satisfaction (customer's perspective), that is nothing else than it already exists inside of the Unit in study, just needing to be organized and aligned with the executive picture and the other collaborators. The requirements competitiveness, information, innovation and technology (perspective of the internal processes), they were indispensable to eliminate the re-work, waste and to improve the automation. It is finally, the investment and development of innovation mechanisms, they enlarge important competitive advantage in the processes for creation of value, through the ability, attitude and knowledge (perspective of the learning and growth). As one of the results of this study a strategic map was developed, looked for in Balanced Scorecard, for support in the electric outlet of decisions of the administration of operations of an Unit Doctor- Healthcare
Resumo:
In this thesis, it is developed the robustness and stability analysis of a variable structure model reference adaptive controller considering the presence of disturbances and unmodeled dynamics. The controller is applied to uncertain, monovariable, linear time-invariant plants with relative degree one, and its development is based on the indirect adaptive control. In the direct approach, well known in the literature, the switching laws are designed for the controller parameters. In the indirect one, they are designed for the plant parameters and, thus, the selection of the relays upper bounds becomes more intuitive, whereas they are related to physical parameters, which present uncertainties that can be known easier, such as resistances, capacitances, inertia moments and friction coefficients. Two versions for the controller algorithm with the stability analysis are presented. The global asymptotic stability with respect to a compact set is guaranteed for both cases. Simulation results under adverse operation conditions in order to verify the theoretical results and to show the performance and robustness of the proposed controller are showed. Moreover, for practical purposes, some simplifications on the original algorithm are developed
Resumo:
In order to guarantee database consistency, a database system should synchronize operations of concurrent transactions. The database component responsible for such synchronization is the scheduler. A scheduler synchronizes operations belonging to different transactions by means of concurrency control protocols. Concurrency control protocols may present different behaviors: in general, a scheduler behavior can be classified as aggressive or conservative. This paper presents the Intelligent Transaction Scheduler (ITS), which has the ability to synchronize the execution of concurrent transactions in an adaptive manner. This scheduler adapts its behavior (aggressive or conservative), according to the characteristics of the computing environment in which it is inserted, using an expert system based on fuzzy logic. The ITS can implement different correctness criteria, such as conventional (syntactic) serializability and semantic serializability. In order to evaluate the performance of the ITS in relation to others schedulers with exclusively aggressive or conservative behavior, it was applied in a dynamic environment, such as a Mobile Database Community (MDBC). An MDBC simulator was developed and many sets of tests were run. The experimentation results, presented herein, prove the efficiency of the ITS in synchronizing transactions in a dynamic environment
Resumo:
The objective of this thesis is proposes a method for a mobile robot to build a hybrid map of an indoor, semi-structured environment. The topological part of this map deals with spatial relationships among rooms and corridors. It is a topology-based map, where the edges of the graph are rooms or corridors, and each link between two distinct edges represents a door. The metric part of the map consists in a set of parameters. These parameters describe a geometric figure which adapts to the free space of the local environment. This figure is calculated by a set of points which sample the boundaries of the local free space. These points are obtained with range sensors and with knowledge about the robot s pose. A method based on generalized Hough transform is applied to this set of points in order to obtain the geomtric figure. The building of the hybrid map is an incremental procedure. It is accomplished while the robot explores the environment. Each room is associated with a metric local map and, consequently, with an edge of the topo-logical map. During the mapping procedure, the robot may use recent metric information of the environment to improve its global or relative pose
Resumo:
This work describes the study and the implementation of the vector speed control for a three-phase Bearingless induction machine with divided winding of 4 poles and 1,1 kW using the neural rotor flux estimation. The vector speed control operates together with the radial positioning controllers and with the winding currents controllers of the stator phases. For the radial positioning, the forces controlled by the internal machine magnetic fields are used. For the radial forces optimization , a special rotor winding with independent circuits which allows a low rotational torque influence was used. The neural flux estimation applied to the vector speed controls has the objective of compensating the parameter dependences of the conventional estimators in relation to the parameter machine s variations due to the temperature increases or due to the rotor magnetic saturation. The implemented control system allows a direct comparison between the respective responses of the speed and radial positioning controllers to the machine oriented by the neural rotor flux estimator in relation to the conventional flux estimator. All the system control is executed by a program developed in the ANSI C language. The DSP resources used by the system are: the Analog/Digital channels converters, the PWM outputs and the parallel and RS-232 serial interfaces, which are responsible, respectively, by the DSP programming and the data capture through the supervisory system
Resumo:
The progressing cavity pump artificial lift system, PCP, is a main lift system used in oil production industry. As this artificial lift application grows the knowledge of it s dynamics behavior, the application of automatic control and the developing of equipment selection design specialist systems are more useful. This work presents tools for dynamic analysis, control technics and a specialist system for selecting lift equipments for this artificial lift technology. The PCP artificial lift system consists of a progressing cavity pump installed downhole in the production tubing edge. The pump consists of two parts, a stator and a rotor, and is set in motion by the rotation of the rotor transmitted through a rod string installed in the tubing. The surface equipment generates and transmits the rotation to the rod string. First, is presented the developing of a complete mathematical dynamic model of PCP system. This model is simplified for use in several conditions, including steady state for sizing PCP equipments, like pump, rod string and drive head. This model is used to implement a computer simulator able to help in system analysis and to operates as a well with a controller and allows testing and developing of control algorithms. The next developing applies control technics to PCP system to optimize pumping velocity to achieve productivity and durability of downhole components. The mathematical model is linearized to apply conventional control technics including observability and controllability of the system and develop design rules for PI controller. Stability conditions are stated for operation point of the system. A fuzzy rule-based control system are developed from a PI controller using a inference machine based on Mandami operators. The fuzzy logic is applied to develop a specialist system that selects PCP equipments too. The developed technics to simulate and the linearized model was used in an actual well where a control system is installed. This control system consists of a pump intake pressure sensor, an industrial controller and a variable speed drive. The PI control was applied and fuzzy controller was applied to optimize simulated and actual well operation and the results was compared. The simulated and actual open loop response was compared to validate simulation. A case study was accomplished to validate equipment selection specialist system
Resumo:
The use of the maps obtained from remote sensing orbital images submitted to digital processing became fundamental to optimize conservation and monitoring actions of the coral reefs. However, the accuracy reached in the mapping of submerged areas is limited by variation of the water column that degrades the signal received by the orbital sensor and introduces errors in the final result of the classification. The limited capacity of the traditional methods based on conventional statistical techniques to solve the problems related to the inter-classes took the search of alternative strategies in the area of the Computational Intelligence. In this work an ensemble classifiers was built based on the combination of Support Vector Machines and Minimum Distance Classifier with the objective of classifying remotely sensed images of coral reefs ecosystem. The system is composed by three stages, through which the progressive refinement of the classification process happens. The patterns that received an ambiguous classification in a certain stage of the process were revalued in the subsequent stage. The prediction non ambiguous for all the data happened through the reduction or elimination of the false positive. The images were classified into five bottom-types: deep water; under-water corals; inter-tidal corals; algal and sandy bottom. The highest overall accuracy (89%) was obtained from SVM with polynomial kernel. The accuracy of the classified image was compared through the use of error matrix to the results obtained by the application of other classification methods based on a single classifier (neural network and the k-means algorithm). In the final, the comparison of results achieved demonstrated the potential of the ensemble classifiers as a tool of classification of images from submerged areas subject to the noise caused by atmospheric effects and the water column
Resumo:
The skin cancer is the most common of all cancers and the increase of its incidence must, in part, caused by the behavior of the people in relation to the exposition to the sun. In Brazil, the non-melanoma skin cancer is the most incident in the majority of the regions. The dermatoscopy and videodermatoscopy are the main types of examinations for the diagnosis of dermatological illnesses of the skin. The field that involves the use of computational tools to help or follow medical diagnosis in dermatological injuries is seen as very recent. Some methods had been proposed for automatic classification of pathology of the skin using images. The present work has the objective to present a new intelligent methodology for analysis and classification of skin cancer images, based on the techniques of digital processing of images for extraction of color characteristics, forms and texture, using Wavelet Packet Transform (WPT) and learning techniques called Support Vector Machine (SVM). The Wavelet Packet Transform is applied for extraction of texture characteristics in the images. The WPT consists of a set of base functions that represents the image in different bands of frequency, each one with distinct resolutions corresponding to each scale. Moreover, the characteristics of color of the injury are also computed that are dependants of a visual context, influenced for the existing colors in its surround, and the attributes of form through the Fourier describers. The Support Vector Machine is used for the classification task, which is based on the minimization principles of the structural risk, coming from the statistical learning theory. The SVM has the objective to construct optimum hyperplanes that represent the separation between classes. The generated hyperplane is determined by a subset of the classes, called support vectors. For the used database in this work, the results had revealed a good performance getting a global rightness of 92,73% for melanoma, and 86% for non-melanoma and benign injuries. The extracted describers and the SVM classifier became a method capable to recognize and to classify the analyzed skin injuries
Resumo:
Post dispatch analysis of signals obtained from digital disturbances registers provide important information to identify and classify disturbances in systems, looking for a more efficient management of the supply. In order to enhance the task of identifying and classifying the disturbances - providing an automatic assessment - techniques of digital signal processing can be helpful. The Wavelet Transform has become a very efficient tool for the analysis of voltage or current signals, obtained immediately after disturbance s occurrences in the network. This work presents a methodology based on the Discrete Wavelet Transform to implement this process. It uses a comparison between distribution curves of signals energy, with and without disturbance. This is done for different resolution levels of its decomposition in order to obtain descriptors that permit its classification, using artificial neural networks
Resumo:
In this work, we propose a solution to solve the scalability problem found in collaborative, virtual and mixed reality environments of large scale, that use the hierarchical client-server model. Basically, we use a hierarchy of servers. When the capacity of a server is reached, a new server is created as a sun of the first one, and the system load is distributed between them (father and sun). We propose efficient tools and techniques for solving problems inherent to client-server model, as the definition of clusters of users, distribution and redistribution of users through the servers, and some mixing and filtering operations, that are necessary to reduce flow between servers. The new model was tested, in simulation, emulation and in interactive applications that were implemented. The results of these experimentations show enhancements in the traditional, previous models indicating the usability of the proposed in problems of all-to-all communications. This is the case of interactive games and other applications devoted to Internet (including multi-user environments) and interactive applications of the Brazilian Digital Television System, to be developed by the research group. Keywords: large scale virtual environments, interactive digital tv, distributed
Resumo:
The so-called Dual Mode Adaptive Robust Control (DMARC) is proposed. The DMARC is a control strategy which interpolates the Model Reference Adaptive Control (MRAC) and the Variable Structure Model Reference Adaptive Control (VS-MRAC). The main idea is to incorporate the transient performance advantages of the VS-MRAC controller with the smoothness control signal in steady-state of the MRAC controller. Two basic algorithms are developed for the DMARC controller. In the first algorithm the controller's adjustment is made, in real time, through the variation of a parameter in the adaptation law. In the second algorithm the control law is generated, using fuzzy logic with Takagi-Sugeno s model, to obtain a combination of the MRAC and VS-MRAC control laws. In both cases, the combined control structure is shown to be robust to the parametric uncertainties and external disturbances, with a fast transient performance, practically without oscillations, and a smoothness steady-state control signal
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory