69 resultados para Planejamento de reativos em sistemas descentralizados multi- áreas
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Equipment maintenance is the major cost factor in industrial plants, it is very important the development of fault predict techniques. Three-phase induction motors are key electrical equipments used in industrial applications mainly because presents low cost and large robustness, however, it isn t protected from other fault types such as shorted winding and broken bars. Several acquisition ways, processing and signal analysis are applied to improve its diagnosis. More efficient techniques use current sensors and its signature analysis. In this dissertation, starting of these sensors, it is to make signal analysis through Park s vector that provides a good visualization capability. Faults data acquisition is an arduous task; in this way, it is developed a methodology for data base construction. Park s transformer is applied into stationary reference for machine modeling of the machine s differential equations solution. Faults detection needs a detailed analysis of variables and its influences that becomes the diagnosis more complex. The tasks of pattern recognition allow that systems are automatically generated, based in patterns and data concepts, in the majority cases undetectable for specialists, helping decision tasks. Classifiers algorithms with diverse learning paradigms: k-Neighborhood, Neural Networks, Decision Trees and Naïves Bayes are used to patterns recognition of machines faults. Multi-classifier systems are used to improve classification errors. It inspected the algorithms homogeneous: Bagging and Boosting and heterogeneous: Vote, Stacking and Stacking C. Results present the effectiveness of constructed model to faults modeling, such as the possibility of using multi-classifiers algorithm on faults classification
Resumo:
Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity
Resumo:
CARVALHO, Andréa Vasconcelos ; ESTEBAN NAVARRO, Miguel Ángel. . Auditoria de Inteligência: um método para o diagnóstico de sistemas de inteligência competitiva e organizacional. In: XI ENANCIB - Encontro Nacional de Pesquisa em Ciência da Informação, 2010, Rio de Janeiro. Anais do XI ENANCIB. Rio de Janeiro: ANCIB, 2010.
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory
Resumo:
CARVALHO, Andréa Vasconcelos ; ESTEBAN NAVARRO, Miguel Ángel. . Auditoria de Inteligência: um método para o diagnóstico de sistemas de inteligência competitiva e organizacional. In: XI ENANCIB - Encontro Nacional de Pesquisa em Ciência da Informação, 2010, Rio de Janeiro. Anais do XI ENANCIB. Rio de Janeiro: ANCIB, 2010.
Resumo:
The multiphase flow occurrence in the oil and gas industry is common throughout fluid path, production, transportation and refining. The multiphase flow is defined as flow simultaneously composed of two or more phases with different properties and immiscible. An important computational tool for the design, planning and optimization production systems is multiphase flow simulation in pipelines and porous media, usually made by multiphase flow commercial simulators. The main purpose of the multiphase flow simulators is predicting pressure and temperature at any point at the production system. This work proposes the development of a multiphase flow simulator able to predict the dynamic pressure and temperature gradient in vertical, directional and horizontal wells. The prediction of pressure and temperature profiles was made by numerical integration using marching algorithm with empirical correlations and mechanistic model to predict pressure gradient. The development of this tool involved set of routines implemented through software programming Embarcadero C++ Builder® 2010 version, which allowed the creation of executable file compatible with Microsoft Windows® operating systems. The simulator validation was conduct by computational experiments and comparison the results with the PIPESIM®. In general, the developed simulator achieved excellent results compared with those obtained by PIPESIM and can be used as a tool to assist production systems development
Resumo:
Urban stormwater can be considered as potential water resources as well as problems for the proper functioning of the manifold activities of the city, resulting from inappropriate use and occupation of the soil, usually due to poor planning of the occupation of the development areas, with little care for the environmental aspects of the drainage of surface runoff. As a basic premise, we must seek mechanisms to preserve the natural flow in all stages of development of an urban area, preserving the soil infiltration capacity in the scale of the urban area, comprising the mechanisms of natural drainage, and noting preserving natural areas of dynamic water courses, both in the main channel and in the secondary. They are challenges for a sustainable urban development in a harmonious coexistence of modern developmental, which are consistent with the authoritative economic environmental and social quality. Integrated studies involving the quantity and quality of rainwater are absolutely necessary to achieve understanding and obtaining appropriate technologies, involving both aspects of the drainage problems and aspects of use of water when subjected to an adequate management of surface runoff , for example, the accumulation of these reservoirs in detention with the possibility of use for other purposes. The purpose of this study aims to develop a computer model, adjusted to prevailing conditions of an experimental urban watershed in order to enable the implementation of management practices for water resources, hydrological simulations of quantity and, in a preliminary way, the quality of stormwater that flow to a pond located at the downstream end of the basin. To this end, we used in parallel with the distributed model SWMM data raised the basin with the highest possible resolution to allow the simulation of diffuse loads, heterogeneous characteristics of the basin both in terms of hydrological and hydraulic parameters on the use and occupation soil. The parallel work should improve the degree of understanding of the phenomena simulated in the basin as well as the activity of the calibration models, and this is supported by monitoring data acquired during the duration of the project MAPLU (Urban Stormwater Management) belonging to the network PROSAB (Research Program in Basic Sanitation) in the years 2006 to 2008
Resumo:
The present work was carried through in the Grossos city - RN and had as main objectives the elaboration of an physicist-ambient, socioeconomic survey and execution a multisecular evaluation of 11 years, between 1986 and 1996, using remote sensing products, to evaluate the modifications of the land use, aiming at the generation of an information database to implementation a geographical information system (GIS) to management the this city. For they had been in such a way raised given referring the two Demographic Censuses carried through by the IBGE (1991 and 2000) and compared, of this form was possible to the accomplishment of an evaluation on the demographic aspects (degree of urbanization, etária structure, educational level) and economic (income, habitation, vulnerability, human development). For the ambient physical survey the maps of the natural resources had been confectioned (simplified geology, hydrography, geomorphologi, veget covering, ground association, use and occupation), based in comments of field and orbital products of remote sensoriamento (images Spot-HRVIR, Landsat 5-TM and IKONOS - II), using itself of techniques of digital picture processing. The survey of these data and important in the identification of the potentialities and fragilities of found ecosystems, therefore allows an adequate planning of the partner-economic development by means of an efficient management. The project was part of a partnership between the Grossos city hall the municipal City hall of Grossos - RN and the Geoscience post-graduate program of the UFRN, more specifically the Geomatica laboratory LAGEOMA
Resumo:
The multiphase flow occurrence in the oil and gas industry is common throughout fluid path, production, transportation and refining. The multiphase flow is defined as flow simultaneously composed of two or more phases with different properties and immiscible. An important computational tool for the design, planning and optimization production systems is multiphase flow simulation in pipelines and porous media, usually made by multiphase flow commercial simulators. The main purpose of the multiphase flow simulators is predicting pressure and temperature at any point at the production system. This work proposes the development of a multiphase flow simulator able to predict the dynamic pressure and temperature gradient in vertical, directional and horizontal wells. The prediction of pressure and temperature profiles was made by numerical integration using marching algorithm with empirical correlations and mechanistic model to predict pressure gradient. The development of this tool involved set of routines implemented through software programming Embarcadero C++ Builder® 2010 version, which allowed the creation of executable file compatible with Microsoft Windows® operating systems. The simulator validation was conduct by computational experiments and comparison the results with the PIPESIM®. In general, the developed simulator achieved excellent results compared with those obtained by PIPESIM and can be used as a tool to assist production systems development
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Since centuries ago, the Asians use seaweed as an important source of feeding and are their greatest world-wide consumers. The migration of these peoples for other countries, made the demand for seaweed to increase. This increasing demand prompted an industry with annual values of around US$ 6 billion. The algal biomass used for the industry is collected in natural reservoirs or cultivated. The market necessity for products of the seaweed base promotes an unsustainable exploration of the natural banks, compromising its associated biological balance. In this context, seaweed culture appears as a viable alternative to prevent the depletion of these natural supplies. Geographic Information Systems (GIS) provide space and produce information that can facilitate the evaluation of important physical and socio-economic characteristics for the planning of seaweed culture. This objective of this study is to identify potential coastal areas for seaweed culture in the state of Rio Grande do Norte, from the integration of social-environmental data in the SIG. In order to achieve this objective, a geo-referred database composed of geographical maps, nautical maps and orbital digital images was assembled; and a bank of attributes including physical and oceanographical variables (winds, chains, bathymetry, operational distance from the culture) and social and environmental factors (main income, experience with seaweed harvesting, demographic density, proximity of the sheltered coast and distance of the banks) was produced. In the modeling of the data, the integration of the space database with the bank of attributes for the attainment of the map of potentiality of seaweed culture was carried out. Of a total of 2,011 ha analyzed by the GIS for the culture of seaweed, around 34% or 682 ha were indicated as high potential, 55% or 1,101 ha as medium potential, and 11% or 228 ha as low potential. The good indices of potentiality obtained in the localities studied demonstrate that there are adequate conditions for the installation of seaweed culture in the state of Rio Grande do Norte
Resumo:
This thesis aims at analyzing from the perspective of the manager the importance of the use of quality management tools and concepts in Federal Universities. It was motivated by the following research problem: do Federal University managers consider it to be relevant the quality management in their institution? Therefore, we sought to gather evidence for a satisfactory approach that addresses the complexity of the topic researched: quality, higher education and quality management systems. We chose to adopt an applied study, the exploratory-descriptive research as to the objective and the quantitative and qualitative research as to the approach to the problem. The object of study is composed by the Planning Provosts of Federal Universities listed in the University Ranking Sheet - (RUF) in 2013. We chose to restrict the sample listing only the provosts of the 20 best-placed universities in the ranking of the Federal Universities. The research instrument was composed of 26 questions, of which 6 questions were designed to identify the profile of the manager, 16 questions of perception (manifested variables) on the importance of quality management in the University, where the managers assigned values (answers) to the affirmatives (that address the main topic of this thesis) based on a Likert scale of 5 points, and 4 open and optional questions, in order to identify general management practices used. It was used for statistical analysis (data analysis) descriptive and factorial statistics. The responses collected through the questionnaire portray the managers´ perception regarding the importance of quality management in their institutions. Sixteen variables were addressed, the results of factor analysis of importance were "Important" and "Very Important", where the variable (V2) was "Important" and all others "Very important." With this information, it is possible to prioritize some areas that deserve immediate action. As it was observed that some variables are "Very important" for the vast majority of managers, others did not show the same result as example (V2, V10, V11). It is concluded that the manager´s perception of quality management in his or her institution is relevant, but the same importance is not given to quality programs implemented in other segments of the economy, and that, despite the advancements offered by SINAES, the model does not evaluate the institution in a global way. Thus, with the results, it is expected to contribute to the advancement of the subject, trying to arouse interest from the managers of Federal Universities in the subject, emphasizing the importance of quality management systems as a necessary tool to raise the institutional quality
Resumo:
We propose a new paradigm for collective learning in multi-agent systems (MAS) as a solution to the problem in which several agents acting over the same environment must learn how to perform tasks, simultaneously, based on feedbacks given by each one of the other agents. We introduce the proposed paradigm in the form of a reinforcement learning algorithm, nominating it as reinforcement learning with influence values. While learning by rewards, each agent evaluates the relation between the current state and/or action executed at this state (actual believe) together with the reward obtained after all agents that are interacting perform their actions. The reward is a result of the interference of others. The agent considers the opinions of all its colleagues in order to attempt to change the values of its states and/or actions. The idea is that the system, as a whole, must reach an equilibrium, where all agents get satisfied with the obtained results. This means that the values of the state/actions pairs match the reward obtained by each agent. This dynamical way of setting the values for states and/or actions makes this new reinforcement learning paradigm the first to include, naturally, the fact that the presence of other agents in the environment turns it a dynamical model. As a direct result, we implicitly include the internal state, the actions and the rewards obtained by all the other agents in the internal state of each agent. This makes our proposal the first complete solution to the conceptual problem that rises when applying reinforcement learning in multi-agent systems, which is caused by the difference existent between the environment and agent models. With basis on the proposed model, we create the IVQ-learning algorithm that is exhaustive tested in repetitive games with two, three and four agents and in stochastic games that need cooperation and in games that need collaboration. This algorithm shows to be a good option for obtaining solutions that guarantee convergence to the Nash optimum equilibrium in cooperative problems. Experiments performed clear shows that the proposed paradigm is theoretical and experimentally superior to the traditional approaches. Yet, with the creation of this new paradigm the set of reinforcement learning applications in MAS grows up. That is, besides the possibility of applying the algorithm in traditional learning problems in MAS, as for example coordination of tasks in multi-robot systems, it is possible to apply reinforcement learning in problems that are essentially collaborative
Resumo:
The usual programs for load flow calculation were in general developped aiming the simulation of electric energy transmission, subtransmission and distribution systems. However, the mathematical methods and algorithms used by the formulations were based, in majority, just on the characteristics of the transmittion systems, which were the main concern focus of engineers and researchers. Though, the physical characteristics of these systems are quite different from the distribution ones. In the transmission systems, the voltage levels are high and the lines are generally very long. These aspects contribute the capacitive and inductive effects that appear in the system to have a considerable influence in the values of the interest quantities, reason why they should be taken into consideration. Still in the transmission systems, the loads have a macro nature, as for example, cities, neiborhoods, or big industries. These loads are, generally, practically balanced, what reduces the necessity of utilization of three-phase methodology for the load flow calculation. Distribution systems, on the other hand, present different characteristics: the voltage levels are small in comparison to the transmission ones. This almost annul the capacitive effects of the lines. The loads are, in this case, transformers, in whose secondaries are connected small consumers, in a sort of times, mono-phase ones, so that the probability of finding an unbalanced circuit is high. This way, the utilization of three-phase methodologies assumes an important dimension. Besides, equipments like voltage regulators, that use simultaneously the concepts of phase and line voltage in their functioning, need a three-phase methodology, in order to allow the simulation of their real behavior. For the exposed reasons, initially was developped, in the scope of this work, a method for three-phase load flow calculation in order to simulate the steady-state behaviour of distribution systems. Aiming to achieve this goal, the Power Summation Algorithm was used, as a base for developping the three phase method. This algorithm was already widely tested and approved by researchers and engineers in the simulation of radial electric energy distribution systems, mainly for single-phase representation. By our formulation, lines are modeled in three-phase circuits, considering the magnetic coupling between the phases; but the earth effect is considered through the Carson reduction. Its important to point out that, in spite of the loads being normally connected to the transformers secondaries, was considered the hypothesis of existence of star or delta loads connected to the primary circuit. To perform the simulation of voltage regulators, a new model was utilized, allowing the simulation of various types of configurations, according to their real functioning. Finally, was considered the possibility of representation of switches with current measuring in various points of the feeder. The loads are adjusted during the iteractive process, in order to match the current in each switch, converging to the measured value specified by the input data. In a second stage of the work, sensibility parameters were derived taking as base the described load flow, with the objective of suporting further optimization processes. This parameters are found by calculating of the partial derivatives of a variable in respect to another, in general, voltages, losses and reactive powers. After describing the calculation of the sensibility parameters, the Gradient Method was presented, using these parameters to optimize an objective function, that will be defined for each type of study. The first one refers to the reduction of technical losses in a medium voltage feeder, through the installation of capacitor banks; the second one refers to the problem of correction of voltage profile, through the instalation of capacitor banks or voltage regulators. In case of the losses reduction will be considered, as objective function, the sum of the losses in all the parts of the system. To the correction of the voltage profile, the objective function will be the sum of the square voltage deviations in each node, in respect to the rated voltage. In the end of the work, results of application of the described methods in some feeders are presented, aiming to give insight about their performance and acuity
Resumo:
Effluent color resulting from textile dyeing processes has been one of the biggest environmental problems faced by the textile industry. In particular, reactive dyes are highly resistant to conventional wastewater treatment methods. New technologies have been contemplated, some of which have been applied in industrial treatment plants, but color removal has not been efficiently attained. Since microemulsion systems provide good results in heavy metals and proteins extraction processes, their use in dyes extraction has been suggested and investigated. In this work, a real textile wastewater from an exhaustion dyebath has been treated, which contains the following reactive dyes: Procion Yellow H-E4R (CI Reactive Yellow 84), Procion Blue H-ERD (CI Reactive Blue 160) and Procion Red H-E3B (CI Reactive Red 120), in addition to auxiliary compounds normally found in dyeing processes with reactive dyes. The dyes Remazol Blue RR and Remazol Turquoise Blue G (Reactive Blue 21) have also been examined in view of the presence of heavy metals in these molecules. The microemulsion system comprised dodecyl ammonium chloride (as a cationic surfactant), water or wastewater as aqueous phase, kerosene as oil phase, and one of the following alcohols as cosurfactant: isoamyl alcohol, n-butyl alcohol and n-octyl alcohol. The pseudo-ternary diagrams were constructed in order to define Winsor s equilibrium regions. The influence of parameters such as pH, C/S (cosurfactant/surfactant) ratio, distribution coefficient, initial dye concentration, salinity, temperature, phases relative amounts, loading capacity of the microemulsion phase and dye reextraction rate has also been investigated. An experimental planning (Scheffé Net) was used to optimize the extraction process. The removal of color and metals reached levels as high as 99%