221 resultados para Sistema de cotas na universidade
Resumo:
The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform
Resumo:
The use of intelligent agents in multi-classifier systems appeared in order to making the centralized decision process of a multi-classifier system into a distributed, flexible and incremental one. Based on this, the NeurAge (Neural Agents) system (Abreu et al 2004) was proposed. This system has a superior performance to some combination-centered methods (Abreu, Canuto, and Santana 2005). The negotiation is important to the multiagent system performance, but most of negotiations are defined informaly. A way to formalize the negotiation process is using an ontology. In the context of classification tasks, the ontology provides an approach to formalize the concepts and rules that manage the relations between these concepts. This work aims at using ontologies to make a formal description of the negotiation methods of a multi-agent system for classification tasks, more specifically the NeurAge system. Through ontologies, we intend to make the NeurAge system more formal and open, allowing that new agents can be part of such system during the negotiation. In this sense, the NeurAge System will be studied on the basis of its functioning and reaching, mainly, the negotiation methods used by the same ones. After that, some negotiation ontologies found in literature will be studied, and then those that were chosen for this work will be adapted to the negotiation methods used in the NeurAge.
Resumo:
Despite the emergence of other forms of artificial lift, sucker rod pumping systems remains hegemonic because of its flexibility of operation and lower investment cost compared to other lifting techniques developed. A successful rod pumping sizing necessarily passes through the supply of estimated flow and the controlled wear of pumping equipment used in the mounted configuration. However, the mediation of these elements is particularly challenging, especially for most designers dealing with this work, which still lack the experience needed to get good projects pumping in time. Even with the existence of various computer applications on the market in order to facilitate this task, they must face a grueling process of trial and error until you get the most appropriate combination of equipment for installation in the well. This thesis proposes the creation of an expert system in the design of sucker rod pumping systems. Its mission is to guide a petroleum engineer in the task of selecting a range of equipment appropriate to the context provided by the characteristics of the oil that will be raised to the surface. Features such as the level of gas separation, presence of corrosive elements, possibility of production of sand and waxing are taken into account in selecting the pumping unit, sucker-rod strings and subsurface pump and their operation mode. It is able to approximate the inferente process in the way of human reasoning, which leads to results closer to those obtained by a specialist. For this, their production rules were based on the theory of fuzzy sets, able to model vague concepts typically present in human reasoning. The calculations of operating parameters of the pumping system are made by the API RP 11L method. Based on information input, the system is able to return to the user a set of pumping configurations that meet a given design flow, but without subjecting the selected equipment to an effort beyond that which can bear
Resumo:
The process for choosing the best components to build systems has become increasingly complex. It becomes more critical if it was need to consider many combinations of components in the context of an architectural configuration. These circumstances occur, mainly, when we have to deal with systems involving critical requirements, such as the timing constraints in distributed multimedia systems, the network bandwidth in mobile applications or even the reliability in real-time systems. This work proposes a process of dynamic selection of architectural configurations based on non-functional requirements criteria of the system, which can be used during a dynamic adaptation. This proposal uses the MAUT theory (Multi-Attribute Utility Theory) for decision making from a finite set of possibilities, which involve multiple criteria to be analyzed. Additionally, it was proposed a metamodel which can be used to describe the application s requirements in terms of the non-functional requirements criteria and their expected values, to express them in order to make the selection of the desired configuration. As a proof of concept, it was implemented a module that performs the dynamic choice of configurations, the MoSAC. This module was implemented using a component-based development approach (CBD), performing a selection of architectural configurations based on the proposed selection process involving multiple criteria. This work also presents a case study where an application was developed in the context of Digital TV to evaluate the time spent on the module to return a valid configuration to be used in a middleware with autoadaptative features, the middleware AdaptTV
Resumo:
The increasingly request for processing power during last years has pushed integrated circuit industry to look for ways of providing even more processing power with less heat dissipation, power consumption, and chip area. This goal has been achieved increasing the circuit clock, but since there are physical limits of this approach a new solution emerged as the multiprocessor system on chip (MPSoC). This approach demands new tools and basic software infrastructure to take advantage of the inherent parallelism of these architectures. The oil exploration industry has one of its firsts activities the project decision on exploring oil fields, those decisions are aided by reservoir simulations demanding high processing power, the MPSoC may offer greater performance if its parallelism can be well used. This work presents a proposal of a micro-kernel operating system and auxiliary libraries aimed to the STORM MPSoC platform analyzing its influence on the problem of reservoir simulation
Resumo:
The use of multi-agent systems for classification tasks has been proposed in order to overcome some drawbacks of multi-classifier systems and, as a consequence, to improve performance of such systems. As a result, the NeurAge system was proposed. This system is composed by several neural agents which communicate and negotiate a common result for the testing patterns. In the NeurAge system, a negotiation method is very important to the overall performance of the system since the agents need to reach and agreement about a problem when there is a conflict among the agents. This thesis presents an extensive analysis of the NeurAge System where it is used all kind of classifiers. This systems is now named ClassAge System. It is aimed to analyze the reaction of this system to some modifications in its topology and configuration
Resumo:
This work presents the tVoice, software that manipulates tags languages, extracting information and, being integral part of the VoiceProxy system, it aids bearers of special needs in the access to the Web. This system is responsible for the search and treatment of the documents in the Web, extracting the textual information contained in those documents and preceding the capability of generating eventually through translation techniques, an audio script, used by the of interface subsystem of VoiceProxy, the iVoice, in the process of voice synthesis. In this stage the tVoice, besides the treatment of the tag language HTML, processes other two formats of documents, PDF and XHTML. Additionally to allow that, besides the iVoice, other interface subsystems can make use of the tVoice through remote access, we propose distribution systems techniques based in the model Client-Server providers operations of the fashion of a proxy server treatment of documents
Resumo:
A colaboração na pesquisa é uma das tarefas centrais da área acadêmica. Atualmente, muitos pesquisadores estão utilizando meios modernos de troca de arquivos digitais através de ferramentas assíncronas e também com o uso de ferramentas mais sofisticadas, do tipo síncronas. Juntamente com o fato da crescente quantidade de artigos sendo gerados, mais complexos, diversificados e aumentando de forma desorganizada, o que trás ao pesquisador uma tarefa difícil para organizá-los de forma a se extrair o melhor conteúdo destes, isto ocorre porque uma subárea da Engenharia de Software (ES) ainda é bastante mal aproveitada, a Engenharia de Software Experimental (ESE). Utilizando-se de um dos tipos de experimentos que a ESE oferece, as revisões sistemáticas entram como uma solução bastante robusta, na qual o pesquisador pode identificar o conhecimento existente em uma área e planejar devidamente sua pesquisa, evitando a repetição de erros em pesquisas já efetivadas por outros pesquisadores no passado. Contudo, estas duas abordagens, a colaboração virtual de pesquisadores e a utilização de revisões sistemáticas, contem problemas: na primeira, sistemas colaborativos são geralmente difíceis de configurar e usar; na segunda, apesar da robustez da metodologia de revisões sistemáticas, ainda se torna necessário uma rigorosa revisão na literatura para se conseguir um resultado satisfatório. Assim, com o foco de unir estas duas abordagens, este trabalho propõe uma maneira de produzir revisões sistemáticas de forma organizada e com a possibilidade de interação entre usuários, com o desenvolvimento de um sistema interativo, no qual as revisões sistemáticas possam ser geradas por usuários em colaboração com outros e também ser avaliadas seguindo a orientação de um profissional da área, tornando o seu conteúdo mais consistente e de melhor qualidade. O sistema não possui níveis de acesso, ou seja, qualquer pessoa pode se cadastrar e usufruir de seus recursos, seja na área acadêmica ou mesmo na área profissional
Resumo:
The environmental management in the health establishments is a reality still little explored in the health sector in Brazil, especially concerning its wastes. The management of wastes of health services is established in the valid legislation through the National Council of Environment and Sanitary Vigilance Agency (358/2005 and 304/2004 respectively). The present work is about a descriptive work about the environmental health in the health services. The used criterion was to diagnose the environmental management in twelve establishments of health inserted in the three levels of complexity of the Unique Health System (Sistema Ùnico de Saúde SUS). Among the sub criteria used the waste management is the one of bigger concern. The one referring to the water quality is considered good. The analysis of data reveals that 66% of the establishments got a poor environmental ranking, 17% critical and 17% appropriate, showing that the health establishments in the three levels of complexity of the SUS need urgent structural, environmental and educational interventions
Resumo:
The reality points to the global environmental sustainability as the only viable option for addressing the crisis at hand. The move towards sustainability calls for the generation / evaluation systems in their direction, through the incorporation of environmental requirements and in line with the National Policy on Solid Waste. Therefore, the proposed research supports the importance of social and environmental vision, complementing the technical view, the system for management of solid waste from East London, which is a municipality that has a system whose inadequacies are configured in environmental risk and health. Therefore, by observing, applying the model of sustainability indicators and content analysis of interviews, this research proposes to investigate the principles of sustainability and social participation are presented and what is the perception of risk about the inadequacies in the system. The results confirmed the hypotheses of the study and draw a picture of worrying data, such as very unfavorable indicators of sustainability, lack of channels of participation, uncommitted investments with the management system, devaluation of the collector of waste and differing perceptions about the risk by making actors act in isolation. This worrying situation is eased by the appearance of a series of elements are configured as opportunities for the integration of environmental principles in the system. And despite the inability of managers to participate in the research system, yet it behaves as an opportunity to implement public policies in the area of solid waste such as: the preparation of the municipal waste, the institutionalization of selective collection and organization of cooperative with the support of companies present in the city and educational institutions as the Federal Institute. The research is an opportunity for the implementation of policies in the area of solid waste and will collaborate with the building instruments for the quality of life of residents, for the socioeconomic conditions of collectors and the move towards a sustainable society
Resumo:
Helicobacter pylori is the main cause of gastritis, gastroduodenal ulcer disease and gastric cancer. The most recommended treatment for eradication of this bacteria often leads to side effects and patient poor compliance, which induce treatment failure. Magnetic drug targeting is a very efficient method that overcomes these drawbacks through association of the drug with a magnetic compound. Such approach may allow such systems to be placed slowed down to a specific target area by an external magnetic field. This work reports a study of the synthesis and characterization of polymeric magnetic particles loaded with the currently used antimicrobial agents for the treatment of Helicobacter pylori infections, aiming the production of magnetic drug delivery system by oral route. Optical microscopy, scanning electron microscopy, transmission electron microscopy, x-ray powder diffraction, nitrogen adsorption/desorption isotherms and vibrating sample magnetometry revealed that the magnetite particles, produced by the co-precipitation method, consisted of a large number of aggregated nanometer-size crystallites (about 6 nm), creating superparamagnetic micrometer with high magnetic susceptibility particles with an average diameter of 6.8 ± 0.2 μm. Also, the polymeric magnetic particles produced by spray drying had a core-shell structure based on magnetite microparticles, amoxicillin and clarithromycin and coated with Eudragit® S100. The system presented an average diameter of 14.2 ± 0.2 μm. The amount of magnetite present in the system may be tailored by suitably controlling the suspension used to feed the spray dryer. In the present work it was 2.9% (w/w). The magnetic system produced may prove to be very promising for eradication of Helicobacter pylori infections
Resumo:
In February 2011, the National Agency of Petroleum, Natural Gas and Biofuels (ANP) has published a new Technical Rules for Handling Land Pipeline Petroleum and Natural Gas Derivatives (RTDT). Among other things, the RTDT made compulsory the use of monitoring systems and leak detection in all onshore pipelines in the country. This document provides a study on the method for detection of transient pressure. The study was conducted on a industrial duct 16" diameter and 9.8 km long. The pipeline is fully pressurized and carries a multiphase mixture of crude oil, water and natural gas. For the study, was built an infrastructure for data acquisition and validation of detection algorithms. The system was designed with SCADA architecture. Piezoresistive sensors were installed at the ends of the duct and Digital Signal Processors (DSPs) were used for sampling, storage and processing of data. The study was based on simulations of leaks through valves and search for patterns that characterize the occurrence of such phenomena
Resumo:
The fundamental senses of the human body are: vision, hearing, touch, taste and smell. These senses are the functions that provide our relationship with the environment. The vision serves as a sensory receptor responsible for obtaining information from the outside world that will be sent to the brain. The gaze reflects its attention, intention and interest. Therefore, the estimation of gaze direction, using computer tools, provides a promising alternative to improve the capacity of human-computer interaction, mainly with respect to those people who suffer from motor deficiencies. Thus, the objective of this work is to present a non-intrusive system that basically uses a personal computer and a low cost webcam, combined with the use of digital image processing techniques, Wavelets transforms and pattern recognition, such as artificial neural network models, resulting in a complete system that performs since the image acquisition (including face detection and eye tracking) to the estimation of gaze direction. The obtained results show the feasibility of the proposed system, as well as several feature advantages.
Resumo:
The separation methods are reduced applications as a result of the operational costs, the low output and the long time to separate the uids. But, these treatment methods are important because of the need for extraction of unwanted contaminants in the oil production. The water and the concentration of oil in water should be minimal (around 40 to 20 ppm) in order to take it to the sea. Because of the need of primary treatment, the objective of this project is to study and implement algorithms for identification of polynomial NARX (Nonlinear Auto-Regressive with Exogenous Input) models in closed loop, implement a structural identification, and compare strategies using PI control and updated on-line NARX predictive models on a combination of three-phase separator in series with three hydro cyclones batteries. The main goal of this project is to: obtain an optimized process of phase separation that will regulate the system, even in the presence of oil gushes; Show that it is possible to get optimized tunings for controllers analyzing the mesh as a whole, and evaluate and compare the strategies of PI and predictive control applied to the process. To accomplish these goals a simulator was used to represent the three phase separator and hydro cyclones. Algorithms were developed for system identification (NARX) using RLS(Recursive Least Square), along with methods for structure models detection. Predictive Control Algorithms were also implemented with NARX model updated on-line, and optimization algorithms using PSO (Particle Swarm Optimization). This project ends with a comparison of results obtained from the use of PI and predictive controllers (both with optimal state through the algorithm of cloud particles) in the simulated system. Thus, concluding that the performed optimizations make the system less sensitive to external perturbations and when optimized, the two controllers show similar results with the assessment of predictive control somewhat less sensitive to disturbances
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior