140 resultados para Catalisadores : Complexos de paládio
Resumo:
Nowadays, telecommunications is one of the most dynamic and strategic areas in the world. Organizations are always seeking to find new management practices within an ever increasing competitive environment where resources are getting scarce. In this scenario, data obtained from business and corporate processes have even greater importance, although this data is not yet adequately explored. Knowledge Discovery in Databases (KDD) appears then, as an option to allow the study of complex problems in different areas of management. This work proposes both a systematization of KDD activities using concepts from different methodologies, such as CRISP-DM, SEMMA and FAYYAD approaches and a study concerning the viability of multivariate regression analysis models to explain corporative telecommunications sales using performance indicators. Thus, statistical methods were outlined to analyze the effects of such indicators on the behavior of business productivity. According to business and standard statistical analysis, equations were defined and fit to their respective determination coefficients. Tests of hypotheses were also conducted on parameters with the purpose of validating the regression models. The results show that there is a relationship between these development indicators and the amount of sales
Resumo:
Posture is one of the most worrying problems dentists face. That is because of the high incidence of low back pathologies regarding the professional activity, despite the development on the field of Dental Ergonomics. This work took place at the dental schoolclinic at a Federal University, and it was grounded on the Ergonomic Principles in the workplace. Its main objective was to analyze the determinants of inadequate posture adopted by students inasmuch as the adoption of non ergonomic methods at the school-clinic may influence them to develop inadequate postures in their working environment. The analysis of the activity showed us that it requires some complex procedures in the patient s mouth. Thus, when the students carry out the activity, they start to adopt, although unconsciously, inadequate postures which will make easier the visual accuracy and the access to the operation focus. In case there is no internal (body awareness) or external warning mechanisms (the professor s or the partner s counseling) regarding posture or possible risks which lead them to self-correction, the students become vulnerable to osteomuscle disorders. The time pressure, because the students are expected to perform their task in a predetermined clinical time. The facts related to each patient s variability as well as the stress caused by the expectations to get their work done in time make the students to advance it believing they will waste time if they help their partners or using an indirect view. We could also notice that there was no assistant to perform the job of minor ones, as well as there was no professor who could actually connect the knowledge on Ergonomics to its working practice. The conclusions of this work stand out the need of widen the discussion at the academic environment regarding health professionals in places such as universities. The ergonomic principles in the workplace aim a multidisciplinary analysis based on the experience of students, professors, staff members and janitors that can contribute to some reflection upon the issue and consequently actions which will bring positive changes at the working environment
Resumo:
Telecommunications play a key role in contemporary society. However, as new technologies are put into the market, it also grows the demanding for new products and services that depend on the offered infrastructure, making the problems of planning telecommunications networks, despite the advances in technology, increasingly larger and complex. However, many of these problems can be formulated as models of combinatorial optimization, and the use of heuristic algorithms can help solving these issues in the planning phase. In this project it was developed two pure metaheuristic implementations Genetic algorithm (GA) and Memetic Algorithm (MA) plus a third hybrid implementation Memetic Algorithm with Vocabulary Building (MA+VB) for a problem in telecommunications that is known in the literature as Problem SONET Ring Assignment Problem or SRAP. The SRAP arises during the planning stage of the physical network and it consists in the selection of connections between a number of locations (customers) in order to meet a series of restrictions on the lowest possible cost. This problem is NP-hard, so efficient exact algorithms (in polynomial complexity ) are not known and may, indeed, even exist
Algoritmo evolutivo paralelo para o problema de atribuição de localidades a anéis em redes sonet/sdh
Resumo:
The telecommunications play a fundamental role in the contemporary society, having as one of its main roles to give people the possibility to connect them and integrate them into society in which they operate and, therewith, accelerate development through knowledge. But as new technologies are introduced on the market, increases the demand for new products and services that depend on the infrastructure offered, making the problems of planning of telecommunication networks become increasingly large and complex. Many of these problems, however, can be formulated as combinatorial optimization models, and the use of heuristic algorithms can help solve these issues in the planning phase. This paper proposes the development of a Parallel Evolutionary Algorithm to be applied to telecommunications problem known in the literature as SONET Ring Assignment Problem SRAP. This problem is the class NP-hard and arises during the physical planning of a telecommunication network and consists of determining the connections between locations (customers), satisfying a series of constrains of the lowest possible cost. Experimental results illustrate the effectiveness of the Evolutionary Algorithm parallel, over other methods, to obtain solutions that are either optimal or very close to it
Resumo:
This work deals with a mathematical fundament for digital signal processing under point view of interval mathematics. Intend treat the open problem of precision and repesention of data in digital systems, with a intertval version of signals representation. Signals processing is a rich and complex area, therefore, this work makes a cutting with focus in systems linear invariant in the time. A vast literature in the area exists, but, some concepts in interval mathematics need to be redefined or to be elaborated for the construction of a solid theory of interval signal processing. We will construct a basic fundaments for signal processing in the interval version, such as basic properties linearity, stability, causality, a version to intervalar of linear systems e its properties. They will be presented interval versions of the convolution and the Z-transform. Will be made analysis of convergences of systems using interval Z-transform , a essentially interval distance, interval complex numbers , application in a interval filter.
Resumo:
With the rapid growth of databases of various types (text, multimedia, etc..), There exist a need to propose methods for ordering, access and retrieve data in a simple and fast way. The images databases, in addition to these needs, require a representation of the images so that the semantic content characteristics are considered. Accordingly, several proposals such as the textual annotations based retrieval has been made. In the annotations approach, the recovery is based on the comparison between the textual description that a user can make of images and descriptions of the images stored in database. Among its drawbacks, it is noted that the textual description is very dependent on the observer, in addition to the computational effort required to describe all the images in database. Another approach is the content based image retrieval - CBIR, where each image is represented by low-level features such as: color, shape, texture, etc. In this sense, the results in the area of CBIR has been very promising. However, the representation of the images semantic by low-level features is an open problem. New algorithms for the extraction of features as well as new methods of indexing have been proposed in the literature. However, these algorithms become increasingly complex. So, doing an analysis, it is natural to ask whether there is a relationship between semantics and low-level features extracted in an image? and if there is a relationship, which descriptors better represent the semantic? which leads us to a new question: how to use descriptors to represent the content of the images?. The work presented in this thesis, proposes a method to analyze the relationship between low-level descriptors and semantics in an attempt to answer the questions before. Still, it was observed that there are three possibilities of indexing images: Using composed characteristic vectors, using parallel and independent index structures (for each descriptor or set of them) and using characteristic vectors sorted in sequential order. Thus, the first two forms have been widely studied and applied in literature, but there were no records of the third way has even been explored. So this thesis also proposes to index using a sequential structure of descriptors and also the order of these descriptors should be based on the relationship that exists between each descriptor and semantics of the users. Finally, the proposed index in this thesis revealed better than the traditional approachs and yet, was showed experimentally that the order in this sequence is important and there is a direct relationship between this order and the relationship of low-level descriptors with the semantics of the users
Resumo:
In this work we use Interval Mathematics to establish interval counterparts for the main tools used in digital signal processing. More specifically, the approach developed here is oriented to signals, systems, sampling, quantization, coding and Fourier transforms. A detailed study for some interval arithmetics which handle with complex numbers is provided; they are: complex interval arithmetic (or rectangular), circular complex arithmetic, and interval arithmetic for polar sectors. This lead us to investigate some properties that are relevant for the development of a theory of interval digital signal processing. It is shown that the sets IR and R(C) endowed with any correct arithmetic is not an algebraic field, meaning that those sets do not behave like real and complex numbers. An alternative to the notion of interval complex width is also provided and the Kulisch- Miranker order is used in order to write complex numbers in the interval form enabling operations on endpoints. The use of interval signals and systems is possible thanks to the representation of complex values into floating point systems. That is, if a number x 2 R is not representable in a floating point system F then it is mapped to an interval [x;x], such that x is the largest number in F which is smaller than x and x is the smallest one in F which is greater than x. This interval representation is the starting point for definitions like interval signals and systems which take real or complex values. It provides the extension for notions like: causality, stability, time invariance, homogeneity, additivity and linearity to interval systems. The process of quantization is extended to its interval counterpart. Thereafter the interval versions for: quantization levels, quantization error and encoded signal are provided. It is shown that the interval levels of quantization represent complex quantization levels and the classical quantization error ranges over the interval quantization error. An estimation for the interval quantization error and an interval version for Z-transform (and hence Fourier transform) is provided. Finally, the results of an Matlab implementation is given
Resumo:
Simulations based on cognitively rich agents can become a very intensive computing task, especially when the simulated environment represents a complex system. This situation becomes worse when time constraints are present. This kind of simulations would benefit from a mechanism that improves the way agents perceive and react to changes in these types of environments. In other worlds, an approach to improve the efficiency (performance and accuracy) in the decision process of autonomous agents in a simulation would be useful. In complex environments, and full of variables, it is possible that not every information available to the agent is necessary for its decision-making process, depending indeed, on the task being performed. Then, the agent would need to filter the coming perceptions in the same as we do with our attentions focus. By using a focus of attention, only the information that really matters to the agent running context are perceived (cognitively processed), which can improve the decision making process. The architecture proposed herein presents a structure for cognitive agents divided into two parts: 1) the main part contains the reasoning / planning process, knowledge and affective state of the agent, and 2) a set of behaviors that are triggered by planning in order to achieve the agent s goals. Each of these behaviors has a runtime dynamically adjustable focus of attention, adjusted according to the variation of the agent s affective state. The focus of each behavior is divided into a qualitative focus, which is responsible for the quality of the perceived data, and a quantitative focus, which is responsible for the quantity of the perceived data. Thus, the behavior will be able to filter the information sent by the agent sensors, and build a list of perceived elements containing only the information necessary to the agent, according to the context of the behavior that is currently running. Based on the human attention focus, the agent is also dotted of a affective state. The agent s affective state is based on theories of human emotion, mood and personality. This model serves as a basis for the mechanism of continuous adjustment of the agent s attention focus, both the qualitative and the quantative focus. With this mechanism, the agent can adjust its focus of attention during the execution of the behavior, in order to become more efficient in the face of environmental changes. The proposed architecture can be used in a very flexibly way. The focus of attention can work in a fixed way (neither the qualitative focus nor the quantitaive focus one changes), as well as using different combinations for the qualitative and quantitative foci variation. The architecture was built on a platform for BDI agents, but its design allows it to be used in any other type of agents, since the implementation is made only in the perception level layer of the agent. In order to evaluate the contribution proposed in this work, an extensive series of experiments were conducted on an agent-based simulation over a fire-growing scenario. In the simulations, the agents using the architecture proposed in this work are compared with similar agents (with the same reasoning model), but able to process all the information sent by the environment. Intuitively, it is expected that the omniscient agent would be more efficient, since they can handle all the possible option before taking a decision. However, the experiments showed that attention-focus based agents can be as efficient as the omniscient ones, with the advantage of being able to solve the same problems in a significantly reduced time. Thus, the experiments indicate the efficiency of the proposed architecture
Resumo:
The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm
Resumo:
Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields
Resumo:
Este trabalho apresenta um levantamento dos problemas associados à influência da observabilidade e da visualização radial no projeto de sistemas de monitoramento para redes de grande magnitude e complexidade. Além disso, se propõe a apresentar soluções para parte desses problemas. Através da utilização da Teoria de Redes Complexas, são abordadas duas questões: (i) a localização e a quantidade de nós necessários para garantir uma aquisição de dados capaz de representar o estado da rede de forma efetiva e (ii) a elaboração de um modelo de visualização das informações da rede capaz de ampliar a capacidade de inferência e de entendimento de suas propriedades. A tese estabelece limites teóricos a estas questões e apresenta um estudo sobre a complexidade do monitoramento eficaz, eficiente e escalável de redes
Resumo:
This paper describes the design, implementation and enforcement of a system for industrial process control based on fuzzy logic and developed using Java, with support for industrial communication protocol through the OPC (Ole for Process Control). Besides the java framework, the software is completely independent from other platforms. It provides friendly and functional tools for modeling, construction and editing of complex fuzzy inference systems, and uses these logical systems in control of a wide variety of industrial processes. The main requirements of the developed system should be flexibility, robustness, reliability and ease of expansion
Resumo:
The development of wireless telecommunication in the last years has been great. It has been taking academics to conceive new ideas and techniques. Their aims are to increase the capacity and the quality of the system s services. Cells that are smaller every time, frequencies that are every time higher and environments that get more and more complex, all those facts deserve more accurate models the propagation prediction techniques are inserted in this context and results with a merger of error that is compatible with the next generations of communication systems. The objective of this Work is to present results of a propagation measurement campaign, aiming at pointing the characteristics of the mobile systems covering in the city of Natal (state of Rio Grande do Norte, Brazil). A mobile laboratory was set up, using the infra-structure available and frequently used by ANATEL. The measures were taken in three different areas: one characterized by high buildings, high relief, presence of trees and towers of different highs. These areas covered the city s central zone, a suburban / rural zone and a section of coast surrounded by sand dunes. It is important to highlight that the analysis was made taking into consideration the actual reality of cellular systems with covering ranges by reduced cells, with the intent of causing greater re-use of frequencies and greater capacity of telephone traffic. The predominance of telephone traffic by cell in the city of Natal occurs within a range inferior to 3 (three) km from the Radio-Base Station. The frequency band used was 800 MHz, corresponding to the control channels of the respective sites, which adopt the FSK modulation technique. This Dissertation starts by presenting a general vision of the models used for predicting propagation. Then, there is a description of the methodology used in the measuring, which were done using the same channels of control of the cellular system. The results obtained were compared with many existing prediction models, and some adaptations were developed by using regression techniques trying to obtain the most optimized solutions. Furthermore, according to regulations from the old Brazilian Holding Telebrás, a minimum covering of 90% of a determined previously area, in 90% of the time, must be obeyed when implanting cellular systems. For such value to be reached, considerations and studies involving the specific environment that is being covered are important. The objective of this work is contribute to this aspect
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
It s notorious the advance of computer networks in recent decades, whether in relation to transmission rates, the number of interconnected devices or the existing applications. In parallel, it s also visible this progress in various sectors of the automation, such as: industrial, commercial and residential. In one of its branches, we find the hospital networks, which can make the use of a range of services, ranging from the simple registration of patients to a surgery by a robot under the supervision of a physician. In the context of both worlds, appear the applications in Telemedicine and Telehealth, which work with the transfer in real time of high resolution images, sound, video and patient data. Then comes a problem, since the computer networks, originally developed for the transfer of less complex data, is now being used by a service that involves high transfer rates and needs requirements for quality of service (QoS) offered by the network . Thus, this work aims to do the analysis and comparison of performance of a network when subjected to this type of application, for two different situations: the first without the use of QoS policies, and the second with the application of such policies, using as scenario for testing, the Metropolitan Health Network of the Federal University of Rio Grande do Norte (UFRN)