1000 resultados para Circuito Espacial Produtivo. Círculos de Cooperação no Espaço. Carcinicultura. Rio Grande do Norte
Resumo:
The search for ever smaller device and without loss of performance has been increasingly investigated by researchers involving applied electromagnetics. Antennas using ceramics materials with a high dielectric constant, whether acting as a substract element of patch radiating or as the radiant element are in evidence in current research, that due to the numerous advantages offered, such as: low profile, ability to reduce the its dimensions when compared to other devices, high efficiency of ratiation, suitability the microwave range and/or millimeter wave, low temperature coefficient and low cost. The reason for this high efficiency is that the dielectric losses of ceramics are very low when compared to commercially materials sold used in printed circuit boards, such as fiberglass and phenolite. These characteristics make ceramic devices suitable for operation in the microwave band. Combining the design of patch antennas and/or dielectric resonator antenna (DRA) to certain materials and the method of synthesis of these powders in the manufacture of devices, it s possible choose a material with a dielectric constant appropriate for the design of an antenna with the desired size. The main aim of this work is the design of patch antennas and DRA antennas on synthesis of ceramic powders (synthesis by combustion and polymeric precursors - Pe- chini method) nanostructured with applications in the microwave band. The conventional method of mix oxides was also used to obtain nanometric powders for the preparation of tablets and dielectric resonators. The devices manufactured and studied on high dielectric constant materials make them good candidates to have their small size compared to other devices operating at the same frequency band. The structures analyzed are excited by three different techniques: i) microstrip line, ii) aperture coupling and iii) inductive coupling. The efficiency of these techniques have been investigated experimentally and compared with simulations by Ansoft HFSS, used in the accurate analysis of the electromagnetic behavior of antennas over the finite element method (FEM). In this thesis a literature study on the theory of microstrip antennas and DRA antenna is performed. The same study is performed about the materials and methods of synthesis of ceramic powders, which are used in the manufacture of tablets and dielectric cylinders that make up the devices investigated. The dielectric media which were used to support the analysis of the DRA and/or patch antennas are analyzed using accurate simulations using the finite difference time domain (FDTD) based on the relative electrical permittivity (er) and loss tangent of these means (tand). This work also presents a study on artificial neural networks, showing the network architecture used and their characteristics, as well as the training algorithms that were used in training and modeling some parameters associated with the devices investigated
Resumo:
This work presents a theoretical and numerical analysis using the transverse resonance technique (TRT) and a proposed MTRT applied in the analysis of the dispersive characteristics of microstrip lines built on truncated isotropic and anisotropic dielectric substrates. The TRT uses the transmission lines model in the transversal section of the structure, allowing to analyze its dispersive behavior. The difference between TRT and MTRT consists basically of the resonance direction. While in the TRT the resonance is calculated in the same direction of the metallic strip normal axis, the MTRT considers the resonance in the metallic strip parallel plane. Although the application of the MTRT results in a more complex equivalent circuit, its use allows some added characterization, like longitudinal section electric mode (LSE) and longitudinal section magnetic mode (LSM), microstrips with truncated substrate, or structures with different dielectric regions. A computer program using TRT and MTRT proposed in this work is implemented for the characterization of microstrips on truncated isotropic and anisotropic substrates. In this analysis, propagating and evanescent modes are considered. Thus, it is possible to characterize both the dominant and higher order modes of the structure. Numerical results are presented for the effective permittivity, characteristic impedance and relative phase velocity for microstrip lines with different parameters and dimensions of the dielectric substrate. Agreement with the results obtained in the literature are shown, as well as experimental results. In some cases, the convergence analysis is also performed by considering the limiting conditions, like particular cases of isotropic materials or structures with dielectric of infinite size found in the literature. The numerical convergence of the formulation is also analyzed. Finally, conclusions and suggestions for the continuity of this work are presented
Resumo:
We propose a multi-resolution approach for surface reconstruction from clouds of unorganized points representing an object surface in 3D space. The proposed method uses a set of mesh operators and simple rules for selective mesh refinement, with a strategy based on Kohonen s self-organizing map. Basically, a self-adaptive scheme is used for iteratively moving vertices of an initial simple mesh in the direction of the set of points, ideally the object boundary. Successive refinement and motion of vertices are applied leading to a more detailed surface, in a multi-resolution, iterative scheme. Reconstruction was experimented with several point sets, induding different shapes and sizes. Results show generated meshes very dose to object final shapes. We include measures of performance and discuss robustness.
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
The increasing of the number of attacks in the computer networks has been treated with the increment of the resources that are applied directly in the active routers equip-ments of these networks. In this context, the firewalls had been consolidated as essential elements in the input and output control process of packets in a network. With the advent of intrusion detectors systems (IDS), efforts have been done in the direction to incorporate packets filtering based in standards of traditional firewalls. This integration incorporates the IDS functions (as filtering based on signatures, until then a passive element) with the already existing functions in firewall. In opposite of the efficiency due this incorporation in the blockage of signature known attacks, the filtering in the application level provokes a natural retard in the analyzed packets, and it can reduce the machine performance to filter the others packets because of machine resources demand by this level of filtering. This work presents models of treatment for this problem based in the packets re-routing for analysis by a sub-network with specific filterings. The suggestion of implementa- tion of this model aims reducing the performance problem and opening a space for the consolidation of scenes where others not conventional filtering solutions (spam blockage, P2P traffic control/blockage, etc.) can be inserted in the filtering sub-network, without inplying in overload of the main firewall in a corporative network
Resumo:
This work introduces a new method for environment mapping with three-dimensional information from visual information for robotic accurate navigation. Many approaches of 3D mapping using occupancy grid typically requires high computacional effort to both build and store the map. We introduce an 2.5-D occupancy-elevation grid mapping, which is a discrete mapping approach, where each cell stores the occupancy probability, the height of the terrain at current place in the environment and the variance of this height. This 2.5-dimensional representation allows that a mobile robot to know whether a place in the environment is occupied by an obstacle and the height of this obstacle, thus, it can decide if is possible to traverse the obstacle. Sensorial informations necessary to construct the map is provided by a stereo vision system, which has been modeled with a robust probabilistic approach, considering the noise present in the stereo processing. The resulting maps favors the execution of tasks like decision making in the autonomous navigation, exploration, localization and path planning. Experiments carried out with a real mobile robots demonstrates that this proposed approach yields useful maps for robot autonomous navigation
Resumo:
The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm
Resumo:
This work presents a cooperative navigation systemof a humanoid robot and a wheeled robot using visual information, aiming to navigate the non-instrumented humanoid robot using information obtained from the instrumented wheeled robot. Despite the humanoid not having sensors to its navigation, it can be remotely controlled by infra-red signals. Thus, the wheeled robot can control the humanoid positioning itself behind him and, through visual information, find it and navigate it. The location of the wheeled robot is obtained merging information from odometers and from landmarks detection, using the Extended Kalman Filter. The marks are visually detected, and their features are extracted by image processing. Parameters obtained by image processing are directly used in the Extended Kalman Filter. Thus, while the wheeled robot locates and navigates the humanoid, it also simultaneously calculates its own location and maps the environment (SLAM). The navigation is done through heuristic algorithms based on errors between the actual and desired pose for each robot. The main contribution of this work was the implementation of a cooperative navigation system for two robots based on visual information, which can be extended to other robotic applications, as the ability to control robots without interfering on its hardware, or attaching communication devices
Resumo:
The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring
Resumo:
With the technological progress the people are more and more looking for convenience, comfort and safety to your homes. The residential automation is winning space on the market not only by the status and modernity that provide, but also to allow a better use of natural resources, reducing the expense to keep up a residence. This work shows the development of a control system and supervision to be destined to the residential automation. The developed software will be working together with a controller (PLC), acting in the administration, control and supervision all the linked devices, and offering to the user an environment simple and practical for the control residence
Resumo:
The metaheuristics techiniques are known to solve optimization problems classified as NP-complete and are successful in obtaining good quality solutions. They use non-deterministic approaches to generate solutions that are close to the optimal, without the guarantee of finding the global optimum. Motivated by the difficulties in the resolution of these problems, this work proposes the development of parallel hybrid methods using the reinforcement learning, the metaheuristics GRASP and Genetic Algorithms. With the use of these techniques, we aim to contribute to improved efficiency in obtaining efficient solutions. In this case, instead of using the Q-learning algorithm by reinforcement learning, just as a technique for generating the initial solutions of metaheuristics, we use it in a cooperative and competitive approach with the Genetic Algorithm and GRASP, in an parallel implementation. In this context, was possible to verify that the implementations in this study showed satisfactory results, in both strategies, that is, in cooperation and competition between them and the cooperation and competition between groups. In some instances were found the global optimum, in others theses implementations reach close to it. In this sense was an analyze of the performance for this proposed approach was done and it shows a good performance on the requeriments that prove the efficiency and speedup (gain in speed with the parallel processing) of the implementations performed
Resumo:
The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals
Resumo:
This work has as main objective to show all the particularities regarding the Three-phase Power Summation Method, used for load flow calculation, in what it says respect to the influence of the magnetic coupling among the phases, as well as to the losses presented in all the existent transformers in the feeder to be analyzed. Besides, its application is detailed in the study of the short-circuits, that happen in the presence of high impedance values, which possess a problem, that is its difficult detection and consequent elimination on the part of common devices of protection. That happens due to the characteristic presented by the current of short¬ circuit, in being generally of the same order of greatness that the load currents. Results of simulations accomplished in several situations will be shown, objectifying a complete analysis of the behavior of the proposed method in several types of short-circuits. Confront of the results obtained by the method with results of another works will be presented to verify its effectiveness
Resumo:
This master dissertation introduces a study about some aspects that determine the aplication of adaptative arrays in DS-CDMA cellular systems. Some basics concepts and your evolution in the time about celular systems was detailed here, meanly the CDMA tecnique, specialy about spread-codes and funtionaly principies. Since this, the mobile radio enviroment, with your own caracteristcs, and the basics concepts about adaptive arrays, as powerfull spacial filter was aborded. Some adaptative algorithms was introduced too, these are integrants of the signals processing, and are answerable for weights update that influency directly in the radiation pattern of array. This study is based in a numerical analysis of adaptative array system behaviors related to the used antenna and array geometry types. All the simulations was done by Mathematica 4.0 software. The results for weights convergency, square mean error, gain, array pattern and supression capacity based the analisis made here, using RLS (supervisioned) and LSDRMTA (blind) algorithms
Resumo:
In this work, we propose methodologies and computer tools to insert robots in cultural environments. The basic idea is to have a robot in a real context (a cultural space) that can represent an user connected to the system through Internet (visitor avatar in the real space) and that the robot also have its representation in a Mixed Reality space (robot avatar in the virtual space). In this way, robot and avatar are not simply real and virtual objects. They play a more important role in the scenery, interfering in the process and taking decisions. In order to have this service running, we developed a module composed by a robot, communication tools and ways to provide integration of these with the virtual environment. As welI we implemented a set of behaviors with the purpose of controlling the robot in the real space. We studied available software and hardware tools for the robotics platform used in the experiments, as welI we developed test routines to determine their potentialities. Finally, we studied the behavior-based control model, we planned and implemented alI the necessary behaviors for the robot integration to the real and virtual cultural spaces. Several experiments were conducted, in order to validate the developed methodologies and tools