17 resultados para new method
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
BARBOSA, André F. ; SOUZA, Bryan C. ; PEREIRA JUNIOR, Antônio ; MEDEIROS, Adelardo A. D.de, . Implementação de Classificador de Tarefas Mentais Baseado em EEG. In: CONGRESSO BRASILEIRO DE REDES NEURAIS, 9., 2009, Ouro Preto, MG. Anais... Ouro Preto, MG, 2009
Resumo:
This work introduces a new method for environment mapping with three-dimensional information from visual information for robotic accurate navigation. Many approaches of 3D mapping using occupancy grid typically requires high computacional effort to both build and store the map. We introduce an 2.5-D occupancy-elevation grid mapping, which is a discrete mapping approach, where each cell stores the occupancy probability, the height of the terrain at current place in the environment and the variance of this height. This 2.5-dimensional representation allows that a mobile robot to know whether a place in the environment is occupied by an obstacle and the height of this obstacle, thus, it can decide if is possible to traverse the obstacle. Sensorial informations necessary to construct the map is provided by a stereo vision system, which has been modeled with a robust probabilistic approach, considering the noise present in the stereo processing. The resulting maps favors the execution of tasks like decision making in the autonomous navigation, exploration, localization and path planning. Experiments carried out with a real mobile robots demonstrates that this proposed approach yields useful maps for robot autonomous navigation
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
One of the objectives of this work is the ana1ysis of planar structures using the PBG (photonic Bandgap), a new method of controlling propagation of electromagnetic waves in devices with dielectrics. Here the basic theory of these structures will be presented, as well as applications and determination of certain parameters. In this work the analysis will be performed concerning PBG structures, including the basic theory and applications in planar structures. Considerations are made related to the implementation of devices. Here the TTL (Transverse Transmission Line) method is employed, characterized by the simplicity in the treatment of the equations that govern the propagation of electromagnetic waves in the structure. In this method, the fields in x and z are expressed in function of the fields in the traverse direction y in FTD (Fourier Transform Domain). This method is useful in the determination of the complex propagation constant with application in high frequency and photonics. In this work structures will be approached in micrometric scale operating in frequencies in the range of T erahertz, a first step for operation in the visible spectra. The mathematical basis are approached for the determination of the electromagnetic fields in the structure, based on the method L TT taking into account the dimensions approached in this work. Calculations for the determination of the constant of complex propagation are also carried out. The computational implementation is presented for high frequencies. at the first time the analysis is done with base in open microstrip lines with semiconductor substrate. Finally, considerations are made regarding applications ofthese devices in the area of telecommunications, and suggestions for future
Resumo:
A new method to perform TCP/IP fingerprinting is proposed. TCP/IP fingerprinting is the process of identify a remote machine through a TCP/IP based computer network. This method has many applications related to network security. Both intrusion and defence procedures may use this process to achieve their objectives. There are many known methods that perform this process in favorable conditions. However, nowadays there are many adversities that reduce the identification performance. This work aims the creation of a new OS fingerprinting tool that bypass these actual problems. The proposed method is based on the use of attractors reconstruction and neural networks to characterize and classify pseudo-random numbers generators
Resumo:
Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)
Resumo:
The considered work presents the procedure for evaluation of the uncertainty related to the calibration of flow measurers and to BS&W. It is about a new method of measurement purposed by the conceptual project of the laboratory LAMP, at Universidade Federal do Rio Grande do Norte, that intends to determine the conventional true value of the BS&W from the total height of the liquid column in the auditor tank, hydrostatic pressure exerted by the liquid column, local gravity, specific mass of the water and the specific mass of the oil, and, to determine the flow, from total height of liquid column and transfer time. The calibration uses a automatized system of monitoration and data acquisition of some necessary largnesses to determine of flow and BS&W, allowing a better trustworthiness of through measurements
Resumo:
This dissertation aims the development of an experimental device to determine quantitatively the content of benzene, toluene and xylenes (BTX) in the atmosphere. BTX are extremely volatile solvents, and therefore play an important role in atmospheric chemistry, being precursors in the tropospheric ozone formation. In this work a BTX new standard gas was produced in nitrogen for stagnant systems. The aim of this dissertation is to develop a new method, simple and cheaper, to quantify and monitor BTX in air using solid phase microextraction/ gas chromatography/mass spectrometry (SPME/CG/MS). The features of the calibration method proposed are presented in this dissertation. SPME sampling was carried out under non-equilibrium conditions using a Carboxen/PDMS fiber exposed for 10 min standard gas mixtures. It is observed that the main parameters that affect the extraction process are sampling time and concentration. The results of the BTX multicomponent system studied have shown a linear and a nonlinear range. In the non-linear range, it is remarkable the effect of competition by selective adsorption with the following affinity order p-xylene > toluene > benzene. This behavior represents a limitation of the method, however being in accordance with the literature. Furthermore, this behavior does not prevent the application of the technique out of the non-linear region to quantify the BTX contents in the atmosphere.
Resumo:
Microalgae are microscopic photosynthetic organisms that grow rapidly and in different environmental conditions due to their simple cellular structure. The cultivation of microalgae is a biological system capable of storing solar energy through the production of organic compounds via photosynthesis, and these species presents growth faster than land plants, enabling higher biomass yield. Thus, it is understood that the cultivation of these photosynthetic mechanisms is part of a relevant proposal, since, when compared to other oil producing raw materials, they have a significantly higher productivity, thus being a raw material able to complete the current demand by biodiesel . The overall aim of the thesis was to obtain biofuel via transesterification process of bio oil from the microalgae Isochrysis galbana. The specific objective was to estimate the use of a photobioreactor at the laboratory level, for the experiments of microalgae growth; evaluating the characteristics of biodiesel from microalgae produced by in situ transesterification process; studying a new route for disinfection of microalgae cultivation, through the use of the chemical agent sodium hypochlorite. The introduction of this new method allowed obtaining the kinetics of the photobioreactor for cultivation, besides getting the biomass needed for processing and analysis of experiments in obtaining biodiesel. The research showed acceptable results for the characteristics observed in the bio oil obtained, which fell within the standards of ANP Resolution No. 14, dated 11.5.2012 - 18.5.2012. Furthermore, it was demonstrated that the photobioreactor designed meet expectations about study culture growth and has contributed largely to the development of the chosen species of microalgae. Thus, it can be seen that the microalgae Isochrysis galbana showed a species with potential for biodiesel production
Resumo:
The composition of petroleum may change from well to well and its resulting characteristics influence significantly the refine products. Therefore, it is important to characterize the oil in order to know its properties and send it adequately for processing. Since petroleum is a multicomponent mixture, the use of synthetic mixtures that are representative of oil fractions provides a better understand of the real mixture behavior. One way for characterization is usually obtained through correlation of physico-chemical properties of easy measurement, such as density, specific gravity, viscosity, and refractive index. In this work new measurements were obtained for density, specific gravity, viscosity, and refractive index of the following binary mixtures: n-heptane + hexadecane, cyclohexane + hexadecane, and benzene + hexadecane. These measurements were accomplished at low pressure and temperatures in the range 288.15 K to 310.95 K. These data were applied in the development of a new method of oil characterization. Furthermore, a series of measurements of density at high pressure and temperature of the binary mixture cyclohexane + n-hexadecane were performed. The ranges of pressure and temperature were 6.895 to 62.053 MPa and 318.15 to 413.15 K, respectively. Based on these experimental data of compressed liquid mixtures, a thermodynamic modeling was proposed using the Peng-Robinson equation of state (EOS). The EOS was modified with scaling of volume and a relatively reduced number of parameters were employed. The results were satisfactory demonstrating accuracy not only for density data, but also for isobaric thermal expansion and isothermal compressibility coefficients. This thesis aims to contribute in a scientific manner to the technological problem of refining heavy fractions of oil. This problem was treated in two steps, i.e., characterization and search of the processes that can produce streams with economical interest, such as solvent extraction at high pressure and temperature. In order to determine phase equilibrium data in these conditions, conceptual projects of two new experimental apparatus were developed. These devices consist of cells of variable volume together with a analytical static device. Therefore, this thesis contributed with the subject of characterization of hydrocarbons mixtures and with development of equilibrium cells operating at high pressure and temperature. These contributions are focused on the technological problem of refining heavy oil fractions
Resumo:
In this work is presented a new method for the determination of the orbital period (Porb) of eclipsing binary systems based on the wavelet technique. This method is applied on 18 eclipsing binary systems detected by the CoRoT (Convection Rotation and planetary transits) satellite. The periods obtained by wavelet were compared with those obtained by the conventional methods: box Fitting (EEBLS) for detached and semi-detached eclipsing binaries; and polynomial methods (ANOVA) for contact binary systems. Comparing the phase diagrams obtained by the different techniques the wavelet method determine better Porb compared with EEBLS. In the case of contact binary systems the wavelet method shows most of the times better results than the ANOVA method but when the number of data per orbital cicle is small ANOVA gives more accurate results. Thus, the wavelet technique seems to be a great tool for the analysis of data with the quality and precision given by CoRoT and the incoming photometric missions.
Resumo:
Three studies were performed using tailings kaolin for the synthesis of zeolite A. The first synthesis of zeolite A was performed using a kaolin waste generated from the beneficiation of kaolin for paper production process was studied. The kaolin waste was thermally activated at a temperature range of 550-800°C. For comparison was performed a synthesis pattern of Zeolite A(procedure IZA). The prepared materials were characterized by 27Al MAS NMR, X-ray diffraction and scanning electron microscopy with microprobe rays. The pre-tramento proved to be the most appropriate and suitable temperatures are between 600 and 700°C. Observed the formation of zeolite A in all materials, reaching 52% crystallinity, and the presence of phase sodalite and amorphous material. The second study was the use of a highly reactive metakaolin originating from the Jari region in the synthesis of zeolite A by a new method of hydrothermal synthesis. The zeolite is obtained pure and highly crystalline employing the Jari kaolin calcined at 600 ° C for 2h when the transformation to metakaolin occurs. Get to zeolite phase A at 4pm. The best crystallization time was of 24 h afforded a crystallinity of 67.9%. The third study was the evaluation of the NaOH / metakaolin and crystallization time on the synthesis of zeolite NaA from a sample of kaolin waste, named Kaolin Coverage. The experiments were performed using statistical design (axial points) and rejoinder the center point. The samples were characterized by X-ray diffraction (XRD), scanning microscopic analysis and chemical analysis using an EPMA microprobe. The results showed that a relationship exists between the amount of NaOH added and the crystallization time. The experiment performed using the lowest ratio NaOH / metakaolin (0.5) and shorter (4 h) produced an amorphous material. The increase ratio of NaOH / metakaolin and crystallization time leads to formation of a more crystalline NaA phase, but the presence of phase with sodalite as impurities
Resumo:
The vascular segmentation is important in diagnosing vascular diseases like stroke and is hampered by noise in the image and very thin vessels that can pass unnoticed. One way to accomplish the segmentation is extracting the centerline of the vessel with height ridges, which uses the intensity as features for segmentation. This process can take from seconds to minutes, depending on the current technology employed. In order to accelerate the segmentation method proposed by Aylward [Aylward & Bullitt 2002] we have adapted it to run in parallel using CUDA architecture. The performance of the segmentation method running on GPU is compared to both the same method running on CPU and the original Aylward s method running also in CPU. The improvemente of the new method over the original one is twofold: the starting point for the segmentation process is not a single point in the blood vessel but a volume, thereby making it easier for the user to segment a region of interest, and; the overall gain method was 873 times faster running on GPU and 150 times more fast running on the CPU than the original CPU in Aylward
Resumo:
The Scientific Algorithms are a new metaheuristics inspired in the scientific research process. The new method introduces the idea of theme to search the solution space of hard problems. The inspiration for this class of algorithms comes from the act of researching that comprises thinking, knowledge sharing and disclosing new ideas. The ideas of the new method are illustrated in the Traveling Salesman Problem. A computational experiment applies the proposed approach to a new variant of the Traveling Salesman Problem named Car Renter Salesman Problem. The results are compared to state-of-the-art algorithms for the latter problem