60 resultados para processamento de sinal
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The occurrence of transients in electrocardiogram (ECG) signals indicates an electrical phenomenon outside the heart. Thus, the identification of transients has been the most-used methodology in medical analysis since the invention of the electrocardiograph (device responsible for benchmarking of electrocardiogram signals). There are few papers related to this subject, which compels the creation of an architecture to do the pre-processing of this signal in order to identify transients. This paper proposes a method based on the signal energy of the Hilbert transform of electrocardiogram, being an alternative to methods based on morphology of the signal. This information will determine the creation of frames of the MP-HA protocol responsible for transmitting the ECG signals through an IEEE 802.3 network to a computing device. That, in turn, may perform a process to automatically sort the signal, or to present it to a doctor so that he can do the sorting manually
Resumo:
The Brain-Computer Interfaces (BCI) have as main purpose to establish a communication path with the central nervous system (CNS) independently from the standard pathway (nervous, muscles), aiming to control a device. The main objective of the current research is to develop an off-line BCI that separates the different EEG patterns resulting from strictly mental tasks performed by an experimental subject, comparing the effectiveness of different signal-preprocessing approaches. We also tested different classification approaches: all versus all, one versus one and a hierarchic classification approach. No preprocessing techniques were found able to improve the system performance. Furthermore, the hierarchic approach proved to be capable to produce results above the expected by literature
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
Embedded systems are widely spread nowadays. An example is the Digital Signal Processor (DSP), which is a high processing power device. This work s contribution consist of exposing DSP implementation of the system logic for detecting leaks in real time. Among the various methods of leak detection available today this work uses a technique based on the pipe pressure analysis and usesWavelet Transform and Neural Networks. In this context, the DSP, in addition to do the pressure signal digital processing, also communicates to a Global Positioning System (GPS), which helps in situating the leak, and to a SCADA, sharing information. To ensure robustness and reliability in communication between DSP and SCADA the Modbus protocol is used. As it is a real time application, special attention is given to the response time of each of the tasks performed by the DSP. Tests and leak simulations were performed using the structure of Laboratory of Evaluation of Measurement in Oil (LAMP), at Federal University of Rio Grande do Norte (UFRN)
Resumo:
In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap
Resumo:
This work aims to develop a methodology for analysis of images using overlapping, which assists in identification of microstructural features in areas of titanium, which may be associated with its biological response. That way, surfaces of titanium heat treated for 08 (eight) different ways have been subjected to a test culture of cells. It was a relationship between the grain, texture and shape of grains of surface of titanium (attacked) trying to relate to the process of proliferation and adhesion. We used an open source software for cell counting adhered to the surface of titanium. The juxtaposition of images before and after cell culture was obtained with the aid of micro-hardness of impressions made on the surface of samples. From this image where there is overlap, it is possible to study a possible relationship between cell growth with microstructural characteristics of the surface of titanium. This methodology was efficient to describe a set of procedures that are useful in the analysis of surfaces of titanium subjected to a culture of cells
Resumo:
This masther dissertation presents a contribution to the study of 316L stainless steel sintering aiming to study their behavior in the milling process and the effect of isotherm temperature on the microstructure and mechanical properties. The 316L stainless steel is a widely used alloy for their high corrosion resistance property. However its application is limited by the low wear resistance consequence of its low hardness. In previous work we analyzed the effect of sintering additives as NbC and TaC. This study aims at deepening the understanding of sintering, analyzing the effect of grinding on particle size and microstructure and the effect of heating rate and soaking time on the sintered microstructure and on their microhardness. Were milled 316L powders with NbC at 1, 5 and 24 hours respectively. Particulates were characterized by SEM and . Cylindrical samples height and diameter of 5.0 mm were compacted at 700 MPa. The sintering conditions were: heating rate 5, 10 and 15◦C/min, temperature 1000, 1100, 1200, 1290 and 1300◦C, and soaking times of 30 and 60min. The cooling rate was maintained at 25◦C/min. All samples were sintered in a vacuum furnace. The sintered microstructure were characterized by optical and electron microscopy as well as density and microhardness. It was observed that the milling process has an influence on sintering, as well as temperature. The major effect was caused by firing temperature, followed by the grinding and heating rate. In this case, the highest rates correspond to higher sintering.
Resumo:
This research presents an overview of the addition steelwork dust of ceramic shingles in order to contribute to the utilization use of such residue. The ceramic industry perspective in the Brazilian State of Piauí is quite promising. Unlike other productive sectors, the ceramic industry uses basically natural raw materials. Its final products are, in short, the result of transforming clay compounds. These raw materials are composed primarily of aluminum oxide, silicon, iron, sodium, magnesium, end calcium, among others. It was verified that steelwork dust is composed primarily of these same oxides, so that its incorporation in to structural ceramics is a very reasonable idea. Both clay and steelwork powder were characterized by AG, XRF, XRD, TGA and DTA. In addition, steelwork dust samples containing (0%, 5%, 10%, 15%, 20% and 25%) were extruded and burned at 800°C, 850°C, 900°C and 950°C. Then t echnological tests of linear shrinkage, water uptake, apparent porosity, apparent density and flexural strengthwere carried at. The results showed the possibility of using steelwork powder in ceramic shingles until 15% significant improvement in physical and mechanical properties. This behavior shows the possibility of burning at temperatures lower than 850ºC, thus promoting a product final cost reduction
Resumo:
Currently the search for new materials with properties suitable for specific applications has increased the number of researches that aim to address market needs. The poly (methyl methacrylate) (PMMA) is one of the most important polymers of the family of polyacrylates and polymethacrylates, especially for its unique optical properties and weathering resistance, and exceptional hardness and gloss. The development of polymer composites by the addition of inorganic fillers to the PMMA matrix increases the potential use of this polymer in various fields of application. The most commonly used inorganic fillers are particles of silica (SiO2), modified clays, graphite and carbon nanotubes. The main objective of this work is the development of PMMA/SiO2 composites at different concentrations of SiO2, for new applications as engineering plastics. The composites were produced by extrusion of tubular film, and obtained via solution for application to commercial PMMA plates, and also by injection molding, for improved the abrasion and scratch resistance of PMMA without compromising transparency. The effects of the addition of silica particles in the polymer matrix properties were evaluated by the maximum tensile strength, hardness, abrasion and scratch resistance, in addition to preliminary characterization by torque rheometry and melt flow rate. The results indicated that it is possible to use silica particles in a PMMA matrix, and a higher silica concentration produced an increase of the abrasion and scratch resistance, hardness, and reduced tensile strength
Resumo:
Polymer matrix composites offer advantages for many applications due their combination of properties, which includes low density, high specific strength and modulus of elasticity and corrosion resistance. However, the application of non-destructive techniques using magnetic sensors for the evaluation these materials is not possible since the materials are non-magnetizable. Ferrites are materials with excellent magnetic properties, chemical stability and corrosion resistance. Due to these properties, these materials are promising for the development of polymer composites with magnetic properties. In this work, glass fiber / epoxy circular plates were produced with 10 wt% of cobalt or barium ferrite particles. The cobalt ferrite was synthesized by the Pechini method. The commercial barium ferrite was subjected to a milling process to study the effect of particle size on the magnetic properties of the material. The characterization of the ferrites was carried out by x-ray diffraction (XRD), field emission gun scanning electron microscopy (FEG-SEM) and vibrating sample magnetometry (VSM). Circular notches of 1, 5 and 10 mm diameter were introduced in the composite plates using a drill bit for the non-destructive evaluation by the technique of magnetic flux leakage (MFL). The results indicated that the magnetic signals measured in plates with barium ferrite without milling and cobalt ferrite showed good correlation with the presence of notches. The milling process for 12 h and 20 h did not contribute to improve the identification of smaller size notches (1 mm). However, the smaller particle size produced smoother magnetic curves, with fewer discontinuities and improved signal-to-noise ratio. In summary, the results suggest that the proposed approach has great potential for the detection of damage in polymer composites structures
Resumo:
This work studied the immiscible blend of elastomeric poly(methyl methacrylate) (PMMA) with poly(ethylene terephthalate) (PET) bottle grade with and without the use of compatibilizer agent, poly(methyl methacrylate-co-glycidyl methacrylate - co-ethyl acrylate) (MGE). The characterizations of torque rheometry, melt flow index measurement (MFI), measuring the density and the degree of cristallinity by pycnometry, tensile testing, method of work essential fracture (EWF), scanning electron microscopy (SEM) and transmission electron microscopy (TEM) were performed in pure polymer and blends PMMA/PET. The rheological results showed evidence of signs of chemical reaction between the epoxy group MGE with the end groups of the PET chains and also to the elastomeric phase of PMMA. The increase in the concentration of PET reduced torque and adding MGE increased the torque of the blend of PMMA/PET. The results of the MFI also show that elastomeric PMMA showed lower flow and thus higher viscosity than PET. In the results of picnometry observed that increasing the percentage of PET resulted in an increase in density and degree crystallinity of the blends PMMA/PET. The tensile test showed that increasing the percentage of PET resulted in an increase in ultimate strength and elastic modulus and decrease in elongation at break. However, in the phase inversion, where the blend showed evidence of a co-continuous morphology and also, with 30% PET dispersed phase and compatibilized with 5% MGE, there were significant results elongation at break compared to elastomeric PMMA. The applicability of the method of essential work of fracture was shown to be possible for most formulations. And it was observed that with increasing elastomeric PMMA in the formulations of the blends there was an improvement in specific amounts of essential work of fracture (We) and a decrease in the values of specific non-essential work of fracture (βWp)
Resumo:
In geophysics there are several steps in the study of the Earth, one of them is the processing of seismic records. These records are obtained through observations made on the earth surface and are useful for information about the structure and composition of the inaccessible parts in great depths. Most of the tools and techniques developed for such studies has been applied in academic projects. The big problem is that the seismic processing power unwanted, recorded by receivers that do not bring any kind of information related to the reflectors can mask the information and/or generate erroneous information from the subsurface. This energy is known as unwanted seismic noise. To reduce the noise and improve a signal indicating a reflection, without losing desirable signals is sometimes a problem of difficult solution. The project aims to get rid of the ground roll noise, which shows a pattern characterized by low frequency, low rate of decay, low velocity and high amplituds. The Karhunen-Loève Transform is a great tool for identification of patterns based on the eigenvalues and eigenvectors. Together with the Karhunen-Loève Transform we will be using the Singular Value Decomposition, since it is a great mathematical technique for manipulating data
Resumo:
The acceleration of industrial growth in recent decades on all continents aroused the interest of the companies to counter the impacts produced on the environment, spurred primarily by major disasters in the petroleum industry. In this context, the water produced is responsible for the largest volume of effluent from the production and extraction of oil and natural gas. This effluent has in its composition some critical components such as inorganic salts, heavy metals (Fe, Cu, Zn, Pb, Cd, ), presence of oil and chemicals added in the various production processes. In response to impact, have been triggered by research alternative adsorbent materials for water treatment and water produced, in order to removing oils and acids and heavy metals. Many surveys of diatomaceous earth (diatomite) in Brazil involve studies on the physico-chemical, mineral deposits, extraction, processing and applications. The official estimated Jazi are around 2.5 million tonnes, the main located in the states of Bahia (44%) and Rio Grande do Norte (37,4%). Moreover, these two states appear as large offshore producers, earning a prominent role in research of adsorbents such as diatomite for treatment of water produced. Its main applications are as an agent of filtration, adsorption of oils and greases, industrial load and thermal insulator. The objective of this work was the processing and characterization of diatomite diatomaceous earth obtained from the municipality of Macaíba-RN (known locally as tabatinga) as a low cost regenerative adsorbent for removal of heavy metals in the application of water produced treatment. In this work we adopted a methodology for batch processing, practiced by small businesses located in producing regions of Brazil. The characterization was made by X-ray diffraction (XRD), scanning electron microscopy (SEM) and specific surface area (BET). Research conducted showed that the improvement process used was effective for small volume production of diatomite concentrated. The diatomite obtained was treated by calcination at temperature of 900 oC for 2 hours, with and without fluxing Na2CO3 (4%), according to optimal results in the literature. Column adsorption experiments were conducted to percolation of the in nature, calcined and calcined fluxing diatomites. Effluent was used as a saline solution containing ions of Cu, Zn, Na, Ca and Mg simulating the composition of produced waters in the state of Rio Grande do Norte, Brazil. The breakthrough curves for simultaneous removal of copper ions and zinc as a result, 84.3% for calcined diatomite and diatomite with 97.3 % for fluxing. The calcined fluxing diatomite was more efficient permeability through the bed and removal of copper and zinc ions. The fresh diatomite had trouble with the permeability through the bed under the conditions tested, compared with the other obtained diatomite. The results are presented as promising for application in the petroleum industry