60 resultados para processamento de sinal


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Launching centers are designed for scientific and commercial activities with aerospace vehicles. Rockets Tracking Systems (RTS) are part of the infrastructure of these centers and they are responsible for collecting and processing the data trajectory of vehicles. Generally, Parabolic Reflector Radars (PRRs) are used in RTS. However, it is possible to use radars with antenna arrays, or Phased Arrays (PAs), so called Phased Arrays Radars (PARs). Thus, the excitation signal of each radiating element of the array can be adjusted to perform electronic control of the radiation pattern in order to improve functionality and maintenance of the system. Therefore, in the implementation and reuse projects of PARs, modeling is subject to various combinations of excitation signals, producing a complex optimization problem due to the large number of available solutions. In this case, it is possible to use offline optimization methods, such as Genetic Algorithms (GAs), to calculate the problem solutions, which are stored for online applications. Hence, the Genetic Algorithm with Maximum-Minimum Crossover (GAMMC) optimization method was used to develop the GAMMC-P algorithm that optimizes the modeling step of radiation pattern control from planar PAs. Compared with a conventional crossover GA, the GAMMC has a different approach from the conventional one, because it performs the crossover of the fittest individuals with the least fit individuals in order to enhance the genetic diversity. Thus, the GAMMC prevents premature convergence, increases population fitness and reduces the processing time. Therefore, the GAMMC-P uses a reconfigurable algorithm with multiple objectives, different coding and genetic operator MMC. The test results show that GAMMC-P reached the proposed requirements for different operating conditions of a planar RAV.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O NAVSTAR/GPS (NAVigation System with Timing And Ranging/Global Po- sitioning System), mais conhecido por GPS, _e um sistema de navegacão baseado em sat_elites desenvolvido pelo departamento de defesa norte-americano em meados de 1970. Criado inicialmente para fins militares, o GPS foi adaptado para o uso civil. Para fazer a localização, o receptor precisa fazer a aquisição de sinais dos satélites visíveis. Essa etapa é de extrema importância, pois é responsável pela detecção dos satélites visíveis, calculando suas respectivas frequências e fases iniciais. Esse processo pode demandar bastante tempo de processamento e precisa ser implementado de forma eficiente. Várias técnicas são utilizadas atualmente, mas a maioria delas colocam em conflito questões de projeto tais como, complexidade computacional, tempo de aquisição e recursos computacionais. Objetivando equilibrar essas questões, foi desenvolvido um método que reduz a complexidade do processo de aquisição utilizando algumas estratégias, a saber, redução do efeito doppler, amostras e tamanho do sinal utilizados, além do paralelismo. Essa estratégia é dividida em dois passos, um grosseiro em todo o espaço de busca e um fino apenas na região identificada previamente pela primeira etapa. Devido a busca grosseira, o limiar do algoritmo convencional não era mais aceitável. Nesse sentido, um novo limiar foi estabelecido baseado na variância dos picos de correlação. Inicialmente, é feita uma busca com pouca precisão comparando a variância dos cinco maiores picos de correlação encontrados. Caso a variância ultrapasse um certo limiar, a região de maior pico torna-se candidata à detecção. Por fim, essa região passa por um refinamento para se ter a certeza de detecção. Os resultados mostram que houve uma redução significativa na complexidade e no tempo de execução, sem que tenha sido necessário utilizar algoritmos muito complexos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the progress of devices technology, generation and use of energy ways, power quality parameters start to influence more significantly the various kinds of power consumers. Currently, there are many types of devices that analyze power quality. However, there is a need to create devices, and perform measurements and calculate parameters, find flaws, suggest changes, and to support the management of the installation. In addition, you must ensure that such devices are accessible. To maintain this balance, one magnitude measuring method should be used which does not require great resources processing or memory. The work shows that application of the Goertzel algorithm, compared with the commonly used FFT allows measurements to be made using much less hardware resources, available memory space to implement management functions. The first point of the work is the research of troubles that are more common for low voltage consumers. Then we propose the functional diagram indicate what will be measured, calculated, what problems will be detected and that solutions can be found. Through the Goertzel algorithm simulation using Scilab, is possible to calculate frequency components of a distorted signal with satisfactory results. Finally, the prototype is assembled and tests are carried out by adjusting the parameters necessary for one to maintain a reliable device without increasing its cost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several are the areas in which digital images are used in solving day-to-day problems. In medicine the use of computer systems have improved the diagnosis and medical interpretations. In dentistry it’s not different, increasingly procedures assisted by computers have support dentists in their tasks. Set in this context, an area of dentistry known as public oral health is responsible for diagnosis and oral health treatment of a population. To this end, oral visual inspections are held in order to obtain oral health status information of a given population. From this collection of information, also known as epidemiological survey, the dentist can plan and evaluate taken actions for the different problems identified. This procedure has limiting factors, such as a limited number of qualified professionals to perform these tasks, different diagnoses interpretations among other factors. Given this context came the ideia of using intelligent systems techniques in supporting carrying out these tasks. Thus, it was proposed in this paper the development of an intelligent system able to segment, count and classify teeth from occlusal intraoral digital photographic images. The proposed system makes combined use of machine learning techniques and digital image processing. We first carried out a color-based segmentation on regions of interest, teeth and non teeth, in the images through the use of Support Vector Machine. After identifying these regions were used techniques based on morphological operators such as erosion and transformed watershed for counting and detecting the boundaries of the teeth, respectively. With the border detection of teeth was possible to calculate the Fourier descriptors for their shape and the position descriptors. Then the teeth were classified according to their types through the use of the SVM from the method one-against-all used in multiclass problem. The multiclass classification problem has been approached in two different ways. In the first approach we have considered three class types: molar, premolar and non teeth, while the second approach were considered five class types: molar, premolar, canine, incisor and non teeth. The system presented a satisfactory performance in the segmenting, counting and classification of teeth present in the images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several are the areas in which digital images are used in solving day-to-day problems. In medicine the use of computer systems have improved the diagnosis and medical interpretations. In dentistry it’s not different, increasingly procedures assisted by computers have support dentists in their tasks. Set in this context, an area of dentistry known as public oral health is responsible for diagnosis and oral health treatment of a population. To this end, oral visual inspections are held in order to obtain oral health status information of a given population. From this collection of information, also known as epidemiological survey, the dentist can plan and evaluate taken actions for the different problems identified. This procedure has limiting factors, such as a limited number of qualified professionals to perform these tasks, different diagnoses interpretations among other factors. Given this context came the ideia of using intelligent systems techniques in supporting carrying out these tasks. Thus, it was proposed in this paper the development of an intelligent system able to segment, count and classify teeth from occlusal intraoral digital photographic images. The proposed system makes combined use of machine learning techniques and digital image processing. We first carried out a color-based segmentation on regions of interest, teeth and non teeth, in the images through the use of Support Vector Machine. After identifying these regions were used techniques based on morphological operators such as erosion and transformed watershed for counting and detecting the boundaries of the teeth, respectively. With the border detection of teeth was possible to calculate the Fourier descriptors for their shape and the position descriptors. Then the teeth were classified according to their types through the use of the SVM from the method one-against-all used in multiclass problem. The multiclass classification problem has been approached in two different ways. In the first approach we have considered three class types: molar, premolar and non teeth, while the second approach were considered five class types: molar, premolar, canine, incisor and non teeth. The system presented a satisfactory performance in the segmenting, counting and classification of teeth present in the images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to develop a methodology for analysis of images using overlapping, which assists in identification of microstructural features in areas of titanium, which may be associated with its biological response. That way, surfaces of titanium heat treated for 08 (eight) different ways have been subjected to a test culture of cells. It was a relationship between the grain, texture and shape of grains of surface of titanium (attacked) trying to relate to the process of proliferation and adhesion. We used an open source software for cell counting adhered to the surface of titanium. The juxtaposition of images before and after cell culture was obtained with the aid of micro-hardness of impressions made on the surface of samples. From this image where there is overlap, it is possible to study a possible relationship between cell growth with microstructural characteristics of the surface of titanium. This methodology was efficient to describe a set of procedures that are useful in the analysis of surfaces of titanium subjected to a culture of cells

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This masther dissertation presents a contribution to the study of 316L stainless steel sintering aiming to study their behavior in the milling process and the effect of isotherm temperature on the microstructure and mechanical properties. The 316L stainless steel is a widely used alloy for their high corrosion resistance property. However its application is limited by the low wear resistance consequence of its low hardness. In previous work we analyzed the effect of sintering additives as NbC and TaC. This study aims at deepening the understanding of sintering, analyzing the effect of grinding on particle size and microstructure and the effect of heating rate and soaking time on the sintered microstructure and on their microhardness. Were milled 316L powders with NbC at 1, 5 and 24 hours respectively. Particulates were characterized by SEM and . Cylindrical samples height and diameter of 5.0 mm were compacted at 700 MPa. The sintering conditions were: heating rate 5, 10 and 15◦C/min, temperature 1000, 1100, 1200, 1290 and 1300◦C, and soaking times of 30 and 60min. The cooling rate was maintained at 25◦C/min. All samples were sintered in a vacuum furnace. The sintered microstructure were characterized by optical and electron microscopy as well as density and microhardness. It was observed that the milling process has an influence on sintering, as well as temperature. The major effect was caused by firing temperature, followed by the grinding and heating rate. In this case, the highest rates correspond to higher sintering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research presents an overview of the addition steelwork dust of ceramic shingles in order to contribute to the utilization use of such residue. The ceramic industry perspective in the Brazilian State of Piauí is quite promising. Unlike other productive sectors, the ceramic industry uses basically natural raw materials. Its final products are, in short, the result of transforming clay compounds. These raw materials are composed primarily of aluminum oxide, silicon, iron, sodium, magnesium, end calcium, among others. It was verified that steelwork dust is composed primarily of these same oxides, so that its incorporation in to structural ceramics is a very reasonable idea. Both clay and steelwork powder were characterized by AG, XRF, XRD, TGA and DTA. In addition, steelwork dust samples containing (0%, 5%, 10%, 15%, 20% and 25%) were extruded and burned at 800°C, 850°C, 900°C and 950°C. Then t echnological tests of linear shrinkage, water uptake, apparent porosity, apparent density and flexural strengthwere carried at. The results showed the possibility of using steelwork powder in ceramic shingles until 15% significant improvement in physical and mechanical properties. This behavior shows the possibility of burning at temperatures lower than 850ºC, thus promoting a product final cost reduction

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently the search for new materials with properties suitable for specific applications has increased the number of researches that aim to address market needs. The poly (methyl methacrylate) (PMMA) is one of the most important polymers of the family of polyacrylates and polymethacrylates, especially for its unique optical properties and weathering resistance, and exceptional hardness and gloss. The development of polymer composites by the addition of inorganic fillers to the PMMA matrix increases the potential use of this polymer in various fields of application. The most commonly used inorganic fillers are particles of silica (SiO2), modified clays, graphite and carbon nanotubes. The main objective of this work is the development of PMMA/SiO2 composites at different concentrations of SiO2, for new applications as engineering plastics. The composites were produced by extrusion of tubular film, and obtained via solution for application to commercial PMMA plates, and also by injection molding, for improved the abrasion and scratch resistance of PMMA without compromising transparency. The effects of the addition of silica particles in the polymer matrix properties were evaluated by the maximum tensile strength, hardness, abrasion and scratch resistance, in addition to preliminary characterization by torque rheometry and melt flow rate. The results indicated that it is possible to use silica particles in a PMMA matrix, and a higher silica concentration produced an increase of the abrasion and scratch resistance, hardness, and reduced tensile strength

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polymer matrix composites offer advantages for many applications due their combination of properties, which includes low density, high specific strength and modulus of elasticity and corrosion resistance. However, the application of non-destructive techniques using magnetic sensors for the evaluation these materials is not possible since the materials are non-magnetizable. Ferrites are materials with excellent magnetic properties, chemical stability and corrosion resistance. Due to these properties, these materials are promising for the development of polymer composites with magnetic properties. In this work, glass fiber / epoxy circular plates were produced with 10 wt% of cobalt or barium ferrite particles. The cobalt ferrite was synthesized by the Pechini method. The commercial barium ferrite was subjected to a milling process to study the effect of particle size on the magnetic properties of the material. The characterization of the ferrites was carried out by x-ray diffraction (XRD), field emission gun scanning electron microscopy (FEG-SEM) and vibrating sample magnetometry (VSM). Circular notches of 1, 5 and 10 mm diameter were introduced in the composite plates using a drill bit for the non-destructive evaluation by the technique of magnetic flux leakage (MFL). The results indicated that the magnetic signals measured in plates with barium ferrite without milling and cobalt ferrite showed good correlation with the presence of notches. The milling process for 12 h and 20 h did not contribute to improve the identification of smaller size notches (1 mm). However, the smaller particle size produced smoother magnetic curves, with fewer discontinuities and improved signal-to-noise ratio. In summary, the results suggest that the proposed approach has great potential for the detection of damage in polymer composites structures

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work studied the immiscible blend of elastomeric poly(methyl methacrylate) (PMMA) with poly(ethylene terephthalate) (PET) bottle grade with and without the use of compatibilizer agent, poly(methyl methacrylate-co-glycidyl methacrylate - co-ethyl acrylate) (MGE). The characterizations of torque rheometry, melt flow index measurement (MFI), measuring the density and the degree of cristallinity by pycnometry, tensile testing, method of work essential fracture (EWF), scanning electron microscopy (SEM) and transmission electron microscopy (TEM) were performed in pure polymer and blends PMMA/PET. The rheological results showed evidence of signs of chemical reaction between the epoxy group MGE with the end groups of the PET chains and also to the elastomeric phase of PMMA. The increase in the concentration of PET reduced torque and adding MGE increased the torque of the blend of PMMA/PET. The results of the MFI also show that elastomeric PMMA showed lower flow and thus higher viscosity than PET. In the results of picnometry observed that increasing the percentage of PET resulted in an increase in density and degree crystallinity of the blends PMMA/PET. The tensile test showed that increasing the percentage of PET resulted in an increase in ultimate strength and elastic modulus and decrease in elongation at break. However, in the phase inversion, where the blend showed evidence of a co-continuous morphology and also, with 30% PET dispersed phase and compatibilized with 5% MGE, there were significant results elongation at break compared to elastomeric PMMA. The applicability of the method of essential work of fracture was shown to be possible for most formulations. And it was observed that with increasing elastomeric PMMA in the formulations of the blends there was an improvement in specific amounts of essential work of fracture (We) and a decrease in the values of specific non-essential work of fracture (βWp)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In geophysics there are several steps in the study of the Earth, one of them is the processing of seismic records. These records are obtained through observations made on the earth surface and are useful for information about the structure and composition of the inaccessible parts in great depths. Most of the tools and techniques developed for such studies has been applied in academic projects. The big problem is that the seismic processing power unwanted, recorded by receivers that do not bring any kind of information related to the reflectors can mask the information and/or generate erroneous information from the subsurface. This energy is known as unwanted seismic noise. To reduce the noise and improve a signal indicating a reflection, without losing desirable signals is sometimes a problem of difficult solution. The project aims to get rid of the ground roll noise, which shows a pattern characterized by low frequency, low rate of decay, low velocity and high amplituds. The Karhunen-Loève Transform is a great tool for identification of patterns based on the eigenvalues and eigenvectors. Together with the Karhunen-Loève Transform we will be using the Singular Value Decomposition, since it is a great mathematical technique for manipulating data

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The acceleration of industrial growth in recent decades on all continents aroused the interest of the companies to counter the impacts produced on the environment, spurred primarily by major disasters in the petroleum industry. In this context, the water produced is responsible for the largest volume of effluent from the production and extraction of oil and natural gas. This effluent has in its composition some critical components such as inorganic salts, heavy metals (Fe, Cu, Zn, Pb, Cd, ), presence of oil and chemicals added in the various production processes. In response to impact, have been triggered by research alternative adsorbent materials for water treatment and water produced, in order to removing oils and acids and heavy metals. Many surveys of diatomaceous earth (diatomite) in Brazil involve studies on the physico-chemical, mineral deposits, extraction, processing and applications. The official estimated Jazi are around 2.5 million tonnes, the main located in the states of Bahia (44%) and Rio Grande do Norte (37,4%). Moreover, these two states appear as large offshore producers, earning a prominent role in research of adsorbents such as diatomite for treatment of water produced. Its main applications are as an agent of filtration, adsorption of oils and greases, industrial load and thermal insulator. The objective of this work was the processing and characterization of diatomite diatomaceous earth obtained from the municipality of Macaíba-RN (known locally as tabatinga) as a low cost regenerative adsorbent for removal of heavy metals in the application of water produced treatment. In this work we adopted a methodology for batch processing, practiced by small businesses located in producing regions of Brazil. The characterization was made by X-ray diffraction (XRD), scanning electron microscopy (SEM) and specific surface area (BET). Research conducted showed that the improvement process used was effective for small volume production of diatomite concentrated. The diatomite obtained was treated by calcination at temperature of 900 oC for 2 hours, with and without fluxing Na2CO3 (4%), according to optimal results in the literature. Column adsorption experiments were conducted to percolation of the in nature, calcined and calcined fluxing diatomites. Effluent was used as a saline solution containing ions of Cu, Zn, Na, Ca and Mg simulating the composition of produced waters in the state of Rio Grande do Norte, Brazil. The breakthrough curves for simultaneous removal of copper ions and zinc as a result, 84.3% for calcined diatomite and diatomite with 97.3 % for fluxing. The calcined fluxing diatomite was more efficient permeability through the bed and removal of copper and zinc ions. The fresh diatomite had trouble with the permeability through the bed under the conditions tested, compared with the other obtained diatomite. The results are presented as promising for application in the petroleum industry

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Originally aimed at operational objectives, the continuous measurement of well bottomhole pressure and temperature, recorded by permanent downhole gauges (PDG), finds vast applicability in reservoir management. It contributes for the monitoring of well performance and makes it possible to estimate reservoir parameters on the long term. However, notwithstanding its unquestionable value, data from PDG is characterized by a large noise content. Moreover, the presence of outliers within valid signal measurements seems to be a major problem as well. In this work, the initial treatment of PDG signals is addressed, based on curve smoothing, self-organizing maps and the discrete wavelet transform. Additionally, a system based on the coupling of fuzzy clustering with feed-forward neural networks is proposed for transient detection. The obtained results were considered quite satisfactory for offshore wells and matched real requisites for utilization

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the growth of energy consumption worldwide, conventional reservoirs, the reservoirs called "easy exploration and production" are not meeting the global energy demand. This has led many researchers to develop projects that will address these needs, companies in the oil sector has invested in techniques that helping in locating and drilling wells. One of the techniques employed in oil exploration process is the reverse time migration (RTM), in English, Reverse Time Migration, which is a method of seismic imaging that produces excellent image of the subsurface. It is algorithm based in calculation on the wave equation. RTM is considered one of the most advanced seismic imaging techniques. The economic value of the oil reserves that require RTM to be localized is very high, this means that the development of these algorithms becomes a competitive differentiator for companies seismic processing. But, it requires great computational power, that it still somehow harms its practical success. The objective of this work is to explore the implementation of this algorithm in unconventional architectures, specifically GPUs using the CUDA by making an analysis of the difficulties in developing the same, as well as the performance of the algorithm in the sequential and parallel version