999 resultados para Sinal
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)
Resumo:
Este projeto propõe desenvolver e implementar um controlador para o sistema de refrigeração da tocha indutiva a plasma térmico. Este processo é feito a partir da medição da temperatura através de um sensor do sistema de refrigeração. O sinal produzido será enviado para uma entrada analógica do microcontrolador da família PIC, que utilizando os conceitos de lógica fuzzy, controla a velocidade de um motor bomba. Este é responsável por diminuir ou aumentar o fluxo circulante de água que passa pela bobina, pelo corpo da tocha e pelo flange de fixação, deixando-os na temperatura desejada. A velocidade desta bomba será controlada por um inversor de frequência. O microcontrolador, também, acionará um ventilador caso exceda a temperatura de referência. A proposta inicial foi o desenvolvimento do controle da temperatura da bobina de uma tocha indutiva a plasma, mas com algumas adequações, foi possível também aplicar no corpo da tocha. Essa tocha será utilizada em uma planta de tratamento de resíduos industriais e efluentes petroquímicos. O controle proposto visa garantir as condições físicas necessárias para tocha de plasma, mantendo a temperatura da água em um determinado nível que permita o resfriamento sem comprometer, no entanto, o rendimento do sistema. No projeto será utilizada uma tocha de plasma com acoplamento indutivo (ICPT), por ter a vantagem de não possuir eletrodos metálicos internos sendo erodidos pelo jato de plasma, evitando uma possível contaminação, e também devido à possibilidade do reaproveitamento energético através da cogeração de energia. O desenvolvimento da tecnologia a plasma na indústria de tratamento de resíduos vem obtendo bons resultados. Aplicações com essa tecnologia têm se tornado cada vez mais importantes por reduzir, em muitos casos, a produção de resíduos e o consumo de energia em vários processos industriais
Resumo:
Nowadays, optic fiber is one of the most used communication methods, mainly due to the fact that the data transmission rates of those systems exceed all of the other means of digital communication. Despite the great advantage, there are problems that prevent full utilization of the optical channel: by increasing the transmission speed and the distances involved, the data is subjected to non-linear inter symbolic interference caused by the dispersion phenomena in the fiber. Adaptive equalizers can be used to solve this problem, they compensate non-ideal responses of the channel in order to restore the signal that was transmitted. This work proposes an equalizer based on artificial neural networks and evaluates its performance in optical communication systems. The proposal is validated through a simulated optic channel and the comparison with other adaptive equalization techniques
Resumo:
The increasing demand for high performance wireless communication systems has shown the inefficiency of the current model of fixed allocation of the radio spectrum. In this context, cognitive radio appears as a more efficient alternative, by providing opportunistic spectrum access, with the maximum bandwidth possible. To ensure these requirements, it is necessary that the transmitter identify opportunities for transmission and the receiver recognizes the parameters defined for the communication signal. The techniques that use cyclostationary analysis can be applied to problems in either spectrum sensing and modulation classification, even in low signal-to-noise ratio (SNR) environments. However, despite the robustness, one of the main disadvantages of cyclostationarity is the high computational cost for calculating its functions. This work proposes efficient architectures for obtaining cyclostationary features to be employed in either spectrum sensing and automatic modulation classification (AMC). In the context of spectrum sensing, a parallelized algorithm for extracting cyclostationary features of communication signals is presented. The performance of this features extractor parallelization is evaluated by speedup and parallel eficiency metrics. The architecture for spectrum sensing is analyzed for several configuration of false alarm probability, SNR levels and observation time for BPSK and QPSK modulations. In the context of AMC, the reduced alpha-profile is proposed as as a cyclostationary signature calculated for a reduced cyclic frequencies set. This signature is validated by a modulation classification architecture based on pattern matching. The architecture for AMC is investigated for correct classification rates of AM, BPSK, QPSK, MSK and FSK modulations, considering several scenarios of observation length and SNR levels. The numerical results of performance obtained in this work show the eficiency of the proposed architectures
Resumo:
In recent years there has been a significant growth in technologies that modify implant surfaces, reducing healing time and allowing their successful use in areas with low bone density. One of the most widely used techniques is plasma nitration, applied with excellent results in titanium and its alloys, with greater frequency in the manufacture of hip, ankle and shoulder implants. However, its use in dental implants is very limited due to high process temperatures (between 700 C o and 800 C o ), resulting in distortions in these geometrically complex and highly precise components. The aim of the present study is to assess osseointegration and mechanical strength of grade II nitrided titanium samples, through configuration of hollow cathode discharge. Moreover, new formulations are proposed to determine the optimum structural topology of the dental implant under study, in order to perfect its shape, make it efficient, competitive and with high definition. In the nitriding process, the samples were treated at a temperature of 450 C o and pressure of 150 Pa , during 1 hour of treatment. This condition was selected because it obtains the best wettability results in previous studies, where different pressure, temperature and time conditions were systematized. The samples were characterized by X-ray diffraction, scanning electron microscope, roughness, microhardness and wettability. Biomechanical fatigue tests were then conducted. Finally, a formulation using the three dimensional structural topology optimization method was proposed, in conjunction with an hadaptive refinement process. The results showed that plasma nitriding, using the hollow cathode discharge technique, caused changes in the surface texture of test specimens, increases surface roughness, wettability and microhardness when compared to the untreated sample. In the biomechanical fatigue test, the treated implant showed no flaws, after five million cycles, at a maximum fatigue load of 84.46 N. The results of the topological optimization process showed well-defined optimized layouts of the dental implant, with a clear distribution of material and a defined edge
Resumo:
The formation of paraffin deposits is common in the petroleum industry during production, transport and treatment stages. It happens due to modifications in the thermodynamic variables that alter the solubility of alkanes fractions present in petroleum. The deposition of paraffin can provoke significant and growing petroleum losses, arriving to block the flow, hindering to the production. This process is associated with the phases equilibrium L-S and the stages and nucleation, growth and agglomeration the crystals. That process is function of petroleum intrinsic characteristics and temperature and pressure variations, during production. Several preventive and corrective methods are used to control the paraffin crystallization, such as: use of chemical inhibitors, hot solvents injection, use of termochemistry reactions, and mechanical removal. But for offshore exploration this expensive problem needs more investigation. Many studies have been carried through Wax Appearance Temperature (WAT) of paraffin; therefore the formed crystals are responsible for the modification of the reologics properties of the oil, causing a lot off operational problems. From the determination of the WAT of a system it is possible to affirm if oil presents or not trend to the formation of organic deposits, making possible to foresee and to prevent problems of wax crystallization. The solvent n-paraffin has been widely used as fluid of perforation, raising the production costs when it is used in the removal paraffin deposits, needing an operational substitute. This study aims to determine the WAT of paraffin and the interference off additives in its reduction, being developed system paraffin/solvent/surfactant that propitiates the wax solubilization. Crystallization temperatures in varied paraffin concentrations and different solvents were established in the first stage of the experiments. In the second stage, using the methodology of variation of the photoelectric signal had been determined the temperature of crystallization of the systems and evaluated the interferences of additives to reduction of the WAT. The experimental results are expressed in function of the variations of the photoelectric signals during controlled cooling, innovating and validating this new methodology to determine WAT, relatively simple with relation the other applied that involve specific equipments and of high cost. Through the curves you differentiate of the results had been also identified to the critical stages of growth and agglomeration of the crystals that represent to the saturation of the system, indicating difficulties of flow due to the increase of the density
Resumo:
The WAT is the temperature at the beginning of the appearance of wax crystals. At this temperature the first wax crystals are formed by the cooling systems paraffin / solvents. Paraffins are composed of a mixture of saturated hydrocarbons of high molecular weight. The removal of petroleum from wells and the production lines means a surcharge on produced oil, thus solubilize these deposits formed due to modifications of thermodynamics has been a constant challenge for companies of oil exploration. This study combines the paraffin solubilization by microemulsion systems, the determination of WAT systems paraffin / solvent and performance of surfactant in reducing the crystallization. We used the methods: rheological and the photoelectric signal, validating the latter which was developed to optimize the data obtained due to sensitivity of the equipment used. Methods developed for description of wax precipitation are often in poor agreement with the experimental data, they tend to underestimate the amount of wax at temperatures below the turbidity point. The Won method and the Ideal solution method were applied to the WAT data obtained in solvent systems, best represented by the second interaction of Won method using the solvents naphtha, hexane and LCO. It was observed that the results obtained by WAT photoelectric signal when compared with the viscosity occur in advance, demonstrating the greatest sensitivity of the method developed. The ionic surfactant reduced the viscosity of the solvent systems as it acted modifying the crystalline structure and, consequently, the pour point. The curves show that the WAT experimental data is, in general, closer to the modeling performed by the method of Won than to the one performed by the ideal solution method, because this method underestimates the curve predicting the onset of paraffin hydrocarbons crystallization temperature. This occurs because the actual temperature measured was the crystallization temperature and the method proposes the fusion temperature measurement.
Resumo:
The present investigation includes a study of Leonhard Euler and the pentagonal numbers is his article Mirabilibus Proprietatibus Numerorum Pentagonalium - E524. After a brief review of the life and work of Euler, we analyze the mathematical concepts covered in that article as well as its historical context. For this purpose, we explain the concept of figurate numbers, showing its mode of generation, as well as its geometric and algebraic representations. Then, we present a brief history of the search for the Eulerian pentagonal number theorem, based on his correspondence on the subject with Daniel Bernoulli, Nikolaus Bernoulli, Christian Goldbach and Jean Le Rond d'Alembert. At first, Euler states the theorem, but admits that he doesn t know to prove it. Finally, in a letter to Goldbach in 1750, he presents a demonstration, which is published in E541, along with an alternative proof. The expansion of the concept of pentagonal number is then explained and justified by compare the geometric and algebraic representations of the new pentagonal numbers pentagonal numbers with those of traditional pentagonal numbers. Then we explain to the pentagonal number theorem, that is, the fact that the infinite product(1 x)(1 xx)(1 x3)(1 x4)(1 x5)(1 x6)(1 x7)... is equal to the infinite series 1 x1 x2+x5+x7 x12 x15+x22+x26 ..., where the exponents are given by the pentagonal numbers (expanded) and the sign is determined by whether as more or less as the exponent is pentagonal number (traditional or expanded). We also mention that Euler relates the pentagonal number theorem to other parts of mathematics, such as the concept of partitions, generating functions, the theory of infinite products and the sum of divisors. We end with an explanation of Euler s demonstration pentagonal number theorem
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
O sistema nervoso autônomo (SNA) desempenha um papel importante na regulação dos processos fisiológicos do organismo humano tanto em condições normais quanto patológicas. Dentre as técnicas utilizadas para sua avaliação, a variabilidade da frequência cardíaca (VFC) tem emergido como uma medida simples e não-invasiva dos impulsos autonômicos, representando um dos mais promissores marcadores quantitativos do balanço autonômico. A VFC descreve as oscilações no intervalo entre batimentos cardíacos consecutivos (intervalos R-R), assim como oscilações entre frequências cardíacas instantâneas consecutivas. Trata-se de uma medida que pode ser utilizada para avaliar a modulação do SNA sob condições fisiológicas, tais como em situações de vigília e sono, diferentes posições do corpo, treinamento físico, e também em condições patológicas. Mudanças nos padrões da VFC fornecem um indicador sensível e antecipado de comprometimentos na saúde. Uma alta variabilidade na frequência cardíaca é sinal de boa adaptação, caracterizando um indivíduo saudável, com mecanismos autonômicos eficientes, enquanto que, baixa variabilidade é frequentemente um indicador de adaptação anormal e insuficiente do SNA, implicando a presença de mau funcionamento fisiológico no indivíduo. Diante da sua importância como um marcador que reflete a atividade do SNA sobre o nódulo sinusal e como uma ferramenta clínica para avaliar e identificar comprometimentos na saúde, este artigo revisa aspectos conceituais da VFC, dispositivos de mensuração, métodos de filtragem, índices utilizados para análise da VFC, limitações de utilização e aplicações clínicas da VFC.
Resumo:
O losartano potássico é um agente anti-hipertensivo não peptídico, que exerce sua ação por bloqueio específico dos receptores da angiotensina II. Este trabalho propôs a validação e aplicação de métodos analíticos orientados ao controle de qualidade de losartano potássico 50 mg na forma farmacêutica cápsula, utilizando a espectrofotometria direta e derivada de primeira ordem na região do UV. Baseado nas características espectrofotométricas de losartano potássico, um sinal a 205 nm do espectro de ordem zero e um sinal a 234 nm do espectro de primeira derivada foram adequados para a quantificação. Os resultados foram usados para comparar essas duas técnicas instrumentais. O coeficiente de correlação entre as respostas e as concentrações de losartano potássico na faixa de 3,0-7,0 mg L-1 e 6,0-14,0 mg L-1 para espectrofotometria direta e derivada de primeira ordem em solução aquosa, respectivamente, foi de (r) of 0,9999 para ambos os casos. Os métodos foram aplicados para quantificação de losartano potássico em cápsulas obtidas de farmácias de manipulação locais e demonstraram ser eficientes, fáceis de aplicar e de baixo custo. Além disso, não necessitam de reagentes poluentes e requerem equipamentos economicamente viáveis.
Resumo:
This dissertation presents an interpretation concerning the critical considerations of the German philosopher Friedrich Nietzsche on Modernity, especially Nietzsche s criticism of Modernity, of Christian mores and democracy produced by him in Beyond Good and Evil. Nietzsche attentively analyses details of Modernity, produces a diagnosis of modern man and discovers the sign of decay. We consider that Nietzsche s criticism of modernity is directly linked to the criticism of classic metaphysics. We emphasize questions like: what in us aspires to truth? Christian mores: why and what for? What characterizes modernity? Could it be the appeal to the democratic taste? Is it possible to reinvent Modernity? We stress the relation between the notion of truth, democracy and Christian mores, showing that these mores were also inherited from the Socratic culture. We also intend to clarify Nietzsche s proposal of a new way of doing philosophy, that would be able to surpass the decay which rules in European modern culture. The end of this research points out to the ―philosophers of the future‖ who are able, according to Nietzsche, to claim life beyond the metaphysics opposition, beyond the good and the evil
Resumo:
In the 20th century, the acupuncture has spread on occident as a complementary practice of heath care. This fact has motivated the international scientific community to invest in research that seek to understand why acupuncture works. In this work we compare statistically volt age fluctuation of bioelectric signals caught on the skin at an acupuncture point (IG 4) another nearby on acupuncture point. The acquisition of these signals was performed utilizing an electronic interface with a computer, which was based on an instrumentation amplifier designed with adequate specifications to this end. On the collected signals from a sample of 30 volunteers we have calculated major statistics and submitted them to pairing t-test with significance leveI a = O, 05. We have estimated to bioelectric signals the following parameters: standard deviation, asymmetry and curtose. Moreover, we have calculated the self-correlation function matched by on exponential curve we have observed that the signal decays more rapidly from a non-acupoint then from an acupoint. This fact is an indicative of the existence of information in the acupoint
Resumo:
One of the main goals of CoRoT Natal Team is the determination of rotation period for thousand of stars, a fundamental parameter for the study of stellar evolutionary histories. In order to estimate the rotation period of stars and to understand the associated uncertainties resulting, for example, from discontinuities in the curves and (or) low signal-to-noise ratio, we have compared three different methods for light curves treatment. These methods were applied to many light curves with different characteristics. First, a Visual Analysis was undertaken for each light curve, giving a general perspective on the different phenomena reflected in the curves. The results obtained by this method regarding the rotation period of the star, the presence of spots, or the star nature (binary system or other) were then compared with those obtained by two accurate methods: the CLEANest method, based on the DCDFT (Date Compensated Discrete Fourier Transform), and the Wavelet method, based on the Wavelet Transform. Our results show that all three methods have similar levels of accuracy and can complement each other. Nevertheless, the Wavelet method gives more information about the star, from the wavelet map, showing the variations of frequencies over time in the signal. Finally, we discuss the limitations of these methods, the efficiency to give us informations about the star and the development of tools to integrate different methods into a single analysis
Resumo:
Oil prospecting is one of most complex and important features of oil industry Direct prospecting methods like drilling well logs are very expensive, in consequence indirect methods are preferred. Among the indirect prospecting techniques the seismic imaging is a relevant method. Seismic method is based on artificial seismic waves that are generated, go through the geologic medium suffering diffraction and reflexion and return to the surface where they are recorded and analyzed to construct seismograms. However, the seismogram contains not only actual geologic information, but also noise, and one of the main components of the noise is the ground roll. Noise attenuation is essential for a good geologic interpretation of the seismogram. It is common to study seismograms by using time-frequency transformations that map the seismic signal into a frequency space where it is easier to remove or attenuate noise. After that, data is reconstructed in the original space in such a way that geologic structures are shown in more detail. In addition, the curvelet transform is a new and effective spectral transformation that have been used in the analysis of complex data. In this work, we employ the curvelet transform to represent geologic data using basis functions that are directional in space. This particular basis can represent more effectively two dimensional objects with contours and lines. The curvelet analysis maps real space into frequencies scales and angular sectors in such way that we can distinguish in detail the sub-spaces where is the noise and remove the coefficients corresponding to the undesired data. In this work we develop and apply the denoising analysis to remove the ground roll of seismograms. We apply this technique to a artificial seismogram and to a real one. In both cases we obtain a good noise attenuation