914 resultados para MatLab Simulink
Resumo:
O presente trabalho tem como objectivo principal a realização de um estudo do efeito do ganho das antenas nas perdas de percurso de uma ligação na banda de frequência ISM (Insdustrial Scientific and Medical) nos 2,4 GHz. Para tal, foram utilizadas 14 antenas de 7 tipos diferentes (2 monopolo, 2 colineares, 2 grelhas, 2 Yagi, 2 painéis, 2 parabólicas e 2 helicoidais) com ganhos a variar desde os 1,3 a 23 dBi. Inicialmente, obtiveram-se os parâmetros fundamentais das antenas utilizadas, nomeadamente o diagrama de radiação e o ganho. Desenvolveu-se um sistema, que permite a medição do diagrama de radiação de forma automática através de um motor passo a passo. Uma aplicação desenvolvida em ambiente Matlab realiza o controlo do motor e desenha o respectivo diagrama em tempo real. Para a realização das medições da propagação do sinal desenvolveu-se um sistema que interliga a antena receptora a um computador portátil através de um analisador de espectros. À medida que as medições são efectuadas, a atenuação do sinal em função da distância é determinada e apresentada graficamente no computador sendo, ainda, representada a atenuação esperada no espaço livre. Para isto, implementou-se uma aplicação em ambiente Matlab que faz o tratamento e representação dos dados. Tendo-se conhecimento dos parâmetros das antenas, e com o sistema de medição implementado, passou-se às medições da propagação do sinal em diferentes meios, tais como, meio sem obstáculos, meio urbano, meio florestal caracterizado por troncos no percurso de propagação e meio florestal caracterizado por folhagem no percurso de propagação. Em cada meio foram efectuadas medições para 37 combinações de antenas, com ganhos conjuntos a variar desde os 2,6 até o 46 dBi. As medições foram efectuadas com a colocação da antena emissora numa posição fixa e com o afastamento da antena receptora em intervalos de 5 metros, até os 150 metros, seguindo a linha de máxima radiação entre ambas as antenas. Ainda, para os meios sem obstáculos e florestais, foram efectuadas medições para dois ângulos em relação à linha indicada, sendo estes de 30° e 330°, respectivamente. Com base nos resultados obtidos, verificou-se um aumento da atenuação à medida que se passou para meios mais obstruídos, sendo que, para o meio sem obstáculos a atenuação aproxima-se da curva do espaço livre. No meio urbano observou-se que a atenuação aumentou ligeiramente. No entanto, esta não é muito significativa quando comparada com a atenuação no espaço livre. Por último, notou-se um grande aumento da atenuação nos meios florestais, ao nível dos troncos e da vegetação, sendo superior para esta última. Relativamente às antenas, verificou-se que existe um aumento da atenuação na medida que o ganho destas aumenta, sendo mais visível para meios com maior obstrução (florestais). Para a situação em que as antenas encontram-se com desvios de 30° e 330°, verificou-se que este aumento é mais significativo para as antenas directivas (ganhos superiores).
Resumo:
Com a evolução constante da tecnologia, também a ciência da medição, ou Metrologia, necessita de processos de medição mais exatos e fiáveis, por vezes automatizados, de modo a ser possível fornecer informações mais corretas sobre uma determinada grandeza física. Entre estas informações destaca-se a incerteza de medição, que permite ao utilizador ter uma estimativa sobre qual o valor final da grandeza física medida, que com processos de medição mais complexos, tornam a sua obtenção mais difícil, sendo necessário, por vezes, a utilização de métodos computacionais. Tendo isto em conta, com esta dissertação pretende-se abordar o problema da automatização de processos de medição, bem como da obtenção de incertezas de medição que reflitam a natureza da grandeza física medida através de métodos computacionais. De modo a automatizar um processo de medição, mais concretamente da calibração de manómetros, utilizou-se o LabView para criar um instrumento virtual que permitisse ao utilizador, de forma simples e intuitiva, realizar o procedimento de calibração. Também se realizou outro instrumento virtual, de modo a permitir a obtenção simultânea de dados provenientes de dois equipamentos diferentes. Relativamente às incertezas de medição, utilizou-se o Método de Monte Carlo, implementado em MATLAB e Excel, de modo a obter o valor destas para a calibração de manómetros, de uma câmara geradora de humidade relativa e de um higrómetro de ponto de orvalho, sendo que os dois últimos possuem um modelo matemático complexo, sendo a análise analítica mais complexa e morosa. Tendo em conta os resultados obtidos, é possível afirmar que a criação de instrumentação virtual permite a adaptação, de uma forma simples, de vários processos de medição, tornando-os mais eficientes para além de reduzirem o erro do operador. Por outro lado, também é possível observar que a utilização de métodos computacionais, neste caso o Método de Monte Carlo, para estudo de incertezas de medição é uma mais valia, comparativamente ao GUM, permitindo umaa análise rápida e fiável de modelos matemáticos complexos.
Resumo:
O principal objetivo deste projeto foi propor um sistema de aquisição de dados de uma aplicação remota. Existem diversas aplicações que requerem a recolha de informação remota, sendo, para isso, necessário estabelecer um sistema de comunicação dedicado. Este trabalho procurou encontrar as melhores soluções para o problema em causa, testando e avaliado um sistema de comunicação. Para testar o sistema foi desenvolvido um protótipo de monotorização de parâmetros ambientais, que mede periodicamente os valores de temperatura, humidade, luminosidade e pressão atmosférica. A comunicação entre sensores foi realizada com recurso a rádios XBee com protocolo Zigbee. Foi, também, desenvolvido um nó de coordenação que tem como objetivo principal gerir e manter todo o sistema de aquisição de dados. Este protótipo recebe, valida e armazena num cartão SD todos os dados provenientes do nó sensor e periodicamente envia os dados para um servidor com acesso a internet. A aquisição de dados em aplicações remotas, normalmente, é efetuada em zonas de ausência de energia elétrica. Então, tendo em consideração a capacidade reduzida dos sistemas de armazenamento de energia, foram desenvolvidos sistemas de alimentação através de energia solar, focando-se no mínimo de consumo possível. Para a comunicação de longa distância foi implementado e testado um sistema de feixes hertzianos. Estudou-se a propagação, utilizando a banda isenta de licença dos 2,4 GHz. Projetou-se uma ligação entre dois pontos e procedeu-se à validação das áreas de cobertura, a qual requer a estimação do sinal nos pontos de interesse. Verificou-se as zonas de interferência e as zonas onde o sinal é fraco ou está no seu limite. O desenvolvimento deste sistema de comunicação foi fundamentado com a análise e avaliação dos modelos de propagação, juntamente com o software criado em plataforma Matlab. Finalmente foram apresentadas as conclusões e algumas sugestões de trabalhos futuros.
Resumo:
The consumption of energy on the planet is currently based on fossil fuels. They are responsible for adverse effects on the environment. Renewables propose solutions for this scenario, but must face issues related to the capacity of the power supply. Wind energy offshore emerging as a promising alternative. The speed and stability are greater winds over oceans, but the variability of these may cause inconvenience to the generation of electric power fluctuations. To reduce this, a combination of wind farms geographically distributed was proposed. The greater the distance between them, the lower the correlation between the wind velocity, increasing the likelihood that together achieve more stable power system with less fluctuations in power generation. The efficient use of production capacity of the wind park however, depends on their distribution in marine environments. The objective of this research was to analyze the optimal allocation of wind farms offshore on the east coast of the U.S. by Modern Portfolio Theory. The Modern Portfolio Theory was used so that the process of building portfolios of wind energy offshore contemplate the particularity of intermittency of wind, through calculations of return and risk of the production of wind farms. The research was conducted with 25.934 observations of energy produced by wind farms 11 hypothetical offshore, from the installation of 01 simulated ocean turbine with a capacity of 5 MW. The data show hourly time resolution and covers the period between January 1, 1998 until December 31, 2002. Through the Matlab R software, six were calculated minimum variance portfolios, each for a period of time distinct. Given the inequality of the variability of wind over time, set up four strategies rebalancing to evaluate the performance of the related portfolios, which enabled us to identify the most beneficial to the stability of the wind energy production offshore. The results showed that the production of wind energy for 1998, 1999, 2000 and 2001 should be considered by the portfolio weights calculated for the same periods, respectively. Energy data for 2002 should use the weights derived from the portfolio calculated in the previous time period. Finally, the production of wind energy in the period 1998-2002 should also be weighted by 1/11. It follows therefore that the portfolios found failed to show reduced levels of variability when compared to the individual production of wind farms hypothetical offshore
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
The study of complex systems has become a prestigious area of science, although relatively young . Its importance was demonstrated by the diversity of applications that several studies have already provided to various fields such as biology , economics and Climatology . In physics , the approach of complex systems is creating paradigms that influence markedly the new methods , bringing to Statistical Physics problems macroscopic level no longer restricted to classical studies such as those of thermodynamics . The present work aims to make a comparison and verification of statistical data on clusters of profiles Sonic ( DT ) , Gamma Ray ( GR ) , induction ( ILD ) , neutron ( NPHI ) and density ( RHOB ) to be physical measured quantities during exploratory drilling of fundamental importance to locate , identify and characterize oil reservoirs . Software were used : Statistica , Matlab R2006a , Origin 6.1 and Fortran for comparison and verification of the data profiles of oil wells ceded the field Namorado School by ANP ( National Petroleum Agency ) . It was possible to demonstrate the importance of the DFA method and that it proved quite satisfactory in that work, coming to the conclusion that the data H ( Hurst exponent ) produce spatial data with greater congestion . Therefore , we find that it is possible to find spatial pattern using the Hurst coefficient . The profiles of 56 wells have confirmed the existence of spatial patterns of Hurst exponents , ie parameter B. The profile does not directly assessed catalogs verification of geological lithology , but reveals a non-random spatial distribution
Resumo:
The development of oil wells drilling requires additional cares mainly if the drilling is in offshore ultra deep water with low overburden pressure gradients which cause low fracture gradients and, consequently, difficult the well drilling by the reduction of the operational window. To minimize, in the well planning phases, the difficulties faced by the drilling in those sceneries, indirect models are used to estimate fracture gradient that foresees approximate values for leakoff tests. These models generate curves of geopressures that allow detailed analysis of the pressure behavior for the whole well. Most of these models are based on the Terzaghi equation, just differentiating in the determination of the values of rock tension coefficient. This work proposes an alternative method for prediction of fracture pressure gradient based on a geometric correlation that relates the pressure gradients proportionally for a given depth and extrapolates it for the whole well depth, meaning that theses parameters vary in a fixed proportion. The model is based on the application of analytical proportion segments corresponding to the differential pressure related to the rock tension. The study shows that the proposed analytical proportion segments reaches values of fracture gradient with good agreement with those available for leakoff tests in the field area. The obtained results were compared with twelve different indirect models for fracture pressure gradient prediction based on the compacting effect. For this, a software was developed using Matlab language. The comparison was also made varying the water depth from zero (onshore wellbores) to 1500 meters. The leakoff tests are also used to compare the different methods including the one proposed in this work. The presented work gives good results for error analysis compared to other methods and, due to its simplicity, justify its possible application
Resumo:
In this work we use Interval Mathematics to establish interval counterparts for the main tools used in digital signal processing. More specifically, the approach developed here is oriented to signals, systems, sampling, quantization, coding and Fourier transforms. A detailed study for some interval arithmetics which handle with complex numbers is provided; they are: complex interval arithmetic (or rectangular), circular complex arithmetic, and interval arithmetic for polar sectors. This lead us to investigate some properties that are relevant for the development of a theory of interval digital signal processing. It is shown that the sets IR and R(C) endowed with any correct arithmetic is not an algebraic field, meaning that those sets do not behave like real and complex numbers. An alternative to the notion of interval complex width is also provided and the Kulisch- Miranker order is used in order to write complex numbers in the interval form enabling operations on endpoints. The use of interval signals and systems is possible thanks to the representation of complex values into floating point systems. That is, if a number x 2 R is not representable in a floating point system F then it is mapped to an interval [x;x], such that x is the largest number in F which is smaller than x and x is the smallest one in F which is greater than x. This interval representation is the starting point for definitions like interval signals and systems which take real or complex values. It provides the extension for notions like: causality, stability, time invariance, homogeneity, additivity and linearity to interval systems. The process of quantization is extended to its interval counterpart. Thereafter the interval versions for: quantization levels, quantization error and encoded signal are provided. It is shown that the interval levels of quantization represent complex quantization levels and the classical quantization error ranges over the interval quantization error. An estimation for the interval quantization error and an interval version for Z-transform (and hence Fourier transform) is provided. Finally, the results of an Matlab implementation is given
Resumo:
This work intends to analyze the behavior of the gas flow of plunger lift wells producing to well testing separators in offshore production platforms to aim a technical procedure to estimate the gas flow during the slug production period. The motivation for this work appeared from the expectation of some wells equipped with plunger lift method by PETROBRAS in Ubarana sea field located at Rio Grande do Norte State coast where the produced fluids measurement is made in well testing separators at the platform. The oil artificial lift method called plunger lift is used when the available energy of the reservoir is not high enough to overcome all the necessary load losses to lift the oil from the bottom of the well to the surface continuously. This method consists, basically, in one free piston acting as a mechanical interface between the formation gas and the produced liquids, greatly increasing the well s lifting efficiency. A pneumatic control valve is mounted at the flow line to control the cycles. When this valve opens, the plunger starts to move from the bottom to the surface of the well lifting all the oil and gas that are above it until to reach the well test separator where the fluids are measured. The well test separator is used to measure all the volumes produced by the well during a certain period of time called production test. In most cases, the separators are designed to measure stabilized flow, in other words, reasonably constant flow by the use of level and pressure electronic controllers (PLC) and by assumption of a steady pressure inside the separator. With plunger lift wells the liquid and gas flow at the surface are cyclical and unstable what causes the appearance of slugs inside the separator, mainly in the gas phase, because introduce significant errors in the measurement system (e.g.: overrange error). The flow gas analysis proposed in this work is based on two mathematical models used together: i) a plunger lift well model proposed by Baruzzi [1] with later modifications made by Bolonhini [2] to built a plunger lift simulator; ii) a two-phase separator model (gas + liquid) based from a three-phase separator model (gas + oil + water) proposed by Nunes [3]. Based on the models above and with field data collected from the well test separator of PUB-02 platform (Ubarana sea field) it was possible to demonstrate that the output gas flow of the separator can be estimate, with a reasonable precision, from the control signal of the Pressure Control Valve (PCV). Several models of the System Identification Toolbox from MATLAB® were analyzed to evaluate which one better fit to the data collected from the field. For validation of the models, it was used the AIC criterion, as well as a variant of the cross validation criterion. The ARX model performance was the best one to fit to the data and, this way, we decided to evaluate a recursive algorithm (RARX) also with real time data. The results were quite promising that indicating the viability to estimate the output gas flow rate from a plunger lift well producing to a well test separator, with the built-in information of the control signal to the PCV
Resumo:
The predictive control technique has gotten, on the last years, greater number of adepts in reason of the easiness of adjustment of its parameters, of the exceeding of its concepts for multi-input/multi-output (MIMO) systems, of nonlinear models of processes could be linearised around a operating point, so can clearly be used in the controller, and mainly, as being the only methodology that can take into consideration, during the project of the controller, the limitations of the control signals and output of the process. The time varying weighting generalized predictive control (TGPC), studied in this work, is one more an alternative to the several existing predictive controls, characterizing itself as an modification of the generalized predictive control (GPC), where it is used a reference model, calculated in accordance with parameters of project previously established by the designer, and the application of a new function criterion, that when minimized offers the best parameters to the controller. It is used technique of the genetic algorithms to minimize of the function criterion proposed and searches to demonstrate the robustness of the TGPC through the application of performance, stability and robustness criterions. To compare achieves results of the TGPC controller, the GCP and proportional, integral and derivative (PID) controllers are used, where whole the techniques applied to stable, unstable and of non-minimum phase plants. The simulated examples become fulfilled with the use of MATLAB tool. It is verified that, the alterations implemented in TGPC, allow the evidence of the efficiency of this algorithm
Resumo:
The microstrip antennas are in constant evidence in current researches due to several advantages that it presents. Fractal geometry coupled with good performance and convenience of the planar structures are an excellent combination for design and analysis of structures with ever smaller features and multi-resonant and broadband. This geometry has been applied in such patch microstrip antennas to reduce its size and highlight its multi-band behavior. Compared with the conventional microstrip antennas, the quasifractal patch antennas have lower frequencies of resonance, enabling the manufacture of more compact antennas. The aim of this work is the design of quasi-fractal patch antennas through the use of Koch and Minkowski fractal curves applied to radiating and nonradiating antenna s edges of conventional rectangular patch fed by microstrip inset-fed line, initially designed for the frequency of 2.45 GHz. The inset-fed technique is investigated for the impedance matching of fractal antennas, which are fed through lines of microstrip. The efficiency of this technique is investigated experimentally and compared with simulations carried out by commercial software Ansoft Designer used for precise analysis of the electromagnetic behavior of antennas by the method of moments and the neural model proposed. In this dissertation a study of literature on theory of microstrip antennas is done, the same study is performed on the fractal geometry, giving more emphasis to its various forms, techniques for generation of fractals and its applicability. This work also presents a study on artificial neural networks, showing the types/architecture of networks used and their characteristics as well as the training algorithms that were used for their implementation. The equations of settings of the parameters for networks used in this study were derived from the gradient method. It will also be carried out research with emphasis on miniaturization of the proposed new structures, showing how an antenna designed with contours fractals is capable of a miniaturized antenna conventional rectangular patch. The study also consists of a modeling through artificial neural networks of the various parameters of the electromagnetic near-fractal antennas. The presented results demonstrate the excellent capacity of modeling techniques for neural microstrip antennas and all algorithms used in this work in achieving the proposed models were implemented in commercial software simulation of Matlab 7. In order to validate the results, several prototypes of antennas were built, measured on a vector network analyzer and simulated in software for comparison
Resumo:
The objectives of this study were to compare the goodness of fit of four non-linear growth models, i.e. Brody, Gompertz, Logistic and Von Bertalanffy, in West African Dwarf (WAD) sheep. A total of 5274 monthly weight records from birth up to 180 days of age from 889 lambs, collected during 2001 to 2004 in Betecoucou breeding farm in Benin were used. In the preliminary analysis, the General Linear Model Procedure of the Statistical Analysis Systems Institute was applied to the dataset to identify the significant effects of the sex of lamb (male and female), type of birth (single and twin), season of birth (rainy season and dry season), parity of dam (1, 2 and 3) and year of birth (2001, 2002, 2003 and 2004) on the observed birth weight and monthly weight up to 6 months of age. The models parameters (A, B and k), coefficient of determination (112), mean square error (MSE) were calculated using language of technical computing package Matlab(R), 2006. The mean values of A, B and k were substituted into each model to calculate the corresponding Akaike's Information Criterion (AIC). Among the four growth functions, the Brody model has been selected for its accuracy of fit according to the higher R(2), lower MSE and A/C Finally, the parameters A, B and k were adjusted in Matlab(R) 2006 for the sex of lamb, year of birth, season of birth, birth type and the parity of ewe, providing a specific slope of the Brody growth curve. The results of this study suggest that Brody model can be useful for WAD sheep breeding in Betecoucou farm conditions through growth monitoring.
Resumo:
Blind Source Separation (BSS) refers to the problem of estimate original signals from observed linear mixtures with no knowledge about the sources or the mixing process. Independent Component Analysis (ICA) is a technique mainly applied to BSS problem and from the algorithms that implement this technique, FastICA is a high performance iterative algorithm of low computacional cost that uses nongaussianity measures based on high order statistics to estimate the original sources. The great number of applications where ICA has been found useful reects the need of the implementation of this technique in hardware and the natural paralelism of FastICA favors the implementation of this algorithm on digital hardware. This work proposes the implementation of FastICA on a reconfigurable hardware platform for the viability of it's use in blind source separation problems, more specifically in a hardware prototype embedded in a Field Programmable Gate Array (FPGA) board for the monitoring of beds in hospital environments. The implementations will be carried out by Simulink models and it's synthesizing will be done through the DSP Builder software from Altera Corporation.
Resumo:
In a real process, all used resources, whether physical or developed in software, are subject to interruptions or operational commitments. However, in situations in which operate critical systems, any kind of problem may bring big consequences. Knowing this, this paper aims to develop a system capable to detect the presence and indicate the types of failures that may occur in a process. For implementing and testing the proposed methodology, a coupled tank system was used as a study model case. The system should be developed to generate a set of signals that notify the process operator and that may be post-processed, enabling changes in control strategy or control parameters. Due to the damage risks involved with sensors, actuators and amplifiers of the real plant, the data set of the faults will be computationally generated and the results collected from numerical simulations of the process model. The system will be composed by structures with Artificial Neural Networks, trained in offline mode using Matlab®
Resumo:
Every day, water scarcity becomes a more serious problem and, directly affects global society. Studies are directed in order to raise awareness of the rational use of this natural asset that is essential to our survival. Only 0.007% of the water available in the world have easy access and can be consumed by humans, it can be found in rivers, lakes, etc... To better take advantage of the water used in homes and small businesses, reuse projects are often implemented, resulting in savings for customers of water utilities. The reuse projects involve several areas of engineering, like Environmental, Chemical, Electrical and Computer Engineering. The last two are responsible for the control of the process, which aims to make gray water (soapy water), and clear blue water (rain water), ideal for consumption, or for use in watering gardens, flushing, among others applications. Water has several features that should be taken into consideration when it comes to working its reuse. Some of the features are, turbidity, temperature, electrical conductivity and, pH. In this document there is a proposal to control the pH (potential Hydrogen) through a microcontroller, using the fuzzy logic as strategy of control. The controller was developed in the fuzzy toolbox of Matlab®