992 resultados para matemática computacional
Resumo:
Os objetivos do presente trabalho foram desenvolver rotina computacional para a solução da equação de Yalin e do diagrama de Shields e avaliar uma equação simplificada para modelar a capacidade de transporte de sedimento num Latossolo Vermelho Distrófico que possa ser utilizada no Water Erosion Prediction Project - WEPP, assim como em outros modelos de predição da erosão do solo. A capacidade de transporte de sedimento para o fluxo superficial foi representada como função-potência da tensão cisalhante, a qual revelou ser aproximação da equação de Yalin. Essa equação simplificada pôde ser aplicada em resultados experimentais oriundos de topografia complexa. A equação simplificada demonstrou acuracidade em relação à equação de Yalin, quando calibrada utilizando-se da tensão média cisalhante. Testes de validação com dados independentes demonstraram que a equação simplificada foi eficiente para estimar a capacidade de transporte de sedimento.
Resumo:
Para o arroz irrigado, poucos trabalhos utilizam métodos de diagnose foliar desenvolvidos para as condições locais de clima, solo ou cultivares. O objetivo deste trabalho foi avaliar os métodos da Diagnose da Composição Nutricional e da Chance Matemática na definição dos padrões nutricionais de lavouras arrozeiras do Estado do Rio Grande do Sul. Resultados de produtividade de grãos e teores foliares de N, P, K, Ca, Mg, S, B, Cu, Fe, Mn, Zn e Mo de 356 lavouras arrozeiras cultivadas sob sistema de irrigação por inundação foram utilizados para a determinação das faixas de suficiência calculadas pelo método da Chance Matemática. As faixas de suficiência foram comparadas com valores críticos propostos pela literatura e com o intervalo de confiança do teor médio dos nutrientes em lavouras consideradas nutricionalmente equilibradas, identificadas pelo método Diagnose da Composição Nutricional. Observou-se pouca concordância entre os valores das faixas de suficiência indicados pelos métodos da Chance Matemática e da Diagnose da Composição Nutricional e os respectivos valores indicados na literatura. A faixa de teores foliares adequados, consistentes com maior produtividade média das lavouras arrozeiras, foi indicada ser de 23 a 28 g kg-1 para N; 11 a 14 g kg-1 para K; 1,4 a 2,0 g kg-1 para S; 6 a 12 mg kg-1 para B; e 70 a 200 mg kg-1 para Fe. Para os teores foliares de P, Ca, Mg, B, Cu, Mn e Zn e Mo nenhuma das faixas adequadas testadas indicou capacidade para distinguir as lavouras arrozeiras quanto à produtividade média.
Resumo:
This study aims to propose a computing device mechanism which is capable to permit a tactile communication between individuals with visual impairment (blindness or low vision) through the Internet or through a local area network (LAN - Local Network Address). The work was developed under the research projects that currently are realized in the LAI (Laboratory of Integrated Accessibility) of the Federal University of Rio Grande do Norte. This way, the research was done in order to involve a prototype capable to recognize geometries by students considered blind from the Institute of Education and Rehabilitation of Blind of Rio Grande do Norte (IERC-RN), located in Alecrim neighborhood, Natal/RN. Besides this research, another prototype was developed to test the communication via a local network and Internet. To analyze the data, a qualitative and quantitative approach was used through simple statistical techniques, such as percentages and averages, to support subjective interpretations. The results offer an analysis of the extent to which the implementation can contribute to the socialization and learning of the visually impaired. Finally, some recommendations are suggested for the development of future researches in order to facilitate the proposed mechanism.
Resumo:
Worldwide, the demand for transportation services for persons with disabilities, the elderly, and persons with reduced mobility have increased in recent years. The population is aging, governments need to adapt to this reality, and this fact could mean business opportunities for companies. Within this context is inserted the Programa de Acessibilidade Especial porta a porta PRAE, a door to door public transportation service from the city of Natal-RN in Brazil. The research presented in this dissertation seeks to develop a programming model which can assist the process of decision making of managers of the shuttle. To that end, it was created an algorithm based on methods of generating approximate solutions known as heuristics. The purpose of the model is to increase the number of people served by the PRAE, given the available fleet, generating optimized schedules routes. The PRAE is a problem of vehicle routing and scheduling of dial-a-ride - DARP, the most complex type among the routing problems. The validation of the method of resolution was made by comparing the results derived by the model and the currently programming method. It is expected that the model is able to increase the current capacity of the service requests of transport
Resumo:
This work present a interval approach to deal with images with that contain uncertainties, as well, as treating these uncertainties through morphologic operations. Had been presented two intervals models. For the first, is introduced an algebraic space with three values, that was constructed based in the tri-valorada logic of Lukasiewiecz. With this algebraic structure, the theory of the interval binary images, that extends the classic binary model with the inclusion of the uncertainty information, was introduced. The same one can be applied to represent certain binary images with uncertainty in pixels, that it was originated, for example, during the process of the acquisition of the image. The lattice structure of these images, allow the definition of the morphologic operators, where the uncertainties are treated locally. The second model, extend the classic model to the images in gray levels, where the functions that represent these images are mapping in a finite set of interval values. The algebraic structure belong the complete lattices class, what also it allow the definition of the elementary operators of the mathematical morphology, dilation and erosion for this images. Thus, it is established a interval theory applied to the mathematical morphology to deal with problems of uncertainties in images
Resumo:
This work deals with a mathematical fundament for digital signal processing under point view of interval mathematics. Intend treat the open problem of precision and repesention of data in digital systems, with a intertval version of signals representation. Signals processing is a rich and complex area, therefore, this work makes a cutting with focus in systems linear invariant in the time. A vast literature in the area exists, but, some concepts in interval mathematics need to be redefined or to be elaborated for the construction of a solid theory of interval signal processing. We will construct a basic fundaments for signal processing in the interval version, such as basic properties linearity, stability, causality, a version to intervalar of linear systems e its properties. They will be presented interval versions of the convolution and the Z-transform. Will be made analysis of convergences of systems using interval Z-transform , a essentially interval distance, interval complex numbers , application in a interval filter.
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
Este trabalho propõe um ambiente computacional aplicado ao ensino de sistemas de controle, denominado de ModSym. O software implementa uma interface gráfica para a modelagem de sistemas físicos lineares e mostra, passo a passo, o processamento necessário à obtenção de modelos matemáticos para esses sistemas. Um sistema físico pode ser representado, no software, de três formas diferentes. O sistema pode ser representado por um diagrama gráfico a partir de elementos dos domínios elétrico, mecânico translacional, mecânico rotacional e hidráulico. Pode também ser representado a partir de grafos de ligação ou de diagramas de fluxo de sinal. Uma vez representado o sistema, o ModSym possibilita o cálculo de funções de transferência do sistema na forma simbólica, utilizando a regra de Mason. O software calcula também funções de transferência na forma numérica e funções de sensibilidade paramétrica. O trabalho propõe ainda um algoritmo para obter o diagrama de fluxo de sinal de um sistema físico baseado no seu grafo de ligação. Este algoritmo e a metodologia de análise de sistemas conhecida por Network Method permitiram a utilização da regra de Mason no cálculo de funções de transferência dos sistemas modelados no software
Resumo:
Several research lines show that sleep favors memory consolidation and learning. It has been proposed that the cognitive role of sleep is derived from a global scaling of synaptic weights, able to homeostatically restore the ability to learn new things, erasing memories overnight. This phenomenon is typical of slow-wave sleep (SWS) and characterized by non-Hebbian mechanisms, i.e., mechanisms independent of synchronous neuronal activity. Another view holds that sleep also triggers the specific enhancement of synaptic connections, carrying out the embossing of certain mnemonic traces within a lattice of synaptic weights rescaled each night. Such an embossing is understood as the combination of Hebbian and non-Hebbian mechanisms, capable of increasing and decreasing respectively the synaptic weights in complementary circuits, leading to selective memory improvement and a restructuring of synaptic configuration (SC) that can be crucial for the generation of new behaviors ( insights ). The empirical findings indicate that initiation of Hebbian plasticity during sleep occurs in the transition of the SWS to the stage of rapid eye movement (REM), possibly due to the significant differences between the firing rates regimes of the stages and the up-regulation of factors involved in longterm synaptic plasticity. In this study the theories of homeostasis and embossing were compared using an artificial neural network (ANN) fed with action potentials recorded in the hippocampus of rats during the sleep-wake cycle. In the simulation in which the ANN did not apply the long-term plasticity mechanisms during sleep (SWS-transition REM), the synaptic weights distribution was re-scaled inexorably, for its mean value proportional to the input firing rate, erasing the synaptic weights pattern that had been established initially. In contrast, when the long-term plasticity is modeled during the transition SWSREM, an increase of synaptic weights were observed in the range of initial/low values, redistributing effectively the weights in a way to reinforce a subset of synapses over time. The results suggest that a positive regulation coming from the long-term plasticity can completely change the role of sleep: its absence leads to forgetting; its presence leads to a positive mnemonic change
Resumo:
One of the objectives of this work is the ana1ysis of planar structures using the PBG (photonic Bandgap), a new method of controlling propagation of electromagnetic waves in devices with dielectrics. Here the basic theory of these structures will be presented, as well as applications and determination of certain parameters. In this work the analysis will be performed concerning PBG structures, including the basic theory and applications in planar structures. Considerations are made related to the implementation of devices. Here the TTL (Transverse Transmission Line) method is employed, characterized by the simplicity in the treatment of the equations that govern the propagation of electromagnetic waves in the structure. In this method, the fields in x and z are expressed in function of the fields in the traverse direction y in FTD (Fourier Transform Domain). This method is useful in the determination of the complex propagation constant with application in high frequency and photonics. In this work structures will be approached in micrometric scale operating in frequencies in the range of T erahertz, a first step for operation in the visible spectra. The mathematical basis are approached for the determination of the electromagnetic fields in the structure, based on the method L TT taking into account the dimensions approached in this work. Calculations for the determination of the constant of complex propagation are also carried out. The computational implementation is presented for high frequencies. at the first time the analysis is done with base in open microstrip lines with semiconductor substrate. Finally, considerations are made regarding applications ofthese devices in the area of telecommunications, and suggestions for future
Resumo:
The Electrical Submersible Pump (ESP) has been one of the most appropriate solutions for lifting method in onshore and offshore applications. The typical features for this application are adverse temperature, viscosity fluids and gas environments. The difficulties in equipments maintenance and setup contributing to increasing costs of oil production in deep water, therefore, the optimization through automation can be a excellent approach for decrease costs and failures in subsurface equipment. This work describe a computer simulation related with the artificial lifting method ESP. This tool support the dynamic behavior of ESP approach, considering the source and electric energy transmission model for the motor, the electric motor model (including the thermal calculation), flow tubbing simulation, centrifugal pump behavior simulation with liquid nature effects and reservoir requirements. In addition, there are tri-dimensional animation for each ESP subsytem (transformer, motor, pump, seal, gas separator, command unit). This computer simulation propose a improvement for monitoring oil wells for maximization of well production. Currenty, the proprietaries simulators are based on specific equipments manufactures. Therefore, it is not possible simulation equipments of another manufactures. In the propose approach there are support for diverse kinds of manufactures equipments
Resumo:
Amongst the results of the AutPoc Project - Automation of Wells, established between UFRN and Petrobras with the support of the CNPq, FINEP, CTPETRO, FUNPEC, was developed a simulator for equipped wells of oil with the method of rise for continuous gas-lift. The gas-lift is a method of rise sufficiently used in production offshore (sea production), and its basic concept is to inject gas in the deep one of the producing well of oil transform it less dense in order to facilitate its displacement since the reservoir until the surface. Based in the use of tables and equations that condense the biggest number of information on characteristics of the reservoir, the well and the valves of gas injection, it is allowed, through successive interpolations, to simulate representative curves of the physical behavior of the existing characteristic variable. With a simulator that approaches a computer of real the physical conditions of an oil well is possible to analyze peculiar behaviors with very bigger speeds, since the constants of time of the system in question well are raised e, moreover, to optimize costs with assays in field. The simulator presents great versatility, with prominance the analysis of the influence of parameters, as the static pressure, relation gas-liquid, pressure in the head of the well, BSW (Relation Basic Sediments and Water) in curves of request in deep of the well and the attainment of the curve of performance of the well where it can be simulated rules of control and otimization. In moving the rules of control, the simulator allows the use in two ways of simulation: the application of the control saw software simulated enclosed in the proper simulator, as well as the use of external controllers. This implies that the simulator can be used as tool of validation of control algorithms. Through the potentialities above cited, of course one another powerful application for the simulator appears: the didactic use of the tool. It will be possible to use it in formation courses and recycling of engineers
Resumo:
It proposes a established computational solution in the development of a software to construct species-specific primers, used to improve the diagnosis of virus of plant for PCR. Primers are indispensable to PCR reaction, besides providing the specificity of the diagnosis. Primer is a synthetic, short, single stranded piece of DNA, used as a starter in PCR technique. It flanks the sequence desired to amplify. Species-specific primers indicate the well known region of beginning and ending where the polymerase enzyme is going to amplify on a certain species, i.e. it is specific for only a species. Thus, the main objective of this work is to automatize the process of choice of primers, optimizing the specificity of chosen primers by the traditional method
Resumo:
This work proposes a computer simulator for sucker rod pumped vertical wells. The simulator is able to represent the dynamic behavior of the systems and the computation of several important parameters, allowing the easy visualization of several pertinent phenomena. The use of the simulator allows the execution of several tests at lower costs and shorter times, than real wells experiments. The simulation uses a model based on the dynamic behavior of the rod string. This dynamic model is represented by a second order partial differencial equation. Through this model, several common field situations can be verified. Moreover, the simulation includes 3D animations, facilitating the physical understanding of the process, due to a better visual interpretation of the phenomena. Another important characteristic is the emulation of the main sensors used in sucker rod pumping automation. The emulation of the sensors is implemented through a microcontrolled interface between the simulator and the industrial controllers. By means of this interface, the controllers interpret the simulator as a real well. A "fault module" was included in the simulator. This module incorporates the six more important faults found in sucker rod pumping. Therefore, the analysis and verification of these problems through the simulator, allows the user to identify such situations that otherwise could be observed only in the field. The simulation of these faults receives a different treatment due to the different boundary conditions imposed to the numeric solution of the problem. Possible applications of the simulator are: the design and analysis of wells, training of technicians and engineers, execution of tests in controllers and supervisory systems, and validation of control algorithms
Resumo:
The main objective of this work is to optimize the performance of frequency selective surfaces (FSS) composed of crossed dipole conducting patches. The optimization process is performed by determining proper values for the width of the crossed dipoles and for the FSS array periodicity, while the length of the crossed dipoles is kept constant. Particularly, the objective is to determine values that provide wide bandwidth using a search algorithm with representation in bioinspired real numbers. Typically FSS structures composed of patch elements are used for band rejection filtering applications. The FSS structures primarily act like filters depending on the type of element chosen. The region of the electromagnetic spectrum chosen for this study is the one that goes from 7 GHz to 12 GHz, which includes mostly the X-band. This frequency band was chosen to allow the use of two X-band horn antennas, in the FSS measurement setup. The design of the FSS using the developed genetic algorithm allowed increasing the structure bandwidth