868 resultados para Computacional Intelligence in Medecine


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the native prokaryotes in hazardous locations favors the application of biotechnology for bioremediation. Independent strategies for cultivation and metagenomics contribute to further microbiological knowledge, enabling studies with non-cultivable about the "native microbiological status and its potential role in bioremediation, for example, of polycyclic aromatic hydrocarbons (HPA's). Considering the biome mangrove interface fragile and critical bordering the ocean, this study characterizes the native microbiota mangrove potential biodegradability of HPA's using a biomarker for molecular detection and assessment of bacterial diversity by PCR in areas under the influence of oil companies in the Basin Petroleum Geology Potiguar (BPP). We chose PcaF, a metabolic enzyme, to be the molecular biomarker in a PCR-DGGE detection of prokaryotes that degrade HPA s. The PCR-DGGE fingerprints obtained from Paracuru-CE, Fortim-CE and Areia Branca-RN samples revealed the occurrence of fluctuations of microbial communities according to the sampling periods and in response to the impact of oil. In the analysis of microbial communities interference of the oil industry, in Areia Branca-RN and Paracuru-CE was observed that oil is a determinant of microbial diversity. Fortim-CE probably has no direct influence with the oil activity. In order to obtain data for better understanding the transport and biodegradation of HPA's, there were conducted in silico studies with modeling and simulation from obtaining 3-D models of proteins involved in the degradation of phenanthrene in the transport of HPA's and also getting the 3-D model of the enzyme PcaF used as molecular marker in this study. Were realized docking studies with substrates and products to a better understanding about the transport mechanism and catalysis of HPA s

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spiking neural networks - networks that encode information in the timing of spikes - are arising as a new approach in the artificial neural networks paradigm, emergent from cognitive science. One of these new models is the pulsed neural network with radial basis function, a network able to store information in the axonal propagation delay of neurons. Learning algorithms have been proposed to this model looking for mapping input pulses into output pulses. Recently, a new method was proposed to encode constant data into a temporal sequence of spikes, stimulating deeper studies in order to establish abilities and frontiers of this new approach. However, a well known problem of this kind of network is the high number of free parameters - more that 15 - to be properly configured or tuned in order to allow network convergence. This work presents for the first time a new learning function for this network training that allow the automatic configuration of one of the key network parameters: the synaptic weight decreasing factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os objetivos do presente trabalho foram desenvolver rotina computacional para a solução da equação de Yalin e do diagrama de Shields e avaliar uma equação simplificada para modelar a capacidade de transporte de sedimento num Latossolo Vermelho Distrófico que possa ser utilizada no Water Erosion Prediction Project - WEPP, assim como em outros modelos de predição da erosão do solo. A capacidade de transporte de sedimento para o fluxo superficial foi representada como função-potência da tensão cisalhante, a qual revelou ser aproximação da equação de Yalin. Essa equação simplificada pôde ser aplicada em resultados experimentais oriundos de topografia complexa. A equação simplificada demonstrou acuracidade em relação à equação de Yalin, quando calibrada utilizando-se da tensão média cisalhante. Testes de validação com dados independentes demonstraram que a equação simplificada foi eficiente para estimar a capacidade de transporte de sedimento.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to propose a computing device mechanism which is capable to permit a tactile communication between individuals with visual impairment (blindness or low vision) through the Internet or through a local area network (LAN - Local Network Address). The work was developed under the research projects that currently are realized in the LAI (Laboratory of Integrated Accessibility) of the Federal University of Rio Grande do Norte. This way, the research was done in order to involve a prototype capable to recognize geometries by students considered blind from the Institute of Education and Rehabilitation of Blind of Rio Grande do Norte (IERC-RN), located in Alecrim neighborhood, Natal/RN. Besides this research, another prototype was developed to test the communication via a local network and Internet. To analyze the data, a qualitative and quantitative approach was used through simple statistical techniques, such as percentages and averages, to support subjective interpretations. The results offer an analysis of the extent to which the implementation can contribute to the socialization and learning of the visually impaired. Finally, some recommendations are suggested for the development of future researches in order to facilitate the proposed mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Worldwide, the demand for transportation services for persons with disabilities, the elderly, and persons with reduced mobility have increased in recent years. The population is aging, governments need to adapt to this reality, and this fact could mean business opportunities for companies. Within this context is inserted the Programa de Acessibilidade Especial porta a porta PRAE, a door to door public transportation service from the city of Natal-RN in Brazil. The research presented in this dissertation seeks to develop a programming model which can assist the process of decision making of managers of the shuttle. To that end, it was created an algorithm based on methods of generating approximate solutions known as heuristics. The purpose of the model is to increase the number of people served by the PRAE, given the available fleet, generating optimized schedules routes. The PRAE is a problem of vehicle routing and scheduling of dial-a-ride - DARP, the most complex type among the routing problems. The validation of the method of resolution was made by comparing the results derived by the model and the currently programming method. It is expected that the model is able to increase the current capacity of the service requests of transport

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several research lines show that sleep favors memory consolidation and learning. It has been proposed that the cognitive role of sleep is derived from a global scaling of synaptic weights, able to homeostatically restore the ability to learn new things, erasing memories overnight. This phenomenon is typical of slow-wave sleep (SWS) and characterized by non-Hebbian mechanisms, i.e., mechanisms independent of synchronous neuronal activity. Another view holds that sleep also triggers the specific enhancement of synaptic connections, carrying out the embossing of certain mnemonic traces within a lattice of synaptic weights rescaled each night. Such an embossing is understood as the combination of Hebbian and non-Hebbian mechanisms, capable of increasing and decreasing respectively the synaptic weights in complementary circuits, leading to selective memory improvement and a restructuring of synaptic configuration (SC) that can be crucial for the generation of new behaviors ( insights ). The empirical findings indicate that initiation of Hebbian plasticity during sleep occurs in the transition of the SWS to the stage of rapid eye movement (REM), possibly due to the significant differences between the firing rates regimes of the stages and the up-regulation of factors involved in longterm synaptic plasticity. In this study the theories of homeostasis and embossing were compared using an artificial neural network (ANN) fed with action potentials recorded in the hippocampus of rats during the sleep-wake cycle. In the simulation in which the ANN did not apply the long-term plasticity mechanisms during sleep (SWS-transition REM), the synaptic weights distribution was re-scaled inexorably, for its mean value proportional to the input firing rate, erasing the synaptic weights pattern that had been established initially. In contrast, when the long-term plasticity is modeled during the transition SWSREM, an increase of synaptic weights were observed in the range of initial/low values, redistributing effectively the weights in a way to reinforce a subset of synapses over time. The results suggest that a positive regulation coming from the long-term plasticity can completely change the role of sleep: its absence leads to forgetting; its presence leads to a positive mnemonic change

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The frequency selective surfaces, or FSS (Frequency Selective Surfaces), are structures consisting of periodic arrays of conductive elements, called patches, which are usually very thin and they are printed on dielectric layers, or by openings perforated on very thin metallic surfaces, for applications in bands of microwave and millimeter waves. These structures are often used in aircraft, missiles, satellites, radomes, antennae reflector, high gain antennas and microwave ovens, for example. The use of these structures has as main objective filter frequency bands that can be broadcast or rejection, depending on the specificity of the required application. In turn, the modern communication systems such as GSM (Global System for Mobile Communications), RFID (Radio Frequency Identification), Bluetooth, Wi-Fi and WiMAX, whose services are highly demanded by society, have required the development of antennas having, as its main features, and low cost profile, and reduced dimensions and weight. In this context, the microstrip antenna is presented as an excellent choice for communications systems today, because (in addition to meeting the requirements mentioned intrinsically) planar structures are easy to manufacture and integration with other components in microwave circuits. Consequently, the analysis and synthesis of these devices mainly, due to the high possibility of shapes, size and frequency of its elements has been carried out by full-wave models, such as the finite element method, the method of moments and finite difference time domain. However, these methods require an accurate despite great computational effort. In this context, computational intelligence (CI) has been used successfully in the design and optimization of microwave planar structures, as an auxiliary tool and very appropriate, given the complexity of the geometry of the antennas and the FSS considered. The computational intelligence is inspired by natural phenomena such as learning, perception and decision, using techniques such as artificial neural networks, fuzzy logic, fractal geometry and evolutionary computation. This work makes a study of application of computational intelligence using meta-heuristics such as genetic algorithms and swarm intelligence optimization of antennas and frequency selective surfaces. Genetic algorithms are computational search methods based on the theory of natural selection proposed by Darwin and genetics used to solve complex problems, eg, problems where the search space grows with the size of the problem. The particle swarm optimization characteristics including the use of intelligence collectively being applied to optimization problems in many areas of research. The main objective of this work is the use of computational intelligence, the analysis and synthesis of antennas and FSS. We considered the structures of a microstrip planar monopole, ring type, and a cross-dipole FSS. We developed algorithms and optimization results obtained for optimized geometries of antennas and FSS considered. To validate results were designed, constructed and measured several prototypes. The measured results showed excellent agreement with the simulated. Moreover, the results obtained in this study were compared to those simulated using a commercial software has been also observed an excellent agreement. Specifically, the efficiency of techniques used were CI evidenced by simulated and measured, aiming at optimizing the bandwidth of an antenna for wideband operation or UWB (Ultra Wideband), using a genetic algorithm and optimizing the bandwidth, by specifying the length of the air gap between two frequency selective surfaces, using an optimization algorithm particle swarm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes design methodologies for frequency selective surfaces (FSSs) composed of periodic arrays of pre-fractals metallic patches on single-layer dielectrics (FR4, RT/duroid). Shapes presented by Sierpinski island and T fractal geometries are exploited to the simple design of efficient band-stop spatial filters with applications in the range of microwaves. Initial results are discussed in terms of the electromagnetic effect resulting from the variation of parameters such as, fractal iteration number (or fractal level), fractal iteration factor, and periodicity of FSS, depending on the used pre-fractal element (Sierpinski island or T fractal). The transmission properties of these proposed periodic arrays are investigated through simulations performed by Ansoft DesignerTM and Ansoft HFSSTM commercial softwares that run full-wave methods. To validate the employed methodology, FSS prototypes are selected for fabrication and measurement. The obtained results point to interesting features for FSS spatial filters: compactness, with high values of frequency compression factor; as well as stable frequency responses at oblique incidence of plane waves. This thesis also approaches, as it main focus, the application of an alternative electromagnetic (EM) optimization technique for analysis and synthesis of FSSs with fractal motifs. In application examples of this technique, Vicsek and Sierpinski pre-fractal elements are used in the optimal design of FSS structures. Based on computational intelligence tools, the proposed technique overcomes the high computational cost associated to the full-wave parametric analyzes. To this end, fast and accurate multilayer perceptron (MLP) neural network models are developed using different parameters as design input variables. These neural network models aim to calculate the cost function in the iterations of population-based search algorithms. Continuous genetic algorithm (GA), particle swarm optimization (PSO), and bees algorithm (BA) are used for FSSs optimization with specific resonant frequency and bandwidth. The performance of these algorithms is compared in terms of computational cost and numerical convergence. Consistent results can be verified by the excellent agreement obtained between simulations and measurements related to FSS prototypes built with a given fractal iteration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Electrical Submersible Pump (ESP) has been one of the most appropriate solutions for lifting method in onshore and offshore applications. The typical features for this application are adverse temperature, viscosity fluids and gas environments. The difficulties in equipments maintenance and setup contributing to increasing costs of oil production in deep water, therefore, the optimization through automation can be a excellent approach for decrease costs and failures in subsurface equipment. This work describe a computer simulation related with the artificial lifting method ESP. This tool support the dynamic behavior of ESP approach, considering the source and electric energy transmission model for the motor, the electric motor model (including the thermal calculation), flow tubbing simulation, centrifugal pump behavior simulation with liquid nature effects and reservoir requirements. In addition, there are tri-dimensional animation for each ESP subsytem (transformer, motor, pump, seal, gas separator, command unit). This computer simulation propose a improvement for monitoring oil wells for maximization of well production. Currenty, the proprietaries simulators are based on specific equipments manufactures. Therefore, it is not possible simulation equipments of another manufactures. In the propose approach there are support for diverse kinds of manufactures equipments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amongst the results of the AutPoc Project - Automation of Wells, established between UFRN and Petrobras with the support of the CNPq, FINEP, CTPETRO, FUNPEC, was developed a simulator for equipped wells of oil with the method of rise for continuous gas-lift. The gas-lift is a method of rise sufficiently used in production offshore (sea production), and its basic concept is to inject gas in the deep one of the producing well of oil transform it less dense in order to facilitate its displacement since the reservoir until the surface. Based in the use of tables and equations that condense the biggest number of information on characteristics of the reservoir, the well and the valves of gas injection, it is allowed, through successive interpolations, to simulate representative curves of the physical behavior of the existing characteristic variable. With a simulator that approaches a computer of real the physical conditions of an oil well is possible to analyze peculiar behaviors with very bigger speeds, since the constants of time of the system in question well are raised e, moreover, to optimize costs with assays in field. The simulator presents great versatility, with prominance the analysis of the influence of parameters, as the static pressure, relation gas-liquid, pressure in the head of the well, BSW (Relation Basic Sediments and Water) in curves of request in deep of the well and the attainment of the curve of performance of the well where it can be simulated rules of control and otimization. In moving the rules of control, the simulator allows the use in two ways of simulation: the application of the control saw software simulated enclosed in the proper simulator, as well as the use of external controllers. This implies that the simulator can be used as tool of validation of control algorithms. Through the potentialities above cited, of course one another powerful application for the simulator appears: the didactic use of the tool. It will be possible to use it in formation courses and recycling of engineers