34 resultados para Complejidad computacional
Resumo:
LINS, Filipe C. A. et al. Modelagem dinâmica e simulação computacional de poços de petróleo verticais e direcionais com elevação por bombeio mecânico. In: CONGRESSO BRASILEIRO DE PESQUISA E DESENVOLVIMENTO EM PETRÓLEO E GÁS, 5. 2009, Fortaleza, CE. Anais... Fortaleza: CBPDPetro, 2009.
Resumo:
This Masters Degree dissertation seeks to make a comparative study of internal air temperature data, simulated through the thermal computer application DesignBuilder 1.2, and data registered in loco through HOBO® Temp Data Logger, in a Social Housing Prototype (HIS), located at the Central Campus of the Federal University of Rio Grande do Norte UFRN. The prototype was designed and built seeking strategies of thermal comfort recommended for the local climate where the study was carried out, and built with panels of cellular concrete by Construtora DoisA, a collaborator of research project REPESC Rede de Pesquisa em Eficiência Energética de Sistemas Construtivos (Research Network on Energy Efficiency of Construction Systems), an integral part of Habitare program. The methodology employed carefully examined the problem, reviewed the bibliography, analyzing the major aspects related to computer simulations for thermal performance of buildings, such as climate characterization of the region under study and users thermal comfort demands. The DesignBuilder 1.2 computer application was used as a simulation tool, and theoretical alterations were carried out in the prototype, then they were compared with the parameters of thermal comfort adopted, based on the area s current technical literature. Analyses of the comparative studies were performed through graphical outputs for a better understanding of air temperature amplitudes and thermal comfort conditions. The data used for the characterization of external air temperature were obtained from the Test Reference Year (TRY), defined for the study area (Natal-RN). Thus the author also performed comparative studies for TRY data registered in the years 2006, 2007 and 2008, at weather station Davis Precision Station, located at the Instituto Nacional de Pesquisas Espaciais INPE-CRN (National Institute of Space Research), in a neighboring area of UFRN s Central Campus. The conclusions observed from the comparative studies performed among computer simulations, and the local records obtained from the studied prototype, point out that the simulations performed in naturally ventilated buildings is quite a complex task, due to the applications limitations, mainly owed to the complexity of air flow phenomena, the influence of comfort conditions in the surrounding areas and climate records. Lastly, regarding the use of the application DesignBuilder 1.2 in the present study, one may conclude that it is a good tool for computer simulations. However, it needs some adjustments to improve reliability in its use. There is a need for continued research, considering the dedication of users to the prototype, as well as the thermal charges of the equipment, in order to check sensitivity
Resumo:
Natural air ventilation is the most import passive strategy to provide thermal comfort in hot and humid climates and a significant low energy strategy. However, the natural ventilated building requires more attention with the architectural design than a conventional building with air conditioning systems, and the results are less reliable. Therefore, this thesis focuses on softwares and methods to predict the natural ventilation performance from the point of view of the architect, with limited resource and knowledge of fluid mechanics. A typical prefabricated building was modelled due to its simplified geometry, low cost and occurrence at the local campus. Firstly, the study emphasized the use of computational fluid dynamics (CFD) software, to simulate the air flow outside and inside the building. A series of approaches were developed to make the simulations possible, compromising the results fidelity. Secondly, the results of CFD simulations were used as the input of an energy tool, to simulate the thermal performance under different rates of air renew. Thirdly, the results of temperature were assessed in terms of thermal comfort. Complementary simulations were carried out to detail the analyses. The results show the potentialities of these tools. However the discussions concerning the simplifications of the approaches, the limitations of the tools and the level of knowledge of the average architect are the major contribution of this study
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
This work proposes an environment for programming programmable logic controllers applied to oil wells with BCP type method of artificially lifting. The environment will have an editor based in the diagram of sequential functions for programming of PLCs. This language was chosen due to the fact of being high-level and accepted by the international standard IEC 61131-3. The use of these control programs in real PLC will be possible with the use of an intermediate level of language based on XML specification PLCopen T6 XML. For the testing and validation of the control programs, an area should be available for viewing variables obtained through communication with a real PLC. Thus, the main contribution of this work is to develop a computational environment that allows: modeling, testing and validating the controls represented in SFC and applied in oil wells with BCP type method of artificially lifting
Resumo:
Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems
Resumo:
Knowledge of the native prokaryotes in hazardous locations favors the application of biotechnology for bioremediation. Independent strategies for cultivation and metagenomics contribute to further microbiological knowledge, enabling studies with non-cultivable about the "native microbiological status and its potential role in bioremediation, for example, of polycyclic aromatic hydrocarbons (HPA's). Considering the biome mangrove interface fragile and critical bordering the ocean, this study characterizes the native microbiota mangrove potential biodegradability of HPA's using a biomarker for molecular detection and assessment of bacterial diversity by PCR in areas under the influence of oil companies in the Basin Petroleum Geology Potiguar (BPP). We chose PcaF, a metabolic enzyme, to be the molecular biomarker in a PCR-DGGE detection of prokaryotes that degrade HPA s. The PCR-DGGE fingerprints obtained from Paracuru-CE, Fortim-CE and Areia Branca-RN samples revealed the occurrence of fluctuations of microbial communities according to the sampling periods and in response to the impact of oil. In the analysis of microbial communities interference of the oil industry, in Areia Branca-RN and Paracuru-CE was observed that oil is a determinant of microbial diversity. Fortim-CE probably has no direct influence with the oil activity. In order to obtain data for better understanding the transport and biodegradation of HPA's, there were conducted in silico studies with modeling and simulation from obtaining 3-D models of proteins involved in the degradation of phenanthrene in the transport of HPA's and also getting the 3-D model of the enzyme PcaF used as molecular marker in this study. Were realized docking studies with substrates and products to a better understanding about the transport mechanism and catalysis of HPA s
Resumo:
This study aims to propose a computing device mechanism which is capable to permit a tactile communication between individuals with visual impairment (blindness or low vision) through the Internet or through a local area network (LAN - Local Network Address). The work was developed under the research projects that currently are realized in the LAI (Laboratory of Integrated Accessibility) of the Federal University of Rio Grande do Norte. This way, the research was done in order to involve a prototype capable to recognize geometries by students considered blind from the Institute of Education and Rehabilitation of Blind of Rio Grande do Norte (IERC-RN), located in Alecrim neighborhood, Natal/RN. Besides this research, another prototype was developed to test the communication via a local network and Internet. To analyze the data, a qualitative and quantitative approach was used through simple statistical techniques, such as percentages and averages, to support subjective interpretations. The results offer an analysis of the extent to which the implementation can contribute to the socialization and learning of the visually impaired. Finally, some recommendations are suggested for the development of future researches in order to facilitate the proposed mechanism.
Resumo:
Worldwide, the demand for transportation services for persons with disabilities, the elderly, and persons with reduced mobility have increased in recent years. The population is aging, governments need to adapt to this reality, and this fact could mean business opportunities for companies. Within this context is inserted the Programa de Acessibilidade Especial porta a porta PRAE, a door to door public transportation service from the city of Natal-RN in Brazil. The research presented in this dissertation seeks to develop a programming model which can assist the process of decision making of managers of the shuttle. To that end, it was created an algorithm based on methods of generating approximate solutions known as heuristics. The purpose of the model is to increase the number of people served by the PRAE, given the available fleet, generating optimized schedules routes. The PRAE is a problem of vehicle routing and scheduling of dial-a-ride - DARP, the most complex type among the routing problems. The validation of the method of resolution was made by comparing the results derived by the model and the currently programming method. It is expected that the model is able to increase the current capacity of the service requests of transport
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
Este trabalho propõe um ambiente computacional aplicado ao ensino de sistemas de controle, denominado de ModSym. O software implementa uma interface gráfica para a modelagem de sistemas físicos lineares e mostra, passo a passo, o processamento necessário à obtenção de modelos matemáticos para esses sistemas. Um sistema físico pode ser representado, no software, de três formas diferentes. O sistema pode ser representado por um diagrama gráfico a partir de elementos dos domínios elétrico, mecânico translacional, mecânico rotacional e hidráulico. Pode também ser representado a partir de grafos de ligação ou de diagramas de fluxo de sinal. Uma vez representado o sistema, o ModSym possibilita o cálculo de funções de transferência do sistema na forma simbólica, utilizando a regra de Mason. O software calcula também funções de transferência na forma numérica e funções de sensibilidade paramétrica. O trabalho propõe ainda um algoritmo para obter o diagrama de fluxo de sinal de um sistema físico baseado no seu grafo de ligação. Este algoritmo e a metodologia de análise de sistemas conhecida por Network Method permitiram a utilização da regra de Mason no cálculo de funções de transferência dos sistemas modelados no software
Resumo:
Several research lines show that sleep favors memory consolidation and learning. It has been proposed that the cognitive role of sleep is derived from a global scaling of synaptic weights, able to homeostatically restore the ability to learn new things, erasing memories overnight. This phenomenon is typical of slow-wave sleep (SWS) and characterized by non-Hebbian mechanisms, i.e., mechanisms independent of synchronous neuronal activity. Another view holds that sleep also triggers the specific enhancement of synaptic connections, carrying out the embossing of certain mnemonic traces within a lattice of synaptic weights rescaled each night. Such an embossing is understood as the combination of Hebbian and non-Hebbian mechanisms, capable of increasing and decreasing respectively the synaptic weights in complementary circuits, leading to selective memory improvement and a restructuring of synaptic configuration (SC) that can be crucial for the generation of new behaviors ( insights ). The empirical findings indicate that initiation of Hebbian plasticity during sleep occurs in the transition of the SWS to the stage of rapid eye movement (REM), possibly due to the significant differences between the firing rates regimes of the stages and the up-regulation of factors involved in longterm synaptic plasticity. In this study the theories of homeostasis and embossing were compared using an artificial neural network (ANN) fed with action potentials recorded in the hippocampus of rats during the sleep-wake cycle. In the simulation in which the ANN did not apply the long-term plasticity mechanisms during sleep (SWS-transition REM), the synaptic weights distribution was re-scaled inexorably, for its mean value proportional to the input firing rate, erasing the synaptic weights pattern that had been established initially. In contrast, when the long-term plasticity is modeled during the transition SWSREM, an increase of synaptic weights were observed in the range of initial/low values, redistributing effectively the weights in a way to reinforce a subset of synapses over time. The results suggest that a positive regulation coming from the long-term plasticity can completely change the role of sleep: its absence leads to forgetting; its presence leads to a positive mnemonic change
Resumo:
The Electrical Submersible Pump (ESP) has been one of the most appropriate solutions for lifting method in onshore and offshore applications. The typical features for this application are adverse temperature, viscosity fluids and gas environments. The difficulties in equipments maintenance and setup contributing to increasing costs of oil production in deep water, therefore, the optimization through automation can be a excellent approach for decrease costs and failures in subsurface equipment. This work describe a computer simulation related with the artificial lifting method ESP. This tool support the dynamic behavior of ESP approach, considering the source and electric energy transmission model for the motor, the electric motor model (including the thermal calculation), flow tubbing simulation, centrifugal pump behavior simulation with liquid nature effects and reservoir requirements. In addition, there are tri-dimensional animation for each ESP subsytem (transformer, motor, pump, seal, gas separator, command unit). This computer simulation propose a improvement for monitoring oil wells for maximization of well production. Currenty, the proprietaries simulators are based on specific equipments manufactures. Therefore, it is not possible simulation equipments of another manufactures. In the propose approach there are support for diverse kinds of manufactures equipments
Resumo:
Amongst the results of the AutPoc Project - Automation of Wells, established between UFRN and Petrobras with the support of the CNPq, FINEP, CTPETRO, FUNPEC, was developed a simulator for equipped wells of oil with the method of rise for continuous gas-lift. The gas-lift is a method of rise sufficiently used in production offshore (sea production), and its basic concept is to inject gas in the deep one of the producing well of oil transform it less dense in order to facilitate its displacement since the reservoir until the surface. Based in the use of tables and equations that condense the biggest number of information on characteristics of the reservoir, the well and the valves of gas injection, it is allowed, through successive interpolations, to simulate representative curves of the physical behavior of the existing characteristic variable. With a simulator that approaches a computer of real the physical conditions of an oil well is possible to analyze peculiar behaviors with very bigger speeds, since the constants of time of the system in question well are raised e, moreover, to optimize costs with assays in field. The simulator presents great versatility, with prominance the analysis of the influence of parameters, as the static pressure, relation gas-liquid, pressure in the head of the well, BSW (Relation Basic Sediments and Water) in curves of request in deep of the well and the attainment of the curve of performance of the well where it can be simulated rules of control and otimization. In moving the rules of control, the simulator allows the use in two ways of simulation: the application of the control saw software simulated enclosed in the proper simulator, as well as the use of external controllers. This implies that the simulator can be used as tool of validation of control algorithms. Through the potentialities above cited, of course one another powerful application for the simulator appears: the didactic use of the tool. It will be possible to use it in formation courses and recycling of engineers
Resumo:
It proposes a established computational solution in the development of a software to construct species-specific primers, used to improve the diagnosis of virus of plant for PCR. Primers are indispensable to PCR reaction, besides providing the specificity of the diagnosis. Primer is a synthetic, short, single stranded piece of DNA, used as a starter in PCR technique. It flanks the sequence desired to amplify. Species-specific primers indicate the well known region of beginning and ending where the polymerase enzyme is going to amplify on a certain species, i.e. it is specific for only a species. Thus, the main objective of this work is to automatize the process of choice of primers, optimizing the specificity of chosen primers by the traditional method