850 resultados para Visão computacional. FPGA. Sistemas orientados a plataformas


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Masters Degree dissertation seeks to make a comparative study of internal air temperature data, simulated through the thermal computer application DesignBuilder 1.2, and data registered in loco through HOBO® Temp Data Logger, in a Social Housing Prototype (HIS), located at the Central Campus of the Federal University of Rio Grande do Norte UFRN. The prototype was designed and built seeking strategies of thermal comfort recommended for the local climate where the study was carried out, and built with panels of cellular concrete by Construtora DoisA, a collaborator of research project REPESC Rede de Pesquisa em Eficiência Energética de Sistemas Construtivos (Research Network on Energy Efficiency of Construction Systems), an integral part of Habitare program. The methodology employed carefully examined the problem, reviewed the bibliography, analyzing the major aspects related to computer simulations for thermal performance of buildings, such as climate characterization of the region under study and users thermal comfort demands. The DesignBuilder 1.2 computer application was used as a simulation tool, and theoretical alterations were carried out in the prototype, then they were compared with the parameters of thermal comfort adopted, based on the area s current technical literature. Analyses of the comparative studies were performed through graphical outputs for a better understanding of air temperature amplitudes and thermal comfort conditions. The data used for the characterization of external air temperature were obtained from the Test Reference Year (TRY), defined for the study area (Natal-RN). Thus the author also performed comparative studies for TRY data registered in the years 2006, 2007 and 2008, at weather station Davis Precision Station, located at the Instituto Nacional de Pesquisas Espaciais INPE-CRN (National Institute of Space Research), in a neighboring area of UFRN s Central Campus. The conclusions observed from the comparative studies performed among computer simulations, and the local records obtained from the studied prototype, point out that the simulations performed in naturally ventilated buildings is quite a complex task, due to the applications limitations, mainly owed to the complexity of air flow phenomena, the influence of comfort conditions in the surrounding areas and climate records. Lastly, regarding the use of the application DesignBuilder 1.2 in the present study, one may conclude that it is a good tool for computer simulations. However, it needs some adjustments to improve reliability in its use. There is a need for continued research, considering the dedication of users to the prototype, as well as the thermal charges of the equipment, in order to check sensitivity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural air ventilation is the most import passive strategy to provide thermal comfort in hot and humid climates and a significant low energy strategy. However, the natural ventilated building requires more attention with the architectural design than a conventional building with air conditioning systems, and the results are less reliable. Therefore, this thesis focuses on softwares and methods to predict the natural ventilation performance from the point of view of the architect, with limited resource and knowledge of fluid mechanics. A typical prefabricated building was modelled due to its simplified geometry, low cost and occurrence at the local campus. Firstly, the study emphasized the use of computational fluid dynamics (CFD) software, to simulate the air flow outside and inside the building. A series of approaches were developed to make the simulations possible, compromising the results fidelity. Secondly, the results of CFD simulations were used as the input of an energy tool, to simulate the thermal performance under different rates of air renew. Thirdly, the results of temperature were assessed in terms of thermal comfort. Complementary simulations were carried out to detail the analyses. The results show the potentialities of these tools. However the discussions concerning the simplifications of the approaches, the limitations of the tools and the level of knowledge of the average architect are the major contribution of this study

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the operational context of industrial processes, alarm, by definition, is a warning to the operator that an action with limited time to run is required, while the event is a change of state information, which does not require action by the operator, therefore should not be advertised, and only stored for analysis of maintenance, incidents and used for signaling / monitoring (EEMUA, 2007). However, alarms and events are often confused and improperly configured similarly by developers of automation systems. This practice results in a high amount of pseudo-alarms during the operation of industrial processes. The high number of alarms is a major obstacle to improving operational efficiency, making it difficult to identify problems and increasing the time to respond to abnormalities. The main consequences of this scenario are the increased risk to personal safety, facilities, environment deterioration and loss of production. The aim of this paper is to present a philosophy for setting up a system of supervision and control, developed with the aim of reducing the amount of pseudo-alarms and increase reliability of the information that the system provides. A real case study was conducted in the automation system of the offshore production of hydrocarbons from Petrobras in Rio Grande do Norte, in order to validate the application of this new methodology. The work followed the premises of the tool presented in ISA SP18.2. 2009, called "life cycle alarm . After the implementation of methodology there was a significant reduction in the number of alarms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aims to propose a computing device mechanism which is capable to permit a tactile communication between individuals with visual impairment (blindness or low vision) through the Internet or through a local area network (LAN - Local Network Address). The work was developed under the research projects that currently are realized in the LAI (Laboratory of Integrated Accessibility) of the Federal University of Rio Grande do Norte. This way, the research was done in order to involve a prototype capable to recognize geometries by students considered blind from the Institute of Education and Rehabilitation of Blind of Rio Grande do Norte (IERC-RN), located in Alecrim neighborhood, Natal/RN. Besides this research, another prototype was developed to test the communication via a local network and Internet. To analyze the data, a qualitative and quantitative approach was used through simple statistical techniques, such as percentages and averages, to support subjective interpretations. The results offer an analysis of the extent to which the implementation can contribute to the socialization and learning of the visually impaired. Finally, some recommendations are suggested for the development of future researches in order to facilitate the proposed mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work provides great contribution to the documental study of the Work Safety courses offered by CEFETs in Brazil, under the perspective of safety management and occupational health, using as a referential the specification OHSAS 18001 (BSI, 1999), as well as directions provided by OIT (ILO, 2001). The theoretical research compares technical and managing competences of the projects of Work Safety courses at CEFETs with the international legislation mentioned above. For field research, questionnaires containing open and close questions were answered by teachers and students aiming at identifying the importance of technical and managing competences for the formation of Work Safety technicians, besides trying to identify which level of minimal formal knowledge should be required to perform managing activities in the area of Work Safety Management Systems and Occupational Health (SGSSO, in Portuguese). The results of the theoretical research point out differences between the projects of the Work Safety technical courses at CEFETs under the perspective of SGSSO. The field research shows that students and teachers opinions converge about most technical and managing competences. In relation to academic formation, the research suggests divergences to the criterion stated by the norm ISO 19011(ABNT, 2002)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is to propose a SLAM (Simultaneous Localization and Mapping) solution based on Extended Kalman Filter (EKF) in order to make possible a robot navigates along the environment using information from odometry and pre-existing lines on the floor. Initially, a segmentation step is necessary to classify parts of the image in floor or non floor . Then the image processing identifies floor lines and the parameters of these lines are mapped to world using a homography matrix. Finally, the identified lines are used in SLAM as landmarks in order to build a feature map. In parallel, using the corrected robot pose, the uncertainty about the pose and also the part non floor of the image, it is possible to build an occupancy grid map and generate a metric map with the obstacle s description. A greater autonomy for the robot is attained by using the two types of obtained map (the metric map and the features map). Thus, it is possible to run path planning tasks in parallel with localization and mapping. Practical results are presented to validate the proposal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation describes the use of new Technologies of the Areas of Telecommunications, Networks and Industrial Automation for increase of the Operational Safety and obtaining of Operational Improvements in the Platforms Petroliferous Offshore. The presented solution represents the junction of several modules of these areas, making possible the Supervision and Contrai of the Platforms Petroliferous Offshore starting from an Station Onshore, in way similar to a remote contral, by virtue of the visualization possibility and audition of the operational area through cameras and microphones, looking the operator of the system to be "present" in the platform. This way, it diminishes the embarked people's need, increasing the Operational Safety. As consequence, we have the obtaining of Operational Improvements, by virtue of the use of a digital link of large band it releases multi-service. In this link traffic simultaneously digital signs of data (Ethernet Network), telephony (Phone VoIP), image and sound

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work intends to analyze the behavior of the gas flow of plunger lift wells producing to well testing separators in offshore production platforms to aim a technical procedure to estimate the gas flow during the slug production period. The motivation for this work appeared from the expectation of some wells equipped with plunger lift method by PETROBRAS in Ubarana sea field located at Rio Grande do Norte State coast where the produced fluids measurement is made in well testing separators at the platform. The oil artificial lift method called plunger lift is used when the available energy of the reservoir is not high enough to overcome all the necessary load losses to lift the oil from the bottom of the well to the surface continuously. This method consists, basically, in one free piston acting as a mechanical interface between the formation gas and the produced liquids, greatly increasing the well s lifting efficiency. A pneumatic control valve is mounted at the flow line to control the cycles. When this valve opens, the plunger starts to move from the bottom to the surface of the well lifting all the oil and gas that are above it until to reach the well test separator where the fluids are measured. The well test separator is used to measure all the volumes produced by the well during a certain period of time called production test. In most cases, the separators are designed to measure stabilized flow, in other words, reasonably constant flow by the use of level and pressure electronic controllers (PLC) and by assumption of a steady pressure inside the separator. With plunger lift wells the liquid and gas flow at the surface are cyclical and unstable what causes the appearance of slugs inside the separator, mainly in the gas phase, because introduce significant errors in the measurement system (e.g.: overrange error). The flow gas analysis proposed in this work is based on two mathematical models used together: i) a plunger lift well model proposed by Baruzzi [1] with later modifications made by Bolonhini [2] to built a plunger lift simulator; ii) a two-phase separator model (gas + liquid) based from a three-phase separator model (gas + oil + water) proposed by Nunes [3]. Based on the models above and with field data collected from the well test separator of PUB-02 platform (Ubarana sea field) it was possible to demonstrate that the output gas flow of the separator can be estimate, with a reasonable precision, from the control signal of the Pressure Control Valve (PCV). Several models of the System Identification Toolbox from MATLAB® were analyzed to evaluate which one better fit to the data collected from the field. For validation of the models, it was used the AIC criterion, as well as a variant of the cross validation criterion. The ARX model performance was the best one to fit to the data and, this way, we decided to evaluate a recursive algorithm (RARX) also with real time data. The results were quite promising that indicating the viability to estimate the output gas flow rate from a plunger lift well producing to a well test separator, with the built-in information of the control signal to the PCV

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work develops a robustness analysis with respect to the modeling errors, being applied to the strategies of indirect control using Artificial Neural Networks - ANN s, belong to the multilayer feedforward perceptron class with on-line training based on gradient method (backpropagation). The presented schemes are called Indirect Hybrid Control and Indirect Neural Control. They are presented two Robustness Theorems, being one for each proposed indirect control scheme, which allow the computation of the maximum steady-state control error that will occur due to the modeling error what is caused by the neural identifier, either for the closed loop configuration having a conventional controller - Indirect Hybrid Control, or for the closed loop configuration having a neural controller - Indirect Neural Control. Considering that the robustness analysis is restrict only to the steady-state plant behavior, this work also includes a stability analysis transcription that is suitable for multilayer perceptron class of ANN s trained with backpropagation algorithm, to assure the convergence and stability of the used neural systems. By other side, the boundness of the initial transient behavior is assured by the assumption that the plant is BIBO (Bounded Input, Bounded Output) stable. The Robustness Theorems were tested on the proposed indirect control strategies, while applied to regulation control of simulated examples using nonlinear plants, and its results are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several research lines show that sleep favors memory consolidation and learning. It has been proposed that the cognitive role of sleep is derived from a global scaling of synaptic weights, able to homeostatically restore the ability to learn new things, erasing memories overnight. This phenomenon is typical of slow-wave sleep (SWS) and characterized by non-Hebbian mechanisms, i.e., mechanisms independent of synchronous neuronal activity. Another view holds that sleep also triggers the specific enhancement of synaptic connections, carrying out the embossing of certain mnemonic traces within a lattice of synaptic weights rescaled each night. Such an embossing is understood as the combination of Hebbian and non-Hebbian mechanisms, capable of increasing and decreasing respectively the synaptic weights in complementary circuits, leading to selective memory improvement and a restructuring of synaptic configuration (SC) that can be crucial for the generation of new behaviors ( insights ). The empirical findings indicate that initiation of Hebbian plasticity during sleep occurs in the transition of the SWS to the stage of rapid eye movement (REM), possibly due to the significant differences between the firing rates regimes of the stages and the up-regulation of factors involved in longterm synaptic plasticity. In this study the theories of homeostasis and embossing were compared using an artificial neural network (ANN) fed with action potentials recorded in the hippocampus of rats during the sleep-wake cycle. In the simulation in which the ANN did not apply the long-term plasticity mechanisms during sleep (SWS-transition REM), the synaptic weights distribution was re-scaled inexorably, for its mean value proportional to the input firing rate, erasing the synaptic weights pattern that had been established initially. In contrast, when the long-term plasticity is modeled during the transition SWSREM, an increase of synaptic weights were observed in the range of initial/low values, redistributing effectively the weights in a way to reinforce a subset of synapses over time. The results suggest that a positive regulation coming from the long-term plasticity can completely change the role of sleep: its absence leads to forgetting; its presence leads to a positive mnemonic change

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work introduces a new method for environment mapping with three-dimensional information from visual information for robotic accurate navigation. Many approaches of 3D mapping using occupancy grid typically requires high computacional effort to both build and store the map. We introduce an 2.5-D occupancy-elevation grid mapping, which is a discrete mapping approach, where each cell stores the occupancy probability, the height of the terrain at current place in the environment and the variance of this height. This 2.5-dimensional representation allows that a mobile robot to know whether a place in the environment is occupied by an obstacle and the height of this obstacle, thus, it can decide if is possible to traverse the obstacle. Sensorial informations necessary to construct the map is provided by a stereo vision system, which has been modeled with a robust probabilistic approach, considering the noise present in the stereo processing. The resulting maps favors the execution of tasks like decision making in the autonomous navigation, exploration, localization and path planning. Experiments carried out with a real mobile robots demonstrates that this proposed approach yields useful maps for robot autonomous navigation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a method to determine the depth of objects in a scene using a combination between stereo vision and self-calibration techniques. Determining the rel- ative distance between visualized objects and a robot, with a stereo head, it is possible to navigate in unknown environments. Stereo vision techniques supply a depth measure by the combination of two or more images from the same scene. To achieve a depth estimates of the in scene objects a reconstruction of this scene geometry is necessary. For such reconstruction the relationship between the three-dimensional world coordi- nates and the two-dimensional images coordinates is necessary. Through the achievement of the cameras intrinsic parameters it is possible to make this coordinates systems relationship. These parameters can be gotten through geometric camera calibration, which, generally is made by a correlation between image characteristics of a calibration pattern with know dimensions. The cameras self-calibration allows the achievement of their intrinsic parameters without using a known calibration pattern, being possible their calculation and alteration during the displacement of the robot in an unknown environment. In this work a self-calibration method based in the three-dimensional polar coordinates to represent image features is presented. This representation is determined by the relationship between images features and horizontal and vertical opening cameras angles. Using the polar coordinates it is possible to geometrically reconstruct the scene. Through the proposed techniques combination it is possible to calculate a scene objects depth estimate, allowing the robot navigation in an unknown environment