930 resultados para Calibração automática


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a kinematic control scheme, using visual feedback for a robot arm with five degrees of freedom. Using computational vision techniques, a method was developed to determine the cartesian 3d position and orientation of the robot arm (pose) using a robot image obtained through a camera. A colored triangular label is disposed on the robot manipulator tool and efficient heuristic rules are used to obtain the vertexes of that label in the image. The tool pose is obtained from those vertexes through numerical methods. A color calibration scheme based in the K-means algorithm was implemented to guarantee the robustness of the vision system in the presence of light variations. The extrinsic camera parameters are computed from the image of four coplanar points whose cartesian 3d coordinates, related to a fixed frame, are known. Two distinct poses of the tool, initial and final, obtained from image, are interpolated to generate a desired trajectory in cartesian space. The error signal in the proposed control scheme consists in the difference between the desired tool pose and the actual tool pose. Gains are applied at the error signal and the signal resulting is mapped in joint incrementals using the pseudoinverse of the manipulator jacobian matrix. These incrementals are applied to the manipulator joints moving the tool to the desired pose

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A hierarchical fuzzy control scheme is applied to improve vibration suppression by using an electro-mechanical system based on the lever principle. The hierarchical intelligent controller consists of a hierarchical fuzzy supervisor, one fuzzy controller and one robust controller. The supervisor combines controllers output signal to generate the control signal that will be applied on the plant. The objective is to improve the performance of the electromechanical system, considering that the supervisor could take advantage of the different techniques based controllers. The robust controller design is based on a linear mathematical model. Genetic algorithms are used on the fuzzy controller and the supervisor tuning, which are based on non-linear mathematical model. In order to attest the efficiency of the hierarchical fuzzy control scheme, digital simulations were employed. Some comparisons involving the optimized hierarchical controller and the non-optimized hierarchical controller will be made to prove the efficiency of the genetic algorithms and the advantages of its use

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The considered work presents the procedure for evaluation of the uncertainty related to the calibration of flow measurers and to BS&W. It is about a new method of measurement purposed by the conceptual project of the laboratory LAMP, at Universidade Federal do Rio Grande do Norte, that intends to determine the conventional true value of the BS&W from the total height of the liquid column in the auditor tank, hydrostatic pressure exerted by the liquid column, local gravity, specific mass of the water and the specific mass of the oil, and, to determine the flow, from total height of liquid column and transfer time. The calibration uses a automatized system of monitoration and data acquisition of some necessary largnesses to determine of flow and BS&W, allowing a better trustworthiness of through measurements

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to advances in the manufacturing process of orthopedic prostheses, the need for better quality shape reading techniques (i.e. with less uncertainty) of the residual limb of amputees became a challenge. To overcome these problems means to be able in obtaining accurate geometry information of the limb and, consequently, better manufacturing processes of both transfemural and transtibial prosthetic sockets. The key point for this task is to customize these readings trying to be as faithful as possible to the real profile of each patient. Within this context, firstly two prototype versions (α and β) of a 3D mechanical scanner for reading residual limbs shape based on reverse engineering techniques were designed. Prototype β is an improved version of prototype α, despite remaining working in analogical mode. Both prototypes are capable of producing a CAD representation of the limb via appropriated graphical sheets and were conceived to work purely by mechanical means. The first results were encouraging as they were able to achieve a great decrease concerning the degree of uncertainty of measurements when compared to traditional methods that are very inaccurate and outdated. For instance, it's not unusual to see these archaic methods in action by making use of ordinary home kind measure-tapes for exploring the limb's shape. Although prototype β improved the readings, it still required someone to input the plotted points (i.e. those marked in disk shape graphical sheets) to an academic CAD software called OrtoCAD. This task is performed by manual typing which is time consuming and carries very limited reliability. Furthermore, the number of coordinates obtained from the purely mechanical system is limited to sub-divisions of the graphical sheet (it records a point every 10 degrees with a resolution of one millimeter). These drawbacks were overcome by designing the second release of prototype β in which it was developed an electronic variation of the reading table components now capable of performing an automatic reading (i.e. no human intervention in digital mode). An interface software (i.e. drive) was built to facilitate data transfer. Much better results were obtained meaning less degree of uncertainty (it records a point every 2 degrees with a resolution of 1/10 mm). Additionally, it was proposed an algorithm to convert the CAD geometry, used by OrtoCAD, to an appropriate format and enabling the use of rapid prototyping equipment aiming future automation of the manufacturing process of prosthetic sockets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a computational methodology to solve problems of optimization in structural design. The application develops, implements and integrates methods for structural analysis, geometric modeling, design sensitivity analysis and optimization. So, the optimum design problem is particularized for plane stress case, with the objective to minimize the structural mass subject to a stress criterion. Notice that, these constraints must be evaluated at a series of discrete points, whose distribution should be dense enough in order to minimize the chance of any significant constraint violation between specified points. Therefore, the local stress constraints are transformed into a global stress measure reducing the computational cost in deriving the optimal shape design. The problem is approximated by Finite Element Method using Lagrangian triangular elements with six nodes, and use a automatic mesh generation with a mesh quality criterion of geometric element. The geometric modeling, i.e., the contour is defined by parametric curves of type B-splines, these curves hold suitable characteristics to implement the Shape Optimization Method, that uses the key points like design variables to determine the solution of minimum problem. A reliable tool for design sensitivity analysis is a prerequisite for performing interactive structural design, synthesis and optimization. General expressions for design sensitivity analysis are derived with respect to key points of B-splines. The method of design sensitivity analysis used is the adjoin approach and the analytical method. The formulation of the optimization problem applies the Augmented Lagrangian Method, which convert an optimization problem constrained problem in an unconstrained. The solution of the Augmented Lagrangian function is achieved by determining the analysis of sensitivity. Therefore, the optimization problem reduces to the solution of a sequence of problems with lateral limits constraints, which is solved by the Memoryless Quasi-Newton Method It is demonstrated by several examples that this new approach of analytical design sensitivity analysis of integrated shape design optimization with a global stress criterion purpose is computationally efficient

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1998 the first decorticator was developed in the Textile Engineering Laboratory and patented for the purpose of extracting fibres from pineapple leaves, with the financial help from CNPq and BNB. The objective of the present work was to develop an automatic decorticator different from the first one with a semiautomatic system of decortication with automatic feeding of the leaves and collection of the extracted fibres. The system is started through a command system that passes information to two engines, one for starting the beater cylinder and the other for the feeding of the leaves as well as the extraction of the decorticated fibres automatically. This in turn introduces the leaves between a knife and a beater cylinder with twenty blades (the previous one had only 8 blades). These blades are supported by equidistant flanges with a central transmission axis that would help in increasing the number of beatings of the leaves. In the present system the operator has to place the leaves on the rotating endless feeding belt and collect the extracted leaves that are being carried out through another endless belt. The pulp resulted form the extraction is collected in a tray through a collector. The feeding of the leaves as well as the extraction of the fibres is controlled automatically by varying the velocity of the cylinders. The semi-automatic decorticator basically composed of a chassis made out of iron bars (profile L) with 200cm length, 91 cm of height 68 cm of width. The decorticator weighs around 300Kg. It was observed that the increase in the number of blades from 8 to twenty in the beater cylinder reduced the turbulence inside the decorticator, which helped to improve the removal of the fibres without any problems as well as the quality of the fibres. From the studies carried out, from each leaf 2,8 to 4,5% of fibres can be extracted. This gives around 4 to 5 tons of fibres per hectare, which is more than that of cotton production per hectare. This quantity with no doubt could generate jobs to the people not only on the production of the fibres but also on their application in different areas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation aims the development of an experimental device to determine quantitatively the content of benzene, toluene and xylenes (BTX) in the atmosphere. BTX are extremely volatile solvents, and therefore play an important role in atmospheric chemistry, being precursors in the tropospheric ozone formation. In this work a BTX new standard gas was produced in nitrogen for stagnant systems. The aim of this dissertation is to develop a new method, simple and cheaper, to quantify and monitor BTX in air using solid phase microextraction/ gas chromatography/mass spectrometry (SPME/CG/MS). The features of the calibration method proposed are presented in this dissertation. SPME sampling was carried out under non-equilibrium conditions using a Carboxen/PDMS fiber exposed for 10 min standard gas mixtures. It is observed that the main parameters that affect the extraction process are sampling time and concentration. The results of the BTX multicomponent system studied have shown a linear and a nonlinear range. In the non-linear range, it is remarkable the effect of competition by selective adsorption with the following affinity order p-xylene > toluene > benzene. This behavior represents a limitation of the method, however being in accordance with the literature. Furthermore, this behavior does not prevent the application of the technique out of the non-linear region to quantify the BTX contents in the atmosphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivou-se com o presente trabalho, estabelecer a relação entre os pigmentos fotossintéticos extraídos em DMSO e as leituras obtidas no clorofilômetro portátil ClorofiLOG® 1030, gerando modelos matemáticos capazes de predizer os teores de clorofila e de carotenóides em folhas de mamoneira. O trabalho foi conduzido na Empresa Brasileira de Pesquisa Agropecuária (EMBRAPA) Algodão, situada em Campina Grande, Estado da Paraíba, em outubro de 2010. Para a análise indireta, foi utilizado um equipamento portátil, sendo realizada a leitura em discos foliares com diferentes tonalidades de verde, sendo feita, nesses mesmos discos, a determinação da clorofila pelo método clássico. Para a extração da clorofila, utilizaram-se 5 mL de dimetilsulfóxido (DMSO), a qual foi mantida em banho-maria a 70ºC, por 30 minutos, e retirou-se 3 mL da alíquota para leitura em espectrofotômetro nos comprimentos de onda de 470, 646 e 663 nm. Os dados foram submetidos à análise da variância e regressão polinomial. A leitura obtida no clorofilômetro portátil foi a variável dependente, e os pigmentos fotossintéticos determinados pelo método clássico foi a variável independente. Os resultados indicaram que o clorofilômetro portátil ClorofiLOG® 1030, associado a modelos matemáticos, permitiu estimar a concentração dos pigmentos fotossintéticos, exceto a clorofila b, com alta precisão, com economia de tempo e com reagentes normalmente utilizados nos procedimentos convencionais.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present work are established initially the fundamental relationships of thermodynamics that govern the equilibrium between phases, the models used for the description of the behavior non ideal of the liquid and vapor phases in conditions of low pressures. This work seeks the determination of vapor-liquid equilibrium (VLE) data for a series of multicomponents mixtures of saturated aliphatic hydrocarbons, prepared synthetically starting from substances with analytical degree and the development of a new dynamic cell with circulation of the vapor phase. The apparatus and experimental procedures developed are described and applied for the determination of VLE data. VLE isobarics data were obtained through a Fischer s ebulliometer of circulation of both phases, for the systems pentane + dodecane, heptane + dodecane and decane + dodecane. Using the two new dynamic cells especially projected, of easy operation and low cost, with circulation of the vapor phase, data for the systems heptane + decane + dodecane, acetone + water, tween 20 + dodecane, phenol + water and distillation curves of a gasoline without addictive were measured. Compositions of the equilibrium phases were found by densimetry, chromatography, and total organic carbon analyzer. Calibration curves of density versus composition were prepared from synthetic mixtures and the behavior excess volumes were evaluated. The VLE data obtained experimentally for the hydrocarbon and aqueous systems were submitted to the test of thermodynamic consistency, as well as the obtained from the literature data for another binary systems, mainly in the bank DDB (Dortmund Data Bank), where the Gibbs-Duhem equation is used obtaining a satisfactory data base. The results of the thermodynamic consistency tests for the binary and ternary systems were evaluated in terms of deviations for applications such as model development. Later, those groups of data (tested and approved) were used in the KijPoly program for the determination of the binary kij parameters of the cubic equations of state original Peng-Robinson and with the expanded alpha function. These obtained parameters can be applied for simulation of the reservoirs petroleum conditions and of the several distillation processes found in the petrochemistry industry, through simulators. The two designed dynamic cells used equipments of national technology for the determination of VLE data were well succeed, demonstrating efficiency and low cost. Multicomponents systems, mixtures of components of different molecular weights and also diluted solutions may be studied in these developed VLE cells

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present work are established initially the fundamental relationships of thermodynamics that govern the equilibrium between phases, the models used for the description of the behavior non ideal of the liquid and vapor phases in conditions of low pressures. This work seeks the determination of vapor-liquid equilibrium (VLE) data for a series of multicomponents mixtures of saturated aliphatic hydrocarbons, prepared synthetically starting from substances with analytical degree and the development of a new dynamic cell with circulation of the vapor phase. The apparatus and experimental procedures developed are described and applied for the determination of VLE data. VLE isobarics data were obtained through a Fischer's ebulliometer of circulation of both phases, for the systems pentane + dodecane, heptane + dodecane and decane + dodecane. Using the two new dynamic cells especially projected, of easy operation and low cost, with circulation of the vapor phase, data for the systems heptane + decane + dodecane, acetone + water, tween 20 + dodecane, phenol + water and distillation curves of a gasoline without addictive were measured. Compositions of the equilibrium phases were found by densimetry, chromatography, and total organic carbon analyzer. Calibration curves of density versus composition were prepared from synthetic mixtures and the behavior excess volumes were evaluated. The VLE data obtained experimentally for the hydrocarbon and aqueous systems were submitted to the test of thermodynamic consistency, as well as the obtained from the literature data for another binary systems, mainly in the bank DDB (Dortmund Data Bank), where the Gibbs-Duhem equation is used obtaining a satisfactory data base. The results of the thermodynamic consistency tests for the binary and ternary systems were evaluated in terms of deviations for applications such as model development. Later, those groups of data (tested and approved) were used in the KijPoly program for the determination of the binary kij parameters of the cubic equations of state original Peng-Robinson and with the expanded alpha function. These obtained parameters can be applied for simulation of the reservoirs petroleum conditions and of the several distillation processes found in the petrochemistry industry, through simulators. The two designed dynamic cells used equipments of national technology for the determination Humberto Neves Maia de Oliveira Tese de Doutorado PPGEQ/PRH-ANP 14/UFRN of VLE data were well succeed, demonstrating efficiency and low cost. Multicomponents systems, mixtures of components of different molecular weights and also diluted solutions may be studied in these developed VLE cells

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthropic disturbances in watersheds, such as inappropriate building development, disorderly land occupation and unplanned land use, may strengthen the sediment yield and the inflow into the estuary, leading to siltation, changes in the reach channel conformation, and ecosystem/water quality problems. Faced with such context, this study aims to assess the applicability of SWAT model to estimate, even in a preliminary way, the sediment yield distribution along the Potengi River watershed, as well as its contribution to the estuary. Furthermore, an assessment of its erosion susceptibility was used for comparison. The susceptibility map was developed by overlaying rainfall erosivity, soil erodibility, the slope of the terrain and land cover. In order to overlap these maps, a multi-criteria analysis through AHP method was applied. The SWAT was run using a five year period (1997-2001), considering three different scenarios based on different sorts of human interference: a) agriculture; b) pasture; and c) no interference (background). Results were analyzed in terms of surface runoff, sediment yield and their propagation along each river section, so that it was possible to find that the regions in the extreme west of the watershed and in the downstream portions returned higher values of sediment yield, reaching respectively 2.8 e 5.1 ton/ha.year, whereas central areas, which were less susceptible, returned the lowest values, never more than 0.7 ton/ha.ano. It was also noticed that in the west sub-watersheds, where one can observe the headwaters, sediment yield was naturally forced by high declivity and weak soils. In another hand, results suggest that the eastern part would not contribute to the sediment inflow into the estuary in a significant way, and the larger part of the sediment yield in that place is due to anthropic activities. For the central region, the analysis of sediment propagation indicates deposition predominance in opposition to transport. Thus, it s not expected that isolated rain storms occurring in the upstream river portions would significantly provide the estuary with sediment. Because the model calibration process hasn t been done yet, it becomes essential to emphasize that values presented here as results should not be applied for pratical aims. Even so, this work warns about the risks of a growth in the alteration of natural land cover, mainly in areas closer to the headwaters and in the downstream Potengi River

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The semiarid rainfall regime is northeastern Brazil is highly variable. Climate processes associated with rainfall are complex and their effects may represent extreme situations of drought or floods, which can have adverse effects on society and the environment. The regional economy has a significant agricultural component, which is strongly influenced by weather conditions. Maximum precipitation analysis is traditionally performed using the intensity-duration-frequency (IDF) probabilistic approach. Results from such analysis are typically used in engineering projects involving hydraulic structures such as drainage network systems and road structures. On the other hand, precipitation data analysis may require the adoption of some kind of event identification criteria. The minimum inter-event duration (IMEE) is one of the most used criteria. This study aims to analyze the effect of the IMEE on the obtained rain event properties. For this purpose, a nine-year precipitation time series (2002- 2011) was used. This data was obtained from an automatic raingauge station, installed in an environmentally protected area, Ecological Seridó Station. The results showed that adopted IMEE values has an important effect on the number of events, duration, event height, mean rainfall rate and mean inter-event duration. Furthermore, a higher occurrence of extreme events was observed for small IMEE values. Most events showed average rainfall intensity higher than 2 mm.h-1 regardless of IMEE. The storm coefficient of advance was, in most cases, within the first quartile of the event, regardless of the IMEE value. Time series analysis using partial time series made it possible to adjust the IDF equations to local characteristics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urban stormwater can be considered as potential water resources as well as problems for the proper functioning of the manifold activities of the city, resulting from inappropriate use and occupation of the soil, usually due to poor planning of the occupation of the development areas, with little care for the environmental aspects of the drainage of surface runoff. As a basic premise, we must seek mechanisms to preserve the natural flow in all stages of development of an urban area, preserving the soil infiltration capacity in the scale of the urban area, comprising the mechanisms of natural drainage, and noting preserving natural areas of dynamic water courses, both in the main channel and in the secondary. They are challenges for a sustainable urban development in a harmonious coexistence of modern developmental, which are consistent with the authoritative economic environmental and social quality. Integrated studies involving the quantity and quality of rainwater are absolutely necessary to achieve understanding and obtaining appropriate technologies, involving both aspects of the drainage problems and aspects of use of water when subjected to an adequate management of surface runoff , for example, the accumulation of these reservoirs in detention with the possibility of use for other purposes. The purpose of this study aims to develop a computer model, adjusted to prevailing conditions of an experimental urban watershed in order to enable the implementation of management practices for water resources, hydrological simulations of quantity and, in a preliminary way, the quality of stormwater that flow to a pond located at the downstream end of the basin. To this end, we used in parallel with the distributed model SWMM data raised the basin with the highest possible resolution to allow the simulation of diffuse loads, heterogeneous characteristics of the basin both in terms of hydrological and hydraulic parameters on the use and occupation soil. The parallel work should improve the degree of understanding of the phenomena simulated in the basin as well as the activity of the calibration models, and this is supported by monitoring data acquired during the duration of the project MAPLU (Urban Stormwater Management) belonging to the network PROSAB (Research Program in Basic Sanitation) in the years 2006 to 2008