93 resultados para Pessoal de laboratório médico
Resumo:
During a petroleum well production process, It is common the slmultaneous oil and water production, in proportion that can vary from 0% up to values close to 100% of water. Moreover, the production flows can vary a lot, depending on the charaeteristies of eaeh reservoir. Thus being, the meters used in field for the flow and BSW (water in the oil) measurement must work well in wide bands of operation. For the evaluation of the operation of these meters, in the different operation conditions, a Laboratory will be built in UFRN, that has for objective to evaluate in an automatic way the processes of flow and BSW petroleum measurement, considering different operation conditions. The good acting of these meters is fundamental for the accuracy of the measures of the volumes of production liquid and rude of petroleum. For the measurement of this production, the petroleum companies use meters that should indicate the values with tha largast possible accuracy and to respect a series of conditions and minimum requirements, estabelished by the united Entrance ANP/INMETRO 19106/2000. The laboratory of Evafuation of the Processes of Measurement of Flow and BSW to be built will possess an oil tank basically, a tank of water, besides a mixer, a tank auditor, a tank for separation and a tank of residues for discard of fluids, fundamental for the evaluation of the flow metars and BSW. The whole process will be automated through the use of a Programmable Logicat Controller (CLP) and of a supervisory system.This laboratory besides allowing the evaluation of flow meters and BSW used by petroleum companies, it will make possible the development of researches related to the automation. Besides, it will be a collaborating element to the development of the Computer Engineering and Automation Department, that it will propitiate the evolution of the faculty and discente, qualifying them for a job market in continuous growth. The present work describes the project of automation of the laboratory that will be built at of UFRN. The system will be automated using a Programmable Logical Controller and a supervisory system. The programming of PLC and the screens of the supervisory system were developed in this work
Resumo:
The occurrence of transients in electrocardiogram (ECG) signals indicates an electrical phenomenon outside the heart. Thus, the identification of transients has been the most-used methodology in medical analysis since the invention of the electrocardiograph (device responsible for benchmarking of electrocardiogram signals). There are few papers related to this subject, which compels the creation of an architecture to do the pre-processing of this signal in order to identify transients. This paper proposes a method based on the signal energy of the Hilbert transform of electrocardiogram, being an alternative to methods based on morphology of the signal. This information will determine the creation of frames of the MP-HA protocol responsible for transmitting the ECG signals through an IEEE 802.3 network to a computing device. That, in turn, may perform a process to automatically sort the signal, or to present it to a doctor so that he can do the sorting manually
Resumo:
Digital signal processing (DSP) aims to extract specific information from digital signals. Digital signals are, by definition, physical quantities represented by a sequence of discrete values and from these sequences it is possible to extract and analyze the desired information. The unevenly sampled data can not be properly analyzed using standard techniques of digital signal processing. This work aimed to adapt a technique of DSP, the multiresolution analysis, to analyze unevenly smapled data, to aid the studies in the CoRoT laboratory at UFRN. The process is based on re-indexing the wavelet transform to handle unevenly sampled data properly. The was efective presenting satisfactory results
Resumo:
Nowadays, classifying proteins in structural classes, which concerns the inference of patterns in their 3D conformation, is one of the most important open problems in Molecular Biology. The main reason for this is that the function of a protein is intrinsically related to its spatial conformation. However, such conformations are very difficult to be obtained experimentally in laboratory. Thus, this problem has drawn the attention of many researchers in Bioinformatics. Considering the great difference between the number of protein sequences already known and the number of three-dimensional structures determined experimentally, the demand of automated techniques for structural classification of proteins is very high. In this context, computational tools, especially Machine Learning (ML) techniques, have become essential to deal with this problem. In this work, ML techniques are used in the recognition of protein structural classes: Decision Trees, k-Nearest Neighbor, Naive Bayes, Support Vector Machine and Neural Networks. These methods have been chosen because they represent different paradigms of learning and have been widely used in the Bioinfornmatics literature. Aiming to obtain an improvment in the performance of these techniques (individual classifiers), homogeneous (Bagging and Boosting) and heterogeneous (Voting, Stacking and StackingC) multiclassification systems are used. Moreover, since the protein database used in this work presents the problem of imbalanced classes, artificial techniques for class balance (Undersampling Random, Tomek Links, CNN, NCL and OSS) are used to minimize such a problem. In order to evaluate the ML methods, a cross-validation procedure is applied, where the accuracy of the classifiers is measured using the mean of classification error rate, on independent test sets. These means are compared, two by two, by the hypothesis test aiming to evaluate if there is, statistically, a significant difference between them. With respect to the results obtained with the individual classifiers, Support Vector Machine presented the best accuracy. In terms of the multi-classification systems (homogeneous and heterogeneous), they showed, in general, a superior or similar performance when compared to the one achieved by the individual classifiers used - especially Boosting with Decision Tree and the StackingC with Linear Regression as meta classifier. The Voting method, despite of its simplicity, has shown to be adequate for solving the problem presented in this work. The techniques for class balance, on the other hand, have not produced a significant improvement in the global classification error. Nevertheless, the use of such techniques did improve the classification error for the minority class. In this context, the NCL technique has shown to be more appropriated
Resumo:
This paper presents an evaluative study about the effects of using a machine learning technique on the main features of a self-organizing and multiobjective genetic algorithm (GA). A typical GA can be seen as a search technique which is usually applied in problems involving no polynomial complexity. Originally, these algorithms were designed to create methods that seek acceptable solutions to problems where the global optimum is inaccessible or difficult to obtain. At first, the GAs considered only one evaluation function and a single objective optimization. Today, however, implementations that consider several optimization objectives simultaneously (multiobjective algorithms) are common, besides allowing the change of many components of the algorithm dynamically (self-organizing algorithms). At the same time, they are also common combinations of GAs with machine learning techniques to improve some of its characteristics of performance and use. In this work, a GA with a machine learning technique was analyzed and applied in a antenna design. We used a variant of bicubic interpolation technique, called 2D Spline, as machine learning technique to estimate the behavior of a dynamic fitness function, based on the knowledge obtained from a set of laboratory experiments. This fitness function is also called evaluation function and, it is responsible for determining the fitness degree of a candidate solution (individual), in relation to others in the same population. The algorithm can be applied in many areas, including in the field of telecommunications, as projects of antennas and frequency selective surfaces. In this particular work, the presented algorithm was developed to optimize the design of a microstrip antenna, usually used in wireless communication systems for application in Ultra-Wideband (UWB). The algorithm allowed the optimization of two variables of geometry antenna - the length (Ls) and width (Ws) a slit in the ground plane with respect to three objectives: radiated signal bandwidth, return loss and central frequency deviation. These two dimensions (Ws and Ls) are used as variables in three different interpolation functions, one Spline for each optimization objective, to compose a multiobjective and aggregate fitness function. The final result proposed by the algorithm was compared with the simulation program result and the measured result of a physical prototype of the antenna built in the laboratory. In the present study, the algorithm was analyzed with respect to their success degree in relation to four important characteristics of a self-organizing multiobjective GA: performance, flexibility, scalability and accuracy. At the end of the study, it was observed a time increase in algorithm execution in comparison to a common GA, due to the time required for the machine learning process. On the plus side, we notice a sensitive gain with respect to flexibility and accuracy of results, and a prosperous path that indicates directions to the algorithm to allow the optimization problems with "η" variables
Resumo:
The vision is one of the five senses of the human body and, in children is responsible for up to 80% of the perception of world around. Studies show that 50% of children with multiple disabilities have some visual impairment, and 4% of all children are diagnosed with strabismus. The strabismus is an eye disability associated with handling capacity of the eye, defined as any deviation from perfect ocular alignment. Besides of aesthetic aspect, the child may report blurred or double vision . Ophthalmological cases not diagnosed correctly are reasons for many school abandonments. The Ministry of Education of Brazil points to the visually impaired as a challenge to the educators of children, particularly in literacy process. The traditional eye examination for diagnosis of strabismus can be accomplished by inducing the eye movements through the doctor s instructions to the patient. This procedure can be played through the computer aided analysis of images captured on video. This paper presents a proposal for distributed system to assist health professionals in remote diagnosis of visual impairment associated with motor abilities of the eye, such as strabismus. It is hoped through this proposal to contribute improving the rates of school learning for children, allowing better diagnosis and, consequently, the student accompaniment
Resumo:
This work uses computer vision algorithms related to features in the identification of medicine boxes for the visually impaired. The system is for people who have a disease that compromises his vision, hindering the identification of the correct medicine to be ingested. We use the camera, available in several popular devices such as computers, televisions and phones, to identify the box of the correct medicine and audio through the image, showing the poor information about the medication, such: as the dosage, indication and contraindications of the medication. We utilize a model of object detection using algorithms to identify the features in the boxes of drugs and playing the audio at the time of detection of feauteres in those boxes. Experiments carried out with 15 people show that where 93 % think that the system is useful and very helpful in identifying drugs for boxes. So, it is necessary to make use of this technology to help several people with visual impairments to take the right medicine, at the time indicated in advance by the physician
Resumo:
The use of solar energy for electricity generation has shown a growing interest in recent years. Generally, the conversion of solar energy into electricity is made by PV modules installed on fixed structures, with slope determined by the latitude of the installation site. In this sense, the use of mobile structures with solar tracking, has enabled increased production of the generated energy. However, the performance of these structures depends on the type of tracker and the position control used. In this work, it is proposed position control a strategy applied for a solar tracker, which will be installed in Laboratory of Power Electronics and Renewable Energy (LEPER), located in the Federal University of Rio Grande do Norte (UFRN). The tracker system is of polar type with daily positioning east-west and tilt angle manual adjustment in the seasonal periods, from north to south
Resumo:
The considered work presents the procedure for evaluation of the uncertainty related to the calibration of flow measurers and to BS&W. It is about a new method of measurement purposed by the conceptual project of the laboratory LAMP, at Universidade Federal do Rio Grande do Norte, that intends to determine the conventional true value of the BS&W from the total height of the liquid column in the auditor tank, hydrostatic pressure exerted by the liquid column, local gravity, specific mass of the water and the specific mass of the oil, and, to determine the flow, from total height of liquid column and transfer time. The calibration uses a automatized system of monitoration and data acquisition of some necessary largnesses to determine of flow and BS&W, allowing a better trustworthiness of through measurements
Resumo:
We propose in this work a software architecture for robotic boats intended to act in diverse aquatic environments, fully autonomously, performing telemetry to a base station and getting this mission to be accomplished. This proposal aims to apply within the project N-Boat Lab NatalNet DCA, which aims to empower a sailboat navigating autonomously. The constituent components of this architecture are the memory modules, strategy, communication, sensing, actuation, energy, security and surveillance, making these systems the boat and base station. To validate the simulator was developed in C language and implemented using the graphics API OpenGL resources, whose main results were obtained in the implementation of memory, performance and strategy modules, more specifically data sharing, control of sails and rudder and planning short routes based on an algorithm for navigation, respectively. The experimental results, shown in this study indicate the feasibility of the actual use of the software architecture developed and their application in the area of autonomous mobile robotics
Resumo:
It is analyzed through the concepts of tribology and mechanical contact and damage the suggestion of implementing a backup system for traction and passage of Pipeline Inspection Gauge (Pig) from the inside of pipelines. In order to verify the integrity of the pipelines, it is suggested the possibility of displacement of such equipment by pulling wires with steel wires. The physical and mechanical characteristics of this method were verified by accelerated tests in the laboratory in a tribological pair, wire versus a curve 90. It also considered the main mechanisms of wear of a sliding system with and without lubricant, in the absence and presence of contaminants. To try this, It was constructed a test bench able to reproduce a slip system, work on mode back-and-forth ("reciprocation"). It was used two kinds of wires, a galvanized steel and other stainless steel and the results achieved using the two kinds of steel cables were compared. For result comparative means, it was used steel cables with and without coating of Poly Vinyl Chloride (PVC). The wires and the curves of the products were characterized using metallographic analysis, microhardness Vickers tests, X-ray diffraction (XRD), X-Ray Refraction (XRF) and tensile tests. After the experiments were analyzed some parameters that have been measurable, it demonstrates to the impracticality of this proposed method, since the friction force and the concept of alternating request at the contact between the strands of wire and the inner curves that are part ducts caused severe wear. These types of wear are likely to cause possible failures in future products and cause fluid leaks
Resumo:
In 1998 the first decorticator was developed in the Textile Engineering Laboratory and patented for the purpose of extracting fibres from pineapple leaves, with the financial help from CNPq and BNB. The objective of the present work was to develop an automatic decorticator different from the first one with a semiautomatic system of decortication with automatic feeding of the leaves and collection of the extracted fibres. The system is started through a command system that passes information to two engines, one for starting the beater cylinder and the other for the feeding of the leaves as well as the extraction of the decorticated fibres automatically. This in turn introduces the leaves between a knife and a beater cylinder with twenty blades (the previous one had only 8 blades). These blades are supported by equidistant flanges with a central transmission axis that would help in increasing the number of beatings of the leaves. In the present system the operator has to place the leaves on the rotating endless feeding belt and collect the extracted leaves that are being carried out through another endless belt. The pulp resulted form the extraction is collected in a tray through a collector. The feeding of the leaves as well as the extraction of the fibres is controlled automatically by varying the velocity of the cylinders. The semi-automatic decorticator basically composed of a chassis made out of iron bars (profile L) with 200cm length, 91 cm of height 68 cm of width. The decorticator weighs around 300Kg. It was observed that the increase in the number of blades from 8 to twenty in the beater cylinder reduced the turbulence inside the decorticator, which helped to improve the removal of the fibres without any problems as well as the quality of the fibres. From the studies carried out, from each leaf 2,8 to 4,5% of fibres can be extracted. This gives around 4 to 5 tons of fibres per hectare, which is more than that of cotton production per hectare. This quantity with no doubt could generate jobs to the people not only on the production of the fibres but also on their application in different areas
Resumo:
Wear mechanisms and thermal history of two non-conforming sliding surfaces was investigated in laboratory. A micro-abrasion testing setup was used but the traditional rotative sphere method was substituted by a cylindrical surface of revolution which included seven sharp angles varying between 15o to 180o. The micro-abrasion tests lead to the investigation on the polyurethane response at different contact pressures. For these turned counterfaces with and without heat treatment. Normal load and sliding speeds were changed. The sliding distance was fixed at 5 km in each test. The room and contact temperatures were measured during the tests. The polyurethane was characterized using tensile testing, hardness Shore A measurement, Thermogravimetric Analysis (TGA), Differential Scanning Calorimetry (DSC) and Thermomechanical Analyze (TMA). The Vickers micro-hardness of the steel was measured before and after the heat treatment and the metallographic characterization was also carried out. Worn surface of polyurethane was analysed using Scanning Electron Microscope (SEM) and EDS (Electron Diffraction Scanning) microanalyses. Single pass scratch testing in polyurethane using indenters with different contact angles was also carried out. The scar morphology of the wear, the wear mechanism and the thermal response were analyzed in order to correlate the conditions imposed by the pressure-velocity pair to the materials in contact. Eight different wear mechanisms were identified on the polyurethane surface. It was found correlation between the temperature variation and the wear scar morphology.
Resumo:
Cotton is a hydrofilic textile fiber and, for this reason, it changes its properties according to the environment changes. Moisture and Temperature are the two most important factors that lead a cotton Spinning sector and influence its quality. Those two properties can change the entire Spinning process. Understanding this, moisture and temperature must be kept under control when used during the Spinning process, once the environment is hot and dry, the cotton yarns absorb moisture and lose the minimal consistency. According to this information, this paper was developed testing four types of cotton yarns, one kind of cotton from Brazil and the others from Egypt. The yarns were exposed to different temperatures and moisture in five different tests and in each test, six samples that were examined through physical and mechanical tests: resistance, strength, tenacity, yarn´s hairness, yarn´s evenness and yarn´s twisting. All the analysis were accomplished at Laboratório de Mecânica dos Fluídos and at COATS Corrente S.A., where, it was possible to use the equipments whose were fundamental to develop this paper, such as the STATIMAT ME that measures strength, tenacity, Zweigler G566, that measure hairiness in the yarn, a skein machine and a twisting machine. The analysis revealed alterations in the yarn´s characteristics in a direct way, for example, as moisture and temperature were increased, the yarn´s strength, tenacity and hairness were increased as well. Having the results of all analysis, it is possible to say that a relatively low temperature and a high humidity, cotton yarns have the best performance
Resumo:
The shrimp farming is a process of creation of shrimp that has been growing rapidly in the country, occupying a meaningful space in the Brazilian exporting. In 2003, this activity presented a volume of 60 millions of tons and 220 millions of dollars, being the main generator of employment and income of the primary sector of the northeast economy. However, it is a new activity with c.a. five years in the Rio Grande do Norte State and therefore needs investment in the technological area. Among the wastewaters of this activity, the sulphite solution is being usually applied in the process of fishing, i.e. retrieval of the shrimps from the farming. The aim of this work is to present the oxidation experimental results of the sulphite that may be and to determine what it s the most efficient method, trough laboratory experiments. The measurements were carried out in a mixing reactor inserting air and with hydrogen peroxide addition with and without UV light. The solutions were prepared synthetically with concentrations found in the wastewater of fishing and also collected in locu. The oxidation process using air was monitorated by iodometric analysis for the sulphite and the oxidation using hydrogen peroxide was evaluated with turbidimetric analysis for sulphate, by spectrophotometer. The sulphite was totally oxidized in both processes. The experimental results permit to conclude that the oxidation by hydrogen peroxide is more efficient and allowed to determine the optimum operational conditions in terms of concentration and time of treatment