18 resultados para curva de calibração

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Investigations in the field of pharmaceutical analysis and quality control of medicines require analytical procedures with good perfomance characteristics. Calibration is one of the most important steps in chemical analysis, presenting direct relation to parameters such as linearity. This work consisted in the development of a new methodology to obtain calibration curves for drug analysis: the stationary cuvette one. It was compared to the currently used methodology, and possible sources of variation between them were evaluated. The results demonstrated that the proposed technique presented similar reproducibility compared to the traditional methodology. In addition to that, some advantages were observed, such as user-friendliness, cost-effectiveness, accuracy, precision and robustness. Therefore, the stationary cuvette methodology may be considered the best choice to obtain calibration curves for drug analyis by spectrophotometry

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ethanol is the most overused psychoactive drug over the world; this fact makes it one of the main substances required in toxicological exams nowadays. The development of an analytical method, adaptation or implementation of a method known, involves a process of validation that estimates its efficiency in the laboratory routine and credibility of the method. The stability is defined as the ability of the sample of material to keep the initial value of a quantitative measure for a defined period within specific limits when stored under defined conditions. This study aimed to evaluate the method of Gas chromatography and study the stability of ethanol in blood samples, considering the variables time and temperature of storage, and the presence of preservative and, with that check if the conditions of conservation and storage used in this study maintain the quality of the sample and preserve the originally amount of analyte present. Blood samples were collected from 10 volunteers to evaluate the method and to study the stability of ethanol. For the evaluation of the method, part of the samples was added to known concentrations of ethanol. In the study of stability, the other side of the pool of blood was placed in two containers: one containing the preservative sodium fluoride 1% and the anticoagulant heparin and the other only heparin, was added ethanol at a concentration of 0.6 g/L, fractionated in two bottles, one being stored at 4ºC (refrigerator) and another at -20ºC (freezer), the tests were performed on the same day (time zero) and after 1, 3, 7, 14, 30 and 60 days of storage. The assessment found the difference in results during storage in relation to time zero. It used the technique of headspace associated with gas chromatography with the FID and capillary column with stationary phase of polyethylene. The best analysis of chromatographic conditions were: temperature of 50ºC (column), 150ºC (jet) and 250ºC (detector), with retention time for ethanol from 9.107 ± 0.026 and the tercbutanol (internal standard) of 8.170 ± 0.081 minutes, the ethanol being separated properly from acetaldehyde, acetone, methanol and 2-propanol, which are potential interfering in the determination of ethanol. The technique showed linearity in the concentration range of 0.01 and 3.2 g/L (0.8051 x + y = 0.6196; r2 = 0.999). The calibration curve showed the following equation of the line: y = x 0.7542 + 0.6545, with a linear correlation coefficient equal to 0.996. The average recovery was 100.2%, the coefficients of variation of accuracy and inter intra test showed values of up to 7.3%, the limit of detection and quantification was 0.01 g/L and showed coefficient of variation within the allowed. The analytical method evaluated in this study proved to be fast, efficient and practical, given the objective of this work satisfactorily. The study of stability has less than 20% difference in the response obtained under the conditions of storage and stipulated period, compared with the response obtained at time zero and at the significance level of 5%, no statistical difference in the concentration of ethanol was observed between analysis. The results reinforce the reliability of the method of gas chromatography and blood samples in search of ethanol, either in the toxicological, forensic, social or clinic

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents a set of intelligent algorithms with the purpose of correcting calibration errors in sensors and reducting the periodicity of their calibrations. Such algorithms were designed using Artificial Neural Networks due to its great capacity of learning, adaptation and function approximation. Two approaches willbe shown, the firstone uses Multilayer Perceptron Networks to approximate the many shapes of the calibration curve of a sensor which discalibrates in different time points. This approach requires the knowledge of the sensor s functioning time, but this information is not always available. To overcome this need, another approach using Recurrent Neural Networks was proposed. The Recurrent Neural Networks have a great capacity of learning the dynamics of a system to which it was trained, so they can learn the dynamics of a sensor s discalibration. Knowingthe sensor s functioning time or its discalibration dynamics, it is possible to determine how much a sensor is discalibrated and correct its measured value, providing then, a more exact measurement. The algorithms proposed in this work can be implemented in a Foundation Fieldbus industrial network environment, which has a good capacity of device programming through its function blocks, making it possible to have them applied to the measurement process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the new discoveries of oil and gas, the exploration of fields in various geological basins, imports of other oils and the development of alternative fuels, more and more research labs have evaluated and characterized new types of petroleum and derivatives. Therefore the investment in new techniques and equipment in the samples analysis to determine their physical and chemical properties, their composition, possible contaminants, especification of products, among others, have multiplied in last years, so development of techniques for rapid and efficient characterization is extremely important for a better economic recovery of oil. Based on this context, this work has two main objectives. The first one is to characterize the oil by thermogravimetry coupled with mass spectrometry (TG-MS), and correlate these results with from other types of characterizations data previously informed. The second is to use the technique to develop a methodology to obtain the curve of evaluation of hydrogen sulfide gas in oil. Thus, four samples were analyzed by TG-MS, and X-ray fluorescence spectrometry (XRF). TG results can be used to indicate the nature of oil, its tendency in coke formation, temperatures of distillation and cracking, and other features. It was observed in MS evaluations the behavior of oil main compounds with temperature, the points where the volatilized certain fractions and the evaluation gas analysis of sulfide hydrogen that is compared with the evaluation curve obtained by Petrobras with another methodology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Petroleum evaluation is analyze it using different methodologies, following international standards to know their chemical and physicochemical properties, contaminant levels, composition and especially their ability to generate derivatives. Many of these analyzes consuming a lot of time, large amount of samples , supplies and need an organized transportation logistics, schedule and professionals involved. Looking for alternatives that optimize the evaluation and enable the use of new technologies, seven samples of different centrifuged Brazilian oils previously characterized by Petrobras were analyzed by thermogravimetry in 25-900° C range using heating rates of 05, 10 and 20ºC per minute. With experimental data obtained, characterizations correlations were performed and provided: generation of true boiling point curves (TBP) simulated; comparing fractions generated with appropriate cut standard in temperature ranges; an approach to obtain Watson characterization factor; and compare micro carbon residue formed. The results showed a good chance of reproducing simulated TBP curve from thermogravimetry taking into account the composition, density and other oil properties. Proposed correlations for experimental characterization factor and carbon residue followed Petrobras characterizations, showing that thermogravimetry can be used as a tool on oil evaluation, because your quick analysis, accuracy, and requires a minimum number of samples and consumables

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work a pyrometer using the classic model of Kimball-Hobbs was developed, tested and calibrated. The solar radiation is verified through the temperature difference between the sensible elements covered by absorbing (black) and reflecting (white) pigmentations of the incoming radiation. The photoacoustic technique was used to optimize the choice of the pigments. Methodologies associated with linearity, thermo-variation, sensibility, response time and distance are also presented. To correctly classify the results, the international standard ISO 9060 as well as indicative parameters of World Meteorological Organization (WMO) are used. In addition a system of data acquisition of two channels with 12 bits, constructed during the this time, was used to measure the global solar radiation on the ground by the pyrometer and also by another pyrometer certified in the case of Keep & zonen. The results statistically show, through the hypothesis test presented here, that both equipments find population average with 95% of correctness

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The continuous development of instruments and equipment used as tools or torque measurement in the industry is demanding more accurate techniques in the use of this kind instrumentation, including development of metrological characteristics in torque measurement. The same happens with the needs in calibration services. There is a diversity of methods of hand torque tools in the market with different measuring range but without complaining with technical standards in terms of requirements of quality and reliability. However, actually there is no choice of a torque measuring standard that fulfils, with low cost, the needs for the calibration of hand torque tools in a large number of ranges. The objective of this thesis is to show the development and evaluation of a torque measuring standard device with a conception to allow the calibration of hand torque tools with three levels of torque with an single instrument, promoting reduction of costs and time in the calibration, also offering reliability for the evaluation of torque measuring instrument. To attend the demand in the calibration of hand torque tools it is necessary that the calibration laboratories have a big collection of torque measuring standards, to fulfills the needs of the costumer, what is very costly. The development of this type of torque measuring standard revealed a viable technique and economically making possible the calibration of hand torque tools in different nominal ranges through a single measurement system versatile, efficient and of easy operation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present work are established initially the fundamental relationships of thermodynamics that govern the equilibrium between phases, the models used for the description of the behavior non ideal of the liquid and vapor phases in conditions of low pressures. This work seeks the determination of vapor-liquid equilibrium (VLE) data for a series of multicomponents mixtures of saturated aliphatic hydrocarbons, prepared synthetically starting from substances with analytical degree and the development of a new dynamic cell with circulation of the vapor phase. The apparatus and experimental procedures developed are described and applied for the determination of VLE data. VLE isobarics data were obtained through a Fischer s ebulliometer of circulation of both phases, for the systems pentane + dodecane, heptane + dodecane and decane + dodecane. Using the two new dynamic cells especially projected, of easy operation and low cost, with circulation of the vapor phase, data for the systems heptane + decane + dodecane, acetone + water, tween 20 + dodecane, phenol + water and distillation curves of a gasoline without addictive were measured. Compositions of the equilibrium phases were found by densimetry, chromatography, and total organic carbon analyzer. Calibration curves of density versus composition were prepared from synthetic mixtures and the behavior excess volumes were evaluated. The VLE data obtained experimentally for the hydrocarbon and aqueous systems were submitted to the test of thermodynamic consistency, as well as the obtained from the literature data for another binary systems, mainly in the bank DDB (Dortmund Data Bank), where the Gibbs-Duhem equation is used obtaining a satisfactory data base. The results of the thermodynamic consistency tests for the binary and ternary systems were evaluated in terms of deviations for applications such as model development. Later, those groups of data (tested and approved) were used in the KijPoly program for the determination of the binary kij parameters of the cubic equations of state original Peng-Robinson and with the expanded alpha function. These obtained parameters can be applied for simulation of the reservoirs petroleum conditions and of the several distillation processes found in the petrochemistry industry, through simulators. The two designed dynamic cells used equipments of national technology for the determination of VLE data were well succeed, demonstrating efficiency and low cost. Multicomponents systems, mixtures of components of different molecular weights and also diluted solutions may be studied in these developed VLE cells

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present work are established initially the fundamental relationships of thermodynamics that govern the equilibrium between phases, the models used for the description of the behavior non ideal of the liquid and vapor phases in conditions of low pressures. This work seeks the determination of vapor-liquid equilibrium (VLE) data for a series of multicomponents mixtures of saturated aliphatic hydrocarbons, prepared synthetically starting from substances with analytical degree and the development of a new dynamic cell with circulation of the vapor phase. The apparatus and experimental procedures developed are described and applied for the determination of VLE data. VLE isobarics data were obtained through a Fischer's ebulliometer of circulation of both phases, for the systems pentane + dodecane, heptane + dodecane and decane + dodecane. Using the two new dynamic cells especially projected, of easy operation and low cost, with circulation of the vapor phase, data for the systems heptane + decane + dodecane, acetone + water, tween 20 + dodecane, phenol + water and distillation curves of a gasoline without addictive were measured. Compositions of the equilibrium phases were found by densimetry, chromatography, and total organic carbon analyzer. Calibration curves of density versus composition were prepared from synthetic mixtures and the behavior excess volumes were evaluated. The VLE data obtained experimentally for the hydrocarbon and aqueous systems were submitted to the test of thermodynamic consistency, as well as the obtained from the literature data for another binary systems, mainly in the bank DDB (Dortmund Data Bank), where the Gibbs-Duhem equation is used obtaining a satisfactory data base. The results of the thermodynamic consistency tests for the binary and ternary systems were evaluated in terms of deviations for applications such as model development. Later, those groups of data (tested and approved) were used in the KijPoly program for the determination of the binary kij parameters of the cubic equations of state original Peng-Robinson and with the expanded alpha function. These obtained parameters can be applied for simulation of the reservoirs petroleum conditions and of the several distillation processes found in the petrochemistry industry, through simulators. The two designed dynamic cells used equipments of national technology for the determination Humberto Neves Maia de Oliveira Tese de Doutorado PPGEQ/PRH-ANP 14/UFRN of VLE data were well succeed, demonstrating efficiency and low cost. Multicomponents systems, mixtures of components of different molecular weights and also diluted solutions may be studied in these developed VLE cells

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is combined with the potential of the technique of near infrared spectroscopy - NIR and chemometrics order to determine the content of diclofenac tablets, without destruction of the sample, to which was used as the reference method, ultraviolet spectroscopy, which is one of the official methods. In the construction of multivariate calibration models has been studied several types of pre-processing of NIR spectral data, such as scatter correction, first derivative. The regression method used in the construction of calibration models is the PLS (partial least squares) using NIR spectroscopic data of a set of 90 tablets were divided into two sets (calibration and prediction). 54 were used in the calibration samples and the prediction was used 36, since the calibration method used was crossvalidation method (full cross-validation) that eliminates the need for a validation set. The evaluation of the models was done by observing the values of correlation coefficient R 2 and RMSEC mean square error (calibration error) and RMSEP (forecast error). As the forecast values estimated for the remaining 36 samples, which the results were consistent with the values obtained by UV spectroscopy

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, the quantitative analysis of glucose, triglycerides and cholesterol (total and HDL) in both rat and human blood plasma was performed without any kind of pretreatment of samples, by using near infrared spectroscopy (NIR) combined with multivariate methods. For this purpose, different techniques and algorithms used to pre-process data, to select variables and to build multivariate regression models were compared between each other, such as partial least squares regression (PLS), non linear regression by artificial neural networks, interval partial least squares regression (iPLS), genetic algorithm (GA), successive projections algorithm (SPA), amongst others. Related to the determinations of rat blood plasma samples, the variables selection algorithms showed satisfactory results both for the correlation coefficients (R²) and for the values of root mean square error of prediction (RMSEP) for the three analytes, especially for triglycerides and cholesterol-HDL. The RMSEP values for glucose, triglycerides and cholesterol-HDL obtained through the best PLS model were 6.08, 16.07 e 2.03 mg dL-1, respectively. In the other case, for the determinations in human blood plasma, the predictions obtained by the PLS models provided unsatisfactory results with non linear tendency and presence of bias. Then, the ANN regression was applied as an alternative to PLS, considering its ability of modeling data from non linear systems. The root mean square error of monitoring (RMSEM) for glucose, triglycerides and total cholesterol, for the best ANN models, were 13.20, 10.31 e 12.35 mg dL-1, respectively. Statistical tests (F and t) suggest that NIR spectroscopy combined with multivariate regression methods (PLS and ANN) are capable to quantify the analytes (glucose, triglycerides and cholesterol) even when they are present in highly complex biological fluids, such as blood plasma

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The calculation of tooth mass discrepancy, essential for good planning and a proper orthodontic finishing, when performed manually, besides being laborious, requires considerable time consumption. The aim of this study was to develop and test Bolton Freeware, a software for analysis of the tooth mass discrepancy of Bolton, aiming to minimize the consumption of time in a less onerous way. The digital analysis of the software was done by means of two-dimensional scanning of plaster study models and compared to manual evaluation (gold standard), using 75 pairs of stone plaster study models divided into two groups according to the magnitude of the Curve of Spee (group I from 0 to 2 mm, group II greater than 2 to 3mm). All the models had permanent dentition and were in perfect condition. The manual evaluation was performed with a digital caliper and a calculator, and the time required to perform the analysis for both methods was recorded and compared. In addition, the software was evaluated by orthodontists regarding its use, by means of questionnaires developed specifically for this purpose. Calibration was performed prior to manual analysis, and excellent levels of inter-rater agreement were achieved, with ICC > 0.75 and r > 0.9 for total and anterior proportion. It was observed in the evaluation of error of the digital method that some teeth showed a significant systematic error, being the highest measured at 0.08 mm. The analysis of total tooth mass discrepancy performed by Bolton Freeware, for those cases in which the curve of Spee is mild and moderate, differ from manual analysis, on average, 0.09 mm and 0.07 mm respectively, for each tooth evaluated, with r> 0, 8 for total and anterior proportion. According to the specificity and sensitivity test, Bolton Freeware has an improved ability to detect true negatives, i.e. the presence of discrepancy. The Bolton analysis digitally performed was faster, with an average difference of time consumed to perform the analysis of Bolton between the two methods of approximately 6 minutes. Most experts interviewed (93%) approved the usability of the software

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Instrumentation is a tool of fundamental importance for research in several areas of human knowledge. Research projects are often unfeasible when data cannot be obtained due to lack of instruments, especially due to impor ting difficulties and the high costs there associated. Thus, in order to collaborate with the enhancement of a national technology, a multiband hand - held sun p hotometer (FSM - 4) was developed to operate in the 500 nm, 670 nm, 870 nm and 940 nm bands. In the 500 nm, 670 nm and 870 nm bands aerosols are monitored for evaluation of the AOD (Aerosol Optical Depth), and the PWC (Precipitable Water Column) is evaluated in the 940 nm band. For the development of the mech anical and electronic parts for the FSM - 4, th e materials and componen ts should combine low cost and quality of the data collected. The calibration process utilized the Langley method (ML) and Modified Langley Method (MLM). These methods are usually applied at high altitudes in order to provide atmosp heric optical stability. This condition however can be found in low height sites as shown in the research by Liu et al. (2010). Thus, for calibration of the FSM - 4, we investigated the atmospher ic optical stability utilizing the ML and MLM at a site in the cit y of Caicó / RN, located in the s emiarid region in northeastern Brazil. This site lies in a region far aw ay from large urban centers and activities generating anthropogenic atmospheric pollution. Data for calibration of the prototype were collected usin g the FSM - 4 in two separate operations during the dry season, one in December 2012 and another in September 2013. The methodologies showed optical atmospheric instability in the studied region through the dispersion of the values obtained for the calibrati on constant. This dispersion is affected by the variability of AOD and PWC during the appl ication of the above mentioned methods . As an alternative to the descr ibed sun photometer calibration , a short study was performed using the sun photometer worldwide network AERONET/NASA (AERsol RObotic NETwork – US Space Agency), installed in Petrolina / PE in Brazil. Data were collected for three days utilizing the AERONET instruments and the FSM - 4, operating simultaneously on the same site. By way of the ML and MLM techniques, convergent test values were obtained for the calibration constants, despite the low amount of data collected. This calibration transfer methodology proved to be a viable alternative to the FSM - 4 calibration .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objetivo: Buscar na literatura evidências científicas que suportem os benefícios da individualização da curva de compensação em prótese total, pela técnica do desgaste de Paterson, em relação à curva de compensação não individualizada. Metodologia: Realizou-se, em Dezembro de 2009, uma revisão sistemática da literatura nas bases de dados MEDLINE, LILACS e BBO com os termos “prótese total” e “oclusão”. Para abranger uma maior quantidade de dados, realizou-se uma busca manual através das referências dos artigos inicialmente selecionados. Resultados: Obteve-se 1273 referências na base dados MEDLINE, 64 na LILACS e 103 na BBO, num total de 1440 referências. Dessas, apenas 24 tratavam do assunto “curva de compensação”, as quais somaram-se mais 13 artigos selecionados manualmente. Conclusão: A partir dos resultados obtidos, conclui-se que não existem dados suficientes que comprovem clinicamente os benefícios da individualização da curva de compensação em relação às próteses totais com curva de compensação não individualizada. Ensaios clínicos controlados e randomizados são necessários para que se possa determinar qual o procedimento mais adequado.