20 resultados para Off-line TMAH-GC-MS
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The Telehealth Brazil Networks Program, created in 2007 with the aim of strengthening primary care and the unified health system (SUS - Sistema Único de Saúde), uses information and communication technologies for distance learning activities related to health. The use of technology enables the interaction between health professionals and / or their patients, furthering the ability of Family Health Teams (FHT). The program is grounded in law, which determines a number of technologies, protocols and processes which guide the work of Telehealth nucleus in the provision of services to the population. Among these services is teleconsulting, which is registered consultation and held between workers, professionals and managers of healthcare through bidirectional telecommunication instruments, in order to answer questions about clinical procedures, health actions and questions on the dossier of work. With the expansion of the program in 2011, was possible to detect problems and challenges that cover virtually all nucleus at different scales for each region. Among these problems can list the heterogeneity of platforms, especially teleconsulting, and low internet coverage in the municipalities, mainly in the interior cities of Brazil. From this perspective, the aim of this paper is to propose a distributed architecture, using mobile computing to enable the sending of teleconsultation. This architecture works offline, so that when internet connection data will be synchronized with the server. This data will travel on compressed to reduce the need for high transmission rates. Any Telehealth Nucleus can use this architecture, through an external service, which will be coupled through a communication interface.
Resumo:
In recent years, the area of advanced materials has been considerably, especially when it comes to materials for industrial use, such as is the case with structured porosity of catalysts suitable for catalytic processes. The use of catalysts combined with the fast pyrolysis process is an alternative to the oxygenate production of high added value, because, in addition to increasing the yield and quality of products, allows you to manipulate the selectivity to a product of interest, and therefore allows greater control over the characteristics of the final product. Based on these arguments, in this work were prepared titanium catalysts supported on MCM-41 for use in catalytic pyrolysis of biomass, called elephant grass. The reactions of pyrolysis of biomass were performed in a micro pyrolyzer, Py-5200, coupled to GC / MS, the company CDS Corporation, headquartered in the United States. The catalysts Ti-MCM-41 in different molar ratios were characterized by XRD, TG / DTG, FT-IR, SEM, XRF, UV-visible adsorption of nitrogen and the distribution of particle diameter and specific surface area measurement by the BET method. From the catalytic tests it was observed that the catalysts synthesized showed good results for the pyrolysis reaction.The main products were obtained a higher yield of aldehydes, ketones and furan. It was observed that the best reactivity is a direct function of the ratio Si/Ti, nature and concentration of the active species on mesoporous supports. Among the catalysts Ti-MCM-41 (molar ratio Si / Ti = 25 and 50), the ratio Si / Ti = 25 (400 ° C and 600 ° C) favored the cracking of oxygenates such as acids , aldehydes, ketones, furans and esters. Already the sample ratio Si / Ti = 50 had the highest yield of aromatic oxygenates
Resumo:
Estuaries are environments prone to the input of chemical pollutants of various kinds and origins, including polycyclic aromatic hydrocarbons (PAHs). Anthropogenic PAHs may have two possible sources: pyrolytic (with four or more aromatic rings and low degree of alkylation) and petrogenic (with two and three aromatic rings and high degree of alkylation). This study aimed to evaluate the levels, distribution and possible sources of polycyclic aromatic hydrocarbons in the estuary of the Potengi river, Natal, Brazil. Samples of bottom sediments were collected in the final 12 km of the estuary until its mouth to the sea, where the urbanization of the Great Natal is more concentrated. Sampling was performed on 12 cross sections, with three stations each, totaling 36 samples, identified as T1 to T36. The non alkylated and alkylated PAHs were analyzed by gas chromatography coupled to mass spectrometry (GC / MS). PAHs were detected in all 36 stations with total concentration on each varying 174-109407 ng g-1. These values are comparable to those of several estuarine regions worldwide with high anthropogenic influence, suggesting the record of diffuse contamination installed in the estuary. PAHs profiles were similar for most stations. In 32 of the 36 stations, low molecular weight PAHs (with 2 and 3 ring: naphthalene, phenanthrene and their alkylated homologues) prevailed, which ranged from 54% to 100% of the total PAH, indicating that leaks, spills and combustion fuels are the dominant source of PAH pollution in the estuary. The level of contamination by PAHs in most stations suggests that there is potential risk of occasional adverse biological effects, but in some stations adverse impacts on the biota may occur frequently. The diagnostic ratios could differentiate sources of PAHs in sediments of the estuary, which were divided into three groups: petrogenic, pyrolytic and mixing of sources. The urban concentration of the Great Natal and the various industrial activities associated with it can be blamed as potential sources of PAHs in bottom sediments of the estuary studied. The data presented highlight the need to control the causes of existing pollution in the estuary
Resumo:
The fast pyrolysis of lignocellulosic biomass is a thermochemical conversion process for production energy which have been very atratactive due to energetic use of its products: gas (CO, CO2, H2, CH4, etc.), liquid (bio-oil) and charcoal. The bio-oil is the main product of fast pyrolysis, and its final composition and characteristics is intrinsically related to quality of biomass (ash disposal, moisture, content of cellulose, hemicellulose and lignin) and efficiency removal of oxygen compounds that cause undesirable features such as increased viscosity, instability, corrosiveness and low calorific value. The oxygenates are originated in the conventional process of biomass pyrolysis, where the use of solid catalysts allows minimization of these products by improving the bio-oil quality. The present study aims to evaluate the products of catalytic pyrolysis of elephant grass (Pennisetum purpureum Schum) using solid catalysts as tungsten oxides, supported or not in mesoporous materials like MCM-41, derived silica from rice husk ash, aimed to reduce oxygenates produced in pyrolysis. The biomasss treatment by washing with heated water (CEL) or washing with acid solution (CELix) and application of tungsten catalysts on vapors from the pyrolysis process was designed to improve the pyrolysis products quality. Conventional and catalytic pyrolysis of biomass was performed in a micro-pyrolyzer, Py-5200, coupled to GC/MS. The synthesized catalysts were characterized by X ray diffraction, infrared spectroscopy, X ray fluorescence, temperature programmed reduction and thermogravimetric analysis. Kinetic studies applying the Flynn and Wall model were performed in order to evaluate the apparent activation energy of holoceluloce thermal decomposition on samples elephant grass (CE, CEL and CELix). The results show the effectiveness of the treatment process, reducing the ash content, and were also observed decrease in the apparent activation energy of these samples. The catalytic pyrolysis process converted most of the oxygenate componds in aromatics such as benzene, toluene, ethylbenzene, etc
Resumo:
The oily sludge is a complex mix of hydrocarbons, organic impurities, inorganic and water. One of the major problems currently found in petroleum industry is management (packaging, storage, transport and fate) of waste. The nanomaterials (catalysts) mesoporous and microporous are considered promising for refining and adsorbents process for environment protection. The aim of this work was to study the oily sludge from primary processing (raw and treated) and vacuum residue, with application of thermal analyses technique (pyrolysis), thermal and catalytic pyrolysis with nanomaterials, aiming at production petroleum derived. The sludge and vacuum residue were analyzed using a soxhlet extraction system, elemental analysis, thin layer chromatography, thermogravimetry and pyrolysis coupled in gas chromatography/mass spectrometry (Py GC MS). The catalysts AlMCM-41, AlSBA-15.1 e AlSBA-15.2 were synthesized with molar ratio silicon aluminum of 50 (Si/Al = 50), using tetraethylorthosilicante as source of silicon and pseudobuhemita (AlOOH) as source of aluminum. The analyzes of the catalysts indicate that materials showed hexagonal structure and surface area (783,6 m2/g for AlMCM-41, 600 m2/g for AlSBA-15.1, 377 m2/g for AlSBA-15.2). The extracted oily sludge showed a range 65 to 95% for organic components (oil), 5 to 35% for inorganic components (salts and oxides) and compositions different of derivatives. The AlSBA-15 catalysts showed better performance in analyzes for production petroleum derived, 20% increase in production of kerosene and light gas oil. The energy potential of sludge was high and it can be used as fuel in other cargo processed in refinery
Resumo:
Natural oils have shown a scientific importance due to its pharmacological activity and renewable character. The copaiba (Copaifera langsdorffii) and Bullfrog (Rana catesbeiana Shaw) oils are used in folk medicine particularly because the anti-inflammatory and antimicrobial activities. Emulsion could be eligible systems to improve the palatability and fragrance, enhance the pharmacological activities and reduce the toxicological effects of these oils. The aim of this work was to investigate the antimicrobial activity of emulsions based on copaiba (resin-oil and essential-oil) and bullfrog oils against fungi and bacteria which cause skin diseases. Firstly, the essential oil was extracted from copaiba oil-resin and the oils were characterized by gas chromatography coupled to a mass spectrometry (GC-MS). Secondly, emulsion systems were produced. A microbiological screening test with all products was performed followed (the minimum inhibitory concentration, the bioautography method and the antibiofilm determination). Staphylococcus aureus, S. epidermidis, Pseudomonas aeruginosa, Candida albicans, C. parapsilosis, C. glabrata, C. krusei and C. tropicalis American Type Culture Collection (ATCC) and clinical samples were used. The emulsions based on copaiba oil-resin and essential oil improved the antimicrobial activity of the pure oils, especially against Staphylococcus e Candida resistant to azoles. The bullfrog oil emulsion and the pure bullfrog oil showed a lower effect on the microorganisms when compared to the copaiba samples. All the emulsions showed a significant antibiofilm activity by inhibiting the cell adhesion. Thus, it may be concluded that emulsions based on copaiba and bullfrog oils are promising candidates to treatment of fungal and bacterial skin infections
Resumo:
Most algorithms for state estimation based on the classical model are just adequate for use in transmission networks. Few algorithms were developed specifically for distribution systems, probably because of the little amount of data available in real time. Most overhead feeders possess just current and voltage measurements at the middle voltage bus-bar at the substation. In this way, classical algorithms are of difficult implementation, even considering off-line acquired data as pseudo-measurements. However, the necessity of automating the operation of distribution networks, mainly in regard to the selectivity of protection systems, as well to implement possibilities of load transfer maneuvers, is changing the network planning policy. In this way, some equipments incorporating telemetry and command modules have been installed in order to improve operational features, and so increasing the amount of measurement data available in real-time in the System Operation Center (SOC). This encourages the development of a state estimator model, involving real-time information and pseudo-measurements of loads, that are built from typical power factors and utilization factors (demand factors) of distribution transformers. This work reports about the development of a new state estimation method, specific for radial distribution systems. The main algorithm of the method is based on the power summation load flow. The estimation is carried out piecewise, section by section of the feeder, going from the substation to the terminal nodes. For each section, a measurement model is built, resulting in a nonlinear overdetermined equations set, whose solution is achieved by the Gaussian normal equation. The estimated variables of a section are used as pseudo-measurements for the next section. In general, a measurement set for a generic section consists of pseudo-measurements of power flows and nodal voltages obtained from the previous section or measurements in real-time, if they exist -, besides pseudomeasurements of injected powers for the power summations, whose functions are the load flow equations, assuming that the network can be represented by its single-phase equivalent. The great advantage of the algorithm is its simplicity and low computational effort. Moreover, the algorithm is very efficient, in regard to the accuracy of the estimated values. Besides the power summation state estimator, this work shows how other algorithms could be adapted to provide state estimation of middle voltage substations and networks, namely Schweppes method and an algorithm based on current proportionality, that is usually adopted for network planning tasks. Both estimators were implemented not only as alternatives for the proposed method, but also looking for getting results that give support for its validation. Once in most cases no power measurement is performed at beginning of the feeder and this is required for implementing the power summation estimations method, a new algorithm for estimating the network variables at the middle voltage bus-bar was also developed
Resumo:
This work deals with the development of a prototype of a helicopter quadrotor for monitoring applications in oil facilities. Anomaly detection problems can be resolved through monitoringmissions performed by a suitably instrumented quadrotor, i.e. infrared thermosensors should be embedded. The proposed monitoring system aims to reduce accidents as well as to make possible the use of non-destructive techniques for detection and location of leaks caused by corrosion. To this end, the implementation of a prototype, its stabilization and a navigation strategy have been proposed. The control strategy is based on dividing the problem into two control hierarchical levels: the lower level stabilizes the angles and the altitude of the vehicle at the desired values, while the higher one provide appropriate references signals to the lower level in order the quadrotor performs the desired movements. The navigation strategy for helicopter quadrotor is made using information provided by a acquisition image system (monocular camera) embedded onto the helicopter. Considering that the low-level control has been solved, the proposed vision-based navigation technique treats the problem as high level control strategies, such as, relative position control, trajectory generation and trajectory tracking. For the position control we use a control technique for visual servoing based on image features. The trajectory generation is done in a offline step, which is a visual trajectory composed of a sequence of images. For the trajectory tracking problem is proposed a control strategy by continuous servovision, thus enabling a navigation strategy without metric maps. Simulation and experimental results are presented to validate the proposal
Resumo:
The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals
Resumo:
This work proposes hardware architecture, VHDL described, developed to embedded Artificial Neural Network (ANN), Multilayer Perceptron (MLP). The present work idealizes that, in this architecture, ANN applications could easily embed several different topologies of MLP network industrial field. The MLP topology in which the architecture can be configured is defined by a simple and specifically data input (instructions) that determines the layers and Perceptron quantity of the network. In order to set several MLP topologies, many components (datapath) and a controller were developed to execute these instructions. Thus, an user defines a group of previously known instructions which determine ANN characteristics. The system will guarantee the MLP execution through the neural processors (Perceptrons), the components of datapath and the controller that were developed. In other way, the biases and the weights must be static, the ANN that will be embedded must had been trained previously, in off-line way. The knowledge of system internal characteristics and the VHDL language by the user are not needed. The reconfigurable FPGA device was used to implement, simulate and test all the system, allowing application in several real daily problems
Resumo:
The Brain-Computer Interfaces (BCI) have as main purpose to establish a communication path with the central nervous system (CNS) independently from the standard pathway (nervous, muscles), aiming to control a device. The main objective of the current research is to develop an off-line BCI that separates the different EEG patterns resulting from strictly mental tasks performed by an experimental subject, comparing the effectiveness of different signal-preprocessing approaches. We also tested different classification approaches: all versus all, one versus one and a hierarchic classification approach. No preprocessing techniques were found able to improve the system performance. Furthermore, the hierarchic approach proved to be capable to produce results above the expected by literature
Resumo:
The extraction with pressurized fluids has become an attractive process for the extraction of essential oils, mainly due the specific characteristics of the fluids near the critical region. This work presents results of the extraction process of the essential oil of Cymbopogon winterianus J. with CO2 under high pressures. The effect of the following variables was evaluated: solvent flow rate (from 0.37 to 1.5 g CO2/min), pressure (66.7 and 75 bar) and temperature (8, 10, 15, 20 and 25 ºC) on the extraction kinetics and the total yield of the process, as well as in the solubility and composition of the C. winterianus essential oil. The experimental apparatus consisted of an extractor of fixed bed and the dynamic method was adopted for the calculation of the oil solubility. Extractions were also accomplished by conventional techniques (steam and organic solvent extraction). The determination and identification of extract composition were done by gas chromatography coupled with a mass spectrometer (GC-MS). The extract composition varied in function of the studied operational conditions and also related to the used extraction method. The main components obtained in the CO2 extraction were elemol, geraniol, citronellol and citronellal. For the steam extraction were the citronellal, citronellol and geraniol and for the organic solvent extraction were the azulene and the hexadecane. The most yield values (2.76%) and oil solubility (2.49x10-2 g oil/ g CO2) were obtained through the CO2 extraction in the operational conditions of T = 10°C, P = 66.7 bar and solvent flow rate 0.85 g CO2/min
Resumo:
The decontamination of the materials has been subject of some studies. One of the factors that it increases the pollution is the lack of responsibility in the discarding of toxic trash, as for example the presence of PCB (Polychlorinated Biphenyls) in the environment. In the Brazilian regulations, the material contaminated with PCB in concentrations higher than 50 ppm must be stored in special places or destroyed, usually by incineration in plasma furnace with dual steps. Due to high cost of the procedure, new methodologies of PCBs removal has been studied. The objective of this study was to develop an experimental methodology and analytical methodology for quantification of removal of PCBs through out the processes of extractions using supercritical fluid and Soxhlet method, also technical efficiency of the two processes of extraction, in the treatment of contaminated materials with PCBs. The materials studied were soils and wood, both were simulated contamination with concentration of 6.000, 33.000 and 60.000 mg of PCB/ kg of materials. Soxhlet extractions were performed using 100 ml of hexane, and temperature of 180 ºC. Extractions by fluid supercritical were performed at conditions of 200 bar, 70°C, and supercritical CO2 flow-rate of 3 g/min for 1-3 hours. The extracts obtained were quantified using Gas chromatography-mass spectrometry (GC/MS). The conventional extractions were made according to factorial experimental planning technique 22, with aim of study the influence of two variables of process extraction for the Soxhlet method: contaminant concentration and extraction time for obtain a maximum removal of PCB in the materials. The extractions for Soxhlet method were efficient for extraction of PCBs in soil and wood in both solvent studied (hexane and ethanol). In the experimental extraction in soils, the better efficient of removal of PCBs using ethanol as solvent was 81.3% than 95% for the extraction using hexane as solvent, for equal time of extraction. The results of the extraction with wood showed statistically it that there is not difference between the extractions in both solvent studied. The supercritical fluid extraction in the conditions studied showed better efficiency in the extraction of PCBs in the wood matrix than in soil, for two hours extractions the obtain percentual of 43.9 ± 0.5 % for the total of PCBs extracted in the soils against 95.1 ± 0,5% for the total of PCBs extracted in the wood. The results demonstrated that the extractions were satisfactory for both technical studied
Resumo:
A chemical process optimization and control is strongly correlated with the quantity of information can be obtained from the system. In biotechnological processes, where the transforming agent is a cell, many variables can interfere in the process, leading to changes in the microorganism metabolism and affecting the quantity and quality of final product. Therefore, the continuously monitoring of the variables that interfere in the bioprocess, is crucial to be able to act on certain variables of the system, keeping it under desirable operational conditions and control. In general, during a fermentation process, the analysis of important parameters such as substrate, product and cells concentration, is done off-line, requiring sampling, pretreatment and analytical procedures. Therefore, this steps require a significant run time and the use of high purity chemical reagents to be done. In order to implement a real time monitoring system for a benchtop bioreactor, these study was conducted in two steps: (i) The development of a software that presents a communication interface between bioreactor and computer based on data acquisition and process variables data recording, that are pH, temperature, dissolved oxygen, level, foam level, agitation frequency and the input setpoints of the operational parameters of the bioreactor control unit; (ii) The development of an analytical method using near-infrared spectroscopy (NIRS) in order to enable substrate, products and cells concentration monitoring during a fermentation process for ethanol production using the yeast Saccharomyces cerevisiae. Three fermentation runs were conducted (F1, F2 and F3) that were monitored by NIRS and subsequent sampling for analytical characterization. The data obtained were used for calibration and validation, where pre-treatments combined or not with smoothing filters were applied to spectrum data. The most satisfactory results were obtained when the calibration models were constructed from real samples of culture medium removed from the fermentation assays F1, F2 and F3, showing that the analytical method based on NIRS can be used as a fast and effective method to quantify cells, substrate and products concentration what enables the implementation of insitu real time monitoring of fermentation processes
Resumo:
Currently, several psychological and non-psychological tests can be found in publishes without standardization on procedures set in different psychological areas, like intelligence, emotional states, attitudes, social skills, vocation, preferences and others. The computerized psychological testing is a extension of traditional testing psychological practices. However, it has own psychometrics qualities, either by its matching in a computerized environment or by the extension that can be developed in it. The current research, developed from a necessity to study process of validity and reliability on a computerized test, drew a methodological structure to provide parallel applications in numerous kinds of operational groups, evaluating the influences of the time and approach in the computerization process. This validity refers to normative values groups, reproducibility in computerized applications process and data processing. Not every psychological test can be computerized. Therefore, our need to find a good test, with quality and plausible properties to transform in computerized application, leaded us to use The Millon Personality Inventory, created by Theodore Millon. This Inventory assesses personality according to 12 bipolarities distributed in 24 factors, distributed in categories motivational styles, cognitive targets and interpersonal relations. This instrument doesn t diagnose pathological features, but test normal and non adaptive aspects in human personality, comparing with Theodore Millon theory of personality. In oder to support this research in a Brazilian context in psychological testing, we discuss the theme, evaluating the advantages and disadvantages of such practices. Also we discuss the current forms in computerization of psychological testing and the main specific criteria in this psychometric specialized area of knowledge. The test was on-line, hosted in the site http://www.planetapsi.com, during the years of 2007 and 2008, which was available a questionnaire to describe social characteristics before test. A report was generated from the data entry of each user. An application of this test was conducted in a linear way through a national coverage in all Brazil regions, getting 1508 applications. Were organized nine groups, reaching 180 applications in test and retest subject, where three periods of time and three forms of retests for studies of on-line tests were separated. Parallel to this, we organized multi-application session offline group, 20 subjects who received tests by email. The subjects of this study were generally distributed by the five Brazilian regions, and were noticed about the test via the Internet. The performance application in traditional and on-line tested groups subsidies us to conclude that on-line application provides significantly consistency in all criteria for validity studied and justifies its use. The on-line test results were related not only among themselves but were similar to those data of tests done on pencil and paper (0,82). The retests results demonstrated correlation, between 0,92 and, 1 while multisessions had a good correlation in these comparisons. Moreover, were assessed the adequacy of operational criteria used, such as security, the performance of users, the environmental characteristics, the organization of the database, operational costs and limitations in this on-line inventory. In all these five items, there were excellent performances, concluding, also, that it s possible a self-applied psychometric test. The results of this work are a guide to question and establish of methodologies studies for computerization psychological testing software in the country