35 resultados para Modelo digital de extração
Resumo:
This study investigates teacher training and cognitive practice of teachers in a Basic Education school that adopted the Project One Computer per Student (OCS) in their school routine. Its relevance consists in provide directions for the continuation of training activities on the Project and guide the teachers with their pedagogical practices using the laptop model one to one. The thesis defended is that the educator formation for social using of digital media (specially the laptops from the Project UCA) gives space to establish new sociotechnical relationships, of new social and professionals practices, new identitary components and a process of reflexivity and knowledge reconstruction to teach. We reaffirm the importance of reflexivity and appropriation of digital culture for the better development of teaching practice using the Information and Communication Technologies (ICTs), giving focus to the aspects of social and professional use of the technology. The study is part of the qualitative aspect and is a procedural tracking based on principles of ethnographic research. As procedures and methodological tools, were used: intensive observation of school environments, documental analysis, focal group, semi-structured questionnaires and semi-structured individual interviews. The research was held in a public school in the city of Parnamirim - RN. The subject sample relates to 17 teachers, coming from the elementary school I and II, Youth and Adult Education and High School, who went through the process of training UCA and having entered the laptops in their teaching. The research corpus is structured based on the messages built into the process of data collection and is analyzed based on principles of Content Analysis, specified by Laurence Bardin (2011). Was taken as theoretical reference studies by Tardif (2000; 2011), Pimenta (2009), Gorz (2004, 2005), Giddens (1991), Dewey, J. (1916), Boudieu (1994; 1999), Freire (1996; 2005), among others. The analysis indicates a process of reconstruction / revision of knowledge to teach and work in digital culture, being these knowledges guided by the experience of the subjects investigated. The reconstructed knowledges will be revealed from a categorization process. The following groups of knowledges: "technical knowledges", "didactic-methodological knowledges and knowledges of professionalization" were built on the assumption of ownership of digital culture in the educational context. The analysis confirms the appearance of new ways of sociability when acquiring other forms of acting and thinking ICTs, despite the environment adverse to the reflexivity shared among the teachers. Also reveals, based on the ownership concept present on the data analysis, the construction of meanings of belonging and transformation of individuals into social routes from the interweaving of the teaching practice with the digital culture. Emphasizes, finally, the importance of a training for use of ICTs that exceeds the instrumentation, in other words, what we call "technical knowledges", but taking on its structural basis the shared reflection, the opening for the ressignificance (new meaning) and reconstruction of new knowledges and practices and that really allows, to the teacher, the living of an experience capable of providing socio-technical transformations of their relationships
Resumo:
This dissertation of Mestrado investigated the performance and quality of web sites. The target of the research is the proposal of an integrated model of evaluation of services of digital information in web educational sites. The universe of the research was constituted by eighteen Brazilian Universities that offer after-graduation courses, in the levels of mestrado and doutorado in the area of Engineering of Production. The adopted methodology was a descriptive and exploratory research, using the technique of systematic comment and focus group, for the collection of the data, using itself changeable independent dependents and, through the application of two instruments of research. The analysis protocol was the instrument adopted for evaluation and attainment of qualitative results. E the analysis grating was applied for evaluation and attainment of the quantitative results. The qualitative results had identified to the lack of standardization of web sites, under the attributes of content, hierarchy of information, design of the colors and letters. It of accessibility for carriers of auditory and visual special necessities was observed inexistence, as well as the lack of convergence of medias and assistivas technologies. The language of the sites also was evaluated and all present Portuguese only language. The general result demonstrated in grafico and tables with classification of the Universities, predominating the Good note As for the quantitative results, analysis method ed was estatistico, in order to get the descriptive and inferencial result between the dependent and independent variaveis. How much a category of analysis of the services of the evaluated sites, was found it props up and the index generality weighed. These results had served of base for ranking of existence or inexistence the Universities, how much of the information of services in its web sites. In analysis inferencial the result of the test of correlation or association of the independent variaveis (level, concept of the CAPES and period of existence of the program) with the caracteristicas, called was gotten categories of services. For this analysis the estatisticos methods had been used: coefficient of Spearman and the Test of Fisher. But the category you discipline of the Program of Mestrado presented significance with variavel independent and concept of the CAPES. Main conclusion of this study it was ausencia of satandardization o how much to the subjective aspects, design, hierarchy of information navigability and content precision and the accessibility inexistence and convergence. How much to the quantitative aspects, the information services offered by web sites of the evaluated Universities, still they do not present a satisfactory and including quality. Absence of strategies, adoption of tools web, techniques of institucional marketing and services that become them more interactive, navigable is perceived and with aggregate value
Resumo:
In this work, we propose a new approach to Interactive Digital Television (IDTV), aimed to explore the concepts of immersivity. Several architectures have been proposed to IDTV, but they did not explore coherently questions related to immersion. The goal of this thesis consists in defining formally what is immersion and interactivity for digital TV and how they may be used to improve user experience in this new televisive model. The approach raises questions such as the appropriate choice of equipment to assist in the sense of immersion, which forms of interaction between users can be exploited in the interaction-immersion context, if the environment where an immersive and interactive application is used can influence the user experience, and which new forms of interactivity between users, and interactivity among users and interactive applications can be explored with the use of immersion. As one of the goals of this proposal, we point out new solutions to these issues that require further studies. We intend to formalize the concepts that embrace interactivity in the brazilian system of digital TV. In an initial study, this definition is organized into categories or levels of interactivity. From this point are made analisis and specifications to achieve immersion using DTV. We pretend to make some case studies of immersive interactive applications for digital television in order to validate the proposed architecture. We also approach the use of remote devices anda proposal of middleware architecture that allows its use in conjunction with immersive interactive applications
Resumo:
The petroleum industry, in consequence of an intense activity of exploration and production, is responsible by great part of the generation of residues, which are considered toxic and pollutants to the environment. Among these, the oil sludge is found produced during the production, transportation and refine phases. This work had the purpose to develop a process to recovery the oil present in oil sludge, in order to use the recovered oil as fuel or return it to the refining plant. From the preliminary tests, were identified the most important independent variables, like: temperature, contact time, solvents and acid volumes. Initially, a series of parameters to characterize the oil sludge was determined to characterize its. A special extractor was projected to work with oily waste. Two experimental designs were applied: fractional factorial and Doehlert. The tests were carried out in batch process to the conditions of the experimental designs applied. The efficiency obtained in the oil extraction process was 70%, in average. Oil sludge is composed of 36,2% of oil, 16,8% of ash, 40% of water and 7% of volatile constituents. However, the statistical analysis showed that the quadratic model was not well fitted to the process with a relative low determination coefficient (60,6%). This occurred due to the complexity of the oil sludge. To obtain a model able to represent the experiments, the mathematical model was used, the so called artificial neural networks (RNA), which was generated, initially, with 2, 4, 5, 6, 7 and 8 neurons in the hidden layer, 64 experimental results and 10000 presentations (interactions). Lesser dispersions were verified between the experimental and calculated values using 4 neurons, regarding the proportion of experimental points and estimated parameters. The analysis of the average deviations of the test divided by the respective training showed up that 2150 presentations resulted in the best value parameters. For the new model, the determination coefficient was 87,5%, which is quite satisfactory for the studied system
Resumo:
Environmental sustainability has become one of the topics of greatest interest in industry, mainly due to effluent generation. Phenols are found in many industries effluents, these industries might be refineries, coal processing, pharmaceutical, plastics, paints and paper and pulp industries. Because phenolic compounds are toxic to humans and aquatic organisms, Federal Resolution CONAMA No. 430 of 13.05.2011 limits the maximum content of phenols, in 0.5 mg.L-1, for release in freshwater bodies. In the effluents treatment, the liquid-liquid extraction process is the most economical for the phenol recovery, because consumes little energy, but in most cases implements an organic solvent, and the use of it can cause some environmental problems due to the high toxicity of this compound. Because of this, exists a need for new methodologies, which aims to replace these solvents for biodegradable ones. Some literature studies demonstrate the feasibility of phenolic compounds removing from aqueous effluents, by biodegradable solvents. In this extraction kind called "Cloud Point Extraction" is used a nonionic surfactant as extracting agent of phenolic compounds. In order to optimize the phenol extraction process, this paper studies the mathematical modeling and optimization of extraction parameters and investigates the effect of the independent variables in the process. A 32 full factorial design has been done with operating temperature and surfactant concentration as independent variables and, parameters extraction: Volumetric fraction of coacervate phase, surfactant and residual concentration of phenol in dilute phase after separation phase and phenol extraction efficiency, as dependent variables. To achieve the objectives presented before, the work was carried out in five steps: (i) selection of some literature data, (ii) use of Box-Behnken model to find out mathematical models that describes the process of phenol extraction, (iii) Data analysis were performed using STATISTICA 7.0 and the analysis of variance was used to assess the model significance and prediction (iv) models optimization using the response surface method (v) Mathematical models validation using additional measures, from samples different from the ones used to construct the model. The results showed that the mathematical models found are able to calculate the effect of the surfactant concentration and the operating temperature in each extraction parameter studied, respecting the boundaries used. The models optimization allowed the achievement of consistent and applicable results in a simple and quick way leading to high efficiency in process operation.
Resumo:
Studies show the great influence of free radicals and other oxidants as responsible for aging and degenerative diseases. On the other hand, the natural phenolic compounds has shown great as antioxidants to inhibit lipid peroxidation and lipoxygenase in vitro. Among these, is highlighted trans-resveratrol ( 3,5,4 `- trihydroxystilbene ) phenolic compound , characterized as a polyphenol stilbene class. The vegetables popularly known as "Azedinha" (Rumex Acetosa) has trans-resveratrol in its composition and from this, the present work aimed to study on the supercritical extraction and conventional extraction (Soxhlet and sequential) in roots of Rumex Acetosa, evaluating the efficiency of extractive processes, antioxidant activity, total phenolic content and quantification of trans-resveratrol contained in the extracts. Extractions using supercritical CO2 as solvent, addition of co-solvent (ethanol) and were conducted by the dynamic method in a fixed bed extractor. The trial met a 23 factorial design with three replications at the central point, with the variable reply process yield and concentration of trans-resveratrol and pressure as independent variables, temperature and concentration of co-solvent (% v/v). Yields ( mass of dry extract / mass of raw material used ) obtained from the supercritical extraction ranged from 0,8 to 7,63 % , and the best result was obtained at 250 bar and 90 °C using the co-solvent 15% ethanol (% v/v). The value was calculated for YCER a flow rate of 1,0 ± 0,17 g/min resulting in 0,0469 CO2 ( g solute / g solvent ). The results of the mass yield varied between conventional extractions 0,78 % ( hexane) and 9,97 % (ethanol). The statistical model generated from the data of the concentration of trans-resveratrol performed as meaningful and predictive for a 95% confidence. GC analysis on HPLC (High Performance Liquid Chromatography), transresveratrol was quantified in all extracts and concentration values ranged between 0,0033 and 0,42 ( mg / g extract) for supercritical extracts and between 0,449 and 17,046 (mg / g extract) to conventional extractions and therefore, the Soxhlet extraction with ethanol for more selective trans-resveratrol than the supercritical fluid. Evaluation of antioxidant (radical method to sequester 2,2- diphenyl-1- picryl - hydrazyl - DPPH) the supercritical extracts resulted in EC50 values (concentration effective to neutralize 50% of free radicals) of between 7,89 and 18,43 mg/mL , while resulting in a Soxhlet extraction with EC50 values in the range of 6,05 and 7,39 mg/mL. As for quantification of the phenolic compounds (Method Spectrophotometer Folin-Ciocalteau) the supercritical extracts resulted in values between 85,3 and 194,79 mg GAE / g extract, whereas values derived from the Soxhlet extract resulted in values between 178,5 and 237,8 mg GAE / g extract. The high antioxidant activity can not be attributed solely to the presence of phenolic compounds, but the presence of other antioxidants in the existing Rumex acetosa
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Venous wounds cause physical, psychological and financial problems that impact the quality of life of patients. Treatment alternatives are investigated in order to reduce healthcare costs and improve quality of life of people affected by this problem. Physical resources, such as therapeutic ultrasound (US), are being considered in the treatment of ulcers as a potential healing agent. This study aimed to investigate the application of US as a treatment for venous ulcers. Subjects were divided into two groups: US group, where treatment consisted of 5 sessions of pulsed US (3 MHz, 1W/cm²) associated with compression and kinesiotherapy; and sham group, where individuals went through the same procedures, but with sham US therapy. Subjects were evaluated for wound size by planimetry and digital photography, visual analogue scale for pain, quality of life by the questionnaires SF- 36 and VEINES-QoL/Sym and enzymatic activity of metalloproteinases 2 and 9 by zymography. It was observed mean reduction in wound area of 41.58±53.8% for the US group and 63.47±37.2% for the placebo group, maintenance of quality of life scores in the US group and significant improvement (p<0.05) in the placebo group by VEINES questionnaire. It was observed decreased perception of pain in the placebo group. Sample feasibility for analysis of the protein activity of metalloproteinases 2 and 9 by zymography collected by swab method was also confirmed. Our data did not give us evidence to support the theory that the US accelerates healing of venous ulcers in a short-term analysis. However, we observed that standard care associated with compression therapy and kinesiotherapy were able to significantly shorten the progression of chronic venous ulcers
Resumo:
The objective of this work is to identify, to chart and to explain the evolution of the soil occupation and the envirionment vulnerability of the areas of Canto do Amaro and Alto da Pedra, in the city of Mossoró-RN, having as base analyzes it multiweather of images of orbital remote sensors, the accomplishment of extensive integrated works of field to a Geographic Information System (GIS). With the use of inserted techniques of it analyzes space inserted in a (GIS), and related with the interpretation and analyzes of products that comes from the Remote Sensoriamento (RS.), make possible resulted significant to reach the objectives of this works. Having as support for the management of the information, the data set gotten of the most varied sources and stored in digital environment, it comes to constitute the geographic data base of this research. The previous knowledge of the spectral behavior of the natural or artificial targets, and the use of algorithms of Processing of Digital images (DIP), it facilitates the interpretation task sufficiently and searchs of new information on the spectral level. Use as background these data, was generated a varied thematic cartography was: Maps of Geology, Geomorfológicals Units soils, Vegetation and Use and Occupation of the soil. The crossing in environment SIG, of the above-mentioned maps, generated the maps of Natural and Vulnerability envirionmental of the petroliferous fields of I Canto do Amaro and Alto da Pedra-RN, working in an ambient centered in the management of waters and solid residuos, as well as the analysis of the spatial data, making possible then a more complex analysis of the studied area
Resumo:
The separation methods are reduced applications as a result of the operational costs, the low output and the long time to separate the uids. But, these treatment methods are important because of the need for extraction of unwanted contaminants in the oil production. The water and the concentration of oil in water should be minimal (around 40 to 20 ppm) in order to take it to the sea. Because of the need of primary treatment, the objective of this project is to study and implement algorithms for identification of polynomial NARX (Nonlinear Auto-Regressive with Exogenous Input) models in closed loop, implement a structural identification, and compare strategies using PI control and updated on-line NARX predictive models on a combination of three-phase separator in series with three hydro cyclones batteries. The main goal of this project is to: obtain an optimized process of phase separation that will regulate the system, even in the presence of oil gushes; Show that it is possible to get optimized tunings for controllers analyzing the mesh as a whole, and evaluate and compare the strategies of PI and predictive control applied to the process. To accomplish these goals a simulator was used to represent the three phase separator and hydro cyclones. Algorithms were developed for system identification (NARX) using RLS(Recursive Least Square), along with methods for structure models detection. Predictive Control Algorithms were also implemented with NARX model updated on-line, and optimization algorithms using PSO (Particle Swarm Optimization). This project ends with a comparison of results obtained from the use of PI and predictive controllers (both with optimal state through the algorithm of cloud particles) in the simulated system. Thus, concluding that the performed optimizations make the system less sensitive to external perturbations and when optimized, the two controllers show similar results with the assessment of predictive control somewhat less sensitive to disturbances
Resumo:
In the current work are presented the results about the study of digital mapping of analogs referents the fluvial oil reservoirs in the Açu Formation. With the regional recognizing in the south corner of Potiguar Basin was selected a area of 150 Km square in the west of Assu city. In this area was chosen the outcrops for the digital mapping and from the data fields and remote sensors were done the depositional architectural for the fluvial deposits, which it was named coarse meandering fluvial systems. In the deposits were individualized 3 (three) fluvial cycles, which they was separated by bounding surface of fifth order. Such cycles are preferentially sandy, with fining-upward sequence finished in flood plain deposits. Inner of the sandy levels of the filling channels were characterized least cycles, normaly incomplete, constituted by braided sandy bodies and bounding surfaces of fourth order. In the mapped area was chosen a outcrop with great exposition, where it was possible to see tipical deposits of filling channel and was in this outcrop that was done the digital mapping. In this outcrop was used diverse technics and tools, which they integrated sedimentological, altimetric (GPS, Total Station), LIDAR (Light Detection and Ranging), digital photomosaic of high resolution and of the inner geometries (Ground Penetration Radar) data sets. For the integrating, interpretation and visualization of data was used software GoCAD®. The final product of the outcrop digital mapping was the photorealistic model of part of the cliff (or slope) because the observed reflectors in the radargrams were absents. A part of bar oblique accretion was modeled according to GPR gride of 200x200 meters in the alluvial Assu river probable recent analog. With the data of inner geometries was developed the three-dimentional sedimentary architectural, where it was possible characterize sand sheet deposits and many hierarchy of braided channels. At last, simulations of sedimentary geometries and architectures of the Potiguar Basin Fluvial Reservoirs were done with PetBool software, in order to understand the capacity of this program in simulations with a lot of numbers of conditioning wells. In total, 45 simulations was acquired, where the time and the channel numbers increase in relation of the conditioning wells quantity. The deformation of the meanders was detected from the change of simulated dominion dimensions. The presence of this problem was because the relationship between the simulated dominion and the width of the meander
Resumo:
In this paper we present the methodological procedures involved in the digital imaging in mesoscale of a block of travertines rock of quaternary age, originating from the city of Acquasanta, located in the Apennines, Italy. This rocky block, called T-Block, was stored in the courtyard of the Laboratório Experimental Petróleo "Kelsen Valente" (LabPetro), of Universidade Estadual de Campinas (UNICAMP), so that from it were performed Scientific studies, mainly for research groups universities and research centers working in brazilian areas of reservoir characterization and 3D digital imaging. The purpose of this work is the development of a Model Solid Digital, from the use of non-invasive techniques of digital 3D imaging of internal and external surfaces of the T-Block. For the imaging of the external surfaces technology has been used LIDAR (Light Detection and Range) and the imaging surface Interior was done using Ground Penetrating Radar (GPR), moreover, profiles were obtained with a Gamma Ray Gamae-spectômetro laptop. The goal of 3D digital imaging involved the identification and parameterization of surface geological and sedimentary facies that could represent heterogeneities depositional mesoscale, based on study of a block rocky with dimensions of approximately 1.60 m x 1.60 m x 2.70 m. The data acquired by means of terrestrial laser scanner made available georeferenced spatial information of the surface of the block (X, Y, Z), and varying the intensity values of the return laser beam and high resolution RGB data (3 mm x 3 mm), total points acquired 28,505,106. This information was used as an aid in the interpretation of radargrams and are ready to be displayed in rooms virtual reality. With the GPR was obtained 15 profiles of 2.3 m and 2 3D grids, each with 24 sections horizontal of 1.3 and 14 m vertical sections of 2.3 m, both the Antenna 900 MHz to about 2600 MHz antenna. Finally, the use of GPR associated with Laser Scanner enabled the identification and 3D mapping of 3 different radarfácies which were correlated with three sedimentary facies as had been defined at the outset. The 6 profiles showed gamma a low amplitude variation in the values of radioactivity. This is likely due to the fact of the sedimentary layers profiled have the same mineralogical composition, being composed by carbonate sediments, with no clay in siliciclastic pellitic layers or other mineral carrier elements radioactive
Resumo:
Displays intervention project that proposes changes in the organizational and operational structure of the Board of Trade of Rio Grande do Norte. Analyzes routine flow of activities of business registration and organizational structure. Compare the models, with the current proposed. The work is divided into six chapters which lists since the description of the institution, work object, until the proposed new organizational and operational model. Uses the methodology of literature review and observation of reality
Resumo:
This work proposes the use of the behavioral model of the hysteresis loop of the ferroelectrics capacitor as a new alternative to the usually costly techniques in the computation of nonlinear functions in artificial neurons implemented on reconfigurable hardware platform, in this case, a FPGA device. Initially the proposal has been validated by the implementation of the boolean logic through the digital models of two artificial neurons: the Perceptron and a variation of the model Integrate and Fire Spiking Neuron, both using the model also digital of the hysteresis loop of the ferroelectric capacitor as it’s basic nonlinear unit for the calculations of the neurons outputs. Finally, it has been used the analog model of the ferroelectric capacitor with the goal of verifying it’s effectiveness and possibly the reduction of the number of necessary logic elements in the case of implementing the artificial neurons on integrated circuit. The implementations has been carried out by Simulink models and the synthesizing has been done through the DSP Builder software from Altera Corporation.
Resumo:
Information extraction is a frequent and relevant problem in digital signal processing. In the past few years, different methods have been utilized for the parameterization of signals and the achievement of efficient descriptors. When the signals possess statistical cyclostationary properties, the Cyclic Autocorrelation Function (CAF) and the Spectral Cyclic Density (SCD) can be used to extract second-order cyclostationary information. However, second-order cyclostationary information is poor in nongaussian signals, as the cyclostationary analysis in this case should comprise higher-order statistical information. This paper proposes a new mathematical tool for the higher-order cyclostationary analysis based on the correntropy function. Specifically, the cyclostationary analysis is revisited focusing on the information theory, while the Cyclic Correntropy Function (CCF) and Cyclic Correntropy Spectral Density (CCSD) are also defined. Besides, it is analytically proven that the CCF contains information regarding second- and higher-order cyclostationary moments, being a generalization of the CAF. The performance of the aforementioned new functions in the extraction of higher-order cyclostationary characteristics is analyzed in a wireless communication system where nongaussian noise exists.