409 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA: CIÊNCIAS CLIMÁTICAS
Resumo:
Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real world scenarios (e.g. lightning, complex backgrounds, facial hair and so on), automatically obtaining affective information from face accurately is a very challenging accomplishment. This work presents a framework which aims to analyse emotional experiences through naturally generated facial expressions. Our main contribution is a new 4-dimensional model to describe emotional experiences in terms of appraisal, facial expressions, mood, and subjective experiences. In addition, we present an experiment using a new protocol proposed to obtain spontaneous emotional reactions. The results have suggested that the initial emotional state described by the participants of the experiment was different from that described after the exposure to the eliciting stimulus, thus showing that the used stimuli were capable of inducing the expected emotional states in most individuals. Moreover, our results pointed out that spontaneous facial reactions to emotions are very different from those in prototypic expressions due to the lack of expressiveness in the latter.
Resumo:
Digital games have been used as aiding tool for transmission of knowledge, allowing faster dissemination of content. Using this strategy of disseminating logical reasoning development for basic school children can be the motivating gear that helps in the learning process for any area. In this context, many games can be created and provided for the use of teacher and student. However, the complexity of construction of these games becomes a obstacle which can, often, prevent their construction. Thus, this paper presents a framework for creating games, which teach programming logic, presenting from their conception to their integration with the visual programming environment (Blockly) and scenarios created in HTML5.
Estudo teórico de intermediários tetraédricos acidez / basicidade e estereosseletividade enzimáticos
Resumo:
The present work aimed first, the theoretical study of tetrahedral intermediate stability formed from carbonyl addition reactions using the second (MP2) and third (MP3) order Møller–Plesset perturbation theory. Linear correlations between electronic energy difference of reactions with Wiberg Indexes and C-O bond lengths were obtained, and was observed that the stability of adducts formed depends directly of electronic density involved between these atoms. The knowing of electronic parameters of these structures has an important hole due to the large use on reactions that in his course forms this tetrahedral intermediate. Employing the ONIOM (B3LYP:AMBER) methodology, was evaluated the stereoselectivity of a enzymatic reaction between CAL B enzyme and a long chain ester. In this study, were obtained the electronic energies of ground state and intermediate state of transesterification rate-determing step from two possible proquirals faces Re and Si. The objective was study the enantioselectivity of CAL B and rationalizes it using quantum theory of atoms in molecules (QTAIM). A theoretical study employing inorganic compounds was performed using ab initio CBS-QB3 method aiming to find a link between thermodynamic and equilibrium involving acids and bases. The results observed showed an excellent relationship between difference in Gibbs free energy, ΔG of acid dissociation reaction and ΔG of hydrolysis reaction of the corresponding conjugate base. It was also observed, a relationship between ΔG of hydrolysis reaction of conjugate acids and their corresponding atomic radius showing that stability plays an important role in hydrolysis reactions. The importance of solvation in acid/base behavior when compared to theoretical and experimental ΔG´s also was evaluated.
Resumo:
The sustainable use of waste resulting from the agribusiness is currently the focus of research, especially the sugar cane bagasse (BCA), being the lignocellulosic waste produced in greater volume in the Brazilian agribusiness, where the residual biomass has been applied in production energy and bioproducts. In this paper, pulp was produced in high purity from the (BCA) by pulping soda / anthraquinone and subsequent conversion to cellulose acetate. Commercial cellulose Avicel was used for comparison. The obtained cellulose acetate was homogeneous acetylation reaction by modifying the variables, the reaction time in hours (8, 12, 16, 20 and 24) and temperature in ° C (25 and 50). FTIR spectra showed characteristic bands identical to cellulosic materials, demonstrating the efficiency of separation by pulping. The characterization of cellulose acetate was obtained and by infrared spectroscopy (FTIR), X-ray diffraction (XRD), thermogravimetric analysis (TG / DTG / DSC), scanning electron microscopy (SEM) and determining the degree of substitution (DS ) for the cellulose acetate to confirm the acetylation. The optimal reaction time for obtaining diacetates and triacetates, at both temperatures were 20 and 24 h. Cellulose acetate produced BCA presented GS between 2.57 and 2.7 at 25 ° C and 50 ° C GS obtained were 2.66 and 2.84, indicating the actual conversion of cellulose BCA of di- and triacetates. Comparative mode, commercial cellulose Avicel GS showed 2.78 and 2.76 at 25 ° C and 2.77 to 2.75 at 50 ° C. Data were collected in time of 20 h and 24 h, respectively. The best result was for the synthesis of cellulose acetate obtained from the BCA GS 2.84 to 50 ° C and 24 hours, being classified as cellulose triacetate, which showed superior result to that produced with the commercial ethyl cellulose Avicel, demonstrating converting potential of cellulose derived from a lignocellulosic residue (BCA), low cost, prospects of commercial use of cellulose acetate
Resumo:
The present work aims to show a possible relationship between the use of the History of Mathematics and Information and Communication Technologies (TIC) in teaching Mathematics through activities that use geometric constructions of the “Geometry of the Compass” (1797) by Lorenzo Mascheroni (1750-1800). For this, it was performed a qualitative research characterized by an historical exploration of bibliographical character followed by an empirical intervention based on use of the History of Mathematics combined with TIC through Mathematical Investigation. Thus, studies were performed in papers dealing with the topic, as well as a survey to highlight problems and /or episodes of the history of mathematics that can be solved with the help of TIC, allowing the production of a notebook of activities addressing the resolution of historical problems in a computer environment. In this search, we came across the problems of geometry that are presented by Mascheroni stated previously in the work that we propose solutions and investigations using GeoGebra software. The research resulted in the elaboration of an educational product, a notebook of activities, which was structure to allow during its implementation, students can conduct historical and/or Mathematics research, therefore, we present the procedures for realization of each construction, followed at some moments by original solution of the work. At the same time, we encourage students to investigate/reflect its construction (GeoGebra), in addition to making comparisons with the solution Mascheroni. This notebook was applied to two classes of the course of Didactics of Mathematics I (MAT0367) Course in Mathematics UFRN in 2014. Knowing the existence of some unfavorable arguments regarding the use of history of mathematics, such as loss of time, it was found that this factor can be mitigated with the aid of computational resource, because we can make checks using only the dynamism of and software without repeating the construction. It is noteworthy that the minimized time does not mean loss of reflection or maturation of ideas, when we adopted the process of historical and/or Mathematics Investigation
Resumo:
This thesis presents a certification method for semantic web services compositions which aims to statically ensure its functional correctness. Certification method encompasses two dimensions of verification, termed base and functional dimensions. Base dimension concerns with the verification of application correctness of the semantic web service in the composition, i.e., to ensure that each service invocation given in the composition comply with its respective service definition. The certification of this dimension exploits the semantic compatibility between the invocation arguments and formal parameters of the semantic web service. Functional dimension aims to ensure that the composition satisfies a given specification expressed in the form of preconditions and postconditions. This dimension is formalized by a Hoare logic based calculus. Partial correctness specifications involving compositions of semantic web services can be derived from the deductive system proposed. Our work is also characterized by exploiting the use of a fragment of description logic, i.e., ALC, to express the partial correctness specifications. In order to operationalize the proposed certification method, we developed a supporting environment for defining the semantic web services compositions as well as to conduct the certification process. The certification method were experimentally evaluated by applying it in three different proof concepts. These proof concepts enabled to broadly evaluate the method certification
Resumo:
The correct distance perception is important for executing various interactive tasks such as navigation, selection and manipulation. It is known, however, that, in general, there is a significant distance perception compression in virtual environments, mainly when using Head-Mounted Displays - HMDs. This perceived distance compression may bring various problems to the applications and even affect in a negative way the utility of those applications that depends on the correct judgment of distances. The scientific community, so far, have not been able to determine the causes of the distance perception compression in virtual environments. For this reason, it was the objective of this work to investigate, through experiments with users, the influence of both the field-of-view - FoV - and the distance estimation methods on this perceived compression. For that, an experimental comparison between the my3D device and a HMD, using 32 participants, seeking to find information on the causes of the compressed perception, was executed. The results showed that the my3D has inferior capabilities when compared to the HMD, resulting in worst estimations, on average, in both the tested estimation methods. The causes of that are believed to be the incorrect stimulus of the peripheral vision of the user, the smaller FoV and the smaller immersion sense, as described by the participants of the experiment.
Resumo:
Data Visualization is widely used to facilitate the comprehension of information and find relationships between data. One of the most widely used techniques for multivariate data (4 or more variables) visualization is the 2D scatterplot. This technique associates each data item to a visual mark in the following way: two variables are mapped to Cartesian coordinates so that a visual mark can be placed on the Cartesian plane; the others variables are mapped gradually to visual properties of the mark, such as size, color, shape, among others. As the number of variables to be visualized increases, the amount of visual properties associated to the mark increases as well. As a result, the complexity of the final visualization is higher. However, increasing the complexity of the visualization does not necessarily implies a better visualization and, sometimes, it provides an inverse situation, producing a visually polluted and confusing visualization—this problem is called visual properties overload. This work aims to investigate whether it is possible to work around the overload of the visual channel and improve insight about multivariate data visualized through a modification in the 2D scatterplot technique. In this modification, we map the variables from data items to multisensoriy marks. These marks are composed not only by visual properties, but haptic properties, such as vibration, viscosity and elastic resistance, as well. We believed that this approach could ease the insight process, through the transposition of properties from the visual channel to the haptic channel. The hypothesis was verified through experiments, in which we have analyzed (a) the accuracy of the answers; (b) response time; and (c) the grade of personal satisfaction with the proposed approach. However, the hypothesis was not validated. The results suggest that there is an equivalence between the investigated visual and haptic properties in all analyzed aspects, though in strictly numeric terms the multisensory visualization achieved better results in response time and personal satisfaction.
Resumo:
Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.
Resumo:
Cloud Computing is a paradigm that enables the access, in a simple and pervasive way, through the network, to shared and configurable computing resources. Such resources can be offered on demand to users in a pay-per-use model. With the advance of this paradigm, a single service offered by a cloud platform might not be enough to meet all the requirements of clients. Ergo, it is needed to compose services provided by different cloud platforms. However, current cloud platforms are not implemented using common standards, each one has its own APIs and development tools, which is a barrier for composing different services. In this context, the Cloud Integrator, a service-oriented middleware platform, provides an environment to facilitate the development and execution of multi-cloud applications. The applications are compositions of services, from different cloud platforms and, represented by abstract workflows. However, Cloud Integrator has some limitations, such as: (i) applications are locally executed; (ii) users cannot specify the application in terms of its inputs and outputs, and; (iii) experienced users cannot directly determine the concrete Web services that will perform the workflow. In order to deal with such limitations, this work proposes Cloud Stratus, a middleware platform that extends Cloud Integrator and offers different ways to specify an application: as an abstract workflow or a complete/partial execution flow. The platform enables the application deployment in cloud virtual machines, so that several users can access it through the Internet. It also supports the access and management of virtual machines in different cloud platforms and provides services monitoring mechanisms and assessment of QoS parameters. Cloud Stratus was validated through a case study that consists of an application that uses different services provided by different cloud platforms. Cloud Stratus was also evaluated through computing experiments that analyze the performance of its processes.
Resumo:
Digital image segmentation is the process of assigning distinct labels to different objects in a digital image, and the fuzzy segmentation algorithm has been used successfully in the segmentation of images from several modalities. However, the traditional fuzzy segmentation algorithm fails to segment objects that are characterized by textures whose patterns cannot be successfully described by simple statistics computed over a very restricted area. In this paper we present an extension of the fuzzy segmentation algorithm that achieves the segmentation of textures by employing adaptive affinity functions as long as we extend the algorithm to tridimensional images. The adaptive affinity functions change the size of the area where they compute the texture descriptors, according to the characteristics of the texture being processed, while three dimensional images can be described as a finite set of two-dimensional images. The algorithm then segments the volume image with an appropriate calculation area for each texture, making it possible to produce good estimates of actual volumes of the target structures of the segmentation process. We will perform experiments with synthetic and real data in applications such as segmentation of medical imaging obtained from magnetic rosonance
Resumo:
Compatibility testing between a drilling fluid and a cement slurry is one of the steps before an operation of cementing oil wells. This test allows us to evaluate the main effects that contamination of these two fluids may cause the technological properties of a cement paste. The interactions between cement paste and drilling fluid, because its different chemical compositions, may affect the cement hydration reactions, damaging the cementing operation. Thus, we carried out the study of the compatibility of non-aqueous drilling fluid and a cement slurry additives. The preparation procedures of the non-aqueous drilling fluid, the cement paste and completion of compatibility testing were performed as set out by the oil industry standards. In the compatibility test is evaluated rheological properties, thickening time, stability and compressive strength of cement pastes. We also conducted analyzes of scanning electron microscopy and X-ray diffraction of the mixture obtained by the compatibility test to determine the microstructural changes in cement pastes. The compatibility test showed no visual changes in the properties of the cement paste, as phase separation. However, after the addition of nonaqueous drilling fluid to cement slurry there was an increased amount of plastic viscosity, the yield point and gel strength. Among the major causative factors can include: chemical reaction of the components present in the non-aqueous drilling fluid as the primary emulsifier, wetting agent and paraffin oil, with the chemical constituents of the cement. There was a reduction in the compressive strength of the cement paste after mixing with this drilling fluid. Thickening test showed that the oil wetting agent and high salinity of the non-aqueous fluid have accelerating action of the handle of the cement paste time. The stability of the cement paste is impaired to the extent that there is increased contamination of the cement slurry with the nonaqueous fluid. The X-ray diffraction identified the formation of portlandite and calcium silicate in contaminated samples. The scanning electron microscopy confirmed the development of the identified structures in the X-ray diffraction and also found the presence of wells in the cured cement paste. The latter, formed by the emulsion stability of the drilling fluid in the cement paste, corroborate the reduction of mechanical strength. The oil wetting agent component of the non-aqueous drilling fluid, the modified cement hydration processes, mainly affecting the setting time.
Resumo:
Produced water is a major problem associated with the crude oil extraction activity. The monitoring of the levels of metals in the waste is constant and requires the use of sensitive analytical techniques. However, the determination of trace elements can often require a pre-concentration step. The objective of this study was to develop a simple and rapid analytical method for the extraction and pre-concentration based on extraction phenomenon cloud point for the determination of Cd, Pb and Tl in produced water samples by spectrometry of high resolution Absorption source continues and atomization graphite furnace. The Box Behnken design was used to obtain the optimal condition of extraction of analytes. The factors were evaluated: concentration of complexing agent (o,o-dietilditilfosfato ammonium, DDTP), the concentration of hydrochloric acid and concentration of surfactant (Triton X -114). The optimal condition obtained through extraction was: 0,6% m v-1 DDTP, HCl 0,3 mol L-1 and 0,2% m v-1 of Triton X - 114 for Pb; 0,7% m v-1 DDTP, HCl 0,8 mol L-1 and 0,2% m v-1 Triton X-114 for Cd. For Tl was evidenced that best extraction condition occurs with no DDTP, the extraction conditions were HCl 1,0 mol L-1 e 1,0% m v-1 de Triton X - 114. The limits of detection for the proposed method were 0,005 µg L-1 , 0,03 µg L-1 and 0,09 µg L-1 to Cd, Pb and Tl, Respectively. Enrichment factors Were greater than 10 times. The method was applied to the water produced in the Potiguar basin, and addition and recovery tests were performed, and values were between 81% and 120%. The precision was expressed with relative standard deviation (RSD) is less than 5%
Resumo:
In this study we evaluated the capacity removal of PAHs in an oily solution between the bentonite hydrofobized with linseed oil and paraffin with natural bentonite. Analyses of natural bentonite and hydrofobized were made by the characterization techniques: (1) Thermogravimetric Analysis (TGA), which aimed to evaluate the thermal events due to mass loss, both associated with the exit of moisture and decomposition of clay as due to hidrofobizante loss agent. (2) Analysis of X-ray diffraction (XRD) in order to determine the mineralogical phases that make up the structure of clay and (3) Spectrophotometry in the infrared region used to characterize the functional groups of both the matrix mineral (bentonite) and the hidrofobizantes agents (linseed oil and paraffin). We used a factorial design 24 with the following factors; hidrofobizante, percent hidrofobizante, adsorption time and volume of the oily solution. Analyzing the factorial design 24 was seen that none of the factors apparently was more important than the others and, as all responses showed significant values in relation to the ability of oil removal was not possible to evaluate a difference in the degree of efficiency the two hidrofobizantes. For the new study compared the efficiency of the modified clay, with each hidrofobizante separately in relation to their natural form. As such, there are four new factorial designs 23 using natural bentonite as a differentiating factor. The factors used were bentonite (with and without hydrophobization), exposure time of the adsorbent material to the oily solution and volume of an oily solution, trying to interpret how these factors could influence the process of purifying water contaminated with PAHs. Was employed as a technique for obtaining responses to fluorescence spectroscopy, as already known from literature that PAHs, for presenting combined chains due to condensation of the aromatic rings fluoresce quite similar when excited in the ultraviolet region and as an auxiliary technique to gas chromatography / mass spectrometry (GC-MS) used for the analysis of PAHs in order to complement the study of fluorescence spectroscopy, since the spectroscopic method only allows you an idea of total number of fluorescent species contained in the oil soluble. The result shows an excellent adsorption of PAHs and other fluorescent species assigned to the main effect of the first factor, hydrophobization for the first planning 23 BNTL 5%, for 93% the sixth stop in the second test (+-+),factorial design 23 BNTL 10%, the fourth test (++-) with 94.5% the third factorial design 23 BNTP 5%, the second test (+--) with 91% and the fourth and final planning 23 BNTP 10%, the last test ( + + +) with 88%. Compared with adsorption of bentonite in its natural form. This work also shows the maximum adsorption of each hidrofobizante
Resumo:
The preparation of nanostructured materials using natural clays as support, has been studied in literature under the same are found in nature and consequently, have a low price. Generally, clays serve as supports for metal oxides by increasing the number of active sites present on the surface and can be applied for various purposes such as adsorption, catalysis and photocatalysis. Some of the materials that are currently highlighted are niobium compounds, in particular, its oxides, by its characteristics such as high acidity, rigidity, water insolubility, oxidative and photocatalytic properties. In this scenario, the study aimed preparing a composite material oxyhydroxide niobium (NbO2OH) / sodium vermiculite clay and evaluate its effectiveness with respect to the natural clay (V0) and NbO2OH. The composite was prepared by precipitation-deposition method and then characterized by X-ray diffraction, infrared spectroscopy (XRD), energy dispersive X-ray (EDS), thermal analysis (TG/DTG), scanning electron microscopy (SEM), N2 adsorption-desorption and investigation of distribution of load. The application of the material NbO2OH/V0 was divided in two steps: first through oxidation and adsorption methods, and second through photocatalytic activity using solar irradiation. Studies of adsorption, oxidation and photocatalytic oxidation monitored the percentage of color removal from the dye methylene blue (MB) by UV-Vis spectroscopy. The XRD showed a decrease in reflection d (001) clay after modification; the FTIR indicated the presence of both the clay when the oxyhydroxide niobium to present bands in 1003 cm-1 related to Si-O stretching bands and 800 cm-1 to the Nb-O stretching. The presence of niobium was also confirmed by EDS indicated that 17 % by mass amount of the metal. Thermal analysis showed thermal stability of the composite at 217 °C and micrographs showed that there was a decrease in particle size. The investigation of the surface charge of NbO2OH/V0 found that the material exhibits a heterogeneous surface with average low and high negative charges. Adsorption tests showed that the composite NbO2OH/V0 higher adsorption capacity to remove 56 % of AM, while the material removed from V0 only 13 % showed no NbO2OH and adsorptive capacity due to the formation of H-aggregates. The percent removal of dye color for the oxidation tests showed little difference from the adsorption, being 18 and 66 % removal of dye color for V0 and NbO2OH/V0 respectively. The NbO2OH/V0 material shows excellent photocatalytic activity managing to remove just 95,5 % in 180 minutes of the color of MB compared to 41,4 % and 82,2 % of V0 the NbO2OH, proving the formation of a new composite with distinct properties of its precursors.