1000 resultados para Ciências Exata e da Terra
Resumo:
The sustainable use of waste resulting from the agribusiness is currently the focus of research, especially the sugar cane bagasse (BCA), being the lignocellulosic waste produced in greater volume in the Brazilian agribusiness, where the residual biomass has been applied in production energy and bioproducts. In this paper, pulp was produced in high purity from the (BCA) by pulping soda / anthraquinone and subsequent conversion to cellulose acetate. Commercial cellulose Avicel was used for comparison. The obtained cellulose acetate was homogeneous acetylation reaction by modifying the variables, the reaction time in hours (8, 12, 16, 20 and 24) and temperature in ° C (25 and 50). FTIR spectra showed characteristic bands identical to cellulosic materials, demonstrating the efficiency of separation by pulping. The characterization of cellulose acetate was obtained and by infrared spectroscopy (FTIR), X-ray diffraction (XRD), thermogravimetric analysis (TG / DTG / DSC), scanning electron microscopy (SEM) and determining the degree of substitution (DS ) for the cellulose acetate to confirm the acetylation. The optimal reaction time for obtaining diacetates and triacetates, at both temperatures were 20 and 24 h. Cellulose acetate produced BCA presented GS between 2.57 and 2.7 at 25 ° C and 50 ° C GS obtained were 2.66 and 2.84, indicating the actual conversion of cellulose BCA of di- and triacetates. Comparative mode, commercial cellulose Avicel GS showed 2.78 and 2.76 at 25 ° C and 2.77 to 2.75 at 50 ° C. Data were collected in time of 20 h and 24 h, respectively. The best result was for the synthesis of cellulose acetate obtained from the BCA GS 2.84 to 50 ° C and 24 hours, being classified as cellulose triacetate, which showed superior result to that produced with the commercial ethyl cellulose Avicel, demonstrating converting potential of cellulose derived from a lignocellulosic residue (BCA), low cost, prospects of commercial use of cellulose acetate
Resumo:
The present work aims to show a possible relationship between the use of the History of Mathematics and Information and Communication Technologies (TIC) in teaching Mathematics through activities that use geometric constructions of the “Geometry of the Compass” (1797) by Lorenzo Mascheroni (1750-1800). For this, it was performed a qualitative research characterized by an historical exploration of bibliographical character followed by an empirical intervention based on use of the History of Mathematics combined with TIC through Mathematical Investigation. Thus, studies were performed in papers dealing with the topic, as well as a survey to highlight problems and /or episodes of the history of mathematics that can be solved with the help of TIC, allowing the production of a notebook of activities addressing the resolution of historical problems in a computer environment. In this search, we came across the problems of geometry that are presented by Mascheroni stated previously in the work that we propose solutions and investigations using GeoGebra software. The research resulted in the elaboration of an educational product, a notebook of activities, which was structure to allow during its implementation, students can conduct historical and/or Mathematics research, therefore, we present the procedures for realization of each construction, followed at some moments by original solution of the work. At the same time, we encourage students to investigate/reflect its construction (GeoGebra), in addition to making comparisons with the solution Mascheroni. This notebook was applied to two classes of the course of Didactics of Mathematics I (MAT0367) Course in Mathematics UFRN in 2014. Knowing the existence of some unfavorable arguments regarding the use of history of mathematics, such as loss of time, it was found that this factor can be mitigated with the aid of computational resource, because we can make checks using only the dynamism of and software without repeating the construction. It is noteworthy that the minimized time does not mean loss of reflection or maturation of ideas, when we adopted the process of historical and/or Mathematics Investigation
Resumo:
Microseisms are continuous vibrations pervasively recorded in the mili Hertz to 1 Hz frequency range. These vibrations are mostly composed of Rayleigh waves and are strongest in the 0.04 to 1 Hz frequency band. Their precise source mechanisms are still a matter of debate but it is agreed that they are related to atmospheric perturbations and ocean gravity waves. The Saint Peter Saint Paul Archipelago (SPSPA) is located in the equatorial region of the Atlantic Ocean about 1,100 km distant from the Brazilian northeastern coast. The SPSPA is composed by a set of several small rocky formations with a total area of approximately 17,000 m². Due to its remote distance from the continent and the lack of cultural noise, this location is a unique location for measuring microseismic noise and to investigate its relation with some climate and oceanographic variables. In the SPSPA we have recorded both primary microseisms (PM) at 0.04 – 0.12 Hz and the secondary microseisms (SM) at 0.12 – 0.4 Hz during 10 months in 2012 and 2013. Our analysis indicates a good correlation between the microseismic noise in the region and a seasonal dependency. In particular, the winter in the northern hemisphere. We have also shown that most of the PM is generated in the SPSPA itself. The SM source location depends with the seasonal climatic and oceanographic variables in the northern hemisphere
Resumo:
One of the main problems related to the use of diesel as fuel is the presence of sulfur (S) which causes environmental pollution and corrosion of engines. In order to minimize the consequences of the release of this pollutant, Brazilian law established maximum sulfur content that diesel fuel may have. To meet these requirements, diesel with a maximum sulfur concentration equal to 10 mg/kg (S10) has been widely marketed in the country. However, the reduction of sulfur can lead to changes in the physicochemical properties of the fuel, which are essential for the performance of road vehicles. This work aims to identify the main changes in the physicochemical properties of diesel fuel and how they are related to reduction of sulfur content. Samples of diesel types S10, S500 and S1800 were tested according with the methods of the American Society for Testing and Materials (ASTM). The fuels were also characterized by thermogravimetric analysis (TG) and subjected to physical distillation (ASTM D86) and simulated distillation gas chromatography (ASTM D2887). The results showed that the reduction of sulfur turned the fuel lighter and fluid, allowing a greater applicability to low temperature environments and safer for transportation and storage. Through the simulated distillation data was observed that decreasing sulfur content resulted in higher initial boiling point temperatures and the decreasing of the boiling temperature of the medium and heavy fractions. Thermogravimetric analysis showed a loss event mass attributed to volatilization or distillation of light and medium hydrocarbons. Based on these data, the kinetic behavior of the samples was investigated and it was observed that the activation energies (Ea) did not show significant changes throughout conversion. Considering the average of these energies, the S1800 had the highest Ea during the conversion and the S10 the lowest values
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico - CNPq
Resumo:
Instrumentation is a tool of fundamental importance for research in several areas of human knowledge. Research projects are often unfeasible when data cannot be obtained due to lack of instruments, especially due to impor ting difficulties and the high costs there associated. Thus, in order to collaborate with the enhancement of a national technology, a multiband hand - held sun p hotometer (FSM - 4) was developed to operate in the 500 nm, 670 nm, 870 nm and 940 nm bands. In the 500 nm, 670 nm and 870 nm bands aerosols are monitored for evaluation of the AOD (Aerosol Optical Depth), and the PWC (Precipitable Water Column) is evaluated in the 940 nm band. For the development of the mech anical and electronic parts for the FSM - 4, th e materials and componen ts should combine low cost and quality of the data collected. The calibration process utilized the Langley method (ML) and Modified Langley Method (MLM). These methods are usually applied at high altitudes in order to provide atmosp heric optical stability. This condition however can be found in low height sites as shown in the research by Liu et al. (2010). Thus, for calibration of the FSM - 4, we investigated the atmospher ic optical stability utilizing the ML and MLM at a site in the cit y of Caicó / RN, located in the s emiarid region in northeastern Brazil. This site lies in a region far aw ay from large urban centers and activities generating anthropogenic atmospheric pollution. Data for calibration of the prototype were collected usin g the FSM - 4 in two separate operations during the dry season, one in December 2012 and another in September 2013. The methodologies showed optical atmospheric instability in the studied region through the dispersion of the values obtained for the calibrati on constant. This dispersion is affected by the variability of AOD and PWC during the appl ication of the above mentioned methods . As an alternative to the descr ibed sun photometer calibration , a short study was performed using the sun photometer worldwide network AERONET/NASA (AERsol RObotic NETwork – US Space Agency), installed in Petrolina / PE in Brazil. Data were collected for three days utilizing the AERONET instruments and the FSM - 4, operating simultaneously on the same site. By way of the ML and MLM techniques, convergent test values were obtained for the calibration constants, despite the low amount of data collected. This calibration transfer methodology proved to be a viable alternative to the FSM - 4 calibration .
Resumo:
The soil heat flux and soil thermal diffusivity are important components of the surface energy balance, especially in ar id and semi-arid regions. The obj ective of this work was to carry out to estimate the soil heat flux from th e soil temperature measured at a single depth, based on the half-order time derivative met hod proposed by Wang and Bras (1999), and to establish a method capable of es timating the thermal diffusivity of the soil, based on the half order derivative, from the temporal series of soil temperature at two depths. The results obtained in the estimates of soil heat flux were compared with the values of soil heat flux measured through flux plates, and the thermal di ffusivity estimated was compared with the measurements carried out in situ. The results obtained showed excellent concordance between the estimated and measured soil heat flux, with correlation (r), coeffici ent of determination (R 2 ) and standard error (W/m 2 ) of: r = 0.99093, R 2 = 0.98194 and error = 2.56 (W/m 2 ) for estimated period of 10 days; r = 0,99069, R 2 = 0,98147 and error = 2.59 (W/m 2 ) for estimated period of 30 days; and r = 0,98974, R 2 = 0,97958 and error = 2.77 (W/m 2 ) for estimated period of 120 days. The values of thermal di ffusivity estimated by the proposed method showed to be coherent and consis tent with in situ measured va lues, and with the values found in the literature usi ng conventional methods.
Resumo:
In this thesis used four different methods in order to diagnose the precipitation extremes on Northeastern Brazil (NEB): Generalized Linear Model s via logistic regression and Poisson, extreme value theory analysis via generalized extre me value (GEV) and generalized Pareto (GPD) distributions and Vectorial Generalized Linea r Models via GEV (MVLG GEV). The logistic regression and Poisson models were used to identify the interactions between the precipitation extremes and other variables based on the odds ratios and relative risks. It was found that the outgoing longwave radiation was the indicator variable for the occurrence of extreme precipitation on eastern, northern and semi arid NEB, and the relative humidity was verified on southern NEB. The GEV and GPD distribut ions (based on the 95th percentile) showed that the location and scale parameters were presented the maximum on the eastern and northern coast NEB, the GEV verified a maximum core on western of Pernambuco influenced by weather systems and topography. The GEV and GPD shape parameter, for most regions the data fitted by Weibull negative an d Beta distributions (ξ < 0) , respectively. The levels and return periods of GEV (GPD) on north ern Maranhão (centerrn of Bahia) may occur at least an extreme precipitation event excee ding over of 160.9 mm /day (192.3 mm / day) on next 30 years. The MVLG GEV model found tha t the zonal and meridional wind components, evaporation and Atlantic and Pacific se a surface temperature boost the precipitation extremes. The GEV parameters show the following results: a) location ( ), the highest value was 88.26 ± 6.42 mm on northern Maran hão; b) scale ( σ ), most regions showed positive values, except on southern of Maranhão; an d c) shape ( ξ ), most of the selected regions were adjusted by the Weibull negative distr ibution ( ξ < 0 ). The southern Maranhão and southern Bahia have greater accuracy. The level period, it was estimated that the centern of Bahia may occur at least an extreme precipitatio n event equal to or exceeding over 571.2 mm/day on next 30 years.
Resumo:
The demand for materials with high consistency obtained at relatively low temperatures has been leveraging the search for chemical processes substituents of the conventional ceramic method. This paper aims to obtain nanosized pigments encapsulated (core-shell) the basis of TiO2 doped with transition metals (Fe, Co, Ni, Al) through three (3) methods of synthesis: polymeric precursors (Pechini); hydrothermal microwave, and co-precipitation associated with the sol-gel chemistry. The study was motivated by the simplicity, speed and low power consumption characteristic of these methods. Systems costs are affordable because they allow achieving good control of microstructure, combined with high purity, controlled stoichiometric phases and allowing to obtain particles of nanometer size. The physical, chemical, morphological, structural and optical properties of the materials obtained were analyzed using different techniques for materials characterization. The powder pigments were tested in discoloration and degradation using a photoreactor through the solution of Remazol yellow dye gold (NNI), such as filtration, resulting in a separation of solution and the filter pigments available for further UV-Vis measurements . Different calcination temperatures taken after obtaining the post, the different methods were: 400 º C and 1000 º C. Using a fixed concentration of 10% (Fe, Al, Ni, Co) mass relative to the mass of titanium technologically and economically enabling the study. By transmission electron microscopy (TEM) technique was possible to analyze and confirm the structural formation nanosized particles of encapsulated pigment, TiO2 having the diameter of 20 nm to 100 nm, and thickness of coated layer of Fe, Ni and Co between 2 nm and 10 nm. The method of synthesis more efficient has been studied in the work co-precipitation associated with sol-gel chemistry, in which the best results were achieved without the need for the obtainment of powders the calcination process
Resumo:
This thesis presents a certification method for semantic web services compositions which aims to statically ensure its functional correctness. Certification method encompasses two dimensions of verification, termed base and functional dimensions. Base dimension concerns with the verification of application correctness of the semantic web service in the composition, i.e., to ensure that each service invocation given in the composition comply with its respective service definition. The certification of this dimension exploits the semantic compatibility between the invocation arguments and formal parameters of the semantic web service. Functional dimension aims to ensure that the composition satisfies a given specification expressed in the form of preconditions and postconditions. This dimension is formalized by a Hoare logic based calculus. Partial correctness specifications involving compositions of semantic web services can be derived from the deductive system proposed. Our work is also characterized by exploiting the use of a fragment of description logic, i.e., ALC, to express the partial correctness specifications. In order to operationalize the proposed certification method, we developed a supporting environment for defining the semantic web services compositions as well as to conduct the certification process. The certification method were experimentally evaluated by applying it in three different proof concepts. These proof concepts enabled to broadly evaluate the method certification
Resumo:
This research aims to set whether is possible to build spatial patterns over oil fields using DFA (Detrended Fluctuation Analysis) of the following well logs: sonic, density, porosity, resistivity and gamma ray. It was employed in the analysis a set of 54 well logs from the oil field of Campos dos Namorados, RJ, Brazil. To check for spatial correlation, it was employed the Mantel test between the matrix of geographic distance and the matrix of the difference of DFA exponents of the well logs. The null hypothesis assumes the absence of spatial structures that means no correlation between the matrix of Euclidean distance and the matrix of DFA differences. Our analysis indicate that the sonic (p=0.18) and the density (p=0.26) were the profiles that show tendency to correlation, or weak correlation. A complementary analysis using contour plot also has suggested that the sonic and the density are the most suitable with geophysical quantities for the construction of spatial structures corroborating the results of Mantel test
Resumo:
The plasma nitriding has been used in industrial and technological applications for large-scale show an improvement in the mechanical, tribological, among others. In order to solve problems arising in the conventional nitriding, for example, rings constraint (edge effect) techniques have been developed with different cathodes. In this work, we studied surfaces of commercially pure titanium (Grade II), modified by plasma nitriding treatment through different settings cathodes (hollow cathode, cathodic cage with a cage and cathodic cage with two cages) varying the temperature 350, 400 and 430oC, with the goal of obtaining a surface optimization for technological applications, evaluating which treatment generally showed better results under the substrate. The samples were characterized by the techniques of testing for Atomic Force Microscopy (AFM), Raman spectroscopy, microhardness, X-ray diffraction (XRD), and a macroscopic analysis. Thus, we were able to evaluate the processing properties, such as roughness, topography, the presence of interstitial elements, hardness, homogeneity, uniformity and thickness of the nitrided layer. It was observed that all samples were exposed to nitriding modified relative to the control sample (no treatment) thus having increased surface hardness, the presence of TiN observed by XRD as per both Raman and a significant change in the roughness of the treated samples . It was found that treatment in hollow cathode, despite having the lowest value of microhardness between treated samples, was presented the lowest surface roughness, although this configuration samples suffer greater physical aggressiveness of treatment
Resumo:
The correct distance perception is important for executing various interactive tasks such as navigation, selection and manipulation. It is known, however, that, in general, there is a significant distance perception compression in virtual environments, mainly when using Head-Mounted Displays - HMDs. This perceived distance compression may bring various problems to the applications and even affect in a negative way the utility of those applications that depends on the correct judgment of distances. The scientific community, so far, have not been able to determine the causes of the distance perception compression in virtual environments. For this reason, it was the objective of this work to investigate, through experiments with users, the influence of both the field-of-view - FoV - and the distance estimation methods on this perceived compression. For that, an experimental comparison between the my3D device and a HMD, using 32 participants, seeking to find information on the causes of the compressed perception, was executed. The results showed that the my3D has inferior capabilities when compared to the HMD, resulting in worst estimations, on average, in both the tested estimation methods. The causes of that are believed to be the incorrect stimulus of the peripheral vision of the user, the smaller FoV and the smaller immersion sense, as described by the participants of the experiment.
Resumo:
Data Visualization is widely used to facilitate the comprehension of information and find relationships between data. One of the most widely used techniques for multivariate data (4 or more variables) visualization is the 2D scatterplot. This technique associates each data item to a visual mark in the following way: two variables are mapped to Cartesian coordinates so that a visual mark can be placed on the Cartesian plane; the others variables are mapped gradually to visual properties of the mark, such as size, color, shape, among others. As the number of variables to be visualized increases, the amount of visual properties associated to the mark increases as well. As a result, the complexity of the final visualization is higher. However, increasing the complexity of the visualization does not necessarily implies a better visualization and, sometimes, it provides an inverse situation, producing a visually polluted and confusing visualization—this problem is called visual properties overload. This work aims to investigate whether it is possible to work around the overload of the visual channel and improve insight about multivariate data visualized through a modification in the 2D scatterplot technique. In this modification, we map the variables from data items to multisensoriy marks. These marks are composed not only by visual properties, but haptic properties, such as vibration, viscosity and elastic resistance, as well. We believed that this approach could ease the insight process, through the transposition of properties from the visual channel to the haptic channel. The hypothesis was verified through experiments, in which we have analyzed (a) the accuracy of the answers; (b) response time; and (c) the grade of personal satisfaction with the proposed approach. However, the hypothesis was not validated. The results suggest that there is an equivalence between the investigated visual and haptic properties in all analyzed aspects, though in strictly numeric terms the multisensory visualization achieved better results in response time and personal satisfaction.
Resumo:
Through numerous technological advances in recent years along with the popularization of computer devices, the company is moving towards a paradigm “always connected”. Computer networks are everywhere and the advent of IPv6 paves the way for the explosion of the Internet of Things. This concept enables the sharing of data between computing machines and objects of day-to-day. One of the areas placed under Internet of Things are the Vehicular Networks. However, the information generated individually for a vehicle has no large amount and does not contribute to an improvement in transit, once information has been isolated. This proposal presents the Infostructure, a system that has to facilitate the efforts and reduce costs for development of applications context-aware to high-level semantic for the scenario of Internet of Things, which allows you to manage, store and combine the data in order to generate broader context. To this end we present a reference architecture, which aims to show the major components of the Infostructure. Soon after a prototype is presented which is used to validate our work reaches the level of contextualization desired high level semantic as well as a performance evaluation, which aims to evaluate the behavior of the subsystem responsible for managing contextual information on a large amount of data. After statistical analysis is performed with the results obtained in the evaluation. Finally, the conclusions of the work and some problems such as no assurance as to the integrity of the sensory data coming Infostructure, and future work that takes into account the implementation of other modules so that we can conduct tests in real environments are presented.