610 resultados para sintáticos
Resumo:
Google Docs (GD) is an online word processor with which multiple authors can work on the same document, in a synchronous or asynchronous manner, which can help develop the ability of writing in English (WEISSHEIMER; SOARES, 2012). As they write collaboratively, learners find more opportunities to notice the gaps in their written production, since they are exposed to more input from the fellow co-authors (WEISSHEIMER; BERGSLEITHNER; LEANDRO, 2012) and prioritize the process of text (re)construction instead of the concern with the final product, i.e., the final version of the text (LEANDRO; WEISSHEIMER; COOPER, 2013). Moreover, when it comes to second language (L2) learning, producing language enables the consolidation of existing knowledge as well as the internalization of new knowledge (SWAIN, 1985; 1993). Taking this into consideration, this mixed-method (DÖRNYEI, 2007) quasi-experimental (NUNAN, 1999) study aims at investigating the impact of collaborative writing through GD on the development of the writing skill in English and on the noticing of syntactic structures (SCHMIDT, 1990). Thirtyfour university students of English integrated the cohort of the study: twenty-five were assigned to the experimental group and nine were assigned to the control group. All learners went through a pre-test and a post-test so that we could measure their noticing of syntactic structures. Learners in the experimental group were exposed to a blended learning experience, in which they took reading and writing classes at the university and collaboratively wrote three pieces of flash fiction (a complete story told in a hundred words), outside the classroom, online through GD, during eleven weeks. Learners in the control group took reading and writing classes at the university but did not practice collaborative writing. The first and last stories produced by the learners in the experimental group were analysed in terms of grammatical accuracy, operationalized as the number of grammar errors per hundred words (SOUSA, 2014), and lexical density, which refers to the relationship between the number of words produced with lexical properties and the number of words produced with grammatical properties (WEISSHEIMER, 2007; MEHNERT, 1998). Additionally, learners in the experimental group answered an online questionnaire on the blended learning experience they were exposed to. The quantitative results showed that the collaborative task led to the production of more lexically dense texts over the 11 weeks. The noticing and grammatical accuracy results were different from what we expected; however, they provide us with insights on measurement issues, in the case of noticing, and on the participants‟ positive attitude towards collaborative writing with flash fiction. The qualitative results also shed light on the usefulness of computer-mediated collaborative writing in L2 learning.
Resumo:
In line with the model of grammar competition (Kroch, 1989; 2001), according to which the change in the syntactic domains is a process that develops via competition between different grammars, we describe and analyze the superficial constructions V2 / V3 in matrices / roots sentences of brazilian personal letters of the 19th and 20th centuries. The corpus, composed by 154 personal letters of Rio de Janeiro and Rio Grande do Norte, is divided into three century halves: (i) latter half of the 19th century; (ii) first half of the 20th century; and (iii) latter half of the 20th century. Our focus was the observation of the nature of preverbal constituents in superficial constructions V2 (verb in second position in the sentence) and V3 (verb in third position in the sentence), with a special attention on the position of the subject. Based on the various diachronical studies about the Portuguese ordination standards (Ambar (1992); Ribeiro (1995, 2001); Paixão de Sousa (2004); Paiva (2011), Coelho and Martins (2009, 2012)), our study sought to realize what are empirical ordination standards that involve superficial constructions V2 / V3 and how these patterns structure syntactically within a formal theoretical perspective (Chomsky, 1981; 1986), more specifically, in accordance with studies of Antonelli (2011), and Costa & Galves (2002). The survey results show that the data from the second half of the 19th century – unlike the first and second half of the 20th century data – have a greater balance in relation to the syntactic nature of preverbal constituent (contiguous or not), so that, in this period, the occurrence of orders with the subject in a preverbal position arrives at, at most, 52% (231/444 data); while in the 48% (213/444 data) remaining, the preverbal constituents are represented by a non-subject constituent, almost always an adverbial adjunct. Seen the results, we advocate that the brazilian personal letters of the 19th century have ordination patterns associated with a V2 system and an SV system, configuring, therefore, a possible competition process between different grammars that instantiate or a V2 system or an SV system. In other words, the brazilian letters of the 19th century instantiate a competition between the grammar of Classic Portuguese (a V2 system) and the grammars of Brazilian Portuguese and European Portuguese (an SV system). Therefore, that period is subject to the completion of two distinct parametric markings: (i) verb moved to the Fin core (grammar of Classic Portuguese) and (ii) verb moved to the T core (grammar of Brazilian Portuguese /European Portuguese). On the other hand, in the personal letters of the 20th century (first and second halves), there is a clear increase in ordenation patterns associated with the SV system, which shows more stable.
Resumo:
Textile industry has been a cause of environmental pollution, mainly due to the generation of large volumes of waste containing high organic loading and intense color. In this context, this study evaluated the electrochemical degradation of synthetic effluents from textile industry containing Methylene Blue (AM) dye, using Ti/IrO2-Ta2O5 and Ti/Pt anodes, by direct and indirect (active chlorine) electrooxidation. We evaluated the influence of applied current density (20, 40 and 60 mA/cm2 ), and the presence of different concentrations of electrolyte (NaCl and Na2SO4), as well as the neutral and alkaline pH media. The electrochemical treatment was conducted in a continuous flow reactor, in which the electrolysis time of the AM 100 ppm was 6 hours. The performance of electrochemical process was evaluated by UV-vis spectrophotometry, chemical oxygen demand (COD) and total organic carbon (TOC). The results showed that with increasing current density, it was possible to obtain 100 % of color removal at Ti/IrO2-Ta2O5 and Ti/Pt electrodes. Regarding the color removal efficiency, increasing the concentration of electrolyte promotes a higher percentage of removal using 0,02 M Na2SO4 and 0,017 M NaCl. Concerning to the aqueous medium, the best color removal results were obtained in alkaline medium using Ti/Pt. In terms of organic matter, 86 % was achieved in neutral pH medium for Ti/Pt; while a 30 % in an alkaline medium. To understand the electrochemical behavior due to the oxygen evolution reaction, polarization curves were registered, determining that the presence of NaCl in the solution favored the production of active chlorine species. The best results in energy consumption and cost were obtained by applying lower current density (20 mA/cm2 ) in 6 hours of electrolysis.
Resumo:
Synthesis of heterocyclic compounds, as quinoxaline derivatives, has being shown to be relevant and promissor due to expressive applications in biological and technological areas. This work was dedicated to the synthesis, characterization and reactivity of quinoxaline derivatives in order to obtain new chemosensors. (L)-Ascorbic acid (1) and 2,3-dichloro-6,7- dinitroquinoxalina (2) were explored as synthetic precursors. Starting from synthesis of 1 and characterization of compounds derived from (L)-ascorbic acid, studies were performed investigating the application of products as chemosensors, in which compound 36 demonstrated selective affinity for Cu2+ íons in methanolic solution, by naked-eye (colorimetric) and UVvisible analyses. Further, initial analysis suggests that 39 a Schiff’s base derived from 36 also presents this feature. Five quinoxaline derivatives were synthesized from building block 2 through nucleophilic aromatic substitution by aliphatic amines, in which controlling the experimental conditions allows to obtain both mono- and di-substituted derivatives. Reactivity studies were carried out with two purposes: i) investigate the possibility of 47 compound being a chemosensor for anion, based on its interaction with sodium hydroxide in DMSO, using image analysis and UV-visible spectroscopy; ii) characterize kinetically the conversion of compound 44 into 46 based on RGB and multivariate image analysis from TLC data, as a simple and inexpensive qualitative and quantitative tool.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
The key aspect limiting resolution in crosswell traveltime tomography is illumination, a well known result but not as well exemplified. Resolution in the 2D case is revisited using a simple geometric approach based on the angular aperture distribution and the Radon Transform properties. Analitically it is shown that if an interface has dips contained in the angular aperture limits in all points, it is correctly imaged in the tomogram. By inversion of synthetic data this result is confirmed and it is also evidenced that isolated artifacts might be present when the dip is near the illumination limit. In the inverse sense, however, if an interface is interpretable from a tomogram, even an aproximately horizontal interface, there is no guarantee that it corresponds to a true interface. Similarly, if a body is present in the interwell region it is diffusely imaged in the tomogram, but its interfaces - particularly vertical edges - can not be resolved and additional artifacts might be present. Again, in the inverse sense, there is no guarantee that an isolated anomaly corresponds to a true anomalous body because this anomaly can also be an artifact. Jointly, these results state the dilemma of ill-posed inverse problems: absence of guarantee of correspondence to the true distribution. The limitations due to illumination may not be solved by the use of mathematical constraints. It is shown that crosswell tomograms derived by the use of sparsity constraints, using both Discrete Cosine Transform and Daubechies bases, basically reproduces the same features seen in tomograms obtained with the classic smoothness constraint. Interpretation must be done always taking in consideration the a priori information and the particular limitations due to illumination. An example of interpreting a real data survey in this context is also presented.
Resumo:
The key aspect limiting resolution in crosswell traveltime tomography is illumination, a well known result but not as well exemplified. Resolution in the 2D case is revisited using a simple geometric approach based on the angular aperture distribution and the Radon Transform properties. Analitically it is shown that if an interface has dips contained in the angular aperture limits in all points, it is correctly imaged in the tomogram. By inversion of synthetic data this result is confirmed and it is also evidenced that isolated artifacts might be present when the dip is near the illumination limit. In the inverse sense, however, if an interface is interpretable from a tomogram, even an aproximately horizontal interface, there is no guarantee that it corresponds to a true interface. Similarly, if a body is present in the interwell region it is diffusely imaged in the tomogram, but its interfaces - particularly vertical edges - can not be resolved and additional artifacts might be present. Again, in the inverse sense, there is no guarantee that an isolated anomaly corresponds to a true anomalous body because this anomaly can also be an artifact. Jointly, these results state the dilemma of ill-posed inverse problems: absence of guarantee of correspondence to the true distribution. The limitations due to illumination may not be solved by the use of mathematical constraints. It is shown that crosswell tomograms derived by the use of sparsity constraints, using both Discrete Cosine Transform and Daubechies bases, basically reproduces the same features seen in tomograms obtained with the classic smoothness constraint. Interpretation must be done always taking in consideration the a priori information and the particular limitations due to illumination. An example of interpreting a real data survey in this context is also presented.
Resumo:
This study aimed to evaluate the potential of oxidative electrochemical treatment coupled with adsorption process using expanded perlite as adsorbent in the removal of textile dyes, Red Remazol and Novacron Blue on synthetic effluent. Dyes and perlite were characterized by thermogravimetry techniques (TG), Differential Scanning Calorimetry (DSC), Spectroscopy infrared (IR), Scanning Electron Microscopy (SEM), X-ray diffraction (XRD) and X-ray fluorescence (XRF) techniques. Electrochemical treatments used as anodes, Ti/Pt and Pb/PbO2 under different conditions: 60 minutes, current density 20, 40 e 60 mAcm-2, pH 1, 4.5 e 8 and temperature variation 20, 40 e 60 ºC. In the case of adsorption tests, contact time of 30 minutes for the Remazol Red dye and 20 minutes for Novacron Blue were established, while pH 1, 4.5 e 8, 500 mg adsorbent and temperature variation 20, 40 e 60 ºC were used for both treatments. The results indicated that both treatments, electroxidation/adsorption and the adsorption/electroxidation, were effective for removing color from synthetic solutions. The consumption of electricity allowed to evaluate the applicability of the electrochemical process, providing very acceptable values, which allowed us to estimate the cost. Total organic carbon (TOC) and Gas Chromatography linked mass spectrometer (GC-MS) analyzes were performed, showing that the better combination for removing organic matter is by Pb/PbO2 and perlite. Meanwhile, GC-MS indicated that the by-products formed are benzoic acid, phthalic acid, thiocarbamic acid, benzene, chlorobenzene, phenol-2-ethyl and naphthalene when Remazol Red was degraded. Conversely, aniline, phthalic acid, 1, 6 - dimethylnaphthalene, naphthalene and ion hidroxobenzenosulfonat was detected when Novacron Blue was studied. Analyses obtained through atomic absorption spectrometry showed that there was release of lead in the electrochemical oxidation of analyzes that were performed with the anode Pb/PbO2, but these values are reduced by subjecting the effluent to adsorption analysis. According to these results, sequential techniques electroxidation/adsorption and adsorption/electroxidation are to treat solutions containing dyes.
Resumo:
This study aimed to evaluate the potential of oxidative electrochemical treatment coupled with adsorption process using expanded perlite as adsorbent in the removal of textile dyes, Red Remazol and Novacron Blue on synthetic effluent. Dyes and perlite were characterized by thermogravimetry techniques (TG), Differential Scanning Calorimetry (DSC), Spectroscopy infrared (IR), Scanning Electron Microscopy (SEM), X-ray diffraction (XRD) and X-ray fluorescence (XRF) techniques. Electrochemical treatments used as anodes, Ti/Pt and Pb/PbO2 under different conditions: 60 minutes, current density 20, 40 e 60 mAcm-2, pH 1, 4.5 e 8 and temperature variation 20, 40 e 60 ºC. In the case of adsorption tests, contact time of 30 minutes for the Remazol Red dye and 20 minutes for Novacron Blue were established, while pH 1, 4.5 e 8, 500 mg adsorbent and temperature variation 20, 40 e 60 ºC were used for both treatments. The results indicated that both treatments, electroxidation/adsorption and the adsorption/electroxidation, were effective for removing color from synthetic solutions. The consumption of electricity allowed to evaluate the applicability of the electrochemical process, providing very acceptable values, which allowed us to estimate the cost. Total organic carbon (TOC) and Gas Chromatography linked mass spectrometer (GC-MS) analyzes were performed, showing that the better combination for removing organic matter is by Pb/PbO2 and perlite. Meanwhile, GC-MS indicated that the by-products formed are benzoic acid, phthalic acid, thiocarbamic acid, benzene, chlorobenzene, phenol-2-ethyl and naphthalene when Remazol Red was degraded. Conversely, aniline, phthalic acid, 1, 6 - dimethylnaphthalene, naphthalene and ion hidroxobenzenosulfonat was detected when Novacron Blue was studied. Analyses obtained through atomic absorption spectrometry showed that there was release of lead in the electrochemical oxidation of analyzes that were performed with the anode Pb/PbO2, but these values are reduced by subjecting the effluent to adsorption analysis. According to these results, sequential techniques electroxidation/adsorption and adsorption/electroxidation are to treat solutions containing dyes.
Resumo:
The advance of drilling in deeper wells has required more thermostable materials. The use of synthetic fluids, which usually have a good chemical stability, faces the environmental constraints, besides it usually generate more discharge and require a costly disposal treatment of drilled cuttings, which are often not efficient and require mechanical components that hinder the operation. The adoption of aqueous fluids generally involves the use of chrome lignosulfonate, used as dispersant, which provides stability on rheological properties and fluid loss under high temperatures and pressures (HTHP). However, due to the environmental impact associated with the use of chrome compounds, the drilling industry needs alternatives that maintain the integrity of the property and ensure success of the operation in view of the strong influence of temperature on the viscosity of aqueous fluids and polymers used in these type fluids, often polysaccharides, passives of hydrolysis and biological degradation. Therefore, vinyl polymers were selected for this study because they have predominantly carbon chain and, in particular, polyvinylpyrrolidone (PVP) for resisting higher temperatures and partially hydrolyzed polyacrylamide (PHPA) and clay by increasing the system's viscosity. Moreover, the absence of acetal bonds reduces the sensitivity to attacks by bacteria. In order to develop an aqueous drilling fluid system for HTHP applications using PVP, HPAM and clay, as main constituents, fluid formulations were prepared and determined its rheological properties using rotary viscometer of the Fann, and volume filtrate obtained by filtration HTHP following the standard API 13B-2. The new fluid system using polyvinylpyrrolidone (PVP) with high molar weight had higher viscosities, gels and yield strength, due to the effect of flocculating clay. On the other hand, the low molecular weight PVP contributed to the formation of disperse systems with lower values in the rheological properties and fluid loss. Both systems are characterized by thermal stability gain up to around 120 ° C, keeping stable rheological parameters. The results were further corroborated through linear clay swelling tests.
Resumo:
The advance of drilling in deeper wells has required more thermostable materials. The use of synthetic fluids, which usually have a good chemical stability, faces the environmental constraints, besides it usually generate more discharge and require a costly disposal treatment of drilled cuttings, which are often not efficient and require mechanical components that hinder the operation. The adoption of aqueous fluids generally involves the use of chrome lignosulfonate, used as dispersant, which provides stability on rheological properties and fluid loss under high temperatures and pressures (HTHP). However, due to the environmental impact associated with the use of chrome compounds, the drilling industry needs alternatives that maintain the integrity of the property and ensure success of the operation in view of the strong influence of temperature on the viscosity of aqueous fluids and polymers used in these type fluids, often polysaccharides, passives of hydrolysis and biological degradation. Therefore, vinyl polymers were selected for this study because they have predominantly carbon chain and, in particular, polyvinylpyrrolidone (PVP) for resisting higher temperatures and partially hydrolyzed polyacrylamide (PHPA) and clay by increasing the system's viscosity. Moreover, the absence of acetal bonds reduces the sensitivity to attacks by bacteria. In order to develop an aqueous drilling fluid system for HTHP applications using PVP, HPAM and clay, as main constituents, fluid formulations were prepared and determined its rheological properties using rotary viscometer of the Fann, and volume filtrate obtained by filtration HTHP following the standard API 13B-2. The new fluid system using polyvinylpyrrolidone (PVP) with high molar weight had higher viscosities, gels and yield strength, due to the effect of flocculating clay. On the other hand, the low molecular weight PVP contributed to the formation of disperse systems with lower values in the rheological properties and fluid loss. Both systems are characterized by thermal stability gain up to around 120 ° C, keeping stable rheological parameters. The results were further corroborated through linear clay swelling tests.
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
A prevenção da oxidação lipídica, uma das principais causas de degradação de óleos alimentares, pode ser efetuada pela adição de antioxidantes. A preocupação pelo consumo de alimentos mais saudáveis e isentos de aditivos sintéticos tem contribuído para a crescente procura de antioxidantes naturais provenientes de plantas, que possam substituir os antioxidantes sintéticos em óleos e gorduras alimentares. O presente estudo teve como objetivo avaliar o efeito da adição de extrato de carqueja (Pterospartum tridentatum), em concentrações diferentes (500 mg/kg e 1000 mg/kg), na estabilidade físico-química de óleo alimentar submetido a três ciclos de aquecimento (9 horas cada) a 180ºC e de óleo armazenado à temperatura ambiente durante 30 dias. O extrato de carqueja foi obtido por extração em etanol, tendo-se alcançado um rendimento de 60,2%±0,225. Determinou-se a capacidade antioxidante do extrato de carqueja por avaliação da capacidade de redução do radical 1,1-difenil-2-picrilhidrazil (DPPH) e pelo método de Folin-Ciocalteu. Os resultados demonstraram que o extrato tem uma atividade antioxidante de 61,7%±0,38 e um teor de compostos fenólicos de 22,4 mg de equivalentes de ácido gálico /g de extrato. A estabilidade físico-química do óleo alimentar foi estudada através da análise de diversos parâmetros, nomeadamente acidez, índice de peróxidos, índice de p-anisidina, absorvância no UV, índice de refração, cor e densidade. Os resultados evidenciaram um aumento de todos os parâmetros, com exceção da cor, ao longo do aquecimento do óleo bem como no seu armazenamento à temperatura ambiente. Este aumento foi mais relevante no óleo em processo de aquecimento uma vez que a 180ºC as reações de oxidação ocorrem mais rapidamente do que à temperatura ambiente. Observaram-se alterações mais acentuadas dos parâmetros físico-químicos no óleo sem extrato de carqueja comparativamente ao óleo suplementado com 500mg/kg e 1000mg/kg de extrato, comprovando-se o efeito do extrato de carqueja na redução da oxidação e, consequentemente, no aumento da estabilidade físico-química do óleo alimentar. O óleo suplementado com 1000mg/kg de extrato de carqueja revelou-se o mais estável à oxidação.