889 resultados para corantes sintéticos
Resumo:
In this work it were developed synthetic and theoretical studies for clerodane-type diterpenes obtained from Croton cajucara Benth which represents one of the most important medicinal plant of the Brazil amazon region. Specifically, the majoritary biocompound 19-nor-clerodane trans-dehydrocrotonin (t-DCTN) isolated from the bark of this Croton, was used as target molecule. Semi-synthetic derivatives were obtained from t-DCTN by using the followed synthetic procedures: 1) catalytic reduction with H2, 2) reduction using NaBH4 and 3) reduction using NaBH4/CeCl3. The semi-synthetic 19-nor-furan-clerodane alcohol-type derivatives were denominated such as t-CTN, tCTN-OL, t-CTN-OL, t-DCTN-OL, t-DCTN-OL, being all of them characterized by NMR. The furan-clerodane alcohol derivatives t-CTN-OL and tCTN-OL were obtained form the semi-synthetic t-CTN, which can be isolated from the bark of C. cajucara. A theoretical protocol (DFT/B3LYP) involving the prevision of geometric and magnetic properties such as bond length and angles, as well as chemical shifts and coupling constants, were developed for the target t-DCTN in which was correlated NMR theoretical data with structural data, with satisfactory correlation with NMR experimental data (coefficients ranging from 0.97 and 0.99) and X-ray diffraction data. This theoretical methodology was also validated for all semi-synthetic derivatives described in this work. In addition, topological data from the Quantum Theory of Atoms in Molecules (QTAIM) showed the presence of H-H and (C)O--H(C) intramolecular stabilized interactions types for t-DCTN e t-CTN, contributing to the understanding of the different reactivity of this clerodanes in the presence of NaBH4.
Resumo:
The reverse time migration algorithm (RTM) has been widely used in the seismic industry to generate images of the underground and thus reduce the risk of oil and gas exploration. Its widespread use is due to its high quality in underground imaging. The RTM is also known for its high computational cost. Therefore, parallel computing techniques have been used in their implementations. In general, parallel approaches for RTM use a coarse granularity by distributing the processing of a subset of seismic shots among nodes of distributed systems. Parallel approaches with coarse granularity for RTM have been shown to be very efficient since the processing of each seismic shot can be performed independently. For this reason, RTM algorithm performance can be considerably improved by using a parallel approach with finer granularity for the processing assigned to each node. This work presents an efficient parallel algorithm for 3D reverse time migration with fine granularity using OpenMP. The propagation algorithm of 3D acoustic wave makes up much of the RTM. Different load balancing were analyzed in order to minimize possible losses parallel performance at this stage. The results served as a basis for the implementation of other phases RTM: backpropagation and imaging condition. The proposed algorithm was tested with synthetic data representing some of the possible underground structures. Metrics such as speedup and efficiency were used to analyze its parallel performance. The migrated sections show that the algorithm obtained satisfactory performance in identifying subsurface structures. As for the parallel performance, the analysis clearly demonstrate the scalability of the algorithm achieving a speedup of 22.46 for the propagation of the wave and 16.95 for the RTM, both with 24 threads.
Resumo:
Google Docs (GD) is an online word processor with which multiple authors can work on the same document, in a synchronous or asynchronous manner, which can help develop the ability of writing in English (WEISSHEIMER; SOARES, 2012). As they write collaboratively, learners find more opportunities to notice the gaps in their written production, since they are exposed to more input from the fellow co-authors (WEISSHEIMER; BERGSLEITHNER; LEANDRO, 2012) and prioritize the process of text (re)construction instead of the concern with the final product, i.e., the final version of the text (LEANDRO; WEISSHEIMER; COOPER, 2013). Moreover, when it comes to second language (L2) learning, producing language enables the consolidation of existing knowledge as well as the internalization of new knowledge (SWAIN, 1985; 1993). Taking this into consideration, this mixed-method (DÖRNYEI, 2007) quasi-experimental (NUNAN, 1999) study aims at investigating the impact of collaborative writing through GD on the development of the writing skill in English and on the noticing of syntactic structures (SCHMIDT, 1990). Thirtyfour university students of English integrated the cohort of the study: twenty-five were assigned to the experimental group and nine were assigned to the control group. All learners went through a pre-test and a post-test so that we could measure their noticing of syntactic structures. Learners in the experimental group were exposed to a blended learning experience, in which they took reading and writing classes at the university and collaboratively wrote three pieces of flash fiction (a complete story told in a hundred words), outside the classroom, online through GD, during eleven weeks. Learners in the control group took reading and writing classes at the university but did not practice collaborative writing. The first and last stories produced by the learners in the experimental group were analysed in terms of grammatical accuracy, operationalized as the number of grammar errors per hundred words (SOUSA, 2014), and lexical density, which refers to the relationship between the number of words produced with lexical properties and the number of words produced with grammatical properties (WEISSHEIMER, 2007; MEHNERT, 1998). Additionally, learners in the experimental group answered an online questionnaire on the blended learning experience they were exposed to. The quantitative results showed that the collaborative task led to the production of more lexically dense texts over the 11 weeks. The noticing and grammatical accuracy results were different from what we expected; however, they provide us with insights on measurement issues, in the case of noticing, and on the participants‟ positive attitude towards collaborative writing with flash fiction. The qualitative results also shed light on the usefulness of computer-mediated collaborative writing in L2 learning.
Resumo:
In line with the model of grammar competition (Kroch, 1989; 2001), according to which the change in the syntactic domains is a process that develops via competition between different grammars, we describe and analyze the superficial constructions V2 / V3 in matrices / roots sentences of brazilian personal letters of the 19th and 20th centuries. The corpus, composed by 154 personal letters of Rio de Janeiro and Rio Grande do Norte, is divided into three century halves: (i) latter half of the 19th century; (ii) first half of the 20th century; and (iii) latter half of the 20th century. Our focus was the observation of the nature of preverbal constituents in superficial constructions V2 (verb in second position in the sentence) and V3 (verb in third position in the sentence), with a special attention on the position of the subject. Based on the various diachronical studies about the Portuguese ordination standards (Ambar (1992); Ribeiro (1995, 2001); Paixão de Sousa (2004); Paiva (2011), Coelho and Martins (2009, 2012)), our study sought to realize what are empirical ordination standards that involve superficial constructions V2 / V3 and how these patterns structure syntactically within a formal theoretical perspective (Chomsky, 1981; 1986), more specifically, in accordance with studies of Antonelli (2011), and Costa & Galves (2002). The survey results show that the data from the second half of the 19th century – unlike the first and second half of the 20th century data – have a greater balance in relation to the syntactic nature of preverbal constituent (contiguous or not), so that, in this period, the occurrence of orders with the subject in a preverbal position arrives at, at most, 52% (231/444 data); while in the 48% (213/444 data) remaining, the preverbal constituents are represented by a non-subject constituent, almost always an adverbial adjunct. Seen the results, we advocate that the brazilian personal letters of the 19th century have ordination patterns associated with a V2 system and an SV system, configuring, therefore, a possible competition process between different grammars that instantiate or a V2 system or an SV system. In other words, the brazilian letters of the 19th century instantiate a competition between the grammar of Classic Portuguese (a V2 system) and the grammars of Brazilian Portuguese and European Portuguese (an SV system). Therefore, that period is subject to the completion of two distinct parametric markings: (i) verb moved to the Fin core (grammar of Classic Portuguese) and (ii) verb moved to the T core (grammar of Brazilian Portuguese /European Portuguese). On the other hand, in the personal letters of the 20th century (first and second halves), there is a clear increase in ordenation patterns associated with the SV system, which shows more stable.
Resumo:
In this study, we carried out the study of Eriochrome black T removal using expanded perlite modified orthophenanthroline by adsorption technique. The study of the adsorption process was performed by investigating the effect of the initial dye concentration, contact time and pH range of the solution (acidic and alkaline) in the adsorption process, for a so-called synthetic effluent (aqueous solution of black eriochrome T) and a real effluent (generated from the test for determining the water hardness, by complexation titration). The materials were characterized by Thermogravimetry / Differential Thermal Analysis (TG / DTA), absorption spectroscopy in the infrared (IR), X-ray Diffraction (XRD) and scanning electron microscopy (SEM). By analysis of XRD observed thinking on orthophenanthroline the modified expanded perlite. And by IR analysis showed an increase in intensity and a detailed enlargement of the absorption band related to the axial deformation of the OH bond of silanol groups of perlite (Si-OH). In the equilibration time of the study, in the evaluated time range (5-230 min) was not possible to observe the existence of a balance of time, probably attributed to the type of interaction between the Eriochrome black-T and the expanded perlite modified orthophenanthroline, being an interaction of surface origin. In the study effect of the initial concentration of the adsorbate in the case 2,0x10-4 mol / L natural pH (pH 5) gave the highest removal percentage value of eriochrome T black color with 63.74 % removal in 20 minutes of contact. In evaluating the effect of varying the pH of Eriochrome black T solution in the adsorption process, it was found that the more acidic the environment, the greater the percentage stain removal, being a result of acid-base interaction between the adsorbate and the adsorbent. In T Eriochrome black removal study in real effluent we used the optimized conditions by studying with synthetic sewage. The dye removal at pH 10, natural pH of the effluent was no significant reaching the maximum amount of color removal percentage of 8.12%, obtained already at pH 3 with maximum color removal 100.00% of color, once more proving that eriochrome black T and effectively interact better with the adsorbent at acid pH values (pH 5 or 3), and most efficiently at pH 3. thus one can mention that the perlite expanded (an amorphous aluminosilicate naturally acid) modified with orthophenanthroline (one Bronsted base) consists of a master and effective removal of coloring material in the acid-type aqueous solution, the conditions expressed in this study, can be applied as an adsorbent of this dye also mums real effluent.
Resumo:
Synthesis of heterocyclic compounds, as quinoxaline derivatives, has being shown to be relevant and promissor due to expressive applications in biological and technological areas. This work was dedicated to the synthesis, characterization and reactivity of quinoxaline derivatives in order to obtain new chemosensors. (L)-Ascorbic acid (1) and 2,3-dichloro-6,7- dinitroquinoxalina (2) were explored as synthetic precursors. Starting from synthesis of 1 and characterization of compounds derived from (L)-ascorbic acid, studies were performed investigating the application of products as chemosensors, in which compound 36 demonstrated selective affinity for Cu2+ íons in methanolic solution, by naked-eye (colorimetric) and UVvisible analyses. Further, initial analysis suggests that 39 a Schiff’s base derived from 36 also presents this feature. Five quinoxaline derivatives were synthesized from building block 2 through nucleophilic aromatic substitution by aliphatic amines, in which controlling the experimental conditions allows to obtain both mono- and di-substituted derivatives. Reactivity studies were carried out with two purposes: i) investigate the possibility of 47 compound being a chemosensor for anion, based on its interaction with sodium hydroxide in DMSO, using image analysis and UV-visible spectroscopy; ii) characterize kinetically the conversion of compound 44 into 46 based on RGB and multivariate image analysis from TLC data, as a simple and inexpensive qualitative and quantitative tool.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
In the oil prospection research seismic data are usually irregular and sparsely sampled along the spatial coordinates due to obstacles in placement of geophones. Fourier methods provide a way to make the regularization of seismic data which are efficient if the input data is sampled on a regular grid. However, when these methods are applied to a set of irregularly sampled data, the orthogonality among the Fourier components is broken and the energy of a Fourier component may "leak" to other components, a phenomenon called "spectral leakage". The objective of this research is to study the spectral representation of irregularly sampled data method. In particular, it will be presented the basic structure of representation of the NDFT (nonuniform discrete Fourier transform), study their properties and demonstrate its potential in the processing of the seismic signal. In this way we study the FFT (fast Fourier transform) and the NFFT (nonuniform fast Fourier transform) which rapidly calculate the DFT (discrete Fourier transform) and NDFT. We compare the recovery of the signal using the FFT, DFT and NFFT. We approach the interpolation of seismic trace using the ALFT (antileakage Fourier transform) to overcome the problem of spectral leakage caused by uneven sampling. Applications to synthetic and real data showed that ALFT method works well on complex geology seismic data and suffers little with irregular spatial sampling of the data and edge effects, in addition it is robust and stable with noisy data. However, it is not as efficient as the FFT and its reconstruction is not as good in the case of irregular filling with large holes in the acquisition.
Resumo:
The key aspect limiting resolution in crosswell traveltime tomography is illumination, a well known result but not as well exemplified. Resolution in the 2D case is revisited using a simple geometric approach based on the angular aperture distribution and the Radon Transform properties. Analitically it is shown that if an interface has dips contained in the angular aperture limits in all points, it is correctly imaged in the tomogram. By inversion of synthetic data this result is confirmed and it is also evidenced that isolated artifacts might be present when the dip is near the illumination limit. In the inverse sense, however, if an interface is interpretable from a tomogram, even an aproximately horizontal interface, there is no guarantee that it corresponds to a true interface. Similarly, if a body is present in the interwell region it is diffusely imaged in the tomogram, but its interfaces - particularly vertical edges - can not be resolved and additional artifacts might be present. Again, in the inverse sense, there is no guarantee that an isolated anomaly corresponds to a true anomalous body because this anomaly can also be an artifact. Jointly, these results state the dilemma of ill-posed inverse problems: absence of guarantee of correspondence to the true distribution. The limitations due to illumination may not be solved by the use of mathematical constraints. It is shown that crosswell tomograms derived by the use of sparsity constraints, using both Discrete Cosine Transform and Daubechies bases, basically reproduces the same features seen in tomograms obtained with the classic smoothness constraint. Interpretation must be done always taking in consideration the a priori information and the particular limitations due to illumination. An example of interpreting a real data survey in this context is also presented.
Resumo:
The key aspect limiting resolution in crosswell traveltime tomography is illumination, a well known result but not as well exemplified. Resolution in the 2D case is revisited using a simple geometric approach based on the angular aperture distribution and the Radon Transform properties. Analitically it is shown that if an interface has dips contained in the angular aperture limits in all points, it is correctly imaged in the tomogram. By inversion of synthetic data this result is confirmed and it is also evidenced that isolated artifacts might be present when the dip is near the illumination limit. In the inverse sense, however, if an interface is interpretable from a tomogram, even an aproximately horizontal interface, there is no guarantee that it corresponds to a true interface. Similarly, if a body is present in the interwell region it is diffusely imaged in the tomogram, but its interfaces - particularly vertical edges - can not be resolved and additional artifacts might be present. Again, in the inverse sense, there is no guarantee that an isolated anomaly corresponds to a true anomalous body because this anomaly can also be an artifact. Jointly, these results state the dilemma of ill-posed inverse problems: absence of guarantee of correspondence to the true distribution. The limitations due to illumination may not be solved by the use of mathematical constraints. It is shown that crosswell tomograms derived by the use of sparsity constraints, using both Discrete Cosine Transform and Daubechies bases, basically reproduces the same features seen in tomograms obtained with the classic smoothness constraint. Interpretation must be done always taking in consideration the a priori information and the particular limitations due to illumination. An example of interpreting a real data survey in this context is also presented.
Resumo:
The advance of drilling in deeper wells has required more thermostable materials. The use of synthetic fluids, which usually have a good chemical stability, faces the environmental constraints, besides it usually generate more discharge and require a costly disposal treatment of drilled cuttings, which are often not efficient and require mechanical components that hinder the operation. The adoption of aqueous fluids generally involves the use of chrome lignosulfonate, used as dispersant, which provides stability on rheological properties and fluid loss under high temperatures and pressures (HTHP). However, due to the environmental impact associated with the use of chrome compounds, the drilling industry needs alternatives that maintain the integrity of the property and ensure success of the operation in view of the strong influence of temperature on the viscosity of aqueous fluids and polymers used in these type fluids, often polysaccharides, passives of hydrolysis and biological degradation. Therefore, vinyl polymers were selected for this study because they have predominantly carbon chain and, in particular, polyvinylpyrrolidone (PVP) for resisting higher temperatures and partially hydrolyzed polyacrylamide (PHPA) and clay by increasing the system's viscosity. Moreover, the absence of acetal bonds reduces the sensitivity to attacks by bacteria. In order to develop an aqueous drilling fluid system for HTHP applications using PVP, HPAM and clay, as main constituents, fluid formulations were prepared and determined its rheological properties using rotary viscometer of the Fann, and volume filtrate obtained by filtration HTHP following the standard API 13B-2. The new fluid system using polyvinylpyrrolidone (PVP) with high molar weight had higher viscosities, gels and yield strength, due to the effect of flocculating clay. On the other hand, the low molecular weight PVP contributed to the formation of disperse systems with lower values in the rheological properties and fluid loss. Both systems are characterized by thermal stability gain up to around 120 ° C, keeping stable rheological parameters. The results were further corroborated through linear clay swelling tests.
Resumo:
The advance of drilling in deeper wells has required more thermostable materials. The use of synthetic fluids, which usually have a good chemical stability, faces the environmental constraints, besides it usually generate more discharge and require a costly disposal treatment of drilled cuttings, which are often not efficient and require mechanical components that hinder the operation. The adoption of aqueous fluids generally involves the use of chrome lignosulfonate, used as dispersant, which provides stability on rheological properties and fluid loss under high temperatures and pressures (HTHP). However, due to the environmental impact associated with the use of chrome compounds, the drilling industry needs alternatives that maintain the integrity of the property and ensure success of the operation in view of the strong influence of temperature on the viscosity of aqueous fluids and polymers used in these type fluids, often polysaccharides, passives of hydrolysis and biological degradation. Therefore, vinyl polymers were selected for this study because they have predominantly carbon chain and, in particular, polyvinylpyrrolidone (PVP) for resisting higher temperatures and partially hydrolyzed polyacrylamide (PHPA) and clay by increasing the system's viscosity. Moreover, the absence of acetal bonds reduces the sensitivity to attacks by bacteria. In order to develop an aqueous drilling fluid system for HTHP applications using PVP, HPAM and clay, as main constituents, fluid formulations were prepared and determined its rheological properties using rotary viscometer of the Fann, and volume filtrate obtained by filtration HTHP following the standard API 13B-2. The new fluid system using polyvinylpyrrolidone (PVP) with high molar weight had higher viscosities, gels and yield strength, due to the effect of flocculating clay. On the other hand, the low molecular weight PVP contributed to the formation of disperse systems with lower values in the rheological properties and fluid loss. Both systems are characterized by thermal stability gain up to around 120 ° C, keeping stable rheological parameters. The results were further corroborated through linear clay swelling tests.
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.
Resumo:
This study involved the synthesis of photocatalysts based on titanium dioxide (TiO2). The photocatalysts were synthesized by the sol-gel method using three different proportions of acetone (25%, 50% and 75% v/v) in water/acetone mixtures, in order to control the hydrolysis of the precursor of titanium (titanium tetraisopropoxide). Aiming to investigate the structural, morphological and electronic changes provoked by the use of the solvent mixtures, different methodologies were used to characterize the oxides, such as X-ray diffraction (XRD), RAMAN spectroscopy, UV-Vis diffuse reflectance spectroscopy, and measurements of specific surface area (BET). XRD combined to RAMAN analyses revealed that the products are two-phase highly crystalline oxides involving anatase as main phase and brookite. Besides, the refined XRD using the method of Rietveld demonstrated that the presence of acetone during the synthesis influenced in the composition of the crystalline phases, increasing the proportion of the brookite phase between 13 and 22%. The band gap energy of these oxides practically did not suffer changes as function of the synthesis conditions. As shown by the isotherm, these photocatalysts are mesoporous materials with mean diameter of pores of 7 nm and approximately 20% of porosity. The surface area of the oxides prepared by hydrolysis in presence of acetone was 12% higher compared to the bare oxide. After characterized, these oxides had their photocatalytic activities evaluated by photodegradation of the azo dyes Ponceau 4R (P4R), Tartrazine (TTZ) and Reactive Red 120 (RR120), and also by the ability to mediate the photocatalytic production of hydrogen. Using the most efficient photocatalyst, the mineralization achieved for the dyes P4R, RR120 and TTZ was of respectively 83%, 79% and 56% in 120 minutes of reaction, while the discoloration of P4R e RR120 reached 100% and 94% for TTZ. In addition, the same photocatalyst in the presence of 0.5% w/w of Platinum and suspended in a 5:1 v/v water/methanol mixture, produced 56 mmol of gaseous hydrogen in five hours of experiment, corresponding to a specific rate of hydrogen production of 139.5 mmol h-1 g-1.