384 resultados para Imgaens sintéticas
Resumo:
El uso de plantas para tratar diferentes patologías es tan ancestral como la humanidad misma, constituyendo estas una destacada fuente de drogas terapéuticas. Actualmente, se estima que un 25 % del total de las drogas utilizadas en la clínica corresponden a principios activos aislados de plantas superiores y a drogas semi-sintéticas obtenidas a partir de estos precursores naturales. Sin embargo, a pesar de contar con este importante número de estas entidades farmacológicas, al cual se le suma una abundante cantidad de moléculas sintéticas, aún no se dispone de suficientes fármacos que satisfagan simultáneamente las actuales demandas de la terapéutica relacionadas a efectividad, selectividad y mínimo impacto en el desarrollo de resistencia. Esta ausencia se torna crítica en patologías tales como el cáncer o las infecciones bacterianas en donde el fenómeno de resistencia a la acción del medicamento es frecuente, constituyendo la principal causa de fallas en los tratamientos. Esto ha llevado a los investigadores a recurrir nuevamente al estudio de la extensa cantidad de metabolitos presentes en las plantas que aún restan evaluar, muchos de los cuales exhibirían estructuras desconocidas o novedosos mecanismos de acción. En este contexto, el objetivo general del proyecto es estudiar los mecanismos farmacológicos relacionados a la actividad antitumoral o antibacteriana de metabolitos obtenidos en nuestro laboratorio a partir de plantas pertenecientes a la flora nativa, adventicia y naturalizada de la región central de Argentina con el fin de contar con la información necesaria para su posicionamiento como fármacos. En particular, los trabajos apuntarán a determinar el efecto regulador sobre moléculas constitutivas de las células tumorales a través del cual, dos compuestos previamente identificados en nuestro laboratorio como citotóxico o como inhibidor de la expulsión de quimioterápicos mediado por la bomba de resistencia a multidrogas (MDR) P-glicoproteína (P-gp), ejercen su acción. Por otro lado se propone la obtención de nuevas sustancias con propiedades antibacterianas con especial atención a las moléculas y procesos involucrados en dicha acción. Es importante subrayar, que las sustancias encontradas podrán surgir en el futuro como drogas alternativas per se o como líderes para la síntesis o semi-síntesis de análogos a los fines de ser utilizadas en los tratamientos clínicos o veterinarios. Estos tópicos son de alta prioridad en el campo de la investigación, dada la urgente necesidad de nuevos fármacos selectivos dirigidos contra la célula cancerosa o bacteria patógena.
Resumo:
En Argentina, en consonancia con el resto del mundo, la Nanotecnología es considerada un área estratégica. Sin embargo, las investigaciones en Nanobiotecnología todavía constituyen un área de vacancia. El uso de nanomateriales para desarrollar plataformas bioanalíticas que permitan la construcción de biosensores ofrece múltiples ventajas y una promisoria perspectiva de aplicación en diversas áreas. En la actualidad, los laboratorios de análisis clínicos, la industria farmacéutica y alimentaria, y los laboratorios de control bromatológico y ambiental requieren de metodologías analíticas que proporcionen resultados exactos, reproducibles, rápidos, sensibles y selectivos empleando pequeños volúmenes de muestra, con un mínimo consumo de reactivos y una producción de deshechos limpia y escasa. Las investigaciones en nanobiosensores se encuentran dirigidas hacia el logro de estas metas. Uno de los grandes desafíos es lograr biosensores miniaturizados con potencialidad para el desarrollo de dispositivos de medición descentralizada (“point of care”) y la detección simultánea de multianalitos. Aún cuando se han hecho innumerables desarrollos en los casi 50 años de vida de los biosensores, todavía hay numerosos interrogantes por dilucidar. La modificación con nanomateriales juega un rol preponderante en los transductores tanto en los electroquímicos como en los plasmónicos. El uso de películas delgadas de Au para SPR modificadas con grafeno u óxido de grafeno, es un campo de una enorme potencialidad y sin embargo es muy poco explotado, por lo que reviste gran importancia. En lo referido a la capa de biorreconocimiento, se trabajará con moléculas capaces de establecer interacciones de bioafinidad, como los anticuerpos y también moléculas que son muy poco usadas en nuestro país y en Latinoamérica como ADN, aptámeros, PNA y lectinas. RESUMEN: El Objetivo general de este proyecto es desarrollar nuevas plataformas bioanalíticas para la detección de diferentes eventos de bioafinidad a partir de la integración de transductores electroquímicos (EQ) y plasmónicos con materiales nanoestructurados (nanotubos de carbono, nanoláminas de grafeno, nanoalambres metálicos); biomoléculas (ADN, “peptide nucleic acid” (PNA), aptámeros, anticuerpos, lectinas) y polímeros funcionalizados con moléculas bioactivas. Las arquitecturas supramoleculares resultantes estarán dirigidas al desarrollo de biosensores EQ y plasmónicos para la cuantificación de biomarcadores de relevancia clínica y medioambiental. Se funcionalizarán CNT, grafeno, óxido de grafeno, nanoalambres metálicos empleando homopéptidos y proteínas con alta afinidad por cationes metálicos, los que se integrarán a transductores de carbono y oro y biomoléculas de reconocimiento capaces de formar complejos de afinidad (antígeno-anticuerpo, aptámero-molécula blanco, ADN-ADN, PNA-ADN, lectinas-hidratos de carbono, ligandos-cationes metálicos y avidina-biotina). Se sintetizarán y caracterizarán nuevos monómeros y polímeros funcionalizados con moléculas bioactivas y/o grupos rédox empleando diferentes rutas sintéticas. Se desarrollarán genosensores para la detección del evento de hibridación de secuencias de interés médico (cáncer de colon y de mama, tuberculosis); aptasensores para la detección de marcadores proteicos de T. cruzi, enfermedades cardiovasculares y contaminantes catiónicos; inmunosensores para la detección de biomarcadores proteicos relacionados con enfermedades cardiovasculares y cáncer; y biosensores de afinidad con lectinas para la detección de hidratos de carbono. La caracterización de las plataformas y las señales analíticas se obtendrán empleando las siguientes técnicas: voltamperometrías cíclica, de pulso diferencial y de onda cuadrada; stripping; resonancia de plasmón superficial; espectroscopía de impedancia electroquímica; microscopías de barrido electroquímico, SEM, TEM, AFM,SNOM, espectroscopías: UV-vis, FTIR,Raman;RMN, TGA y DSC.
Resumo:
As cantigas da lírica galego-portuguesa são obras de um conjunto diversificado de autores e constituem-se como um rico património literário e cultural da Idade Média, produzido entre os séculos XII e XIV. Ao longo dos tempos, o seu interesse tem conduzido ao estudo de aspetos da transmissão dos textos, da biografia dos trovadores e das influências recebidas de territórios além peninsulares, bem como tem levado à concretização de diversas edições críticas. A presente tese tem como objetivo a edição crítica das cantigas de um dos trovadores da lírica galego-portuguesa, Airas Engeitado. Este autor foi editado pela última vez em 1932, por José Joaquim Nunes, juntamente com as cantigas de amor que Carolina Michaëlis considerou excluídas do cancioneiro da Ajuda. Esta edição não foi, até à data e que seja do nosso conhecimento, revista por nenhum editor. A edição de Nunes, sobre a qual o próprio Nunes manifestou dúvidas, apresenta os textos de Engeitado bastante deturpados, pelo que se procede aqui a uma nova edição crítica, com critérios de edição mais exigentes que os de Nunes na edição referida e normas de transcrição diferentes. Procede-se também ao enquadramento e explicação de uma lírica de autor com caraterísticas que podemos considerar singulares, no contexto da lírica galego-portuguesa. As quatro cantigas de amor que considero da autoria de Airas Engeitado chegaram até nós pelo Cancioneiro da Biblioteca Nacional (B), pelo Cancioneiro da Biblioteca Vaticana (V) e foram mencionadas na Tavola Colocciana, índice de B. Além do estabelecimento crítico das cantigas de Airas Engeitado, fazem-se diversos apontamentos sobre questões paleográficas, notas que abordam as divergências existentes entre as minhas leituras dos testemunhos e as leituras do editor anterior, bem como notas que remetem para peculiaridades lexicais, sintáticas ou dos esquemas de versificação das cantigas. Em breve capítulo, resume-se o pouco que se sabe sobre a biografia de Airas Engeitado e faz-se o enquadramento das cantigas editadas na tradição manuscrita. Questão de extrema relevância é a da dupla atribuição da cantiga A gran direito lazerei, que equaciono e discuto também no capítulo sobre a tradição manuscrita. É nesta reflexão que fundamento a minha decisão de incluir a cantiga na presente edição, apesar de ela ter sido, até à data, unanimemente atribuída a Afonso Eanes do Coton.
Resumo:
Dissertação para obtenção do grau de Mestre no Instituto Superior de Ciências da Saúde Egas Moniz
Resumo:
This research studies the application of syntagmatic analysis of written texts in the language of Brazilian Portuguese as a methodology for the automatic creation of extractive summaries. The automation of abstracts, while linked to the area of natural language processing (PLN) is studying ways the computer can autonomously construct summaries of texts. For this we use as presupposed the idea that switch to the computer the way a language is structured, in our case the Brazilian Portuguese, it will help in the discovery of the most relevant sentences, and consequently build extractive summaries with higher informativeness. In this study, we propose the definition of a summarization method that automatically perform the syntagmatic analysis of texts and through them, to build an automatic summary. The phrases that make up the syntactic structures are then used to analyze the sentences of the text, so the count of these elements determines whether or not a sentence will compose the summary to be generated
Resumo:
The improved performance of hydraulic binders, the base of Portland cement, consists in the careful selection and application of materials that promote greater durability and reduced maintenance costs There is a wide variety of chemical additives used in Portland cement slurries for cementing oil wells. These are designed to work in temperatures below 0 ° C (frozen areas of land) to 300 ° C (thermal recovery wells and geothermal); pressure ranges near ambient pressure (in shallow wells) to greater than 200 MPa (in deep wells). Thus, additives make possible the adaptation of the cement slurries for application under various conditions. Among the materials used in Portland cement slurry, for oil wells, the materials with nanometer scale have been applied with good results. The nanossílica, formed by a dispersion of SiO2 particles, in the nanometer scale, when used in cement systems improves the plastic characteristics and mechanical properties of the hardened material. This dispersion is used commercially as filler material, modifier of rheological properties and / or in recovery processes construction. It is also used in many product formulations such as paints, plastics, synthetic rubbers, adhesives, sealants and insulating materials Based on the above, this study aims to evaluate the performance of nanossílica as extender additive and improver of the performance of cement slurries subjected to low temperatures (5 ° C ± 3 ° C) for application to early stages of marine oil wells. Cement slurries were formulated, with densities 11.0;12.0 and 13.0 ppg, and concentrations of 0; 0.5, 1.0 and 1.5%. The cement slurries were subjected to cold temperatures (5 ° C ± 3 ° C), and its evaluation performed by tests rheological stability, free water and compressive strength in accordance with the procedures set by API SPEC 10A. Thermal characterization tests (TG / DTA) and crystallographic (XRD) were also performed. The use of nanossílica promoted reduction of 30% of the volume of free water and increased compression resistance value of 54.2% with respect to the default cement slurry. Therefore, nanossílica presented as a promising material for use in cement slurries used in the early stages of low-temperature oil wells
Resumo:
Heavy metals are present in industrial waste. These metals can generate a large environmental impact contaminating water, soil and plants. The chemical action of heavy metals has attracted environmental interest. In this context, this study aimed to test t he performance of electrochemical technologies for removing and quantifying heavy metals. First ly , the electroanalytical technique of stripping voltammetry with glassy carbon electrode (GC) was standardized in order to use this method for the quantificatio n of metals during their removal by electrocoagulation process (EC). A nalytical curves were evaluated to obtain reliability of the determin ation and quantification of Cd 2+ and Pb 2+ separately or in a mixture. Meanwhile , EC process was developed using an el ectrochemical cell in a continuous flow (EFC) for removing Pb 2+ and Cd 2+ . The se experiments were performed using Al parallel plates with 10 cm of diameter ( 63.5 cm 2 ) . The optimization of conditions for removing Pb 2+ and Cd 2+ , dissolved in 2 L of solution at 151 L h - 1 , were studied by applying different values of current for 30 min. Cd 2+ and Pb 2+ concentrations were monitored during electrolysis using stripping voltammetry. The results showed that the removal of Pb 2 + was effective when the EC pro cess is used, obtaining removals of 98% in 30 min. This behavior is dependent on the applied current, which implies an increase in power consumption. From the results also verified that the stripping voltammetry technique is quite reliable deter mining Pb 2+ concentration , when compared with the measurements obtained by atomic absorption method (AA). In view of this, t he second objective of this study was to evaluate the removal of Cd 2+ and Pb 2+ (mixture solution) by EC . Removal efficiency increasing current was confirmed when 93% and 100% of Cd 2+ and Pb 2+ was removed after 30 min . The increase in the current promotes the oxidation of sacrificial electrodes, and consequently increased amount of coagulant, which influences the removal of heavy metals in solution. Adsortive voltammetry is a fast, reliable, economical and simple way to determine Cd 2+ and Pb 2+ during their removal. I t is more economical than those normally used, which require the use of toxic and expensive reagents. Our results demonstrated the potential use of electroanalytical techniques to monitor the course of environmental interventions. Thus, the application of the two techniques associated can be a reliable way to monitor environmental impacts due to the pollution of aquatic ecosystems by heavy metals.
Resumo:
Drilling fluids have fundamental importance in the petroleum activities, since they are responsible for remove the cuttings, maintain pressure and well stability, preventing collapse and inflow of fluid into the rock formation and maintain lubrication and cooling the drill. There are basically three types of drilling fluids: water-based, non-aqueous and aerated based. The water-based drilling fluid is widely used because it is less aggressive to the environment and provide excellent stability and inhibition (when the water based drilling fluid is a inhibition fluid), among other qualities. Produced water is generated simultaneously with oil during production and has high concentrations of metals and contaminants, so it’s necessary to treat for disposal this water. The produced water from the fields of Urucu-AM and Riacho da forquilha-RN have high concentrations of contaminants, metals and salts such as calcium and magnesium, complicating their treatment and disposal. Thus, the objective was to analyze the use of synthetic produced water with similar characteristics of produced water from Urucu-AM and Riacho da Forquilha-RN for formulate a water-based drilling mud, noting the influence of varying the concentration of calcium and magnesium into filtered and rheology tests. We conducted a simple 32 factorial experimental design for statistical modeling of data. The results showed that the varying concentrations of calcium and magnesium did not influence the rheology of the fluid, where in the plastic viscosity, apparent viscosity and the initial and final gels does not varied significantly. For the filtrate tests, calcium concentration in a linear fashion influenced chloride concentration, where when we have a higher concentration of calcium we have a higher the concentration of chloride in the filtrate. For the Urucu’s produced water based fluids, volume of filtrate was observed that the calcium concentration influences quadratically, this means that high calcium concentrations interfere with the power of the inhibitors used in the formulation of the filtered fluid. For Riacho’s produced water based fluid, Calcium’s influences is linear for volume of filtrate. The magnesium concentration was significant only for chloride concentration in a quadratic way just for Urucu’s produced water based fluids. The mud with maximum concentration of magnesium (9,411g/L), but minimal concentration of calcium (0,733g/L) showed good results. Therefore, a maximum water produced by magnesium concentration of 9,411g/L and the maximum calcium concentration of 0,733g/L can be used for formulating water-based drilling fluids, providing appropriate properties for this kind of fluid.
Resumo:
In this work, the treatment of wastewater from the textile industry, containing dyes as Yellow Novacron (YN), Red Remazol BR (RRB) and Blue Novacron CD (NB), and also, the treatment of wastewater from petrochemical industry (produced water) were investigated by anodic oxidation (OA) with platinum anodes supported on titanium (Ti/Pt) and boron-doped diamond (DDB). Definitely, one of the main parameters of this kind of treatment is the type of electrocatalytic material used, since the mechanisms and products of some anodic reactions depend on it. The OA of synthetic effluents containing with RRB, NB and YN were investigated in order to find the best conditions for the removal of color and organic content of the dye. According to the experimental results, the process of OA is suitable for decolorization of wastewaters containing these textile dyes due to electrocatalytic properties of DDB and Pt anodes. Removal of the organic load was more efficient at DDB, in all cases; where the dyes were degraded to aliphatic carboxylic acids at the end of the electrolysis. Energy requirements for the removal of color during OA of solutions of RRB, NB and YN depends mainly on the operating conditions, for example, RRB passes of 3.30 kWh m-3 at 20 mA cm-2 for 4.28 kWh m-3 at 60 mA cm-2 (pH = 1); 15.23 kWh m-3 at 20 mA cm-2 to 24.75 kWh m-3 at 60 mA cm-2 (pH 4.5); 10.80 kWh m-3 at 20 mA cm-2 to 31.5 kWh m-3 at 60 mA cm-2 (pH = 8) (estimated data for volume of treated effluent). On the other hand, in the study of OA of produced water effluent generated by petrochemical industry, galvanostatic electrolysis using DDB led to the complete removal of COD (98%), due to large amounts of hydroxyl radicals and peroxodisulphates generated from the oxidation of water and sulfates in solution, respectively. Thus, the rate of COD removal increases with increasing applied current density (15-60 mAcm-2 ). Moreover, at Pt electrode, approximately 50% removal of the organic load was achieved by applying from 15 to 30 mAcm-2 while 80% of COD removal was achieved for 60 mAcm-2 . Thus, the results obtained in the application of this technology were satisfactory depending on the electrocatalytic materials and operating conditions used for removal of organic load (petrochemical and textile effluents) as well as for the removal of color (in the case of textile effluents). Therefore, the applicability of electrochemical treatment can be considered as a new alternative like pretreatment or treatment of effluents derived from textiles and petrochemical industries.
Resumo:
Google Docs (GD) is an online word processor with which multiple authors can work on the same document, in a synchronous or asynchronous manner, which can help develop the ability of writing in English (WEISSHEIMER; SOARES, 2012). As they write collaboratively, learners find more opportunities to notice the gaps in their written production, since they are exposed to more input from the fellow co-authors (WEISSHEIMER; BERGSLEITHNER; LEANDRO, 2012) and prioritize the process of text (re)construction instead of the concern with the final product, i.e., the final version of the text (LEANDRO; WEISSHEIMER; COOPER, 2013). Moreover, when it comes to second language (L2) learning, producing language enables the consolidation of existing knowledge as well as the internalization of new knowledge (SWAIN, 1985; 1993). Taking this into consideration, this mixed-method (DÖRNYEI, 2007) quasi-experimental (NUNAN, 1999) study aims at investigating the impact of collaborative writing through GD on the development of the writing skill in English and on the noticing of syntactic structures (SCHMIDT, 1990). Thirtyfour university students of English integrated the cohort of the study: twenty-five were assigned to the experimental group and nine were assigned to the control group. All learners went through a pre-test and a post-test so that we could measure their noticing of syntactic structures. Learners in the experimental group were exposed to a blended learning experience, in which they took reading and writing classes at the university and collaboratively wrote three pieces of flash fiction (a complete story told in a hundred words), outside the classroom, online through GD, during eleven weeks. Learners in the control group took reading and writing classes at the university but did not practice collaborative writing. The first and last stories produced by the learners in the experimental group were analysed in terms of grammatical accuracy, operationalized as the number of grammar errors per hundred words (SOUSA, 2014), and lexical density, which refers to the relationship between the number of words produced with lexical properties and the number of words produced with grammatical properties (WEISSHEIMER, 2007; MEHNERT, 1998). Additionally, learners in the experimental group answered an online questionnaire on the blended learning experience they were exposed to. The quantitative results showed that the collaborative task led to the production of more lexically dense texts over the 11 weeks. The noticing and grammatical accuracy results were different from what we expected; however, they provide us with insights on measurement issues, in the case of noticing, and on the participants‟ positive attitude towards collaborative writing with flash fiction. The qualitative results also shed light on the usefulness of computer-mediated collaborative writing in L2 learning.
Resumo:
Produced water is considered the main effluent of the oil industry, due to their increased volume in mature fields and its varied composition. The oil and grease content (TOG) is the main parameter for the final disposal of produced water. In this context, it is of great significance to develop an alternative method based on guar gum gel for the treatment of synthetic produced water, and using as the differential a polymer having high hydrophilicity for clarifying waters contaminated with oil. Thus, this study aims to evaluate the efficiency of guar gum gels in the remotion of oil from produced water. Guar gum is a natural polymer that, under specific conditions, forms three-dimensional structures, with important physical and chemical properties. By crosslinking the polymer chains by borate ions in the presence of salts, the effect salting out occurs, reducing the solubility of the polymer gel in water. As a result, there is phase separation with the oil trapped in the collapsed gel. The TOG was quantified from the spectroscopy in the ultraviolet and visible region. The system was proven to be highly efficient in the removal of dispersed oil from water produced synthetically, reaching removal percentages above 90%.
Resumo:
Intense precipitation events (IPE) have been causing great social and economic losses in the affected regions. In the Amazon, these events can have serious impacts, primarily for populations living on the margins of its countless rivers, because when water levels are elevated, floods and/or inundations are generally observed. Thus, the main objective of this research is to study IPE, through Extreme Value Theory (EVT), to estimate return periods of these events and identify regions of the Brazilian Amazon where IPE have the largest values. The study was performed using daily rainfall data of the hydrometeorological network managed by the National Water Agency (Agência Nacional de Água) and the Meteorological Data Bank for Education and Research (Banco de Dados Meteorológicos para Ensino e Pesquisa) of the National Institute of Meteorology (Instituto Nacional de Meteorologia), covering the period 1983-2012. First, homogeneous rainfall regions were determined through cluster analysis, using the hierarchical agglomerative Ward method. Then synthetic series to represent the homogeneous regions were created. Next EVT, was applied in these series, through Generalized Extreme Value (GEV) and the Generalized Pareto Distribution (GPD). The goodness of fit of these distributions were evaluated by the application of the Kolmogorov-Smirnov test, which compares the cumulated empirical distributions with the theoretical ones. Finally, the composition technique was used to characterize the prevailing atmospheric patterns for the occurrence of IPE. The results suggest that the Brazilian Amazon has six pluvial homogeneous regions. It is expected more severe IPE to occur in the south and in the Amazon coast. More intense rainfall events are expected during the rainy or transitions seasons of each sub-region, with total daily precipitation of 146.1, 143.1 and 109.4 mm (GEV) and 201.6, 209.5 and 152.4 mm (GPD), at least once year, in the south, in the coast and in the northwest of the Brazilian Amazon, respectively. For the south Amazonia, the composition analysis revealed that IPE are associated with the configuration and formation of the South Atlantic Convergence Zone. Along the coast, intense precipitation events are associated with mesoscale systems, such Squall Lines. In Northwest Amazonia IPE are apparently associated with the Intertropical Convergence Zone and/or local convection.
Resumo:
Currently, there is a great search for materials derived from renewable sources. The vegetable fibers as reinforcement for polymer matrixes, has been used as an alternative to replace synthetic fibres, being biodegradable and of low cost. The present work aims to develop a composite material with epoxy resin reinforced with curauá fibre with the addition of alumina trihydrate (aluminum hydroxide, Al(OH)3) as a flame retardant, which was used in proportions of 10 %, 20% and 30% of the total volume of the composite. The curauá fibers have gone through a cleaning process with an alkaline bath of sodium hydroxide (NaOH ), parallelized by hand and cut carding according to the default length . They were molded composites with fibers 30cm. Composites were molded in a Lossy Mold with unidirectional fibres in the proportion of 20% of the total volume of the composite. The composites were prepared in the Chemical Processing Laboratory of the Textile Engineering Department at UFRN. To measure the performance of the material, tests for the resistance to traction and flexion were carried out. with samples that were later analyzed in the Electronic Microscopy Apparatus (SEM ). The composites showed good mechanical properties by the addition of flame retardant and in some cases, leaving the composite more vulnerable to breakage. These mechanical results were analyzed by chi-square statistical test at the 5% significance level to check for possible differences between the composite groups. Flammability testing was conducted based on the standard Underwriters Laboratory 94 and the material showed a satisfactory result taking their average burn rate (mm / min) decreasing with increasing addition of the flame retardant composite.
Resumo:
In this paper, we analyze corporate slogans, understanding them as a discursive construction that is, as a pairing of form and function able to unite the notions of textual type and discursive genre. In this way, we developed a qualitative and quantitative analysis, aimed specifically to analyze the formal properties (phonetic, morphological and syntactic) and functional (semantic, pragmatic and discursive) of slogans. Furthermore, we attempted at verifying and quantifying recurring aspects involved in their construction, in order to capture configurational patterns underlying their formation. The data come from slogans collected in products and / or service stores in the metropolitan area of Natal city, Rio Grande do Norte. This research is based on the Cognitive-Functional Linguistics, that conjugates the North American functionalist tradition, represented by researchers as Talmy Givón, Paul Hopper, Joan Bybee, Elizabeth Closs-Traugott, with Cognitive Linguistics, in particular, the chain linked to Construction Grammar, as postulated by Adele Goldberg, William Croft e Jan-Olla Östman, among others. The results ratified the importance of the interface between the formal and functional aspects in the analysis of linguistic uses. These results point to the idea of the slogan as the pairing of form and function on textual / discursive level, in other words, as a discursive construction, constituting as cognitive storage of a scheme / model of textual formation with a specific discursive-pragmatic function.
Resumo:
This study aimed to evaluate the potential of oxidative electrochemical treatment coupled with adsorption process using expanded perlite as adsorbent in the removal of textile dyes, Red Remazol and Novacron Blue on synthetic effluent. Dyes and perlite were characterized by thermogravimetry techniques (TG), Differential Scanning Calorimetry (DSC), Spectroscopy infrared (IR), Scanning Electron Microscopy (SEM), X-ray diffraction (XRD) and X-ray fluorescence (XRF) techniques. Electrochemical treatments used as anodes, Ti/Pt and Pb/PbO2 under different conditions: 60 minutes, current density 20, 40 e 60 mAcm-2, pH 1, 4.5 e 8 and temperature variation 20, 40 e 60 ºC. In the case of adsorption tests, contact time of 30 minutes for the Remazol Red dye and 20 minutes for Novacron Blue were established, while pH 1, 4.5 e 8, 500 mg adsorbent and temperature variation 20, 40 e 60 ºC were used for both treatments. The results indicated that both treatments, electroxidation/adsorption and the adsorption/electroxidation, were effective for removing color from synthetic solutions. The consumption of electricity allowed to evaluate the applicability of the electrochemical process, providing very acceptable values, which allowed us to estimate the cost. Total organic carbon (TOC) and Gas Chromatography linked mass spectrometer (GC-MS) analyzes were performed, showing that the better combination for removing organic matter is by Pb/PbO2 and perlite. Meanwhile, GC-MS indicated that the by-products formed are benzoic acid, phthalic acid, thiocarbamic acid, benzene, chlorobenzene, phenol-2-ethyl and naphthalene when Remazol Red was degraded. Conversely, aniline, phthalic acid, 1, 6 - dimethylnaphthalene, naphthalene and ion hidroxobenzenosulfonat was detected when Novacron Blue was studied. Analyses obtained through atomic absorption spectrometry showed that there was release of lead in the electrochemical oxidation of analyzes that were performed with the anode Pb/PbO2, but these values are reduced by subjecting the effluent to adsorption analysis. According to these results, sequential techniques electroxidation/adsorption and adsorption/electroxidation are to treat solutions containing dyes.