851 resultados para Resinas sintéticas


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen de la comunicación presentada en el XIII Congreso Nacional de Ingeniería Química, Madrid, 18-20 noviembre 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este trabajo se estudia el uso de las nubes de puntos en 3D, es decir, un conjunto de puntos en un sistema de referencia cartesiano en R3, para la identificación y caracterización de las discontinuidades que afloran en un macizo rocoso y su aplicación al campo de la Mecánica de Rocas. Las nubes de puntos utilizadas se han adquirido mediante tres técnicas: sintéticas, 3D laser scanner y la técnica de fotogrametría digital Structure From Motion (SfM). El enfoque está orientado a la extracción y caracterización de familias de discontinuidades y su aplicación a la evaluación de la calidad de un talud rocoso mediante la clasificación geomecánica Slope Mass Rating (SMR). El contenido de la misma está dividido en tres bloques, como son: (1) metodología de extracción de discontinuidades y clasificación de la nube de puntos 3D; (2) análisis de espaciados normales en nubes de puntos 3D; y (3) análisis de la evaluación de la calidad geomecánica de taludes rocoso mediante la clasificación geomecánica SMR a partir de nubes de puntos 3D. La primera línea de investigación consiste en el estudio de las nubes de puntos 3D con la finalidad de extraer y caracterizar las discontinuidades planas presentes en la superficie de un macizo rocoso. En primer lugar, se ha recopilado información de las metodologías existentes y la disponibilidad de programas para su estudio. Esto motivó la decisión de investigar y diseñar un proceso de clasificación novedoso, que muestre todos los pasos para su programación e incluso ofreciendo el código programado a la comunidad científica bajo licencia GNU GPL. De esta forma, se ha diseñado una novedosa metodología y se ha programado un software que analiza nubes de puntos 3D de forma semi-automática, permitiendo al usuario interactuar con el proceso de clasificación. Dicho software se llama Discontinuity Set Extractor (DSE). El método se ha validado empleando nubes de puntos sintéticas y adquiridas con 3D laser scanner. En primer lugar, este código analiza la nube de puntos efectuando un test de coplanaridad para cada punto y sus vecinos próximos para, a continuación, calcular el vector normal de la superficie en el punto estudiado. En segundo lugar, se representan los polos de los vectores normales calculados en el paso previo en una falsilla estereográfica. A continuación se calcula la densidad de los polos y los polos con mayor densidad o polos principales. Estos indican las orientaciones de la superficie más representadas, y por tanto las familias de discontinuidades. En tercer lugar, se asigna a cada punto una familia en dependencia del ángulo formado por el vector normal del punto y el de la familia. En este punto el usuario puede visualizar la nube de puntos clasificada con las familias de discontinuidades que ha determinado para validar el resultado intermedio. En cuarto lugar, se realiza un análisis cluster en el que se determina la agrupación de puntos según planos para cada familia (clusters). A continuación, se filtran aquellos que no tengan un número de puntos suficiente y se determina la ecuación de cada plano. Finalmente, se exportan los resultados de la clasificación a un archivo de texto para su análisis y representación en otros programas. La segunda línea de investigación consiste en el estudio del espaciado entre discontinuidades planas que afloran en macizos rocosos a partir de nubes de puntos 3D. Se desarrolló una metodología de cálculo de espaciados a partir de nubes de puntos 3D previamente clasificadas con el fin de determinar las relaciones espaciales entre planos de cada familia y calcular el espaciado normal. El fundamento novedoso del método propuesto es determinar el espaciado normal de familia basándonos en los mismos principios que en campo, pero sin la restricción de las limitaciones espaciales, condiciones de inseguridad y dificultades inherentes al proceso. Se consideraron dos aspectos de las discontinuidades: su persistencia finita o infinita, siendo la primera el aspecto más novedoso de esta publicación. El desarrollo y aplicación del método a varios casos de estudio permitió determinar su ámbito de aplicación. La validación se llevó a cabo con nubes de puntos sintéticas y adquiridas con 3D laser scanner. La tercera línea de investigación consiste en el análisis de la aplicación de la información obtenida con nubes de puntos 3D a la evaluación de la calidad de un talud rocoso mediante la clasificación geomecánica SMR. El análisis se centró en la influencia del uso de orientaciones determinadas con distintas fuentes de información (datos de campo y técnicas de adquisición remota) en la determinación de los factores de ajuste y al valor del índice SMR. Los resultados de este análisis muestran que el uso de fuentes de información y técnicas ampliamente aceptadas pueden ocasionar cambios en la evaluación de la calidad del talud rocoso de hasta una clase geomecánica (es decir, 20 unidades). Asimismo, los análisis realizados han permitido constatar la validez del índice SMR para cartografiar zonas inestables de un talud. Los métodos y programas informáticos desarrollados suponen un importante avance científico para el uso de nubes de puntos 3D para: (1) el estudio y caracterización de las discontinuidades de los macizos rocosos y (2) su aplicación a la evaluación de la calidad de taludes en roca mediante las clasificaciones geomecánicas. Asimismo, las conclusiones obtenidas y los medios y métodos empleados en esta tesis doctoral podrán ser contrastadas y utilizados por otros investigadores, al estar disponibles en la web del autor bajo licencia GNU GPL.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El uso de plantas para tratar diferentes patologías es tan ancestral como la humanidad misma, constituyendo estas una destacada fuente de drogas terapéuticas. Actualmente, se estima que un 25 % del total de las drogas utilizadas en la clínica corresponden a principios activos aislados de plantas superiores y a drogas semi-sintéticas obtenidas a partir de estos precursores naturales. Sin embargo, a pesar de contar con este importante número de estas entidades farmacológicas, al cual se le suma una abundante cantidad de moléculas sintéticas, aún no se dispone de suficientes fármacos que satisfagan simultáneamente las actuales demandas de la terapéutica relacionadas a efectividad, selectividad y mínimo impacto en el desarrollo de resistencia. Esta ausencia se torna crítica en patologías tales como el cáncer o las infecciones bacterianas en donde el fenómeno de resistencia a la acción del medicamento es frecuente, constituyendo la principal causa de fallas en los tratamientos. Esto ha llevado a los investigadores a recurrir nuevamente al estudio de la extensa cantidad de metabolitos presentes en las plantas que aún restan evaluar, muchos de los cuales exhibirían estructuras desconocidas o novedosos mecanismos de acción. En este contexto, el objetivo general del proyecto es estudiar los mecanismos farmacológicos relacionados a la actividad antitumoral o antibacteriana de metabolitos obtenidos en nuestro laboratorio a partir de plantas pertenecientes a la flora nativa, adventicia y naturalizada de la región central de Argentina con el fin de contar con la información necesaria para su posicionamiento como fármacos. En particular, los trabajos apuntarán a determinar el efecto regulador sobre moléculas constitutivas de las células tumorales a través del cual, dos compuestos previamente identificados en nuestro laboratorio como citotóxico o como inhibidor de la expulsión de quimioterápicos mediado por la bomba de resistencia a multidrogas (MDR) P-glicoproteína (P-gp), ejercen su acción. Por otro lado se propone la obtención de nuevas sustancias con propiedades antibacterianas con especial atención a las moléculas y procesos involucrados en dicha acción. Es importante subrayar, que las sustancias encontradas podrán surgir en el futuro como drogas alternativas per se o como líderes para la síntesis o semi-síntesis de análogos a los fines de ser utilizadas en los tratamientos clínicos o veterinarios. Estos tópicos son de alta prioridad en el campo de la investigación, dada la urgente necesidad de nuevos fármacos selectivos dirigidos contra la célula cancerosa o bacteria patógena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En Argentina, en consonancia con el resto del mundo, la Nanotecnología es considerada un área estratégica. Sin embargo, las investigaciones en Nanobiotecnología todavía constituyen un área de vacancia. El uso de nanomateriales para desarrollar plataformas bioanalíticas que permitan la construcción de biosensores ofrece múltiples ventajas y una promisoria perspectiva de aplicación en diversas áreas. En la actualidad, los laboratorios de análisis clínicos, la industria farmacéutica y alimentaria, y los laboratorios de control bromatológico y ambiental requieren de metodologías analíticas que proporcionen resultados exactos, reproducibles, rápidos, sensibles y selectivos empleando pequeños volúmenes de muestra, con un mínimo consumo de reactivos y una producción de deshechos limpia y escasa. Las investigaciones en nanobiosensores se encuentran dirigidas hacia el logro de estas metas. Uno de los grandes desafíos es lograr biosensores miniaturizados con potencialidad para el desarrollo de dispositivos de medición descentralizada (“point of care”) y la detección simultánea de multianalitos. Aún cuando se han hecho innumerables desarrollos en los casi 50 años de vida de los biosensores, todavía hay numerosos interrogantes por dilucidar. La modificación con nanomateriales juega un rol preponderante en los transductores tanto en los electroquímicos como en los plasmónicos. El uso de películas delgadas de Au para SPR modificadas con grafeno u óxido de grafeno, es un campo de una enorme potencialidad y sin embargo es muy poco explotado, por lo que reviste gran importancia. En lo referido a la capa de biorreconocimiento, se trabajará con moléculas capaces de establecer interacciones de bioafinidad, como los anticuerpos y también moléculas que son muy poco usadas en nuestro país y en Latinoamérica como ADN, aptámeros, PNA y lectinas. RESUMEN: El Objetivo general de este proyecto es desarrollar nuevas plataformas bioanalíticas para la detección de diferentes eventos de bioafinidad a partir de la integración de transductores electroquímicos (EQ) y plasmónicos con materiales nanoestructurados (nanotubos de carbono, nanoláminas de grafeno, nanoalambres metálicos); biomoléculas (ADN, “peptide nucleic acid” (PNA), aptámeros, anticuerpos, lectinas) y polímeros funcionalizados con moléculas bioactivas. Las arquitecturas supramoleculares resultantes estarán dirigidas al desarrollo de biosensores EQ y plasmónicos para la cuantificación de biomarcadores de relevancia clínica y medioambiental. Se funcionalizarán CNT, grafeno, óxido de grafeno, nanoalambres metálicos empleando homopéptidos y proteínas con alta afinidad por cationes metálicos, los que se integrarán a transductores de carbono y oro y biomoléculas de reconocimiento capaces de formar complejos de afinidad (antígeno-anticuerpo, aptámero-molécula blanco, ADN-ADN, PNA-ADN, lectinas-hidratos de carbono, ligandos-cationes metálicos y avidina-biotina). Se sintetizarán y caracterizarán nuevos monómeros y polímeros funcionalizados con moléculas bioactivas y/o grupos rédox empleando diferentes rutas sintéticas. Se desarrollarán genosensores para la detección del evento de hibridación de secuencias de interés médico (cáncer de colon y de mama, tuberculosis); aptasensores para la detección de marcadores proteicos de T. cruzi, enfermedades cardiovasculares y contaminantes catiónicos; inmunosensores para la detección de biomarcadores proteicos relacionados con enfermedades cardiovasculares y cáncer; y biosensores de afinidad con lectinas para la detección de hidratos de carbono. La caracterización de las plataformas y las señales analíticas se obtendrán empleando las siguientes técnicas: voltamperometrías cíclica, de pulso diferencial y de onda cuadrada; stripping; resonancia de plasmón superficial; espectroscopía de impedancia electroquímica; microscopías de barrido electroquímico, SEM, TEM, AFM,SNOM, espectroscopías: UV-vis, FTIR,Raman;RMN, TGA y DSC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As cantigas da lírica galego-portuguesa são obras de um conjunto diversificado de autores e constituem-se como um rico património literário e cultural da Idade Média, produzido entre os séculos XII e XIV. Ao longo dos tempos, o seu interesse tem conduzido ao estudo de aspetos da transmissão dos textos, da biografia dos trovadores e das influências recebidas de territórios além peninsulares, bem como tem levado à concretização de diversas edições críticas. A presente tese tem como objetivo a edição crítica das cantigas de um dos trovadores da lírica galego-portuguesa, Airas Engeitado. Este autor foi editado pela última vez em 1932, por José Joaquim Nunes, juntamente com as cantigas de amor que Carolina Michaëlis considerou excluídas do cancioneiro da Ajuda. Esta edição não foi, até à data e que seja do nosso conhecimento, revista por nenhum editor. A edição de Nunes, sobre a qual o próprio Nunes manifestou dúvidas, apresenta os textos de Engeitado bastante deturpados, pelo que se procede aqui a uma nova edição crítica, com critérios de edição mais exigentes que os de Nunes na edição referida e normas de transcrição diferentes. Procede-se também ao enquadramento e explicação de uma lírica de autor com caraterísticas que podemos considerar singulares, no contexto da lírica galego-portuguesa. As quatro cantigas de amor que considero da autoria de Airas Engeitado chegaram até nós pelo Cancioneiro da Biblioteca Nacional (B), pelo Cancioneiro da Biblioteca Vaticana (V) e foram mencionadas na Tavola Colocciana, índice de B. Além do estabelecimento crítico das cantigas de Airas Engeitado, fazem-se diversos apontamentos sobre questões paleográficas, notas que abordam as divergências existentes entre as minhas leituras dos testemunhos e as leituras do editor anterior, bem como notas que remetem para peculiaridades lexicais, sintáticas ou dos esquemas de versificação das cantigas. Em breve capítulo, resume-se o pouco que se sabe sobre a biografia de Airas Engeitado e faz-se o enquadramento das cantigas editadas na tradição manuscrita. Questão de extrema relevância é a da dupla atribuição da cantiga A gran direito lazerei, que equaciono e discuto também no capítulo sobre a tradição manuscrita. É nesta reflexão que fundamento a minha decisão de incluir a cantiga na presente edição, apesar de ela ter sido, até à data, unanimemente atribuída a Afonso Eanes do Coton.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre no Instituto Superior de Ciências da Saúde Egas Moniz

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre no Instituto Superior de Ciências da Saúde Egas Moniz

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este estudo avaliou a colagem de braquetes linguais nas faces linguais de pré-molares superiores com diferentes espessuras de PADs, confeccionados com a resina fotopolimerizável Transbond XT nas espessuras de 1,0 mm e 2,0 mm e comparando-os com a menor espessura possível. Foi utilizado o adesivo Sondhi Rapid Set para a colagem indireta. Avaliou-se a força de resistência sob cisalhamento cinco minutos após a colagem em uma máquina de ensaios mecânicos Kratos, com velocidade de cruzeta de 1,0 mm/min. A força média da resistência da colagem sob cisalhamento para o Grupo I foi de 9,69 MPa (DP 4,02 MPa), para o Grupo II foi de 6,15 MPa (DP 2,69 MPa) e para o Grupo III foi de 5,73 MPa (DP 1,62 MPa). O Grupo I, com menor espessura do PAD, apresentou força de resistência da colagem sob cisalhamento significativamente maior do que os Grupos II e III (PADs com 1,0 e 2,0 mm respectivamente). Estes por sua vez, não apresentaram diferença estatisticamente significante ao nível de p<0,05. O índice 1 de Adesivo Remanescente predominou nos Grupos I e II, caracterizando um maior número de fraturas do tipo adesiva. No Grupo 3 predominou o Índice 2, com fraturas do tipo coesiva.(AU)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research studies the application of syntagmatic analysis of written texts in the language of Brazilian Portuguese as a methodology for the automatic creation of extractive summaries. The automation of abstracts, while linked to the area of natural language processing (PLN) is studying ways the computer can autonomously construct summaries of texts. For this we use as presupposed the idea that switch to the computer the way a language is structured, in our case the Brazilian Portuguese, it will help in the discovery of the most relevant sentences, and consequently build extractive summaries with higher informativeness. In this study, we propose the definition of a summarization method that automatically perform the syntagmatic analysis of texts and through them, to build an automatic summary. The phrases that make up the syntactic structures are then used to analyze the sentences of the text, so the count of these elements determines whether or not a sentence will compose the summary to be generated

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The improved performance of hydraulic binders, the base of Portland cement, consists in the careful selection and application of materials that promote greater durability and reduced maintenance costs There is a wide variety of chemical additives used in Portland cement slurries for cementing oil wells. These are designed to work in temperatures below 0 ° C (frozen areas of land) to 300 ° C (thermal recovery wells and geothermal); pressure ranges near ambient pressure (in shallow wells) to greater than 200 MPa (in deep wells). Thus, additives make possible the adaptation of the cement slurries for application under various conditions. Among the materials used in Portland cement slurry, for oil wells, the materials with nanometer scale have been applied with good results. The nanossílica, formed by a dispersion of SiO2 particles, in the nanometer scale, when used in cement systems improves the plastic characteristics and mechanical properties of the hardened material. This dispersion is used commercially as filler material, modifier of rheological properties and / or in recovery processes construction. It is also used in many product formulations such as paints, plastics, synthetic rubbers, adhesives, sealants and insulating materials Based on the above, this study aims to evaluate the performance of nanossílica as extender additive and improver of the performance of cement slurries subjected to low temperatures (5 ° C ± 3 ° C) for application to early stages of marine oil wells. Cement slurries were formulated, with densities 11.0;12.0 and 13.0 ppg, and concentrations of 0; 0.5, 1.0 and 1.5%. The cement slurries were subjected to cold temperatures (5 ° C ± 3 ° C), and its evaluation performed by tests rheological stability, free water and compressive strength in accordance with the procedures set by API SPEC 10A. Thermal characterization tests (TG / DTA) and crystallographic (XRD) were also performed. The use of nanossílica promoted reduction of 30% of the volume of free water and increased compression resistance value of 54.2% with respect to the default cement slurry. Therefore, nanossílica presented as a promising material for use in cement slurries used in the early stages of low-temperature oil wells

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heavy metals are present in industrial waste. These metals can generate a large environmental impact contaminating water, soil and plants. The chemical action of heavy metals has attracted environmental interest. In this context, this study aimed to test t he performance of electrochemical technologies for removing and quantifying heavy metals. First ly , the electroanalytical technique of stripping voltammetry with glassy carbon electrode (GC) was standardized in order to use this method for the quantificatio n of metals during their removal by electrocoagulation process (EC). A nalytical curves were evaluated to obtain reliability of the determin ation and quantification of Cd 2+ and Pb 2+ separately or in a mixture. Meanwhile , EC process was developed using an el ectrochemical cell in a continuous flow (EFC) for removing Pb 2+ and Cd 2+ . The se experiments were performed using Al parallel plates with 10 cm of diameter (  63.5 cm 2 ) . The optimization of conditions for removing Pb 2+ and Cd 2+ , dissolved in 2 L of solution at 151 L h - 1 , were studied by applying different values of current for 30 min. Cd 2+ and Pb 2+ concentrations were monitored during electrolysis using stripping voltammetry. The results showed that the removal of Pb 2 + was effective when the EC pro cess is used, obtaining removals of 98% in 30 min. This behavior is dependent on the applied current, which implies an increase in power consumption. From the results also verified that the stripping voltammetry technique is quite reliable deter mining Pb 2+ concentration , when compared with the measurements obtained by atomic absorption method (AA). In view of this, t he second objective of this study was to evaluate the removal of Cd 2+ and Pb 2+ (mixture solution) by EC . Removal efficiency increasing current was confirmed when 93% and 100% of Cd 2+ and Pb 2+ was removed after 30 min . The increase in the current promotes the oxidation of sacrificial electrodes, and consequently increased amount of coagulant, which influences the removal of heavy metals in solution. Adsortive voltammetry is a fast, reliable, economical and simple way to determine Cd 2+ and Pb 2+ during their removal. I t is more economical than those normally used, which require the use of toxic and expensive reagents. Our results demonstrated the potential use of electroanalytical techniques to monitor the course of environmental interventions. Thus, the application of the two techniques associated can be a reliable way to monitor environmental impacts due to the pollution of aquatic ecosystems by heavy metals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drilling fluids have fundamental importance in the petroleum activities, since they are responsible for remove the cuttings, maintain pressure and well stability, preventing collapse and inflow of fluid into the rock formation and maintain lubrication and cooling the drill. There are basically three types of drilling fluids: water-based, non-aqueous and aerated based. The water-based drilling fluid is widely used because it is less aggressive to the environment and provide excellent stability and inhibition (when the water based drilling fluid is a inhibition fluid), among other qualities. Produced water is generated simultaneously with oil during production and has high concentrations of metals and contaminants, so it’s necessary to treat for disposal this water. The produced water from the fields of Urucu-AM and Riacho da forquilha-RN have high concentrations of contaminants, metals and salts such as calcium and magnesium, complicating their treatment and disposal. Thus, the objective was to analyze the use of synthetic produced water with similar characteristics of produced water from Urucu-AM and Riacho da Forquilha-RN for formulate a water-based drilling mud, noting the influence of varying the concentration of calcium and magnesium into filtered and rheology tests. We conducted a simple 32 factorial experimental design for statistical modeling of data. The results showed that the varying concentrations of calcium and magnesium did not influence the rheology of the fluid, where in the plastic viscosity, apparent viscosity and the initial and final gels does not varied significantly. For the filtrate tests, calcium concentration in a linear fashion influenced chloride concentration, where when we have a higher concentration of calcium we have a higher the concentration of chloride in the filtrate. For the Urucu’s produced water based fluids, volume of filtrate was observed that the calcium concentration influences quadratically, this means that high calcium concentrations interfere with the power of the inhibitors used in the formulation of the filtered fluid. For Riacho’s produced water based fluid, Calcium’s influences is linear for volume of filtrate. The magnesium concentration was significant only for chloride concentration in a quadratic way just for Urucu’s produced water based fluids. The mud with maximum concentration of magnesium (9,411g/L), but minimal concentration of calcium (0,733g/L) showed good results. Therefore, a maximum water produced by magnesium concentration of 9,411g/L and the maximum calcium concentration of 0,733g/L can be used for formulating water-based drilling fluids, providing appropriate properties for this kind of fluid.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the treatment of wastewater from the textile industry, containing dyes as Yellow Novacron (YN), Red Remazol BR (RRB) and Blue Novacron CD (NB), and also, the treatment of wastewater from petrochemical industry (produced water) were investigated by anodic oxidation (OA) with platinum anodes supported on titanium (Ti/Pt) and boron-doped diamond (DDB). Definitely, one of the main parameters of this kind of treatment is the type of electrocatalytic material used, since the mechanisms and products of some anodic reactions depend on it. The OA of synthetic effluents containing with RRB, NB and YN were investigated in order to find the best conditions for the removal of color and organic content of the dye. According to the experimental results, the process of OA is suitable for decolorization of wastewaters containing these textile dyes due to electrocatalytic properties of DDB and Pt anodes. Removal of the organic load was more efficient at DDB, in all cases; where the dyes were degraded to aliphatic carboxylic acids at the end of the electrolysis. Energy requirements for the removal of color during OA of solutions of RRB, NB and YN depends mainly on the operating conditions, for example, RRB passes of 3.30 kWh m-3 at 20 mA cm-2 for 4.28 kWh m-3 at 60 mA cm-2 (pH = 1); 15.23 kWh m-3 at 20 mA cm-2 to 24.75 kWh m-3 at 60 mA cm-2 (pH 4.5); 10.80 kWh m-3 at 20 mA cm-2 to 31.5 kWh m-3 at 60 mA cm-2 (pH = 8) (estimated data for volume of treated effluent). On the other hand, in the study of OA of produced water effluent generated by petrochemical industry, galvanostatic electrolysis using DDB led to the complete removal of COD (98%), due to large amounts of hydroxyl radicals and peroxodisulphates generated from the oxidation of water and sulfates in solution, respectively. Thus, the rate of COD removal increases with increasing applied current density (15-60 mAcm-2 ). Moreover, at Pt electrode, approximately 50% removal of the organic load was achieved by applying from 15 to 30 mAcm-2 while 80% of COD removal was achieved for 60 mAcm-2 . Thus, the results obtained in the application of this technology were satisfactory depending on the electrocatalytic materials and operating conditions used for removal of organic load (petrochemical and textile effluents) as well as for the removal of color (in the case of textile effluents). Therefore, the applicability of electrochemical treatment can be considered as a new alternative like pretreatment or treatment of effluents derived from textiles and petrochemical industries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Google Docs (GD) is an online word processor with which multiple authors can work on the same document, in a synchronous or asynchronous manner, which can help develop the ability of writing in English (WEISSHEIMER; SOARES, 2012). As they write collaboratively, learners find more opportunities to notice the gaps in their written production, since they are exposed to more input from the fellow co-authors (WEISSHEIMER; BERGSLEITHNER; LEANDRO, 2012) and prioritize the process of text (re)construction instead of the concern with the final product, i.e., the final version of the text (LEANDRO; WEISSHEIMER; COOPER, 2013). Moreover, when it comes to second language (L2) learning, producing language enables the consolidation of existing knowledge as well as the internalization of new knowledge (SWAIN, 1985; 1993). Taking this into consideration, this mixed-method (DÖRNYEI, 2007) quasi-experimental (NUNAN, 1999) study aims at investigating the impact of collaborative writing through GD on the development of the writing skill in English and on the noticing of syntactic structures (SCHMIDT, 1990). Thirtyfour university students of English integrated the cohort of the study: twenty-five were assigned to the experimental group and nine were assigned to the control group. All learners went through a pre-test and a post-test so that we could measure their noticing of syntactic structures. Learners in the experimental group were exposed to a blended learning experience, in which they took reading and writing classes at the university and collaboratively wrote three pieces of flash fiction (a complete story told in a hundred words), outside the classroom, online through GD, during eleven weeks. Learners in the control group took reading and writing classes at the university but did not practice collaborative writing. The first and last stories produced by the learners in the experimental group were analysed in terms of grammatical accuracy, operationalized as the number of grammar errors per hundred words (SOUSA, 2014), and lexical density, which refers to the relationship between the number of words produced with lexical properties and the number of words produced with grammatical properties (WEISSHEIMER, 2007; MEHNERT, 1998). Additionally, learners in the experimental group answered an online questionnaire on the blended learning experience they were exposed to. The quantitative results showed that the collaborative task led to the production of more lexically dense texts over the 11 weeks. The noticing and grammatical accuracy results were different from what we expected; however, they provide us with insights on measurement issues, in the case of noticing, and on the participants‟ positive attitude towards collaborative writing with flash fiction. The qualitative results also shed light on the usefulness of computer-mediated collaborative writing in L2 learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Produced water is considered the main effluent of the oil industry, due to their increased volume in mature fields and its varied composition. The oil and grease content (TOG) is the main parameter for the final disposal of produced water. In this context, it is of great significance to develop an alternative method based on guar gum gel for the treatment of synthetic produced water, and using as the differential a polymer having high hydrophilicity for clarifying waters contaminated with oil. Thus, this study aims to evaluate the efficiency of guar gum gels in the remotion of oil from produced water. Guar gum is a natural polymer that, under specific conditions, forms three-dimensional structures, with important physical and chemical properties. By crosslinking the polymer chains by borate ions in the presence of salts, the effect salting out occurs, reducing the solubility of the polymer gel in water. As a result, there is phase separation with the oil trapped in the collapsed gel. The TOG was quantified from the spectroscopy in the ultraviolet and visible region. The system was proven to be highly efficient in the removal of dispersed oil from water produced synthetically, reaching removal percentages above 90%.