84 resultados para CSV


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Ocean Sampling Day (OSD) is a simultaneous sampling campaign of the world's oceans which took place (for the first time) on the summer solstice (June 21st) in the year 2014. These cumulative samples, related in time, space and environmental parameters, provide insights into fundamental rules describing microbial diversity and function and contribute to the blue economy through the identification of novel, ocean-derived biotechnologies. We see OSD data as a reference data set for generations of experiments to follow in the coming decade. The present data set includes a description of each sample collected during the Ocean Sampling Day 2014 and provides contextual environmental data measured concurrently with the collection of water samples for genomic analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El Trabajo Fin de Grado ha consistido en el diseño e implementación de una herramienta para la gestión y administración de los entrenamientos de atletas de deportes individuales. Hasta ahora los deportistas debían gestionar sus entrenamientos a través de hojas de cálculo, teniendo que dedicar tiempo al aprendizaje de herramientas como Microsoft Excel u OpenOffice Excel para personalizar las plantillas y guardar los datos, utilizar otras herramientas como Google Calendar para obtener una visión de un calendario con los entrenamientos realizados o bien utilizar programas hechos a medida para un deporte e incluso para un deportista. El objetivo principal consistía en desarrollar una herramienta que unificara todas las tareas para ofrecer al deportista las funciones de configuración de plantillas, registro y generación de gráficas de los datos registrados y visionado del calendario de entrenamientos de una forma ágil, sencilla e intuitiva, adaptándose a las necesidades de cualquier deporte o deportista. Para alcanzar el objetivo principal realizamos encuestas a atletas de una gran diversidad de deportes individuales, detectando las particularidades de cada deporte y analizando los datos que nos ofrecían para alcanzar el objetivo de diseñar una herramienta versátil que permitiera su uso independientemente de los parámetros que se quisiera registrar de cada entrenamiento. La herramienta generada es una herramienta programada en Java, que ofrece portabilidad a cualquier sistema operativo que lo soporte, sin ser necesario realizar una instalación previa. Es una aplicación plug and play en la que solo se necesita del fichero ejecutable para su funcionamiento; de esta forma facilitamos que el deportista guarde toda la información en muy poco espacio, 6 megabytes aproximadamente, y pueda llevarla a cualquier lado en un pen drive o en sistemas de almacenamiento en la nube. Además, los ficheros en los que se registran los datos son ficheros CSV (valores separados por comas) con un formato estandarizado que permite la exportación a otras herramientas. Como conclusión el atleta ahorra tiempo y esfuerzo en tareas ajenas a la práctica del deporte y disfruta de una herramienta que le permite analizar de diferentes maneras cada uno de los parámetros registrados para ver su evolución y ayudarle a mejorar aquellos aspectos que sean deficientes. ---ABSTRACT---The Final Project consists in the design and implementation of a tool for the management and administration of training logs for individual athletes. Until now athletes had to manage their workouts through spreadsheets, having to spend time in learning tools such as Microsoft Excel or OpenOffice in order to save the data, others tools like Google Calendar to check their training plan or buy specifics programs designed for a specific sport or even for an athlete. The main purpose of this project is to develop an intuitive and straightforward tool that unifies all tasks offering setup functions, data recording, graph generation and training schedule to the athletes. Whit this in mind, we have interviewed athletes from a wide range of individual sports, identifying their specifications and analyzing the data provided to design a flexible tool that registers multitude of training parameters. This tool has been coded in Java, providing portability to any operating system that supports it, without installation being required. It is a plug and play application, that only requires the executable file to start working. Accordingly, athletes can keep all the information in a relative reduced space (aprox 6 megabytes) and save it in a pen drive or in the cloud. In addition, the files whit the stored data are CSV (comma separated value) files whit a standardized format that allows exporting to other tools. Consequently athletes will save time and effort on tasks unrelated to the practice of sports. The new tool will enable them to analyze in detail all the existing data and improve in those areas with development opportunities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto tiene como objetivo la implementación de un sistema capaz de analizar el movimiento corporal a partir de unos puntos cinemáticos. Estos puntos cinemáticos se obtienen con un programa previo y se captan con la cámara kinect. Para ello el primer paso es realizar un estudio sobre las técnicas y conocimientos existentes relacionados con el movimiento de las personas. Se sabe que Rudolph Laban fue uno de sus mayores exponentes y gracias a sus observaciones se establece una relación entre la personalidad, el estado anímico y la forma de moverse de un individuo. Laban acuñó el término esfuerzo, que hace referencia al modo en que se administra la energía que genera el movimiento y de qué manera se modula en las secuencias, es una manera de describir la intención de las expresiones internas. El esfuerzo se divide en 4 categorías: peso, espacio, tiempo y flujo, y cada una de estas categorías tiene una polaridad denominada elemento de esfuerzo. Con estos 8 elementos de esfuerzo un movimiento queda caracterizado. Para poder cuantificar los citados elementos de esfuerzo se buscan movimientos que representen a alguno de ellos. Los movimientos se graban con la cámara kinect y se guardan sus valores en un archivo csv. Para el procesado de estos datos se establece que el sistema más adecuado es una red neuronal debido a su flexibilidad y capacidad a la hora de procesar entradas no lineales. Para la implementación de la misma se requiere un amplio estudio que incluye: topologías, funciones de activación, tipos de aprendizaje, algoritmos de entrenamiento entre otros. Se decide que la red tenga dos capas ocultas, para mejor procesado de los datos, que sea estática, siga un proceso de cálculo hacia delante (Feedforward) y el algoritmo por el que se rija su aprendizaje sea el de retropropagación (Backpropagation) En una red estática las entradas han de ser valores fijos, es decir, no pueden variar en el tiempo por lo que habrá que implementar un programa intermedio que haga una media aritmética de los valores. Una segunda prueba con la misma red trata de comprobar si sería capaz de reconocer movimientos que estuvieran caracterizados por más de un elemento de esfuerzo. Para ello se vuelven a grabar los movimientos, esta vez en parejas de dos, y el resto del proceso es igual. ABSTRACT. The aim of this project is the implementation of a system able to analyze body movement from cinematic data. This cinematic data was obtained with a previous program. The first step is carrying out a study about the techniques and knowledge existing nowadays related to people movement. It is known that Rudolf Laban was one the greatest exponents of this field and thanks to his observations a relation between personality, mood and the way the person moves was made. Laban coined the term effort, that refers to the way energy generated from a movement is managed and how it is modulated in the sequence, this is a method of describing the inner intention of the person. The effort is divided into 4 categories: weight, space, time and flow, and each of these categories have 2 polarities named elements of effort. These 8 elements typify a movement. We look for movements that are made of these elements so we can quantify them. The movements are recorded with the kinect camera and saved in a csv file. In order to process this data a neural network is chosen owe to its flexibility and capability of processing non-linear inputs. For its implementation it is required a wide study regarding: topology, activation functions, different types of learning methods and training algorithms among others. The neural network for this project will have 2 hidden layers, it will be static and follow a feedforward process ruled by backpropagation. In a static net the inputs must be fixed, this means they cannot vary in time, so we will have to implement an intermediate program to calculate the average of our data. A second test for our net will be checking its ability to recognize more than one effort element in just one movement. In order to do this all the movements are recorded again but this time in pairs, the rest of the process remains the same.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este proyecto fín de carrera describe el desarrollo de un sistema de estimación de mapas de profundidad densos a partir de secuencias reales de vídeo 3D. Está motivado por la necesidad de utilizar la información de profundidad de un vídeo estéreo para calcular las oclusiones en el módulo de inserción de objetos sintéticos interactivos desarrollado en el proyecto ImmersiveTV. En el receptor 3DTV, el sistema debe procesar en tiempo real secuencias estéreo de escenas reales en alta resolución con formato Side-by-Side. Se analizan las características del contenido para conocer los problemas a enfrentar. Obtener un mapa de profundidad denso mediante correspondencia estéreo (stereo matching) permite calcular las oclusiones del objeto sintético con la escena. No es necesario que el valor de disparidad asignado a cada píxel sea preciso, basta con distinguir los distintos planos de profundidad ya que se trabaja con distancias relativas. La correspondencia estéreo exige que las dos vistas de entrada estén alineadas. Primero se comprueba si se deben rectificar y se realiza un repaso teórico de calibración y rectificación, resumiendo algunos métodos a considerar en la resolución del problema. Para estimar la profundidad, se revisan técnicas de correspondencia estéreo densa habituales, seleccionando un conjunto de implementaciones con el fin de valorar cuáles son adecuadas para resolver el problema, incluyendo técnicas locales, globales y semiglobales, algunas sobre CPU y otras para GPU; modificando algunas para soportar valores negativos de disparidad. No disponer de ground truth de los mapas de disparidad del contenido real supone un reto que obliga a buscar métodos indirectos de comparación de resultados. Para una evaluación objetiva, se han revisado trabajos relacionados con la comparación de técnicas de correspondencia y entornos de evaluación existentes. Se considera el mapa de disparidad como error de predicción entre vistas desplazadas. A partir de la vista derecha y la disparidad de cada píxel, puede reconstruirse la vista izquierda y, comparando la imagen reconstruida con la original, se calculan estadísticas de error y las tasas de píxeles con disparidad inválida y errónea. Además, hay que tener en cuenta la eficiencia de los algoritmos midiendo la tasa de cuadros por segundo que pueden procesar. Observando los resultados, atendiendo a los criterios de maximización de PSNR y minimización de la tasa de píxeles incorrectos, se puede elegir el algoritmo con mejor comportamiento. Como resultado, se ha implementado una herramienta que integra el sistema de estimación de mapas de disparidad y la utilidad de evaluación de resultados. Trabaja sobre una imagen, una secuencia o un vídeo estereoscópico. Para realizar la correspondencia, permite escoger entre un conjunto de algoritmos que han sido adaptados o modificados para soportar valores negativos de disparidad. Para la evaluación, se ha implementado la reconstrucción de la vista de referencia y la comparación con la original mediante el cálculo de la RMS y PSNR, como medidas de error, además de las tasas de píxeles inválidos e incorrectos y de la eficiencia en cuadros por segundo. Finalmente, se puede guardar las imágenes (o vídeos) generados como resultado, junto con un archivo de texto en formato csv con las estadísticas para su posterior comparación.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En los últimos años ha habido un gran aumento de fuentes de datos biomédicos. La aparición de nuevas técnicas de extracción de datos genómicos y generación de bases de datos que contienen esta información ha creado la necesidad de guardarla para poder acceder a ella y trabajar con los datos que esta contiene. La información contenida en las investigaciones del campo biomédico se guarda en bases de datos. Esto se debe a que las bases de datos permiten almacenar y manejar datos de una manera simple y rápida. Dentro de las bases de datos existen una gran variedad de formatos, como pueden ser bases de datos en Excel, CSV o RDF entre otros. Actualmente, estas investigaciones se basan en el análisis de datos, para a partir de ellos, buscar correlaciones que permitan inferir, por ejemplo, tratamientos nuevos o terapias más efectivas para una determinada enfermedad o dolencia. El volumen de datos que se maneja en ellas es muy grande y dispar, lo que hace que sea necesario el desarrollo de métodos automáticos de integración y homogeneización de los datos heterogéneos. El proyecto europeo p-medicine (FP7-ICT-2009-270089) tiene como objetivo asistir a los investigadores médicos, en este caso de investigaciones relacionadas con el cáncer, proveyéndoles con nuevas herramientas para el manejo de datos y generación de nuevo conocimiento a partir del análisis de los datos gestionados. La ingestión de datos en la plataforma de p-medicine, y el procesamiento de los mismos con los métodos proporcionados, buscan generar nuevos modelos para la toma de decisiones clínicas. Dentro de este proyecto existen diversas herramientas para integración de datos heterogéneos, diseño y gestión de ensayos clínicos, simulación y visualización de tumores y análisis estadístico de datos. Precisamente en el ámbito de la integración de datos heterogéneos surge la necesidad de añadir información externa al sistema proveniente de bases de datos públicas, así como relacionarla con la ya existente mediante técnicas de integración semántica. Para resolver esta necesidad se ha creado una herramienta, llamada Term Searcher, que permite hacer este proceso de una manera semiautomática. En el trabajo aquí expuesto se describe el desarrollo y los algoritmos creados para su correcto funcionamiento. Esta herramienta ofrece nuevas funcionalidades que no existían dentro del proyecto para la adición de nuevos datos provenientes de fuentes públicas y su integración semántica con datos privados.---ABSTRACT---Over the last few years, there has been a huge growth of biomedical data sources. The emergence of new techniques of genomic data generation and data base generation that contain this information, has created the need of storing it in order to access and work with its data. The information employed in the biomedical research field is stored in databases. This is due to the capability of databases to allow storing and managing data in a quick and simple way. Within databases there is a variety of formats, such as Excel, CSV or RDF. Currently, these biomedical investigations are based on data analysis, which lead to the discovery of correlations that allow inferring, for example, new treatments or more effective therapies for a specific disease or ailment. The volume of data handled in them is very large and dissimilar, which leads to the need of developing new methods for automatically integrating and homogenizing the heterogeneous data. The p-medicine (FP7-ICT-2009-270089) European project aims to assist medical researchers, in this case related to cancer research, providing them with new tools for managing and creating new knowledge from the analysis of the managed data. The ingestion of data into the platform and its subsequent processing with the provided tools aims to enable the generation of new models to assist in clinical decision support processes. Inside this project, there exist different tools related to areas such as the integration of heterogeneous data, the design and management of clinical trials, simulation and visualization of tumors and statistical data analysis. Particularly in the field of heterogeneous data integration, there is a need to add external information from public databases, and relate it to the existing ones through semantic integration methods. To solve this need a tool has been created: the term Searcher. This tool aims to make this process in a semiautomatic way. This work describes the development of this tool and the algorithms employed in its operation. This new tool provides new functionalities that did not exist inside the p-medicine project for adding new data from public databases and semantically integrate them with private data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Supporting data are included in PDF and CSV files; any additional data may be obtained from the corresponding author (e-mail: j.vinogradov@imperial.ac.uk). TOTAL is thanked for partial support of Jackson's Chair in Geological Fluid Mechanics and for supporting the activities of the TOTAL Laboratory for Reservoir Physics at Imperial College London where these experiments were conducted. The Editor thanks Andre Revil and Paul Glover for their assistance in evaluating this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our purpose is to report alterations in contrast sensitivity function (CSF) and in the magno, parvo and koniocellular visual pathways by means of a multichannel perimeter in case of an essential tremor (ET). A complete evaluation of the visual function was performed in a 69-year old patient, including the analysis of the chromatic discrimination by the Fansworth–Munsell 100 hue test, the measurement of the CSF by the CSV-1000E test, and the detection of potential alteration patterns in the magno, parvo and koniocellular visual pathways by means of a multichannel perimeter. Visual acuity and intraocular pressure (IOP) were within the ranges of normality in both eyes. No abnormalities were detected in the fundoscopic examination and in the optical coherence tomography (OCT) exam. The results of the color vision examination were also within the ranges of normality. A significant decrease in the achromatic CSFs for right eye (RE) and left eye (LE) was detected for all spatial frequencies. The statistical global values provided by the multichannel perimeter confirms that there were significant absolute sensitivity losses compared to the normal pattern in RE. In the LE, only a statistically significant decrease in sensitivity was detected for the blue-yellow (BY) channel. The pattern standard deviation (PSD) values obtained in our patient indicated that there were significant localized losses compared to the normality pattern in the achromatic channel of the RE and in the red-green (RG) channel of the LE. Some color vision alterations may be present in ET that cannot be detected with conventional color vision tests, such as the FM 100 Hue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This data set describes the distribution of a total of 90 plant species growing on field margins of an agricultural landscape in the Haean-myun catchment in South Korea. We conducted our survey between July and August 2011 in 100 sampling plots, covering the whole catchment. In each plot we measured three environmental variables: slope, width of the field margin, and management type (i.e. "managed" for field margins that had signs of management activities from the ongoing season such as cutting or spraying herbicides and "unmanaged" for field margins that had been left untouched in the season). For the botanical survey each plot was sampled using three subplots of one square meter per subplot; subplots were 4 m apart from each other. In each subplot, we estimated three different vegetation characteristics: vegetation cover (i.e. the percentage of ground covered by vegetation), species richness (i.e. the number of observed species) and species abundance (i.e. the number of observed individuals / species). We calculated the percentage of the non-farmed habitats by creating buffer zones of 100, 200, 300, 400 and 500 m radii around each plot using data provided by (Seo et al. 2014). Non-farmed habitats included field margins, fallows, forest, riparian areas, pasture and grassland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Contrast detection is an important aspect of the assessment of visual function; however, clinical tests evaluate limited spatial frequencies and contrasts. This study validates the accuracy and inter-test repeatability of a swept-frequency near and distance mobile app Aston contrast sensitivity test, which overcomes this limitation compared to traditional charts. METHOD: Twenty subjects wearing their full refractive correction underwent contrast sensitivity testing on the new near application (near app), distance app, CSV-1000 and Pelli-Robson charts with full correction and with vision degraded by 0.8 and 0.2 Bangerter degradation foils. In addition repeated measures using the 0.8 occluding foil were taken. RESULTS: The mobile apps (near more than distance, p = 0.005) recorded a higher contrast sensitivity than printed tests (p < 0.001); however, all charts showed a reduction in measured contrast sensitivity with degradation (p < 0.001) and a similar decrease with increasing spatial frequency (interaction > 0.05). Although the coefficient of repeatability was lowest for the Pelli-Robson charts (0.14 log units), the mobile app charts measured more spatial frequencies, took less time and were more repeatable (near: 0.26 to 0.37 log units; distance: 0.34 to 0.39 log units) than the CSV-1000 (0.30 to 0.93 log units). The duration to complete the CSV-1000 was 124 ± 37 seconds, Pelli-Robson 78 ± 27 seconds, near app 53 ± 15 seconds and distance app 107 ± 36 seconds. CONCLUSIONS: While there were differences between charts in contrast levels measured, the new Aston near and distance apps are valid, repeatable and time-efficient method of assessing contrast sensitivity at multiple spatial frequencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To examine visual outcomes following bilateral implantation of the FineVision trifocal intraocular lens (IOL; PhysIOL, Liège, Belgium). Methods: 26 patients undergoing routine cataract surgery were implanted bilaterally with the FineVision Trifocal IOL and followed up post-operatively for 3 months. The FineVision optic features a combination of 2 diffractive structures, resulting in distance, intermediate (+1.75 D add) and near vision (+3.50 D add) zones. Apodization of the optic surface increases far vision dominance with pupil aperture. Data collected at the 3 month visit included uncorrected and corrected distance (CDVA) and near vision; subjective refraction; defocus curve testing (photopic and mesopic); contrast sensitivity (CSV-1000); halometry glare testing and a questionnaire (NAVQ) to gauge near vision function and patient satisfaction. Results: The cohort comprised 15 males and 11 females, aged 52.5–82.4 years (mean 70.6 ± 8.2 years). Mean post-operative UDVA was 0.22 ± 0.14 logMAR, with a mean spherical equivalent refraction of +0.02 ± 0.35 D. Mean CDVA was 0.13 ± 0.10 logMAR monocularly, and 0.09 ± 0.07 logMAR binocularly. Defocus curve testing showed an extensive range of clear vision in both photopic and mesopic conditions. Patients showed high levels of satisfaction with their near vision (mean ± 0.9 ± 0.6, where 0 = completely satisfied, and 4 = completely unsatisfied) and demonstrated good spectacle independence. Conclusion: The FineVision IOL can be considered in patients seeking spectacle dependence following cataract surgery, and provide good patient satisfaction with uncorrected vision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biological rhythms are part of the life from the simplest to the most complex living beings. In humans, one of the most important biological rhythms is the sleep-wake cycle (SWC), which represents an indispensable behavior for health, since sleep deprivation can lead to deficits in attention and memory, mood and daytime sleepiness which may affect school performance. Nevertheless, the SWC is a content rarely discussed in schools. Thus, the aim of this research was to address contents of the sleep-wake cycle, related to the content of Health to encourage healthy sleep habits. This study was conducted in a public school with 33 students of the 3rd year of high school and is divided into four stages: 1st) Study and analysis of the content of the textbook adopted by the school to subsidize the activities covered in the teaching unit (TU) and approximation with the biology teacher from the class to evaluated the feasibility of schedules for the development of TU; 2nd) Survey of students' prior knowledge, through a questionnaire, to guide the development of the TU; 3rd) Development and implementation of a TU based on meaningful learning and characterization of the students sleep habits, 4th) Evaluation of the TU as a viable proposal to teach biological rhythms concepts. Previous knowledge of students about the SWC are scarce and this content is not covered in the books adopted by the school. Alternative conceptions were observed, particularly with regard to individual differences in sleep, which may contribute to the occurrence of inadequate sleep habits, as reported by the adolescents in this study. The activities developed during UD were well received by the students who showed participative, motivated and evaluated positively the procedures used by the researcher. After the TU, students' knowledge about the concept of biological rhythms has been increased and they started to identify that the SWC changes throughout life and occur due biological and socio-cultural factors. Thus, the UD elaborated in this study represents a viable proposal to teach the concepts of biological rhythms contextualized to the content of Health, in high school

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Academic demands, new social context, new routines and decrease of the parental control, are factors that may influence the sleep pattern of freshman students at the University. Medical students from the Federal University of Rio Grande do Norte (UFRN) have a full-time course, subjects with high-level content, and, at the first semester, classes begin at 7 a.m. This group composed by young adults who still suffering with delayed sleep phase, common in adolescence, indicating that this class schedule can be inappropriate at this age. The reduction of nocturnal sleep during school days, and the attempt to recover sleep on free days – social jet lag (JLS), suggests that in the first semester, students suffer from high sleep pressure. High sleep pressure may reflect on cognitive tasks and performance. Therefore, the aim of this study was to investigate the relationship between sleep pressure and the academic profile of medical students from the first semester of UFRN, characterizing this population socio-demographically and investigating possible impacts on therestactivity rhytm and academic performance. A sample of 88 students, healthy men and women awswered the following questionnaires: Pittsburgh Sleep Quality (PSQI), Epworth Sleepiness Scale (ESS), Horne & Ostberg Chronotype (HO), Munich Chronotype (MCTQ) and “Health and Sleep” adapted. Actigraphy was used during 14 days to make actogramas and obtain non-parametric variables of the rest-activity rhythm and the grades of the morning schedule were used as academic performance. The JLS was used as a measure of sleep pressure. Statistics significance level was 95%. The population was sociodemographic homogeneous. Most students have healthy lifestyle, practice physical activity, use car to go to the university and take between 15 and 30 minutes for this route. Regarding CSV, most were classify as intermediate (38.6%) and evening (32%) chronotypes, needs to nap during the week, suffer daytime sleepiness and have poor sleep quality. 83% of the sample has at least 1h JLS, which led us to divide into two groups: Group <2h JLS (N = 44) and Group ≥ 2h JLS (N = 44). The groups have differences only in chronotype, showing that most evening individuals have more JLS, however, no differences were found in relation to sociodemographic aspect, rest-activity rhythm or academic performance. The homogeneity of the sample was limited to compare the groups, however, is alarming that students already present in the first half: JLG, poor sleep quality and excessive daytime sleepiness, which can be accentuated through the university years, with the emergence of night shifts and increased academic demand. Interventionsaddressingthe importance of good sleep habits and the change of the class start time are strategies aimed to improve student’s health.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hexavalent chromium is a heavy metal present in various industrial effluents, and depending on its concentration may cause irreparable damage to the environment and to humans. Facing this surrounding context, this study aimed on the application of electrochemical methods to determine and remove the hexavalent chromium (Cr6+) in simulated wastewater. To determine was applied to cathodic stripping voltammetry (CSV) using ultra trace graphite electrodes ultra trace (work), Ag/AgCl (reference) and platinum (counter electrode), the samples were complexed with 1,5- diphenylcarbazide and then subjected to analysis. The removal of Cr6+ was applied electrocoagulation process (EC) using Fe and Al electrodes. The variables that constituted the factorial design 24, applied to optimizing the EC process, were: current density (5 and 10 mA.cm-2), temperature (25 and 60 ºC), concentration (50 and 100 ppm) and agitation rate (400 and 600 RPM). Through the preliminary test it was possible the adequacy of applying the CSV for determining of Cr6+, removed during the EC process. The Fe and Al electrodes as anodes sacrifice showed satisfactory results in the EC process, however Fe favored complete removal in 30 min, whereas with Al occurred at 240 min. In the application of factorial design 24 and analysis of Response Surface Methodology was possible to optimize the EC process for removal of Cr6+ in H2SO4 solution (0.5 mol.L-1), in which the temperature, with positive effect, was the variable that presented higher statistical significance compared with other variables and interactions, while in optimizing the EC process for removal of Cr6+ in NaCl solution (0.1 mol.L-1) the current density, with positive effect, and concentration, with a negative effect were the variables that had greater statistical significance with greater statistical significance compared with other variables and interactions. The utilization of electrolytes supports NaCl and Na2SO4 showed no significant differences, however NaCl resulted in rapid improvement in Cr6+ removal kinetics and increasing the NaCl concentration provided an increase in conductivity of the solution, resulting in lower energy consumption. The wear of the electrodes evaluated in all the process of EC showed that the Al in H2SO4 solution (0.5 mol.L-1), undergoes during the process of anodization CE, then the experimental mass loss is less than the theoretical mass loss, however, the Fe in the same medium showed a loss of mass greater experimental estimated theoretically. This fact is due to a spontaneous reaction of Fe with H2SO4, and when the reaction medium was the NaCl and Na2SO4 loss experimental mass approached the theoretical mass loss. Furthermore, it was observed the energy consumption of all processes involved in this study had a low operating cost, thus enabling the application of the EC process for treating industrial effluents. The results were satisfactory, it was achieved complete removal of Cr6+ in all processes used in this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately.