6 resultados para Color in textile crafts.
em Universidad de Alicante
Resumo:
En el siguiente trabajo se realiza la impregnación de diferentes sustratos poliméricos con agentes biocidas y con un colorante textil, comúnmente empleados en los procesos de acabados textiles. En este estudio se realiza la selección del colorante Disperse Red 167 (DR167), mediante la comparación de solubilidad en CO2 supercrítico (scCO2) entre varios colorantes dispersos. Los agentes biocidas seleccionados han sido; esencia de clavo (eugenol) y aceite esencial de orégano. Se ha realizado la impregnación de diferentes sustratos poliméricos; poliéster (PES), polipropileno (PP), y algodón (CO), en diferentes condiciones. En total se realizaron impregnaciones utilizando diez concentraciones relativas del DR167. El objetivo principal es determinar las condiciones óptimas de procesado para cada sustrato. Para determinar el rendimiento de la tintura en scCO2 se han representado los diagramas cromáticos de las muestras tintadas en diferentes condiciones. Las muestras de PES son las que presentan mayor rendimiento de color, sabiendo que esta es la única fibra que presenta afinidad con el DR167. Para determinar el efecto de inhibición de las bacterias se han realizado ensayos de actividad antimicrobiana y actividad fungicida. Puede indicarse que sí se observó cierta actividad inhibitoria frente algunos microorganismos, como Staphylococcus aureus, mientras que no se observó una actividad inhibitoria importante frente a otros como Pseudomonas aeruginosa.
Resumo:
During grasping and intelligent robotic manipulation tasks, the camera position relative to the scene changes dramatically because the robot is moving to adapt its path and correctly grasp objects. This is because the camera is mounted at the robot effector. For this reason, in this type of environment, a visual recognition system must be implemented to recognize and “automatically and autonomously” obtain the positions of objects in the scene. Furthermore, in industrial environments, all objects that are manipulated by robots are made of the same material and cannot be differentiated by features such as texture or color. In this work, first, a study and analysis of 3D recognition descriptors has been completed for application in these environments. Second, a visual recognition system designed from specific distributed client-server architecture has been proposed to be applied in the recognition process of industrial objects without these appearance features. Our system has been implemented to overcome problems of recognition when the objects can only be recognized by geometric shape and the simplicity of shapes could create ambiguity. Finally, some real tests are performed and illustrated to verify the satisfactory performance of the proposed system.
Resumo:
Clase invitada impartida en marzo 2011 en la asignatura "Colour on industry" del máster Erasmus Mundus "Color in Informaticas and Media Technology" en la Universidad de Granada.
Resumo:
In the chemical textile domain experts have to analyse chemical components and substances that might be harmful for their usage in clothing and textiles. Part of this analysis is performed searching opinions and reports people have expressed concerning these products in the Social Web. However, this type of information on the Internet is not as frequent for this domain as for others, so its detection and classification is difficult and time-consuming. Consequently, problems associated to the use of chemical substances in textiles may not be detected early enough, and could lead to health problems, such as allergies or burns. In this paper, we propose a framework able to detect, retrieve, and classify subjective sentences related to the chemical textile domain, that could be integrated into a wider health surveillance system. We also describe the creation of several datasets with opinions from this domain, the experiments performed using machine learning techniques and different lexical resources such as WordNet, and the evaluation focusing on the sentiment classification, and complaint detection (i.e., negativity). Despite the challenges involved in this domain, our approach obtains promising results with an F-score of 65% for polarity classification and 82% for complaint detection.
Resumo:
From a set of gonioapparent automotive samples from different manufacturers we selected 28 low-chroma color pairs with relatively small color differences predominantly in lightness. These color pairs were visually assessed with a gray scale at six different viewing angles by a panel of 10 observers. Using the Standardized Residual Sum of Squares (STRESS) index, the results of our visual experiment were tested against predictions made by 12 modern color-difference formulas. From a weighted STRESS index accounting for the uncertainty in visual assessments, the best prediction of our whole experiment was achieved using AUDI2000, CAM02-SCD, CAM02-UCS and OSA-GP-Euclidean color-difference formulas, which were no statistically significant different among them. A two-step optimization of the original AUDI2000 color-difference formula resulted in a modified AUDI2000 formula which performed both, significantly better than the original formula and below the experimental inter-observer variability. Nevertheless the proposal of a new revised AUDI2000 color-difference formula requires additional experimental data.
Resumo:
This paper illustrates how to design a visual experiment to measure color differences in gonioapparent materials and how to assess the merits of different advanced color-difference formulas trying to predict the results of such experiment. Successful color-difference formulas are necessary for industrial quality control and artificial color-vision applications. A color- difference formula must be accurate under a wide variety of experimental conditions including the use of challenging materials like, for example, gonioapparent samples. Improving the experimental design in a previous paper [Melgosaet al., Optics Express 22, 3458-3467 (2014)], we have tested 11 advanced color-difference formulas from visual assessments performed by a panel of 11 observers with normal colorvision using a set of 56 nearly achromatic colorpairs of automotive gonioapparent samples. Best predictions of our experimental results were found for the AUDI2000 color-difference formula, followed by color-difference formulas based on the color appearance model CIECAM02. Parameters in the original weighting function for lightness in the AUDI2000 formula were optimized obtaining small improvements. However, a power function from results provided by the AUDI2000 formula considerably improved results, producing values close to the inter-observer variability in our visual experiment. Additional research is required to obtain a modified AUDI2000 color-difference formula significantly better than the current one.