13 resultados para Visualisation de logiciels
em Universidad Politécnica de Madrid
Resumo:
Visualisation of program executions has been used in applications which include education and debugging. However, traditional visualisation techniques often fall short of expectations or are altogether inadequate for new programming paradigms, such as Constraint Logic Programming (CLP), whose declarative and operational semantics differ in some crucial ways from those of other paradigms. In particular, traditional ideas regarding the behaviour of data often cannot be lifted in a straightforward way to (C)LP from other families of programming languages. In this chapter we discuss techniques for visualising data evolution in CLP. We briefly review some previously proposed visualisation paradigms, and also propose a number of (to our knowledge) novel ones. The graphical representations have been chosen based on the perceived needs of a programmer trying to analyse the behaviour and characteristics of an execution. In particular, we concentrate on the representation of the run-time values of the variables, and the constraints among them. Given our interest in visualising large executions, we also pay attention to abstraction techniques, i.e., techniques which are intended to help in reducing the complexity of the visual information.
Resumo:
Les jeux sur mobile sont un exemple majeur à la fois d'une application réussie sur les mobiles et du nombre croissant de plates-formes pour les médias et les industries de loisirs. Explorant cette convergence, l'article analyse les caractéristiques principales du marché des jeux sur mobile et de son écosystème industriel, ses activités et acteurs principaux. L'article se concentre sur le rôle des différentes plates-formes de logiciels et sur les défis et opportunités futures pour les développeurs de jeux sur mobile dans un nouveau scénario dominé par les plates-formes de mobile.
Resumo:
Desde mediados de los 90, gracias a las posibilidades de la World Wide Web, se liberó la cartografía de su dependencia del medio físico, posibilitando el acceso y visualización de millones de mapas almacenados en formatos gráficos a través de Internet. En este contexto, el papel de la Información Geográfica (IG) en la vida cotidiana adquirió relevancia en la medida que el acceso a la misma resultaba cada vez más fácil gracias a múltiples herramientas y aplicaciones para distribuir y acercar los mapas en distintos formatos a la sociedad en general. Sin embargo, dado que esa información enseguida pasaba a estar desactualizada, surgió una demanda desde distintos ámbitos (seguridad, medio ambiente transporte, servicios, etc.) y de la sociedad en general para disponer de la información más actual. Como respuesta a esta demanda, surgen las iniciativas denominadas Infraestructuras de Datos Espaciales (IDE). Estas iniciativas, mediante la acción coordinada de un conjunto de tecnologías, estándares, normas y políticas, brindan la posibilidad a los usuarios de acceder, a través de Internet, a IG actualizada producida por instituciones y organismos oficiales, en un marco colaborativo y sustentada en una estructura organizativa. En este contexto, el ámbito educativo no ha permanecido ajeno representando uno de los espacios más propicios para la difusión de las potencialidades y usos de las IDE. En esta tesis se propone la utilización de las IDE en el contexto educativo, específicamente en la Educación Secundaria Obligatoria (ESO). Utilizar las IDE en el contexto educativo implica asignarle un papel en el proceso de enseñanza-aprendizaje y en el marco de esta tesis se presentan los fundamentos teóricos que permiten afirmar que las IDE son un re-curso educativo que responde a las características de las Tecnologías de la Información y la Comunicación (TIC). Esto se explicita a través de un concepto más amplio que hemos denominado “recurso educativo TIC”. En este contexto se analizan las posibilidades que ofrece las IDE para alcanzar los objetivos de aprendizaje de asignaturas de la ESO relacionadas con IG y se identifican contenidos susceptibles de ser abordados utilizándolas. Por otra parte, atendiendo al modelo educativo del aprendizaje basado en competencias, se exponen las posibilidades y potencialidades que ofrecen las IDE para desarrollar la competencia digital. Una vez planteado el marco teórico se desarrollaron dos estrategias de formación y difusión de las IDE orientadas al profesorado de la ESO. En primer lugar, utilizando el Modelo de Diseño Instruccional ADDIE, se diseñaron, desarrollaron, implementaron y evaluaron tres cursos e-learning para el profesorado de ESO de las asignaturas Ciencias Sociales, Ciencias de la Naturaleza y Tecnología. En segundo lugar, con objetivo de complementar los resultados obtenidos de los cursos e-learning, se realizó una actividad en dos Institutos de Educación Secundaria orientada a difundir las IDE. La puesta en práctica de estas estrategias ofreció al profesorado la información necesaria sobre qué son las IDE y proporcionó ejemplos concretos de uso de las mismas en su asignatura, permitiéndoles disponer de los conocimientos e información para emitir una valoración sobre las posibilidades que ofrecen las IDE como un recurso educativo TIC. Since about the middle of the 1990 decade, owing to the potential of the World Wide Web, cartography freed itself from its dependence on its physical support, enabling the access and visualisation of millions of maps stored in graphical formats through the Internet. In this context, the role of Geographic Information (GI) in daily life became relevant in as much as its access turned out to be ever easier due to multiple tools and applications to distribute and bring maps in different formats closer to society in general. Yet, since the information available often became outdated, a demand for updated information arose from different specific fields (security, environment, transport, services, etc.) and from the general public. As a response to this demand, the so-called Spatial Data Infrastructure (SDI) initiatives arose which, through the coordinated action of a set of technologies, stan-dards, and policies, enabled users to access updated GI created by organisations and official institutions, through the Internet, within a cooperative framework and an organisational structure. In this context the educational world has not remained aloof, since it represented one of the most propitious scope for the dissemination of the potentials and uses of SDI. In this thesis the utilization of SDI in the educational context is proposed, specifically in the Spanish Compulsory Secondary Education (Educación Secundaria Obligatoria – ESO). This utilization implies assigning SDI a role in the teaching-learning process; here the theo-retical foundation is presented which allows asserting that SDI is an educational resource fitting in with the characteristics of the Information and Communication Technologies (ICT). This is made explicit by means of a broader concept we have called “ICT educa-tional resource”. The possibilities offered by SDI to reach the objective of learning ESO subjects related to GI are analyzed, and contents apt to be addressed by using them are identified. On the other hand, attending to the educational model of learning based on competences, the possibilities and potentials the SDI offer to develop the digital compe-tence are exposed. After having set forth the theoretical frame, two strategies of training and dissemination of SDI were developed, oriented to the ESO teaching staff. First, using the ADDIE Instruc-tional Design Model, three learning courses were designed, developed, implemented and evaluated for the ESO teaching staff in the subjects of Social Sciences, Natural Sciences and Technology. In the second place, with the purpose of supplementing the results ob-tained from the e-learning courses, an activity was carried out in two High Schools, ori-ented to disseminate the SDI. The implementation of these strategies offered the teaching staff the needed information concerning the SDI and provided specific instances of utilisa-tion thereof in their subject, thus enabling them to acquire the knowledge and information to issue an assessment of the possibilities the SDI offer as an ICT educational resource
Resumo:
Cultural content on the Web is available in various domains (cultural objects, datasets, geospatial data, moving images, scholarly texts and visual resources), concerns various topics, is written in different languages, targeted to both laymen and experts, and provided by different communities (libraries, archives museums and information industry) and individuals (Figure 1). The integration of information technologies and cultural heritage content on the Web is expected to have an impact on everyday life from the point of view of institutions, communities and individuals. In particular, collaborative environment scan recreate 3D navigable worlds that can offer new insights into our cultural heritage (Chan 2007). However, the main barrier is to find and relate cultural heritage information by end-users of cultural contents, as well as by organisations and communities managing and producing them. In this paper, we explore several visualisation techniques for supporting cultural interfaces, where the role of metadata is essential for supporting the search and communication among end-users (Figure 2). A conceptual framework was developed to integrate the data, purpose, technology, impact, and form components of a collaborative environment, Our preliminary results show that collaborative environments can help with cultural heritage information sharing and communication tasks because of the way in which they provide a visual context to end-users. They can be regarded as distributed virtual reality systems that offer graphically realised, potentially infinite, digital information landscapes. Moreover, collaborative environments also provide a new way of interaction between an end-user and a cultural heritage data set. Finally, the visualisation of metadata of a dataset plays an important role in helping end-users in their search for heritage contents on the Web.
Resumo:
The solaR package includes a set of functions to calculate the solar radiation incident on a photovoltaic generator and simulate the performance of several applications of the photovoltaic energy. This package performs the whole calculation procedure from both daily and intradaily global horizontal irradiation to the final productivity of grid connected PV systems and water pumping PV systems. The package stands on a set of S4 classes. The core of each class is a group of slots with yearly, monthly, daily and intradaily multivariate time series (with the zoo package ). The classes share a variety of methods to access the information (for example, as.zooD provides a zoo object with the daily multivariate time series of the corresponding object) and several visualisation methods based on the lattice andlatticeExtra packages.
Resumo:
Some floating-liquid-zone experiments performed under reduced-gravity conditions are reviewed. Several types of instabilities are discussed, together with the relevant parameters controlling them. It is shown that the bounding values of these parameters could be increased, by orders of magnitude in several instances, by selecting appropriate liquids. Two of the many problems that a Fluid-Physics Module, devised to perform experiments on floating zones in a space laboratory, would involve are discussed: namely (i) procedures for disturbing the zoneunder controlled conditions, and (ii) visualisation of the inner flow pattern. Several topics connected with the nonisothermal nature and the phase-changes of floating zones are presented. In particular, a mode of propagation through the liquid zone for disturbances which could appear in the melting solid/liquid interface is suggested. Although most research on floating liquid zones is aimed at improving the crystal-growth process, some additional applications are suggested.
Resumo:
This introduction gives a general perspective of the debugging methodology and the tools developed in the ESPRIT IV project DiSCiPl Debugging Systems for Constraint Programming. It has been prepared by the editors of this volume by substantial rewriting of the DiSCiPl deliverable CP Debugging Tools [1]. This introduction is organised as follows. Section 1 outlines the DiSCiPl view of debugging, its associated debugging methodology, and motivates the kinds of tools proposed: the assertion based tools, the declarative diagnoser and the visualisation tools. Sections 2 through 4 provide a short presentation of the tools of each kind. Finally, Section 5 presents a summary of the tools developed in the project. This introduction gives only a general view of the DiSCiPl debugging methodology and tools. For details and for specific bibliographic referenees the reader is referred to the subsequent chapters.
Resumo:
The arrangement of atoms at the surface of a solid accounts for many of its properties: Hardness, chemical activity, corrosion, etc. are dictated by the precise surface structure. Hence, finding it, has a broad range of technical and industrial applications. The ability to solve this problem opens the possibility of designing by computer materials with properties tailored to specific applications. Since the search space grows exponentially with the number of atoms, its solution cannot be achieved for arbitrarily large structures. Presently, a trial and error procedure is used: an expert proposes an structure as a candidate solution and tries a local optimization procedure on it. The solution relaxes to the local minimum in the attractor basin corresponding to the initial point, that might be the one corresponding to the global minimum or not. This procedure is very time consuming and, for reasonably sized surfaces, can take many iterations and much effort from the expert. Here we report on a visualization environment designed to steer this process in an attempt to solve bigger structures and reduce the time needed. The idea is to use an immersive environment to interact with the computation. It has immediate feedback to assess the quality of the proposed structure in order to let the expert explore the space of candidate solutions. The visualization environment is also able to communicate with the de facto local solver used for this problem. The user is then able to send trial structures to the local minimizer and track its progress as they approach the minimum. This allows for simultaneous testing of candidate structures. The system has also proved very useful as an educational tool for the field.
Resumo:
El principal objetivo de este trabajo es aportar conocimiento para contestar la pregunta: ¿hasta que punto los ensayos en túnel aerodinámico pueden contribuir a determinar las características que afectan la respuesta dinámica de los aerogeneradores operando en terreno complejo?. Esta pregunta no es nueva, de hecho, el debate en la comunidad científica comenzó en el primer tercio del siglo pasado y aún está intensamente vivo. El método generalmente aceptado para enfrentar el mencionado problema consiste en analizar un caso de estudio determinado en el cual se aplican tanto ensayos a escala real como análisis computacionales y ensayos en túnel aerodinámico. Esto no es ni fácil ni barato. Esta es la razón por la cual desde el experimento de Askervein en 1988, los modelizadores del flujo atmosférico tuvieron que esperar hasta 2007 a que el experimento de Bolund fuese puesto en marcha con un despliegue de medios técnicos equivalentes (teniendo en cuenta la evolución de las tecnologías de sensores y computación). El problema contempla tantos aspectos que ambas experiencias fueron restringidas a condiciones de atmósfera neutra con efectos de Coriolis despreciables con objeto de reducir la complejidad. Este es el contexto en el que se ha desarrollado la presente tesis doctoral. La topología del flujo sobre la isla de Bolund ha sido estudiada mediante la reproducción del experimento de Bolund en los túneles aerodinámicos A9 y ACLA16 del IDR. Dos modelos de la isla de Bolund fueron fabricados a dos escalas, 1:230 y 1:115. El flujo de entrada en el túnel aerodinámico simulando la capa límite sin perturbar correspondía a régimen de transición (transitionally rough regime) y fue usado como situación de referencia. El modelo a escala 1:230 fue ensayado en el túnel A9 para determinar la presión sobre su superficie. La distribución del coeficiente de presión sobre la isla proporcionó una visualización y estimación de una región de desprendimiento sobre el pequeño acantilado situado al frente de la misma. Las medidas de presión instantánea con suficiente grado de resolución temporal pusieron de manifiesto la no estacionariedad en la región de desprendimiento. El modelo a escala 1:115 fue ensayado utilizando hilo caliente de tres componentes y un sistema de velocimetría por imágenes de partículas de dos componentes. El flujo fue caracterizado por el ratio de aceleración, el incremento normalizado de energía cinética turbulenta y los ángulos de inclinación y desviación horizontal. Los resultados a lo largo de la dirección 270°y alturas de 2 m y 5 m presentaron una gran similitud con los resultados a escala real del experimento de Bolund. Los perfiles verticales en las localizaciones de las torres meteorológicas mostraron un acuerdo significativo con los resultados a escala real. El análisis de los esfuerzos de Reynolds y el análisis espectral en las localizaciones de los mástiles meteorológicos presentaron niveles de acuerdo variados en ciertas posiciones, mientras que en otras presentaron claras diferencias. El mapeo horizontal del flujo, para una dirección de viento de 270°, permitió caracterizar el comportamiento de la burbuja intermitente de recirculación sobre el pequeño acantilado existente al frente de la isla así como de la región de relajación y de la capa de cortadura en la región corriente abajo de Bolund. Se realizaron medidas de velocidad con alta resolución espacial en planos perpendiculares a la dirección del flujo sin perturbar. Estas medidas permitieron detectar y caracterizar una estructura de flujo similar a un torbellino longitudinal con regiones con altos gradientes de velocidad y alta intensidad de turbulencia. Esta estructura de flujo es, sin duda, un reto para los modelos computacionales y puede considerarse un factor de riesgo para la operación de los aerogeneradores. Se obtuvieron y analizaron distribuciones espaciales de los esfuerzos de Reynolds mediante 3CHW y PIV. Este tipo de parámetros no constituyen parte de los resultados habituales en los ensayos en túnel sobre topografías y son muy útiles para los modelizadores que utilizan simulación de grades torbellinos (LES). Se proporciona una interpretación de los resultados obtenidos en el túnel aerodinámico en términos de utilidad para los diseñadores de parques eólicos. La evolución y variación de los parámetros del flujo a lo largo de líneas, planos y superficies han permitido identificar como estas propiedades del flujo podrían afectar la localización de los aerogeneradores y a la clasificación de emplazamientos. Los resultados presentados sugieren, bajo ciertas condiciones, la robustez de los ensayos en túnel para estudiar la topología sobre terreno complejo y su comparabilidad con otras técnicas de simulación, especialmente considerando el nivel de acuerdo del conjunto de resultados presentados con los resultados a escala real. De forma adicional, algunos de los parámetros del flujo obtenidos de las medidas en túnel son difícilmente determinables en ensayos a escala real o por medios computacionales, considerado el estado del arte. Este trabajo fue realizado como parte de las actividades subvencionadas por la Comisión Europea como dentro del proyecto FP7-PEOPLE-ITN-2008WAUDIT (Wind Resource Assessment Audit and Standardization) dentro de la FP7 Marie-Curie Initial Training Network y por el Ministerio Español de Economía y Competitividad dentro del proyecto ENE2012-36473, TURCO (Determinación en túnel aerodinámico de la distribución espacial de parámetros estadísticos de la turbulencia atmosférica sobre topografías complejas) del Plan Nacional de Investigación (Subprograma de investigación fundamental no orientada 2012). El informe se ha organizado en siete capítulos y un conjunto de anexos. En el primer capítulo se introduce el problema. En el capítulo dos se describen los medios experimentales utilizados. Seguidamente, en el capítulo tres, se analizan en detalle las condiciones de referencia del principal túnel aerodinámico utilizado en esta investigación. En el capítulo tres se presentan resultados de ensayos de presión superficial sobre un modelo de la isla. Los principales resultados del experimento de Bolund se reproducen en el capítulo cinco. En el capítulo seis se identifican diferentes estructuras del flujo sobre la isla y, finalmente, en el capitulo siete, se recogen las conclusiones y una propuesta de lineas de trabajo futuras. ABSTRACT The main objective of this work is to contribute to answer the question: to which extend can the wind tunnel testing contribute to determine the flow characteristics that affect the dynamic response of wind turbines operating in highly complex terrains?. This question is not new, indeed, the debate in the scientific community was opened in the first third of the past century and it is still intensely alive. The accepted approach to face this problem consists in analysing a given case study where full-scale tests, computational modelling and wind tunnel testing are applied to the same topography. This is neither easy nor cheap. This is is the reason why since the Askervein experience in 1988, the atmospheric flow modellers community had to wait till 2007 when the Bolund experiment was setup with a deployment of technical means equivalent (considering the evolution of the sensor and computing techniques). The problem is so manifold that both experiences were restricted to neutral conditions without Coriolis effects in order to reduce the complexity. This is the framework in which this PhD has been carried out. The flow topology over the Bolund Island has been studied by replicating the Bolund experiment in the IDR A9 and ACLA16 wind tunnels. Two mock-ups of the Bolund island were manufactured at two scales of 1:230 and 1:115. The in-flow in the empty wind tunnel simulating the incoming atmospheric boundary layer was in the transitionally rough regime and used as a reference case. The 1:230 model was tested in the A9 wind tunnel to measure surface pressure. The mapping of the pressure coefficient across the island gave a visualisation and estimation of a detachment region on the top of the escarpment in front of the island. Time resolved instantaneous pressure measurements illustrated the non-steadiness in the detachment region. The 1:115 model was tested using 3C hot-wires(HW) and 2C Particle Image Velocimetry(PIV). Measurements at met masts M3, M6, M7 and M8 and along Line 270°were taken to replicate the result of the Bolund experiment. The flow was characterised by the speed-up ratio, normalised increment of the turbulent kinetic energy, inclination angle and turning angle. Results along line 270°at heights of 2 m and 5 m compared very well with the full-scale results of the Bolund experiment. Vertical profiles at the met masts showed a significant agreement with the full-scale results. The analysis of the Reynolds stresses and the spectral analysis at the met mast locations gave a varied level of agreement at some locations while clear mismatch at others. The horizontal mapping of the flow field, for a 270°wind direction, allowed to characterise the behaviour of the intermittent recirculation bubble on top of the front escarpment followed by a relaxation region and the presence of a shear layer in the lee side of the island. Further detailed velocity measurements were taken at cross-flow planes over the island to study the flow structures on the island. A longitudinal vortex-like structure with high mean velocity gradients and high turbulent kinetic energy was characterised on the escarpment and evolving downstream. This flow structure is a challenge to the numerical models while posing a threat to wind farm designers when siting wind turbines. Spatial distribution of Reynold stresses were presented from 3C HW and PIV measurements. These values are not common results from usual wind tunnel measurements and very useful for modellers using large eddy simulation (LES). An interpretation of the wind tunnel results in terms of usefulness to wind farm designers is given. Evolution and variation of the flow parameters along measurement lines, planes and surfaces indicated how the flow field could affect wind turbine siting. Different flow properties were presented so compare the level of agreement to full-scale results and how this affected when characterising the site wind classes. The results presented suggest, under certain conditions, the robustness of the wind tunnel testing for studying flow topology over complex terrain and its capability to compare to other modelling techniques especially from the level of agreement between the different data sets presented. Additionally, some flow parameters obtained from wind tunnel measurements would have been quite difficult to be measured at full-scale or by computational means considering the state of the art. This work was carried out as a part of the activities supported by the EC as part of the FP7- PEOPLE-ITN-2008 WAUDIT project (Wind Resource Assessment Audit and Standardization) within the FP7 Marie-Curie Initial Training Network and by the Spanish Ministerio de Economía y Competitividad, within the framework of the ENE2012-36473, TURCO project (Determination of the Spatial Distribution of Statistic Parameters of Flow Turbulence over Complex Topographies in Wind Tunnel) belonging to the Spanish National Program of Research (Subprograma de investigación fundamental no orientada 2012). The report is organised in seven chapters and a collection of annexes. In chapter one, the problem is introduced. In chapter two the experimental setup is described. Following, in chapter three, the inflow conditions of the main wind tunnel used in this piece of research are analysed in detail. In chapter three, preliminary pressure tests results on a model of the island are presented. The main results from the Bolund experiment are replicated in chapter five. In chapter six, an identification of specific flow strutures over the island is presented and, finally, in chapter seven, conclusions and lines for future works related to the presented one are included.
Resumo:
Recent advances in non-destructive imaging techniques, such as X-ray computed tomography (CT), make it possible to analyse pore space features from the direct visualisation from soil structures. A quantitative characterisation of the three-dimensional solid-pore architecture is important to understand soil mechanics, as they relate to the control of biological, chemical, and physical processes across scales. This analysis technique therefore offers an opportunity to better interpret soil strata, as new and relevant information can be obtained. In this work, we propose an approach to automatically identify the pore structure of a set of 200-2D images that represent slices of an original 3D CT image of a soil sample, which can be accomplished through non-linear enhancement of the pixel grey levels and an image segmentation based on a PFCM (Possibilistic Fuzzy C-Means) algorithm. Once the solids and pore spaces have been identified, the set of 200-2D images is then used to reconstruct an approximation of the soil sample by projecting only the pore spaces. This reconstruction shows the structure of the soil and its pores, which become more bounded, less bounded, or unbounded with changes in depth. If the soil sample image quality is sufficiently favourable in terms of contrast, noise and sharpness, the pore identification is less complicated, and the PFCM clustering algorithm can be used without additional processing; otherwise, images require pre-processing before using this algorithm. Promising results were obtained with four soil samples, the first of which was used to show the algorithm validity and the additional three were used to demonstrate the robustness of our proposal. The methodology we present here can better detect the solid soil and pore spaces on CT images, enabling the generation of better 2D?3D representations of pore structures from segmented 2D images.
Resumo:
In current industrial environments there is an increasing need for practical and inexpensive quality control systems to detect the foreign food materials in powder food processing lines. This demand is especially important for the detection of product adulteration with traces of highly allergenic products, such as peanuts and tree nuts. Manufacturing industries dealing with the processing of multiple powder food products present a substantial risk for the contamination of powder foods with traces of tree nuts and other adulterants, which might result in unintentional ingestion of nuts by the sensitised population. Hence, the need for an in-line system to detect nut traces at the early stages of food manufacturing is of crucial importance. In this present work, a feasibility study of a spectral index for revealing adulteration of tree nut and peanut traces in wheat flour samples with hyperspectral images is reported. The main nuts responsible for allergenic reactions considered in this work were peanut, hazelnut and walnut. Enhanced contrast between nuts and wheat flour was obtained after the application of the index. Furthermore, the segmentation of these images by selecting different thresholds for different nut and flour mixtures allowed the identification of nut traces in the samples. Pixels identified as nuts were counted and compared with the actual percentage of peanut adulteration. As a result, the multispectral system was able to detect and provide good visualisation of tree nut and peanut trace levels down to 0.01% by weight. In this context, multispectral imaging could operate in conjuction with chemical procedures, such as Real Time Polymerase Chain Reaction and Enzyme-Linked Immunosorbent Assay to save time, money and skilled labour on product quality control. This approach could enable not only a few selected samples to be assessed but also to extensively incorporate quality control surveyance on product processing lines.
Resumo:
In current industrial environments there is an increasing need for practical and inexpensive quality control systems to detect the foreign food materials in powder food processing lines. This demand is especially important for the detection of product adulteration with traces of highly allergenic products, such as peanuts and tree nuts. Manufacturing industries dealing with the processing of multiple powder food products present a substantial risk for the contamination of powder foods with traces of tree nuts and other adulterants, which might result in unintentional ingestion of nuts by the sensitised population. Hence, the need for an in-line system to detect nut traces at the early stages of food manufacturing is of crucial importance. In this present work, a feasibility study of a spectral index for revealing adulteration of tree nut and peanut traces in wheat flour samples with hyperspectral images is reported. The main nuts responsible for allergenic reactions considered in this work were peanut, hazelnut and walnut. Enhanced contrast between nuts and wheat flour was obtained after the application of the index. Furthermore, the segmentation of these images by selecting different thresholds for different nut and flour mixtures allowed the identification of nut traces in the samples. Pixels identified as nuts were counted and with the actual percentage of peanut adulteration. As a result, the multispectral system was able to detect and provide good visualisation of tree nut and peanut trace levels down to 0.01% by weight. In this context, multispectral imaging could operate in conjuction with chemical procedures, such as Real Time Polymerase Chain Reaction and Enzyme-Linked Immunosorbent Assay to save time, money and skilled labour on product quality control. This approach could enable not only a few selected samples to be assessed but also to extensively incorporate quality control surveyance on product processing lines.
Resumo:
This thesis is the result of a project whose objective has been to develop and deploy a dashboard for sentiment analysis of football in Twitter based on web components and D3.js. To do so, a visualisation server has been developed in order to present the data obtained from Twitter and analysed with Senpy. This visualisation server has been developed with Polymer web components and D3.js. Data mining has been done with a pipeline between Twitter, Senpy and ElasticSearch. Luigi have been used in this process because helps building complex pipelines of batch jobs, so it has analysed all tweets and stored them in ElasticSearch. To continue, D3.js has been used to create interactive widgets that make data easily accessible, this widgets will allow the user to interact with them and �filter the most interesting data for him. Polymer web components have been used to make this dashboard according to Google's material design and be able to show dynamic data in widgets. As a result, this project will allow an extensive analysis of the social network, pointing out the influence of players and teams and the emotions and sentiments that emerge in a lapse of time.