29 resultados para On-line instruments
Resumo:
En este proyecto se han analizado distintas imágenes de fragmentos de rocas de distintas granulometrías correspondientes a una serie de voladuras de una misma cantera. Cada una de las voladuras se componen de 20 imágenes. A posteriori utilizando el programa Split Desktop en su versión 3.1, se delimitaron los fragmentos de roca de los que está compuesta la imagen, obteniéndose posteriormente la curva granulométrica correspondiente a dicha imagen. Una vez se calculan las curvas granulométricas correspondientes a cada imagen, se calcula la curva media de todas ellas, pudiéndose considerar por tanto la curva media de cada voladura. Se han utilizado las distintas soluciones del software, manual, online y automático, para realizar los análisis de dichas imágenes y a posteriori comparar sus resultados. Dichos resultados se muestran a través de una serie de gráficos y tablas que se explican con detalle para la comprensión del estudio. De dichos resultados es posible afirmar que, el tratamiento de imágenes realizado de manera online y automático por Split, desemboca en el mismo resultado, al no haber una diferencia estadística significativa. Por el contrario, el sistema manual es diferente de los otros dos, no pudiéndose afirmar cual es mejor de los dos. El manual depende del operario que trabaje las imágenes y el online de los ajustes realizados y por tanto, ambos tienen ciertas incertidumbres difíciles de solucionar. Abstract In this project, different images of rock fragments of different grain sizes corresponding to a series of blasts from the same quarry have been analyzed. To study each blast, 20 images has been used and studied with the software Split Desktop 3.1. Rock fragments from each image has been delimitated with the software, obtaining a grading curve of each one. Once these curves are calculated, the mean curve of these data set is obtained and can be considered the mean curve of each blast. Different software solutions as manual, online and automatic, has been used for the analysis of these images. Then the results has been compared between them. These results are shown through a series of graphs and tables, that are explained in detail, to enhance the understanding of the study. From these results, it can be said that the image processing with online and automatic options from Split, leads to the same result, after an statistical study. On the contrary, the manual Split mode is different from the others; however is not possible to assert what will be the best. The manual Split mode depends on the operator ability and dedication, although the online mode depends on the software settings, so therefore, both have some uncertainties that are difficult to solve.
Resumo:
This article presents a new automatic evaluation for on-line graphics, its application and the numerous advantages achieved applying this developed correcting method. The software application developed by the Innovation in Education Group “E4”, from the Technical University of Madrid, is oriented for the online self-assessment of the graphic drawings that students carry out as continuous training. The adaptation to the European Higher Educational Area is an important opportunity to research about the possibilities of on-line education assessment. In this way, a new software tool has been developed for continuous self-testing by undergraduates. Using this software it is possible to evaluate the graphical answer of the students. Thus, the drawings made on-line by students are automatically corrected according to the geometry (straight lines, sloping lines or second order curves) and by sizes (depending on the specific values which define the graphics).
Resumo:
Se presentan los objetivos y las actividades llevadas a cabo en el área de formación continua on-line de la Sociedad Española de Microbiología (SEM).
New On-Line Excitation-System Ground Fault Location Method Tested in a 106 MVA Synchronous Generator
Resumo:
In this paper, a novel excitation-system ground-fault location method is described and tested in a 106 MVA synchronous machine. In this unit, numerous rotor ground-fault trips took place always about an hour after the synchronization to the network. However, when the field winding insulation was checked after the trips, there was no failure. The data indicated that the faults in the rotor were caused by centrifugal forces and temperature. Unexpectedly, by applying this new method, the failure was located in a cable between the excitation transformer and the automatic voltage regulator. In addition, several intentional ground faults were performed along the field winding with different fault resistance values, in order to test the accuracy of this method to locate defects in rotor windings of large generators. Therefore, this new on-line rotor ground-fault detection algorithm is tested in high-power synchronous generators with satisfactory results.
Resumo:
The banking industry is observing how new competitors threaten its millennial business model by targeting unbanked people, offering new financial services to their customer base, and even enabling new channels for existing services and customers. The knowledge on users, their behaviour, and expectations become a key asset in this new context. Well aware of this situation, the Center for Open Middleware, a joint technology center created by Santander Bank and Universidad Politécnica de Madrid, has launched a set of initiatives to allow the experimental analysis and management of socio-economic information. PosdataP2P service is one of them, which seeks to model the economic ties between the holders of university smart cards, leveraging on the social networks the holders are subscribed to. In this paper we describe the design principles guiding the development of the system, its architecture and some implementation details.
Resumo:
En estudios previos desarrollados en la central Nuclear de cofrentes (Valencia) se ha observado que los microorganismos presentes en las aguas radiactivas de las piscinas de almacenamiento de combustible nuclear gastado son capaces de colonizar las superficies metálicas de las paredes y conducciones y formar biopelículas sobre éstas. Estas biopelículas retienen los radionúclidos de las aguas contributyendo a su descontaminación. En este proyecto se ha diseñado una planta piloto para la biodescontaminación de las aguas radiactivas. Actualmente el agua radiactiva procedente de las piscinas de combustible se hace pasar por resinas de intercambio iónico que posteriormente tienen que ser gestionadas como residuos radiactivos. En este proyecto, el agua se hace pasar por un biorreactor que contiene ovillos de acero inoxidable capaces de ser colonizados por los microorganismos existentes en dichas aguas. A su paso por el biorreactor, el agua entra en contacto con el material del ovillo, formándose una biopelícula que retiene los radioisótopos presentes en el agua. La biopelícula es fácilmente eliminada por cualquier procedimiento convencional de descontaminación radioquímica de materiales y los radionúclidos se pueden concentrar en un volumen pequeño de eluyente para su recuperación, disposición final o contención. A continuación, el material del biorreactor puede ser gestionado como material no radiactivo.
Resumo:
Recientemente se ha demostrado la existencia de microorganismos en las piscinas de almacenamiento de combustible nuclear gastado en las centrales nucleares utilizando técnicas convencionales de cultivo en el laboratorio. Estudios posteriores han puesto de manifiesto que los microorganismos presentes eran capaces de colonizar las paredes de acero inoxidable de las piscinas formando biopelículas. Adicionalmente se ha observado la capacidad de estas biopelículas de retener radionúclidos, lo que hace pensar en la posibilidad de utilizarlas en la descontaminación de las aguas radiactivas de las piscinas. En la presente tesis se plantea conocer más profundamente la biodiversidad microbiana de las biopelículas utilizando técnicas de biología molecular como la clonación, además de desarrollar un sistema de descontaminación a escala piloto con el objetivo de valorar si el proceso podría resultar escalable a nivel industrial. Para ello se diseñaron y fabricaron dos biorreactores en acero inoxidable compatibles con las condiciones específicas de seguridad sísmica y protección frente a la radiación en la zona controlada de una central nuclear. Los biorreactores se instalaron en la Central Nuclear de Cofrentes (Valencia) en las proximidades de las piscinas de almacenamiento de combustible nuclear gastado y precediendo a las resinas de intercambio iónico, de forma que reciben el agua de las piscinas permitiendo el análisis in situ de la radiación eliminada del agua de las mismas. Se conectó una lámpara de luz ultravioleta a uno de los biorreactores para poder comparar el desarrollo de bipelículas y la retención de radiactividad en ambas condiciones. En estos biorreactores se introdujeron ovillos de acero inoxidable y de titanio que se extrajeron a diversos tiempos, hasta 635 días para los ovillos de acero inoxidable y hasta 309 días para los ovillos de titanio. Se analizaron las biopelículas desarrolladas sobre los ovillos por microscopía electrónica de barrido y por microscopía de epifluorescencia. Se extrajo el ADN de las biopelículas y, tras su clonación, se identificaron los microorganismos por técnicas independientes de cultivo. Asimismo se determinó por espectrometría gamma la capacidad de las biopelículas para retener radionúclidos. Los microorganismos radiorresistentes identificados pertenecen a los grupos filogenéticos Alpha-proteobacteria, Gamma-proteobacteria, Actinobacteria, Deinococcus-Thermus y Bacteroidetes. Las secuencias de estos microorganismos se han depositado en el GenBank con los números de acceso KR817260-KR817405. Se ha observado una distribución porcentual ligeramente diferente en relación con el tipo de biorreactor. Las biopelículas han retenido fundamentalmente radionúclidos de activación. La suma de Co-60 y Mn-54 ha llegado en ocasiones al 97%. Otros radionúclidos retenidos han sido Cr-51, Co-58, Fe-59, Zn-65 y Zr-95. Se sugiere un mecanismo del proceso de retención de radionúclidos relacionado con el tiempo de formación y desaparición de las biopelículas. Se ha valorado que el proceso escalable puede ser económicamente rentable. ABSTRACT The existence of microorganisms in spent nuclear fuel pools has been demonstrated recently in nuclear power plants by using conventional microbial techniques. Subsequent studies have revealed that those microorganisms were able to colonize the stainless steel pool walls forming biofilms. Additionally, it has been observed the ability of these biofilms to retain radionuclides, which suggests the possibility of using them for radioactive water decontamination purposes. This thesis presents deeper knowledge of microbial biofilms biodiversity by using molecular biology techniques such as cloning, and develops a decontamination system on a pilot scale, in order to assess whether the process could be scalable to an industrial level. Aiming to demonstrate this was feasible, two stainless steel bioreactors were designed and manufactured, both were compatible with seismic and radiation protection standards in the controlled zone of a nuclear plant. These bioreactors were installed in the Cofrentes Nuclear Power Plant (Valencia) next to the spent nuclear fuel pools and preceding (upstream) ion exchange resins. This configuration allowed the bioreactors to receive water directly from the pools allowing in situ analysis of radiation removal. One ultraviolet lamp was connected to one of the bioreactors to compare biofilms development and radioactivity retention in both conditions. Stainless steel and titanium balls were introduced into these bioreactors and were removed after different time periods, up to 635 days for stainless steel balls and up to 309 days for titanium. Biofilms developed on the balls were analyzed by scanning electron microscopy and epifluorescence microscopy. DNA was extracted from the biofilms, was cloned and then the microorganisms were identified by independent culture techniques. Biofilms ability to retain radionuclides was also determined by gamma spectrometry. The identified radioresistant organisms belong to the phylogenetic groups Alphaproteobacteria, Gamma-proteobacteria, Actinobacteria, Deinococcus-Thermus and Bacteroidetes. The sequences of these microorganisms have been deposited in GenBank (access numbers KR817260-KR817405). A different distribution of microorganisms was observed in relation to the type of bioreactor. Biofilms have essentially retained activation radionuclides. Sometimes the sum of Co-60 and Mn-54 reached 97%. Cr-51, Co-58, Fe-59, Zn-65 and Zr-95 have also been retained. A radionuclide retention process mechanism related to biofilms formation and disappearance time is suggested. It has been assessed that the scalable process can be economically profitable.
Resumo:
The engineering design of fissionchambers as on-line radiation detectors for IFMIF is being performed in the framework of the IFMIF-EVEDA works. In this paper the results of the experiments performed in the BR2 reactor during the phase-2 of the foreseen validation activities are addressed. Two detectors have been tested in a mixedneutron-gamma field with high neutron fluence and gamma absorbed dose rates, comparable with the expected values in the HFTM in IFMIF. Since the neutron spectra in all BR2 channels are dominated by the thermal neutron component, the detectors have been surrounded by a cylindrical gadolinium screen to cut the thermal neutron component, in order to get a more representative test for IFMIF conditions. The integrated gamma absorbed dose was about 4 × 1010 Gy and the fast neutron fluence (E > 0.1 MeV) 4 × 1020 n/cm2. The fissionchambers were calibrated in three BR2 channels with different neutron-to-gamma ratio, and the long-term evolution of the signals was studied and compared with theoretical calculations
Resumo:
Las prácticas en laboratorios forman una parte muy importante de la formación en todos los programas docentes. A pesar de esta importancia, la creación de un laboratorio no es una tarea fácil, ya que el hecho de equipar un laboratorio puede suponer un gran gasto económico, tanto inicial como posterior. Como solución, surge la educación a distancia, y en concreto los laboratorios virtuales, es decir, simulaciones de un laboratorio real utilizando modelos matemáticos. Por sus características y flexibilidad se han ido desarrollando laboratorios virtuales en el ámbito docente, pero no todas las áreas cuentan con tantas posibilidades o facilidades como en la electrónica. La mayoría de los laboratorios accesibles desde Internet que hay en la actualidad dentro de la enseñanza a distancia o formación online, son virtuales. El laboratorio que se ha desarrollado tiene como principal ventaja la realización de prácticas controlando instrumentos y circuitos reales de forma remota. El proyecto consiste en realizar un sistema software para implementar un laboratorio remoto en el área de la electrónica analógica, que pueda ser utilizado como complemento a las actividades formativas que se realizan en los laboratorios de los centros de enseñanza. El sistema completo también consta de un hardware controlado mediante buses de comunicación estándar, que permite la implementación de distintos circuitos analógicos, de tal forma que se pueda realizar prácticas sobre circuitos físicos reales. Para desarrollar un laboratorio lo más real posible, la aplicación que maneja el estudiante es un visor 3D. Con la utilización de un visor 3D lo que se pretende es tener un aumento de la realidad a la hora de realizar las prácticas de laboratorio remotamente. El sistema desarrollado cuenta con un sistema de comunicación basado en un modelo cliente-servidor: • Servidor: se encarga de procesar las acciones que realiza el cliente y controla y monitoriza los instrumentos y dispositivos del sistema hardware. • Cliente: sería el usuario final, que mediante un visor 3D comunica las acciones a realizar al servidor para que éste las procese. Practices in laboratories are a very important part of training in all educational programs. Despite this importance, the establishment of a laboratory is not an easy task, since the fact of equipping a laboratory can be a great economic budget, both initial and subsequent spending. As a solution, appears the education at distance (online), and in particular the virtual labs, namely simulations of a real laboratory by using mathematical models. Virtual laboratories in the field of teaching have been developed for its features and flexibility, but not all areas have so many possibilities or facilities as in electronics. The most accessible laboratories from the Internet that are currently accessible within the distance or e-learning (on-line) are virtual. The laboratory which has been developed has as a main advantage to make practices or exercises in the fact of controlling instruments and real circuits remotely. The project consists of making a software system in order to implement a remote laboratory in the area of analog electronics that can be used as a complement to the others training activities to be carried out. The complete system also consists of a controlled hardware by standard communication buses that allow the implementation of several analog circuits, in such a way that practices can control real physical circuits. To develop a laboratory as more realistic as possible, the application that manages the student is a 3D viewer. With the use of a 3D viewer, is intended to have an increase in reality when any student wants to access to laboratory practices remotely. The developed system has a communication system based on a model Client/Server: • Server: The system that handles actions provided by the client and controls and monitors the instruments and devices in the hardware system. • Client: The end user, which using a 3D viewer, communicates the actions to be performed at the server so that it will process them.
Resumo:
The objective of this paper is to design a path following control system for a car-like mobile robot using classical linear control techniques, so that it adapts on-line to varying conditions during the trajectory following task. The main advantages of the proposed control structure is that well known linear control theory can be applied in calculating the PID controllers to full control requirements, while at the same time it is exible to be applied in non-linear changing conditions of the path following task. For this purpose the Frenet frame kinematic model of the robot is linearised at a varying working point that is calculated as a function of the actual velocity, the path curvature and kinematic parameters of the robot, yielding a transfer function that varies during the trajectory. The proposed controller is formed by a combination of an adaptive PID and a feed-forward controller, which varies accordingly with the working conditions and compensates the non-linearity of the system. The good features and exibility of the proposed control structure have been demonstrated through realistic simulations that include both kinematics and dynamics of the car-like robot.
Resumo:
Different parameters are used to quantify the maturity of fruits at or near harvest (shape, color, flesh texture and internal composition). Flesh firmness is a critical handling parameter for fruits such as peach, pear and apple. Results of previous studies conducted by different researchers have shown that impact techniques can be used to evaluate firmness of fruits. A prototype impact system for firmness sorting of fruits was developed by Chen and Ruiz-Altisent (Chen et al, 1996). This sensor was mounted and tested successfully on a 3 m section of a commercial conveyor belt (Chen et al, 1998). This is a further development of the on-line impact system for firmness sorting of fruits. The design of the sensor has been improved and it has been mounted on a experimental fruit packing line (Ortiz-Cañavate et al 1999).
Resumo:
In recent years, the increasing sophistication of embedded multimedia systems and wireless communication technologies has promoted a widespread utilization of video streaming applications. It has been reported in 2013 that youngsters, aged between 13 and 24, spend around 16.7 hours a week watching online video through social media, business websites, and video streaming sites. Video applications have already been blended into people daily life. Traditionally, video streaming research has focused on performance improvement, namely throughput increase and response time reduction. However, most mobile devices are battery-powered, a technology that grows at a much slower pace than either multimedia or hardware developments. Since battery developments cannot satisfy expanding power demand of mobile devices, research interests on video applications technology has attracted more attention to achieve energy-efficient designs. How to efficiently use the limited battery energy budget becomes a major research challenge. In addition, next generation video standards impel to diversification and personalization. Therefore, it is desirable to have mechanisms to implement energy optimizations with greater flexibility and scalability. In this context, the main goal of this dissertation is to find an energy management and optimization mechanism to reduce the energy consumption of video decoders based on the idea of functional-oriented reconfiguration. System battery life is prolonged as the result of a trade-off between energy consumption and video quality. Functional-oriented reconfiguration takes advantage of the similarities among standards to build video decoders reconnecting existing functional units. If a feedback channel from the decoder to the encoder is available, the former can signal the latter changes in either the encoding parameters or the encoding algorithms for energy-saving adaption. The proposed energy optimization and management mechanism is carried out at the decoder end. This mechanism consists of an energy-aware manager, implemented as an additional block of the reconfiguration engine, an energy estimator, integrated into the decoder, and, if available, a feedback channel connected to the encoder end. The energy-aware manager checks the battery level, selects the new decoder description and signals to build a new decoder to the reconfiguration engine. It is worth noting that the analysis of the energy consumption is fundamental for the success of the energy management and optimization mechanism. In this thesis, an energy estimation method driven by platform event monitoring is proposed. In addition, an event filter is suggested to automate the selection of the most appropriate events that affect the energy consumption. At last, a detailed study on the influence of the training data on the model accuracy is presented. The modeling methodology of the energy estimator has been evaluated on different underlying platforms, single-core and multi-core, with different characteristics of workload. All the results show a good accuracy and low on-line computation overhead. The required modifications on the reconfiguration engine to implement the energy-aware manager have been assessed under different scenarios. The results indicate a possibility to lengthen the battery lifetime of the system in two different use-cases.
Resumo:
This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enable, and support reproducible research; and (2) individual researchers should conduct each experiment as though someone will replicate that experiment. Participants documented numerous issues, questions, technologies, practices, and potentially promising initiatives emerging from the discussion, but also highlighted four areas of particular interest to XSEDE: (1) documentation and training that promotes reproducible research; (2) system-level tools that provide build- and run-time information at the level of the individual job; (3) the need to model best practices in research collaborations involving XSEDE staff; and (4) continued work on gateways and related technologies. In addition, an intriguing question emerged from the day's interactions: would there be value in establishing an annual award for excellence in reproducible research? Overview
Resumo:
Una de las características de la cartografía y SIG Participativos (SIGP) es incluir en sus métodos a la sociedad civil para aportar contenidos cualitativos a la información de sus territorios. Sin embargo no sólo se trata de datos, sino de los efectos que pueden tener estas prácticas sobre el territorio y su sociedad. El acceso a esa información se ve reducida en contraste con el incremento de información difundida a través de servicios de visualización, geoinformación y cartografía on-line. Todo esto hace que sea necesario el análisis del alcance real de las metodologías participativas en el uso de Información Geográfica (IG) y la comparación desde distintos contextos geográficos. También es importante conocer los beneficios e inconvenientes del acceso a la información para el planeamiento; desde la visibilidad de muchos pueblos desapercibidos en zonas rurales y periféricas, hasta la influencia en programas de gobierno sobre la gestión del territorio pasando por el conocimiento local espacial. El análisis se centró en los niveles de participación de la sociedad civil y sus grados de accesibilidad a la información (acceso y uso), dentro del estudio de los SIGP, Participatory Mapping, además se estudió de los TIG (Tecnologías de Información Geográfica), cartografías on-line (geoweb) y plataformas de geovisualización espacial, como recursos de Neocartografía. En este sentido, se realizó un trabajo de campo de cartografía participativa en Bolivia, se evaluaron distintos proyectos SIGP en países del norte y sur (comparativa de contextos en países en desarrollo) y se analizaron los resultados del cruce de las distintas variables.(validación, accesibilidad, verificación de datos, valor en la planificación e identidad) La tesis considera que ambos factores (niveles de participación y grado de accesibilidad) afectan a la (i) validación, verificación y calidad de los datos, la (ii) valor analítico en la planificación, y al (iii) modelo de identidad de un lugar, y que al ser tratados de forma integral, constituyen el valor añadido que los SIGP pueden aportar para lograr una planificación efectiva. Asimismo se comprueba, que la dimensión participativa en los SIGP varía según el contexto, la centralización de sus actores e intereses sectoriales. La información resultante de las prácticas SIGP tiende a estar restringida por la falta de legislaciones y por la ausencia de formatos estándar, que limitan la difusión e intercambio de la información. Todo esto repercute en la efectividad de una planificación estratégica y en la viabilidad de la implementación de cualquier proyecto sobre el territorio, y en consecuencia sobre los niveles de desarrollo de un país. Se confirma la hipótesis de que todos los elementos citados en los SIGP y mapeo participativo actuarán como herramientas válidas para el fortalecimiento y la eficacia en la planificación sólo si están interconectadas y vinculadas entre sí. Se plantea una propuesta metodológica ante las formas convencionales de planificación (nueva ruta del planeamiento; que incluye el intercambio de recursos y determinación participativa local antes de establecer la implementación), con ello, se logra incorporar los beneficios de las metodologías participativas en el manejo de la IG y los SIG (Sistemas de Información Geográfica) como instrumentos estratégicos para el desarrollo de la identidad local y la optimización en los procesos de planeamiento y estudios del territorio. Por último, se fomenta que en futuras líneas de trabajo los mapas de los SIGP y la cartografía participativa puedan llegar a ser instrumentos visuales representativos que transfieran valores identitarios del territorio y de su sociedad, y de esta manera, ayudar a alcanzar un mayor conocimiento, reconocimiento y valoración de los territorios para sus habitantes y sus planificadores. ABSTRACT A feature of participatory mapping and PGIS is to include the participation of the civil society, to provide qualitative information of their territories. However, focus is not only data, but also the effects that such practices themselves may have on the territory and their society. Access to this information is reduced in contrast to the increase of information disseminated through visualization services, geoinformation, and online cartography. Thus, the analysis of the real scope of participatory methodologies in the use of Geographic Information (GI) is necessary, including the comparison of different geographical contexts. It is also important to know the benefits and disadvantages of access to information needed for planning in different contexts, ranging from unnoticed rural areas and suburbs to influence on government programs on land management through local spatial knowledge. The analysis focused on the participation levels of civil society and the degrees of accessibility of the information (access and use) within the study of Participatory GIS (PGIS). In addition, this work studies GIT (Geographic Information Technologies), online cartographies (Geoweb) and platforms of spatial geovisualization, as resources of Neocartography. A participatory cartographic fieldwork was carried out in Bolivia. Several PGIS projects were evaluated in Northern and Southern countries (comparatively with the context of developing countries), and the results were analyzed for each these different variables. (validation, accessibility, verification,value, identity). The thesis considers that both factors (participation levels and degree of accessibility) affect the (i) validation, verification and quality of the data, (ii) analytical value for planning, and (iii) the identity of a place. The integrated management of all the above cited criteria constitutes an added value that PGISs can contribute to reach an effective planning. Also, it confirms the participatory dimension of PGISs varies according to the context, the centralization of its actors, and to sectorial interests. The resulting information from PGIS practices tends to be restricted by the lack of legislation and by the absence of standard formats, which limits in turn the diffusion and exchange of the information. All of this has repercussions in the effectiveness of a strategic planning and in the viability of the implementation of projects about the territory, and consequentially in the land development levels. The hypothesis is confirmed since all the described elements in PGISs and participatory mapping will act as valid tools in strengthening and improving the effectivity in planning only if they are interconnected and linked amongst themselves. This work, therefore, suggests a methodological proposal when faced with the conventional ways of planning: a new planning route which includes the resources exchange and local participatory determination before any plan is established -. With this, the benefits of participatory methodologies in the management of GI and GIS (Geographic Information Systems) is incorporated as a strategic instrument for development of local identity and optimization in planning processes and territory studies. Finally, the study outlines future work on PGIS maps and Participatory Mapping, such that these could eventually evolve into visual representative instruments that transfer identity values of the territory and its society. In this way, they would contribute to attain a better knowledge, recognition, and appraisement of the territories for their inhabitants and planners.