955 resultados para Open source information retrieval
Resumo:
La participación pública constituye un pilar fundamental de nuestra sociedad y se integra dentro de las fases de elaboración de planes de ordenación del territorio y en la planificación hidrológica. Las Administraciones Públicas tienen el reto de aprovechar las nuevas posibilidades de comunicación para mejorar su gestión y hacer efectivo el derecho de la ciudadanía al acceso a la información y la toma de decisiones, por ejemplo, en materia de medio ambiente como recoge la Ley 27/2006 entre otras normativas. Para que esta participación sea satisfactoria, se requiere de la implantación de mecanismos de creación colaborativa y transmisión de conocimiento relativo al territorio. Es aquí donde las dinámicas de las comunidades de software libre (FLOSS) y contenidos abiertos han demostrado ser tremendamente efectivas y parecen de gran interés para los gobiernos como complemento a los servicios de las IDE. La Xunta de Galicia es consciente del gran potencial de la ciudadanía para aportar datos que mejoren los planes y actuaciones sobre el territorio. Por ello, dentro de los planes de creación del SIG Corporativo de Galicia se contemplan componentes geomáticos libres que mejoren los procesos de participación. En este artículo se presentará la plataforma con soporte espacial del “Proxecto Ríos” creada en 2011 desde la Xunta que facilita la coordinación de más de 200 grupos de voluntarios que recogen datos de los ríos gallegos de forma colaborativa. Esta herramienta hace uso de proyectos como OpenLayers, GeoExt y PostGIS. Siguiendo las líneas de acción FLOSS definidas por la Secretaría General de Modernización e Innovación Tecnológica de la Xunta de Galicia se tiene previsto liberar estos desarrollos para su uso en el resto de la Red de Proyectos Ríos de la península y contribuir a la forja de software libre del gobierno gallego
Resumo:
Las estructuras monumentales prehistóricas construidas mediante zanjas continuas excavadas en el suelo de la Amazonia brasileña, son yacimientos que comprenden varias formas geométricas de diversos tamaños. Actualmente se conocen 291 yacimientos arqueológicos, la mayor parte de los cuales han sido mapeados por medio de imágenes de satélite. Estas construcciones prehistóricas fueron localizadas mediante la combinación de una serie de estrategias de prospección que incluyen el uso de imágenes satélites, sobrevuelos y otras tecnologías que posibilitaron su identificación. En un análisis de caracterización cuantitativa y morfológica y a través da estadísticas, obtuvimos como principales resultados que existe una gran dispersión de los datos correspondientes a su tamaño, siendo menor la variación para la profundidad, el valor medio de las áreas de los recintos con zanjas perimetrales es 17.490,6 m2 , a pesar de que el 40,3% mide menos de una hectárea (10.000 m ), la altitud media a la que se encuentran es de 194,4 metros y ésta es la variable que mejor correlación tiene con la posición geográfica. Con la ayuda de gvSIG, Sextante, GRASS y R, hemos tratado de caracterizar la ubicación de los yacimientos atendiendo a diferentes variables entre las que destacan la altitud relativa, la orientación, la distancia al curso de agua más próximo, la pendiente y la posición relativa en el territorio. La intención es intentar predecir en qué áreas, hoy en día cubiertas por la masa forestal, se pueden encontrar estructuras semejantes a las localizadas en las áreas abiertas. Este carácter predictivo de nuestras observaciones sería de vital importancia para poder definir futuras prospecciones en las áreas boscosas de la Amazonia.Para el futuro, no descartamos el aprovechamiento de datos LIDAR para intentar comprobar si las áreas designadas como potencialmente poseedoras de geoglifos efectivamente los poseen
Resumo:
La coordinación de recursos en situaciones de emergencia requiere de procedimientos y herramientas que faciliten a los distintos cuerpos operativos el acceso a la información necesaria en el menor tiempo posible para tomar mejores decisiones. Los sistemas de información geográfica de código abierto y los estándares destinados a la difusión de información geográfica se encuentran en un estado de madurez tal que permiten dar respuesta a esta este escenario, facilitando la construcción de soluciones tecnológicas adecuadas para la prevención, gestión y seguimiento de este tipo de situaciones de emergencia. Este artículo trata de compartir la experiencia en el desarrollo de un sistema de información geográfica de apoyo a la gestión de emergencias ingredado en una solución vertical de extendido uso en el ámbito nacional. Para ello, se expondrán las claves tecnológicas y funcionales del SIG, haciendo especial hincapié en las ventajas que proporciona el empleo de estándares abiertos en el intercambio y explotación de información en un contexto tan exigente
Resumo:
El geomarketing se presenta como una de las posibilidades de análisis con las tecnologías de la información geográfica con más auge en la actualidad. La localización ideal de un negocio o de una campaña publicitaria viene determinada por variables espaciales como la ubicación de la competencia, del público objetivo o encontrar las zonas con mejores accesibilidades, lo cual nos permitiría finalmente localizar aquellos lugares que potencialmente serían los más óptimos para nuestra actividad. Sumado a esto, entran en juego las nuevas tecnologías de localización mediante dispositivos móviles. La posibilidades de divulgación de la información mediante comunicaciones inalámbricas restringidas o no a zonas específicas, e incluso, la capacidad de compartir tu ubicación y estado mediante las redes sociales permite a los analistas realizar estudios de mercado, que hasta ahora eran inviables. Posteriormente, toda esta información es susceptible de ser mostrada mediante plataformas cartográficas. El usuario final puede posicionarse en un área de interés y obtener, mediante la información alfanumérica que tengan asociados los elementos del mapa, los datos que requiera
Resumo:
El sistema de información geográfica del Ayuntamiento de Bétera nace con la necesidad de obtener una cartografía única y accesible a técnicos ubicados en distintos departamentos o edificios, ciudadanos, empresas y administraciones públicas
Resumo:
A quasi-optical interferometric technique capable of measuring antenna phase patterns without the need for a heterodyne receiver is presented. It is particularly suited to the characterization of terahertz antennas feeding power detectors or mixers employing quasi-optical local oscillator injection. Examples of recorded antenna phase patterns at frequencies of 1.4 and 2.5 THz using homodyne detectors are presented. To our knowledge, these are the highest frequency antenna phase patterns ever recovered. Knowledge of both the amplitude and phase patterns in the far field enable a Gauss-Hermite or Gauss-Laguerre beam-mode analysis to be carried out for the antenna, of importance in performance optimization calculations, such as antenna gain and beam efficiency parameters at the design and prototype stage of antenna development. A full description of the beam would also be required if the antenna is to be used to feed a quasi-optical system in the near-field to far-field transition region. This situation could often arise when the device is fitted directly at the back of telescopes in flying observatories. A further benefit of the proposed technique is simplicity for characterizing systems in situ, an advantage of considerable importance as in many situations, the components may not be removable for further characterization once assembled. The proposed methodology is generic and should be useful across the wider sensing community, e.g., in single detector acoustic imaging or in adaptive imaging array applications. Furthermore, it is applicable across other frequencies of the EM spectrum, provided adequate spatial and temporal phase stability of the source can be maintained throughout the measurement process. Phase information retrieval is also of importance to emergent research areas, such as band-gap structure characterization, meta-materials research, electromagnetic cloaking, slow light, super-lens design as well as near-field and virtual imaging applications.
Resumo:
The ability to create accurate geometric models of neuronal morphology is important for understanding the role of shape in information processing. Despite a significant amount of research on automating neuron reconstructions from image stacks obtained via microscopy, in practice most data are still collected manually. This paper describes Neuromantic, an open source system for three dimensional digital tracing of neurites. Neuromantic reconstructions are comparable in quality to those of existing commercial and freeware systems while balancing speed and accuracy of manual reconstruction. The combination of semi-automatic tracing, intuitive editing, and ability of visualizing large image stacks on standard computing platforms provides a versatile tool that can help address the reconstructions availability bottleneck. Practical considerations for reducing the computational time and space requirements of the extended algorithm are also discussed.
Resumo:
The large scale urban consumption of energy (LUCY) model simulates all components of anthropogenic heat flux (QF) from the global to individual city scale at 2.5 × 2.5 arc-minute resolution. This includes a database of different working patterns and public holidays, vehicle use and energy consumption in each country. The databases can be edited to include specific diurnal and seasonal vehicle and energy consumption patterns, local holidays and flows of people within a city. If better information about individual cities is available within this (open-source) database, then the accuracy of this model can only improve, to provide the community data from global-scale climate modelling or the individual city scale in the future. The results show that QF varied widely through the year, through the day, between countries and urban areas. An assessment of the heat emissions estimated revealed that they are reasonably close to those produced by a global model and a number of small-scale city models, so results from LUCY can be used with a degree of confidence. From LUCY, the global mean urban QF has a diurnal range of 0.7–3.6 W m−2, and is greater on weekdays than weekends. The heat release from building is the largest contributor (89–96%), to heat emissions globally. Differences between months are greatest in the middle of the day (up to 1 W m−2 at 1 pm). December to February, the coldest months in the Northern Hemisphere, have the highest heat emissions. July and August are at the higher end. The least QF is emitted in May. The highest individual grid cell heat fluxes in urban areas were located in New York (577), Paris (261.5), Tokyo (178), San Francisco (173.6), Vancouver (119) and London (106.7). Copyright © 2010 Royal Meteorological Society
Resumo:
The CHARMe project enables the annotation of climate data with key pieces of supporting information that we term “commentary”. Commentary reflects the experience that has built up in the user community, and can help new or less-expert users (such as consultants, SMEs, experts in other fields) to understand and interpret complex data. In the context of global climate services, the CHARMe system will record, retain and disseminate this commentary on climate datasets, and provide a means for feeding back this experience to the data providers. Based on novel linked data techniques and standards, the project has developed a core system, data model and suite of open-source tools to enable this information to be shared, discovered and exploited by the community.
Resumo:
For users of climate services, the ability to quickly determine the datasets that best fit one's needs would be invaluable. The volume, variety and complexity of climate data makes this judgment difficult. The ambition of CHARMe ("Characterization of metadata to enable high-quality climate services") is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports or feedback on previous applications of the data. The capture and discovery of this "commentary" information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator ("CHARMe Maps") and a tool for correlating climate time series with external "significant events" (e.g. instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source, released under a liberal licence, permitting future projects to re-use the source code as they wish.
Resumo:
Geospatial information of many kinds, from topographic maps to scientific data, is increasingly being made available through web mapping services. These allow georeferenced map images to be served from data stores and displayed in websites and geographic information systems, where they can be integrated with other geographic information. The Open Geospatial Consortium’s Web Map Service (WMS) standard has been widely adopted in diverse communities for sharing data in this way. However, current services typically provide little or no information about the quality or accuracy of the data they serve. In this paper we will describe the design and implementation of a new “quality-enabled” profile of WMS, which we call “WMS-Q”. This describes how information about data quality can be transmitted to the user through WMS. Such information can exist at many levels, from entire datasets to individual measurements, and includes the many different ways in which data uncertainty can be expressed. We also describe proposed extensions to the Symbology Encoding specification, which include provision for visualizing uncertainty in raster data in a number of different ways, including contours, shading and bivariate colour maps. We shall also describe new open-source implementations of the new specifications, which include both clients and servers.
Resumo:
Climate change poses new challenges to cities and new flexible forms of governance are required that are able to take into account the uncertainty and abruptness of changes. The purpose of this paper is to discuss adaptive climate change governance for urban resilience. This paper identifies and reviews three traditions of literature on the idea of transitions and transformations, and assesses to what extent the transitions encompass elements of adaptive governance. This paper uses the open source Urban Transitions Project database to assess how urban experiments take into account principles of adaptive governance. The results show that: the experiments give no explicit information of ecological knowledge; the leadership of cities is primarily from local authorities; and evidence of partnerships and anticipatory or planned adaptation is limited or absent. The analysis shows that neither technological, political nor ecological solutions alone are sufficient to further our understanding of the analytical aspects of transition thinking in urban climate governance. In conclusion, the paper argues that the future research agenda for urban climate governance needs to explore further the links between the three traditions in order to better identify contradictions, complementarities or compatibilities, and what this means in practice for creating and assessing urban experiments.
Resumo:
This paper presents an open-source canopy height profile (CHP) toolkit designed for processing small-footprint full-waveform LiDAR data to obtain the estimates of effective leaf area index (LAIe) and CHPs. The use of the toolkit is presented with a case study of LAIe estimation in discontinuous-canopy fruit plantations. The experiments are carried out in two study areas, namely, orange and almond plantations, with different percentages of canopy cover (48% and 40%, respectively). For comparison, two commonly used discrete-point LAIe estimation methods are also tested. The LiDAR LAIe values are first computed for each of the sites and each method as a whole, providing “apparent” site-level LAIe, which disregards the discontinuity of the plantations’ canopies. Since the toolkit allows for the calculation of the study area LAIe at different spatial scales, between-tree-level clumpingcan be easily accounted for and is then used to illustrate the impact of the discontinuity of canopy cover on LAIe retrieval. The LiDAR LAIe estimates are therefore computed at smaller scales as a mean of LAIe in various grid-cell sizes, providing estimates of “actual” site-level LAIe. Subsequently, the LiDAR LAIe results are compared with theoretical models of “apparent” LAIe versus “actual” LAIe, based on known percent canopy cover in each site. The comparison of those models to LiDAR LAIe derived from the smallest grid-cell sizes against the estimates of LAIe for the whole site has shown that the LAIe estimates obtained from the CHP toolkit provided values that are closest to those of theoretical models.
Resumo:
Successful classification, information retrieval and image analysis tools are intimately related with the quality of the features employed in the process. Pixel intensities, color, texture and shape are, generally, the basis from which most of the features are Computed and used in such fields. This papers presents a novel shape-based feature extraction approach where an image is decomposed into multiple contours, and further characterized by Fourier descriptors. Unlike traditional approaches we make use of topological knowledge to generate well-defined closed contours, which are efficient signatures for image retrieval. The method has been evaluated in the CBIR context and image analysis. The results have shown that the multi-contour decomposition, as opposed to a single shape information, introduced a significant improvement in the discrimination power. (c) 2008 Elsevier B.V. All rights reserved,
Resumo:
Det mobila operativsystemet Android är idag ett ganska dominerande operativsystem på den mobila marknaden dels på grund av sin öppenhet men också på grund av att tillgängligheten är stor i och med både billiga och dyra telefoner finns att tillgå. Men idag har Android inget fördefinierat designmönster vilket leder till att varje utvecklare får bestämma själv vad som ska användas, vilket ibland kan leda till onödigt komplex kod i applikationerna som sen blir svårtestad och svårhanterlig. Detta arbete ämnar jämföra två designmönster, Passive Model View Controller (PMVC) och Model View View-Model (MVVM), för att se vilket designmönster som blir minst komplext med hjälp av att räkna fram mätvärden med hjälp av Cyclomatic Complexity Number (CCN). Studien är gjord utifrån arbetssättet Design & Creation och ämnar bidra med: kunskap om vilket mönster man bör välja, samt om CCN kan peka ut vilka delar i en applikation som kommer att ta mer eller mindre lång tid att testa. Under studiens gång tog vi även fram skillnader på om man anväder sig av den så kallade Single Responsibilyt Principle (SRP) eller inte. Detta för att se om separerade vyer gör någon skillnad i applikationernas komplexitet. I slutändan så visar studien på att komplexiteten i små applikationer är väldigt likvärdig, men att man även på små applikationer kan se skillnad på hur komplex koden är men också att kodkomplexitet på metodnivå kan ge riktlinjer för testfall.