43 resultados para postgreSQL
Resumo:
The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Ciências Cartográficas - FCT
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
This Project aims to develop methods for data classification in a Data Warehouse for decision-making purposes. We also have as another goal the reduction of an attribute set in a Data Warehouse, in which a given reduced set is capable of keeping the same properties of the original one. Once we achieve a reduced set, we have a smaller computational cost of processing, we are able to identify non-relevant attributes to certain kinds of situations, and finally we are also able to recognize patterns in the database that will help us to take decisions. In order to achieve these main objectives, it will be implemented the Rough Sets algorithm. We chose PostgreSQL as our data base management system due to its efficiency, consolidation and finally, it’s an open-source system (free distribution)
Resumo:
[ES] El objetivo de este Trabajo Final de Grado (TFG) es la creación de un prototipo de aplicación web para la gestión de recursos geoespaciales. Esta propuesta surgió a partir de la necesidad de disponer de una herramienta que no tuviera que ser instalada en un dispositivo, sino servida por un servidor web, permitiendo su acceso desde cualquier parte y dispositivo. El resultado fue el Gestor Web de Recursos Geoespaciales con Tecnología OpenLayers, una aplicación que combina diversas herramientas (OpenLayers, GeoServer, PostgreSQL, jQuery…) – todas ellas basadas en Software Libre – para cumplir funcionalidades como la creación de primitivas vectoriales sobre un mapa, gestión y visualización de la información asociada, edición de estilos, modificación de coordenadas, etc. siendo todas éstas funcionalidades características de un Sistema de Información Geográfica (SIG) y ofreciendo una interfaz de uso cómoda y eficaz, que abstraiga al usuario de detalles internos y complejos. El material desarrollado dispone del potencial necesario para convertirse en una solución a las necesidades de gestión de información geoespacial de la ULPGC, especialmente en el campus de Tafira, sobre el que se ha ejemplificado su uso. Además, a diferencia de las herramientas ofertadas por empresas como Google o Microsoft, esta aplicación está por completo bajo una licencia GNU GPL v3, lo que permite que se pueda indagar dentro de su código, mejorarlo y añadir funcionalidades a cualquier persona interesada.
Resumo:
The spectacular advances computer science applied to geographic information systems (GIS) in recent times has favored the emergence of several technological solutions. These developments have given rise to enormous opportunities for digital management of the territory. Among the technological solutions, the most famous Google Maps offers free online mapping dynamic exhaustive of the Maps. In addition to meet the enormous needs of urban indicators geotagged information, we did work on this project “Integration of an urban observatory on Google Maps.” The problem of geolocation in the urban observatory is particularly relevant in the sense that there is currently no data (descriptive and geographical) reliable on the urban sector; we must stick to extrapolate from data old and obsolete. This helps to curb the effectiveness of urban management to make difficult investment programming and to prevent the acquisition of knowledge to make cities engines of growth. The use of a geolocation tool coupled to the data would allow better monitoring of indicators Our project's objective is to develop an interactive map server (WebMapping) which map layer is formed from the resources of the Google Maps servers and match information from the field to produce maps of urban equipment and infrastructure of a city data to the client's request To achieve this goal, we will participate in a study of a GPS location of strategic sites in our core sector (health facilities), on the other hand, using information from the field, we will build a postgresql database that will link the information from the field to map from Google Maps via KML scripts and PHP appropriate. We will limit ourselves in our work to the city of Douala Cameroon with the sectors of health facilities with the possibility of extension to other areas and other cities. Keywords: Geographic Information System (GIS), Thematic Mapping, Web Mapping, data mining, Google API.
Resumo:
The Protein pKa Database (PPD) v1.0 provides a compendium of protein residue-specific ionization equilibria (pKa values), as collated from the primary literature, in the form of a web-accessible postgreSQL relational database. Ionizable residues play key roles in the molecular mechanisms that underlie many biological phenomena, including protein folding and enzyme catalysis. The PPD serves as a general protein pKa archive and as a source of data that allows for the development and improvement of pKa prediction systems. The database is accessed through an HTML interface, which offers two fast, efficient search methods: an amino acid-based query and a Basic Local Alignment Search Tool search. Entries also give details of experimental techniques and links to other key databases, such as National Center for Biotechnology Information and the Protein Data Bank, providing the user with considerable background information.
Resumo:
JenPep is a relational database containing a compendium of thermodynamic binding data for the interaction of peptides with a range of important immunological molecules: the major histocompatibility complex, TAP transporter, and T cell receptor. The database also includes annotated lists of B cell and T cell epitopes. Version 2.0 of the database is implemented in a bespoke postgreSQL database system and is fully searchable online via a perl/HTML interface (URL: http://www.jenner.ac.uk/JenPep).
Resumo:
En el presente trabajo se desarrolló un sistema de funciones estadísticas empleando el lenguaje R-Statistics y otras herramientas: Open Source (Symfony, PostgreSQL, PLR y Debian), para el análisis de los datos del almacén de datos del Ministerio de Salud, integrándose dichas funciones como un módulo del sistema integrado de indicadores gerenciales. El sistema, facilitará la toma de decisiones, dentro de los resultados estadísticos que proporciona están: Medidas de posición, de tendencia central, de dispersión, gráficas tales como el dendograma, circulo de corrección, diagrama de caja y bigote entre otros, los cuales permitirá tener una mejor apreciación de la situación actual y futura, para definir estrategias y realizar planificaciones
Resumo:
El objetivo del TFG es ejectuar y documentar el proceso de actualizaci on de un sistema software real de car acter empresarial, perteneciente a la empresa dedicada a las transacciones de divisas Foreign Exchange Solutions SL. El sistema est a implementado en Python 2.7 usando el framework de desarrollo r apido de aplicaciones web Django que, comenzando por su versi on 1.3.1, terminar a al nal del proyecto en la versi on 1.4.10, lo que nos llevar a a tener que actualizar todas las librer as relacionadas, adem as de mejorar la calidad del c odigo e incluso cambiar la estructura del proyecto, prestando adem as especial atenci on a la pruebas unitarias y de regresi on para comprobar el correcto funcionamiento del sistema a lo largo del desarrollo. Todo esto con el n de conseguir las nuevas funcionalidades y caracter sticas que una versi on m as nueva nos ofrece, adem as de mejorar la calidad de la aplicaci on -aumentar la reutilizaci on del c odigo y reducir futuros errores gracias a un c odigo m as sencillo y legible-, aumentar el rendimiento, y obtener una buena cobertura de pruebas. Usaremos adem as la metodolog a agil Scrum, el SGBD PostgreSQL, adem as de otras herramientas como Solr, ElasticSearch, Redis, Celery o Mercurial para el control de versiones.
Resumo:
El proyecto consiste en un portal de búsqueda de vulnerabilidades web, llamado Krashr, cuyo objetivo es el de buscar si una página web introducida por un usuario contiene algún tipo de vulnerabilidad explotable, además de tratar de ayudar a este usuario a arreglar las vulnerabilidades encontradas. Se cuenta con un back-end realizado en Python con una base de datos PostreSQL, un front-end web realizado en AngularJS y una API basada en Node.js y Express que comunica los dos frentes.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Geociências, 2016.