72 resultados para Spatial Database Systems
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
Non-conventional database management systems are used to achieve a better performance when dealing with complex data. One fundamental concept of these systems is object identity (OID), because each object in the database has a unique identifier that is used to access and reference it in relationships to other objects. Two approaches can be used for the implementation of OIDs: physical or logical OIDs. In order to manage complex data, was proposed the Multimedia Data Manager Kernel (NuGeM) that uses a logical technique, named Indirect Mapping. This paper proposes an improvement to the technique used by NuGeM, whose original contribution is management of OIDs with a fewer number of disc accesses and less processing, thus reducing management time from the pages and eliminating the problem with exhaustion of OIDs. Also, the technique presented here can be applied to others OODBMSs. © 2011 IEEE.
Resumo:
The increase in the number of spatial data collected has motivated the development of geovisualisation techniques, aiming to provide an important resource to support the extraction of knowledge and decision making. One of these techniques are 3D graphs, which provides a dynamic and flexible increase of the results analysis obtained by the spatial data mining algorithms, principally when there are incidences of georeferenced objects in a same local. This work presented as an original contribution the potentialisation of visual resources in a computational environment of spatial data mining and, afterwards, the efficiency of these techniques is demonstrated with the use of a real database. The application has shown to be very interesting in interpreting obtained results, such as patterns that occurred in a same locality and to provide support for activities which could be done as from the visualisation of results. © 2013 Springer-Verlag.
Resumo:
A method for spatial electric load forecasting using elements from evolutionary algorithms is presented. The method uses concepts from knowledge extraction algorithms and linguistic rules' representation to characterize the preferences for land use into a spatial database. The future land use preferences in undeveloped zones in the electrical utility service area are determined using an evolutionary heuristic, which considers a stochastic behavior by crossing over similar rules. The method considers development of new zones and also redevelopment of existing ones. The results are presented in future preference maps. The tests in a real system from a midsized city show a high rate of success when results are compared with information gathered from the utility planning department. The most important features of this method are the need for few data and the simplicity of the algorithm, allowing for future scalability.
Resumo:
This paper presents a technique to share the data stored in an object-oriented database aimed at designing environments. This technique shares data between two related databases, called the Original and Product databases, and is composed of three processes: data separation, evolution and integration. Whenever a block of data needs to be shared, it is spread into both databases, resulting in a block on the original database, and another into the Product database, with special links between them controlled by the Object Manager. These blocks do not need to be maintained identical during the evolution phase of the sharing process. Six types of links were defined, and by choosing one, the designer control the evolution and reintegration of the block in both databases. This process uses the composite object concept as the unit of control. The presented concepts can be applied to any data model with support to composite objects.
Resumo:
This paper presents the prototype of a low-cost terrestrial mobile mapping system (MMS) composed of a van, two digital video cameras, two GPS receivers, a notebook computer, and a sound frame synchronisation system. The imaging sensors are mounted as a stereo video camera on top of the vehicle together with the GPS antennae. The GPS receivers and the notebook computer are configured to record data referred to the vehicle position at a planned time interval. This position is subsequently transferred to the road images. This set of equipment and methods provide the opportunity to merge distinct techniques to make topographic maps and also to build georeferenced road image databases. Both vector maps and raster image databases, when integrated appropriately, can give spatial researchers and engineers a new technique whose application may realise better planning and analysis related to the road environment. The experimental results proved that the MMS developed at the São Paulo State University is an effective approach to inspecting road pavements, to map road marks and traffic signs, electric power poles, telephone booths, drain pipes, and many other applications important to people's safety and welfare. A small number of wad images have already been captured by the prototype as a consequence of its application in distinct projects. An efficient organisation of those images and the prompt access to them justify the need for building a georeferenced image database. By expanding it, both at the hardware and software levels, it is possible for engineers to analyse the entire road environment on their office computers.
Resumo:
Cancer is the second main cause of death in Brazil, and according to statistics disclosed by INCA - National Cancer Institute 466,730 new cases of the disease are forecast for 2008. The storage and analysis of tumour tissues of various types and patients' clinical data, genetic profiles, characteristics of diseases and epidemiological data may provide more precise diagnoses, providing more effective treatments with higher chances for the cure of cancer. In this paper we present a Web system with a client-server architecture, which manages a relational database containing all information relating to the tumour tissue and their location in freezers, patients, medical forms, physicians, users, and others. Furthermore, it is also discussed the software engineering used to developing the system.
Resumo:
The development of new technologies that use peer-to-peer networks grows every day, with the object to supply the need of sharing information, resources and services of databases around the world. Among them are the peer-to-peer databases that take advantage of peer-to-peer networks to manage distributed knowledge bases, allowing the sharing of information semantically related but syntactically heterogeneous. However, it is a challenge to ensure the efficient search for information without compromising the autonomy of each node and network flexibility, given the structural characteristics of these networks. On the other hand, some studies propose the use of ontology semantics by assigning standardized categorization of information. The main original contribution of this work is the approach of this problem with a proposal for optimization of queries supported by the Ant Colony algorithm and classification though ontologies. The results show that this strategy enables the semantic support to the searches in peer-to-peer databases, aiming to expand the results without compromising network performance. © 2011 IEEE.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The acquisition and update of Geographic Information System (GIS) data are typically carried out using aerial or satellite imagery. Since new roads are usually linked to georeferenced pre-existing road network, the extraction of pre-existing road segments may provide good hypotheses for the updating process. This paper addresses the problem of extracting georeferenced roads from images and formulating hypotheses for the presence of new road segments. Our approach proceeds in three steps. First, salient points are identified and measured along roads from a map or GIS database by an operator or an automatic tool. These salient points are then projected onto the image-space and errors inherent in this process are calculated. In the second step, the georeferenced roads are extracted from the image using a dynamic programming (DP) algorithm. The projected salient points and corresponding error estimates are used as input for this extraction process. Finally, the road center axes extracted in the previous step are analyzed to identify potential new segments attached to the extracted, pre-existing one. This analysis is performed using a combination of edge-based and correlation-based algorithms. In this paper we present our approach and early implementation results.
Resumo:
The present work begins with a review of the literature on bit selection methods for oil well drilling. A proposal for the structure and organization of a drilling database and a knowledge base, is described. Previous studies formed the principal elements in the process of selection of drills for proposed drilling. The procedure was implemented as a computer system for the selection of tricone bits. A drilling bit database for three different Brazilian sedimentary basins was obtained for several wells drilled, and knowledge was collected from drilling engineers from different fields both electronically and also by means of interviews. It can be concluded that the selection process showed good results based on tests, which were carried out.
Resumo:
This paper describes a data mining environment for knowledge discovery in bioinformatics applications. The system has a generic kernel that implements the mining functions to be applied to input primary databases, with a warehouse architecture, of biomedical information. Both supervised and unsupervised classification can be implemented within the kernel and applied to data extracted from the primary database, with the results being suitably stored in a complex object database for knowledge discovery. The kernel also includes a specific high-performance library that allows designing and applying the mining functions in parallel machines. The experimental results obtained by the application of the kernel functions are reported. © 2003 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents the overall methodology that has been used to encode both the Brazilian Portuguese WordNet (WordNet.Br) standard language-independent conceptual-semantic relations (hyponymy, co-hyponymy, meronymy, cause, and entailment) and the so-called cross-lingual conceptual-semantic relations between different wordnets. Accordingly, after contextualizing the project and outlining the current lexical database structure and statistics, it describes the WordNet.Br editing GUI that was designed to aid the linguist in carrying out the tasks of building synsets, selecting sample sentences from corpora, writing synset concept glosses, and encoding both language-independent conceptual-semantic relations and cross-lingual conceptual-semantic relations between WordNet.Br and Princeton WordNet © Springer-Verlag Berlin Heidelberg 2006.
Resumo:
ArcTech is a software being developed, applied and improved with the aim of becoming an efficient sensitization tool to support the teaching-learning process of Architecture courses. The application deals initially with the thermal comfort of buildings. The output generated by the software shows if a student is able to produce a pleasant environment, in terms of thermal sensation along a 24-hours period. Although one can find the very same characteristics in fully-developed commercial software, the reason to create ArcTech is related to the flexibility of the system to be adapted by the instructor and also to the need of simple tools for the evaluation of specific topics along the courses. The first part of ArcTech is dedicated to data management and that was developed using the visual programming language Delphi 7 and Firebird as the database management system. The second part contains the parameters that can be changed by the system administrator and those related to project visualization. The interface of the system, in which the student will learn how to implement and to evaluate the project alternatives, was built using Macromedia Flash. The software was applied to undergraduate students revealing its easy-learning and easy-teaching interface.