977 resultados para Distributed Database Integration


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last years, and particularly in the context of the COMBIOMED network, our biomedical informatics (BMI) group at the Universidad Politecnica de Madrid has carried out several approaches to address a fundamental issue: to facilitate open access and retrieval to BMI resources —including software, databases and services. In this regard, we have followed various directions: a) a text mining-based approach to automatically build a “resourceome”, an inventory of open resources, b) methods for heterogeneous database integration —including clinical, -omics and nanoinformatics sources—; c) creating various services to provide access to different resources to African users and professionals, and d) an approach to facilitate access to open resources from research projects

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current technology permits connecting local networks via high-bandwidth telephone lines. Central coordinator nodes may use Intelligent Networks to manage data flow over dialed data lines, e.g. ISDN, and to establish connections between LANs. This dissertation focuses on cost minimization and on establishing operational policies for query distribution over heterogeneous, geographically distributed databases. Based on our study of query distribution strategies, public network tariff policies, and database interface standards we propose methods for communication cost estimation, strategies for the reduction of bandwidth allocation, and guidelines for central to node communication protocols. Our conclusion is that dialed data lines offer a cost effective alternative for the implementation of distributed database query systems, and that existing commercial software may be adapted to support query processing in heterogeneous distributed database systems. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Das Grünbuch 2006 der Europäischen Kommission "Eine Europäische Strategie für nachhaltige, wettbewerbsfähige und sichere Energie" unterstreicht, dass Europa in ein neues Energie-Zeitalter eingetreten ist. Die vorrangigen Ziele europäischer Energiepolitik müssen Nachhaltigkeit, Wettbewerbsfähigkeit und Versorgungssicherheit sein, wobei sie eine zusammenhängende und logische Menge von Taktiken und Maßnahmen benötigt, um diese Ziele zu erreichen. Die Strommärkte und Verbundnetze Europas bilden das Kernstück unseres Energiesystems und müssen sich weiterentwickeln, um den neuen Anforderungen zu entsprechen. Die europäischen Stromnetze haben die lebenswichtigen Verbindungen zwischen Stromproduzenten und Verbrauchern mit großem Erfolg seit vielen Jahrzehnten gesichert. Die grundlegende Struktur dieser Netze ist entwickelt worden, um die Bedürfnisse großer, überwiegend auf Kohle aufgebauten Herstellungstechnologien zu befriedigen, die sich entfernt von den Verbraucherzentren befinden. Die Energieprobleme, denen Europa jetzt gegenübersteht, ändern die Stromerzeugungslandschaft in zwei Gesichtspunkten: die Notwendigkeit für saubere Kraftwerkstechnologien verbunden mit erheblich verbesserten Wirkungsgraden auf der Verbraucherseite wird es Kunden ermöglichen, mit den Netzen viel interaktiver zu arbeiten; andererseits müssen die zukünftigen europaweiten Stromnetze allen Verbrauchern eine höchst zuverlässige, preiswerte Energiezufuhr bereitstellen, wobei sowohl die Nutzung von großen zentralisierten Kraftwerken als auch kleineren lokalen Energiequellen überall in Europa ausgeschöpft werden müssen. In diesem Zusammenhang wird darauf hingewiesen, dass die Informationen, die in dieser Arbeit dargestellt werden, auf aktuellen Fragen mit großem Einfluss auf die gegenwärtigen technischen und wirtschaftspolitischen Diskussionen basieren. Der Autor hat während der letzten Jahre viele der hier vorgestellten Schlussfolgerungen und Empfehlungen mit Vertretern der Kraftwerksindustrie, Betreibern von Stromnetzen und Versorgungsbetrieben, Forschungsgremien und den Regulierungsstellen diskutiert. Die folgenden Absätze fassen die Hauptergebnisse zusammen: Diese Arbeit definiert das neue Konzept, das auf mehr verbraucherorientierten Netzen basiert, und untersucht die Notwendigkeiten sowie die Vorteile und die Hindernisse für den Übergang auf ein mögliches neues Modell für Europa: die intelligenten Stromnetze basierend auf starker Integration erneuerbarer Quellen und lokalen Kleinkraftwerken. Das neue Modell wird als eine grundlegende Änderung dargestellt, die sich deutlich auf Netzentwurf und -steuerung auswirken wird. Sie fordert ein europäisches Stromnetz mit den folgenden Merkmalen: – Flexibel: es erfüllt die Bedürfnisse der Kunden, indem es auf Änderungen und neue Forderungen eingehen kann – Zugänglich: es gestattet den Verbindungszugang aller Netzbenutzer besonders für erneuerbare Energiequellen und lokale Stromerzeugung mit hohem Wirkungsgrad sowie ohne oder mit niedrigen Kohlendioxidemissionen – Zuverlässig: es verbessert und garantiert die Sicherheit und Qualität der Versorgung mit den Forderungen des digitalen Zeitalters mit Reaktionsmöglichkeiten gegen Gefahren und Unsicherheiten – Wirtschaftlich: es garantiert höchste Wirtschaftlichkeit durch Innovation, effizientes Energiemanagement und liefert „gleiche Ausgangsbedingungen“ für Wettbewerb und Regulierung. Es beinhaltet die neuesten Technologien, um Erfolg zu gewährleisten, während es die Flexibilität behält, sich an weitere Entwicklungen anzupassen und fordert daher ein zuversichtliches Programm für Forschung, Entwicklung und Demonstration, das einen Kurs im Hinblick auf ein Stromversorgungsnetz entwirft, welches die Bedürfnisse der Zukunft Europas befriedigt: – Netztechnologien, die die Stromübertragung verbessern und Energieverluste verringern, werden die Effizienz der Versorgung erhöhen, während neue Leistungselektronik die Versorgungsqualität verbessern wird. Es wird ein Werkzeugkasten erprobter technischer Lösungen geschaffen werden, der schnell und wirtschaftlich eingesetzt werden kann, so dass bestehende Netze Stromeinleitungen von allen Energieressourcen aufnehmen können. – Fortschritte bei Simulationsprogrammen wird die Einführung innovativer Technologien in die praktische Anwendung zum Vorteil sowohl der Kunden als auch der Versorger stark unterstützen. Sie werden das erfolgreiche Anpassen neuer und alter Ausführungen der Netzkomponenten gewährleisten, um die Funktion von Automatisierungs- und Regelungsanordnungen zu garantieren. – Harmonisierung der ordnungspolitischen und kommerziellen Rahmen in Europa, um grenzüberschreitenden Handel von sowohl Energie als auch Netzdienstleistungen zu erleichtern; damit muss eine Vielzahl von Einsatzsituationen gewährleistet werden. Gemeinsame technische Normen und Protokolle müssen eingeführt werden, um offenen Zugang zu gewährleisten und den Einsatz der Ausrüstung eines jeden Herstellers zu ermöglichen. – Entwicklungen in Nachrichtentechnik, Mess- und Handelssystemen werden auf allen Ebenen neue Möglichkeiten eröffnen, auf Grund von Signalen des Marktes frühzeitig technische und kommerzielle Wirkungsgrade zu verbessern. Es wird Unternehmen ermöglichen, innovative Dienstvereinbarungen zu benutzen, um ihre Effizienz zu verbessern und ihre Angebote an Kunden zu vergrößern. Schließlich muss betont werden, dass für einen erfolgreichen Übergang zu einem zukünftigen nachhaltigen Energiesystem alle relevanten Beteiligten involviert werden müssen.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the constant grow of enterprises and the need to share information across departments and business areas becomes more critical, companies are turning to integration to provide a method for interconnecting heterogeneous, distributed and autonomous systems. Whether the sales application needs to interface with the inventory application, the procurement application connect to an auction site, it seems that any application can be made better by integrating it with other applications. Integration between applications can face several troublesome due the fact that applications may not have been designed and implemented having integration in mind. Regarding to integration issues, two tier software systems, composed by the database tier and by the “front-end” tier (interface), have shown some limitations. As a solution to overcome the two tier limitations, three tier systems were proposed in the literature. Thus, by adding a middle-tier (referred as middleware) between the database tier and the “front-end” tier (or simply referred application), three main benefits emerge. The first benefit is related with the fact that the division of software systems in three tiers enables increased integration capabilities with other systems. The second benefit is related with the fact that any modifications to the individual tiers may be carried out without necessarily affecting the other tiers and integrated systems and the third benefit, consequence of the others, is related with less maintenance tasks in software system and in all integrated systems. Concerning software development in three tiers, this dissertation focus on two emerging technologies, Semantic Web and Service Oriented Architecture, combined with middleware. These two technologies blended with middleware, which resulted in the development of Swoat framework (Service and Semantic Web Oriented ArchiTecture), lead to the following four synergic advantages: (1) allow the creation of loosely-coupled systems, decoupling the database from “front-end” tiers, therefore reducing maintenance; (2) the database schema is transparent to “front-end” tiers which are aware of the information model (or domain model) that describes what data is accessible; (3) integration with other heterogeneous systems is allowed by providing services provided by the middleware; (4) the service request by the “frontend” tier focus on ‘what’ data and not on ‘where’ and ‘how’ related issues, reducing this way the application development time by developers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Energy policies and technological progress in the development of wind turbines have made wind power the fastest growing renewable power source worldwide. The inherent variability of this resource requires special attention when analyzing the impacts of high penetration on the distribution network. A time-series steady-state analysis is proposed that assesses technical issues such as energy export, losses, and short-circuit levels. A multiobjective programming approach based on the nondominated sorting genetic algorithm (NSGA) is applied in order to find configurations that maximize the integration of distributed wind power generation (DWPG) while satisfying voltage and thermal limits. The approach has been applied to a medium voltage distribution network considering hourly demand and wind profiles for part of the U.K. The Pareto optimal solutions obtained highlight the drawbacks of using a single demand and generation scenario, and indicate the importance of appropriate substation voltage settings for maximizing the connection of MPG.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Esta tesis doctoral está enmarcada en dos diferentes pero complementarias áreas de investigación: las redes de Publicación/Subscripción y los servicios móviles distribuidos. Con el paso de los años las redes de Publicación/Subscripción han ido ofreciendo el soporte de comunicaciones desacopladas y ligeras que a su vez, han mejorado la distribución de la información en muchos escenarios de aplicación como lo son la ejecución de servicios distribuidos en entornos fijos. Los servicios móviles distribuidos han de ser desplegados en ambientes inalámbricos en donde los dispositivos móviles deben confiar en las mismas características que las redes de Publicación/Subscripción han estado ofreciendo a sus contrapartes fijos. En este contexto, una de las líneas de investigación pendientes consiste en cómo tomar ventaja de estas características, y cómo avanzar hacia nuevas soluciones no existentes con el fin de mejorar la integración entre los dispositivos fijos y móviles, y la ejecución de los servicios móviles distribuidos. En esta tesis doctoral se pretende avanzar en los mecanismos de integración y coordinación de los servicios móviles distribuidos en el contexto de las redes de Publicación/Subscripción. Los objetivos específicos de esta disertación están enfocados en lograr la integración de los sistemas de Publicación/Suscripción fijos y móviles, y la pro-visión de una versión de red de Publicación/Subscripción específica y uniforme que cuente con mecanismos de coordinación que mejoren la ejecución de los servicios móviles distribuidos. Los resultados de esta tesis doctoral están enmarcados en una versión específica de una red de Publicación/Subscripción que integra brokers fijos y móviles, y permite una coordinación totalmente desacoplada y mejorada entre dispositivos móviles que ejecutan fragmentos de servicios. Las contribuciones específicas son las siguientes: una nueva arquitectura de broker móvil que he llamado Rendezvous Mobile broker, un modelo abstracto de servicios móviles distribuidos coordinados sobre una red de Publicación/Subscripción, mejoras en los mecanismos de enrutamiento epidémicos para diseminar eventos de control producidos por fragmentos de servicios, una solución para soportar servicios altamente fragmentados y geográficamente dispersos, y finalmente una solución de interconexión entre dos dominios de red basados en Publicación/Subscripción: una red basada en el protocolo PubSubHubbub y otro en una red basada en el Publish/Subscribe Internet Routing Paradigm (PSIRP). Los experimentos llevados a cabo confirman que la versión específica de red de Pu-blicación/Subscripción propuesta incrementa el rendimiento de la red en términos de tiempo de espera entre nodos finales, permite una coordinación de los servicios móviles distribuidos más resistente a interrupciones y un mejor uso de los recursos de red, y finalmente logra exitosamente, con variaciones mínimas en el rendimiento de las comunicaciones, la interconexión entre estos dominios de Publicación/Subscripción diferentes. ABSTRACT This dissertation is made up of two different but complementary research areas: Publish/Subscribe networks and mobile distributed services. Over the years, Publish/Subscribe networks have been offering the lightweight and decoupled communication characteristics to improve the information distribution in several application domains such as the execution of distributed services. Mobile distributed services are set to be deployed in wireless environments where mobile devices must rely on the same features Publish/Subscribe networks can offer; so one of the pending research directions consists of how to take advantage of these features and further advance to-wards new un-existing solutions that enhance the integration between mobile and fixed systems and the execution of mobile distributed services. This dissertation seeks to advance the integration and coordination mechanisms of mobile distributed services in the context of Publish/Subscribe networks. The specific objectives aim to enable the integration of mobile and fixed Publish/Subscribe systems and provide a uniform and specific version of a Publish/Subscribe network with new coordination mechanisms that improve the execution of mobile distributed services. The results of this dissertation are enclosed in one specific version of a Publish/Subscribe network that integrates mobile and fixed brokers and coordinates the execution of mobile distributed services. These specific contributions are: a new architecture of a mobile broker I called Rendezvous Mobile Broker, an abstract model for coordinating mobile distributed services executions using a Publish/Subscribe net-work, new gossip routing solutions to disseminate events of services, mechanisms to support highly partitioned and geographically dispersed services and finally, an inter-networking solution between two Publish/Subscribe domains: a PubSubHubbub-based network and the Publish/Subscribe Internet Routing Paradigm (PSIRP)-based network. The experimental efforts confirm that the specific version of the Publish/Subscribe proposed in this dissertation improves the performance of the overall network in terms of end-to-end delay, enables a more resilience execution of mobile distributed services, a better usage of the existing network resources, and finally successfully achieves, with minor variations in the network performance, the internetworking between two different Publish/Subscribe domains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Mouse Genome Database (MGD) is the community database resource for the laboratory mouse, a key model organism for interpreting the human genome and for understanding human biology and disease (http://www.informatics.jax.org). MGD provides standard nomenclature and consensus map positions for mouse genes and genetic markers; it provides a curated set of mammalian homology records, user-defined chromosomal maps, experimental data sets and the definitive mouse ‘gene to sequence’ reference set for the research community. The integration and standardization of these data sets facilitates the transition between mouse DNA sequence, gene and phenotype annotations. A recent focus on allele and phenotype representations enhances the ability of MGD to organize and present data for exploring the relationship between genotype and phenotype. This link between the genome and the biology of the mouse is especially important as phenotype information grows from large mutagenesis projects and genotype information grows from large-scale sequencing projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this work is the development of database of the distributed information measurement and control system that implements methods of optical spectroscopy for plasma physics research and atomic collisions and provides remote access to information and hardware resources within the Intranet/Internet networks. The database is based on database management system Oracle9i. Client software was realized in Java language. The software was developed using Model View Controller architecture, which separates application data from graphical presentation components and input processing logic. The following graphical presentations were implemented: measurement of radiation spectra of beam and plasma objects, excitation function for non-elastic collisions of heavy particles and analysis of data acquired in preceding experiments. The graphical clients have the following functionality of the interaction with the database: browsing information on experiments of a certain type, searching for data with various criteria, and inserting the information about preceding experiments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The methods and software for integration of databases (DBs) on inorganic material and substance properties have been developed. The information systems integration is based on known approaches combination: EII (Enterprise Information Integration) and EAI (Enterprise Application Integration). The metabase - special database that stores data on integrated DBs contents is an integrated system kernel. Proposed methods have been applied for DBs integrated system creation in the field of inorganic chemistry and materials science. Important developed integrated system feature is ability to include DBs that have been created by means of different DBMS using essentially various computer platforms: Sun (DB "Diagram") and Intel (other DBs) and diverse operating systems: Sun Solaris (DB "Diagram") and Microsoft Windows Server (other DBs).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^