479 resultados para Interoperability
Resumo:
The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed.
Resumo:
The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.
Resumo:
This paper presents innovative work in the development of policy-based autonomic computing. The core of the work is a powerful and flexible policy-expression language AGILE, which facilitates run-time adaptable policy configuration of autonomic systems. AGILE also serves as an integrating platform for other self-management technologies including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. The various technologies are implemented as object components. Self-management behaviour is specified using the policy language semantics to bind the various components together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. Additional benefits include the standardisation of the application programmer interface, terminology and semantics, and only a single point of embedding is required.
Resumo:
This paper describes an autonomics development tool which serves as both a powerful and flexible policy-expression language and a policy-based framework that supports the integration and dynamic composition of several autonomic computing techniques including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. Self-management behaviour is specified using the policy language semantics to bind the various technologies together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. The policy language and implementation library have integrated support for self-stabilising behaviour, enabling oscillation and other forms of instability to be handled at the policy level with very little effort on the part of the application developer. Example applications are presented to illustrate the integration of different autonomics techniques, and the achievement of dynamic composition.
Resumo:
Although many scholars recognise the great potential of games for teaching and learning, the EU-based industry for such “serious” games” is highly fragmented and its growth figures remain well behind those of the leisure game market. Serious gaming has been designated as a priority area by the European Commission in its Horizon 2020 Framework Programme for Research and Innovation. The RAGE project, which is funded as part of the Horizon 2020 Programme, is a technology-driven research and innovation project that will make available a series of self-contained gaming software modules that support game studios in the development of serious games. As game studios are a critical factor in the uptake of serious games, the RAGE projects will base its work on their views and needs as to achieve maximum impact. This paper presents the results of a survey among European game studios about their development related needs and expectations. The survey is aimed at identifying a baseline reference for successfully supporting game studios with advanced ICTs for serious games.
Resumo:
This paper is concerned with several of the most important aspects of Competence-Based Learning (CBL): course authoring, assignments, and categorization of learning content. The latter is part of the so-called Bologna Process (BP) and can effectively be supported by integrating knowledge resources like, e.g., standardized skill and competence taxonomies into the target implementation approach, aiming at making effective use of an open integration architecture while fostering the interoperability of hybrid knowledge-based e-learning solutions. Modern scenarios ask for interoperable software solutions to seamlessly integrate existing e-learning infrastructures and legacy tools with innovative technologies while being cognitively efficient to handle. In this way, prospective users are enabled to use them without learning overheads. At the same time, methods of Learning Design (LD) in combination with CBL are getting more and more important for production and maintenance of easy to facilitate solutions. We present our approach of developing a competence-based course-authoring and assignment support software. It is bridging the gaps between contemporary Learning Management Systems (LMS) and established legacy learning infrastructures by embedding existing resources via Learning Tools Interoperability (LTI). Furthermore, the underlying conceptual architecture for this integration approach will be explained. In addition, a competence management structure based on knowledge technologies supporting standardized skill and competence taxonomies will be introduced. The overall goal is to develop a software solution which will not only flawlessly merge into a legacy platform and several other learning environments, but also remain intuitively usable. As a proof of concept, the so-called platform independent conceptual architecture model will be validated by a concrete use case scenario.
Resumo:
For seizing the potential of serious games, the RAGE project - funded by the Horizon-2020 Programme of the European Commission - will make available an interoperable set of advanced technology components (software assets) that support game studios at serious game development. This paper describes the overall software architecture and design conditions that are needed for the easy integration and reuse of such software assets in existing game platforms. Based on the component-based software engineering paradigm the RAGE architecture takes into account the portability of assets to different operating systems, different programming languages and different game engines. It avoids dependencies on external software frameworks and minimizes code that may hinder integration with game engine code. Furthermore it relies on a limited set of standard software patterns and well-established coding practices. The RAGE architecture has been successfully validated by implementing and testing basic software assets in four major programming languages (C#, C++, Java and Typescript/JavaScript, respectively). A demonstrator implementation of asset integration with an existing game engine was created and validated. The presented RAGE architecture paves the way for large scale development and application of cross-engine reusable software assets for enhancing the quality and diversity of serious gaming.
Resumo:
The report provides a review of the current level of exchange in marine life data and its management in the UK taking into account the current structures that are in place between data providers, custodians and managers. In addition, the report makes recommendations on how data flow can be improved over the next few years to achieve greater exchange and interoperability within the marine sector.
Resumo:
The open service network for marine environmental data (NETMAR) project uses semantic web technologies in its pilot system which aims to allow users to search, download and integrate satellite, in situ and model data from open ocean and coastal areas. The semantic web is an extension of the fundamental ideas of the World Wide Web, building a web of data through annotation of metadata and data with hyperlinked resources. Within the framework of the NETMAR project, an interconnected semantic web resource was developed to aid in data and web service discovery and to validate Open Geospatial Consortium Web Processing Service orchestration. A second semantic resource was developed to support interoperability of coastal web atlases across jurisdictional boundaries. This paper outlines the approach taken to producing the resource registry used within the NETMAR project and demonstrates the use of these semantic resources to support user interactions with systems. Such interconnected semantic resources allow the increased ability to share and disseminate data through the facilitation of interoperability between data providers. The formal representation of geospatial knowledge to advance geospatial interoperability is a growing research area. Tools and methods such as those outlined in this paper have the potential to support these efforts.
Resumo:
WebCom-G is a fledgling Grid Operating System, designed to provide independent service access through interoperability with existing middlewares. It offers an expressive programming model that automatically handles task synchronisation – load balancing, fault tolerance, and task allocation are handled at the WebCom-G system level – without burdening the application writer. These characteristics, together with the ability of its computing model to mix evaluation strategies to match the characteristics of the geographically dispersed facilities and the overall problem- solving environment, make WebCom-G a promising grid middleware candidate.
Resumo:
Model Driven Architecture supports the transformation from reusable models to executable software. Business representations, however, cannot be fully and explicitly represented in such models for direct transformation into running systems. Thus, once business needs change, the language abstractions used by MDA (e.g. Object Constraint Language / Action Semantics), being low level, have to be edited directly. We therefore describe an Agent-oriented Model Driven Architecture (AMDA) that uses a set of business models under continuous maintenance by business people, reflecting the current business needs and being associated with adaptive agents that interpret the captured knowledge to behave dynamically. Three contributions of the AMDA approach are identified: 1) to Agent-oriented Software Engineering, a method of building adaptive Multi-Agent Systems; 2) to MDA, a means of abstracting high level business-oriented models to align executable systems with their requirements at runtime; 3) to distributed systems, the interoperability of disparate components and services via the agent abstraction.
Resumo:
The development of wideband network services and the new network infrastructures to support them have placed much more requirements on current network management systems. Issues such as scalability, integrity and interoperability have become more important. Existing management systems are not flexible enough to support the provision of Quality of Service (QoS) in these dynamic environments. The concept of Programmable Networks has been proposed to address these requirements. Within this framework, CORBA is regarded as a middleware technology that can enable interoperation among the distributed entities founds in Programmable Networks. By using the basic CORBA environment in a heterogeneous network environment, a network manager is able to control remote Network Elements (NEs) in the same way it controls its local resources. Using this approach both the flexibility and intelligence of the overall network management can be improved. This paper proposes the use of two advanced features of CORBA to enhance the QoS management in a Programmable Network environment. The Transaction Service can be used to manage a set of tasks, whenever the management of elements in a network is correlated; and the Concurrency Service can be used to coordinate multiple accesses on the same network resources. It is also shown in this paper that proper use of CORBA can largely reduce the development and administration of network management applications.
Resumo:
To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioral data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals’ environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.
Resumo:
Cloud services are exploding, and organizations are converging their data centers in order to take advantage of the predictability, continuity, and quality of service delivered by virtualization technologies. In parallel, energy-efficient and high-security networking is of increasing importance. Network operators, and service and product providers require a new network solution to efficiently tackle the increasing demands of this changing network landscape. Software-defined networking has emerged as an efficient network technology capable of supporting the dynamic nature of future network functions and intelligent applications while lowering operating costs through simplified hardware, software, and management. In this article, the question of how to achieve a successful carrier grade network with software-defined networking is raised. Specific focus is placed on the challenges of network performance, scalability, security, and interoperability with the proposal of potential solution directions.
Resumo:
The ability of building information modeling (BIM) to positively impact projects in the AEC through greater collaboration and integration is widely acknowledged. This paper aims to examine the development of BIM and how it can contribute to the cold-formed steel (CFS) building industry. This is achieved through the adoption of a qualitative methodology encompassing a literature review, exploratory interviews with industry experts, culminating in the development of e-learning material for the sector. In doing so, the research team have collaborated with one of the United Kingdom’s largest cold-formed steel designer/fabricators. By demonstrating the capabilities of BIM software and providing technical and informative videos in its creation, this project has found two key outcomes. Firstly, to provide invaluable assistance in the transition from traditional processes to a fully collaborative 3D BIM as required by the UK Government under the “Government Construction Strategy” by 2016 in all public sector projects. Secondly, to demonstrate BIM’s potential not only within CFS companies, but also within the AEC sector as a whole. As the flexibility, adaptability and interoperability of BIM software is alluded to, the results indicate that the introduction and development of BIM and the underlying ethos suggests that it is a key tool in the development of the industry as a whole.