479 resultados para Interoperability
Resumo:
Extensible Markup Language ( XML) has emerged as a medium for interoperability over the Internet. As the number of documents published in the form of XML is increasing, there is a need for selective dissemination of XML documents based on user interests. In the proposed technique, a combination of Adaptive Genetic Algorithms and multi class Support Vector Machine ( SVM) is used to learn a user model. Based on the feedback from the users, the system automatically adapts to the user's preference and interests. The user model and a similarity metric are used for selective dissemination of a continuous stream of XML documents. Experimental evaluations performed over a wide range of XML documents, indicate that the proposed approach significantly improves the performance of the selective dissemination task, with respect to accuracy and efficiency.
Resumo:
XML has emerged as a medium for interoperability over the Internet. As the number of documents published in the form of XML is increasing there is a need for selective dissemination of XML documents based on user interests. In the proposed technique, a combination of Self Adaptive Migration Model Genetic Algorithm (SAMCA)[5] and multi class Support Vector Machine (SVM) are used to learn a user model. Based on the feedback from the users the system automatically adapts to the user's preference and interests. The user model and a similarity metric are used for selective dissemination of a continuous stream of XML documents. Experimental evaluations performed over a wide range of XML documents indicate that the proposed approach significantly improves the performance of the selective dissemination task, with respect to accuracy and efficiency.
Resumo:
The industry foundation classes (IFC) file format is one of the most complex and ambitious IT standardization projects currently being undertaken in any industry, focusing on the development of an open and neutral standard for exchanging building model data. Scientific literature related to the IFC standard has dominantly been technical so far; research looking at the IFC standard from an industry standardization per- spective could offer valuable new knowledge for both theory and practice. This paper proposes the use of IT standardization and IT adoption theories, supported by studies done within construction IT, to lay a theoretical foundation for further empirical analysis of the standardization process of the IFC file format.
Resumo:
This report addresses five key topics: »» Policy development and implementation »» Skills and capability »» Infrastructure and interoperability »» Incentives for researchers and support stakeholders »» Business case and sustainability
Resumo:
This study was undertaken by UKOLN on behalf of the Joint Information Systems Committee (JISC) in the period April to September 2008. Application profiles are metadata schemata which consist of data elements drawn from one or more namespaces, optimized for a particular local application. They offer a way for particular communities to base the interoperability specifications they create and use for their digital material on established open standards. This offers the potential for digital materials to be accessed, used and curated effectively both within and beyond the communities in which they were created. The JISC recognized the need to undertake a scoping study to investigate metadata application profile requirements for scientific data in relation to digital repositories, and specifically concerning descriptive metadata to support resource discovery and other functions such as preservation. This followed on from the development of the Scholarly Works Application Profile (SWAP) undertaken within the JISC Digital Repositories Programme and led by Andy Powell (Eduserv Foundation) and Julie Allinson (RRT UKOLN) on behalf of the JISC. Aims and Objectives 1.To assess whether a single metadata AP for research data, or a small number thereof, would improve resource discovery or discovery-to-delivery in any useful or significant way. 2.If so, then to:a.assess whether the development of such AP(s) is practical and if so, how much effort it would take; b.scope a community uptake strategy that is likely to be successful, identifying the main barriers and key stakeholders. 3.Otherwise, to investigate how best to improve cross-discipline, cross-community discovery-to-delivery for research data, and make recommendations to the JISC and others as appropriate. Approach The Study used a broad conception of what constitutes scientific data, namely data gathered, collated, structured and analysed using a recognizably scientific method, with a bias towards quantitative methods. The approach taken was to map out the landscape of existing data centres, repositories and associated projects, and conduct a survey of the discovery-to-delivery metadata they use or have defined, alongside any insights they have gained from working with this metadata. This was followed up by a series of unstructured interviews, discussing use cases for a Scientific Data Application Profile, and how widely a single profile might be applied. On the latter point, matters of granularity, the experimental/measurement contrast, the quantitative/qualitative contrast, the raw/derived data contrast, and the homogeneous/heterogeneous data collection contrast were discussed. The Study report was loosely structured according to the Singapore Framework for Dublin Core Application Profiles, and in turn considered: the possible use cases for a Scientific Data Application Profile; existing domain models that could either be used or adapted for use within such a profile; and a comparison existing metadata profiles and standards to identify candidate elements for inclusion in the description set profile for scientific data. The report also considered how the application profile might be implemented, its relationship to other application profiles, the alternatives to constructing a Scientific Data Application Profile, the development effort required, and what could be done to encourage uptake in the community. The conclusions of the Study were validated through a reference group of stakeholders.
Resumo:
Scientific research revolves around the production, analysis, storage, management, and re-use of data. Data sharing offers important benefits for scientific progress and advancement of knowledge. However, several limitations and barriers in the general adoption of data sharing are still in place. Probably the most important challenge is that data sharing is not yet very common among scholars and is not yet seen as a regular activity among scientists, although important efforts are being invested in promoting data sharing. In addition, there is a relatively low commitment of scholars to cite data. The most important problems and challenges regarding data metrics are closely tied to the more general problems related to data sharing. The development of data metrics is dependent on the growth of data sharing practices, after all it is nothing more than the registration of researchers’ behaviour. At the same time, the availability of proper metrics can help researchers to make their data work more visible. This may subsequently act as an incentive for more data sharing and in this way a virtuous circle may be set in motion. This report seeks to further explore the possibilities of metrics for datasets (i.e. the creation of reliable data metrics) and an effective reward system that aligns the main interests of the main stakeholders involved in the process. The report reviews the current literature on data sharing and data metrics. It presents interviews with the main stakeholders on data sharing and data metrics. It also analyses the existing repositories and tools in the field of data sharing that have special relevance for the promotion and development of data metrics. On the basis of these three pillars, the report presents a number of solutions and necessary developments, as well as a set of recommendations regarding data metrics. The most important recommendations include the general adoption of data sharing and data publication among scholars; the development of a reward system for scientists that includes data metrics; reducing the costs of data publication; reducing existing negative cultural perceptions of researchers regarding data publication; developing standards for preservation, publication, identification and citation of datasets; more coordination of data repository initiatives; and further development of interoperability protocols across different actors.
Resumo:
Executive Summary: The EcoGIS project was launched in September 2004 to investigate how Geographic Information Systems (GIS), marine data, and custom analysis tools can better enable fisheries scientists and managers to adopt Ecosystem Approaches to Fisheries Management (EAFM). EcoGIS is a collaborative effort between NOAA’s National Ocean Service (NOS) and National Marine Fisheries Service (NMFS), and four regional Fishery Management Councils. The project has focused on four priority areas: Fishing Catch and Effort Analysis, Area Characterization, Bycatch Analysis, and Habitat Interactions. Of these four functional areas, the project team first focused on developing a working prototype for catch and effort analysis: the Fishery Mapper Tool. This ArcGIS extension creates time-and-area summarized maps of fishing catch and effort from logbook, observer, or fishery-independent survey data sets. Source data may come from Oracle, Microsoft Access, or other file formats. Feedback from beta-testers of the Fishery Mapper was used to debug the prototype, enhance performance, and add features. This report describes the four priority functional areas, the development of the Fishery Mapper tool, and several themes that emerged through the parallel evolution of the EcoGIS project, the concept and implementation of the broader field of Ecosystem Approaches to Management (EAM), data management practices, and other EAM toolsets. In addition, a set of six succinct recommendations are proposed on page 29. One major conclusion from this work is that there is no single “super-tool” to enable Ecosystem Approaches to Management; as such, tools should be developed for specific purposes with attention given to interoperability and automation. Future work should be coordinated with other GIS development projects in order to provide “value added” and minimize duplication of efforts. In addition to custom tools, the development of cross-cutting Regional Ecosystem Spatial Databases will enable access to quality data to support the analyses required by EAM. GIS tools will be useful in developing Integrated Ecosystem Assessments (IEAs) and providing pre- and post-processing capabilities for spatially-explicit ecosystem models. Continued funding will enable the EcoGIS project to develop GIS tools that are immediately applicable to today’s needs. These tools will enable simplified and efficient data query, the ability to visualize data over time, and ways to synthesize multidimensional data from diverse sources. These capabilities will provide new information for analyzing issues from an ecosystem perspective, which will ultimately result in better understanding of fisheries and better support for decision-making. (PDF file contains 45 pages.)
Resumo:
The use of self-contained, low-maintenance sensor systems installed on commercial vessels is becoming an important monitoring and scientific tool in many regions around the world. These systems integrate data from meteorological and water quality sensors with GPS data into a data stream that is automatically transferred from ship to shore. To begin linking some of this developing expertise, the Alliance for Coastal Technologies (ACT) and the European Coastal and Ocean Observing Technology (ECOOT) organized a workshop on this topic in Southampton, United Kingdom, October 10-12, 2006. The participants included technology users, technology developers, and shipping representatives. They collaborated to identify sensors currently employed on integrated systems, users of this data, limitations associated with these systems, and ways to overcome these limitations. The group also identified additional technologies that could be employed on future systems and examined whether standard architectures and data protocols for integrated systems should be established. Participants at the workshop defined 17 different parameters currently being measured by integrated systems. They identified that diverse user groups utilize information from these systems from resource management agencies, such as the Environmental Protection Agency (EPA), to local tourism groups and educational organizations. Among the limitations identified were instrument compatibility and interoperability, data quality control and quality assurance, and sensor calibration andlor maintenance frequency. Standardization of these integrated systems was viewed to be both advantageous and disadvantageous; while participants believed that standardization could be beneficial on many levels, they also felt that users may be hesitant to purchase a suite of instruments from a single manufacturer; and that a "plug and play" system including sensors from multiple manufactures may be difficult to achieve. A priority recommendation and conclusion for the general integrated sensor system community was to provide vessel operators with real-time access to relevant data (e.g., ambient temperature and salinity to increase efficiency of water treatment systems and meteorological data for increased vessel safety and operating efficiency) for broader system value. Simplified data displays are also required for education and public outreach/awareness. Other key recommendations were to encourage the use of integrated sensor packages within observing systems such as 100s and EuroGOOS, identify additional customers of sensor system data, and publish results of previous work in peer-reviewed journals to increase agency and scientific awareness and confidence in the technology. Priority recommendations and conclusions for ACT entailed highlighting the value of integrated sensor systems for vessels of opportunity through articles in the popular press, and marine science. [PDF contains 28 pages]
Resumo:
[ES] Los siguientes enlaces proporcionan información adicional sobre este texto:
Resumo:
Este trabalho apresenta uma arquitetura geral para evolução de circuitos eletrônicos analógicos baseada em algoritmos genéticos. A organização lógica privilegia a interoperabilidade de seus principais componentes, incluindo a possibilidade de substituição ou melhorias internas de suas funcionalidades. A plataforma implementada utiliza evolução extrínseca, isto é, baseada em simulação de circuitos, e visa facilidade e flexibilidade para experimentação. Ela viabiliza a interconexão de diversos componentes aos nós de um circuito eletrônico que será sintetizado ou adaptado. A técnica de Algoritmos Genéticos é usada para buscar a melhor forma de interconectar os componentes para implementar a função desejada. Esta versão da plataforma utiliza o ambiente MATLAB com um toolbox de Algoritmos Genéticos e o PSpice como simulador de circuitos. Os estudos de caso realizados apresentaram resultados que demonstram a potencialidade da plataforma no desenvolvimento de circuitos eletrônicos adaptativos.
Resumo:
[EU]Gaur egun, Europa mailan European Rail Traffic Management System (ERTMS) seinaleztapen-sistema bateratua hedatzen ari dira trenbide sare desberdinen arteko elkar eragintasuna bultzatzeko. Proiektu honen helburua da ERTMS sistemaren barneko ETCS protokoloa hedatzea simulazio hibridodun ingurune batean, ERTMS sistemaren hedatzea azkartuko duten erakusleak sortuz. Horretarako, OPNET simulagailuaren System-in-the-loop erreminta erabili da. Erreminta hau baliatuz ETCS protokoloaren pakete errealak ingurune simulatuan integratzeko funtzioen liburutegi bat idatzi da. Amaitzeko, liburutegi hori baliatuz ETCS protokoloak sareko arazoen aurrean duen errendimenduaren analisi bat burutu da eta liburutegi berri horrek pakete errealak simulatuetara itzultzean (eta kontrakoa) duen errendimendua zein den aztertu da.
Resumo:
[ES]La interoperabilidad entre distintas redes ferroviarias europeas es muy escasa. Para dar solución a este problema, la Unión Europea creó el sistema europeo de gestión del tráfico ferroviario (ERTMS), encargado de crear un estándar único para toda la red europea. El objetivo de este proyecto es la implementación del sistema ETCS ( European Train Control System ) en un entorno cliente-servidor. La implementación incluye el sistema del tren y el del centro de control (RBC). Se ha implementado de forma que se pueda operar sobre dos protocolos de red transporte, de forma que será compatible tanto para redes orientadas a la conexión (TCP) como no orientadas a conexión (UDP).
Resumo:
Em 2005, a Agência Nacional de Saúde Suplementar (ANS) estabelece o padrão TISS (Troca de Informação na Saúde Suplementar), intercâmbio eletrônico obrigatório entre as operadoras de planos de saúde (cerca de 1500 registradas na ANS) e prestadores de serviços (cerca de 200 mil) sobre os eventos de assistência prestados aos beneficiários. O padrão TISS foi desenvolvido seguindo a estrutura do Comitê ISO/TC215 de padrões para informática em saúde e se divide em quatro partes: conteúdo e estrutura, que compreende a estrutura das guias em papel; representação de conceitos em saúde, que se refere às tabelas de domínio e vocabulários em saúde; comunicação, que contempla as mensagens eletrônicas; e segurança e privacidade, seguindo recomendação do Conselho Federal de Medicina (CFM). Para aprimorar sua metodologia de evolução, essa presente tese analisou o grau de interoperabilidade do padrão TISS segundo a norma ISO 20514 (ISO 20514, 2005) e a luz do modelo dual da Fundação openEHR, que propõe padrões abertos para arquitetura e estrutura do Registro Eletrônico de Saúde (RES). O modelo dual da Fundação openEHR é composto, no primeiro nível, por um modelo de referência genérico e, no segundo, por um modelo de arquétipos, onde se detalham os conceitos e atributos. Dois estudos foram realizados: o primeiro se refere a um conjunto de arquétipos demográficos elaborados como proposta de representação da informação demográfica em saúde, baseado no modelo de referência da Fundação openEHR. O segundo estudo propõe um modelo de referência genérico, como aprimoramento das especificações da Fundação openEHR, para representar o conceito de submissão de autorização e contas na saúde, assim como um conjunto de arquétipos. Por fim, uma nova arquitetura para construção do padrão TISS é proposta, tendo como base o modelo dual da Fundação openEHR e como horizonte a evolução para o RES centrado no paciente
Resumo:
Atualmente o Brasil conta com um volume imenso de dados sobre o território nacional. Entretanto, grande parte dos dados existentes encontra-se dispersa, fragmentada, sem compatibilização cartográfica e, em alguns casos, duplicada em vários locais. O grande desafio é compartilhar dados geograficamente dispersos e comunicar conceitos importantes entre departamentos dentro da organização ou entre organizações diferentes usando, para isso, tecnologias de informação. Assim, esse trabalho tem como objetivo geral contribuir para o desenvolvimento de uma infra-estrutura para informação geográfica, que possa ser amplamente disseminada via Internet através de Web Services e que atenda os requisitos de interoperabilidade, de modo que diversos usuários possam usufruir dos dados disponíveis, integrando-os quando necessários. Este trabalho incidirá inicialmente nas necessidades de informação geográfica para o Zoneamento Ecológico e Econômico do Brasil. Entretanto, como se trata de um sistema de infra-estrutura de dados espaciais poderá, então, agregar dados para qualquer trabalho que envolva a informação espacial.
Resumo:
Service-Oriented Architecture (SOA) and Web Services (WS) offer advanced flexibility and interoperability capabilities. However they imply significant performance overheads that need to be carefully considered. Supply Chain Management (SCM) and Traceability systems are an interesting domain for the use of WS technologies that are usually deemed to be too complex and unnecessary in practical applications, especially regarding security. This paper presents an externalized security architecture that uses the eXtensible Access Control Markup Language (XACML) authorization standard to enforce visibility restrictions on trace-ability data in a supply chain where multiple companies collaborate; the performance overheads are assessed by comparing 'raw' authorization implementations - Access Control Lists, Tokens, and RDF Assertions - with their XACML-equivalents. © 2012 IEEE.