570 resultados para Interoperability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Integration is currently a key factor in intelligent transportation systems (ITS), especially because of the ever increasing service demands originating from the ITS industry and ITS users. The current ITS landscape is made up of multiple technologies that are tightly coupled, and its interoperability is extremely low, which limits ITS services generation. Given this fact, novel information technologies (IT) based on the service-oriented architecture (SOA) paradigm have begun to introduce new ways to address this problem. The SOA paradigm allows the construction of loosely coupled distributed systems that can help to integrate the heterogeneous systems that are part of ITS. In this paper, we focus on developing an SOA-based model for integrating information technologies (IT) into ITS to achieve ITS service delivery. To develop our model, the ITS technologies and services involved were identified, catalogued, and decoupled. In doing so, we applied our SOA-based model to integrate all of the ITS technologies and services, ranging from the lowest-level technical components, such as roadside unit as a service (RS S), to the most abstract ITS services that will be offered to ITS users (value-added services). To validate our model, a functionality case study that included all of the components of our model was designed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The construction industry has long been considered as highly fragmented and non-collaborative industry. This fragmentation sprouted from complex and unstructured traditional coordination processes and information exchanges amongst all parties involved in a construction project. This nature coupled with risk and uncertainty has pushed clients and their supply chain to search for new ways of improving their business process to deliver better quality and high performing product. This research will closely investigate the need to implement a Digital Nervous System (DNS), analogous to a biological nervous system, on the flow and management of digital information across the project lifecycle. This will be through direct examination of the key processes and information produced in a construction project and how a DNS can provide a well-integrated flow of digital information throughout the project lifecycle. This research will also investigate how a DNS can create a tight digital feedback loop that enables the organisation to sense, react and adapt to changing project conditions. A Digital Nervous System is a digital infrastructure that provides a well-integrated flow of digital information to the right part of the organisation at the right time. It provides the organisation with the relevant and up-to-date information it needs, for critical project issues, to aid in near real-time decision-making. Previous literature review and survey questionnaires were used in this research to collect and analyse data about information management problems of the industry – e.g. disruption and discontinuity of digital information flow due to interoperability issues, disintegration/fragmentation of the adopted digital solutions and paper-based transactions. Results analysis revealed efficient and effective information management requires the creation and implementation of a DNS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past few months, four Central European states have made decisions which will determine the shape of their air forces over the next decade. On 11 October, Romania signed a contract under which it will buy twelve used US F-16A/B multi-role fighter aircraft from Portugal. In August, Slovakia signed contracts with Russia’s MiG for repairs and the limited modernisation of its twelve MiG-29 fighter aircraft currently in service. The Czech Republic entered into a preliminary agreement in July with Sweden on extending the lease of fourteen JAS-39 Gripen multi-role fighter aircraft (the new Czech government will hammer out the details following the parliamentary election). Bulgaria, which has been facing financial problems and political instability, in June postponed the purchase of new (non-Soviet) combat aircraft at least until the end of this year. If Sofia decides to buy any within the next few years, these will be not more than twelve relatively old and worn-out machines (most likely F-16A/B from Portuguese or Dutch army surplus). Given the fact that Hungary in 2012 made the same decision regarding its fourteen Gripen aircraft as the Czech Republic, there are good grounds to claim that the capabilities Central European NATO member states have to take action in airspace are durably limited. The region’s saturation with combat aircraft is the lowest when compared to the entire continent (with the exception of the Baltic states). Furthermore, the machines to be used in the coming decade will be the oldest and the least advanced technologically (all of them belong to the so-called “fourth generation”, the roots of which date back to the 1970s). The problem with gaining full interoperability within NATO has not been resolved in its Central European member states. By modernising its MiG-29 aircraft, Slovakia is to say the least postponing the achievement of interoperability once again. Bulgaria will gain interoperability by buying any Western combat aircraft. However, it is very unlikely to introduce new machines into service earlier than at the end of the present decade. Since the introduction of new fifth generation multi-role combat aircraft or transitional 4+ generation machines in the region’s air forces is unrealistic, the defence of the airspace of NATO member states in Central Europe can be termed an ever more porous sky.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The EU began railway reform in earnest around the turn of the century. Two ‘railway packages’ have meanwhile been adopted amounting to a series of directives and a third package has been proposed. A range of complementary initiatives has been undertaken or is underway. This BEEP Briefing inspects the main economic aspects of EU rail reform. After highlighting the dramatic loss of market share of rail since the 1960s, the case for reform is argued to rest on three arguments: the need for greater competitiveness of rail, promoting the (market driven) diversion of road haulage to rail as a step towards sustainable mobility in Europe, and an end to the disproportional claims on public budgets of Member States. The core of the paper deals respectively with market failures in rail and in the internal market for rail services; the complex economic issues underlying vertical separation (unbundling) and pricing options; and the methods, potential and problems of introducing competition in rail freight and in passenger services. Market failures in the rail sector are several (natural monopoly, economies of density, safety and asymmetries of information), exacerbated by no less than 7 technical and legal barriers precluding the practical operation of an internal rail market. The EU choice to opt for vertical unbundling (with benefits similar in nature as in other network industries e.g. preventing opaque cross-subsidisation and greater cost revelation) risks the emergence of considerable coordination costs. The adoption of marginal cost pricing is problematic on economic grounds (drawbacks include arbitrary cost allocation rules in the presence of large economies of scope and relatively large common costs; a non-optimal incentive system, holding back the growth of freight services; possibly anti-competitive effects of two-part tariffs). Without further detailed harmonisation, it may also lead to many different systems in Member States, causing even greater distortions. Insofar as freight could develop into a competitive market, a combination of Ramsey pricing (given the incentive for service providers to keep market share) and price ceilings based on stand-alone costs might be superior in terms of competition, market growth and regulatory oversight. The incipient cooperative approach for path coordination and allocation is welcome but likely to be seriously insufficient. The arguments to introduce competition, notably in freight, are valuable and many e.g. optimal cross-border services, quality differentiation as well as general quality improvement, larger scale for cost recovery and a decrease of rent seeking. Nevertheless, it is not correct to argue for the introduction of competition in rail tout court. It depends on the size of the market and on removing a host of barriers; it requires careful PSO definition and costing; also, coordination failures ought to be pre-empted. On the other hand, reform and competition cannot and should not be assessed in a static perspective. Conduct and cost structures will change with reform. Infrastructure and investment in technology are known to generate enormous potential for cost savings, especially when coupled with the EU interoperability programme. All this dynamism may well help to induce entry and further enlarge the (net) welfare gains from EU railway reform. The paper ends with a few pointers for the way forward in EU rail reform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the overall negotiations on the Transatlantic Trade and Investment Partnership (TTIP), the digital chapter appears to be growing in importance. This is due to several factors, including the recent Datagate scandal that undermined trust between the negotiating parties and led to calls to suspend the US-EU Safe Harbour agreement as well as the furious debate currently ongoing in both legal systems on key issues such as policies to encourage broadband infrastructure deployment, network neutrality policies and the application of competition policy in cyberspace. This paper explores the current divergences between the two legal systems on these key issues and discusses possible scenarios for the ultimate agreement to be reached in the TTIP: from a basic, minimal agreement (which would essentially include e-labelling and e-accessibility measures) to more ambitious scenarios on network neutrality, competition rules, privacy and interoperability measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A tese estuda e analisa as Tecnologias da Informação e Comunicação (TIC) disponíveis em hospitais no Brasil, associados à ANAHP – Associação Nacional de Hospitais Privados que patrocinou a coleta de dados e apoiou a pesquisa com dois temas centrais: uso e custo; integração e interoperabilidade das TIC. A Organização para a Cooperação e Desenvolvimento Econômico (OECD), em estudo sobre TIC em saúde, afirmou que, se implantadas de maneira eficiente, as TIC podem resultar em melhoria da qualidade na prestação de serviços de saúde, aumento da segurança no atendimento ao paciente e custos menores. Para permitir a comparação de resultados e o uso de modelos de pesquisa confiáveis, este estudo adaptou o Modelo de Pesquisa de Uso de TI utilizado pelo GVcia - Centro de Tecnologia da Informação Aplicada da FGV – EAESP e realizou uma complementação dos instrumentos de pesquisa para levantar dados específicos da integração das TIC em hospitais utilizando com referência uma pesquisa aplicada a hospitais norte-americanos. A coleta de dados foi realizada para o período 2009 a 2015 em duas etapas. Na primeira, foi enviado o questionário eletrônico, validado por executivos de hospitais e apresentado no apêndice A. Na segunda etapa, os dados enviados foram validados e aspectos da integração foram esclarecidos por meio de entrevistas. O estudo de custo e uso das TIC resultou no cálculo de diversos indicadores. Gastos e investimentos com TIC em hospitais privados atingiu a 3,5% do faturamento anual dos hospitais da amostra enquanto que o setor da saúde, por sua vez, gastou e investiu 6,4% do faturamento anual com as TIC que implementou. Outro indicador inédito e importante para futuros estudos no setor é o custo anual das TIC por leito hospitalar que permaneceu estável em torno de US$ 39.000 entre 2010 e 2014 e caiu para US$ 36.000 em 2015, valor equivalente a R$ 120.000 de dezembro de 2015. Outras estatísticas e indicadores apresentados no estudo ajudam a entender a evolução e o desempenho das TIC na prestação de serviços de saúde e poderão ser úteis para decidir se as TIC ajudarão a melhorar a segurança e atenção ao paciente, aos profissionais de saúde no acesso a dados do paciente, e a necessária sintonia com o Modelo da Saúde Digital (e-Saúde), a custos adequados. Os resultados do presente estudo são comparáveis aos americanos semelhantes e permitem inferir que o parque tecnológico em uso nos hospitais privados brasileiros poderá alavancar a integração de sistemas existentes, permitindo a sintonia com Modelos de Saúde Digital, com a melhoria de desempenho da cadeia estendida da saúde na atenção ao cidadão que poderá ser assistido onde estiver, a qualquer momento, com segurança. Ação gerencial e investimentos são os tópicos centrais que envolvem a integração.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este artigo é parte do relatório Cybersecurity Are We Ready in Latin America and the Caribbean?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pervasive computing applications must be sufficiently autonomous to adapt their behaviour to changes in computing resources and user requirements. This capability is known as context-awareness. In some cases, context-aware applications must be implemented as autonomic systems which are capable of dynamically discovering and replacing context sources (sensors) at run-time. Unlike other types of application autonomy, this kind of dynamic reconfiguration has not been sufficiently investigated yet by the research community. However, application-level context models are becoming common, in order to ease programming of context-aware applications and support evolution by decoupling applications from context sources. We can leverage these context models to develop general (i.e., application-independent) solutions for dynamic, run-time discovery of context sources (i.e., context management). This paper presents a model and architecture for a reconfigurable context management system that supports interoperability by building on emerging standards for sensor description and classification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While developments in distributed object computing environments, such as the Common Object Request Broker Architecture (CORBA) [17] and the Telecommunication Intelligent Network Architecture (TINA) [16], have enabled interoperability between domains in large open distributed systems, managing the resources within such systems has become an increasingly complex task. This challenge has been considered for several years within the distributed systems management research community and policy-based management has recently emerged as a promising solution. Large evolving enterprises present a significant challenge for policy-based management partly due to the requirement to support both mutual transparency and individual autonomy between domains [2], but also because the fluidity and complexity of interactions occurring within such environments requires an ability to cope with the coexistence of multiple, potentially inconsistent policies. This paper discusses the need of providing both dynamic (run-time) and static (compile-time) conflict detection and resolution for policies in such systems and builds on our earlier conflict detection work [7, 8] to introduce the methods for conflict resolution in large open distributed systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta dissertação apresenta um estudo exploratório sobre a experiência de implantação de um sistema digital de segurança. O estudo teve como locus a Guarda Civil Metropolitana de São Paulo e o objetivo geral da pesquisa é analisar o uso das tecnologias da informação e comunicação no campo operacional da organização. Primeiramente foi realizada a pesquisa bibliográfica para rastreamento de temas análogos. Foi possível verificar a escassez de estudos sobre as tecnologias da informação e comunicação na Segurança Pública no Brasil. Dos raros trabalhos encontrados o que deu inspiração para essa pesquisa foi o experimento realizado em Brasília, na área de Gestão do Conhecimento e da Tecnologia da Informação, que investigou a integração de sistemas de info rmações na Segurança Pública do Distrito Federal. Para a realização deste estudo de caso adotou-se a metodologia qualitativa com pesquisas bibliográfica, documental e de campo (entrevistas e observações diretas). Os resultados obtidos demonstram a necessidade de investimentos em tecnologias da informação e comunicação, objetivando a integração e a interoperabilidade das organizações de Segurança Pública. Os resultados também confirmaram os achados em pesquisas e estudos sobre a violência urbana que apontam que só os investimentos na estrutura técnica, em pessoal, em gestão de segurança, chamada política de combate à violência e à criminalidade, não resolve. É preciso urgência na gestão de políticas públicas integradas para combater as causas provocativas da violência e da criminalidade que são a miséria crescente, o desemprego, a falta de serviços públicos eficientes, em especial no setor da saúde e da educação e a ausência de políticas sociais, todos entendidos como violência.(AU)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Internet of Things (IoT) consists of a worldwide “network of networks,” composed by billions of interconnected heterogeneous devices denoted as things or “Smart Objects” (SOs). Significant research efforts have been dedicated to port the experience gained in the design of the Internet to the IoT, with the goal of maximizing interoperability, using the Internet Protocol (IP) and designing specific protocols like the Constrained Application Protocol (CoAP), which have been widely accepted as drivers for the effective evolution of the IoT. This first wave of standardization can be considered successfully concluded and we can assume that communication with and between SOs is no longer an issue. At this time, to favor the widespread adoption of the IoT, it is crucial to provide mechanisms that facilitate IoT data management and the development of services enabling a real interaction with things. Several reference IoT scenarios have real-time or predictable latency requirements, dealing with billions of device collecting and sending an enormous quantity of data. These features create a new need for architectures specifically designed to handle this scenario, hear denoted as “Big Stream”. In this thesis a new Big Stream Listener-based Graph architecture is proposed. Another important step, is to build more applications around the Web model, bringing about the Web of Things (WoT). As several IoT testbeds have been focused on evaluating lower-layer communication aspects, this thesis proposes a new WoT Testbed aiming at allowing developers to work with a high level of abstraction, without worrying about low-level details. Finally, an innovative SOs-driven User Interface (UI) generation paradigm for mobile applications in heterogeneous IoT networks is proposed, to simplify interactions between users and things.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis provides an interoperable language for quantifying uncertainty using probability theory. A general introduction to interoperability and uncertainty is given, with particular emphasis on the geospatial domain. Existing interoperable standards used within the geospatial sciences are reviewed, including Geography Markup Language (GML), Observations and Measurements (O&M) and the Web Processing Service (WPS) specifications. The importance of uncertainty in geospatial data is identified and probability theory is examined as a mechanism for quantifying these uncertainties. The Uncertainty Markup Language (UncertML) is presented as a solution to the lack of an interoperable standard for quantifying uncertainty. UncertML is capable of describing uncertainty using statistics, probability distributions or a series of realisations. The capabilities of UncertML are demonstrated through a series of XML examples. This thesis then provides a series of example use cases where UncertML is integrated with existing standards in a variety of applications. The Sensor Observation Service - a service for querying and retrieving sensor-observed data - is extended to provide a standardised method for quantifying the inherent uncertainties in sensor observations. The INTAMAP project demonstrates how UncertML can be used to aid uncertainty propagation using a WPS by allowing UncertML as input and output data. The flexibility of UncertML is demonstrated with an extension to the GML geometry schemas to allow positional uncertainty to be quantified. Further applications and developments of UncertML are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The CONNECT European project that started in February 2009 aims at dropping the interoperability barrier faced by today’s distributed systems. It does so by adopting a revolutionary approach to the seamless networking of digital systems, that is, synthesizing on the fly the connectors via which networked systems communicate.