57 resultados para Computing Services


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Not just with the emergence but also with the growing of the electronic market, that is, the growth of online suppliers of services and products and Internet users (potential consumers), the necessary conditions to the affirmation of the agile/virtual enterprises (A/VE) as a present and future enterprise organizational model are created. In this context, it is our understanding that the broker may have an important role in its development, namely, if the broker performs functions for the A/VE with better efficacy and efficiency. In this article, we will present first a revision of the broker’s models in a structured form. We present a taxonomy of possible broker’s functions for the broker’s actuation near the A/VE and then the classification of the literature broker’s models. This classification will permit an analysis of a broker’s model and establish a mainframe for our broker’s model according to the BM_Virtual Enterprise Architecture Reference Model (BM_VEARM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamically reconfigurable SRAM-based field-programmable gate arrays (FPGAs) enable the implementation of reconfigurable computing systems where several applications may be run simultaneously, sharing the available resources according to their own immediate functional requirements. To exclude malfunctioning due to faulty elements, the reliability of all FPGA resources must be guaranteed. Since resource allocation takes place asynchronously, an online structural test scheme is the only way of ensuring reliable system operation. On the other hand, this test scheme should not disturb the operation of the circuit, otherwise availability would be compromised. System performance is also influenced by the efficiency of the management strategies that must be able to dynamically allocate enough resources when requested by each application. As those resources are allocated and later released, many small free resource blocks are created, which are left unused due to performance and routing restrictions. To avoid wasting logic resources, the FPGA logic space must be defragmented regularly. This paper presents a non-intrusive active replication procedure that supports the proposed test methodology and the implementation of defragmentation strategies, assuring both the availability of resources and their perfect working condition, without disturbing system operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As estruturas orgânicas empresariais estão cada vez mais obrigadas a garantir elevados padrões de qualidade de serviços, possibilitando ao mesmo tempo a sustentabilidade das estruturas e ainda, o alinhamento dos investimentos efetuados com as estratégias de negócio. O seu desenvolvimento obriga a que na área das tecnologias de informação e comunicação exista a necessidade de repensar estratégias em vigor, procurando novos modelos, mais ágeis e mais capazes de se enquadrar nestas novas exigências. Neste âmbito, é de esperar que as plataformas de identidade digital tenham um papel determinante no desenvolvimento destes novos modelos, pois são um instrumento único para se implementarem plataformas heterogéneas, intemperáveis, com elevados níveis de segurança e de garantia de controlo no acesso à informação. O trabalho agora apresentado tem como objectivo investigar e desenvolver uma plataforma de identidade digital e uma plataforma de testes, que permitam ao Politécnico do Porto a aquisição de um infraestrutura de Tecnologias de Informação e Comunicação que se torne um instrumento fundamental para o desenvolvimento contínuo, de garantia de qualidade e de sustentabilidade de todos os serviços prestados à sua comunidade.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ultimate goal of this research plan is to improve the learning experience of students through the combination of pedagogical eLearning services. Service oriented architectures are already being used in eLearning but in this work the focus is on services of pedagogical value, rather then on generic services adapted from other business systems. This approach to the architecture of eLearning platforms raises challenges addressed by this work, namely: conceptual modeling of the pedagogical eLearning services domain; interoperability and coordination of pedagogical eLearning service; conversion of existing eLearning systems to pedagogical services; adaptation of eLearning services to individual learners. An improved eLearning platform will incorporate learning tools adequate to the domains it covers and will focus on the individual learner that uses it. With this approach we expect to raise the pedagogical value of eLearning platforms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Learning systems are evolving from component based and centralized architectures towards service oriented and decentralized architectures. The standardization of e-learning content and interoperability is a powerful force in this evolution. In this chapter we put in perspective the evolution of e-learning systems and standards, and argue that specialized services will play an important role in future learning systems, especially in those targeted for competitive learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Para muitos, o ato de ensinar, era e continua a ser uma “arte”, em que os professores e os grandes mestres mais eficientes são aqueles que têm a capacidade e a arte de fazer passar as suas mensagens e conhecimentos, de forma simples e apelativa, independentemente da área de estudo. A informação relacionada com a aula, é cada vez mais digital, sendo importante, por parte dos docentes, o domínio de tecnologias de criação, organização e disponibilização de conteúdos. Essa partilha foi inicialmente possível pelas páginas Web e mais tarde pelas plataformas LMS (Learning Management System). Criar um Website era uma tarefa complicada quer ao nível do seu custo quer ao nível do domínio da tecnologia Web e era por vezes necessário contratar profissionais para o efeito. Surgiram então as CMS (Content Management System) que são tecnologias Open Source, que permitem a gestão de conteúdos. Neste sentido foi realizado um estudo com o objetivo de aferir sobre as competências dos professores no domínio da partilha de Gestão de Conteúdos Digitais. O presente estudo permitiu retirar conclusões sobre o potencial e aplicabilidade das CMS no ensino. O principal objetivo do presente estudo incidiu no potencial de distribuição e partilha de Recursos Educativos Digitais organizados sobre o ponto de vista pedagógico aos alunos. Foi ainda analisado e estudado o papel do Cloud Computing no processo de partilha colaborativa de documentos. Foi delineado como suporte à presente investigação um curso modelo que por sua vez foi implementado nas três principais CMS da atualidade e avaliado o potencial de cada uma neste contexto. Finalmente foram apresentadas as conclusões retiradas do presente estudo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empowered by virtualisation technology, cloud infrastructures enable the construction of flexi- ble and elastic computing environments, providing an opportunity for energy and resource cost optimisation while enhancing system availability and achieving high performance. A crucial re- quirement for effective consolidation is the ability to efficiently utilise system resources for high- availability computing and energy-efficiency optimisation to reduce operational costs and carbon footprints in the environment. Additionally, failures in highly networked computing systems can negatively impact system performance substantially, prohibiting the system from achieving its initial objectives. In this paper, we propose algorithms to dynamically construct and readjust vir- tual clusters to enable the execution of users’ jobs. Allied with an energy optimising mechanism to detect and mitigate energy inefficiencies, our decision-making algorithms leverage virtuali- sation tools to provide proactive fault-tolerance and energy-efficiency to virtual clusters. We conducted simulations by injecting random synthetic jobs and jobs using the latest version of the Google cloud tracelogs. The results indicate that our strategy improves the work per Joule ratio by approximately 12.9% and the working efficiency by almost 15.9% compared with other state-of-the-art algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extracting the semantic relatedness of terms is an important topic in several areas, including data mining, information retrieval and web recommendation. This paper presents an approach for computing the semantic relatedness of terms using the knowledge base of DBpedia — a community effort to extract structured information from Wikipedia. Several approaches to extract semantic relatedness from Wikipedia using bag-of-words vector models are already available in the literature. The research presented in this paper explores a novel approach using paths on an ontological graph extracted from DBpedia. It is based on an algorithm for finding and weighting a collection of paths connecting concept nodes. This algorithm was implemented on a tool called Shakti that extract relevant ontological data for a given domain from DBpedia using its SPARQL endpoint. To validate the proposed approach Shakti was used to recommend web pages on a Portuguese social site related to alternative music and the results of that experiment are reported in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A satisfação do utente na comunicação com profissionais de saúde é um indicador de qualidade dos serviços ou instituições. Na literatura não encontramos instrumentos padronizados e validados, que avaliem a satisfação do utente na comunicação com os profissionais de saúde. O presente estudo tem como objetivo construir e validar um instrumento para avaliar a satisfação do utente na comunicação com os profissionais de saúde. Desenvolvemos este estudo em três ciclos. Um primeiro, revisão da literatura, para identificar dimensões e itens da comunicação interpessoal na saúde. No segundo ciclo, conduzimos um método de Delphi modificado em três rondas, com recurso à plataforma informática de questionários Survey Monkey, no qual participou um painel de 25 peritos; estabelecemos como critério mínimo de retenção para a ronda seguinte os itens que recebessem 70% do consenso por parte do painel. Após as três rondas, obtivemos um instrumento com seis dimensões comunicacionais (comunicação verbal, comunicação não verbal, empatia, respeito, resolução de problemas e material de apoio), vinte e cinco itens específicos, e mais seis dimensões genéricas, que avaliam cada uma das dimensões. No terceiro ciclo avaliamos as características psicométricas, em termos de sensibilidade, validade do construto e fidelidade, numa amostra de 348 participantes. Os resultados mostram que todas as categorias de resposta estavam representadas em todos os itens. Validade do construto- a análise fatorial identificou uma solução de seis componentes que explicam 71% da variância total. Fiabilidade - os valores da correlação item-total variam entre 0,387 e 0,722, existindo uma correlação positiva moderada a forte. O valor de alfa de Cronbach (α=0,928) indica que a consistência interna é excelente. O instrumento construído apresenta boas propriedades psicométricas. Fica assim disponível uma nova ferramenta para auxiliar na gestão e no processo de planeamento necessários ao incremento da qualidade nos serviços e instituições de saúde.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports the development of a B2B platform for the personalization of the publicity transmitted during the program intervals. The platform as a whole must ensure that the intervals are filled with ads compatible with the profile, context and expressed interests of the viewers. The platform acts as an electronic marketplace for advertising agencies (content producer companies) and multimedia content providers (content distribution companies). The companies, once registered at the platform, are represented by agents who negotiate automatically the price of the interval timeslots according to the specified price range and adaptation behaviour. The candidate ads for a given viewer interval are selected through a matching mechanism between ad, viewer and the current context (program being watched) profiles. The overall architecture of the platform consists of a multiagent system organized into three layers consisting of: (i) interface agents that interact with companies; (ii) enterprise agents that model the companies, and (iii) delegate agents that negotiate a specific ad or interval. The negotiation follows a variant of the Iterated Contract Net Interaction Protocol (ICNIP) and is based on the price/s offered by the advertising agencies to occupy the viewer’s interval.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current ubiquitous network access and increase in network bandwidth are driving the sales of mobile location-aware user devices and, consequently, the development of context-aware applications, namely location-based services. The goal of this project is to provide consumers of location-based services with a richer end-user experience by means of service composition, personalization, device adaptation and continuity of service. Our approach relies on a multi-agent system composed of proxy agents that act as mediators and providers of personalization meta-services, device adaptation and continuity of service for consumers of pre-existing location-based services. These proxy agents, which have Web services interfaces to ensure a high level of interoperability, perform service composition and take in consideration the preferences of the users, the limitations of the user devices, making the usage of different types of devices seamless for the end-user. To validate and evaluate the performance of this approach, use cases were defined, tests were conducted and results gathered which demonstrated that the initial goals were successfully fulfilled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of the work presented in this paper is to provide mobile platforms within our campus with a GPS based data service capable of supporting precise outdoor navigation. This can be achieved by providing campus-wide access to real time Differential GPS (DGPS) data. As a result, we designed and implemented a three-tier distributed system that provides Internet data links between remote DGPS sources and the campus and a campus-wide DGPS data dissemination service. The Internet data link service is a two-tier client/server where the server-side is connected to the DGPS station and the client-side is located at the campus. The campus-wide DGPS data provider disseminates the DGPS data received at the campus via the campus Intranet and via a wireless data link. The wireless broadcast is intended for portable receivers equipped with a DGPS wireless interface and the Intranet link is provided for receivers with a DGPS serial interface. The application is expected to provide adequate support for accurate outdoor campus navigation tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

he expansion of Digital Television and the convergence between conventional broadcasting and television over IP contributed to the gradual increase of the number of available channels and on demand video content. Moreover, the dissemination of the use of mobile devices like laptops, smartphones and tablets on everyday activities resulted in a shift of the traditional television viewing paradigm from the couch to everywhere, anytime from any device. Although this new scenario enables a great improvement in viewing experiences, it also brings new challenges given the overload of information that the viewer faces. Recommendation systems stand out as a possible solution to help a watcher on the selection of the content that best fits his/her preferences. This paper describes a web based system that helps the user navigating on broadcasted and online television content by implementing recommendations based on collaborative and content based filtering. The algorithms developed estimate the similarity between items and users and predict the rating that a user would assign to a particular item (television program, movie, etc.). To enable interoperability between different systems, programs characteristics (title, genre, actors, etc.) are stored according to the TV-Anytime standard. The set of recommendations produced are presented through a Web Application that allows the user to interact with the system based on the obtained recommendations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless Body Area Networks (WBANs) have emerged as a promising technology for medical and non-medical applications. WBANs consist of a number of miniaturized, portable, and autonomous sensor nodes that are used for long-term health monitoring of patients. These sensor nodes continuously collect information of patients, which are used for ubiquitous health monitoring. In addition, WBANs may be used for managing catastrophic events and increasing the effectiveness and performance of rescue forces. The huge amount of data collected by WBAN nodes demands scalable, on-demand, powerful, and secure storage and processing infrastructure. Cloud computing is expected to play a significant role in achieving the aforementioned objectives. The cloud computing environment links different devices ranging from miniaturized sensor nodes to high-performance supercomputers for delivering people-centric and context-centric services to the individuals and industries. The possible integration of WBANs with cloud computing (WBAN-cloud) will introduce viable and hybrid platform that must be able to process the huge amount of data collected from multiple WBANs. This WBAN-cloud will enable users (including physicians and nurses) to globally access the processing and storage infrastructure at competitive costs. Because WBANs forward useful and life-critical information to the cloud – which may operate in distributed and hostile environments, novel security mechanisms are required to prevent malicious interactions to the storage infrastructure. Both the cloud providers and the users must take strong security measures to protect the storage infrastructure.