911 resultados para Production-on-demand


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Vocational training (VT) is a mandatory requirement for all UK dental graduates prior to entering NHS practice. The VT period provides structured, supervised experience supported by study days and interaction with peers. It is not compulsory for Irish dental graduates working in either Ireland or the UK to undertake VT but yet a proportion voluntarily do so each year.

Objectives: This study was designed to explore the choices made by Irish dental graduates. It aimed to record any benefits of VT and its impact upon future career choices.

Method: A self-completion questionnaire was developed and piloted before being circulated electronically to recent dental graduates from University College Cork. After collecting demographic information respondents were asked to indicate if they pursued vocational training on graduation, give their perception of their post-graduation experience, describe their current work profile and detail any formal postgraduate studies.

Results: 35% of respondents opted to undertake VT and 79% did so in the UK. Those who completed VT regarded it as a very positive experience with benefits including: working in a positive learning environment, help on demand and interaction with peers. Of those who chose VT, 49% have pursued some form of further formal postgraduate study as compared to 40% of those who did not. All of the respondents who completed VT indicated they would recommend it to current Irish graduates. The majority of those who took up an associate position immediately after graduation reported that this was beneficial but up to three quarters would recommend current graduates undertake VT and 45% would now chose to do so themselves.

Conclusions: Increasing numbers of Irish graduates are moving to the UK to undertake VT and they find it a beneficial experience. In addition, those who undertook VT were more likely to undertake formal postgraduate study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In-Memory Databases (IMDBs), such as SAP HANA, enable new levels of database performance by removing the disk bottleneck and by compressing data in memory. The consequence of this improved performance means that reports and analytic queries can now be processed on demand. Therefore, the goal is now to provide near real-time responses to compute and data intensive analytic queries. To facilitate this, much work has investigated the use of acceleration technologies within the database context. While current research into the application of these technologies has yielded positive results, they have tended to focus on single database tasks or on isolated single user requests. This paper uses SHEPARD, a framework for managing accelerated tasks across shared heterogeneous resources, to introduce acceleration into an IMDB. Results show how, using SHEPARD, multiple simultaneous user queries all receive speed-up by using a shared pool of accelerators. Results also show that offloading analytic tasks onto accelerators can have indirect benefits for other database workloads by reducing contention for CPU resources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the modern society, new devices, applications and technologies, with sophisticated capabilities, are converging in the same network infrastructure. Users are also increasingly demanding in personal preferences and expectations, desiring Internet connectivity anytime and everywhere. These aspects have triggered many research efforts, since the current Internet is reaching a breaking point trying to provide enough flexibility for users and profits for operators, while dealing with the complex requirements raised by the recent evolution. Fully aligned with the future Internet research, many solutions have been proposed to enhance the current Internet-based architectures and protocols, in order to become context-aware, that is, to be dynamically adapted to the change of the information characterizing any network entity. In this sense, the presented Thesis proposes a new architecture that allows to create several networks with different characteristics according to their context, on the top of a single Wireless Mesh Network (WMN), which infrastructure and protocols are very flexible and self-adaptable. More specifically, this Thesis models the context of users, which can span from their security, cost and mobility preferences, devices’ capabilities or services’ quality requirements, in order to turn a WMN into a set of logical networks. Each logical network is configured to meet a set of user context needs (for instance, support of high mobility and low security). To implement this user-centric architecture, this Thesis uses the network virtualization, which has often been advocated as a mean to deploy independent network architectures and services towards the future Internet, while allowing a dynamic resource management. This way, network virtualization can allow a flexible and programmable configuration of a WMN, in order to be shared by multiple logical networks (or virtual networks - VNs). Moreover, the high level of isolation introduced by network virtualization can be used to differentiate the protocols and mechanisms of each context-aware VN. This architecture raises several challenges to control and manage the VNs on-demand, in response to user and WMN dynamics. In this context, we target the mechanisms to: (i) discover and select the VN to assign to an user; (ii) create, adapt and remove the VN topologies and routes. We also explore how the rate of variation of the user context requirements can be considered to improve the performance and reduce the complexity of the VN control and management. Finally, due to the scalability limitations of centralized control solutions, we propose a mechanism to distribute the control functionalities along the architectural entities, which can cooperate to control and manage the VNs in a distributed way.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rapid evolution and proliferation of a world-wide computerized network, the Internet, resulted in an overwhelming and constantly growing amount of publicly available data and information, a fact that was also verified in biomedicine. However, the lack of structure of textual data inhibits its direct processing by computational solutions. Information extraction is the task of text mining that intends to automatically collect information from unstructured text data sources. The goal of the work described in this thesis was to build innovative solutions for biomedical information extraction from scientific literature, through the development of simple software artifacts for developers and biocurators, delivering more accurate, usable and faster results. We started by tackling named entity recognition - a crucial initial task - with the development of Gimli, a machine-learning-based solution that follows an incremental approach to optimize extracted linguistic characteristics for each concept type. Afterwards, Totum was built to harmonize concept names provided by heterogeneous systems, delivering a robust solution with improved performance results. Such approach takes advantage of heterogenous corpora to deliver cross-corpus harmonization that is not constrained to specific characteristics. Since previous solutions do not provide links to knowledge bases, Neji was built to streamline the development of complex and custom solutions for biomedical concept name recognition and normalization. This was achieved through a modular and flexible framework focused on speed and performance, integrating a large amount of processing modules optimized for the biomedical domain. To offer on-demand heterogenous biomedical concept identification, we developed BeCAS, a web application, service and widget. We also tackled relation mining by developing TrigNER, a machine-learning-based solution for biomedical event trigger recognition, which applies an automatic algorithm to obtain the best linguistic features and model parameters for each event type. Finally, in order to assist biocurators, Egas was developed to support rapid, interactive and real-time collaborative curation of biomedical documents, through manual and automatic in-line annotation of concepts and relations. Overall, the research work presented in this thesis contributed to a more accurate update of current biomedical knowledge bases, towards improved hypothesis generation and knowledge discovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Network virtualisation is seen as a promising approach to overcome the so-called “Internet impasse” and bring innovation back into the Internet, by allowing easier migration towards novel networking approaches as well as the coexistence of complementary network architectures on a shared infrastructure in a commercial context. Recently, the interest from the operators and mainstream industry in network virtualisation has grown quite significantly, as the potential benefits of virtualisation became clearer, both from an economical and an operational point of view. In the beginning, the concept has been mainly a research topic and has been materialized in small-scale testbeds and research network environments. This PhD Thesis aims to provide the network operator with a set of mechanisms and algorithms capable of managing and controlling virtual networks. To this end, we propose a framework that aims to allocate, monitor and control virtual resources in a centralized and efficient manner. In order to analyse the performance of the framework, we performed the implementation and evaluation on a small-scale testbed. To enable the operator to make an efficient allocation, in real-time, and on-demand, of virtual networks onto the substrate network, it is proposed a heuristic algorithm to perform the virtual network mapping. For the network operator to obtain the highest profit of the physical network, it is also proposed a mathematical formulation that aims to maximize the number of allocated virtual networks onto the physical network. Since the power consumption of the physical network is very significant in the operating costs, it is important to make the allocation of virtual networks in fewer physical resources and onto physical resources already active. To address this challenge, we propose a mathematical formulation that aims to minimize the energy consumption of the physical network without affecting the efficiency of the allocation of virtual networks. To minimize fragmentation of the physical network while increasing the revenue of the operator, it is extended the initial formulation to contemplate the re-optimization of previously mapped virtual networks, so that the operator has a better use of its physical infrastructure. It is also necessary to address the migration of virtual networks, either for reasons of load balancing or for reasons of imminent failure of physical resources, without affecting the proper functioning of the virtual network. To this end, we propose a method based on cloning techniques to perform the migration of virtual networks across the physical infrastructure, transparently, and without affecting the virtual network. In order to assess the resilience of virtual networks to physical network failures, while obtaining the optimal solution for the migration of virtual networks in case of imminent failure of physical resources, the mathematical formulation is extended to minimize the number of nodes migrated and the relocation of virtual links. In comparison with our optimization proposals, we found out that existing heuristics for mapping virtual networks have a poor performance. We also found that it is possible to minimize the energy consumption without penalizing the efficient allocation. By applying the re-optimization on the virtual networks, it has been shown that it is possible to obtain more free resources as well as having the physical resources better balanced. Finally, it was shown that virtual networks are quite resilient to failures on the physical network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The law regulating the availability of abortion is problematic both legally and morally. It is dogmatic in its requirements of women and doctors and ignorant of would-be fathers. Practically, its usage is liberal - with s1(1)(a) Abortion Act 1967 treated as a ‘catch all’ ground - it allows abortion on demand. Yet this is not reflected in the ‘law’. Against this outdated legislation I propose a model of autonomy which seeks to tether our moral concerns with a new legal approach to abortion. I do so by maintaining that a legal conception of autonomy is derivable from the categorical imperative resulting from Gewirth’s argument to the Principle of Generic Consistency: Act in accordance with the generic rights of your recipients as well as of yourself. This model of Gewirthian Rational Autonomy, I suggest, provides a guide for both public and private notions of autonomy and how our autonomous interests can be balanced across social structures in order to legitimately empower choice. I claim, ultimately, that relevant rights in the context of abortion are derivable from this model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This piece is a short rejoinder to César Bolaño’s paper The Political Economy of the Internet and related articles (e.g., Comor, Foley, Huws, Reveley, Rigi and Prey, Robinson) that center around the relevance of Marx’s labor theory of value for understanding social media. I argue that Dallas Smythe’s assessment of advertising was made to distinguish his approach from the one by Baran and Sweezy. Smythe developed the idea of capital’s exploitation of the audience at a time when both feminist and anti-imperialist Marxists challenged the orthodox idea that only white factory workers are exploited. The crucial question is how to conceptualize productive labor. This is a theoretical, normative, and political question. A mathematical example shows the importance of the “crowdsourcing” of value-production on Facebook. I also point out parallels of the contemporary debate to the Soviet question of who is a productive or unproductive worker in the Material Product System.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Superior de Contabilidade para obtenção do Grau de Mestre em Auditoria Orientada por: Doutora Alcina Dias

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the hardware implementation of a High-Rate MIMO Receiver in an FPGA for three modulations, namely BPSK, QPSK and 16-QAM based on the Alamouti scheme. The implementation with 16-QAM achieves more than 1.6 Gbps with 66% of the resources of a medium-sized Virtex-4 FPGA. This results indicate that the Alamouti scheme is a good design option for hardware implementation of a high-rate MIMO receiver. Also, using an FPGA, the modulation can be dynamically changed on demand.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The need for better adaptation of networks to transported flows has led to research on new approaches such as content aware networks and network aware applications. In parallel, recent developments of multimedia and content oriented services and applications such as IPTV, video streaming, video on demand, and Internet TV reinforced interest in multicast technologies. IP multicast has not been widely deployed due to interdomain and QoS support problems; therefore, alternative solutions have been investigated. This article proposes a management driven hybrid multicast solution that is multi-domain and media oriented, and combines overlay multicast, IP multicast, and P2P. The architecture is developed in a content aware network and network aware application environment, based on light network virtualization. The multicast trees can be seen as parallel virtual content aware networks, spanning a single or multiple IP domains, customized to the type of content to be transported while fulfilling the quality of service requirements of the service provider.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neste trabalho foi considerada a possibilidade de incorporar serviços remotos, normalmente associados a serviços web e cloud computing, numa solução local que centralizasse os vários serviços num único sistema e permitisse aos seus utilizadores consumir e configurar os mesmos, quer a partir da rede local, quer remotamente a partir da Internet. Desta forma seria possível conciliar o acesso a partir de qualquer local com internet, característico nas clouds, com a simplicidade de concentrar num só sistema vários serviços que são por norma oferecidos por entidades distintas e ainda permitir aos seus utilizadores o controlo e configuração sobre os mesmos. De forma a validar que este conceito é viável, prático e funcional, foram implementadas duas componentes. Um cliente que corre nos dispositivos dos utilizadores e que proporciona a interface para consumir os serviços disponíveis e um servidor que irá conter e prestar esses serviços aos clientes. Estes serviços incluem lista de contactos, mensagens instantâneas, salas de conversação, transferência de ficheiros, chamadas e conferências de voz e vídeo, pastas remotas, pastas sincronizadas, backups, pastas partilhadas, VoD (Video-on Demand) e AoD (Audio-on Demand). Para o desenvolvimento do cliente e do servidor foi utilizada a framework Qt que recorre à linguagem de programação C++ e ao conjunto de bibliotecas que possui, para o desenvolvimento de aplicações multiplataforma. Para as comunicações entre clientes e servidor, foi utilizado o protocolo XMPP (Extensible Messaging and Presence Protocol), pela forma da biblioteca qxmpp e do servidor XMPP ejabberd. Pelo facto de conter um conjunto de centenas de extensões atualmente ativas que auferem funcionalidades como salas de conversação, transferências de ficheiros e até estabelecer sessões multimédia, graças à sua flexibilidade permitiu ainda a criação de extensões personalizada necessárias para algumas funcionalidades que se pretendeu implementar. Foi ainda utilizado no servidor a framework ffmpeg para suportar algumas funcionalidades multimédia. Após a implementação do cliente para Windows e Linux, e de implementar o servidor em Linux foi realizado um conjunto de testes funcionais para perceber se as funcionalidades e seus mecanismos funcionam corretamente. No caso onde a análise da performance e do consumo de recursos era importante, foram realizados testes de performance e testes de carga.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A presente investigação procura inscrever-se no domínio da epistemologia dos estudos literários, fazendo apelo a uma abordagem interdisciplinar que aponta para a necessidade de um comparatismo literário africano e, ao mesmo tempo, mundial. O conceito-chave com que operamos neste trabalho é o de disciplinarização, tendo em conta o seu potencial explicativo para entender o processo que nos irá conduzir à integração dos Estudos Literários Africanos no sistema disciplinar actual. Por disciplinarização entendemos o processo de definição que consiste na demarcação de uma determinada disciplina, por força de dinâmicas endógenas e exógenas, durante o qual se transita de uma fase pré-disciplinar para outra disciplinar, admitindo-se a existência de uma compatibilidade entre os fundamentos epistemológicos e metodológicos da produção e transmissão de conhecimentos e, por outro lado, a consagração da institucionalidade da disciplina como objecto de estudo. A profissionalização disciplinar será uma consequência desse processo e da formação de comunidades de agentes epistémicos que, conhecendo profundamente a história e os universos de referência da disciplina, sejam capazes de aplicar as metodologias mais adequadas no domínio da investigação e do ensino. Para compreender os fundamentos epistemológicos dos Estudos Literários Africanos, importa refletir sobre o momento a partir do qual se constituem como campo disciplinar na história da produção do conhecimento sobre o continente africano. Por outro lado, com o presente trabalho pretende-se avaliar o estatuto disciplinar da Literatura Angolana, num exercício que procura justificar as determinações da epistemologia disciplinar, operacionalizando os sentidos em que se pode analisar o conceito de disciplina. Deste modo, a atribuição do referido estatuto pressupõe o domínio de um instrumental teórico que implica a descrição dos tipos de conhecimento veiculados através dos processos de transmissão que caracterizam as disciplinas escolares e as disciplinas académicas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los estudios sobre política y su impacto y circulación entre la sociedad moderna, han solido limitar su expansión a un número reducido de personas del entorno más próximo a los grandes actores cortesanos frente a la tradicional “indiferencia” del común. Sin embargo, gracias a la renovación de la historiografía de lo político y a su interés por áreas culturales y sociales ajenas a su tradicional consideración, en las últimas décadas se ha descubierto un interesante terreno de experiencias políticas que nos puede servir como atalaya para conocer la difusión de la información sobre los hechos políticos también entre “gente corriente”. A nuestro juicio, es un momento adecuado para evaluar el desarrollo de un fenómeno historiográfico carente de cierta sistematicidad, razón por la que planteamos este balance crítico y analítico sobre la sociedad ibérica del Antiguo Régimen.