942 resultados para Optimistic replication
Resumo:
Besnoitia besnoiti and Toxoplasma gondii are two closely related parasites that interact with the host cell microtubule cytoskeleton during host cell invasion. Here we studied the relationship between the ability of these parasites to invade and to recruit the host cell centrosome and the Golgi apparatus. We observed that T. gondii recruits the host cell centrosome towards the parasitophorous vacuole (PV), whereas B. besnoiti does not. Notably, both parasites recruit the host Golgi apparatus to the PV but its organization is affected in different ways. We also investigated the impact of depleting and over-expressing the host centrosomal protein TBCCD1, involved in centrosome positioning and Golgi apparatus integrity, on the ability of these parasites to invade and replicate. Toxoplasma gondii replication rate decreases in cells over-expressing TBCCD1 but not in TBCCD1-depleted cells; while for B. besnoiti no differences were found. However, B. besnoiti promotes a reorganization of the Golgi ribbon previously fragmented by TBCCD1 depletion. These results suggest that successful establishment of PVs in the host cell requires modulation of the Golgi apparatus which probably involves modifications in microtubule cytoskeleton organization and dynamics. These differences in how T. gondii and B. besnoiti interact with their host cells may indicate different evolutionary paths.
Resumo:
In this paper, we present some of the fault tolerance management mechanisms being implemented in the Multi-μ architecture, namely its support for replica non-determinism. In this architecture, fault tolerance is achieved by node active replication, with software based replica management and fault tolerance transparent algorithms. A software layer implemented between the application and the real-time kernel, the Fault Tolerance Manager (FTManager), is the responsible for the transparent incorporation of the fault tolerance mechanisms The active replication model can be implemented either imposing replica determinism or keeping replica consistency at critical points, by means of interactive agreement mechanisms. One of the Multi-μ architecture goals is to identify such critical points, relieving the underlying system from performing the interactive agreement in every Ada dispatching point.
Resumo:
This paper presents an architecture (Multi-μ) being implemented to study and develop software based fault tolerant mechanisms for Real-Time Systems, using the Ada language (Ada 95) and Commercial Off-The-Shelf (COTS) components. Several issues regarding fault tolerance are presented and mechanisms to achieve fault tolerance by software active replication in Ada 95 are discussed. The Multi-μ architecture, based on a specifically proposed Fault Tolerance Manager (FTManager), is then described. Finally, some considerations are made about the work being done and essential future developments.
Resumo:
In the past few years Tabling has emerged as a powerful logic programming model. The integration of concurrent features into the implementation of Tabling systems is demanded by need to use recently developed tabling applications within distributed systems, where a process has to respond concurrently to several requests. The support for sharing of tables among the concurrent threads of a Tabling process is a desirable feature, to allow one of Tabling’s virtues, the re-use of computations by other threads and to allow efficient usage of available memory. However, the incremental completion of tables which are evaluated concurrently is not a trivial problem. In this dissertation we describe the integration of concurrency mechanisms, by the way of multi-threading, in a state of the art Tabling and Prolog system, XSB. We begin by reviewing the main concepts for a formal description of tabled computations, called SLG resolution and for the implementation of Tabling under the SLG-WAM, the abstract machine supported by XSB. We describe the different scheduling strategies provided by XSB and introduce some new properties of local scheduling, a scheduling strategy for SLG resolution. We proceed to describe our implementation work by describing the process of integrating multi-threading in a Prolog system supporting Tabling, without addressing the problem of shared tables. We describe the trade-offs and implementation decisions involved. We then describe an optimistic algorithm for the concurrent sharing of completed tables, Shared Completed Tables, which allows the sharing of tables without incurring in deadlocks, under local scheduling. This method relies on the execution properties of local scheduling and includes full support for negation. We provide a theoretical framework and discuss the implementation’s correctness and complexity. After that, we describe amethod for the sharing of tables among threads that allows parallelism in the computation of inter-dependent subgoals, which we name Concurrent Completion. We informally argue for the correctness of Concurrent Completion. We give detailed performance measurements of the multi-threaded XSB systems over a variety of machines and operating systems, for both the Shared Completed Tables and the Concurrent Completion implementations. We focus our measurements inthe overhead over the sequential engine and the scalability of the system. We finish with a comparison of XSB with other multi-threaded Prolog systems and we compare our approach to concurrent tabling with parallel and distributed methods for the evaluation of tabling. Finally, we identify future research directions.
Resumo:
The National Cancer Institute (NCI) method allows the distributions of usual intake of nutrients and foods to be estimated. This method can be used in complex surveys. However, the user must perform additional calculations, such as balanced repeated replication (BRR), in order to obtain standard errors and confidence intervals for the percentiles and mean from the distribution of usual intake. The objective is to highlight adaptations of the NCI method using data from the National Dietary Survey. The application of the NCI method was exemplified analyzing the total energy (kcal) and fruit (g) intake, comparing estimations of mean and standard deviation that were based on the complex design of the Brazilian survey with those assuming simple random sample. Although means point estimates were similar, estimates of standard error using the complex design increased by up to 60% compared to simple random sample. Thus, for valid estimates of food and energy intake for the population, all of the sampling characteristics of the surveys should be taken into account because when these characteristics are neglected, statistical analysis may produce underestimated standard errors that would compromise the results and the conclusions of the survey.
Resumo:
This paper intends to evaluate the capacity of producing concrete with a pre-established performance (in terms of mechanical strength) incorporating recycled concrete aggregates (RCA) from different sources. To this purpose, rejected products from the precasting industry and concrete produced in laboratory were used. The appraisal of the self-replication capacity was made for three strength ranges: 15-25 MPa, 35-45 MPa and 65-75 MPa. The mixes produced tried to replicate the strength of the source concrete (SC) of the RA. Only total, (100%) replacement of coarse natural aggregates (CNA) by coarse recycled concrete aggregates (CRCA) was tested. The results show that, both in mechanical and durability terms, there were no significant differences between aggregates from controlled sources and those from precast rejects for the highest levels of the target strength. Furthermore, the performance losses resulting from the RA's incorporation are substantially reduced when used medium or high strength SC's. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The great majority of the courses on science and technology areas where lab work is a fundamental part of the apprenticeship was not until recently available to be taught at distance. This reality is changing with the dissemination of remote laboratories. Supported by resources based on new information and communication technologies, it is now possible to remotely control a wide variety of real laboratories. However, most of them are designed specifically to this purpose, are inflexible and only on its functionality they resemble the real ones. In this paper, an alternative remote lab infrastructure devoted to the study of electronics is presented. Its main characteristics are, from a teacher's perspective, reusability and simplicity of use, and from a students' point of view, an exact replication of the real lab, enabling them to complement or finish at home the work started at class. The remote laboratory is integrated in the Learning Management System in use at the school, and therefore, may be combined with other web experiments and e-learning strategies, while safeguarding security access issues.
Resumo:
OBJECTIVE To estimate the budget impact from the incorporation of positron emission tomography (PET) in mediastinal and distant staging of non-small cell lung cancer.METHODS The estimates were calculated by the epidemiological method for years 2014 to 2018. Nation-wide data were used about the incidence; data on distribution of the disease´s prevalence and on the technologies’ accuracy were from the literature; data regarding involved costs were taken from a micro-costing study and from Brazilian Unified Health System (SUS) database. Two strategies for using PET were analyzed: the offer to all newly-diagnosed patients, and the restricted offer to the ones who had negative results in previous computed tomography (CT) exams. Univariate and extreme scenarios sensitivity analyses were conducted to evaluate the influence from sources of uncertainties in the parameters used.RESULTS The incorporation of PET-CT in SUS would imply the need for additional resources of 158.1 BRL (98.2 USD) million for the restricted offer and 202.7 BRL (125.9 USD) million for the inclusive offer in five years, with a difference of 44.6 BRL (27.7 USD) million between the two offer strategies within that period. In absolute terms, the total budget impact from its incorporation in SUS, in five years, would be 555 BRL (345 USD) and 600 BRL (372.8 USD) million, respectively. The costs from the PET-CT procedure were the most influential parameter in the results. In the most optimistic scenario, the additional budget impact would be reduced to 86.9 BRL (54 USD) and 103.8 BRL (64.5 USD) million, considering PET-CT for negative CT and PET-CT for all, respectively.CONCLUSIONS The incorporation of PET in the clinical staging of non-small cell lung cancer seems to be financially feasible considering the high budget of the Brazilian Ministry of Health. The potential reduction in the number of unnecessary surgeries may cause the available resources to be more efficiently allocated.
Resumo:
Dynamically reconfigurable SRAM-based field-programmable gate arrays (FPGAs) enable the implementation of reconfigurable computing systems where several applications may be run simultaneously, sharing the available resources according to their own immediate functional requirements. To exclude malfunctioning due to faulty elements, the reliability of all FPGA resources must be guaranteed. Since resource allocation takes place asynchronously, an online structural test scheme is the only way of ensuring reliable system operation. On the other hand, this test scheme should not disturb the operation of the circuit, otherwise availability would be compromised. System performance is also influenced by the efficiency of the management strategies that must be able to dynamically allocate enough resources when requested by each application. As those resources are allocated and later released, many small free resource blocks are created, which are left unused due to performance and routing restrictions. To avoid wasting logic resources, the FPGA logic space must be defragmented regularly. This paper presents a non-intrusive active replication procedure that supports the proposed test methodology and the implementation of defragmentation strategies, assuring both the availability of resources and their perfect working condition, without disturbing system operation.
Resumo:
The dynamics of catalytic networks have been widely studied over the last decades because of their implications in several fields like prebiotic evolution, virology, neural networks, immunology or ecology. One of the most studied mathematical bodies for catalytic networks was initially formulated in the context of prebiotic evolution, by means of the hypercycle theory. The hypercycle is a set of self-replicating species able to catalyze other replicator species within a cyclic architecture. Hypercyclic organization might arise from a quasispecies as a way to increase the informational containt surpassing the so-called error threshold. The catalytic coupling between replicators makes all the species to behave like a single and coherent evolutionary multimolecular unit. The inherent nonlinearities of catalytic interactions are responsible for the emergence of several types of dynamics, among them, chaos. In this article we begin with a brief review of the hypercycle theory focusing on its evolutionary implications as well as on different dynamics associated to different types of small catalytic networks. Then we study the properties of chaotic hypercycles with error-prone replication with symbolic dynamics theory, characterizing, by means of the theory of topological Markov chains, the topological entropy and the periods of the orbits of unimodal-like iterated maps obtained from the strange attractor. We will focus our study on some key parameters responsible for the structure of the catalytic network: mutation rates, autocatalytic and cross-catalytic interactions.
Resumo:
A área da simulação computacional teve um rápido crescimento desde o seu apareciment, sendo actualmente uma das ciências de gestão e de investigação operacional mais utilizadas. O seu princípio baseia-se na replicação da operação de processos ou sistemas ao longo de períodos de tempo, tornando-se assim uma metodologia indispensável para a resolução de variados problemas do mundo real, independentemente da sua complexidade. Das inúmeras áreas de aplicação, nos mais diversos campos, a que mais se destaca é a utilização em sistemas de produção, onde o leque de aplicações disponível é muito vasto. A sua aplicação tem vindo a ser utilizada para solucionar problemas em sistemas de produção, uma vez que permite às empresas ajustar e planear de uma maneira rápida, eficaz e ponderada as suas operações e os seus sistemas, permitindo assim uma rápida adaptação das mesmas às constantes mudanças das necessidades da economia global. As aplicações e packages de simulação têm seguindo as tendências tecnológicas pelo que é notório o recurso a tecnologias orientadas a objectos para o desenvolvimento das mesmas. Este estudo baseou-se, numa primeira fase, na recolha de informação de suporte aos conceitos de modelação e simulação, bem como a respectiva aplicação a sistemas de produção em tempo real. Posteriormente centralizou-se no desenvolvimento de um protótipo de uma aplicação de simulação de ambientes de fabrico em tempo real. O desenvolvimento desta ferramenta teve em vista eventuais fins pedagógicos e uma utilização a nível académico, sendo esta capaz de simular um modelo de um sistema de produção, estando também dotada de animação. Sem deixar de parte a possibilidade de integração de outros módulos ou, até mesmo, em outras plataformas, houve ainda a preocupação acrescida de que a sua implementação recorresse a metodologias de desenvolvimento orientadas a objectos.
Resumo:
Dynamically reconfigurable systems have benefited from a new class of FPGAs recently introduced into the market, which allow partial and dynamic reconfiguration at run-time, enabling multiple independent functions from different applications to share the same device, swapping resources as needed. When the sequence of tasks to be performed is not predictable, resource allocation decisions have to be made on-line, fragmenting the FPGA logic space. A rearrangement may be necessary to get enough contiguous space to efficiently implement incoming functions, to avoid spreading their components and, as a result, degrading their performance. This paper presents a novel active replication mechanism for configurable logic blocks (CLBs), able to implement on-line rearrangements, defragmenting the available FPGA resources without disturbing those functions that are currently running.
Resumo:
Germfree (GF and conventional (CV) mice were infected intraperitoneally with GF cercariae of Schistosoma mansoni and kept for six weeks. Twenty four hours before killing, they were injected with [³H]-thymidine. Schistosoma worms, harvested after perfusion of portal system, were counted as well as eggs from liver and intestines. Liver was also used for DNA, protein, and collagen determinations. [³H] -Thymidine incorporation and collagen determinations were used to establish the indices given by the difference between their contents in infected and control animals and expressed per thousand eggs in liver. The recovery of worms in GF mice was around twice as much as in CV ones, and the total number of eggs was higher in the liver of GF animals. No hypertrophy of liver cells was observed by the ratio protein/DNA, but [³H]-thymidine incorporation into DNA was higher than in controls in both GF and CV infected animals. The [³H]-thymidine and collagen indices were lower in GF animals which indicate a more discrete cellular replication and smaller collagen content in relation to the number of eggs present in livers of these mice. It was concluded that the disease seems to be less severe in GF animals.
Resumo:
As novas tecnologias, e a Internet em particular, criaram novas formas de transmissão de informação para o público e alteraram a forma como as pessoas comunicam. Isto abriu portas a novas formas de publicidade e ao aparecimento de um novo género de jogos, os advergames, aproveitando o facto dos jogos online contarem já com milhões de jogadores a nível mundial, um número que continua em constante crescimento. O conceito é relativamente recente mas apresenta resultados bastante positivos, com muitos especialistas a defender que os advergames são o futuro da publicidade, em grande parte devido aos custos inferiores e ao tempo de exposição do produto, quando comparado com os métodos mais tradicionais de publicidade. Os Jogos Sérios e, em especial, os advergames são o tema principal desta tese, com uma análise detalhada das suas vantagens e desvantagens, origens e oportunidade de desenvolvimento no futuro. São também analisados alguns casos de advergames de sucesso. A componente prática da tese tem como objetivo a criação de um advergame com o propósito principal de auxiliar os novos alunos do ISEP no seu processo de integração. O jogo consiste num formato de labirinto em duas dimensões, com objetivos que consistem na captura de certos objetos e entrega dos mesmos em pontos de destino pré-definidos, sempre dentro de um tempo limite e evitando outros perigos e obstáculos. Os resultados obtidos com a aplicação deste jogo demonstram que a transmissão de informação é bastante eficaz junto do seu público-alvo, devido em parte à abordagem mais dinâmica e interativa que um advergame tem com os seus utilizadores. A simplicidade da interface e facilidade de utilização proporcionada pelo jogo permitem uma exposição alargada da mensagem a passar, aumentando a motivação do jogador para se manter em contacto com o mesmo. Isto apresenta perspetivas bastante otimistas para o futuro da utilização de advergames no meio Universitário.
Resumo:
Enterprise and Work Innovation Studies, 5