912 resultados para Real Time Recognition Systems
Resumo:
Recent embedded processor architectures containing multiple heterogeneous cores and non-coherent caches renewed attention to the use of Software Transactional Memory (STM) as a building block for developing parallel applications. STM promises to ease concurrent and parallel software development, but relies on the possibility of abort conflicting transactions to maintain data consistency, which in turns affects the execution time of tasks carrying transactions. Because of this fact the timing behaviour of the task set may not be predictable, thus it is crucial to limit the execution time overheads resulting from aborts. In this paper we formalise a FIFO-based algorithm to order the sequence of commits of concurrent transactions. Then, we propose and evaluate two non-preemptive and one SRP-based fully-preemptive scheduling strategies, in order to avoid transaction starvation.
Resumo:
Presented at Work in Progress Session, IEEE Real-Time Systems Symposium (RTSS 2015). 1 to 3, Dec, 2015. San Antonio, U.S.A..
Resumo:
Presented at Work in Progress Session, IEEE Real-Time Systems Symposium (RTSS 2015). 1 to 3, Dec, 2015. San Antonio, U.S.A..
Resumo:
Nos dias de hoje, os sistemas de tempo real crescem em importância e complexidade. Mediante a passagem do ambiente uniprocessador para multiprocessador, o trabalho realizado no primeiro não é completamente aplicável no segundo, dado que o nível de complexidade difere, principalmente devido à existência de múltiplos processadores no sistema. Cedo percebeu-se, que a complexidade do problema não cresce linearmente com a adição destes. Na verdade, esta complexidade apresenta-se como uma barreira ao avanço científico nesta área que, para já, se mantém desconhecida, e isto testemunha-se, essencialmente no caso de escalonamento de tarefas. A passagem para este novo ambiente, quer se trate de sistemas de tempo real ou não, promete gerar a oportunidade de realizar trabalho que no primeiro caso nunca seria possível, criando assim, novas garantias de desempenho, menos gastos monetários e menores consumos de energia. Este último fator, apresentou-se desde cedo, como, talvez, a maior barreira de desenvolvimento de novos processadores na área uniprocessador, dado que, à medida que novos eram lançados para o mercado, ao mesmo tempo que ofereciam maior performance, foram levando ao conhecimento de um limite de geração de calor que obrigou ao surgimento da área multiprocessador. No futuro, espera-se que o número de processadores num determinado chip venha a aumentar, e como é óbvio, novas técnicas de exploração das suas inerentes vantagens têm de ser desenvolvidas, e a área relacionada com os algoritmos de escalonamento não é exceção. Ao longo dos anos, diferentes categorias de algoritmos multiprocessador para dar resposta a este problema têm vindo a ser desenvolvidos, destacando-se principalmente estes: globais, particionados e semi-particionados. A perspectiva global, supõe a existência de uma fila global que é acessível por todos os processadores disponíveis. Este fato torna disponível a migração de tarefas, isto é, é possível parar a execução de uma tarefa e resumir a sua execução num processador distinto. Num dado instante, num grupo de tarefas, m, as tarefas de maior prioridade são selecionadas para execução. Este tipo promete limites de utilização altos, a custo elevado de preempções/migrações de tarefas. Em contraste, os algoritmos particionados, colocam as tarefas em partições, e estas, são atribuídas a um dos processadores disponíveis, isto é, para cada processador, é atribuída uma partição. Por essa razão, a migração de tarefas não é possível, acabando por fazer com que o limite de utilização não seja tão alto quando comparado com o caso anterior, mas o número de preempções de tarefas decresce significativamente. O esquema semi-particionado, é uma resposta de caráter hibrido entre os casos anteriores, pois existem tarefas que são particionadas, para serem executadas exclusivamente por um grupo de processadores, e outras que são atribuídas a apenas um processador. Com isto, resulta uma solução que é capaz de distribuir o trabalho a ser realizado de uma forma mais eficiente e balanceada. Infelizmente, para todos estes casos, existe uma discrepância entre a teoria e a prática, pois acaba-se por se assumir conceitos que não são aplicáveis na vida real. Para dar resposta a este problema, é necessário implementar estes algoritmos de escalonamento em sistemas operativos reais e averiguar a sua aplicabilidade, para caso isso não aconteça, as alterações necessárias sejam feitas, quer a nível teórico quer a nível prá
Resumo:
Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.
Resumo:
The observational method in tunnel engineering allows the evaluation in real time of the actual conditions of the ground and to take measures if its behavior deviates considerably from predictions. However, it lacks a consistent and structured methodology to use the monitoring data to adapt the support system in real time. The definition of limit criteria above which adaptation is required are not defined and complex inverse analysis procedures (Rechea et al. 2008, Levasseur et al. 2010, Zentar et al. 2001, Lecampion et al. 2002, Finno and Calvello 2005, Goh 1999, Cui and Pan 2012, Deng et al. 2010, Mathew and Lehane 2013, Sharifzadeh et al. 2012, 2013) may be needed to consistently analyze the problem. In this paper a methodology for the real time adaptation of the support systems during tunneling is presented. In a first step limit criteria for displacements and stresses are proposed. The methodology uses graphics that are constructed during the project stage based on parametric calculations to assist in the process and when these graphics are not available, since it is not possible to predict every possible scenario, inverse analysis calculations are carried out. The methodology is applied to the “Bois de Peu” tunnel which is composed by two tubes with over 500 m long. High uncertainty levels existed concerning the heterogeneity of the soil and consequently in the geomechanical design parameters. The methodology was applied in four sections and the results focus on two of them. It is shown that the methodology has potential to be applied in real cases contributing for a consistent approach of a real time adaptation of the support system and highlight the importance of the existence of good quality and specific monitoring data to improve the inverse analysis procedure.
Resumo:
Hospitals have multiple data sources, such as embedded systems, monitors and sensors. The number of data available is increasing and the information are used not only to care the patient but also to assist the decision processes. The introduction of intelligent environments in health care institutions has been adopted due their ability to provide useful information for health professionals, either in helping to identify prognosis or also to understand patient condition. Behind of this concept arises this Intelligent System to track patient condition (e.g. critic events) in health care. This system has the great advantage of being adaptable to the environment and user needs. The system is focused in identifying critic events from data streaming (e.g. vital signs and ventilation) which is particularly valuable for understanding the patient’s condition. This work aims to demonstrate the process of creating an intelligent system capable of operating in a real environment using streaming data provided by ventilators and vital signs monitors. Its development is important to the physician because becomes possible crossing multiple variables in real-time by analyzing if a value is critic or not and if their variation has or not clinical importance.
Resumo:
In type I diabetes mellitus, islet transplantation provides a moment-to-moment fine regulation of insulin. Success rates vary widely, however, necessitating suitable methods to monitor islet delivery, engraftment and survival. Here magnetic resonance-trackable magnetocapsules have been used simultaneously to immunoprotect pancreatic beta-cells and to monitor, non-invasively in real-time, hepatic delivery and engraftment by magnetic resonance imaging (MRI). Magnetocapsules were detected as single capsules with an altered magnetic resonance appearance on capsule rupture. Magnetocapsules were functional in vivo because mouse beta-cells restored normal glycemia in streptozotocin-induced diabetic mice and human islets induced sustained C-peptide levels in swine. In this large-animal model, magnetocapsules could be precisely targeted for infusion by using magnetic resonance fluoroscopy, whereas MRI facilitated monitoring of liver engraftment over time. These findings are directly applicable to ongoing improvements in islet cell transplantation for human diabetes, particularly because our magnetocapsules comprise clinically applicable materials.
Resumo:
Reinforcement learning (RL) is a very suitable technique for robot learning, as it can learn in unknown environments and in real-time computation. The main difficulties in adapting classic RL algorithms to robotic systems are the generalization problem and the correct observation of the Markovian state. This paper attempts to solve the generalization problem by proposing the semi-online neural-Q_learning algorithm (SONQL). The algorithm uses the classic Q_learning technique with two modifications. First, a neural network (NN) approximates the Q_function allowing the use of continuous states and actions. Second, a database of the most representative learning samples accelerates and stabilizes the convergence. The term semi-online is referred to the fact that the algorithm uses the current but also past learning samples. However, the algorithm is able to learn in real-time while the robot is interacting with the environment. The paper shows simulated results with the "mountain-car" benchmark and, also, real results with an underwater robot in a target following behavior
Resumo:
This study aimed to quantify Toxoplasma gondii in tissue samples of serologically positive chickens using real-time polymerase chain reaction (PCR). Of 65 chickens evaluated, 28 were positive for T. gondii antibodies. Brain and heart samples were collected from 26 seropositive chickens and DNA was extracted using Trizol® and amplified using real-time PCR with SYBR® Green. Parasite DNA was detected in 24 of the 26 samples analyzed; the number of positive tissue samples and the parasite quantity did not differ between tissue types. The results confirmed the analytical sensitivity of parasite detection in chicken tissue samples and demonstrated the possibility of using other molecular systems for genotypic analysis.
Resumo:
Real-time PCR is a widely used tool for the diagnosis of many infectious diseases. However, little information exists about the influences of the different factors involved in PCR on the amplification efficiency. The aim of this study was to analyze the effect of boiling as the DNA preparation method on the efficiency of the amplification process of real-time PCR for the diagnosis of human brucellosis with serum samples. Serum samples from 10 brucellosis patients were analyzed by a SYBR green I LightCycler-based real-time PCR and by using boiling to obtain the DNA. DNA prepared by boiling lysis of the bacteria isolated from serum did not prevent the presence of inhibitors, such as immunoglobulin G (IgG), which were extracted with the template DNA. To identify and confirm the presence of IgG, serum was precipitated to separate and concentrate the IgG and was analyzed by sodium dodecyl sulfate-polyacrylamide gel electrophoresis and Western blotting. The use of serum volumes above 0.6 ml completely inhibited the amplification process. The inhibitory effect of IgG in serum samples was not concentration dependent, and it could be eliminated by diluting the samples 1/10 and 1/20 in water. Despite the lack of the complete elimination of the IgG from the template DNA, boiling does not require any special equipment and it provides a rapid, reproducible, and cost-effective method for the preparation of DNA from serum samples for the diagnosis of brucellosis.
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human"s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.
Resumo:
Molecular diagnosis using real-time polymerase chain reaction (PCR) may allow earlier diagnosis of rickettsiosis. We developed a duplex real-time PCR that amplifies (1) DNA of any rickettsial species and (2) DNA of both typhus group rickettsia, that is, Rickettsia prowazekii and Rickettsia typhi. Primers and probes were selected to amplify a segment of the 16S rRNA gene of Rickettsia spp. for the pan-rickettsial PCR and the citrate synthase gene (gltA) for the typhus group rickettsia PCR. Analytical sensitivity was 10 copies of control plasmid DNA per reaction. No cross-amplification was observed when testing human DNA and 22 pathogens or skin commensals. Real-time PCR was applied to 16 clinical samples. Rickettsial DNA was detected in the skin biopsies of three patients. In one patient with severe murine typhus, the typhus group PCR was positive in a skin biopsy from a petechial lesion and seroconversion was later documented. The two other patients with negative typhus group PCR suffered from Mediterranean and African spotted fever, respectively; in both cases, skin biopsy was performed on the eschar. Our duplex real-time PCR showed a good analytical sensitivity and specificity, allowing early diagnosis of rickettsiosis among three patients, and recognition of typhus in one of them.