968 resultados para Computation laboratories
Resumo:
More than 70 species of mycobacteria have been defined, and some can cause disease in humans, especially in immunocompromised patients. Species identification in most clinical laboratories is based on phenotypic characteristics and biochemical tests and final results are obtained only after two to four weeks. Quick identification methods, by reducing time for diagnosis, could expedite institution of specific treatment, increasing chances of success. PCR restriction-enzyme analysis (PRA) of the hsp65 gene was used as a rapid method for identification of 103 clinical isolates. Band patterns were interpreted by comparison with published tables and patterns available at an Internet site (http://www.hospvd.ch:8005). Concordant results of PRA and biochemical identification were obtained in 76 out of 83 isolates (91.5%). Results from 20 isolates could not be compared due to inconclusive PRA or biochemical identification. The results of this work showed that PRA could improve identification of mycobacteria in a routine setting because it is accurate, fast, and cheaper than conventional phenotypic identification.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
A comparison of the Etest and the reference broth macrodilution susceptibility test for fluconazole, ketoconazole, itraconazole and amphotericin B was performed with 59 of Candida species isolated from the oral cavities of AIDS patients. The Etest method was performed according to the manufacturer's instructions, and the reference method was performed according to National Committee for Clinical Laboratory Standards document M27-A guidelines. Our data showed that there was a good correlation between the MICs obtained by the Etest and broth dilution methods. When only the MIC results at ± 2 dilutions for both methods were considered, the agreement rates were 90.4% for itraconazole, ketoconazole and amphotericin B and 84.6% for fluconazole of the C. albicans tested. In contrast, to the reference method, the Etest method classified as susceptible three fluconazole-resistant isolates and one itraconazole-resistant isolate, representing four very major errors. These results indicate that Etest could be considered useful for antifungal sensitivity evaluation of yeasts in clinical laboratories.
Resumo:
A função de escalonamento desempenha um papel importante nos sistemas de produção. Os sistemas de escalonamento têm como objetivo gerar um plano de escalonamento que permite gerir de uma forma eficiente um conjunto de tarefas que necessitam de ser executadas no mesmo período de tempo pelos mesmos recursos. Contudo, adaptação dinâmica e otimização é uma necessidade crítica em sistemas de escalonamento, uma vez que as organizações de produção têm uma natureza dinâmica. Nestas organizações ocorrem distúrbios nas condições requisitos de trabalho regularmente e de forma inesperada. Alguns exemplos destes distúrbios são: surgimento de uma nova tarefa, cancelamento de uma tarefa, alteração na data de entrega, entre outros. Estes eventos dinâmicos devem ser tidos em conta, uma vez que podem influenciar o plano criado, tornando-o ineficiente. Portanto, ambientes de produção necessitam de resposta imediata para estes eventos, usando um método de reescalonamento em tempo real, para minimizar o efeito destes eventos dinâmicos no sistema de produção. Deste modo, os sistemas de escalonamento devem de uma forma automática e inteligente, ser capazes de adaptar o plano de escalonamento que a organização está a seguir aos eventos inesperados em tempo real. Esta dissertação aborda o problema de incorporar novas tarefas num plano de escalonamento já existente. Deste modo, é proposta uma abordagem de otimização – Hiper-heurística baseada em Seleção Construtiva para Escalonamento Dinâmico- para lidar com eventos dinâmicos que podem ocorrer num ambiente de produção, a fim de manter o plano de escalonamento, o mais robusto possível. Esta abordagem é inspirada em computação evolutiva e hiper-heurísticas. Do estudo computacional realizado foi possível concluir que o uso da hiper-heurística de seleção construtiva pode ser vantajoso na resolução de problemas de otimização de adaptação dinâmica.
Resumo:
Near real time media content personalisation is nowadays a major challenge involving media content sources, distributors and viewers. This paper describes an approach to seamless recommendation, negotiation and transaction of personalised media content. It adopts an integrated view of the problem by proposing, on the business-to-business (B2B) side, a brokerage platform to negotiate the media items on behalf of the media content distributors and sources, providing viewers, on the business-to-consumer (B2C) side, with a personalised electronic programme guide (EPG) containing the set of recommended items after negotiation. In this setup, when a viewer connects, the distributor looks up and invites sources to negotiate the contents of the viewer personal EPG. The proposed multi-agent brokerage platform is structured in four layers, modelling the registration, service agreement, partner lookup, invitation as well as item recommendation, negotiation and transaction stages of the B2B processes. The recommendation service is a rule-based switch hybrid filter, including six collaborative and two content-based filters. The rule-based system selects, at runtime, the filter(s) to apply as well as the final set of recommendations to present. The filter selection is based on the data available, ranging from the history of items watched to the ratings and/or tags assigned to the items by the viewer. Additionally, this module implements (i) a novel item stereotype to represent newly arrived items, (ii) a standard user stereotype for new users, (iii) a novel passive user tag cloud stereotype for socially passive users, and (iv) a new content-based filter named the collinearity and proximity similarity (CPS). At the end of the paper, we present off-line results and a case study describing how the recommendation service works. The proposed system provides, to our knowledge, an excellent holistic solution to the problem of recommending multimedia contents.
Resumo:
In recent years, vehicular cloud computing (VCC) has emerged as a new technology which is being used in wide range of applications in the area of multimedia-based healthcare applications. In VCC, vehicles act as the intelligent machines which can be used to collect and transfer the healthcare data to the local, or global sites for storage, and computation purposes, as vehicles are having comparatively limited storage and computation power for handling the multimedia files. However, due to the dynamic changes in topology, and lack of centralized monitoring points, this information can be altered, or misused. These security breaches can result in disastrous consequences such as-loss of life or financial frauds. Therefore, to address these issues, a learning automata-assisted distributive intrusion detection system is designed based on clustering. Although there exist a number of applications where the proposed scheme can be applied but, we have taken multimedia-based healthcare application for illustration of the proposed scheme. In the proposed scheme, learning automata (LA) are assumed to be stationed on the vehicles which take clustering decisions intelligently and select one of the members of the group as a cluster-head. The cluster-heads then assist in efficient storage and dissemination of information through a cloud-based infrastructure. To secure the proposed scheme from malicious activities, standard cryptographic technique is used in which the auotmaton learns from the environment and takes adaptive decisions for identification of any malicious activity in the network. A reward and penalty is given by the stochastic environment where an automaton performs its actions so that it updates its action probability vector after getting the reinforcement signal from the environment. The proposed scheme was evaluated using extensive simulations on ns-2 with SUMO. The results obtained indicate that the proposed scheme yields an improvement of 10 % in detection rate of malicious nodes when compared with the existing schemes.
Resumo:
Dissertation presented to obtain the Ph.D degree in Chemistry
Resumo:
1st ASPIC International Congress
Resumo:
3rd Workshop on High-performance and Real-time Embedded Systems (HIRES 2015). 21, Jan, 2015. Amsterdam, Netherlands.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
The aim of the present study was to standardize and evaluate dot-Enzyme linked immunosorbent assay (Dot-ELISA), a simple and rapid test for the detection of cysticercus antibodies in the serum for the diagnosis of neurocysticercosis (NCC). The antigen used in the study was a complete homogenate of Cysticercus cellulosae cysts obtained from infected pigs and dotted on to nitrocellulose membrane. Test sera were collected from the patients of NCC, and control sera from patients with other diseases and healthy students and blood donors of the Jawaharlal Institute of Postgraduate Medical Education and Research (JIPMER) Hospital, Pondicherry, during a study period from 2001 to 2003. Dot-ELISA detected antibodies in 14 of 25 (56%) in clinically suspected cases of NCC, 13 of 23 (56.5%) in CT/MRI proven cases of NCC and 2 of 25 (8%) each in non-cysticercal CNS infection controls and healthy controls. The test showed a sensitivity of 56.25%, specificity of 92%, positive predictive value of 87.09%, and negative predictive value of 70.76%. Results of the present study shows that the Dot-ELISA as a simple test can be used in the field or poorly equipped laboratories for diagnosis of NCC .
Resumo:
Presented at 21st IEEE International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA 2015). 19 to 21, Aug, 2015, pp 122-131. Hong Kong, China.
Resumo:
4th International Conference, SIMPAR 2014, Bergamo, Italy, October 20-23, 2014
Resumo:
DNA amplification techniques are being used increasingly in clinical laboratories to confirm the identity of medically important bacteria. A PCR-based identification method has been in use in our centre for 10 years for Burkholderia pseudomallei and was used to confirm the identity of bacteria isolated from cases of melioidosis in Ceará since 2003. This particular method has been used as a reference standard for less discriminatory methods. In this study we evaluated three PCR-based methods of B. pseudomallei identification and used DNA sequencing to resolve discrepancies between PCR-based results and phenotypic identification methods. The established semi-nested PCR protocol for B. pseudomallei 16-23s spacer region produced a consistent negative result for one of our 100 test isolates (BCC #99), but correctly identified all 71 other B. pseudomallei isolates tested. Anomalous sequence variation was detected at the inner, reverse primer binding site for this method. PCR methods were developed for detection of two other B. pseudomallei bacterial metabolic genes. The conventional lpxO PCR protocol had a sensitivity of 0.89 and a specificity of 1.00, while a real-time lpxO protocol performed even better with sensitivity and specificity of 1.00, and 1.00. This method identified all B. pseudomallei isolates including the PCR-negative discrepant isolate. The phaC PCR protocol detected the gene in all B. pseudomallei and all but three B. cepacia isolates, making this method unsuitable for PCR-based identification of B. pseudomallei. This experience with PCR-based B. pseudomallei identification methods indicates that single PCR targets should be used with caution for identification of these bacteria, and need to be interpreted alongside phenotypic and alternative molecular methods such as gene sequencing.
Resumo:
A Computação Evolutiva enquadra-se na área da Inteligência Artificial e é um ramo das ciências da computação que tem vindo a ser aplicado na resolução de problemas em diversas áreas da Engenharia. Este trabalho apresenta o estado da arte da Computação Evolutiva, assim como algumas das suas aplicações no ramo da eletrónica, denominada Eletrónica Evolutiva (ou Hardware Evolutivo), enfatizando a síntese de circuitos digitais combinatórios. Em primeiro lugar apresenta-se a Inteligência Artificial, passando à Computação Evolutiva, nas suas principais vertentes: os Algoritmos Evolutivos baseados no processo da evolução das espécies de Charles Darwin e a Inteligência dos Enxames baseada no comportamento coletivo de alguns animais. No que diz respeito aos Algoritmos Evolutivos, descrevem-se as estratégias evolutivas, a programação genética, a programação evolutiva e com maior ênfase, os Algoritmos Genéticos. Em relação à Inteligência dos Enxames, descreve-se a otimização por colônia de formigas e a otimização por enxame de partículas. Em simultâneo realizou-se também um estudo da Eletrónica Evolutiva, explicando sucintamente algumas das áreas de aplicação, entre elas: a robótica, as FPGA, o roteamento de placas de circuito impresso, a síntese de circuitos digitais e analógicos, as telecomunicações e os controladores. A título de concretizar o estudo efetuado, apresenta-se um caso de estudo da aplicação dos algoritmos genéticos na síntese de circuitos digitais combinatórios, com base na análise e comparação de três referências de autores distintos. Com este estudo foi possível comparar, não só os resultados obtidos por cada um dos autores, mas também a forma como os algoritmos genéticos foram implementados, nomeadamente no que diz respeito aos parâmetros, operadores genéticos utilizados, função de avaliação, implementação em hardware e tipo de codificação do circuito.