18 resultados para Detection and fault location
em Instituto Politécnico do Porto, Portugal
Resumo:
The deterioration of water quality by Cyanobacteria cause outbreaks and epidemics associated with harmful diseases in Humans and animals because of the toxins that they release. Microcystin-LR is one of the hepatotoxins most widely studied and the World Health Organization, recommend a maximum value of 1mgL 1 in drinking water. Highly specific recognition molecules, such as molecular imprinted polymers are developed to quantify microcystins in waters for human use and shown to be of great potential in the analysis of these kinds of samples. The obtained results were auspicious, the detection limit found, 1.5mgL 1, being of the same order of magnitude as the guideline limit recommended by the WHO. This technology is very promising because the sensors are stable and specific, and the technology is inexpensive and allows for rapid on-site monitoring.
Resumo:
To increase the amount of logic available to the users in SRAM-based FPGAs, manufacturers are using nanometric technologies to boost logic density and reduce costs, making its use more attractive. However, these technological improvements also make FPGAs particularly vulnerable to configuration memory bit-flips caused by power fluctuations, strong electromagnetic fields and radiation. This issue is particularly sensitive because of the increasing amount of configuration memory cells needed to define their functionality. A short survey of the most recent publications is presented to support the options assumed during the definition of a framework for implementing circuits immune to bit-flips induction mechanisms in memory cells, based on a customized redundant infrastructure and on a detection-and-fix controller.
Resumo:
Na ocorrência de anomalias nas redes de distribuição de energia elétrica, muitas vezes devido ao reduzido número de informação disponível, a determinação da localização dos defeitos é uma tarefa árdua e morosa. Consequentemente, impõe-se o recurso por parte das companhias elétricas a sistemas que, contribuindo para a diminuição do tempo despendido na localização dos defeitos, assegurem a redução da duração e frequência das falhas de alimentação. Esta dissertação pretende estudar os diversos sistemas de deteção de defeitos existentes, com destaque para a utilização de Indicadores de Passagem de Defeito e analisar o contributo destes sistemas para a melhoria dos Índices de Qualidade de Serviço. Abordar as dificuldades que se colocam à implementação destes sistemas, nomeadamente, pelas características específicas das redes de distribuição. Pretende, ainda, desenvolver uma metodologia e a respetiva ferramenta, que permita a deteção de defeitos baseada na utilização de Indicadores de Passagem de Defeito comunicantes, numa saída da rede de distribuição de média tensão pertencente à EDP. Analisar técnica e economicamente os benefícios a obter com a implementação da metodologia desenvolvida. Esta dissertação pretende, não só atingir os objetivos acima referidos, mas também, através deles, elaborar uma ferramenta útil para as Companhias Elétricas, no sentido de adotarem sistemas de deteção de defeitos e com fim principal de uma possível redução dos tempos de indisponibilidade de alimentação, intimamente associados à persecução de melhores índices de Qualidade de Serviço por parte das mesmas.
Resumo:
β-lactamases are hydrolytic enzymes that inactivate the β-lactam ring of antibiotics such as penicillins and cephalosporins. The major diversity of studies carried out until now have mainly focused on the characterization of β-lactamases recovered among clinical isolates of Gram-positive staphylococci and Gram-negative enterobacteria, amongst others. However, only some studies refer to the detection and development of β-lactamases carriers in healthy humans, sick animals, or even in strains isolated from environmental stocks such as food, water, or soils. Considering this, we proposed a 10-week laboratory programme for the Biochemistry and Molecular Biology laboratory for majors in the health, environmental, and agronomical sciences. During those weeks, students would be dealing with some basic techniques such as DNA extraction, bacterial transformation, polymerase chain reaction (PCR), gel electrophoresis, and the use of several bioinformatics tools. These laboratory exercises would be conducted as a mini research project in which all the classes would be connected with the previous ones. This curriculum was compared in an experiment involving two groups of students from two different majors. The new curriculum, with classes linked together as a mini research project, was taught to a major in Pharmacy and an old curriculum was taught to students from environmental health. The results showed that students who were enrolled in the new curriculum obtained better results in the final exam than the students who were enrolled in the former curriculum. Likewise, these students were found to be more enthusiastic during the laboratory classes than those from the former curriculum.
Resumo:
Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.
Resumo:
Localization is a fundamental task in Cyber-Physical Systems (CPS), where data is tightly coupled with the environment and the location where it is generated. The research literature on localization has reached a critical mass, and several surveys have also emerged. This review paper contributes on the state-of-the-art with the proposal of a new and holistic taxonomy of the fundamental concepts of localization in CPS, based on a comprehensive analysis of previous research works and surveys. The main objective is to pave the way towards a deep understanding of the main localization techniques, and unify their descriptions. Furthermore, this review paper provides a complete overview on the most relevant localization and geolocation techniques. Also, we present the most important metrics for measuring the accuracy of localization approaches, which is meant to be the gap between the real location and its estimate. Finally, we present open issues and research challenges pertaining to localization. We believe that this review paper will represent an important and complete reference of localization techniques in CPS for researchers and practitioners and will provide them with an added value as compared to previous surveys.
Resumo:
Dependability is a critical factor in computer systems, requiring high quality validation & verification procedures in the development stage. At the same time, digital devices are getting smaller and access to their internal signals and registers is increasingly complex, requiring innovative debugging methodologies. To address this issue, most recent microprocessors include an on-chip debug (OCD) infrastructure to facilitate common debugging operations. This paper proposes an enhanced OCD infrastructure with the objective of supporting the verification of fault-tolerant mechanisms through fault injection campaigns. This upgraded on-chip debug and fault injection (OCD-FI) infrastructure provides an efficient fault injection mechanism with improved capabilities and dynamic behavior. Preliminary results show that this solution provides flexibility in terms of fault triggering and allows high speed real-time fault injection in memory elements
Resumo:
Fault injection is frequently used for the verification and validation of dependable systems. When targeting real time microprocessor based systems the process becomes significantly more complex. This paper proposes two complementary solutions to improve real time fault injection campaign execution, both in terms of performance and capabilities. The methodology is based on the use of the on-chip debug mechanisms present in modern electronic devices. The main objective is the injection of faults in microprocessor memory elements with minimum delay and intrusiveness. Different configurations were implemented and compared in terms of performance gain and logic overhead.
Resumo:
Sendo uma forma natural de interação homem-máquina, o reconhecimento de gestos implica uma forte componente de investigação em áreas como a visão por computador e a aprendizagem computacional. O reconhecimento gestual é uma área com aplicações muito diversas, fornecendo aos utilizadores uma forma mais natural e mais simples de comunicar com sistemas baseados em computador, sem a necessidade de utilização de dispositivos extras. Assim, o objectivo principal da investigação na área de reconhecimento de gestos aplicada à interacção homemmáquina é o da criação de sistemas, que possam identificar gestos específicos e usálos para transmitir informações ou para controlar dispositivos. Para isso as interfaces baseados em visão para o reconhecimento de gestos, necessitam de detectar a mão de forma rápida e robusta e de serem capazes de efetuar o reconhecimento de gestos em tempo real. Hoje em dia, os sistemas de reconhecimento de gestos baseados em visão são capazes de trabalhar com soluções específicas, construídos para resolver um determinado problema e configurados para trabalhar de uma forma particular. Este projeto de investigação estudou e implementou soluções, suficientemente genéricas, com o recurso a algoritmos de aprendizagem computacional, permitindo a sua aplicação num conjunto alargado de sistemas de interface homem-máquina, para reconhecimento de gestos em tempo real. A solução proposta, Gesture Learning Module Architecture (GeLMA), permite de forma simples definir um conjunto de comandos que pode ser baseado em gestos estáticos e dinâmicos e que pode ser facilmente integrado e configurado para ser utilizado numa série de aplicações. É um sistema de baixo custo e fácil de treinar e usar, e uma vez que é construído unicamente com bibliotecas de código. As experiências realizadas permitiram mostrar que o sistema atingiu uma precisão de 99,2% em termos de reconhecimento de gestos estáticos e uma precisão média de 93,7% em termos de reconhecimento de gestos dinâmicos. Para validar a solução proposta, foram implementados dois sistemas completos. O primeiro é um sistema em tempo real capaz de ajudar um árbitro a arbitrar um jogo de futebol robótico. A solução proposta combina um sistema de reconhecimento de gestos baseada em visão com a definição de uma linguagem formal, o CommLang Referee, à qual demos a designação de Referee Command Language Interface System (ReCLIS). O sistema identifica os comandos baseados num conjunto de gestos estáticos e dinâmicos executados pelo árbitro, sendo este posteriormente enviado para um interface de computador que transmite a respectiva informação para os robôs. O segundo é um sistema em tempo real capaz de interpretar um subconjunto da Linguagem Gestual Portuguesa. As experiências demonstraram que o sistema foi capaz de reconhecer as vogais em tempo real de forma fiável. Embora a solução implementada apenas tenha sido treinada para reconhecer as cinco vogais, o sistema é facilmente extensível para reconhecer o resto do alfabeto. As experiências também permitiram mostrar que a base dos sistemas de interação baseados em visão pode ser a mesma para todas as aplicações e, deste modo facilitar a sua implementação. A solução proposta tem ainda a vantagem de ser suficientemente genérica e uma base sólida para o desenvolvimento de sistemas baseados em reconhecimento gestual que podem ser facilmente integrados com qualquer aplicação de interface homem-máquina. A linguagem formal de definição da interface pode ser redefinida e o sistema pode ser facilmente configurado e treinado com um conjunto de gestos diferentes de forma a serem integrados na solução final.
Resumo:
The new generations of SRAM-based FPGA (field programmable gate array) devices are the preferred choice for the implementation of reconfigurable computing platforms intended to accelerate processing in real-time systems. However, FPGA's vulnerability to hard and soft errors is a major weakness to robust configurable system design. In this paper, a novel built-in self-healing (BISH) methodology, based on run-time self-reconfiguration, is proposed. A soft microprocessor core implemented in the FPGA is responsible for the management and execution of all the BISH procedures. Fault detection and diagnosis is followed by repairing actions, taking advantage of the dynamic reconfiguration features offered by new FPGA families. Meanwhile, modular redundancy assures that the system still works correctly
Resumo:
This study aims to optimize the water quality monitoring of a polluted watercourse (Leça River, Portugal) through the principal component analysis (PCA) and cluster analysis (CA). These statistical methodologies were applied to physicochemical, bacteriological and ecotoxicological data (with the marine bacterium Vibrio fischeri and the green alga Chlorella vulgaris) obtained with the analysis of water samples monthly collected at seven monitoring sites and during five campaigns (February, May, June, August, and September 2006). The results of some variables were assigned to water quality classes according to national guidelines. Chemical and bacteriological quality data led to classify Leça River water quality as “bad” or “very bad”. PCA and CA identified monitoring sites with similar pollution pattern, giving to site 1 (located in the upstream stretch of the river) a distinct feature from all other sampling sites downstream. Ecotoxicity results corroborated this classification thus revealing differences in space and time. The present study includes not only physical, chemical and bacteriological but also ecotoxicological parameters, which broadens new perspectives in river water characterization. Moreover, the application of PCA and CA is very useful to optimize water quality monitoring networks, defining the minimum number of sites and their location. Thus, these tools can support appropriate management decisions.
Resumo:
Auditory event-related potentials (AERPs) are widely used in diverse fields of today’s neuroscience, concerning auditory processing, speech perception, language acquisition, neurodevelopment, attention and cognition in normal aging, gender, developmental, neurologic and psychiatric disorders. However, its transposition to clinical practice has remained minimal. Mainly due to scarce literature on normative data across age, wide spectrumof results, variety of auditory stimuli used and to different neuropsychological meanings of AERPs components between authors. One of the most prominent AERP components studied in last decades was N1, which reflects auditory detection and discrimination. Subsequently, N2 indicates attention allocation and phonological analysis. The simultaneous analysis of N1 and N2 elicited by feasible novelty experimental paradigms, such as auditory oddball, seems an objective method to assess central auditory processing. The aim of this systematic review was to bring forward normative values for auditory oddball N1 and N2 components across age. EBSCO, PubMed, Web of Knowledge and Google Scholarwere systematically searched for studies that elicited N1 and/or N2 by auditory oddball paradigm. A total of 2,764 papers were initially identified in the database, of which 19 resulted from hand search and additional references, between 1988 and 2013, last 25 years. A final total of 68 studiesmet the eligibility criteria with a total of 2,406 participants from control groups for N1 (age range 6.6–85 years; mean 34.42) and 1,507 for N2 (age range 9–85 years; mean 36.13). Polynomial regression analysis revealed thatN1latency decreases with aging at Fz and Cz,N1 amplitude at Cz decreases from childhood to adolescence and stabilizes after 30–40 years and at Fz the decrement finishes by 60 years and highly increases after this age. Regarding N2, latency did not covary with age but amplitude showed a significant decrement for both Cz and Fz. Results suggested reliable normative values for Cz and Fz electrode locations; however, changes in brain development and components topography over age should be considered in clinical practice.
Resumo:
The intensification of agricultural productivity is an important challenge worldwide. However, environmental stressors can provide challenges to this intensification. The progressive occurrence of the cyanotoxins cylindrospermopsin (CYN) and microcystin-LR (MC-LR) as a potential consequence of eutrophication and climate change is of increasing concern in the agricultural sector because it has been reported that these cyanotoxins exert harmful effects in crop plants. A proteomic-based approach has been shown to be a suitable tool for the detection and identification of the primary responses of organisms exposed to cyanotoxins. The aim of this study was to compare the leaf-proteome profiles of lettuce plants exposed to environmentally relevant concentrations of CYN and a MC-LR/CYN mixture. Lettuce plants were exposed to 1, 10, and 100 lg/l CYN and a MC-LR/CYN mixture for five days. The proteins of lettuce leaves were separated by twodimensional electrophoresis (2-DE), and those that were differentially abundant were then identified by matrix-assisted laser desorption/ionization time of flight-mass spectrometry (MALDI-TOF/TOF MS). The biological functions of the proteins that were most represented in both experiments were photosynthesis and carbon metabolism and stress/defense response. Proteins involved in protein synthesis and signal transduction were also highly observed in the MC-LR/CYN experiment. Although distinct protein abundance patterns were observed in both experiments, the effects appear to be concentration-dependent, and the effects of the mixture were clearly stronger than those of CYN alone. The obtained results highlight the putative tolerance of lettuce to CYN at concentrations up to 100 lg/l. Furthermore, the combination of CYN with MC-LR at low concentrations (1 lg/l) stimulated a significant increase in the fresh weight (fr. wt) of lettuce leaves and at the proteomic level resulted in the increase in abundance of a high number of proteins. In contrast, many proteins exhibited a decrease in abundance or were absent in the gels of the simultaneous exposure to 10 and 100 lg/l MC-LR/CYN. In the latter, also a significant decrease in the fr. wt of lettuce leaves was obtained. These findings provide important insights into the molecular mechanisms of the lettuce response to CYN and MC-LR/CYN and may contribute to the identification of potential protein markers of exposure and proteins that may confer tolerance to CYN and MC-LR/CYN. Furthermore, because lettuce is an important crop worldwide, this study may improve our understanding of the potential impact of these cyanotoxins on its quality traits (e.g., presence of allergenic proteins).
Resumo:
As empresas nacionais deparam-se com a necessidade de responder ao mercado com uma grande variedade de produtos, pequenas séries e prazos de entrega reduzidos. A competitividade das empresas num mercado global depende assim da sua eficiência, da sua flexibilidade, da qualidade dos seus produtos e de custos reduzidos. Para se atingirem estes objetivos é necessário desenvolverem-se estratégias e planos de ação que envolvem os equipamentos produtivos, incluindo: a criação de novos equipamentos complexos e mais fiáveis, alteração dos equipamentos existentes modernizando-os de forma a responderem às necessidades atuais e a aumentar a sua disponibilidade e produtividade; e implementação de políticas de manutenção mais assertiva e focada no objetivo de “zero avarias”, como é o caso da manutenção preditiva. Neste contexto, o objetivo principal deste trabalho consiste na previsão do instante temporal ótimo da manutenção de um equipamento industrial – um refinador da fábrica de Mangualde da empresa Sonae Industria, que se encontra em funcionamento contínuo 24 horas por dia, 365 dias por ano. Para o efeito são utilizadas medidas de sensores que monitorizam continuamente o estado do refinador. A principal operação de manutenção deste equipamento é a substituição de dois discos metálicos do seu principal componente – o desfibrador. Consequentemente, o sensor do refinador analisado com maior detalhe é o sensor que mede a distância entre os dois discos do desfibrador. Os modelos ARIMA consistem numa abordagem estatística avançada para previsão de séries temporais. Baseados na descrição da autocorrelação dos dados, estes modelos descrevem uma série temporal como função dos seus valores passados. Neste trabalho, a metodologia ARIMA é utilizada para determinar um modelo que efetua uma previsão dos valores futuros do sensor que mede a distância entre os dois discos do desfibrador, determinando-se assim o momento ótimo da sua substituição e evitando paragens forçadas de produção por ocorrência de uma falha por desgaste dos discos. Os resultados obtidos neste trabalho constituem uma contribuição científica importante para a área da manutenção preditiva e deteção de falhas em equipamentos industriais.
Resumo:
This work presents the integration of obstacle detection and analysis capabilities in a coherent and advanced C&C framework allowing mixed-mode control in unmanned surface systems. The collision avoidance work has been successfully integrated in an operational autonomous surface vehicle and demonstrated in real operational conditions. We present the collision avoidance system, the ROAZ autonomous surface vehicle and the results obtained at sea tests. Limitations of current COTS radar systems are also discussed and further research directions are proposed towards the development and integration of advanced collision avoidance systems taking in account the different requirements in unmanned surface vehicles.