976 resultados para Automatic detection
Resumo:
It is presented in this paper a study on the photo-electronic properties of multi layer a-Si: H/a-SiC: H p-i-n-i-p structures. This study is aimed to give an insight into the internal electrical characteristics of such a structure in thermal equilibrium, under applied Was and under different illumination condition. Taking advantage of this insight it is possible to establish a relation among-the electrical behavior of the structure the structure geometry (i.e. thickness of the light absorbing intrinsic layers and of the internal n-layer) and the composition of the layers (i.e. optical bandgap controlled through percentage of carbon dilution in the a-Si1-xCx: H layers). Showing an optical gain for low incident light power controllable by means of externally applied bias or structure composition, these structures are quite attractive for photo-sensing device applications, like color sensors and large area color image detector. An analysis based on numerical ASCA simulations is presented for describing the behavior of different configurations of the device and compared with experimental measurements (spectral response and current-voltage characteristic). (c) 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. In this paper, a new computer-aided diagnosis (CAD) system for steatosis classification, in a local and global basis, is presented. Bayes factor is computed from objective ultrasound textural features extracted from the liver parenchyma. The goal is to develop a CAD screening tool, to help in the steatosis detection. Results showed an accuracy of 93.33%, with a sensitivity of 94.59% and specificity of 92.11%, using the Bayes classifier. The proposed CAD system is a suitable graphical display for steatosis classification.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. Steatosis is usually a diffuse liver disease, since it is globally affected. However, steatosis can also be focal affecting only some foci difficult to discriminate. In both cases, steatosis is detected by laboratorial analysis and visual inspection of ultrasound images of the hepatic parenchyma. Liver biopsy is the most accurate diagnostic method but its invasive nature suggest the use of other non-invasive methods, while visual inspection of the ultrasound images is subjective and prone to error. In this paper a new Computer Aided Diagnosis (CAD) system for steatosis classification and analysis is presented, where the Bayes Factor, obatined from objective intensity and textural features extracted from US images of the liver, is computed in a local or global basis. The main goal is to provide the physician with an application to make it faster and accurate the diagnosis and quantification of steatosis, namely in a screening approach. The results showed an overall accuracy of 93.54% with a sensibility of 95.83% and 85.71% for normal and steatosis class, respectively. The proposed CAD system seemed suitable as a graphical display for steatosis classification and comparison with some of the most recent works in the literature is also presented.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica
Resumo:
Objectives - Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Methods - Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Results - Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. Conclusions - The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources.
Resumo:
A instalação de sistemas de videovigilância, no interior ou exterior, em locais como aeroportos, centros comerciais, escritórios, edifícios estatais, bases militares ou casas privadas tem o intuito de auxiliar na tarefa de monitorização do local contra eventuais intrusos. Com estes sistemas é possível realizar a detecção e o seguimento das pessoas que se encontram no ambiente local, tornando a monitorização mais eficiente. Neste contexto, as imagens típicas (imagem natural e imagem infravermelha) são utilizadas para extrair informação dos objectos detectados e que irão ser seguidos. Contudo, as imagens convencionais são afectadas por condições ambientais adversas como o nível de luminosidade existente no local (luzes muito fortes ou escuridão total), a presença de chuva, de nevoeiro ou de fumo que dificultam a tarefa de monitorização das pessoas. Deste modo, tornou‐se necessário realizar estudos e apresentar soluções que aumentem a eficácia dos sistemas de videovigilância quando sujeitos a condições ambientais adversas, ou seja, em ambientes não controlados, sendo uma das soluções a utilização de imagens termográficas nos sistemas de videovigilância. Neste documento são apresentadas algumas das características das câmaras e imagens termográficas, assim como uma caracterização de cenários de vigilância. Em seguida, são apresentados resultados provenientes de um algoritmo que permite realizar a segmentação de pessoas utilizando imagens termográficas. O maior foco desta dissertação foi na análise dos modelos de descrição (Histograma de Cor, HOG, SIFT, SURF) para determinar o desempenho dos modelos em três casos: distinguir entre uma pessoa e um carro; distinguir entre duas pessoas distintas e determinar que é a mesma pessoa ao longo de uma sequência. De uma forma sucinta pretendeu‐se, com este estudo, contribuir para uma melhoria dos algoritmos de detecção e seguimento de objectos em sequências de vídeo de imagens termográficas. No final, através de uma análise dos resultados provenientes dos modelos de descrição, serão retiradas conclusões que servirão de indicação sobre qual o modelo que melhor permite discriminar entre objectos nas imagens termográficas.
Resumo:
Introdução Actualmente, as mensagens electrónicas são consideradas um importante meio de comunicação. As mensagens electrónicas – vulgarmente conhecidas como emails – são utilizadas fácil e frequentemente para enviar e receber o mais variado tipo de informação. O seu uso tem diversos fins gerando diariamente um grande número de mensagens e, consequentemente um enorme volume de informação. Este grande volume de informação requer uma constante manipulação das mensagens de forma a manter o conjunto organizado. Tipicamente esta manipulação consiste em organizar as mensagens numa taxonomia. A taxonomia adoptada reflecte os interesses e as preferências particulares do utilizador. Motivação A organização manual de emails é uma actividade morosa e que consome tempo. A optimização deste processo através da implementação de um método automático, tende a melhorar a satisfação do utilizador. Cada vez mais existe a necessidade de encontrar novas soluções para a manipulação de conteúdo digital poupando esforços e custos ao utilizador; esta necessidade, concretamente no âmbito da manipulação de emails, motivou a realização deste trabalho. Hipótese O objectivo principal deste projecto consiste em permitir a organização ad-hoc de emails com um esforço reduzido por parte do utilizador. A metodologia proposta visa organizar os emails num conjunto de categorias, disjuntas, que reflectem as preferências do utilizador. A principal finalidade deste processo é produzir uma organização onde as mensagens sejam classificadas em classes apropriadas requerendo o mínimo número esforço possível por parte do utilizador. Para alcançar os objectivos estipulados, este projecto recorre a técnicas de mineração de texto, em especial categorização automática de texto, e aprendizagem activa. Para reduzir a necessidade de inquirir o utilizador – para etiquetar exemplos de acordo com as categorias desejadas – foi utilizado o algoritmo d-confidence. Processo de organização automática de emails O processo de organizar automaticamente emails é desenvolvido em três fases distintas: indexação, classificação e avaliação. Na primeira fase, fase de indexação, os emails passam por um processo transformativo de limpeza que visa essencialmente gerar uma representação dos emails adequada ao processamento automático. A segunda fase é a fase de classificação. Esta fase recorre ao conjunto de dados resultantes da fase anterior para produzir um modelo de classificação, aplicando-o posteriormente a novos emails. Partindo de uma matriz onde são representados emails, termos e os seus respectivos pesos, e um conjunto de exemplos classificados manualmente, um classificador é gerado a partir de um processo de aprendizagem. O classificador obtido é então aplicado ao conjunto de emails e a classificação de todos os emails é alcançada. O processo de classificação é feito com base num classificador de máquinas de vectores de suporte recorrendo ao algoritmo de aprendizagem activa d-confidence. O algoritmo d-confidence tem como objectivo propor ao utilizador os exemplos mais significativos para etiquetagem. Ao identificar os emails com informação mais relevante para o processo de aprendizagem, diminui-se o número de iterações e consequentemente o esforço exigido por parte dos utilizadores. A terceira e última fase é a fase de avaliação. Nesta fase a performance do processo de classificação e a eficiência do algoritmo d-confidence são avaliadas. O método de avaliação adoptado é o método de validação cruzada denominado 10-fold cross validation. Conclusões O processo de organização automática de emails foi desenvolvido com sucesso, a performance do classificador gerado e do algoritmo d-confidence foi relativamente boa. Em média as categorias apresentam taxas de erro relativamente baixas, a não ser as classes mais genéricas. O esforço exigido pelo utilizador foi reduzido, já que com a utilização do algoritmo d-confidence obteve-se uma taxa de erro próxima do valor final, mesmo com um número de casos etiquetados abaixo daquele que é requerido por um método supervisionado. É importante salientar, que além do processo automático de organização de emails, este projecto foi uma excelente oportunidade para adquirir conhecimento consistente sobre mineração de texto e sobre os processos de classificação automática e recuperação de informação. O estudo de áreas tão interessantes despertou novos interesses que consistem em verdadeiros desafios futuros.
Resumo:
Signal subspace identification is a crucial first step in many hyperspectral processing algorithms such as target detection, change detection, classification, and unmixing. The identification of this subspace enables a correct dimensionality reduction, yielding gains in algorithm performance and complexity and in data storage. This paper introduces a new minimum mean square error-based approach to infer the signal subspace in hyperspectral imagery. The method, which is termed hyperspectral signal identification by minimum error, is eigen decomposition based, unsupervised, and fully automatic (i.e., it does not depend on any tuning parameters). It first estimates the signal and noise correlation matrices and then selects the subset of eigenvalues that best represents the signal subspace in the least squared error sense. State-of-the-art performance of the proposed method is illustrated by using simulated and real hyperspectral images.
Resumo:
The development of a FIA system for the determination of total choline content in several types of milk is described. The samples were submitted to hydrochloric acid digestion before injection into the system and passed through an enzymatic reactor containing choline oxidase immobilised on glass beads. This enzymatic reaction releases hydrogen peroxide which then reacts with a solution of iodide. The decrease in the concentration of iodide ion is quantified using an iodide ion selective tubular electrode based on a homogeneous crystalline membrane. Validation of the results obtained with this system was performed by comparison with results from a method described in the literature and applied to the determination of total choline in milks. The relative deviation was always < 5%. The repeatability of the method developed was assessed by calculation of the relative standard deviation (RSD) for 12 consecutive injections of one sample. The RSD obtained was < 0.6%.
Resumo:
A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition
Resumo:
The deterioration of water quality by Cyanobacteria cause outbreaks and epidemics associated with harmful diseases in Humans and animals because of the toxins that they release. Microcystin-LR is one of the hepatotoxins most widely studied and the World Health Organization, recommend a maximum value of 1mgL 1 in drinking water. Highly specific recognition molecules, such as molecular imprinted polymers are developed to quantify microcystins in waters for human use and shown to be of great potential in the analysis of these kinds of samples. The obtained results were auspicious, the detection limit found, 1.5mgL 1, being of the same order of magnitude as the guideline limit recommended by the WHO. This technology is very promising because the sensors are stable and specific, and the technology is inexpensive and allows for rapid on-site monitoring.
Resumo:
A multiresidue approach using microwave-assisted extraction and liquid chromatography with photodiode array detection was investigated for the determination of butylate, carbaryl, carbofuran, chlorpropham, ethiofencarb, linuron,metobromuron, and monolinuron in soils. The critical parameters of the developed methodology were studied. Method validation was performed by analyzing freshly and aged spiked soil samples. The recoveries and relative standard deviations reached using the optimized conditions were between 77.0 ± 0.46% and 120 ± 2.9% except for ethiofencarb (46.4 ± 4.4% to 105 ± 1.6%) and butylate (22.1 ± 7.6% to 49.2 ± 11%). Soil samples from five locations of Portugal were analysed.
Resumo:
Electroanalytical methods based on square-wave adsorptive-stripping voltammetry (SWAdSV) and flow-injection analysis with square-wave adsorptive-stripping voltammetric detection (FIA-SWAdSV) were developed for the determination of fluoxetine (FXT). The methods were based on the reduction of FXT at a mercury drop electrode at -1.2 V versus Ag/AgCl, in a phosphate buffer of pH 12.0, and on the possibility of accumulating the compound at the electrode surface. The SWAdSV method was successfully applied in the quantification of FXT in pharmaceutical products, human serum samples, and in drug dissolution studies. Because the presence of dissolved oxygen did not interfere significantly with the analysis, it was possible to quantify FXT in several pharmaceutical products using FIA-SWAdSV. This method enables analysis of up to 120 samples per hour at reduced costs.
Resumo:
Conferência: 39th Annual Conference of the IEEE Industrial-Electronics-Society (IECON), Vienna, Austria, Nov 10-14, 2013
Resumo:
On the basis of its electrochemical behaviour a new flow-injection analysis (FIA) method with amperometric detection has been developed for quantification of the herbicide bentazone (BTZ) in estuarine waters. Standard solutions and samples (200 µL) were injected into a water carrier stream and both pH and ionic strength were automatically adjusted inside the manifold. Optimization of critical FIA conditions indicated that the best analytical results were obtained at an oxidation potential of 1.10 V, pH 4.5, and an overall flow-rate of 2.4 mL min–1. Analysis of real samples was performed by means of calibration curves over the concentration range 2.5x10–6 to 5.0x10–5 mol L–1, and results were compared with those obtained by use of an independent method (HPLC). The accuracy of the amperometric determinations was ascertained; errors relative to the comparison method were below 4% and sampling rates were approximately 100 samples h–1. The repeatability of the proposed method was calculated by assessing the relative standard deviation (%) of ten consecutive determinations of one sample; the value obtained was 2.1%.