232 resultados para Preprocessing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction : Depression is a major issue worldwide and is seen as a significant health problem. Stigma and patient denial, clinical experience, time limitations, and reliability of psychometrics are barriers to the clinical diagnoses of depression. Thus, the establishment of an automated system that could detect such abnormalities would assist medical experts in their decision-making process. This paper reviews existing methods for the automated detection of depression from brain structural magnetic resonance images (sMRI).Methods : Relevant sources were identified from various databases and online sites using a combination of keywords and terms including depression, major depressive disorder, detection, classification, and MRI databases. Reference lists of chosen articles were further reviewed for associated publications.Results : The paper introduces a generic structure for representing and describing the methods developed for the detection of depression from sMRI of the brain. It consists of a number of components including acquisition and preprocessing, feature extraction, feature selection, and classification.Conclusion : Automated sMRI-based detection methods have the potential to provide an objective measure of depression, hence improving the confidence level in the diagnosis and prognosis of depression.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this work is to recognize faces using sets of images in visual and thermal spectra. This is challenging because the former is greatly affected by illumination changes, while the latter frequently contains occlusions due to eye-wear and is inherently less discriminative. Our method is based on a fusion of the two modalities. Specifically: we examine (i) the effects of preprocessing of data in each domain, (ii) the fusion of holistic and local facial appearance, and (iii) propose an algorithm for combining the similarity scores in visual and thermal spectra in the presence of prescription glasses and significant pose variations, using a small number of training images (5-7). Our system achieved a high correct identification rate of 97% on a freely available test set of 29 individuals and extreme illumination changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recognition algorithms that use data obtained by imaging faces in the thermal spectrum are promising in achieving invariance to extreme illumination changes that are often present in practice. In this paper we analyze the performance of a recently proposed face recognition algorithm that combines visual and thermal modalities by decision level fusion. We examine (i) the effects of the proposed data preprocessing in each domain, (ii) the contribution to improved recognition of different types of features, (iii) the importance of prescription glasses detection, in the context of both 1-to-N and 1-to-1 matching (recognition vs. verification performance). Finally, we discuss the significance of our results and, in particular, identify a number of limitations of the current state-of-the-art and propose promising directions for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nonnegative matrix factorization (NMF) is a widely used method for blind spectral unmixing (SU), which aims at obtaining the endmembers and corresponding fractional abundances, knowing only the collected mixing spectral data. It is noted that the abundance may be sparse (i.e., the endmembers may be with sparse distributions) and sparse NMF tends to lead to a unique result, so it is intuitive and meaningful to constrain NMF with sparseness for solving SU. However, due to the abundance sum-to-one constraint in SU, the traditional sparseness measured by L0/L1-norm is not an effective constraint any more. A novel measure (termed as S-measure) of sparseness using higher order norms of the signal vector is proposed in this paper. It features the physical significance. By using the S-measure constraint (SMC), a gradient-based sparse NMF algorithm (termed as NMF-SMC) is proposed for solving the SU problem, where the learning rate is adaptively selected, and the endmembers and abundances are simultaneously estimated. In the proposed NMF-SMC, there is no pure index assumption and no need to know the exact sparseness degree of the abundance in prior. Yet, it does not require the preprocessing of dimension reduction in which some useful information may be lost. Experiments based on synthetic mixtures and real-world images collected by AVIRIS and HYDICE sensors are performed to evaluate the validity of the proposed method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

 Automated sMRI-based depression detection system is developed whose components include acquisition and preprocessing, feature extraction, feature selection, and classification. The core focus of the research is on the establishment of a new feature selection algorithm that quantifies the most relevant brain volumetric feature for depression detection at an individual level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lung segmentation in thoracic computed tomography (CT) scans is an important preprocessing step for computer-aided diagnosis (CAD) of lung diseases. This paper focuses on the segmentation of the lung field in thoracic CT images. Traditional lung segmentation is based on Gray level thresholding techniques, which often requires setting a threshold and is sensitive to image contrasts. In this paper, we present a fully automated method for robust and accurate lung segmentation, which includes a enhanced thresholding algorithm and a refinement scheme based on a texture-aware active contour model. In our thresholding algorithm, a histogram based image stretch technique is performed in advance to uniformly increase contrasts between areas with low Hounsfield unit (HU) values and areas with high HU in all CT images. This stretch step enables the following threshold-free segmentation, which is the Otsu algorithm with contour analysis. However, as a threshold based segmentation, it has common issues such as holes, noises and inaccurate segmentation boundaries that will cause problems in future CAD for lung disease detection. To solve these problems, a refinement technique is proposed that captures vessel structures and lung boundaries and then smooths variations via texture-aware active contour model. Experiments on 2,342 diagnosis CT images demonstrate the effectiveness of the proposed method. Performance comparison with existing methods shows the advantages of our method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need to implement a software architecture that promotes the development of a SCADA supervisory system for monitoring industrial processes simulated with the flexibility of adding intelligent modules and devices such as CLP, according to the specifications of the problem, it was the motivation for this work. In the present study, we developed an intelligent supervisory system on a simulation of a distillation column modeled with Unisim. Furthermore, OLE Automation was used as communication between the supervisory and simulation software, which, with the use of the database, promoted an architecture both scalable and easy to maintain. Moreover, intelligent modules have been developed for preprocessing, data characteristics extraction, and variables inference. These modules were fundamentally based on the Encog software

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Brain-Computer Interfaces (BCI) have as main purpose to establish a communication path with the central nervous system (CNS) independently from the standard pathway (nervous, muscles), aiming to control a device. The main objective of the current research is to develop an off-line BCI that separates the different EEG patterns resulting from strictly mental tasks performed by an experimental subject, comparing the effectiveness of different signal-preprocessing approaches. We also tested different classification approaches: all versus all, one versus one and a hierarchic classification approach. No preprocessing techniques were found able to improve the system performance. Furthermore, the hierarchic approach proved to be capable to produce results above the expected by literature

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need to implement a software architecture that promotes the development of a SCADA supervisory system for monitoring industrial processes simulated with the flexibility of adding intelligent modules and devices such as CLP, according to the specifications of the problem, it was the motivation for this work. In the present study, we developed an intelligent supervisory system on a simulation of a distillation column modeled with Unisim. Furthermore, OLE Automation was used as communication between the supervisory and simulation software, which, with the use of the database, promoted an architecture both scalable and easy to maintain. Moreover, intelligent modules have been developed for preprocessing, data characteristics extraction, and variables inference. These modules were fundamentally based on the Encog software

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To prevent large errors in the GPS positioning, cycle slips should be detected and corrected. Such procedure is not trivial, mainly for single frequency receivers, but normally it is not noticed by the users. Thus, it will be discussed some practical and more used methods for cycle slips detection and correction using just GPS single-frequency observations. In the detection, the triple (TD) and tetra differences were used. In relation to the correction, in general, each slip is corrected in the preprocessing. Otherwise, other strategies should be adopted during the processing. In this paper, the option was to the second option, and two strategies were tested. In one of them, the elements of the covariance matrix of the involved ambiguities are modified and new ambiguity estimation starts. In the one, a new ambiguity is introduced as additional unknown when a cycle slip is detected. These possibilities are discussed and compared in this paper, as well as the aspects related to the practicity, implementation and viability of each one. Some experiments were carried out using simulated data with cycle slips in different satellites and epochs of the data. This allowed assessing and comparing the results of different occurrence of cycle slip and correction in several conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUÇÃO: O ensaio do cometa ou técnica da eletroforese de células isoladas é largamente empregado para avaliação de danos e reparo do DNA em células individuais. O material pode ser corado por técnicas de fluorescência ou por sal de prata. Este último apresenta vantagens técnicas, como o tipo de microscópio utilizado e a possibilidade de armazenamento das lâminas. A análise dos cometas pode ser feita de modo visual, porém há a desvantagem da subjetividade dos resultados, que pode ser minimizada por análise digital automatizada. OBJETIVOS: Desenvolvimento e validação de método de análise digital de cometas corados por sal de prata. MÉTODOS: Cinquenta cometas foram fotografados de maneira padronizada e impressos em papel. Além de medidas manualmente, essas imagens foram classificadas em cinco categorias por três avaliadores, antes e depois de pré-processadas automaticamente pelo software ImageJ 1.38x. As estimativas geradas pelos avaliadores foram comparadas quanto sua correlação e reprodutibilidade. em seguida, foram desenvolvidos algoritmos de análise digital das medidas, com base em filtros estatísticos de mediana e de mínimo. Os valores obtidos foram comparados com os estimados manual e visualmente após o pré-processamento. RESULTADOS: As medidas manuais das imagens pré-processadas apresentaram maior correlação intraclasse do que as imagens preliminares. Os parâmetros automatizados apresentaram alta correlação com as medidas manuais pré-processadas, sugerindo que este sistema aumenta a objetividade da análise, podendo ser utilizado na estimativa dos parâmetros dos cometas. CONCLUSÃO: A presente análise digital proposta para o teste do cometa corado pela prata mostrou-se factível e de melhor reprodutibilidade que a análise visual.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes an interactive environment built entirely upon public domain or free software, intended to be used as the preprocessor of a finite element package for the simulation of three-dimensional electromagnetic problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The edges detection model by a non-linear anisotropic diffusion, consists in a mathematical model of smoothing based in Partial Differential Equation (PDE), alternative to the conventional low-pass filters. The smoothing model consists in a selective process, where homogeneous areas of the image are smoothed intensely in agreement with the temporal evolution applied to the model. The level of smoothing is related with the amount of undesired information contained in the image, i.e., the model is directly related with the optimal level of smoothing, eliminating the undesired information and keeping selectively the interest features for Cartography area. The model is primordial for cartographic applications, its function is to realize the image preprocessing without losing edges and other important details on the image, mainly airports tracks and paved roads. Experiments carried out with digital images showed that the methodology allows to obtain the features, e.g. airports tracks, with efficiency.