223 resultados para Particle image analyser
Resumo:
In this paper, we propose two active learning algorithms for semiautomatic definition of training samples in remote sensing image classification. Based on predefined heuristics, the classifier ranks the unlabeled pixels and automatically chooses those that are considered the most valuable for its improvement. Once the pixels have been selected, the analyst labels them manually and the process is iterated. Starting with a small and nonoptimal training set, the model itself builds the optimal set of samples which minimizes the classification error. We have applied the proposed algorithms to a variety of remote sensing data, including very high resolution and hyperspectral images, using support vector machines. Experimental results confirm the consistency of the methods. The required number of training samples can be reduced to 10% using the methods proposed, reaching the same level of accuracy as larger data sets. A comparison with a state-of-the-art active learning method, margin sampling, is provided, highlighting advantages of the methods proposed. The effect of spatial resolution and separability of the classes on the quality of the selection of pixels is also discussed.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
We investigated the relationship between being bullied and measured body weight and perceived body weight among adolescents of a middle-income sub Saharan African country. Our data originated from the Global School-based Health Survey, which targets adolescents aged 13-15 years. Student weights and heights were measured before administrating the questionnaire which included questions about personal data, health behaviors and being bullied. Standard criteria were used to assess thinness, overweight and obesity. Among 1,006 participants who had complete data, 16.5% (95%CI 13.3-20.2) reported being bullied ≥ 3 days during the past 30 days; 13.4% were thin, 16.8% were overweight and 7.6% were obese. Categories of actual weight and of perceived weight correlated only moderately (Spearman correlation coefficient 0.37 for boys and 0.57 for girls; p < 0.001). In univariate analysis, both actual obesity (OR 1.76; p = 0.051) and perception of high weight (OR 1.63 for "slightly overweight"; OR 2.74 for "very overweight", both p < 0.05) were associated with being bullied. In multivariate analysis, ORs for categories of perceived overweight were virtually unchanged while ORs for actual overweight and obesity were substantially attenuated, suggesting a substantial role of perceived weight in the association with being bullied. Actual underweight and perceived thinness also tended to be associated with being bullied, although not significantly. Our findings suggest that more research attention be given to disentangling the significant association between body image, overweight and bullying among adolescents. Further studies in diverse populations are warranted.
Resumo:
Three-dimensional imaging for the quantification of myocardial motion is a key step in the evaluation of cardiac disease. A tagged magnetic resonance imaging method that automatically tracks myocardial displacement in three dimensions is presented. Unlike other techniques, this method tracks both in-plane and through-plane motion from a single image plane without affecting the duration of image acquisition. A small z-encoding gradient is subsequently added to the refocusing lobe of the slice-selection gradient pulse in a slice following CSPAMM acquisition. An opposite polarity z-encoding gradient is added to the orthogonal tag direction. The additional z-gradients encode the instantaneous through plane position of the slice. The vertical and horizontal tags are used to resolve in-plane motion, while the added z-gradients is used to resolve through-plane motion. Postprocessing automatically decodes the acquired data and tracks the three-dimensional displacement of every material point within the image plane for each cine frame. Experiments include both a phantom and in vivo human validation. These studies demonstrate that the simultaneous extraction of both in-plane and through-plane displacements and pathlines from tagged images is achievable. This capability should open up new avenues for the automatic quantification of cardiac motion and strain for scientific and clinical purposes.
Resumo:
We present a novel approach for analyzing single-trial electroencephalography (EEG) data, using topographic information. The method allows for visualizing event-related potentials using all the electrodes of recordings overcoming the problem of previous approaches that required electrode selection and waveforms filtering. We apply this method to EEG data from an auditory object recognition experiment that we have previously analyzed at an ERP level. Temporally structured periods were statistically identified wherein a given topography predominated without any prior information about the temporal behavior. In addition to providing novel methods for EEG analysis, the data indicate that ERPs are reliably observable at a single-trial level when examined topographically.
Resumo:
A semisupervised support vector machine is presented for the classification of remote sensing images. The method exploits the wealth of unlabeled samples for regularizing the training kernel representation locally by means of cluster kernels. The method learns a suitable kernel directly from the image and thus avoids assuming a priori signal relations by using a predefined kernel structure. Good results are obtained in image classification examples when few labeled samples are available. The method scales almost linearly with the number of unlabeled samples and provides out-of-sample predictions.
Resumo:
Des techniques de visualisation sont exploitées dans les enquêtes judiciaires afin de faciliter le traitement d'affaires d'envergure. Les éléments pertinents de l'enquête sont représentés par des schémas décrivant les relations entre les évènements et les entités d'intérêt. Les exploitations classiques de ces techniques qui s'apparentent à la construction de graphes sont par exemple : la représentation de réseaux criminels, de trafics de marchandises, de chronologies d'évènements, ainsi que la visualisation de relations téléphoniques et financières. Dans ce contexte, la visualisation soutient un nombre important d'objectifs, tels qu'analyser les traces et les informations collectées, évaluer à posteriori une investigation, aider à qualifier les infractions, faciliter l'appréhension d'un dossier et la prise de décisions au cours d'une enquête, voire soutenir une argumentation lors du procès. La pratique intègre des outils logiciels simples qui produisent des graphiques élégants et souvent percutants. Cette recherche tend à montrer qu'il existe des disparités étonnantes lors de l'exploitation de ces techniques. Des biais de raisonnement et de perception peuvent être induits, allant jusqu'à provoquer des décisions aux conséquences parfois désastreuses. Pour mettre en évidence ces difficultés, des évaluations ont été effectuées avec des praticiens et des étudiants. Elles ont permis d'établir une image empirique de l'étendue des variations de conception et d'interprétation des représentations, ainsi que de leurs impacts sur la prise de décision. La nature et la diversité des concepts à représenter, l'absence de consensus sur la manière de représenter les données, la diversité des solutions visuelles envisageables, les contraintes imposées par les outils exploités et l'absence d'une formalisation claire du langage, sont autant de causes supposées des difficultés. Ce constat révèle la nécessiter de consolider les méthodes pratiquées.