994 resultados para Segmentation techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the morphological determination of living individuals of the two sibling species S. araneus and S. coronatus is not possible, we have tested two biochemical methods to determine these shrews in ecological studies. After producing specific antibodies by rabbits, we performed an immunological test on 25 individuals. With this first method, a correct determination was achieved in 76% of the cases only. The second method proved very successful: a polyacrylamide gel electrophoresis showed a systematic difference for albumin (73 individuals analyzed). According to our experience, the necessary blood sampling (10-20 μl) seems harmless for the shrews

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2000 the European Statistical Office published the guidelines for developing theHarmonized European Time Use Surveys system. Under such a unified framework,the first Time Use Survey of national scope was conducted in Spain during 2002–03. The aim of these surveys is to understand human behavior and the lifestyle ofpeople. Time allocation data are of compositional nature in origin, that is, they aresubject to non-negativity and constant-sum constraints. Thus, standard multivariatetechniques cannot be directly applied to analyze them. The goal of this work is toidentify homogeneous Spanish Autonomous Communities with regard to the typicalactivity pattern of their respective populations. To this end, fuzzy clustering approachis followed. Rather than the hard partitioning of classical clustering, where objects areallocated to only a single group, fuzzy method identify overlapping groups of objectsby allowing them to belong to more than one group. Concretely, the probabilistic fuzzyc-means algorithm is conveniently adapted to deal with the Spanish Time Use Surveymicrodata. As a result, a map distinguishing Autonomous Communities with similaractivity pattern is drawn.Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel technique for estimating the rank of the trajectory matrix in the local subspace affinity (LSA) motion segmentation framework is presented. This new rank estimation is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built with LSA. The result is an enhanced model selection technique for trajectory matrix rank estimation by which it is possible to automate LSA, without requiring any a priori knowledge, and to improve the final segmentation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a novel rank estimation technique for trajectories motion segmentation within the Local Subspace Affinity (LSA) framework is presented. This technique, called Enhanced Model Selection (EMS), is based on the relationship between the estimated rank of the trajectory matrix and the affinity matrix built by LSA. The results on synthetic and real data show that without any a priori knowledge, EMS automatically provides an accurate and robust rank estimation, improving the accuracy of the final motion segmentation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chest physiotherapy (CP) using passive expiratory manoeuvres is widely used in Western Europe for the treatment of bronchiolitis, despite lacking evidence for its efficacy. We undertook an open randomised trial to evaluate the effectiveness of CP in infants hospitalised for bronchiolitis by comparing the time to clinical stability, the daily improvement of a severity score and the occurrence of complications between patients with and without CP. Children <1 year admitted for bronchiolitis in a tertiary hospital during two consecutive respiratory syncytial virus seasons were randomised to group 1 with CP (prolonged slow expiratory technique, slow accelerated expiratory flow, rarely induced cough) or group 2 without CP. All children received standard care (rhinopharyngeal suctioning, minimal handling, oxygen for saturation ≥92%, fractionated meals). Ninety-nine eligible children (mean age, 3.9 months), 50 in group 1 and 49 in group 2, with similar baseline variables and clinical severity at admission. Time to clinical stability, assessed as primary outcome, was similar for both groups (2.9 ± 2.1 vs. 3.2 ± 2.8 days, P = 0.45). The rate of improvement of a clinical and respiratory score, defined as secondary outcome, only showed a slightly faster improvement of the respiratory score in the intervention group when including stethoacoustic properties (P = 0.044). Complications were rare but occurred more frequently, although not significantly (P = 0.21), in the control arm. In conclusion, this study shows the absence of effectiveness of CP using passive expiratory techniques in infants hospitalised for bronchiolitis. It seems justified to recommend against the routine use of CP in these patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of segmentation methods is a crucial aspect in image processing, especially in the medical imaging field, where small differences between segmented regions in the anatomy can be of paramount importance. Usually, segmentation evaluation is based on a measure that depends on the number of segmented voxels inside and outside of some reference regions that are called gold standards. Although some other measures have been also used, in this work we propose a set of new similarity measures, based on different features, such as the location and intensity values of the misclassified voxels, and the connectivity and the boundaries of the segmented data. Using the multidimensional information provided by these measures, we propose a new evaluation method whose results are visualized applying a Principal Component Analysis of the data, obtaining a simplified graphical method to compare different segmentation results. We have carried out an intensive study using several classic segmentation methods applied to a set of MRI simulated data of the brain with several noise and RF inhomogeneity levels, and also to real data, showing that the new measures proposed here and the results that we have obtained from the multidimensional evaluation, improve the robustness of the evaluation and provides better understanding about the difference between segmentation methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a segmentation method for fetal brain tissuesof T2w MR images, based on the well known ExpectationMaximization Markov Random Field (EM- MRF) scheme. Ourmain contribution is an intensity model composed of 7Gaussian distribution designed to deal with the largeintensity variability of fetal brain tissues. The secondmain contribution is a 3-steps MRF model that introducesboth local spatial and anatomical priors given by acortical distance map. Preliminary results on 4 subjectsare presented and evaluated in comparison to manualsegmentations showing that our methodology cansuccessfully be applied to such data, dealing with largeintensity variability within brain tissues and partialvolume (PV).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Echocardiography is the preferred initial test to assess cardiac morphology and ventricular function. Cardiac MRI enables an optimal visualisation of heart muscle without contrast injection, and precise measurement of the ventricular volumes and systolic function. It is therefore an ideal test for patients with poor echocardiographic windows or for the specific evaluation of right heart chambers. Heart CT also remarkably images heart muscle and precisely measures ventricular systolic function after intravenous injection of iodinated contrast. Coronary CT may also, in selected cases, avoid the need for diagnostic coronary angiography. Although very accurate, these imaging modalities are expensive and may be contra-indicated for a particular patient. Their use in clinical practice has to follow the accepted guidelines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a pattern recognition method focused on paintings images. The purpose is construct a system able to recognize authors or art styles based on common elements of his work (here called patterns). The method is based on comparing images that contain the same or similar patterns. It uses different computer vision techniques, like SIFT and SURF, to describe the patterns in descriptors, K-Means to classify and simplify these descriptors, and RANSAC to determine and detect good results. The method are good to find patterns of known images but not so good if they are not.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In European countries and North America, people spend 80 to 90% of time inside buildings and thus breathe indoor air. In Switzerland, special attention has been devoted to the 16 stations of the national network of observation of atmospheric pollutants (NABEL). The results indicate a reduction in outdoor pollution over the last ten years. With such a decrease in pollution over these ten years the question becomes: how can we explain an increase of diseases? Indoor pollution can be the cause. Indoor contaminants that may create indoor air quality (IAQ) problems come from a variety of sources. These can include inadequate ventilation, temperature and humidity dysfunction, and volatile organic compounds (VOCs). The health effects from these contaminants are varied and can range from discomfort, irritation and respiratory diseases to cancer. Among such contaminants, environmental tobacco smoke (ETS) could be considered the most important in terms of both health effects and engineering controls of ventilation. To perform indoor pollution monitoring, several selected ETS tracers can be used including carbon monoxide (CO), carbon dioxide (CO2), respirable particles (RSP), condensate, nicotine, polycyclic aromatic hydrocarbons (PAHs), nitrosamines, etc. In this paper, some examples are presented of IAQ problems that have occurred following the renewal of buildings and energy saving concerns. Using industrial hygiene sampling techniques and focussing on selected priority pollutants used as tracers, various problems have been identified and solutions proposed. [Author]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment