991 resultados para Segmentation methods
Resumo:
Functional Data Analysis (FDA) deals with samples where a whole function is observedfor each individual. A particular case of FDA is when the observed functions are densityfunctions, that are also an example of infinite dimensional compositional data. In thiswork we compare several methods for dimensionality reduction for this particular typeof data: functional principal components analysis (PCA) with or without a previousdata transformation and multidimensional scaling (MDS) for diferent inter-densitiesdistances, one of them taking into account the compositional nature of density functions. The difeerent methods are applied to both artificial and real data (householdsincome distributions)
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
Influenza surveillance networks must detect early the viruses that will cause the forthcoming annual epidemics and isolate the strains for further characterization. We obtained the highest sensitivity (95.4%) with a diagnostic tool that combined a shell-vial assay and reverse transcription-PCR on cell culture supernatants at 48 h, and indeed, recovered the strain
Resumo:
Autonomous underwater vehicles (AUV) represent a challenging control problem with complex, noisy, dynamics. Nowadays, not only the continuous scientific advances in underwater robotics but the increasing number of subsea missions and its complexity ask for an automatization of submarine processes. This paper proposes a high-level control system for solving the action selection problem of an autonomous robot. The system is characterized by the use of reinforcement learning direct policy search methods (RLDPS) for learning the internal state/action mapping of some behaviors. We demonstrate its feasibility with simulated experiments using the model of our underwater robot URIS in a target following task
Resumo:
Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.
Resumo:
Photo-mosaicing techniques have become popular for seafloor mapping in various marine science applications. However, the common methods cannot accurately map regions with high relief and topographical variations. Ortho-mosaicing borrowed from photogrammetry is an alternative technique that enables taking into account the 3-D shape of the terrain. A serious bottleneck is the volume of elevation information that needs to be estimated from the video data, fused, and processed for the generation of a composite ortho-photo that covers a relatively large seafloor area. We present a framework that combines the advantages of dense depth-map and 3-D feature estimation techniques based on visual motion cues. The main goal is to identify and reconstruct certain key terrain feature points that adequately represent the surface with minimal complexity in the form of piecewise planar patches. The proposed implementation utilizes local depth maps for feature selection, while tracking over several views enables 3-D reconstruction by bundle adjustment. Experimental results with synthetic and real data validate the effectiveness of the proposed approach
Resumo:
In image segmentation, clustering algorithms are very popular because they are intuitive and, some of them, easy to implement. For instance, the k-means is one of the most used in the literature, and many authors successfully compare their new proposal with the results achieved by the k-means. However, it is well known that clustering image segmentation has many problems. For instance, the number of regions of the image has to be known a priori, as well as different initial seed placement (initial clusters) could produce different segmentation results. Most of these algorithms could be slightly improved by considering the coordinates of the image as features in the clustering process (to take spatial region information into account). In this paper we propose a significant improvement of clustering algorithms for image segmentation. The method is qualitatively and quantitative evaluated over a set of synthetic and real images, and compared with classical clustering approaches. Results demonstrate the validity of this new approach
Resumo:
In the accounting literature, interaction or moderating effects are usually assessed by means of OLS regression and summated rating scales are constructed to reduce measurement error bias. Structural equation models and two-stage least squares regression could be used to completely eliminate this bias, but large samples are needed. Partial Least Squares are appropriate for small samples but do not correct measurement error bias. In this article, disattenuated regression is discussed as a small sample alternative and is illustrated on data of Bisbe and Otley (in press) that examine the interaction effect of innovation and style of use of budgets on performance. Sizeable differences emerge between OLS and disattenuated regression
Resumo:
Résumé Objectif: l'observation des variations de volume de la matière grise (MG), de la matière blanche (MB), et du liquide céphalo-rachidien (LCR) est particulièrement utile dans l'étude de nombreux processus physiopathologiques, la mesure quantitative 'in vivo' de ces volumes présente un intérêt considérable tant en recherche qu'en pratique clinique. Cette étude présente et valide une méthode de segmentation automatique du cerveau avec mesure des volumes de MG et MB sur des images de résonance magnétique. Matériel et Méthode: nous utilisons un algorithme génétique automatique pour segmenter le cerveau en MG, MB et LCR à partir d'images tri-dimensionnelles de résonance magnétique en pondération Ti. Une étude morphométrique a été conduite sur 136 sujets hommes et femmes de 15 à 74 ans. L'algorithme a ensuite été validé par 5 approches différentes: I. Comparaison de mesures de volume sur un cerveau de cadavre par méthode automatique et par mesure de déplacement d'eau selon la méthode d'Archimède. 2. Comparaison de mesures surfaces sur des images bidimensionnelles segmentées soit par un traçage manuel soit par la méthode automatique. 3. Evaluation de la fiabilité de la segmentation par acquisitions et segmentations itératives du même cerveau. 4. Les volumes de MG, MB et LCR ont été utilisés pour une étude du vieillissement normal de la population. 5. Comparaison avec les données existantes de la littérature. Résultats: nous avons pu observer une variation de la mesure de 4.17% supplémentaire entre le volume d'un cerveau de cadavre mesuré par la méthode d'Archimède, en majeure partie due à la persistance de tissus après dissection_ La comparaison des méthodes de comptage manuel de surface avec la méthode automatique n'a pas montré de variation significative. L'épreuve du repositionnement du même sujet à diverses reprises montre une très bonne fiabilité avec une déviation standard de 0.46% pour la MG, 1.02% pour la MB et 3.59% pour le LCR, soit 0.19% pour le volume intracrânien total (VICT). L'étude morphométrique corrobore les résultats des études anatomiques et radiologiques existantes. Conclusion: la segmentation du cerveau par un algorithme génétique permet une mesure 100% automatique, fiable et rapide des volumes cérébraux in vivo chez l'individu normal.
Resumo:
A total of 138 isolates, 118 methicillin-resistant Staphylococcus aureus (MRSA) isolates (staphylococcal cassette chromosome type II, 20 isolates, type III, 39 isolates and type IV, 59 isolates) and 20 methicillin-sensitive S. aureus isolates were evaluated by phenotypic methods: cefoxitin and oxacillin disk diffusion (DD), agar dilution (AD), latex agglutination (LA), oxacillin agar screening (OAS) and chromogenic agar detection. All methods showed 100% specificity, but only the DD tests presented 100% sensitivity. The sensitivity of the other tests ranged from 82.2% (OAS)-98.3% (AD). The LA test showed the second lowest sensitivity (86.4%). The DD test showed high accuracy in the detection of MRSA isolates, but there was low precision in the detection of type IV isolates by the other tests, indicating that the genotypic characteristics of the isolates should be considered.
Resumo:
The generation of an antigen-specific T-lymphocyte response is a complex multi-step process. Upon T-cell receptor-mediated recognition of antigen presented by activated dendritic cells, naive T-lymphocytes enter a program of proliferation and differentiation, during the course of which they acquire effector functions and may ultimately become memory T-cells. A major goal of modern immunology is to precisely identify and characterize effector and memory T-cell subpopulations that may be most efficient in disease protection. Sensitive methods are required to address these questions in exceedingly low numbers of antigen-specific lymphocytes recovered from clinical samples, and not manipulated in vitro. We have developed new techniques to dissect immune responses against viral or tumor antigens. These allow the isolation of various subsets of antigen-specific T-cells (with major histocompatibility complex [MHC]-peptide multimers and five-color FACS sorting) and the monitoring of gene expression in individual cells (by five-cell reverse transcription-polymerase chain reaction [RT-PCR]). We can also follow their proliferative life history by flow-fluorescence in situ hybridization (FISH) analysis of average telomere length. Recently, using these tools, we have identified subpopulations of CD8+ T-lymphocytes with distinct proliferative history and partial effector-like properties. Our data suggest that these subsets descend from recently activated T-cells and are committed to become differentiated effector T-lymphocytes.
Resumo:
This study compares the diagnostic accuracy of the TF-Test® (TFT) for human parasitosis with results obtained using the traditional Kato-Katz (KK), Hoffman-Pons-Janer (HPJ), Willis and Baermann-Moraes (BM) techniques. Overall, four stool samples were taken from each individual; three alternate-day TFT stool samples and another sample that was collected in a universal container. Stool samples were taken from 331 inhabitants of the community of Quilombola Santa Cruz. The gold standard (GS) for protozoa detection was defined as the combined results for TFT, HPJ and Willis coproscopic techniques; for helminth detection, GS was defined as the combined results for all five coproscopic techniques (TFT, KK, HPJ, Willis and BM). The positivity rate of each method was compared using the McNemar test. While the TFT exhibited similar positivity rates to the GS for Entamoeba histolytica/dispar (82.4%) and Giardia duodenalis (90%), HPJ and Willis techniques exhibited significantly lower positivity rates for these protozoa. All tests exhibited significantly lower positivity rates compared with GS for the diagnosis of helminths. The KK technique had the highest positivity rate for diagnosing Schistosoma mansoni (74.6%), while the TFT had the highest positivity rates for Ascaris lumbricoides (58.1%) and hookworm (75%); HPJ technique had the highest positivity rate for Strongyloides stercoralis (50%). Although a combination of tests is the most accurate method for the diagnosis of enteral parasites, the TFT reliably estimates the prevalence of protozoa and selected helminths, such as A. lumbricoides and hookworm. Further studies are needed to evaluate the detection accuracy of the TFT in samples with varying numbers of parasites.
Resumo:
In this study we compared two polymerase chain reaction (PCR) methods using either 16S ribosomal RNA (rRNA) or 23S rRNA gene primers for the detection of different Leptospira interrogans serovars. The performance of these two methods was assessed using DNA extracted from bovine tissues previously inoculated with several bacterial suspensions. PCR was performed on the same tissues before and after the formalin-fixed, paraffin-embedding procedure (FFPE tissues). The 23S rDNA PCR detected all fresh and FFPE positive tissues while the 16S rDNA-based protocol detected primarily the positive fresh tissues. Both methods are specific for pathogenic L. interrogans. The 23S-based PCR method successfully detected Leptospira in four dubious cases of human leptospirosis from archival tissue specimens and one leptospirosis-positive canine specimen. A sensitive method for leptospirosis identification in FFPE tissues would be a useful tool to screen histological specimen archives and gain a better assessment of human leptospirosis prevalence, especially in tropical countries, where large outbreaks can occur following the rainy season.
Resumo:
The diagnosis of schistosomiasis is problematic in low-intensity transmission areas because parasitological methods lack sensitivity and molecular methods are neither widely available nor extensively validated. Helmintex is a method for isolating eggs from large faecal samples. We report preliminary results of a comparative evaluation of the Helmintex and Kato-Katz (KK) methods for the diagnosis of schistosomiasis in a low-intensity transmission area in Bandeirantes, Paraná, southern Brazil. Eggs were detected by both methods in seven patients, whereas only Helmintex yielded positive results in four individuals. The results confirm the previously demonstrated higher sensitivity of the Helmintex method compared with the KK method.