925 resultados para Label propagation


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Semi-supervised learning is applied to classification problems where only a small portion of the data items is labeled. In these cases, the reliability of the labels is a crucial factor, because mislabeled items may propagate wrong labels to a large portion or even the entire data set. This paper aims to address this problem by presenting a graph-based (network-based) semi-supervised learning method, specifically designed to handle data sets with mislabeled samples. The method uses teams of walking particles, with competitive and cooperative behavior, for label propagation in the network constructed from the input data set. The proposed model is nature-inspired and it incorporates some features to make it robust to a considerable amount of mislabeled data items. Computer simulations show the performance of the method in the presence of different percentage of mislabeled data, in networks of different sizes and average node degree. Importantly, these simulations reveals the existence of the critical points of the mislabeled subset size, below which the network is free of wrong label contamination, but above which the mislabeled samples start to propagate their labels to the rest of the network. Moreover, numerical comparisons have been made among the proposed method and other representative graph-based semi-supervised learning methods using both artificial and real-world data sets. Interestingly, the proposed method has increasing better performance than the others as the percentage of mislabeled samples is getting larger. © 2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Identification and classification of overlapping nodes in networks are important topics in data mining. In this paper, a network-based (graph-based) semi-supervised learning method is proposed. It is based on competition and cooperation among walking particles in a network to uncover overlapping nodes by generating continuous-valued outputs (soft labels), corresponding to the levels of membership from the nodes to each of the communities. Moreover, the proposed method can be applied to detect overlapping data items in a data set of general form, such as a vector-based data set, once it is transformed to a network. Usually, label propagation involves risks of error amplification. In order to avoid this problem, the proposed method offers a mechanism to identify outliers among the labeled data items, and consequently prevents error propagation from such outliers. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method. © 2012 Springer-Verlag.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Semi-supervised learning is one of the important topics in machine learning, concerning with pattern classification where only a small subset of data is labeled. In this paper, a new network-based (or graph-based) semi-supervised classification model is proposed. It employs a combined random-greedy walk of particles, with competition and cooperation mechanisms, to propagate class labels to the whole network. Due to the competition mechanism, the proposed model has a local label spreading fashion, i.e., each particle only visits a portion of nodes potentially belonging to it, while it is not allowed to visit those nodes definitely occupied by particles of other classes. In this way, a "divide-and-conquer" effect is naturally embedded in the model. As a result, the proposed model can achieve a good classification rate while exhibiting low computational complexity order in comparison to other network-based semi-supervised algorithms. Computer simulations carried out for synthetic and real-world data sets provide a numeric quantification of the performance of the method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Semi-supervised learning is a classification paradigm in which just a few labeled instances are available for the training process. To overcome this small amount of initial label information, the information provided by the unlabeled instances is also considered. In this paper, we propose a nature-inspired semi-supervised learning technique based on attraction forces. Instances are represented as points in a k-dimensional space, and the movement of data points is modeled as a dynamical system. As the system runs, data items with the same label cooperate with each other, and data items with different labels compete among them to attract unlabeled points by applying a specific force function. In this way, all unlabeled data items can be classified when the system reaches its stable state. Stability analysis for the proposed dynamical system is performed and some heuristics are proposed for parameter setting. Simulation results show that the proposed technique achieves good classification results on artificial data sets and is comparable to well-known semi-supervised techniques using benchmark data sets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In general, pattern recognition techniques require a high computational burden for learning the discriminating functions that are responsible to separate samples from distinct classes. As such, there are several studies that make effort to employ machine learning algorithms in the context of big data classification problems. The research on this area ranges from Graphics Processing Units-based implementations to mathematical optimizations, being the main drawback of the former approaches to be dependent on the graphic video card. Here, we propose an architecture-independent optimization approach for the optimum-path forest (OPF) classifier, that is designed using a theoretical formulation that relates the minimum spanning tree with the minimum spanning forest generated by the OPF over the training dataset. The experiments have shown that the approach proposed can be faster than the traditional one in five public datasets, being also as accurate as the original OPF. (C) 2014 Elsevier B. V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have recently demonstrated a biosensor based on a lattice of SU8 pillars on a 1 μm SiO2/Si wafer by measuring vertically reflectivity as a function of wavelength. The biodetection has been proven with the combination of Bovine Serum Albumin (BSA) protein and its antibody (antiBSA). A BSA layer is attached to the pillars; the biorecognition of antiBSA involves a shift in the reflectivity curve, related with the concentration of antiBSA. A detection limit in the order of 2 ng/ml is achieved for a rhombic lattice of pillars with a lattice parameter (a) of 800 nm, a height (h) of 420 nm and a diameter(d) of 200 nm. These results correlate with calculations using 3D-finite difference time domain method. A 2D simplified model is proposed, consisting of a multilayer model where the pillars are turned into a 420 nm layer with an effective refractive index obtained by using Beam Propagation Method (BPM) algorithm. Results provided by this model are in good correlation with experimental data, reaching a reduction in time from one day to 15 minutes, giving a fast but accurate tool to optimize the design and maximizing sensitivity, and allows analyzing the influence of different variables (diameter, height and lattice parameter). Sensitivity is obtained for a variety of configurations, reaching a limit of detection under 1 ng/ml. Optimum design is not only chosen because of its sensitivity but also its feasibility, both from fabrication (limited by aspect ratio and proximity of the pillars) and fluidic point of view. (© 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network. Copyright (c) 2008 J. R. C. Piqueira and F. B. Cesar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compartmental epidemiological models have been developed since the 1920s and successfully applied to study the propagation of infectious diseases. Besides, due to their structure, in the 1960s an interesting version of these models was developed to clarify some aspects of rumor propagation, considering that spreading an infectious disease or disseminating information is analogous phenomena. Here, in an analogy with the SIR (Susceptible-Infected-Removed) epidemiological model, the ISS (Ignorant-Spreader-Stifler) rumor spreading model is studied. By using concepts from the Dynamical Systems Theory, stability of equilibrium points is established, according to propagation parameters and initial conditions. Some numerical experiments are conducted in order to validate the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction and Purpose: Bimatoprost and the fixed combination of latanoprost with timolol maleate are 2 medications widely used to treat glaucoma and ocular hypertension (OHT). The aim of the study is to compare the efficacy of these 2 drugs in reducing intraocular pressure (IOP) after 8 weeks of treatment in patients with primary open angle glaucoma (POAG) or OHT. Methods: In this randomized, open-label trial, 44 patients with POAG or OHT were allocated to receive either bimatoprost (1 drop QD) or latanoprost/timolol (1 drop QD). Primary outcome was the mean diurnal IOP measurement at the 8th week, calculated as the mean IOP measurements taken at 8:00 AM, 10: 00 AM, and 12: 00 PM Secondary outcomes included the baseline change in IOP measured 3 times a day, after the water-drinking test (performed after the last IOP measurement), and the assessment of side effects of each therapy. Results: The mean IOP levels of latanoprost/timolol (13.83, SD = 2.54) was significantly lower than of bimatoprost (16.16, SD = 3.28; P < 0.0001) at week 8. Also, the change in mean IOP values was significantly higher in the latanoprost/timolol group at 10:00 AM (P = 0.013) and 12:00 PM (P = 0.01), but not at 8: 00 AM (P = ns). During the water-drinking test, there was no signifi cant difference in IOP increase (absolute and percentage) between groups; however, there was a signifi cant decrease in mean heart rate in the latanoprost/timolol group. Finally, no signifi cant changes in blood pressure and lung spirometry were observed in either groups. Conclusions: The fixed combination of latanoprost/timolol was significantly superior to bimatoprost alone in reducing IOP in patients with POAG or OHT. Further studies with large sample sizes should be taken to support the superior efficacy of latanoprost/timolol, as well as to better assess its profile of side effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last decade the Sznajd model has been successfully employed in modeling some properties and scale features of both proportional and majority elections. We propose a version of the Sznajd model with a generalized bounded confidence rule-a rule that limits the convincing capability of agents and that is essential to allow coexistence of opinions in the stationary state. With an appropriate choice of parameters it can be reduced to previous models. We solved this model both in a mean-field approach (for an arbitrary number of opinions) and numerically in a Barabaacutesi-Albert network (for three and four opinions), studying the transient and the possible stationary states. We built the phase portrait for the special cases of three and four opinions, defining the attractors and their basins of attraction. Through this analysis, we were able to understand and explain discrepancies between mean-field and simulation results obtained in previous works for the usual Sznajd model with bounded confidence and three opinions. Both the dynamical system approach and our generalized bounded confidence rule are quite general and we think it can be useful to the understanding of other similar models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fatigue and crack propagation are phenomena affected by high uncertainties, where deterministic methods fail to predict accurately the structural life. The present work aims at coupling reliability analysis with boundary element method. The latter has been recognized as an accurate and efficient numerical technique to deal with mixed mode propagation, which is very interesting for reliability analysis. The coupled procedure allows us to consider uncertainties during the crack growth process. In addition, it computes the probability of fatigue failure for complex structural geometry and loading. Two coupling procedures are considered: direct coupling of reliability and mechanical solvers and indirect coupling by the response surface method. Numerical applications show the performance of the proposed models in lifetime assessment under uncertainties, where the direct method has shown faster convergence than response surface method. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with analysis of cracked structures using BEM. Two formulations to analyse the crack growth process in quasi-brittle materials are discussed. They are based on the dual formulation of BEM where two different integral equations are employed along the opposite sides of the crack surface. The first presented formulation uses the concept of constant operator, in which the corrections of the nonlinear process are made only by applying appropriate tractions along the crack surfaces. The second presented BEM formulation to analyse crack growth problems is an implicit technique based on the use of a consistent tangent operator. This formulation is accurate, stable and always requires much less iterations to reach the equilibrium within a given load increment in comparison with the classical approach. Comparison examples of classical problem of crack growth are shown to illustrate the performance of the two formulations. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a boundary element method (BEM) model that is used for the analysis of multiple random crack growth by considering linear elastic fracture mechanics problems and structures subjected to fatigue. The formulation presented in this paper is based on the dual boundary element method, in which singular and hyper-singular integral equations are used. This technique avoids singularities of the resulting algebraic system of equations, despite the fact that the collocation points coincide for the two opposite crack faces. In fracture mechanics analyses, the displacement correlation technique is applied to evaluate stress intensity factors. The maximum circumferential stress theory is used to evaluate the propagation angle and the effective stress intensity factor. The fatigue model uses Paris` law to predict structural life. Examples of simple and multi-fractured structures loaded until rupture are considered. These analyses demonstrate the robustness of the proposed model. In addition, the results indicate that this formulation is accurate and can model localisation and coalescence phenomena. (C) 2010 Elsevier Ltd. All rights reserved.