999 resultados para Universal Decimal Classification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes a new approach using a committee machine of artificial neural networks to classify masses found in mammograms as benign or malignant. Three shape factors, three edge-sharpness measures, and 14 texture measures are used for the classification of 20 regions of interest (ROIs) related to malignant tumors and 37 ROIs related to benign masses. A group of multilayer perceptrons (MLPs) is employed as a committee machine of neural network classifiers. The classification results are reached by combining the responses of the individual classifiers. Experiments involving changes in the learning algorithm of the committee machine are conducted. The classification accuracy is evaluated using the area A. under the receiver operating characteristics (ROC) curve. The A, result for the committee machine is compared with the A, results obtained using MLPs and single-layer perceptrons (SLPs), as well as a linear discriminant analysis (LDA) classifier Tests are carried out using the student's t-distribution. The committee machine classifier outperforms the MLP SLP, and LDA classifiers in the following cases: with the shape measure of spiculation index, the A, values of the four methods are, in order 0.93, 0.84, 0.75, and 0.76; and with the edge-sharpness measure of acutance, the values are 0.79, 0.70, 0.69, and 0.74. Although the features with which improvement is obtained with the committee machines are not the same as those that provided the maximal value of A(z) (A(z) = 0.99 with some shape features, with or without the committee machine), they correspond to features that are not critically dependent on the accuracy of the boundaries of the masses, which is an important result. (c) 2008 SPIE and IS&T.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Since establishing universal free access to antiretroviral therapy in 1996, the Brazilian Health System has increased the number of centers providing HIV/AIDS outpatient care from 33 to 540. There had been no formal monitoring of the quality of these services until a survey of 336 AIDS health centers across 7 Brazilian states was undertaken in 2002. Managers of the services were asked to assess their clinics according to parameters of service inputs and service delivery processes. This report analyzes the survey results and identifies predictors of the overall quality of service delivery. Methods: The survey involved completion of a multiple-choice questionnaire comprising 107 parameters of service inputs and processes of delivering care, with responses assessed according to their likely impact on service quality using a 3-point scale. K-means clustering was used to group these services according to their scored responses. Logistic regression analysis was performed to identify predictors of high service quality. Results: The questionnaire was completed by 95.8% (322) of the managers of the sites surveyed. Most sites scored about 50% of the benchmark expectation. K-means clustering analysis identified four quality levels within which services could be grouped: 76 services (24%) were classed as level 1 (best), 53 (16%) as level 2 (medium), 113 (35%) as level 3 (poor), and 80 (25%) as level 4 (very poor). Parameters of service delivery processes were more important than those relating to service inputs for determining the quality classification. Predictors of quality services included larger care sites, specialization for HIV/AIDS, and location within large municipalities. Conclusion: The survey demonstrated highly variable levels of HIV/AIDS service quality across the sites. Many sites were found to have deficiencies in the processes of service delivery processes that could benefit from quality improvement initiatives. These findings could have implications for how HIV/AIDS services are planned in Brazil to achieve quality standards, such as for where service sites should be located, their size and staffing requirements. A set of service delivery indicators has been identified that could be used for routine monitoring of HIV/AIDS service delivery for HIV/AIDS in Brazil (and potentially in other similar settings).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. In this work, we describe the pipeline for the fast supervised classification of light curves observed by the CoRoT exoplanet CCDs. We present the classification results obtained for the first four measured fields, which represent a one-year in-orbit operation. Methods. The basis of the adopted supervised classification methodology has been described in detail in a previous paper, as is its application to the OGLE database. Here, we present the modifications of the algorithms and of the training set to optimize the performance when applied to the CoRoT data. Results. Classification results are presented for the observed fields IRa01, SRc01, LRc01, and LRa01 of the CoRoT mission. Statistics on the number of variables and the number of objects per class are given and typical light curves of high-probability candidates are shown. We also report on new stellar variability types discovered in the CoRoT data. The full classification results are publicly available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Natural Language Processing (NLP) symbolic systems, several linguistic phenomena, for instance, the thematic role relationships between sentence constituents, such as AGENT, PATIENT, and LOCATION, can be accounted for by the employment of a rule-based grammar. Another approach to NLP concerns the use of the connectionist model, which has the benefits of learning, generalization and fault tolerance, among others. A third option merges the two previous approaches into a hybrid one: a symbolic thematic theory is used to supply the connectionist network with initial knowledge. Inspired on neuroscience, it is proposed a symbolic-connectionist hybrid system called BIO theta PRED (BIOlogically plausible thematic (theta) symbolic-connectionist PREDictor), designed to reveal the thematic grid assigned to a sentence. Its connectionist architecture comprises, as input, a featural representation of the words (based on the verb/noun WordNet classification and on the classical semantic microfeature representation), and, as output, the thematic grid assigned to the sentence. BIO theta PRED is designed to ""predict"" thematic (semantic) roles assigned to words in a sentence context, employing biologically inspired training algorithm and architecture, and adopting a psycholinguistic view of thematic theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report on some unusual behavior of the measured current-voltage characteristics (CVC) in artificially prepared two-dimensional unshunted array of overdamped Nb-AlO(x)-Nb Josephson junctions. The obtained nonlinear CVC are found to exhibit a pronounced (and practically temperature independent) crossover at some current I(cr) = (1/2 beta(C)-1)I(C) from a resistance R dominated state with V(R)=R root I(2)-I(C)(2) below I(cr) to a capacitance C dominated state with V(C) = root(h) over bar /4eC root I-I(C) above I(cr). The origin of the observed behavior is discussed within a single-plaquette approximation assuming the conventional resistively shunted junction model with a finite capacitance and the Ambegaokar-Baratoff relation for the critical current of the single junction. (C) 2010 American Institute of Physics. [doi: 10.1063/1.3407566]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We calculate the entanglement entropy of blocks of size x embedded in a larger system of size L, by means of a combination of analytical and numerical techniques. The complete entanglement entropy in this case is a sum of three terms. One is a universal x- and L-dependent term, first predicted by Calabrese and Cardy, the second is a nonuniversal term arising from the thermodynamic limit, and the third is a finite size correction. We give an explicit expression for the second, nonuniversal, term for the one-dimensional Hubbard model, and numerically assess the importance of all three contributions by comparing to the entropy obtained from fully numerical diagonalization of the many-body Hamiltonian. We find that finite-size corrections are very small. The universal Calabrese-Cardy term is equally small for small blocks, but becomes larger for x > 1. In all investigated situations, however, the by far dominating contribution is the nonuniversal term stemming from the thermodynamic limit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical renormalization-group study of the conductance through a quantum wire containing noninteracting electrons side-coupled to a quantum dot is reported. The temperature and the dot-energy dependence of the conductance are examined in the light of a recently derived linear mapping between the temperature-dependent conductance and the universal function describing the conductance for the symmetric Anderson model of a quantum wire with an embedded quantum dot. Two conduction paths, one traversing the wire, the other a bypass through the quantum dot, are identified. A gate potential applied to the quantum wire is shown to control the current through the bypass. When the potential favors transport through the wire, the conductance in the Kondo regime rises from nearly zero at low temperatures to nearly ballistic at high temperatures. When it favors the dot, the pattern is reversed: the conductance decays from nearly ballistic to nearly zero. When comparable currents flow through the two channels, the conductance is nearly temperature independent in the Kondo regime, and Fano antiresonances in the fixed-temperature plots of the conductance as a function of the dot-energy signal interference between them. Throughout the Kondo regime and, at low temperatures, even in the mixed-valence regime, the numerical data are in excellent agreement with the universal mapping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thermal dependence of the zero-bias conductance for the single electron transistor is the target of two independent renormalization-group approaches, both based on the spin-degenerate Anderson impurity model. The first approach, an analytical derivation, maps the Kondo-regime conductance onto the universal conductance function for the particle-hole symmetric model. Linear, the mapping is parametrized by the Kondo temperature and the charge in the Kondo cloud. The second approach, a numerical renormalization-group computation of the conductance as a function the temperature and applied gate voltages offers a comprehensive view of zero-bias charge transport through the device. The first approach is exact in the Kondo regime; the second, essentially exact throughout the parametric space of the model. For illustrative purposes, conductance curves resulting from the two approaches are compared.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of semialgebraic Lipschitz classification of quasihomogeneous polynomials on a Holder triangle is studied. For this problem, the ""moduli"" are described completely in certain combinatorial terms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to explicitly describe in terms of generators and relations the universal central extension of the infinite dimensional Lie algebra, g circle times C[t, t(-1), u vertical bar u(2) = (t(2) - b(2))(t(2) - c(2))], appearing in the work of Date, Jimbo, Kashiwara and Miwa in their study of integrable systems arising from the Landau-Lifshitz differential equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality control of toys for avoiding children exposure to potentially toxic elements is of utmost relevance and it is a common requirement in national and/or international norms for health and safety reasons. Laser-induced breakdown spectroscopy (LIBS) was recently evaluated at authors` laboratory for direct analysis of plastic toys and one of the main difficulties for the determination of Cd. Cr and Pb was the variety of mixtures and types of polymers. As most norms rely on migration (lixiviation) protocols, chemometric classification models from LIBS spectra were tested for sampling toys that present potential risk of Cd, Cr and Pb contamination. The classification models were generated from the emission spectra of 51 polymeric toys and by using Partial Least Squares - Discriminant Analysis (PLS-DA), Soft Independent Modeling of Class Analogy (SIMCA) and K-Nearest Neighbor (KNN). The classification models and validations were carried out with 40 and 11 test samples, respectively. Best results were obtained when KNN was used, with corrected predictions varying from 95% for Cd to 100% for Cr and Pb. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, chronotype classification is based on the Morningness-Eveningness Questionnaire (MEQ). It is implicit in the classification that intermediate individuals get intermediate scores to most of the MEQ questions. However, a small group of individuals has a different pattern of answers. In some questions, they answer as ""morning-types"" and in some others they answer as ""evening-types,"" resulting in an intermediate total score. ""Evening-type"" and ""Morning-type"" answers were set as A(1) and A(4), respectively. Intermediate answers were set as A(2) and A(3). The following algorithm was applied: Bimodality Index = (Sigma A(1) x Sigma A(4))(2) - (Sigma A(2) x Sigma A(3))(2). Neither-types that had positive bimodality scores were classified as bimodal. If our hypothesis is validated by objective data, an update of chronotype classification will be required. (Author correspondence: brunojm@ymail.com)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this article is to examine the presence of cinema in World Fairs between the 1893 World`s Columbian Exposition in Chicago and 1939 (the New York World`s Fair). As an integral part of a visual culture constructed by these spaces to celebrate capitalism, the trajectory of cinema is identified with these world fairs due to its ability to entertain and, at the same time, to educate. Cinema was established as a means of mass communication during the First World War and afterwards would participate more actively in the symbolic disputes of a world about to enter the second global conflict. It would reach a broader public, becoming the main `showcase` in which nations projected virtues to be celebrated. The new striking visual spectacle assumed, within this context, greater emphasis through films idealized as true cinematographic monuments.