821 resultados para Classification Criteria of SLE
Resumo:
This report explains the objectives, datasets and evaluation criteria of both the clustering and classification tasks set in the INEX 2009 XML Mining track. The report also describes the approaches and results obtained by the different participants.
Resumo:
Most learning paradigms impose a particular syntax on the class of concepts to be learned; the chosen syntax can dramatically affect whether the class is learnable or not. For classification paradigms, where the task is to determine whether the underlying world does or does not have a particular property, how that property is represented has no implication on the power of a classifier that just outputs 1’s or 0’s. But is it possible to give a canonical syntactic representation of the class of concepts that are classifiable according to the particular criteria of a given paradigm? We provide a positive answer to this question for classification in the limit paradigms in a logical setting, with ordinal mind change bounds as a measure of complexity. The syntactic characterization that emerges enables to derive that if a possibly noncomputable classifier can perform the task assigned to it by the paradigm, then a computable classifier can also perform the same task. The syntactic characterization is strongly related to the difference hierarchy over the class of open sets of some topological space; this space is naturally defined from the class of possible worlds and possible data of the learning paradigm.
Resumo:
The present study deals with the application of cluster analysis, Fuzzy Cluster Analysis (FCA) and Kohonen Artificial Neural Networks (KANN) methods for classification of 159 meteorological stations in India into meteorologically homogeneous groups. Eight parameters, namely latitude, longitude, elevation, average temperature, humidity, wind speed, sunshine hours and solar radiation, are considered as the classification criteria for grouping. The optimal number of groups is determined as 14 based on the Davies-Bouldin index approach. It is observed that the FCA approach performed better than the other two methodologies for the present study.
Resumo:
Clock synchronization in wireless sensor networks (WSNs) assures that sensor nodes have the same reference clock time. This is necessary not only for various WSN applications but also for many system level protocols for WSNs such as MAC protocols, and protocols for sleep scheduling of sensor nodes. Clock value of a node at a particular instant of time depends on its initial value and the frequency of the crystal oscillator used in the sensor node. The frequency of the crystal oscillator varies from node to node, and may also change over time depending upon many factors like temperature, humidity, etc. As a result, clock values of different sensor nodes diverge from each other and also from the real time clock, and hence, there is a requirement for clock synchronization in WSNs. Consequently, many clock synchronization protocols for WSNs have been proposed in the recent past. These protocols differ from each other considerably, and so, there is a need to understand them using a common platform. Towards this goal, this survey paper categorizes the features of clock synchronization protocols for WSNs into three types, viz, structural features, technical features, and global objective features. Each of these categories has different options to further segregate the features for better understanding. The features of clock synchronization protocols that have been used in this survey include all the features which have been used in existing surveys as well as new features such as how the clock value is propagated, when the clock value is propagated, and when the physical clock is updated, which are required for better understanding of the clock synchronization protocols in WSNs in a systematic way. This paper also gives a brief description of a few basic clock synchronization protocols for WSNs, and shows how these protocols fit into the above classification criteria. In addition, the recent clock synchronization protocols for WSNs, which are based on the above basic clock synchronization protocols, are also given alongside the corresponding basic clock synchronization protocols. Indeed, the proposed model for characterizing the clock synchronization protocols in WSNs can be used not only for analyzing the existing protocols but also for designing new clock synchronization protocols. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Systemic lupus erythematosus is a chronic autoimmune disease with multifactorial ethiopathogenesis. The complement system is involved in both the early and late stages of disease development and organ damage. To better understand autoantibody mediated complement consumption we examined ex vivo immune complex formation on autoantigen arrays. We recruited patients with SLE (n = 211), with other systemic autoimmune diseases (n = 65) and non-autoimmune control subjects (n = 149). Standard clinical and laboratory data were collected and serum complement levels were determined. The genotype of SNP rs1143679 in the ITGAM gene was also determined. Ex vivo formation of immune complexes, with respect to IgM, IgG, complement C4 and C3 binding, was examined using a functional immunoassay on autoantigen microarray comprising nucleic acids, proteins and lipids. Complement consumption of nucleic acids increased upon binding of IgM and IgG even when serum complement levels were decreased due to consumption in SLE patients. A negative correlation between serum complement levels and ex vivo complement deposition on nucleic acid autoantigens is demonstrated. On the contrary, complement deposition on tested protein and lipid autoantigens showed positive correlation with C4 levels. Genetic analysis revealed that the non-synonymous variant rs1143679 in complement receptor type 3 is associated with an increased production of anti-dsDNA IgG antibodies. Notwithstanding, homozygous carriers of the previously reported susceptible allele (AA) had lower levels of dsDNA specific IgM among SLE patients. Both the non-synonymous variant rs1143679 and the high ratio of nucleic acid specific IgG/IgM were associated with multiple organ involvement. In summary, secondary complement deficiency in SLE does not impair opsonization of nucleic-acid-containing autoantigens but does affect other antigens and potentially other complement dependent processes. Dysfunction of the receptor recognizing complement opsonized immune complexes promotes the development of class-switched autoantibodies targeting nucleic acids.
Resumo:
Item Response Theory, IRT, is a valuable methodology for analyzing the quality of the instruments utilized in assessment of academic achievement. This article presents an implementation of the mentioned theory, particularly of the Rasch model, in order to calibrate items and the instrument used in the classification test for the Basic Mathematics subject at Universidad Jorge Tadeo Lozano. 509 responses chains of students, obtained in the june 2011 application, were analyzed with a set of 45 items, through eight case studies that are showing progressive steps of calibration. Criteria of validity of items and of whole instrument were defined and utilized, to select groups of responses chains and items that were finally used in the determination of parameters which then allowed the classification of assessed students by the test.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
INTRODUCTION: In periapical surgery, the absence of standardization between different studies makes it difficult to compare the outcomes. OBJECTIVE: To compare the healing classification of different authors and evaluate the prognostic criteria of periapical surgery at 12 months. MATERIAL AND METHODS: 278 patients (101 men and 177 women) with a mean age of 38.1 years (range 11 to 77) treated with periapical surgery using the ultrasound technique and a 2.6x magnifying glass, and silver amalgam as root-end filling material were included in the study. Evolution was analyzed using the clinical criteria of Mikkonen et al., 1983; radiographic criteria of Rud et al., 1972; the overall combined clinical and radiographic criteria of von Arx and Kurt, 1999; and the Friedman (2005) concept of functional tooth at 12 months of surgery. RESULTS: After 12 months, 87.2% clinical success was obtained according to the Mikkonen et al., 1983 criteria; 73.9% complete radiographic healing using Rud et al. criteria; 62.1% overall success, following the clinical and radiographic parameters of von Arx and Kurt, and 91.9% of teeth were functional. The von Arx and Kurt criteria was found to be the most reliable. CONCLUSION: Overall evolution according to von Arx and Kurt agreed most closely with the other scales.
Resumo:
Objective: To evaluate the impact of the revised diagnostic criteria for diabetes mellitus adopted by the American Diabetes Association on prevalence of diabetes and on classification of patients. For epidemiological purposes the American criteria use a fasting plasma glucose concentration ⩾7.0 mmol/l in contrast with the current World Health Organisation criteria of 2 hour glucose concentration ⩾11.1 mmol/l.
Resumo:
The XML Document Mining track was launched for exploring two main ideas: (1) identifying key problems and new challenges of the emerging field of mining semi-structured documents, and (2) studying and assessing the potential of Machine Learning (ML) techniques for dealing with generic ML tasks in the structured domain, i.e., classification and clustering of semi-structured documents. This track has run for six editions during INEX 2005, 2006, 2007, 2008, 2009 and 2010. The first five editions have been summarized in previous editions and we focus here on the 2010 edition. INEX 2010 included two tasks in the XML Mining track: (1) unsupervised clustering task and (2) semi-supervised classification task where documents are organized in a graph. The clustering task requires the participants to group the documents into clusters without any knowledge of category labels using an unsupervised learning algorithm. On the other hand, the classification task requires the participants to label the documents in the dataset into known categories using a supervised learning algorithm and a training set. This report gives the details of clustering and classification tasks.
Resumo:
The window of opportunity is a concept critical to rheumatoid arthritis treatment. Early treatment changes the outcome of rheumatoid arthritis treatment, in that response rates are higher with earlier disease-modifying anti-rheumatic drug treatment and damage is substantially reduced. Axial spondyloarthritis is an inflammatory axial disease encompassing both nonradiographic axial spondyloarthritis and established ankylosing spondylitis. In axial spondyloarthritis, studies of magnetic resonance imaging as well as tumor necrosis factor inhibitor treatment and withdrawal studies all suggest that early effective suppression of inflammation has the potential to reduce radiographic damage. This potential would suggest that the concept of a window of opportunity is relevant not only to rheumatoid arthritis but also to axial spondyloarthritis. The challenge now remains to identify high-risk patients early and to commence treatment without delay. Developments in risk stratification include new classification criteria, identification of clinical risk factors, biomarkers, genetic associations, potential antibody associations and an ankylosing spondylitis-specific microbiome signature. Further research needs to focus on the evidence for early intervention and the early identification of high-risk individuals.
Resumo:
Classification criteria should facilitate selection of similar patients for clinical and epidemiologic studies, therapeutic trials, and research on etiopathogenesis to enable comparison of results across studies from different centers. We critically appraise the validity and performance of the Assessment of SpondyloArthritis international Society (ASAS) classification criteria for axial spondyloarthritis (axSpA). It is still debatable whether all patients fulfilling these criteria should be considered as having true axSpA. Patients with radiographically evident disease by the ASAS criteria are not necessarily identical with ankylosing spondylitis (AS) as classified by the modified New York criteria. The complex multi-arm selection design of the ASAS criteria induces considerable heterogeneity among patients so classified, and applying them in settings with a low prevalence of axial spondyloarthritis (SpA) greatly increases the proportion of subjects falsely classified as suffering from axial SpA. One of the unmet needs in non-radiographic form of axial SpA is to have reliable markers that can identify individuals at risk for progression to AS and thereby facilitate early intervention trials designed to prevent such progression. We suggest needed improvements of the ASAS criteria for axSpA, as all criteria sets should be regarded as dynamic concepts open to modifications or updates as our knowledge advances.
Resumo:
Within online learning communities, receiving timely and meaningful insights into the quality of learning activities is an important part of an effective educational experience. Commonly adopted methods – such as the Community of Inquiry framework – rely on manual coding of online discussion transcripts, which is a costly and time consuming process. There are several efforts underway to enable the automated classification of online discussion messages using supervised machine learning, which would enable the real-time analysis of interactions occurring within online learning communities. This paper investigates the importance of incorporating features that utilise the structure of on-line discussions for the classification of "cognitive presence" – the central dimension of the Community of Inquiry framework focusing on the quality of students' critical thinking within online learning communities. We implemented a Conditional Random Field classification solution, which incorporates structural features that may be useful in increasing classification performance over other implementations. Our approach leads to an improvement in classification accuracy of 5.8% over current existing techniques when tested on the same dataset, with a precision and recall of 0.630 and 0.504 respectively.