935 resultados para Objected-oriented classification
Resumo:
When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.
Resumo:
For several years, the lack of consensus on definition, nomenclature, natural history, and biology of serrated polyps (SPs) of the colon has created considerable confusion among pathologists. According to the latest WHO classification, the family of SPs comprises hyperplastic polyps (HPs), sessile serrated adenomas/polyps (SSA/Ps), and traditional serrated adenomas (TSAs). The term SSA/P with dysplasia has replaced the category of mixed hyperplastic/adenomatous polyps (MPs). The present study aimed to evaluate the reproducibility of the diagnosis of SPs based on currently available diagnostic criteria and interactive consensus development. In an initial round, H&E slides of 70 cases of SPs were circulated among participating pathologists across Europe. This round was followed by a consensus discussion on diagnostic criteria. A second round was performed on the same 70 cases using the revised criteria and definitions according to the recent WHO classification. Data were evaluated for inter-observer agreement using Kappa statistics. In the initial round, for the total of 70 cases, a fair overall kappa value of 0.318 was reached, while in the second round overall kappa value improved to moderate (kappa = 0.557; p < 0.001). Overall kappa values for each diagnostic category also significantly improved in the final round, reaching 0.977 for HP, 0.912 for SSA/P, and 0.845 for TSA (p < 0.001). The diagnostic reproducibility of SPs improves when strictly defined, standardized diagnostic criteria adopted by consensus are applied.
Resumo:
Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.
Resumo:
This document Classifications and Pay Plans is produced by the State of Iowa Executive Branch, Department of Administrative Services. Informational document about the pay plan codes and classification codes, how to use them.
Resumo:
An exhaustive classification of matrix effects occurring when a sample preparation is performed prior to liquid-chromatography coupled to mass spectrometry (LC-MS) analyses was proposed. A total of eight different situations were identified allowing the recognition of the matrix effect typology via the calculation of four recovery values. A set of 198 compounds was used to evaluate matrix effects after solid phase extraction (SPE) from plasma or urine samples prior to LC-ESI-MS analysis. Matrix effect identification was achieved for all compounds and classified through an organization chart. Only 17% of the tested compounds did not present significant matrix effects.
Resumo:
The main goal of this observational and descriptive study is to evaluate whether the diagnosis axis of a nursing interface terminology meets the content validity criterion of being nursing-phenomena oriented. Nursing diagnosis concepts were analyzed in terms of presence in the nursing literature, type of articles published and areas of disciplinary interest. The search strategy was conducted in three databases with limits in relation to period and languages. The final analysis included 287 nursing diagnosis concepts. The results showed that most of the concepts were identified in the scientific literature, with a homogeneous distribution of types of designs. Most of these concepts (87.7%) were studied from two or more areas of disciplinary interest. Validity studies on disciplinary controlled vocabularies may contribute to demonstrate the nursing influence on patients" outcomes.
Resumo:
Axée dans un premier temps sur le formalisme et les méthodes, cette thèse est construite sur trois concepts formalisés: une table de contingence, une matrice de dissimilarités euclidiennes et une matrice d'échange. À partir de ces derniers, plusieurs méthodes d'Analyse des données ou d'apprentissage automatique sont exprimées et développées: l'analyse factorielle des correspondances (AFC), vue comme un cas particulier du multidimensional scaling; la classification supervisée, ou non, combinée aux transformations de Schoenberg; et les indices d'autocorrélation et d'autocorrélation croisée, adaptés à des analyses multivariées et permettant de considérer diverses familles de voisinages. Ces méthodes débouchent dans un second temps sur une pratique de l'analyse exploratoire de différentes données textuelles et musicales. Pour les données textuelles, on s'intéresse à la classification automatique en types de discours de propositions énoncées, en se basant sur les catégories morphosyntaxiques (CMS) qu'elles contiennent. Bien que le lien statistique entre les CMS et les types de discours soit confirmé, les résultats de la classification obtenus avec la méthode K- means, combinée à une transformation de Schoenberg, ainsi qu'avec une variante floue de l'algorithme K-means, sont plus difficiles à interpréter. On traite aussi de la classification supervisée multi-étiquette en actes de dialogue de tours de parole, en se basant à nouveau sur les CMS qu'ils contiennent, mais aussi sur les lemmes et le sens des verbes. Les résultats obtenus par l'intermédiaire de l'analyse discriminante combinée à une transformation de Schoenberg sont prometteurs. Finalement, on examine l'autocorrélation textuelle, sous l'angle des similarités entre diverses positions d'un texte, pensé comme une séquence d'unités. En particulier, le phénomène d'alternance de la longueur des mots dans un texte est observé pour des voisinages d'empan variable. On étudie aussi les similarités en fonction de l'apparition, ou non, de certaines parties du discours, ainsi que les similarités sémantiques des diverses positions d'un texte. Concernant les données musicales, on propose une représentation d'une partition musicale sous forme d'une table de contingence. On commence par utiliser l'AFC et l'indice d'autocorrélation pour découvrir les structures existant dans chaque partition. Ensuite, on opère le même type d'approche sur les différentes voix d'une partition, grâce à l'analyse des correspondances multiples, dans une variante floue, et à l'indice d'autocorrélation croisée. Qu'il s'agisse de la partition complète ou des différentes voix qu'elle contient, des structures répétées sont effectivement détectées, à condition qu'elles ne soient pas transposées. Finalement, on propose de classer automatiquement vingt partitions de quatre compositeurs différents, chacune représentée par une table de contingence, par l'intermédiaire d'un indice mesurant la similarité de deux configurations. Les résultats ainsi obtenus permettent de regrouper avec succès la plupart des oeuvres selon leur compositeur.
Resumo:
Several primary techniques have been developed through which soil aggregate road material properties may be improved. Such techniques basically involve a mechanism of creating a continuous matrix system of soil and/or aggregate particles, interlocked through the use of some additive such as portland cement, lime, or bituminous products. Details by which soils are stabilized vary greatly, but they are dependent on the type of stabilizing agent and nature of the soil, though the overall approach to stabilization has the common feature that improvement is achieved by some mechanism(s) forcing individual particles to adhere to one another. This process creates a more rigid material, most often capable of resisting the influx of water during freezing, loss of strength due to high moisture content and particle dispersion during thawing, and loss of strength due to migration of fines and/or water by capillarity and pumping. The study reported herein, took a new and relatively different approach to strengthening of soils, i.e., improvement of roadway soils and/or soil-aggregate materials by structural reinforcement with randomly oriented fibers. The purpose of the study was to conduct a laboratory and field investigation into the potential of improving (a) soil-aggregate surfaced and subgrade materials, including those that are frost-prone and/or highly moisture susceptible, and (b) localized base course materials, by uniting such materials through fibrous reinforcement. The envisioned objective of the project was the development of a simple construction technique(s) that could be (a) applied on a selective basis to specific areas having a history of poor performance, or (b) used for improvement of potential base materials prior to surfacing. Little background information on such purpose and objective was available. Though the envisioned process had similarities to fibrous reinforced concrete, and to fibrous reinforced resin composites, the process was devoid of a cementitious binder matrix and thus highly dependent on the cohesive and frictional interlocking processes of a soil and/or aggregate with the fibrous reinforcement; a condition not unlike the introduction of reinforcing bars into a concrete sand/aggregate mixture without benefit of portland cement. Thus the study was also directed to answering some fundamental questions: (1) would the technique work; (2) what type or types of fibers are effective; (3) are workable fibers commercially available; and (4) can such fibers be effectively incorporated with conventional construction equipment, and employed in practical field applications? The approach to obtaining answers to these questions, was guided by the philosophy that an understanding of basic fundamentals was essential to developing a body of engineering knowledge, that would serve as the basis for eventual development of design procedures with fibrous products for the applications previously noted.
Resumo:
In forensic pathology routine, fatal cases of contrast agent exposure can be occasionally encountered. In such situations, beyond the difficulties inherent in establishing the cause of death due to nonspecific or absent autopsy and histology findings as well as limited laboratory investigations, pathologists may face other problems in formulating exhaustive, complete reports, and conclusions that are scientifically accurate. Indeed, terminology concerning adverse drug reactions and allergy nomenclature is confusing. Some terms, still utilized in forensic and radiological reports, are outdated and should be avoided. Additionally, not all forensic pathologists master contrast material classification and pathogenesis of contrast agent reactions. We present a review of the literature covering allergic reactions to contrast material exposure in order to update used terminology, explain the pathophysiology, and list currently available laboratory investigations for diagnosis in the forensic setting.