984 resultados para Natural Classification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. To describe and classify patterns of abnormal fundus autofluorescence (FAF) in eyes with early nonexudative age-related macular disease (AMD). METHODS. FAF images were recorded in eyes with early AMD by confocal scanning laser ophthalmoscopy (cSLO) with excitation at 488 nm (argon or OPSL laser) and emission above 500 or 521 nm (barrier filter). A standardized protocol for image acquisition and generation of mean images after automated alignment was applied, and routine fundus photographs were obtained. FAF images were classified by two independent observers. The ? statistic was applied to assess intra- and interobserver variability. RESULTS. Alterations in FAF were classified into eight phenotypic patterns including normal, minimal change, focal increased, patchy, linear, lacelike, reticular, and speckled. Areas with abnormal increased or decreased FAF signals may or may not have corresponded to funduscopically visible alterations. For intraobserver variability, ? of observer I was 0.80 (95% confidence interval [CI]0.71-0.89) and of observer II, 0.74. (95% CI, 0.64-0.84). For interobserver variability, ? was 0.77 (95% CI, 0.67-0.87). CONCLUSIONS. Various phenotypic patterns of abnormal FAF can be identified with cSLO imaging. Distinct patterns may reflect heterogeneity at a cellular and molecular level in contrast to a nonspecific aging process. The results indicate that the classification system yields a relatively high degree of intra- and interobserver agreement. It may be applicable for determination of novel prognostic determinants in longitudinal natural history studies, for identification of genetic risk factors, and for monitoring of future therapeutic interventions to slow the progression of early AMD. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Oncology is a field that profits tremendously from the genomic data generated by high-throughput technologies, including next-generation sequencing. However, in order to exploit, integrate, visualize and interpret such high-dimensional data efficiently, non-trivial computational and statistical analysis methods are required that need to be developed in a problem-directed manner.

Discussion: For this reason, computational cancer biology aims to fill this gap. Unfortunately, computational cancer biology is not yet fully recognized as a coequal field in oncology, leading to a delay in its maturation and, as an immediate consequence, an under-exploration of high-throughput data for translational research.

Summary: Here we argue that this imbalance, favoring 'wet lab-based activities', will be naturally rectified over time, if the next generation of scientists receives an academic education that provides a fair and competent introduction to computational biology and its manifold capabilities. Furthermore, we discuss a number of local educational provisions that can be implemented on university level to help in facilitating the process of harmonization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, Deep Learning (DL) techniques have gained much at-tention from Artificial Intelligence (AI) and Natural Language Processing (NLP) research communities because these approaches can often learn features from data without the need for human design or engineering interventions. In addition, DL approaches have achieved some remarkable results. In this paper, we have surveyed major recent contributions that use DL techniques for NLP tasks. All these reviewed topics have been limited to show contributions to text understand-ing, such as sentence modelling, sentiment classification, semantic role labelling, question answering, etc. We provide an overview of deep learning architectures based on Artificial Neural Networks (ANNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Recursive Neural Networks (RNNs).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les modèles à sur-représentation de zéros discrets et continus ont une large gamme d'applications et leurs propriétés sont bien connues. Bien qu'il existe des travaux portant sur les modèles discrets à sous-représentation de zéro et modifiés à zéro, la formulation usuelle des modèles continus à sur-représentation -- un mélange entre une densité continue et une masse de Dirac -- empêche de les généraliser afin de couvrir le cas de la sous-représentation de zéros. Une formulation alternative des modèles continus à sur-représentation de zéros, pouvant aisément être généralisée au cas de la sous-représentation, est présentée ici. L'estimation est d'abord abordée sous le paradigme classique, et plusieurs méthodes d'obtention des estimateurs du maximum de vraisemblance sont proposées. Le problème de l'estimation ponctuelle est également considéré du point de vue bayésien. Des tests d'hypothèses classiques et bayésiens visant à déterminer si des données sont à sur- ou sous-représentation de zéros sont présentées. Les méthodes d'estimation et de tests sont aussi évaluées au moyen d'études de simulation et appliquées à des données de précipitation agrégées. Les diverses méthodes s'accordent sur la sous-représentation de zéros des données, démontrant la pertinence du modèle proposé. Nous considérons ensuite la classification d'échantillons de données à sous-représentation de zéros. De telles données étant fortement non normales, il est possible de croire que les méthodes courantes de détermination du nombre de grappes s'avèrent peu performantes. Nous affirmons que la classification bayésienne, basée sur la distribution marginale des observations, tiendrait compte des particularités du modèle, ce qui se traduirait par une meilleure performance. Plusieurs méthodes de classification sont comparées au moyen d'une étude de simulation, et la méthode proposée est appliquée à des données de précipitation agrégées provenant de 28 stations de mesure en Colombie-Britannique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans le domaine des neurosciences computationnelles, l'hypothèse a été émise que le système visuel, depuis la rétine et jusqu'au cortex visuel primaire au moins, ajuste continuellement un modèle probabiliste avec des variables latentes, à son flux de perceptions. Ni le modèle exact, ni la méthode exacte utilisée pour l'ajustement ne sont connus, mais les algorithmes existants qui permettent l'ajustement de tels modèles ont besoin de faire une estimation conditionnelle des variables latentes. Cela nous peut nous aider à comprendre pourquoi le système visuel pourrait ajuster un tel modèle; si le modèle est approprié, ces estimé conditionnels peuvent aussi former une excellente représentation, qui permettent d'analyser le contenu sémantique des images perçues. Le travail présenté ici utilise la performance en classification d'images (discrimination entre des types d'objets communs) comme base pour comparer des modèles du système visuel, et des algorithmes pour ajuster ces modèles (vus comme des densités de probabilité) à des images. Cette thèse (a) montre que des modèles basés sur les cellules complexes de l'aire visuelle V1 généralisent mieux à partir d'exemples d'entraînement étiquetés que les réseaux de neurones conventionnels, dont les unités cachées sont plus semblables aux cellules simples de V1; (b) présente une nouvelle interprétation des modèles du système visuels basés sur des cellules complexes, comme distributions de probabilités, ainsi que de nouveaux algorithmes pour les ajuster à des données; et (c) montre que ces modèles forment des représentations qui sont meilleures pour la classification d'images, après avoir été entraînés comme des modèles de probabilités. Deux innovations techniques additionnelles, qui ont rendu ce travail possible, sont également décrites : un algorithme de recherche aléatoire pour sélectionner des hyper-paramètres, et un compilateur pour des expressions mathématiques matricielles, qui peut optimiser ces expressions pour processeur central (CPU) et graphique (GPU).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Durant les dernières décennies, l’occurrence des catastrophes naturelles a été fortement à la hausse. En effet, les catastrophes naturelles sont devenues de plus en plus fréquentes. En fait, ces risques dévastateurs ont touché durant les années précédentes différents pays dans des zones très diversifiées et continueront très probablement à être de réelles menaces dans le monde. Puisqu’aucun pays n’est à l’abri des catastrophes naturelles, il s’avère alors utile d’étudier les facteurs déterminants de leur survenue notamment avec la restriction de leurs périodes de retour et donc l’augmentation de leurs chances d’occurrence. Il nous a donc semblé opportun de tester les facteurs sous-jacents de la survenue des catastrophes naturelles. Notre travail se base sur l’application d’un réseau neuronal de type perceptron multicouche pour prédire le nombre des catastrophes naturelles à partir des variables les plus connues théoriquement. Ainsi, nous allons utiliser ce modèle neuronal pour effectuer l’analyse de sensitivité. Cette dernière permet de classer les variables explicatives selon l’importance de leur contribution dans la détermination du nombre de catastrophes naturelles comptabilisées durant la période d’étude. Les résultats obtenus ont montré que le réseau retenu peut prédire le nombre des catastrophes naturelles. De même, les différentes variables possèdent un effet considérable sur la sortie du réseau neuronal mais selon différents ordres d’importance. De ce fait, toutes ces variables contribuent à l’explication d’un problème aussi complexe comme la survenue des catastrophes naturelles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Underwater target localization and tracking attracts tremendous research interest due to various impediments to the estimation task caused by the noisy ocean environment. This thesis envisages the implementation of a prototype automated system for underwater target localization, tracking and classification using passive listening buoy systems and target identification techniques. An autonomous three buoy system has been developed and field trials have been conducted successfully. Inaccuracies in the localization results, due to changes in the environmental parameters, measurement errors and theoretical approximations are refined using the Kalman filter approach. Simulation studies have been conducted for the tracking of targets with different scenarios even under maneuvering situations. This system can as well be used for classifying the unknown targets by extracting the features of the noise emanations from the targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of the drive towards waste-poor world and reserving the non-renewable materials, recycling the construction and demolition materials become very essential. Now reuse of the recycled concrete aggregate more than 4 mm in producing new concrete is allowed but with natural sand a fine aggregate while. While the sand portion that represent about 30\% to 60\% of the crushed demolition materials is disposed off. To perform this research, recycled concrete sand was produced in the laboratory while nine recycled sands produced from construction and demolitions materials and two sands from natural crushed limestone were delivered from three plants. Ten concrete mix designs representing the concrete exposition classes XC1, XC2, XF3 and XF4 according to European standard EN 206 were produced with partial and full replacement of natural sand by the different recycled sands. Bituminous mixtures achieving the requirements of base courses according to Germany standards and both base and binder courses according to Egyptian standards were produced with the recycled sands as a substitution to the natural sands. The mechanical properties and durability of concrete produced with the different recycled sands were investigated and analyzed. Also the volumetric analysis and Marshall test were performed hot bituminous mixtures produced with the recycled sands. According to the effect of replacement the natural sand by the different recycled sands on the concrete compressive strength and durability, the recycled sands were classified into three groups. The maximum allowable recycled sand that can be used in the different concrete exposition class was determined for each group. For the asphalt concrete mixes all the investigated recycled sands can be used in mixes for base and binder courses up to 21\% of the total aggregate mass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente proyecto tiene como objeto identificar cuáles son los conceptos de salud, enfermedad, epidemiología y riesgo aplicables a las empresas del sector de extracción de petróleo y gas natural en Colombia. Dado, el bajo nivel de predicción de los análisis financieros tradicionales y su insuficiencia, en términos de inversión y toma de decisiones a largo plazo, además de no considerar variables como el riesgo y las expectativas de futuro, surge la necesidad de abordar diferentes perspectivas y modelos integradores. Esta apreciación es pertinente dentro del sector de extracción de petróleo y gas natural, debido a la creciente inversión extranjera que ha reportado, US$2.862 millones en el 2010, cifra mayor a diez veces su valor en el año 2003. Así pues, se podrían desarrollar modelos multi-dimensional, con base en los conceptos de salud financiera, epidemiológicos y estadísticos. El termino de salud y su adopción en el sector empresarial, resulta útil y mantiene una coherencia conceptual, evidenciando una presencia de diferentes subsistemas o factores interactuantes e interconectados. Es necesario mencionar también, que un modelo multidimensional (multi-stage) debe tener en cuenta el riesgo y el análisis epidemiológico ha demostrado ser útil al momento de determinarlo e integrarlo en el sistema junto a otros conceptos, como la razón de riesgo y riesgo relativo. Esto se analizará mediante un estudio teórico-conceptual, que complementa un estudio previo, para contribuir al proyecto de finanzas corporativas de la línea de investigación en Gerencia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the Loess Plateau, China, arable cultivation of slope lands is common and associated with serious soil erosion. Planting trees or grass may control erosion, but planted species may consume more soil water and can threaten long-term ecosystem sustainability. Natural vegetation succession is an alternative ecological solution to restore degraded land, but there is a time cost, given that the establishment of natural vegetation, adequate to prevent soil erosion, is a longer process than planting. The aims of this study were to identify the environmental factors controlling the type of vegetation established on abandoned cropland and to identify candidate species that might be sown soon after abandonment to accelerate vegetation succession and establishment of natural vegetation to prevent soil erosion. A field survey of thirty-three 2 × 2–m plots was carried out in July 2003, recording age since abandonment, vegetation cover, and frequency of species together with major environmental and soil variables. Data were analyzed using correspondence analysis, classification tree analysis, and species response curves. Four vegetation types were identified and the data analysis confirmed the importance of time since abandonment, total P, and soil water in controlling the type of vegetation established. Among the dominant species in the three late-successional vegetation types, the most appropriate candidates for accelerating and directing vegetation succession were King Ranch bluestem (Bothriochloa ischaemum) and Lespedeza davurica (Leguminosae). These species possess combinations of the following characteristics: tolerance of low water and nutrient availability, fibrous root system and strong lateral vegetative spread, and a persistent seed bank.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pollination services are economically important component of agricultural biodiversity which enhance the yield and quality of many crops. An understanding of the suitability of extant habitats for pollinating species is crucial for planning management actions to protect and manage these service providers. In a highly modified agricultural ecosystem, we tested the effect of different pollination treatments (open, autonomous self- and wind-pollination) on pod set, seed set, and seed weight in field beans (Vicia faba). We also investigated the effect of semi-natural habitats and flower abundance on pollinators of field beans. Pollinator sampling was undertaken in ten field bean fields along a gradient of habitat complexity; CORINE land cover classification was used to analyse the land use patterns between 500–3000 m around the sites. Total yield from open-pollination increased by 185% compared to autonomous self-pollination. There was positive interactive effect of local flower abundance and cover of semi-natural habitats on overall abundance of pollinators at 1500 and 2000 m, and abundance of bumblebees (Bombus spp.) at 1000–2000 m. In contrast, species richness of pollinators was only correlated with flower abundance and not with semi-natural habitats. We did not find a link between pod set from open-pollination and pollinator abundance, possibly due to variations in the growing conditions and pollinator communities between sites. We conclude that insect pollination is essential for optimal bean yields and therefore the maintenance of semi-natural habitats in agriculture-dominated landscapes should ensure stable and more efficient pollination services in field beans.