989 resultados para Belief networks


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Deep belief networks are a powerful way to model complex probability distributions. However, learning the structure of a belief network, particularly one with hidden units, is difficult. The Indian buffet process has been used as a nonparametric Bayesian prior on the directed structure of a belief network with a single infinitely wide hidden layer. In this paper, we introduce the cascading Indian buffet process (CIBP), which provides a nonparametric prior on the structure of a layered, directed belief network that is unbounded in both depth and width, yet allows tractable inference. We use the CIBP prior with the nonlinear Gaussian belief network so each unit can additionally vary its behavior between discrete and continuous representations. We provide Markov chain Monte Carlo algorithms for inference in these belief networks and explore the structures learned on several image data sets.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During recent decades anthropogenic activities have dramatically impacted the Black Sea ecosystem. High levels of riverine nutrient input during the 1970s and 1980s caused eutrophic conditions including intense algal blooms resulting in hypoxia and the subsequent collapse of benthic habitats on the northwestern shelf. Intense fishing pressure also depleted stocks of many apex predators, contributing to an increase in planktivorous fish that are now the focus of fishing efforts. Additionally, the Black Sea's ecosystem changed even further with the introduction of exotic species. Economic collapse of the surrounding socialist republics in the early 1990s resulted in decreased nutrient loading which has allowed the Black Sea ecosystem to start to recover, but under rapidly changing economic and political conditions, future recovery is uncertain. In this study we use a multidisciplinary approach to integrate information from socio-economic and ecological systems to model the effects of future development scenarios on the marine environment of the northwestern Black Sea shelf. The Driver–Pressure–State-Impact-Response framework was used to construct conceptual models, explicitly mapping impacts of socio-economic Drivers on the marine ecosystem. Bayesian belief networks (BBNs), a stochastic modelling technique, were used to quantify these causal relationships, operationalise models and assess the effects of alternative development paths on the Black Sea ecosystem. BBNs use probabilistic dependencies as a common metric, allowing the integration of quantitative and qualitative information. Under the Baseline Scenario, recovery of the Black Sea appears tenuous as the exploitation of environmental resources (agriculture, fishing and shipping) increases with continued economic development of post-Soviet countries. This results in the loss of wetlands through drainage and reclamation. Water transparency decreases as phytoplankton bloom and this deterioration in water quality leads to the degradation of coastal plant communities (Cystoseira, seagrass) and also Phyllophora habitat on the shelf. Decomposition of benthic plants results in hypoxia killing flora and fauna associated with these habitats. Ecological pressure from these factors along with constant levels of fishing activity results in target stocks remaining depleted. Of the four Alternative Scenarios, two show improvements on the Baseline ecosystem condition, with improved waste water treatment and reduced fishing pressure, while the other two show a worsening, due to increased natural resource exploitation leading to rapid reversal of any recent ecosystem recovery. From this we conclude that variations in economic policy have significant consequences for the health of the Black Sea, and ecosystem recovery is directly linked to social–economic choices.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper is concerned with handling uncertainty as part of the analysis of data from a medical study. The study is investigating connections between the birth weight of babies and the dietary intake of their mothers. Bayesian belief networks were used in the analysis. Their perceived benefits include (i) an ability to represent the evidence emerging from the evolving study, dealing effectively with the inherent uncertainty involved; (ii) providing a way of representing evidence graphically to facilitate analysis and communication with clinicians; (iii) helping in the exploration of the data to reveal undiscovered knowledge; and (iv) providing a means of developing an expert system application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertation presented at the Faculty of Sciences and Technology of the New University of Lisbon to obtain the degree of Doctor in Electrical Engineering, specialty of Robotics and Integrated Manufacturing

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Les tâches de vision artificielle telles que la reconnaissance d’objets demeurent irrésolues à ce jour. Les algorithmes d’apprentissage tels que les Réseaux de Neurones Artificiels (RNA), représentent une approche prometteuse permettant d’apprendre des caractéristiques utiles pour ces tâches. Ce processus d’optimisation est néanmoins difficile. Les réseaux profonds à base de Machine de Boltzmann Restreintes (RBM) ont récemment été proposés afin de guider l’extraction de représentations intermédiaires, grâce à un algorithme d’apprentissage non-supervisé. Ce mémoire présente, par l’entremise de trois articles, des contributions à ce domaine de recherche. Le premier article traite de la RBM convolutionelle. L’usage de champs réceptifs locaux ainsi que le regroupement d’unités cachées en couches partageant les même paramètres, réduit considérablement le nombre de paramètres à apprendre et engendre des détecteurs de caractéristiques locaux et équivariant aux translations. Ceci mène à des modèles ayant une meilleure vraisemblance, comparativement aux RBMs entraînées sur des segments d’images. Le deuxième article est motivé par des découvertes récentes en neurosciences. Il analyse l’impact d’unités quadratiques sur des tâches de classification visuelles, ainsi que celui d’une nouvelle fonction d’activation. Nous observons que les RNAs à base d’unités quadratiques utilisant la fonction softsign, donnent de meilleures performances de généralisation. Le dernière article quand à lui, offre une vision critique des algorithmes populaires d’entraînement de RBMs. Nous montrons que l’algorithme de Divergence Contrastive (CD) et la CD Persistente ne sont pas robustes : tous deux nécessitent une surface d’énergie relativement plate afin que leur chaîne négative puisse mixer. La PCD à "poids rapides" contourne ce problème en perturbant légèrement le modèle, cependant, ceci génère des échantillons bruités. L’usage de chaînes tempérées dans la phase négative est une façon robuste d’adresser ces problèmes et mène à de meilleurs modèles génératifs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lors de ces dix dernières années, le coût de la maintenance des systèmes orientés objets s'est accru jusqu' à compter pour plus de 70% du coût total des systèmes. Cette situation est due à plusieurs facteurs, parmi lesquels les plus importants sont: l'imprécision des spécifications des utilisateurs, l'environnement d'exécution changeant rapidement et la mauvaise qualité interne des systèmes. Parmi tous ces facteurs, le seul sur lequel nous ayons un réel contrôle est la qualité interne des systèmes. De nombreux modèles de qualité ont été proposés dans la littérature pour contribuer à contrôler la qualité. Cependant, la plupart de ces modèles utilisent des métriques de classes (nombre de méthodes d'une classe par exemple) ou des métriques de relations entre classes (couplage entre deux classes par exemple) pour mesurer les attributs internes des systèmes. Pourtant, la qualité des systèmes par objets ne dépend pas uniquement de la structure de leurs classes et que mesurent les métriques, mais aussi de la façon dont celles-ci sont organisées, c'est-à-dire de leur conception, qui se manifeste généralement à travers les patrons de conception et les anti-patrons. Dans cette thèse nous proposons la méthode DEQUALITE, qui permet de construire systématiquement des modèles de qualité prenant en compte non seulement les attributs internes des systèmes (grâce aux métriques), mais aussi leur conception (grâce aux patrons de conception et anti-patrons). Cette méthode utilise une approche par apprentissage basée sur les réseaux bayésiens et s'appuie sur les résultats d'une série d'expériences portant sur l'évaluation de l'impact des patrons de conception et des anti-patrons sur la qualité des systèmes. Ces expériences réalisées sur 9 grands systèmes libres orientés objet nous permettent de formuler les conclusions suivantes: • Contre l'intuition, les patrons de conception n'améliorent pas toujours la qualité des systèmes; les implantations très couplées de patrons de conception par exemple affectent la structure des classes et ont un impact négatif sur leur propension aux changements et aux fautes. • Les classes participantes dans des anti-atrons sont beaucoup plus susceptibles de changer et d'être impliquées dans des corrections de fautes que les autres classes d'un système. • Un pourcentage non négligeable de classes sont impliquées simultanément dans des patrons de conception et dans des anti-patrons. Les patrons de conception ont un effet positif en ce sens qu'ils atténuent les anti-patrons. Nous appliquons et validons notre méthode sur trois systèmes libres orientés objet afin de démontrer l'apport de la conception des systèmes dans l'évaluation de la qualité.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When triangulating a belief network we aim to obtain a junction tree of minimum state space. Searching for the optimal triangulation can be cast as a search over all the permutations of the network's vaeriables. Our approach is to embed the discrete set of permutations in a convex continuous domain D. By suitably extending the cost function over D and solving the continous nonlinear optimization task we hope to obtain a good triangulation with respect to the aformentioned cost. In this paper we introduce an upper bound to the total junction tree weight as the cost function. The appropriatedness of this choice is discussed and explored by simulations. Then we present two ways of embedding the new objective function into continuous domains and show that they perform well compared to the best known heuristic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bayesian networks are powerful tools as they represent probability distributions as graphs. They work with uncertainties of real systems. Since last decade there is a special interest in learning network structures from data. However learning the best network structure is a NP-Hard problem, so many heuristics algorithms to generate network structures from data were created. Many of these algorithms use score metrics to generate the network model. This thesis compare three of most used score metrics. The K-2 algorithm and two pattern benchmarks, ASIA and ALARM, were used to carry out the comparison. Results show that score metrics with hyperparameters that strength the tendency to select simpler network structures are better than score metrics with weaker tendency to select simpler network structures for both metrics (Heckerman-Geiger and modified MDL). Heckerman-Geiger Bayesian score metric works better than MDL with large datasets and MDL works better than Heckerman-Geiger with small datasets. The modified MDL gives similar results to Heckerman-Geiger for large datasets and close results to MDL for small datasets with stronger tendency to select simpler network structures

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta Tese propõe o desenvolvimento de uma estratégia de planejamento que combina: caracterização de carga de uma aplicação típica de TV Digital, extração de vetor peso por meio de redes de crença e tomada de decisão multicriterio a partir da aplicação de métodos analíticos (TOPSIS e ELECTRE III), para fornecer suporte a decisão junto a provedores de serviços, objetivando-se permitir optar-se por uma tecnologia para canal de retorno (ADSL2+, PLC, WiMAX e 3G), considerando a carga típica de um cenário de TV Digital interativo, padrão ISDB-T. A estratégia proposta apresenta cinco etapas, sendo estas: definição dos canais de retorno e das métricas de desempenho, realização de medições das tecnologias de acesso em cenários reais, simulação dos dados em ambientes simulados, aplicação de técnicas de correlação de dados para geração do vetor peso e aplicação de métodos analíticos de tomada de decisão para escolha da melhor tecnologia a ser implantada em determinado cenário. Como resultado principal se obteve um modelo genérico e flexível que foi validado através de um estudo de caso que ordenou a preferência das tecnologias avaliadas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Existing models estimating oil spill costs at sea are based on data from the past, and they usually lack a systematic approach. This make them passive, and limits their ability to forecast the effect of the changes in the oil combating fleet or location of a spill on the oil spill costs. In this paper we make an attempt towards the development of a probabilistic and systematic model estimating the costs of clean-up operations for the Gulf of Finland. For this purpose we utilize expert knowledge along with the available data and information from literature. Then, the obtained information is combined into a framework with the use of a Bayesian Belief Networks. Due to lack of data, we validate the model by comparing its results with existing models, with which we found good agreement. We anticipate that the presented model can contribute to the cost-effective oil-combating fleet optimization for the Gulf of Finland. It can also facilitate the accident consequences estimation in the framework of formal safety assessment (FSA).