100 resultados para Safety prognosis, Dynamic Bayesian networks, Ant colony algorithm, Fault propagation path, Risk evaluation, Proactive maintenance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relationships among organisms and their surroundings can be of immense complexity. To describe and understand an ecosystem as a tangled bank, multiple ways of interaction and their effects have to be considered, such as predation, competition, mutualism and facilitation. Understanding the resulting interaction networks is a challenge in changing environments, e.g. to predict knock-on effects of invasive species and to understand how climate change impacts biodiversity. The elucidation of complex ecological systems with their interactions will benefit enormously from the development of new machine learning tools that aim to infer the structure of interaction networks from field data. In the present study, we propose a novel Bayesian regression and multiple changepoint model (BRAM) for reconstructing species interaction networks from observed species distributions. The model has been devised to allow robust inference in the presence of spatial autocorrelation and distributional heterogeneity. We have evaluated the model on simulated data that combines a trophic niche model with a stochastic population model on a 2-dimensional lattice, and we have compared the performance of our model with L1-penalized sparse regression (LASSO) and non-linear Bayesian networks with the BDe scoring scheme. In addition, we have applied our method to plant ground coverage data from the western shore of the Outer Hebrides with the objective to infer the ecological interactions. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-qualitative probabilistic networks (SQPNs) merge two important graphical model formalisms: Bayesian networks and qualitative probabilistic networks. They provide a very general modeling framework by allowing the combination of numeric and qualitative assessments over a discrete domain, and can be compactly encoded by exploiting the same factorization of joint probability distributions that are behind the Bayesian networks. This paper explores the computational complexity of semi-qualitative probabilistic networks, and takes the polytree-shaped networks as its main target. We show that the inference problem is coNP-Complete for binary polytrees with multiple observed nodes. We also show that inferences can be performed in linear time if there is a single observed node, which is a relevant practical case. Because our proof is constructive, we obtain an efficient linear time algorithm for SQPNs under such assumptions. To the best of our knowledge, this is the first exact polynomial-time algorithm for SQPNs. Together these results provide a clear picture of the inferential complexity in polytree-shaped SQPNs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Credal networks relax the precise probability requirement of Bayesian networks, enabling a richer representation of uncertainty in the form of closed convex sets of probability measures. The increase in expressiveness comes at the expense of higher computational costs. In this paper, we present a new variable elimination algorithm for exactly computing posterior inferences in extensively specified credal networks, which is empirically shown to outperform a state-of-the-art algorithm. The algorithm is then turned into a provably good approximation scheme, that is, a procedure that for any input is guaranteed to return a solution not worse than the optimum by a given factor. Remarkably, we show that when the networks have bounded treewidth and bounded number of states per variable the approximation algorithm runs in time polynomial in the input size and in the inverse of the error factor, thus being the first known fully polynomial-time approximation scheme for inference in credal networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents new results on the complexity of graph-theoretical models that represent probabilities (Bayesian networks) and that represent interval and set valued probabilities (credal networks). We define a new class of networks with bounded width, and introduce a new decision problem for Bayesian networks, the maximin a posteriori. We present new links between the Bayesian and credal networks, and present new results both for Bayesian networks (most probable explanation with observations, maximin a posteriori) and for credal networks (bounds on probabilities a posteriori, most probable explanation with and without observations, maximum a posteriori).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Highway structures such as bridges are subject to continuous degradation primarily due to ageing and environmental factors. A rational transport policy requires the monitoring of this transport infrastructure to provide adequate maintenance and guarantee the required levels of transport service and safety. In Europe, this is now a legal requirement - a European Directive requires all member states of the European Union to implement a Bridge Management System. However, the process is expensive, requiring the installation of sensing equipment and data acquisition electronics on the bridge. This paper investigates the use of an instrumented vehicle fitted with accelerometers on its axles to monitor the dynamic behaviour of bridges as an indicator of its structural condition. This approach eliminates the need for any on-site installation of measurement equipment. A simplified half-car vehicle-bridge interaction model is used in theoretical simulations to test the possibility of extracting the dynamic parameters of the bridge from the spectra of the vehicle accelerations. The effect of vehicle speed, vehicle mass and bridge span length on the detection of the bridge dynamic parameters are investigated. The algorithm is highly sensitive to the condition of the road profile and simulations are carried out for both smooth and rough profiles

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper introduces a new modeling approach that represents the waiting times in an Accident and Emergency (A&E) Department in a UK based National Health Service (NHS) hospital. The technique uses Bayesian networks to capture the heterogeneity of arriving patients by representing how patient covariates interact to influence their waiting times in the department. Such waiting times have been reviewed by the NHS as a means of investigating the efficiency of A&E departments (Emergency Rooms) and how they operate. As a result activity targets are now established based on the patient total waiting times with much emphasis on trolley waits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper introduces a new modeling approach that represents the waiting times in an accident and emergency (A&E) department in a UK based national health service (NHS) hospital. The technique uses Bayesian networks to capture the heterogeneity of arriving patients by representing how patient covariates interact to influence their waiting times in the department. Such waiting times have been reviewed by the NHS as a means of investigating the efficiency of A&E departments (emergency rooms) and how they operate. As a result activity targets are now established based on the patient total waiting times with much emphasis on trolley waits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a fully-distributed self-healing algorithm DEX, that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders - whose expansion properties hold deterministically - that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion which rapidly degrade in a dynamic setting, in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(log n) rounds and O(log n) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table (DHT) on top of DEX, with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a Bayesian-odds-ratio-based algorithm for detecting stellar flares in light-curve data. We assume flares are described by a model in which there is a rapid rise with a half-Gaussian profile, followed by an exponential decay. Our signal model also contains a polynomial background model required to fit underlying light-curve variations in the data, which could otherwise partially mimic a flare. We characterize the false alarm probability and efficiency of this method under the assumption that any unmodelled noise in the data is Gaussian, and compare it with a simpler thresholding method based on that used in Walkowicz et al. We find our method has a significant increase in detection efficiency for low signal-to-noise ratio (S/N) flares. For a conservative false alarm probability our method can detect 95 per cent of flares with S/N less than 20, as compared to S/N of 25 for the simpler method. We also test how well the assumption of Gaussian noise holds by applying the method to a selection of 'quiet' Kepler stars. As an example we have applied our method to a selection of stars in Kepler Quarter 1 data. The method finds 687 flaring stars with a total of 1873 flares after vetos have been applied. For these flares we have made preliminary characterizations of their durations and and S/N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Credal networks are graph-based statistical models whose parameters take values in a set, instead of being sharply specified as in traditional statistical models (e.g., Bayesian networks). The computational complexity of inferences on such models depends on the irrelevance/independence concept adopted. In this paper, we study inferential complexity under the concepts of epistemic irrelevance and strong independence. We show that inferences under strong independence are NP-hard even in trees with binary variables except for a single ternary one. We prove that under epistemic irrelevance the polynomial-time complexity of inferences in credal trees is not likely to extend to more general models (e.g., singly connected topologies). These results clearly distinguish networks that admit efficient inferences and those where inferences are most likely hard, and settle several open questions regarding their computational complexity. We show that these results remain valid even if we disallow the use of zero probabilities. We also show that the computation of bounds on the probability of the future state in a hidden Markov model is the same whether we assume epistemic irrelevance or strong independence, and we prove an analogous result for inference in Naive Bayes structures. These inferential equivalences are important for practitioners, as hidden Markov models and Naive Bayes networks are used in real applications of imprecise probability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Credal networks are graph-based statistical models whose parameters take values on a set, instead of being sharply specified as in traditional statistical models (e.g., Bayesian networks). The result of inferences with such models depends on the irrelevance/independence concept adopted. In this paper, we study the computational complexity of inferences under the concepts of epistemic irrelevance and strong independence. We strengthen complexity results by showing that inferences with strong independence are NP-hard even in credal trees with ternary variables, which indicates that tractable algorithms, including the existing one for epistemic trees, cannot be used for strong independence. We prove that the polynomial time of inferences in credal trees under epistemic irrelevance is not likely to extend to more general models, because the problem becomes NP-hard even in simple polytrees. These results draw a definite line between networks with efficient inferences and those where inferences are hard, and close several open questions regarding the computational complexity of such models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A credal network is a graphical tool for representation and manipulation of uncertainty, where probability values may be imprecise or indeterminate. A credal network associates a directed acyclic graph with a collection of sets of probability measures; in this context, inference is the computation of tight lower and upper bounds for conditional probabilities. In this paper we present new algorithms for inference in credal networks based on multilinear programming techniques. Experiments indicate that these new algorithms have better performance than existing ones, in the sense that they can produce more accurate results in larger networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a fully-distributed self-healing algorithm dex that maintains a constant degree expander network in a dynamic setting. To the best of our knowledge, our algorithm provides the first efficient distributed construction of expanders—whose expansion properties holddeterministically—that works even under an all-powerful adaptive adversary that controls the dynamic changes to the network (the adversary has unlimited computational power and knowledge of the entire network state, can decide which nodes join and leave and at what time, and knows the past random choices made by the algorithm). Previous distributed expander constructions typically provide only probabilistic guarantees on the network expansion whichrapidly degrade in a dynamic setting; in particular, the expansion properties can degrade even more rapidly under adversarial insertions and deletions. Our algorithm provides efficient maintenance and incurs a low overhead per insertion/deletion by an adaptive adversary: only O(logn)O(log⁡n) rounds and O(logn)O(log⁡n) messages are needed with high probability (n is the number of nodes currently in the network). The algorithm requires only a constant number of topology changes. Moreover, our algorithm allows for an efficient implementation and maintenance of a distributed hash table on top of dex  with only a constant additional overhead. Our results are a step towards implementing efficient self-healing networks that have guaranteed properties (constant bounded degree and expansion) despite dynamic changes.

Gopal Pandurangan has been supported in part by Nanyang Technological University Grant M58110000, Singapore Ministry of Education (MOE) Academic Research Fund (AcRF) Tier 2 Grant MOE2010-T2-2-082, MOE AcRF Tier 1 Grant MOE2012-T1-001-094, and the United States-Israel Binational Science Foundation (BSF) Grant 2008348. Peter Robinson has been supported by Grant MOE2011-T2-2-042 “Fault-tolerant Communication Complexity in Wireless Networks” from the Singapore MoE AcRF-2. Work done in part while the author was at the Nanyang Technological University and at the National University of Singapore. Amitabh Trehan has been supported by the Israeli Centers of Research Excellence (I-CORE) program (Center No. 4/11). Work done in part while the author was at Hebrew University of Jerusalem and at the Technion and supported by a Technion fellowship.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work proposes an extended version of the well-known tree-augmented naive Bayes (TAN) classifier where the structure learning step is performed without requiring features to be connected to the class. Based on a modification of Edmonds' algorithm, our structure learning procedure explores a superset of the structures that are considered by TAN, yet achieves global optimality of the learning score function in a very efficient way (quadratic in the number of features, the same complexity as learning TANs). We enhance our procedure with a new score function that only takes into account arcs that are relevant to predict the class, as well as an optimization over the equivalent sample size during learning. These ideas may be useful for structure learning of Bayesian networks in general. A range of experiments shows that we obtain models with better prediction accuracy than naive Bayes and TAN, and comparable to the accuracy of the state-of-the-art classifier averaged one-dependence estimator (AODE). We release our implementation of ETAN so that it can be easily installed and run within Weka.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of topology control is to assign per-node transmission power such that the resulting topology is energy efficient and satisfies certain global properties such as connectivity. The conventional approach to achieve these objectives is based on the fundamental assumption that nodes are socially responsible. We examine the following question: if nodes behave in a selfish manner, how does it impact the overall connectivity and energy consumption in the resulting topologies? We pose the above problem as a noncooperative game and use game-theoretic analysis to address it. We study Nash equilibrium properties of the topology control game and evaluate the efficiency of the induced topology when nodes employ a greedy best response algorithm. We show that even when the nodes have complete information about the network, the steady-state topologies are suboptimal. We propose a modified algorithm based on a better response dynamic and show that this algorithm is guaranteed to converge to energy-efficient and connected topologies. Moreover, the node transmit power levels are more evenly distributed, and the network performance is comparable to that obtained from centralized algorithms.