72 resultados para Uncertainty in governance
Resumo:
In this paper, a strategy for min-max Moving Horizon Estimation (MHE) of a class of uncertain hybrid systems is proposed. The class of hybrid systems being considered are Piecewise Affine systems (PWA) with both continuous valued and logic components. Furthermore, we consider the case when there is a (possibly structured) norm bounded uncertainty in each subsystem. Sufficient conditions on the time horizon and the penalties on the state at the beginning of the estimation horizon to guarantee convergence of the MHE scheme will be provided. The MHE scheme will be implemented as a mixed integer semidefinite optimisation for which an efficient algorithm was recently introduced.
Resumo:
Brittleness is the unintended, but inevitable consequence of producing a transparent ceramic for architectural applications such as the soda-lime glass. Its tensile strength is particularly sensitive to surface imperfections, such as that from natural weathering and malicious damage. Although a significant amount of testing of new glass has been carried out, there has been surprisingly little testing on weathered glass. Due to the variable nature of the causes of surface damage, the lack of data on weathered glass leads to a considerable degree of uncertainty in the long-term strength of exposed glass. This paper presents the results of recent tests on weathered annealed glass which has been exposed to natural weathering for more than 20 years. The tests include experimental investigations using the co-axial ring setup as well as optical and atomic force microscopy of the glass surfaces. The experimental data from these tests is subsequently used to extend existing fracture mechanics-based models to predict the strength of weathered glass. It is shown that using an automated approach based directly on finite element analysis results can give an increase in effective design strength in the order of 70 to 100% when compared to maximum stress methods. It is also shown that by combining microscopy and strength test results, it is possible to quantitatively characterise the damage on glass surfaces.
Resumo:
At an early stage of learning novel dynamics, changes in muscle activity are mainly due to corrective feedback responses. These feedback contributions to the overall motor command are gradually reduced as feedforward control is learned. The temporary increased use of feedback could arise simply from the large errors in early learning with either unaltered gains or even slightly downregulated gains, or from an upregulation of the feedback gains when feedforward prediction is insufficient. We therefore investigated whether the sensorimotor control system alters feedback gains during adaptation to a novel force field generated by a robotic manipulandum. To probe the feedback gains throughout learning, we measured the magnitude of involuntary rapid visuomotor responses to rapid shifts in the visual location of the hand during reaching movements. We found large increases in the magnitude of the rapid visuomotor response whenever the dynamics changed: both when the force field was first presented, and when it was removed. We confirmed that these changes in feedback gain are not simply a byproduct of the change in background load, by demonstrating that this rapid visuomotor response is not load sensitive. Our results suggest that when the sensorimotor control system experiences errors, it increases the gain of the visuomotor feedback pathways to deal with the unexpected disturbances until the feedforward controller learns the appropriate dynamics. We suggest that these feedback gains are upregulated with increased uncertainty in the knowledge of the dynamics to counteract any errors or disturbances and ensure accurate and skillful movements.
Resumo:
Modelling is fundamental to many fields of science and engineering. A model can be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian non-parametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian non-parametrics. The survey covers the use of Bayesian non-parametrics for modelling unknown functions, density estimation, clustering, time-series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically, it gives brief non-technical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman's coalescent, Dirichlet diffusion trees and Wishart processes.
Resumo:
Ground vibration due to underground railways is a significant source of disturbance for people living or working near subways. Numerical models are commonly used to predict vibration levels; however, uncertainty inherent to these simulations must be understood to give confidence in the predictions. A semi-analytical approach is developed herein to investigate the effect of uncertainty in soil material properties on the surface vibration of layered halfspaces excited by an underground railway. The half-space is simulated using the thin-layer method coupled with the pipe-in-pipe (PiP) method for determining the load on the buried tunnel. The K-L expansion method is employed to smoothly vary the material properties throughout the soil by up to 10%. The simulation predicts a surface rms velocity variation of 5-10dB compared to a homogeneous, layered halfspace. These results suggest it may be prudent to include a 5dB error band on predicted vibration levels when simulating areas of varied material properties.
Resumo:
Amplitude demodulation is an ill-posed problem and so it is natural to treat it from a Bayesian viewpoint, inferring the most likely carrier and envelope under probabilistic constraints. One such treatment is Probabilistic Amplitude Demodulation (PAD), which, whilst computationally more intensive than traditional approaches, offers several advantages. Here we provide methods for estimating the uncertainty in the PAD-derived envelopes and carriers, and for learning free-parameters like the time-scale of the envelope. We show how the probabilistic approach can naturally handle noisy and missing data. Finally, we indicate how to extend the model to signals which contain multiple modulators and carriers.
Resumo:
Variational methods are a key component of the approximate inference and learning toolbox. These methods fill an important middle ground, retaining distributional information about uncertainty in latent variables, unlike maximum a posteriori methods (MAP), and yet generally requiring less computational time than Monte Carlo Markov Chain methods. In particular the variational Expectation Maximisation (vEM) and variational Bayes algorithms, both involving variational optimisation of a free-energy, are widely used in time-series modelling. Here, we investigate the success of vEM in simple probabilistic time-series models. First we consider the inference step of vEM, and show that a consequence of the well-known compactness property of variational inference is a failure to propagate uncertainty in time, thus limiting the usefulness of the retained distributional information. In particular, the uncertainty may appear to be smallest precisely when the approximation is poorest. Second, we consider parameter learning and analytically reveal systematic biases in the parameters found by vEM. Surprisingly, simpler variational approximations (such a mean-field) can lead to less bias than more complicated structured approximations.
Resumo:
An existing hybrid finite element (FE)/statistical energy analysis (SEA) approach to the analysis of the mid- and high frequency vibrations of a complex built-up system is extended here to a wider class of uncertainty modeling. In the original approach, the constituent parts of the system are considered to be either deterministic, and modeled using FE, or highly random, and modeled using SEA. A non-parametric model of randomness is employed in the SEA components, based on diffuse wave theory and the Gaussian Orthogonal Ensemble (GOE), and this enables the mean and variance of second order quantities such as vibrational energy and response cross-spectra to be predicted. In the present work the assumption that the FE components are deterministic is relaxed by the introduction of a parametric model of uncertainty in these components. The parametric uncertainty may be modeled either probabilistically, or by using a non-probabilistic approach such as interval analysis, and it is shown how these descriptions can be combined with the non-parametric uncertainty in the SEA subsystems to yield an overall assessment of the performance of the system. The method is illustrated by application to an example built-up plate system which has random properties, and benchmark comparisons are made with full Monte Carlo simulations. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
Numerical integration is a key component of many problems in scientific computing, statistical modelling, and machine learning. Bayesian Quadrature is a modelbased method for numerical integration which, relative to standard Monte Carlo methods, offers increased sample efficiency and a more robust estimate of the uncertainty in the estimated integral. We propose a novel Bayesian Quadrature approach for numerical integration when the integrand is non-negative, such as the case of computing the marginal likelihood, predictive distribution, or normalising constant of a probabilistic model. Our approach approximately marginalises the quadrature model's hyperparameters in closed form, and introduces an active learning scheme to optimally select function evaluations, as opposed to using Monte Carlo samples. We demonstrate our method on both a number of synthetic benchmarks and a real scientific problem from astronomy.
Resumo:
We study the the design of a tracking controller for the popular bouncing ball model: the continuous-time actuation of a table is used to control the impacts of the table with a bouncing ball. The proposed control law uses the impact times as the sole feedback information. We show that the acceleration of the table at impact plays no role in the stability analysis but is an important parameter for the robustness of the feedback system to model uncertainty, in particular to the uncertainty on the coefficient of restitution. © 2006 IEEE.
Resumo:
The growing interest in innovative reactors and advanced fuel cycle designs requires more accurate prediction of various transuranic actinide concentrations during irradiation or following discharge because of their effect on reactivity or spent-fuel emissions, such as gamma and neutron activity and decay heat. In this respect, many of the important actinides originate from the 241Am(n,γ) reaction, which leads to either the ground or the metastable state of 242Am. The branching ratio for this reaction depends on the incident neutron energy and has very large uncertainty in the current evaluated nuclear data files. This study examines the effect of accounting for the energy dependence of the 241Am(n,γ) reaction branching ratio calculated from different evaluated data files for different reactor and fuel types on the reactivity and concentrations of some important actinides. The results of the study confirm that the uncertainty in knowing the 241Am(n,γ) reaction branching ratio has a negligible effect on the characteristics of conventional light water reactor fuel. However, in advanced reactors with large loadings of actinides in general, and 241Am in particular, the branching ratio data calculated from the different data files may lead to significant differences in the prediction of the fuel criticality and isotopic composition. Moreover, it was found that neutron energy spectrum weighting of the branching ratio in each analyzed case is particularly important and may result in up to a factor of 2 difference in the branching ratio value. Currently, most of the neutronic codes have a single branching ratio value in their data libraries, which is sometimes difficult or impossible to update in accordance with the neutron spectrum shape for the analyzed system.
Resumo:
We present a novel mixture of trees (MoT) graphical model for video segmentation. Each component in this mixture represents a tree structured temporal linkage between super-pixels from the first to the last frame of a video sequence. Our time-series model explicitly captures the uncertainty in temporal linkage between adjacent frames which improves segmentation accuracy. We provide a variational inference scheme for this model to estimate super-pixel labels and their confidences in nearly realtime. The efficacy of our approach is demonstrated via quantitative comparisons on the challenging SegTrack joint segmentation and tracking dataset [23].