166 resultados para Probabilistic Error Correction
Resumo:
Bayesian formulated neural networks are implemented using hybrid Monte Carlo method for probabilistic fault identification in cylindrical shells. Each of the 20 nominally identical cylindrical shells is divided into three substructures. Holes of (12±2) mm in diameter are introduced in each of the substructures and vibration data are measured. Modal properties and the Coordinate Modal Assurance Criterion (COMAC) are utilized to train the two modal-property-neural-networks. These COMAC are calculated by taking the natural-frequency-vector to be an additional mode. Modal energies are calculated by determining the integrals of the real and imaginary components of the frequency response functions over bandwidths of 12% of the natural frequencies. The modal energies and the Coordinate Modal Energy Assurance Criterion (COMEAC) are used to train the two frequency-response-function-neural-networks. The averages of the two sets of trained-networks (COMAC and COMEAC as well as modal properties and modal energies) form two committees of networks. The COMEAC and the COMAC are found to be better identification data than using modal properties and modal energies directly. The committee approach is observed to give lower standard deviations than the individual methods. The main advantage of the Bayesian formulation is that it gives identities of damage and their respective confidence intervals.
Resumo:
This paper explores the current state-of-the-art in performance indicators and use of probabilistic approaches used in climate change impact studies. It presents a critical review of recent publications in this field, focussing on (1) metrics for energy use for heating and cooling, emissions, overheating and high-level performance aspects, and (2) uptake of uncertainty and risk analysis. This is followed by a case study, which is used to explore some of the contextual issues around the broader uptake of climate change impact studies in practice. The work concludes that probabilistic predictions of the impact of climate change are feasible, but only based on strict and explicitly stated assumptions. © 2011 Elsevier B.V. All rights reserved.
Resumo:
This article investigates how to use UK probabilistic climate-change projections (UKCP09) in rigorous building energy analysis. Two office buildings (deep plan and shallow plan) are used as case studies to demonstrate the application of UKCP09. Three different methods for reducing the computational demands are explored: statistical reduction (Finkelstein-Schafer [F-S] statistics), simplification using degree-day theory and the use of metamodels. The first method, which is based on an established technique, can be used as reference because it provides the most accurate information. However, it is necessary to automatically choose weather files based on F-S statistic by using computer programming language because thousands of weather files created from UKCP09 weather generator need to be processed. A combination of the second (degree-day theory) and third method (metamodels) requires only a relatively small number of simulation runs, but still provides valuable information to further implement the uncertainty and sensitivity analyses. The article also demonstrates how grid computing can be used to speed up the calculation for many independent EnergyPlus models by harnessing the processing power of idle desktop computers. © 2011 International Building Performance Simulation Association (IBPSA).
Resumo:
This study investigated the neuromuscular mechanisms underlying the initial stage of adaptation to novel dynamics. A destabilizing velocity-dependent force field (VF) was introduced for sets of three consecutive trials. Between sets a random number of 4-8 null field trials were interposed, where the VF was inactivated. This prevented subjects from learning the novel dynamics, making it possible to repeatedly recreate the initial adaptive response. We were able to investigate detailed changes in neural control between the first, second and third VF trials. We identified two feedforward control mechanisms, which were initiated on the second VF trial and resulted in a 50% reduction in the hand path error. Responses to disturbances encountered on the first VF trial were feedback in nature, i.e. reflexes and voluntary correction of errors. However, on the second VF trial, muscle activation patterns were modified in anticipation of the effects of the force field. Feedforward cocontraction of all muscles was used to increase the viscoelastic impedance of the arm. While stiffening the arm, subjects also exerted a lateral force to counteract the perturbing effect of the force field. These anticipatory actions indicate that the central nervous system responds rapidly to counteract hitherto unfamiliar disturbances by a combination of increased viscoelastic impedance and formation of a crude internal dynamics model.
Resumo:
The paper is based on qualitative properties of the solution of the Navier-Stokes equations for incompressible fluid, and on properties of their finite element solution. In problems with corner-like singularities (e.g. on the well-known L-shaped domain) usually some adaptive strategy is used. In this paper we present an alternative approach. For flow problems on domains with corner singularities we use the a priori error estimates and asymptotic expansion of the solution to derive an algorithm for refining the mesh near the corner. It gives very precise solution in a cheap way. We present some numerical results.
Resumo:
The book begins by detailing the fundamentals of advanced coding techniques such as Coding, Decoding, Design, and Optimization.
Resumo:
We propose a novel model for the spatio-temporal clustering of trajectories based on motion, which applies to challenging street-view video sequences of pedestrians captured by a mobile camera. A key contribution of our work is the introduction of novel probabilistic region trajectories, motivated by the non-repeatability of segmentation of frames in a video sequence. Hierarchical image segments are obtained by using a state-of-the-art hierarchical segmentation algorithm, and connected from adjacent frames in a directed acyclic graph. The region trajectories and measures of confidence are extracted from this graph using a dynamic programming-based optimisation. Our second main contribution is a Bayesian framework with a twofold goal: to learn the optimal, in a maximum likelihood sense, Random Forests classifier of motion patterns based on video features, and construct a unique graph from region trajectories of different frames, lengths and hierarchical levels. Finally, we demonstrate the use of Isomap for effective spatio-temporal clustering of the region trajectories of pedestrians. We support our claims with experimental results on new and existing challenging video sequences. © 2011 IEEE.
Resumo:
A case study of an aircraft engine manufacturer is used to analyze the effects of management levers on the lead time and design errors generated in an iteration-intensive concurrent engineering process. The levers considered are amount of design-space exploration iteration, degree of process concurrency, and timing of design reviews. Simulation is used to show how the ideal combination of these levers can vary with changes in design problem complexity, which can increase, for instance, when novel technology is incorporated in a design. Results confirm that it is important to consider multiple iteration-influencing factors and their interdependencies to understand concurrent processes, because the factors can interact with confounding effects. The article also demonstrates a new approach to derive a system dynamics model from a process task network. The new approach could be applied to analyze other concurrent engineering scenarios. © The Author(s) 2012.