30 resultados para 4-component gaussian basis sets


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measured toughness J(C) of adipose and dermal porcine tissues are 4.1 and 17 kJ m(-2), respectively, via a trouser tear test. An assessment is made of the contribution to overall toughness from the microstructural elements. The analysis suggests that the toughness of adipose tissue is determined by the collagen network that surrounds the adipocytes. The volume fraction of the interlobular septa is sufficiently low for it to make a negligible contribution to the macroscopic toughness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a computational method for the coupled simulation of a compressible flow interacting with a thin-shell structure undergoing large deformations. An Eulerian finite volume formulation is adopted for the fluid and a Lagrangian formulation based on subdivision finite elements is adopted for the shell response. The coupling between the fluid and the solid response is achieved via a novel approach based on level sets. The basic approach furnishes a general algorithm for coupling Lagrangian shell solvers with Cartesian grid based Eulerian fluid solvers. The efficiency and robustness of the proposed approach is demonstrated with a airbag deployment simulation. It bears emphasis that in the proposed approach the solid and the fluid components as well as their coupled interaction are considered in full detail and modeled with an equivalent level of fidelity without any oversimplifying assumptions or bias towards a particular physical aspect of the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We combine Bayesian online change point detection with Gaussian processes to create a nonparametric time series model which can handle change points. The model can be used to locate change points in an online manner; and, unlike other Bayesian online change point detection algorithms, is applicable when temporal correlations in a regime are expected. We show three variations on how to apply Gaussian processes in the change point context, each with their own advantages. We present methods to reduce the computational burden of these models and demonstrate it on several real world data sets. Copyright 2010 by the author(s)/owner(s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Density modeling is notoriously difficult for high dimensional data. One approach to the problem is to search for a lower dimensional manifold which captures the main characteristics of the data. Recently, the Gaussian Process Latent Variable Model (GPLVM) has successfully been used to find low dimensional manifolds in a variety of complex data. The GPLVM consists of a set of points in a low dimensional latent space, and a stochastic map to the observed space. We show how it can be interpreted as a density model in the observed space. However, the GPLVM is not trained as a density model and therefore yields bad density estimates. We propose a new training strategy and obtain improved generalisation performance and better density estimates in comparative evaluations on several benchmark data sets. © 2010 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structured precision modelling is an important approach to improve the intra-frame correlation modelling of the standard HMM, where Gaussian mixture model with diagonal covariance are used. Previous work has all been focused on direct structured representation of the precision matrices. In this paper, a new framework is proposed, where the structure of the Cholesky square root of the precision matrix is investigated, referred to as Cholesky Basis Superposition (CBS). Each Cholesky matrix associated with a particular Gaussian distribution is represented as a linear combination of a set of Gaussian independent basis upper-triangular matrices. Efficient optimization methods are derived for both combination weights and basis matrices. Experiments on a Chinese dictation task showed that the proposed approach can significantly outperformed the direct structured precision modelling with similar number of parameters as well as full covariance modelling. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preferential species diffusion is known to have important effects on local flame structure in turbulent premixed flames, and differential diffusion of heat and mass can have significant effects on both local flame structure and global flame parameters, such as turbulent flame speed. However, models for turbulent premixed combustion normally assume that atomic mass fractions are conserved from reactants to fully burnt products. Experiments reported here indicate that this basic assumption may be incorrect for an important class of turbulent flames. Measurements of major species and temperature in the near field of turbulent, bluff-body stabilized, lean premixed methane-air flames (Le=0.98) reveal significant departures from expected conditional mean compositional structure in the combustion products as well as within the flame. Net increases exceeding 10% in the equivalence ratio and the carbon-to-hydrogen atom ratio are observed across the turbulent flame brush. Corresponding measurements across an unstrained laminar flame at similar equivalence ratio are in close agreement with calculations performed using Chemkin with the GRI 3.0 mechanism and multi-component transport, confirming accuracy of experimental techniques. Results suggest that the large effects observed in the turbulent bluff-body burner are cause by preferential transport of H 2 and H 2O through the preheat zone ahead of CO 2 and CO, followed by convective transport downstream and away from the local flame brush. This preferential transport effect increases with increasing velocity of reactants past the bluff body and is apparently amplified by the presence of a strong recirculation zone where excess CO 2 is accumulated. © 2011 The Combustion Institute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we develop a new Rao-Blackwellized Monte Carlo smoothing algorithm for conditionally linear Gaussian models. The algorithm is based on the forward-filtering backward-simulation Monte Carlo smoother concept and performs the backward simulation directly in the marginal space of the non-Gaussian state component while treating the linear part analytically. Unlike the previously proposed backward-simulation based Rao-Blackwellized smoothing approaches, it does not require sampling of the Gaussian state component and is also able to overcome certain normalization problems of two-filter smoother based approaches. The performance of the algorithm is illustrated in a simulated application. © 2012 IFAC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamism and uncertainty are real challenges for present day manufacturing enterprises (MEs). Reasons include: an increasing demand for customisation, reduced time to market, shortened product life cycles and globalisation. MEs can reduce competitive pressure by becoming reconfigurable and change-capable. However, modern manufacturing philosophies, including agile and lean, must complement the application of reconfigurable manufacturing paradigms. Choosing and applying the best philosophies and techniques is very difficult as most MEs deploy complex and unique configurations of processes and resource systems, and seek economies of scope and scale in respect of changing and distinctive product flows. It follows that systematic methods of achieving model driven reconfiguration and interoperation of component based manufacturing systems are required to design, engineer and change future MEs. This thesis, titled Enhanced Integrated Modelling Approach to Reconfiguring Manufacturing Enterprises , introduces the development and prototyping a model-driven environment for the design, engineering, optimisation and control of the reconfiguration of MEs with an embedded capability to handle various types of change. The thesis describes a novel systematic approach, namely enhanced integrated modelling approach (EIMA), in which coherent sets of integrated models are created that facilitates the engineering of MEs especially their production planning and control (PPC) systems. The developed environment supports the engineering of common types of strategic, tactical and operational processes found in many MEs. The EIMA is centred on the ISO standardised CIMOSA process modelling approach. Early study led to the development of simulation models during which various CIMOSA shortcomings were observed, especially in its support for aspects of ME dynamism. A need was raised to structure and create semantically enriched models hence forming an enhanced integrated modelling environment. The thesis also presents three industrial case examples: (1) Ford Motor Company; (2) Bradgate Furniture Manufacturing Company; and (3) ACM Bearings Company. In order to understand the system prior to realisation of any PPC strategy, multiple process segments of any target organisation need to be modelled. Coherent multi-perspective case study models are presented that have facilitated process reengineering and associated resource system configuration. Such models have a capability to enable PPC decision making processes in support of the reconfiguration of MEs. During these case studies, capabilities of a number of software tools were exploited such as Arena®, Simul8®, Plant Simulation®, MS Visio®, and MS Excel®. Case study results demonstrated effectiveness of the concepts related to the EIMA. The research has resulted in new contributions to knowledge in terms of new understandings, concepts and methods in following ways: (1) a structured model driven integrated approach to the design, optimisation and control of future reconfiguration of MEs. The EIMA is an enriched and generic process modelling approach with capability to represent both static and dynamic aspects of an ME; and (2) example application cases showing benefits in terms of reduction in lead time, cost and resource load and in terms of improved responsiveness of processes and resource systems with a special focus on PPC; (3) identification and industrial application of a new key performance indicator (KPI) known as P3C the measuring and monitoring of which can aid in enhancing reconfigurability and responsiveness of MEs; and (4) an enriched modelling concept framework (E-MUNE) to capture requirements of static and dynamic aspects of MEs where the conceptual framework has the capability to be extended and modified according to the requirements. The thesis outlines key areas outlining a need for future research into integrated modelling approaches, interoperation and updating mechanisms of partial models in support of the reconfiguration of MEs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We demonstrate how a prior assumption of smoothness can be used to enhance the reconstruction of free energy profiles from multiple umbrella sampling simulations using the Bayesian Gaussian process regression approach. The method we derive allows the concurrent use of histograms and free energy gradients and can easily be extended to include further data. In Part I we review the necessary theory and test the method for one collective variable. We demonstrate improved performance with respect to the weighted histogram analysis method and obtain meaningful error bars without any significant additional computation. In Part II we consider the case of multiple collective variables and compare to a reconstruction using least squares fitting of radial basis functions. We find substantial improvements in the regimes of spatially sparse data or short sampling trajectories. A software implementation is made available on www.libatoms.org.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Copyright © (2014) by the International Machine Learning Society (IMLS) All rights reserved. Classical methods such as Principal Component Analysis (PCA) and Canonical Correlation Analysis (CCA) are ubiquitous in statistics. However, these techniques are only able to reveal linear re-lationships in data. Although nonlinear variants of PCA and CCA have been proposed, these are computationally prohibitive in the large scale. In a separate strand of recent research, randomized methods have been proposed to construct features that help reveal nonlinear patterns in data. For basic tasks such as regression or classification, random features exhibit little or no loss in performance, while achieving drastic savings in computational requirements. In this paper we leverage randomness to design scalable new variants of nonlinear PCA and CCA; our ideas extend to key multivariate analysis tools such as spectral clustering or LDA. We demonstrate our algorithms through experiments on real- world data, on which we compare against the state-of-the-art. A simple R implementation of the presented algorithms is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present novel batch and online (sequential) versions of the expectation-maximisation (EM) algorithm for inferring the static parameters of a multiple target tracking (MTT) model. Online EM is of particular interest as it is a more practical method for long data sets since in batch EM, or a full Bayesian approach, a complete browse of the data is required between successive parameter updates. Online EM is also suited to MTT applications that demand real-time processing of the data. Performance is assessed in numerical examples using simulated data for various scenarios. For batch estimation our method significantly outperforms an existing gradient based maximum likelihood technique, which we show to be significantly biased. © 2014 Springer Science+Business Media New York.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis focuses on the modelling of settlement induced damage to masonry buildings. In densely populated areas, the need for new space is nowadays producing a rapid increment of underground excavations. Due to the construction of new metro lines, tunnelling activity in urban areas is growing. One of the consequences is a greater attention to the risk of damage on existing structures. Thus, the assessment of potential damage of surface buildings has become an essential stage in the excavation projects in urban areas (Chapter 1). The current damage risk assessment procedure is based on strong simplifications, which not always lead to conservative results. Object of this thesis is the development of an improved damage classification system, which takes into account the parameters influencing the structural response to settlement, like the non-linear behaviour of masonry and the soil-structure interaction. The methodology used in this research is based on experimental and numerical modelling. The design and execution of an experimental benchmark test representative of the problem allows to identify the principal factors and mechanisms involved. The numerical simulations enable to generalize the results to a broader range of physical scenarios. The methodological choice is based on a critical review of the currently available procedures for the assessment of settlement-induced building damage (Chapter 2). A new experimental test on a 1/10th masonry façade with a rubber base interface is specifically designed to investigate the effect of soil-structure interaction on the tunnelling-induced damage (Chapter 3). The experimental results are used to validate a 2D semi-coupled finite element model for the simulation of the structural response (Chapter 4). The numerical approach, which includes a continuum cracking model for the masonry and a non-linear interface to simulate the soil-structure interaction, is then used to perform a sensitivity study on the effect of openings, material properties, initial damage, initial conditions, normal and shear behaviour of the base interface and applied settlement profile (Chapter 5). The results assess quantitatively the major role played by the normal stiffness of the soil-structure interaction and by the material parameters defining the quasi-brittle masonry behaviour. The limitation of the 2D modelling approach in simulating the progressive 3D displacement field induced by the excavation and the consequent torsional response of the building are overcome by the development of a 3D coupled model of building, foundation, soil and tunnel (Chapter 6). Following the same method applied to the 2D semi-coupled approach, the 3D model is validated through comparison with the monitoring data of a literature case study. The model is then used to carry out a series of parametric analyses on geometrical factors: the aspect ratio of horizontal building dimensions with respect to the tunnel axis direction, the presence of adjacent structures and the position and alignment of the building with respect to the excavation (Chapter 7). The results show the governing effect of the 3D building response, proving the relevance of 3D modelling. Finally, the results from the 2D and 3D parametric analyses are used to set the framework of an overall damage model which correlates the analysed structural features with the risk for the building of being damaged by a certain settlement (Chapter 8). This research therefore provides an increased experimental and numerical understanding of the building response to excavation-induced settlements, and sets the basis for an operational tool for the risk assessment of structural damage (Chapter 9).