57 resultados para Microscopic simulation models


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this article, we develop a new Rao-Blackwellized Monte Carlo smoothing algorithm for conditionally linear Gaussian models. The algorithm is based on the forward-filtering backward-simulation Monte Carlo smoother concept and performs the backward simulation directly in the marginal space of the non-Gaussian state component while treating the linear part analytically. Unlike the previously proposed backward-simulation based Rao-Blackwellized smoothing approaches, it does not require sampling of the Gaussian state component and is also able to overcome certain normalization problems of two-filter smoother based approaches. The performance of the algorithm is illustrated in a simulated application. © 2012 IFAC.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The performance of algebraic flame surface density (FSD) models has been assessed for flames with nonunity Lewis number (Le) in the thin reaction zones regime, using a direct numerical simulation (DNS) database of freely propagating turbulent premixed flames with Le ranging from 0.34 to 1.2. The focus is on algebraic FSD models based on a power-law approach, and the effects of Lewis number on the fractal dimension D and inner cut-off scale η i have been studied in detail. It has been found that D is strongly affected by Lewis number and increases significantly with decreasing Le. By contrast, η i remains close to the laminar flame thermal thickness for all values of Le considered here. A parameterisation of D is proposed such that the effects of Lewis number are explicitly accounted for. The new parameterisation is used to propose a new algebraic model for FSD. The performance of the new model is assessed with respect to results for the generalised FSD obtained from explicitly LES-filtered DNS data. It has been found that the performance of the most existing models deteriorates with decreasing Lewis number, while the newly proposed model is found to perform as well or better than the most existing algebraic models for FSD. © 2012 Mohit Katragadda et al.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A direct numerical simulation (DNS) database of freely propagating statistically planar turbulent premixed flames with a range of different turbulent Reynolds numbers has been used to assess the performance of algebraic flame surface density (FSD) models based on a fractal representation of the flame wrinkling factor. The turbulent Reynolds number Ret has been varied by modifying the Karlovitz number Ka and the Damköhler number Da independently of each other in such a way that the flames remain within the thin reaction zones regime. It has been found that the turbulent Reynolds number and the Karlovitz number both have a significant influence on the fractal dimension, which is found to increase with increasing Ret and Ka before reaching an asymptotic value for large values of Ret and Ka. A parameterisation of the fractal dimension is presented in which the effects of the Reynolds and the Karlovitz numbers are explicitly taken into account. By contrast, the inner cut-off scale normalised by the Zel'dovich flame thickness ηi/δz does not exhibit any significant dependence on Ret for the cases considered here. The performance of several algebraic FSD models has been assessed based on various criteria. Most of the algebraic models show a deterioration in performance with increasing the LES filter width. © 2012 Mohit Katragadda et al.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modeling work in neuroscience can be classified using two different criteria. The first one is the complexity of the model, ranging from simplified conceptual models that are amenable to mathematical analysis to detailed models that require simulations in order to understand their properties. The second criterion is that of direction of workflow, which can be from microscopic to macroscopic scales (bottom-up) or from behavioral target functions to properties of components (top-down). We review the interaction of theory and simulation using examples of top-down and bottom-up studies and point to some current developments in the fields of computational and theoretical neuroscience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The magnetisation of bulk high temperature superconductors (HTS), such as RE-Ba-Cu-O [(RE)BCO, where RE is a rare earth element or Y], by a practical technique is essential for their application in high field, permanent magnet-like devices. Research to-date into the pulsed field magnetisation (PFM) of these materials, however, has been limited generally to experimental techniques, with relatively little progress in the development of theoretical models. This is because not only is a multi-physics approach needed to take account of the heating of the samples but also the high electric fields generated are well above the regime in which there are reliable experimental results. This paper describes a framework of theoretical simulation using the finite element method (FEM) that is applicable to both single- and multi-pulse magnetisation processes of (RE)BCO bulk superconductors. The model incorporates the heat equation and provides a convenient way of determining the distribution of trapped field, current density and temperature change within a bulk superconductor at each stage of the magnetisation process. An example of the single-pulse magnetisation of a (RE)BCO bulk is described. Potentially, the model may serve as a cost-effective tool for the optimisation of the bulk geometry and the magnetisation profile in multi-pulse magnetisation processes. © 2010 IOP Publishing Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate Bayesian computation (ABC) is a popular technique for analysing data for complex models where the likelihood function is intractable. It involves using simulation from the model to approximate the likelihood, with this approximate likelihood then being used to construct an approximate posterior. In this paper, we consider methods that estimate the parameters by maximizing the approximate likelihood used in ABC. We give a theoretical analysis of the asymptotic properties of the resulting estimator. In particular, we derive results analogous to those of consistency and asymptotic normality for standard maximum likelihood estimation. We also discuss how sequential Monte Carlo methods provide a natural method for implementing our likelihood-based ABC procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we shall discuss the use of the TSIM simulation software for modelling large-scale industrial processes. The discussion draws on our recent experience of modelling a large plant in the food-processing industry. We shall focus on those features of software organization and software engineering which proved to be particularly necessary for the execution of this project, and illustrate the extent to which the use of TISM facilitated the implementation of these features. We shall also make some general remarks about the 'life-cycle' of models resulting from projects of this kind.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, a variety of user models have been proposed for user simulation-based reinforcement-learning of dialogue strategies. However, the strategies learned with these models are rarely evaluated in actual user trials and it remains unclear how the choice of user model affects the quality of the learned strategy. In particular, the degree to which strategies learned with a user model generalise to real user populations has not be investigated. This paper presents a series of experiments that qualitatively and quantitatively examine the effect of the user model on the learned strategy. Our results show that the performance and characteristics of the strategy are in fact highly dependent on the user model. Furthermore, a policy trained with a poor user model may appear to perform well when tested with the same model, but fail when tested with a more sophisticated user model. This raises significant doubts about the current practice of learning and evaluating strategies with the same user model. The paper further investigates a new technique for testing and comparing strategies directly on real human-machine dialogues, thereby avoiding any evaluation bias introduced by the user model. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models for simulating Scanning Probe Microscopy (SPM) may serve as a reference point for validating experimental data and practice. Generally, simulations use a microscopic model of the sample-probe interaction based on a first-principles approach, or a geometric model of macroscopic distortions due to the probe geometry. Examples of the latter include use of neural networks, the Legendre Transform, and dilation/erosion transforms from mathematical morphology. Dilation and the Legendre Transform fall within a general family of functional transforms, which distort a function by imposing a convex solution.In earlier work, the authors proposed a generalized approach to modeling SPM using a hidden Markov model, wherein both the sample-probe interaction and probe geometry may be taken into account. We present a discussion of the hidden Markov model and its relationship to these convex functional transforms for simulating and restoring SPM images.©2009 SPIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observation shows that the watershed-scale models in common use in the United States (US) differ from those used in the European Union (EU). The question arises whether the difference in model use is due to familiarity or necessity. Do conditions in each continent require the use of unique watershed-scale models, or are models sufficiently customizable that independent development of models that serve the same purpose (e.g., continuous/event- based, lumped/distributed, field-Awatershed-scale) is unnecessary? This paper explores this question through the application of two continuous, semi-distributed, watershed-scale models (HSPF and HBV-INCA) to a rural catchment in southern England. The Hydrological Simulation Program-Fortran (HSPF) model is in wide use in the United States. The Integrated Catchments (INCA) model has been used extensively in Europe, and particularly in England. The results of simulation from both models are presented herein. Both models performed adequately according to the criteria set for them. This suggests that there was not a necessity to have alternative, yet similar, models. This partially supports a general conclusion that resources should be devoted towards training in the use of existing models rather than development of new models that serve a similar purpose to existing models. A further comparison of water quality predictions from both models may alter this conclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the application of advanced compact models of the IGBT and PIN diode to the full electrothermal system simulation of a hybrid electric vehicle converter using a look-up table of device losses. The Fourier-based solution model is used, which takes account of features such as local lifetime control and field-stop technology. Device and circuit parameters are extracted from experimental waveforms and device structural data. Matching of the switching waveforms and the resulting generation of the look-up table is presented. An example of the use of the look-up tables in simulation of inverter device temperatures is also given, for a hypothetical electric vehicle subjected to an urban driving cycle. © 2006 IEEE.