932 resultados para Parameter Estimation, Fokker-planck Equation, Finite Elements


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Commodity price modeling is normally approached in terms of structural time-series models, in which the different components (states) have a financial interpretation. The parameters of these models can be estimated using maximum likelihood. This approach results in a non-linear parameter estimation problem and thus a key issue is how to obtain reliable initial estimates. In this paper, we focus on the initial parameter estimation problem for the Schwartz-Smith two-factor model commonly used in asset valuation. We propose the use of a two-step method. The first step considers a univariate model based only on the spot price and uses a transfer function model to obtain initial estimates of the fundamental parameters. The second step uses the estimates obtained in the first step to initialize a re-parameterized state-space-innovations based estimator, which includes information related to future prices. The second step refines the estimates obtained in the first step and also gives estimates of the remaining parameters in the model. This paper is part tutorial in nature and gives an introduction to aspects of commodity price modeling and the associated parameter estimation problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a risk-sensitive approach to parameter estimation for hidden Markov models (HMMs). The parameter estimation approach considered exploits estimation of various functions of the state, based on model estimates. We propose certain practical suboptimal risk-sensitive filters to estimate the various functions of the state during transients, rather than optimal risk-neutral filters as in earlier studies. The estimates are asymptotically optimal, if asymptotically risk neutral, and can give significantly improved transient performance, which is a very desirable objective for certain engineering applications. To demonstrate the improvement in estimation simulation studies are presented that compare parameter estimation based on risk-sensitive filters with estimation based on risk-neutral filters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid recursive estimation of hidden Markov Model (HMM) parameters is important in applications that place an emphasis on the early availability of reasonable estimates (e.g. for change detection) rather than the provision of longer-term asymptotic properties (such as convergence, convergence rate, and consistency). In the context of vision- based aircraft (image-plane) heading estimation, this paper suggests and evaluates the short-data estimation properties of 3 recursive HMM parameter estimation techniques (a recursive maximum likelihood estimator, an online EM HMM estimator, and a relative entropy based estimator). On both simulated and real data, our studies illustrate the feasibility of rapid recursive heading estimation, but also demonstrate the need for careful step-size design of HMM recursive estimation techniques when these techniques are intended for use in applications where short-data behaviour is paramount.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose a novel online hidden Markov model (HMM) parameter estimator based on the new information-theoretic concept of one-step Kerridge inaccuracy (OKI). Under several regulatory conditions, we establish a convergence result (and some limited strong consistency results) for our proposed online OKI-based parameter estimator. In simulation studies, we illustrate the global convergence behaviour of our proposed estimator and provide a counter-example illustrating the local convergence of other popular HMM parameter estimators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stochastic (or random) processes are inherent to numerous fields of human endeavour including engineering, science, and business and finance. This thesis presents multiple novel methods for quickly detecting and estimating uncertainties in several important classes of stochastic processes. The significance of these novel methods is demonstrated by employing them to detect aircraft manoeuvres in video signals in the important application of autonomous mid-air collision avoidance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we consider subordinated processes controlled by a family of subordinators which consist of a power function of a time variable and a negative power function of an α-stable random variable. The effect of parameters in the subordinators on the subordinated process is discussed. By suitable variable substitutions and the Laplace transform technique, the corresponding fractional Fokker–Planck-type equations are derived. We also compute their mean square displacements in a free force field. By choosing suitable ranges of parameters, the resulting subordinated processes may be subdiffusive, normal diffusive or superdiffusive

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The LISA Parameter Estimation Taskforce was formed in September 2007 to provide the LISA Project with vetted codes, source distribution models and results related to parameter estimation. The Taskforce's goal is to be able to quickly calculate the impact of any mission design changes on LISA's science capabilities, based on reasonable estimates of the distribution of astrophysical sources in the universe. This paper describes our Taskforce's work on massive black-hole binaries (MBHBs). Given present uncertainties in the formation history of MBHBs, we adopt four different population models, based on (i) whether the initial black-hole seeds are small or large and (ii) whether accretion is efficient or inefficient at spinning up the holes. We compare four largely independent codes for calculating LISA's parameter-estimation capabilities. All codes are based on the Fisher-matrix approximation, but in the past they used somewhat different signal models, source parametrizations and noise curves. We show that once these differences are removed, the four codes give results in extremely close agreement with each other. Using a code that includes both spin precession and higher harmonics in the gravitational-wave signal, we carry out Monte Carlo simulations and determine the number of events that can be detected and accurately localized in our four population models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A finite element analysis of laminated shells reinforced with laminated stiffeners is described in this paper. A rectangular laminated anisotropic shallow thin shell finite element of 48 d.o.f. is used in conjunction with a laminated anisotropic curved beam and shell stiffening finite element having 16 d.o.f. Compatibility between the shell and the stiffener is maintained all along their junction line. Some problems of symmetrically stiff ened isotropic plates and shells have been solved to evaluate the performance of the present method. Behaviour of an eccentrically stiffened laminated cantilever cylindrical shell has been predicted to show the ability of the present program. General shells amenable to rectangular meshes can also be solved in a similar manner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Davis Growth Model (a dynamic steer growth model encompassing 4 fat deposition models) is currently being used by the phenotypic prediction program of the Cooperative Research Centre (CRC) for Beef Genetic Technologies to predict P8 fat (mm) in beef cattle to assist beef producers meet market specifications. The concepts of cellular hyperplasia and hypertrophy are integral components of the Davis Growth Model. The net synthesis of total body fat (kg) is calculated from the net energy available after accounting tor energy needs for maintenance and protein synthesis. Total body fat (kg) is then partitioned into 4 fat depots (intermuscular, intramuscular, subcutaneous, and visceral). This paper reports on the parameter estimation and sensitivity analysis of the DNA (deoxyribonucleic acid) logistic growth equations and the fat deposition first-order differential equations in the Davis Growth Model using acslXtreme (Hunstville, AL, USA, Xcellon). The DNA and fat deposition parameter coefficients were found to be important determinants of model function; the DNA parameter coefficients with days on feed >100 days and the fat deposition parameter coefficients for all days on feed. The generalized NL2SOL optimization algorithm had the fastest processing time and the minimum number of objective function evaluations when estimating the 4 fat deposition parameter coefficients with 2 observed values (initial and final fat). The subcutaneous fat parameter coefficient did indicate a metabolic difference for frame sizes. The results look promising and the prototype Davis Growth Model has the potential to assist the beef industry meet market specifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with the formulation and implementation of finite deformation viscoplasticity within the framework of stress-based hybrid finite element methods. Hybrid elements, which are based on a two-field variational formulation, are much less susceptible to locking than conventional displacement-based elements. The conventional return-mapping scheme cannot be used in the context of hybrid stress methods since the stress is known, and the strain and the internal plastic variables have to be recovered using this known stress field.We discuss the formulation and implementation of the consistent tangent tensor, and the return-mapping algorithm within the context of the hybrid method. We demonstrate the efficacy of the algorithm on a wide range of problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Swarm Intelligence techniques such as particle swarm optimization (PSO) are shown to be incompetent for an accurate estimation of global solutions in several engineering applications. This problem is more severe in case of inverse optimization problems where fitness calculations are computationally expensive. In this work, a novel strategy is introduced to alleviate this problem. The proposed inverse model based on modified particle swarm optimization algorithm is applied for a contaminant transport inverse model. The inverse models based on standard-PSO and proposed-PSO are validated to estimate the accuracy of the models. The proposed model is shown to be out performing the standard one in terms of accuracy in parameter estimation. The preliminary results obtained using the proposed model is presented in this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional three-dimensional isoparametric elements are susceptible to problems of locking when used to model plate/shell geometries or when the meshes are distorted etc. Hybrid elements that are based on a two-field variational formulation are immune to most of these problems, and hence can be used to efficiently model both "chunky" three-dimensional and plate/shell type structures. Thus, only one type of element can be used to model "all" types of structures, and also allows us to use a standard dual algorithm for carrying out the topology optimization of the structure. We also address the issue of manufacturability of the designs.