8 resultados para Estimation process

em University of Queensland eSpace - Australia


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Spatial characterization of non-Gaussian attributes in earth sciences and engineering commonly requires the estimation of their conditional distribution. The indicator and probability kriging approaches of current nonparametric geostatistics provide approximations for estimating conditional distributions. They do not, however, provide results similar to those in the cumbersome implementation of simultaneous cokriging of indicators. This paper presents a new formulation termed successive cokriging of indicators that avoids the classic simultaneous solution and related computational problems, while obtaining equivalent results to the impractical simultaneous solution of cokriging of indicators. A successive minimization of the estimation variance of probability estimates is performed, as additional data are successively included into the estimation process. In addition, the approach leads to an efficient nonparametric simulation algorithm for non-Gaussian random functions based on residual probabilities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A calibration methodology based on an efficient and stable mathematical regularization scheme is described. This scheme is a variant of so-called Tikhonov regularization in which the parameter estimation process is formulated as a constrained minimization problem. Use of the methodology eliminates the need for a modeler to formulate a parsimonious inverse problem in which a handful of parameters are designated for estimation prior to initiating the calibration process. Instead, the level of parameter parsimony required to achieve a stable solution to the inverse problem is determined by the inversion algorithm itself. Where parameters, or combinations of parameters, cannot be uniquely estimated, they are provided with values, or assigned relationships with other parameters, that are decreed to be realistic by the modeler. Conversely, where the information content of a calibration dataset is sufficient to allow estimates to be made of the values of many parameters, the making of such estimates is not precluded by preemptive parsimonizing ahead of the calibration process. White Tikhonov schemes are very attractive and hence widely used, problems with numerical stability can sometimes arise because the strength with which regularization constraints are applied throughout the regularized inversion process cannot be guaranteed to exactly complement inadequacies in the information content of a given calibration dataset. A new technique overcomes this problem by allowing relative regularization weights to be estimated as parameters through the calibration process itself. The technique is applied to the simultaneous calibration of five subwatershed models, and it is demonstrated that the new scheme results in a more efficient inversion, and better enforcement of regularization constraints than traditional Tikhonov regularization methodologies. Moreover, it is argued that a joint calibration exercise of this type results in a more meaningful set of parameters than can be achieved by individual subwatershed model calibration. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The estimation of a concentration-dependent diffusion coefficient in a drying process is known as an inverse coefficient problem. The solution is sought wherein the space-average concentration is known as function of time (mass loss monitoring). The problem is stated as the minimization of a functional and gradient-based algorithms are used to solve it. Many numerical and experimental examples that demonstrate the effectiveness of the proposed approach are presented. Thin slab drying was carried out in an isothermal drying chamber built in our laboratory. The diffusion coefficients of fructose obtained with the present method are compared with existing literature results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptive phase estimation is the process of estimating the phase of an electromagnetic field via a continually changing measurement. The measurement is varied in an attempt to optimize it at each moment. In this paper, we show that adaptive phase estimation is more accurate than nonadaptive phase estimation for continuous beams of light even when small time delays in the feedback are present. (c) 2005 Pleiades Publishing Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of adsorption of two dissociating and two non-dissociating aromatic compounds from dilute aqueous solutions on an untreated commercially available activated carbon (B.D.H.) was investigated systematically. All adsorption experiments were carried out in pH controlled aqueous solutions. The experimental isotherms were fitted into four different models (Langmuir homogenous Models, Langmuir binary Model, Langmuir-Freundlich single model and Langmuir-Freundlich double model). Variation of the model parameters with the solution pH was studied and used to gain further insight into the adsorption process. The relationship between the model parameters and the solution pH and pK(a) was used to predict the adsorption capacity in molecular and ionic form of solutes in other solution. A relationship was sought to predict the effect of pH on the adsorption systems and for estimating the maximum adsorption capacity of carbon at any pH where the solute is ionized reasonably well. N-2 and CO2 adsorption were used to characterize the carbon. X-ray Photoelectron Spectroscopy (XPS) measurement was used for surface elemental analysis of the activated carbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider a network of unreliable links, modelling for example a communication network. Estimating the reliability of the network-expressed as the probability that certain nodes in the network are connected-is a computationally difficult task. In this paper we study how the Cross-Entropy method can be used to obtain more efficient network reliability estimation procedures. Three techniques of estimation are considered: Crude Monte Carlo and the more sophisticated Permutation Monte Carlo and Merge Process. We show that the Cross-Entropy method yields a speed-up over all three techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The water retention curve (WRC) is a hydraulic characteristic of concrete required for advanced modeling of water (and thus solute) transport in variably saturated, heterogeneous concrete. Unfortunately, determination by a direct experimental method (for example, measuring equilibrium moisture levels of large samples stored in constant humidity cells) is a lengthy process, taking over 2 years for large samples. A surrogate approach is presented in which the WRC is conveniently estimated from mercury intrusion porosimetry (MIP) and validated by water sorption isotherms: The well-known Barrett, Joyner and Halenda (BJH) method of estimating the pore size distribution (PSD) from the water sorption isotherm is shown to complement the PSD derived from conventional MIP. This provides a basis for predicting the complete WRC from MIP data alone. The van Genuchten equation is used to model the combined water sorption and MIP results. It is a convenient tool for describing water retention characteristics over the full moisture content range. The van Genuchten parameter estimation based solely on MIP is shown to give a satisfactory approximation to the WRC, with a simple restriction on one. of the parameters.