947 resultados para Estimated parameter


Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The VSS X chart, dedicated to the detection of small to moderate mean shifts in the process, has been investigated by several researchers under the assumption of known process parameters. In practice, the process parameters are rarely known and are usually estimated from an in-control Phase I data set. In this paper, we evaluate the (run length) performances of the VSS chart when the process parameters are estimated, we compare them in the case where the process parameters are assumed known and we propose specific optimal control chart parameters taking the number of Phase I samples into account.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In radionuclide metrology, Monte Carlo (MC) simulation is widely used to compute parameters associated with primary measurements or calibration factors. Although MC methods are used to estimate uncertainties, the uncertainty associated with radiation transport in MC calculations is usually difficult to estimate. Counting statistics is the most obvious component of MC uncertainty and has to be checked carefully, particularly when variance reduction is used. However, in most cases fluctuations associated with counting statistics can be reduced using sufficient computing power. Cross-section data have intrinsic uncertainties that induce correlations when apparently independent codes are compared. Their effect on the uncertainty of the estimated parameter is difficult to determine and varies widely from case to case. Finally, the most significant uncertainty component for radionuclide applications is usually that associated with the detector geometry. Recent 2D and 3D x-ray imaging tools may be utilized, but comparison with experimental data as well as adjustments of parameters are usually inevitable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work the G(A)(0) distribution is assumed as the universal model for amplitude Synthetic Aperture (SAR) imagery data under the Multiplicative Model. The observed data, therefore, is assumed to obey a G(A)(0) (alpha; gamma, n) law, where the parameter n is related to the speckle noise, and (alpha, gamma) are related to the ground truth, giving information about the background. Therefore, maps generated by the estimation of (alpha, gamma) in each coordinate can be used as the input for classification methods. Maximum likelihood estimators are derived and used to form estimated parameter maps. This estimation can be hampered by the presence of corner reflectors, man-made objects used to calibrate SAR images that produce large return values. In order to alleviate this contamination, robust (M) estimators are also derived for the universal model. Gaussian Maximum Likelihood classification is used to obtain maps using hard-to-deal-with simulated data, and the superiority of robust estimation is quantitatively assessed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although estimation of turbulent transport parameters using inverse methods is not new, there is little evaluation of the method in the literature. Here, it is shown that extended observation of the broad scale hydrography by Argo provides a path to improved estimates of regional turbulent transport rates. Results from a 20 year ocean state estimate produced with the ECCO v4 non-linear inverse modeling framework provide supporting evidence. Turbulent transport parameter maps are estimated under the constraints of fitting the extensive collection of Argo profiles collected through 2011. The adjusted parameters dramatically reduce misfits to in situ profiles as compared with earlier ECCO solutions. They also yield a clear reduction in the model drift away from observations over multi-century long simulations, both for assimilated variables (temperature and salinity) and independent variables (bio-geochemical tracers). Despite the minimal constraints imposed specifically on the estimated parameters, their geography is physically plausible and exhibits close connections with the upper ocean ocean stratification as observed by Argo. The estimated parameter adjustments furthermore have first order impacts on upper-ocean stratification and mixed layer depths over 20 years. These results identify the constraint of fitting Argo profiles as an effective observational basis for regional turbulent transport rates. Uncertainties and further improvements of the method are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective Levodopa in presence of decarboxylase inhibitors is following two-compartment kinetics and its effect is typically modelled using sigmoid Emax models. Pharmacokinetic modelling of the absorption phase of oral distributions is problematic because of irregular gastric emptying. The purpose of this work was to identify and estimate a population pharmacokinetic- pharmacodynamic model for duodenal infusion of levodopa/carbidopa (Duodopa®) that can be used for in numero simulation of treatment strategies. Methods The modelling involved pooling data from two studies and fixing some parameters to values found in literature (Chan et al. J Pharmacokinet Pharmacodyn. 2005 Aug;32(3-4):307-31). The first study involved 12 patients on 3 occasions and is described in Nyholm et al. Clinical Neuropharmacology 2003:26:156-63. The second study, PEDAL, involved 3 patients on 2 occasions. A bolus dose (normal morning dose plus 50%) was given after a washout during night. Plasma samples and motor ratings (clinical assessment of motor function from video recordings on a treatment response scale between -3 and 3, where -3 represents severe parkinsonism and 3 represents severe dyskinesia.) were repeatedly collected until the clinical effect was back at baseline. At this point, the usual infusion rate was started and sampling continued for another two hours. Different structural absorption models and effect models were evaluated using the value of the objective function in the NONMEM package. Population mean parameter values, standard error of estimates (SE) and if possible, interindividual/interoccasion variability (IIV/IOV) were estimated. Results Our results indicate that Duodopa absorption can be modelled with an absorption compartment with an added bioavailability fraction and a lag time. The most successful effect model was of sigmoid Emax type with a steep Hill coefficient and an effect compartment delay. Estimated parameter values are presented in the table. Conclusions The absorption and effect models were reasonably successful in fitting observed data and can be used in simulation experiments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Applying microeconomic theory, we develop a forecasting model for firm entry into local markets and test this model using data from the Swedish wholesale industry. The empirical analysis is based on directly estimating the profit function of wholesale firms. As in previous entry studies, profits are assumed to depend on firm- and location-specific factors,and the profit equation is estimated using panel data econometric techniques. Using the residuals from the profit equation estimations, we identify local markets in Sweden where firm profits are abnormally high given the level of all independent variables included in the profit function. From microeconomic theory, we then know that these local markets should have higher net entry than other markets, all else being equal, and we investigate this in a second step,also using a panel data econometric model. The results of estimating the net-entry equation indicate that four of five estimated models have more net entry in high-return municipalities, but the estimated parameter is only statistically significant at conventional levels in one of our estimated models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The VSS X- chart is known to perform better than the traditional X- control chart in detecting small to moderate mean shifts in the process. Many researchers have used this chart in order to detect a process mean shift under the assumption of known parameters. However, in practice, the process parameters are rarely known and are usually estimated from an in-control Phase I data set. In this paper, we evaluate the (run length) performances of the VSS X- control chart when the process parameters are estimated and we compare them in the case where the process parameters are assumed known. We draw the conclusion that these performances are quite different when the shift and the number of samples used during the phase I are small. ©2010 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A migração com amplitudes verdadeiras de dados de reflexão sísmica, em profundidade ou em tempo, possibilita que seja obtida uma medida dos coeficientes de reflexão dos chamados eventos de reflexão primária. Estes eventos são constituídos, por exemplo, pelas reflexões de ondas longitudinais P-P em refletores de curvaturas arbitrárias e suaves. Um dos métodos mais conhecido é o chamado migração de Kirchhoff, através do qual a imagem sísmica é produzida pela integração do campo de ondas sísmicas, utilizando-se superfícies de difrações, denominadas de Superfícies de Huygens. A fim de se obter uma estimativa dos coeficientes de reflexão durante a migração, isto é a correção do efeito do espalhamento geométrico, utiliza-se uma função peso no operador integral de migração. A obtenção desta função peso é feita pela solução assintótica da integral em pontos estacionários. Tanto no cálculo dos tempos de trânsito como na determinação da função peso, necessita-se do traçamento de raios, o que torna a migração em situações de forte heterogeneidade da propriedade física um processo com alto custo computacional. Neste trabalho é apresentado um algoritmo de migração em profundidade com amplitudes verdadeiras, para o caso em que se tem uma fonte sísmica pontual, sendo o modelo de velocidades em subsuperfície representado por uma função que varia em duas dimensões, e constante na terceira dimensão. Esta situação, conhecida como modelo dois-e-meio dimensional (2,5-D), possui características típicas de muitas situações de interesse na exploração do petróleo, como é o caso da aquisição de dados sísmicos 2-D com receptores ao longo de uma linha sísmica e fonte sísmica 3-D. Em particular, é dada ênfase ao caso em que a velocidade de propagação da onda sísmica varia linearmente com a profundidade. Outro tópico de grande importância abordado nesse trabalho diz respeito ao método de inversão sísmica denominado empilhamento duplo de difrações. Através do quociente de dois empilhamentos com pesos apropriados, pode-se determinar propriedades físicas e parâmetros geométricos relacionados com a trajetória do raio refletido, os quais podem ser utilizados a posteriori no processamento dos dados sísmicos, visando por exemplo, a análise de amplitudes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die Beziehung zwischen genetischem Polymorphismus von Populationen und Umweltvariabilität: Anwendung der Fitness-Set Theorie Das Quantitative Fitness-Set Modell (QFM) ist eine Erweiterung der Fitness-Set Theorie. Das QFM kann Abstufungen zwischen grob- und feinkörnigen regelmäßigen Schwankungen zweier Umwelten darstellen. Umwelt- und artspezifische Parameter, sowie die bewirkte Körnigkeit, sind quantifizierbar. Experimentelle Daten lassen sich analysieren und das QFM erweist sich in großen Populationen als sehr genau, was durch den diskreten Parameterraum unterstützt wird. Kleine Populationen und/oder hohe genetische Diversität führen zu Schätzungsungenauigkeiten, die auch in natürlichen Populationen zu erwarten sind. Ein populationsgrößenabhängiger Unschärfewert erweitert die Punktschätzung eines Parametersatzes zur Intervallschätzung. Diese Intervalle wirken in finiten Populationen als Fitnessbänder. Daraus ergibt sich die Hypothese, dass bei Arten, die in dichten kontinuierlichen Fitnessbändern leben, Generalisten und in diskreten Fitnessbändern Spezialisten evolvieren.Asynchrone Reproduktionsstrategien führen zur Bewahrung genetischer Diversität. Aus dem Wechsel von grobkörniger zu feinkörniger Umweltvariation ergibt sich eine Bevorzugung der spezialisierten Genotypen. Aus diesem Angriffspunkt für disruptive Selektion lässt sich die Hypothese Artbildung in Übergangsszenarien von grobkörniger zu feinkörniger Umweltvariation formulieren. Im umgekehrten Fall ist Diversitätsverlust und stabilisierende Selektion zu erwarten Dies ist somit eine prozessorientierte Erklärung für den Artenreichtum der (feinkörnigen) Tropen im Vergleich zu den artenärmeren, jahreszeitlichen Schwankungen unterworfenen (grobkörnigen) temperaten Zonen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To estimate a parameter in an elliptic boundary value problem, the method of equation error chooses the value that minimizes the error in the PDE and boundary condition (the solution of the BVP having been replaced by a measurement). The estimated parameter converges to the exact value as the measured data converge to the exact value, provided Tikhonov regularization is used to control the instability inherent in the problem. The error in the estimated solution can be bounded in an appropriate quotient norm; estimates can be derived for both the underlying (infinite-dimensional) problem and a finite-element discretization that can be implemented in a practical algorithm. Numerical experiments demonstrate the efficacy and limitations of the method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In order to add value to soybens crops, and hence the marketing, medium and large producers have been using precision agriculture techniques (PA), as the Remote Sensing, Geographic Information Systems (GIS) and positioning satellite, to assist the management of crops. Thus, given the economic relevance of that culture to the southwest of Paraná State and Brazil, scientific studies to increase their productivity and profitability are of main importance. The objective of this study was to evaluate the correlation between the chemical soil properties and soybean yield for each estimated parameter of semivariogram (range, nugget and level effect), and the deployment of these correlations in direct and indirect effects, aiming to improve the mapping process of spatial variability of soil chemical properties for use in PA. The hypothesis is that not all attributes of soil used to estimate the semivariogram parameters has a direct effect on productivity, and that even in groups of plants within a larger area it is possible to estimate the parameters of the semivariograms. The experiment was conducted in a commercial area of 19.7 ha, located in the city of Pato Branco - PR, central geographic coordinates 26º 11 '35 "South, 52 43' 05" West longitude, and average altitude of 780 m. The area is planted with soybeans for over 30 years, currently being adopted to cultivate Brasmax Target RR - Don Mario 5.9i, with row spacing of 0.50 m and 13 plants m-1, totaling 260,000 plants ha-1. For georeferencing of the area of study and sampling points was used a couple of topographic ProMarkTM3 receptors, making a relative positioning to obtain the georeferenced coordinates. To collect data (chemical analyzes of soil and crop yield) were sampled 10 blocks in the experimental area, each with an area of 20 m2 (20 meters long x 1 meter wide) containing two spaced adjacent rows of 0.5 m. Each block was divided into 20 portions of 1 m2, and from each were collected four subsamples at a distance of 0.5 m in relation to the lines of blocks, making up a sample depth for 0-10 cm a sample to 10-20 cm for each plot, totaling 200 samples for each depth. The soybean crop was performed on the blocks depending on maturity, and in each block was considered a bundle at each meter. In the data analysis, it was performed a diagnosis of multicollinearity, and subsequently a path analysis of the main variables according to the explanatory variables (range of chemical attributes: pH, K, P, Ca, etc.). The results obtained by the path analysis of the parameters of the semivariogram of soil chemical properties, indicated that only the Fe, Mg, Mn, organic matter (OM), P and Saturation by bases (SB) exerted direct and indirect effects on soybean productivity, although they have not presented spatial variability, indicating that the distribution of blocks in the area was unable to identify the spatial dependence of these elements, making it impossible to draw up maps of the chemical attributes for use in PA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hydrometallurgical process modeling is the main objective of this Master’s thesis work. Three different leaching processes namely, high pressure pyrite oxidation, direct oxidation zinc concentrate (sphalerite) leaching and gold chloride leaching using rotating disc electrode (RDE) are modeled and simulated using gPROMS process simulation program in order to evaluate its model building capabilities. The leaching mechanism in each case is described in terms of a shrinking core model. The mathematical modeling carried out included process model development based on available literature, estimation of reaction kinetic parameters and assessment of the model reliability by checking the goodness fit and checking the cross correlation between the estimated parameters through the use of correlation matrices. The estimated parameter values in each case were compared with those obtained using the Modest simulation program. Further, based on the estimated reaction kinetic parameters, reactor simulation and modeling for direct oxidation zinc concentrate (sphalerite) leaching is carried out in Aspen Plus V8.6. The zinc leaching autoclave is based on Cominco reactor configuration and is modeled as a series of continuous stirred reactors (CSTRs). The sphalerite conversion is calculated and a sensitivity analysis is carried out so to determine the optimum reactor operation temperature and optimum oxygen mass flow rate. In this way, the implementation of reaction kinetic models into the process flowsheet simulation environment has been demonstrated.