990 resultados para Estimation errors
Resumo:
The aim of this article is to discuss the estimation of the systematic risk in capital asset pricing models with heavy-tailed error distributions to explain the asset returns. Diagnostic methods for assessing departures from the model assumptions as well as the influence of observations on the parameter estimates are also presented. It may be shown that outlying observations are down weighted in the maximum likelihood equations of linear models with heavy-tailed error distributions, such as Student-t, power exponential, logistic II, so on. This robustness aspect may also be extended to influential observations. An application in which the systematic risk estimate of Microsoft is compared under normal and heavy-tailed errors is presented for illustration.
Resumo:
In many epidemiological studies it is common to resort to regression models relating incidence of a disease and its risk factors. The main goal of this paper is to consider inference on such models with error-prone observations and variances of the measurement errors changing across observations. We suppose that the observations follow a bivariate normal distribution and the measurement errors are normally distributed. Aggregate data allow the estimation of the error variances. Maximum likelihood estimates are computed numerically via the EM algorithm. Consistent estimation of the asymptotic variance of the maximum likelihood estimators is also discussed. Test statistics are proposed for testing hypotheses of interest. Further, we implement a simple graphical device that enables an assessment of the model`s goodness of fit. Results of simulations concerning the properties of the test statistics are reported. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
We analyse the finite-sample behaviour of two second-order bias-corrected alternatives to the maximum-likelihood estimator of the parameters in a multivariate normal regression model with general parametrization proposed by Patriota and Lemonte [A. G. Patriota and A. J. Lemonte, Bias correction in a multivariate regression model with genereal parameterization, Stat. Prob. Lett. 79 (2009), pp. 1655-1662]. The two finite-sample corrections we consider are the conventional second-order bias-corrected estimator and the bootstrap bias correction. We present the numerical results comparing the performance of these estimators. Our results reveal that analytical bias correction outperforms numerical bias corrections obtained from bootstrapping schemes.
Resumo:
This paper presents a two-step pseudo likelihood estimation technique for generalized linear mixed models with the random effects being correlated between groups. The core idea is to deal with the intractable integrals in the likelihood function by multivariate Taylor's approximation. The accuracy of the estimation technique is assessed in a Monte-Carlo study. An application of it with a binary response variable is presented using a real data set on credit defaults from two Swedish banks. Thanks to the use of two-step estimation technique, the proposed algorithm outperforms conventional pseudo likelihood algorithms in terms of computational time.
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
A novel approach for solving robust parameter estimation problems is presented for processes with unknown-but-bounded errors and uncertainties. An artificial neural network is developed to calculate a membership set for model parameters. Techniques of fuzzy logic control lead the network to its equilibrium points. Simulated examples are presented as an illustration of the proposed technique. The result represent a significant improvement over previously proposed methods. (C) 1999 IMACS/Elsevier B.V. B.V. All rights reserved.
Resumo:
This paper presents an intelligent search strategy for the conforming bad data errors identification in the generalized power system state estimation, by using the tabu search meta heuristic. The main objective is to detect critical errors involving both analog and topology errors. These errors are represented by conforming errors, whose nature affects measurements that actually do not present bad data and also the conventional bad data identification strategies based on the normalized residual methods. ©2005 IEEE.
Resumo:
Among the positioning systems that compose GNSS (Global Navigation Satellite System), GPS has the capability of providing low, medium and high precision positioning data. However, GPS observables may be subject to many different types of errors. These systematic errors can degrade the accuracy of the positioning provided by GPS. These errors are mainly related to GPS satellite orbits, multipath, and atmospheric effects. In order to mitigate these errors, a semiparametric model and the penalized least squares technique were employed in this study. This is similar to changing the stochastical model, in which error functions are incorporated and the results are similar to those in which the functional model is changed instead. Using this method, it was shown that ambiguities and the estimation of station coordinates were more reliable and accurate than when employing a conventional least squares methodology.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Ceramic parts are increasingly replacing metal parts due to their excellent physical, chemical and mechanical properties, however they also make them difficult to manufacture by traditional machining methods. The developments carried out in this work are used to estimate tool wear during the grinding of advanced ceramics. The learning process was fed with data collected from a surface grinding machine with tangential diamond wheel and alumina ceramic test specimens, in three cutting configurations: with depths of cut of 120 mu m, 70 mu m and 20 mu m. The grinding wheel speed was 35m/s and the table speed 2.3m/s. Four neural models were evaluated, namely: Multilayer Perceptron, Radial Basis Function, Generalized Regression Neural Networks and the Adaptive Neuro-Fuzzy Inference System. The models'performance evaluation routines were executed automatically, testing all the possible combinations of inputs, number of neurons, number of layers, and spreading. The computational results reveal that the neural models were highly successful in estimating tool wear, since the errors were lower than 4%.
Resumo:
Estimation of the lower flammability limits of C-H compounds at 25 degrees C and 1 atm; at moderate temperatures and in presence of diluent was the objective of this study. A set of 120 degrees C H compounds was divided into a correlation set and a prediction set of 60 compounds each. The absolute average relative error for the total set was 7.89%; for the correlation set, it was 6.09%; and for the prediction set it was 9.68%. However, it was shown that by considering different sources of experimental data the values were reduced to 6.5% for the prediction set and to 6.29% for the total set. The method showed consistency with Le Chatelier's law for binary mixtures of C H compounds. When tested for a temperature range from 5 degrees C to 100 degrees C , the absolute average relative errors were 2.41% for methane; 4.78% for propane; 0.29% for iso-butane and 3.86% for propylene. When nitrogen was added, the absolute average relative errors were 2.48% for methane; 5.13% for propane; 0.11% for iso-butane and 0.15% for propylene. When carbon dioxide was added, the absolute relative errors were 1.80% for methane; 5.38% for propane; 0.86% for iso-butane and 1.06% for propylene. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Regression coefficients specify the partial effect of a regressor on the dependent variable. Sometimes the bivariate or limited multivariate relationship of that regressor variable with the dependent variable is known from population-level data. We show here that such population- level data can be used to reduce variance and bias about estimates of those regression coefficients from sample survey data. The method of constrained MLE is used to achieve these improvements. Its statistical properties are first described. The method constrains the weighted sum of all the covariate-specific associations (partial effects) of the regressors on the dependent variable to equal the overall association of one or more regressors, where the latter is known exactly from the population data. We refer to those regressors whose bivariate or limited multivariate relationships with the dependent variable are constrained by population data as being ‘‘directly constrained.’’ Our study investigates the improvements in the estimation of directly constrained variables as well as the improvements in the estimation of other regressor variables that may be correlated with the directly constrained variables, and thus ‘‘indirectly constrained’’ by the population data. The example application is to the marital fertility of black versus white women. The difference between white and black women’s rates of marital fertility, available from population-level data, gives the overall association of race with fertility. We show that the constrained MLE technique both provides a far more powerful statistical test of the partial effect of being black and purges the test of a bias that would otherwise distort the estimated magnitude of this effect. We find only trivial reductions, however, in the standard errors of the parameters for indirectly constrained regressors.
Resumo:
This paper introduces a skewed log-Birnbaum-Saunders regression model based on the skewed sinh-normal distribution proposed by Leiva et al. [A skewed sinh-normal distribution and its properties and application to air pollution, Comm. Statist. Theory Methods 39 (2010), pp. 426-443]. Some influence methods, such as the local influence and generalized leverage, are presented. Additionally, we derived the normal curvatures of local influence under some perturbation schemes. An empirical application to a real data set is presented in order to illustrate the usefulness of the proposed model.
Resumo:
Objective To evaluate and compare the intraobserver and interobserver reliability and agreement for the biparietal diameter (BPD), abdominal circumference (AC), femur length (FL) and estimated fetal weight (EFW) obtained by two-dimensional ultrasound (2D-US) and three-dimensional ultrasound (3D-US). Methods Singleton pregnant women between 24 and 40 weeks were invited to participate in this study. They were examined using 2D-US in a blinded manner, twice by one observer, intercalated by a scan by a second observer, to determine BPD, AC and FL. In each of the three examinations, three 3D-US datasets (head, abdomen and thigh) were acquired for measurements of the same parameters. We determined EFW using Hadlock's formula. Systematic errors between 3D-US and 2D-US were examined using the paired t-test. Reliability and agreement were assessed by intraclass correlation coefficients (ICCs), limits of agreement (LoA), SD of differences and proportion of differences below arbitrary points. Results We evaluated 102 singleton pregnancies. No significant systematic error between 2D-US and 3D-US was observed. The ICC values were higher for 3D-US in both intra- and interobserver evaluations; however, only for FL was there no overlap in the 95% CI. The LoA values were wider for 2D-US, suggesting that random errors were smaller when using 3D-US. Additionally, we observed that the SD values determined from 3D-US differences were smaller than those obtained for 2D-US. Higher proportions of differences were below the arbitrarily defined cut-off points when using 3D-US. Conclusion 3D-US improved the reliability and agreement of fetal measurements and EFW compared with 2D-US.