601 resultados para Estimators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider robust parametric procedures for univariate discrete distributions, focusing on the negative binomial model. The procedures are based on three steps: ?First, a very robust, but possibly inefficient, estimate of the model parameters is computed. ?Second, this initial model is used to identify outliers, which are then removed from the sample. ?Third, a corrected maximum likelihood estimator is computed with the remaining observations. The final estimate inherits the breakdown point (bdp) of the initial one and its efficiency can be significantly higher. Analogous procedures were proposed in [1], [2], [5] for the continuous case. A comparison of the asymptotic bias of various estimates under point contamination points out the minimum Neyman's chi-squared disparity estimate as a good choice for the initial step. Various minimum disparity estimators were explored by Lindsay [4], who showed that the minimum Neyman's chi-squared estimate has a 50% bdp under point contamination; in addition, it is asymptotically fully efficient at the model. However, the finite sample efficiency of this estimate under the uncontaminated negative binomial model is usually much lower than 100% and the bias can be strong. We show that its performance can then be greatly improved using the three step procedure outlined above. In addition, we compare the final estimate with the procedure described in

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the importance of the labour mobility of inventors, as well as the scale, extent and density of their collaborative research networks, for regional innovation outcomes. To do so, we apply a knowledge production function framework at the regional level and include inventors’ networks and their labour mobility as regressors. Our empirical approach takes full account of spatial interactions by estimating a spatial lag model together, where necessary, with a spatial error model. In addition, standard errors are calculated using spatial heteroskedasticity and autocorrelation consistent estimators to ensure their robustness in the presence of spatial error autocorrelation and heteroskedasticity of unknown form. Our results point to the existence of a robust positive correlation between intraregional labour mobility and regional innovation, whilst the relationship with networks is less clear. However, networking across regions positively correlates with a region’s innovation intensity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With this paper we build a two-region model where both innovation and imitation are performed. In particular imitation takes the form of technological spillovers that lagging regions may exploit given certain human capital conditions. We show how the high skill content of each region’s workforce (rather than the average human capital stock) is crucial to determine convergence towards the income level of the leader region and to exploit the technological spillovers coming from the frontier. The same applies to bureaucratic/institutional quality which are conductive to higher growth in the long run. We test successfully our theoretical result over Spanish regions for the period between 1960 and 1997. We exploit system GMM estimators which allow us to correctly deal with endogeneity problems and small sample bias.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The author studies random walk estimators for radiosity with generalized absorption probabilities. That is, a path will either die or survive on a patch according to an arbitrary probability. The estimators studied so far, the infinite path length estimator and finite path length one, can be considered as particular cases. Practical applications of the random walks with generalized probabilities are given. A necessary and sufficient condition for the existence of the variance is given, together with heuristics to be used in practical cases. The optimal probabilities are also found for the case when one is interested in the whole scene, and are equal to the reflectivities

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The directional consistency and skew-symmetry statistics have been proposed as global measurements of social reciprocity. Although both measures can be useful for quantifying social reciprocity, researchers need to know whether these estimators are biased in order to assess descriptive results properly. That is, if estimators are biased, researchers should compare actual values with expected values under the specified null hypothesis. Furthermore, standard errors are needed to enable suitable assessment of discrepancies between actual and expected values. This paper aims to derive some exact and approximate expressions in order to obtain bias and standard error values for both estimators for round-robin designs, although the results can also be extended to other reciprocal designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Standard indirect Inference (II) estimators take a given finite-dimensional statistic, Z_{n} , and then estimate the parameters by matching the sample statistic with the model-implied population moment. We here propose a novel estimation method that utilizes all available information contained in the distribution of Z_{n} , not just its first moment. This is done by computing the likelihood of Z_{n}, and then estimating the parameters by either maximizing the likelihood or computing the posterior mean for a given prior of the parameters. These are referred to as the maximum indirect likelihood (MIL) and Bayesian Indirect Likelihood (BIL) estimators, respectively. We show that the IL estimators are first-order equivalent to the corresponding moment-based II estimator that employs the optimal weighting matrix. However, due to higher-order features of Z_{n} , the IL estimators are higher order efficient relative to the standard II estimator. The likelihood of Z_{n} will in general be unknown and so simulated versions of IL estimators are developed. Monte Carlo results for a structural auction model and a DSGE model show that the proposed estimators indeed have attractive finite sample properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the targets of the climate and energy package of the European Union is to increase the energy efficiency in order to achieve a 20 percent reduction in primary energy use compared with the projected level by 2020. The energy efficiency can be improved for example by increasing the rotational speed of large electrical drives, because this enables the elimination of gearboxes leading to a compact design with lower losses. The rotational speeds of traditional bearings, such as roller bearings, are limited by mechanical friction. Active magnetic bearings (AMBs), on the other hand, allow very high rotational speeds. Consequently, their use in large medium- and high-speed machines has rapidly increased. An active magnetic bearing rotor system is an inherently unstable, nonlinear multiple-input, multiple-output system. Model-based controller design of AMBs requires an accurate system model. Finite element modeling (FEM) together with the experimental modal analysis provides a very accurate model for the rotor, and a linearized model of the magneticactuators has proven to work well in normal conditions. However, the overall system may suffer from unmodeled dynamics, such as dynamics of foundation or shrink fits. This dynamics can be modeled by system identification. System identification can also be used for on-line diagnostics. In this study, broadband excitation signals are adopted to the identification of an active magnetic bearing rotor system. The broadband excitation enables faster frequency response function measurements when compared with the widely used stepped sine and swept sine excitations. Different broadband excitations are reviewed, and the random phase multisine excitation is chosen for further study. The measurement times using the multisine excitation and the stepped sine excitation are compared. An excitation signal design with an analysis of the harmonics produced by the nonlinear system is presented. The suitability of different frequency response function estimators for an AMB rotor system are also compared. Additionally, analytical modeling of an AMB rotor system, obtaining a parametric model from the nonparametric frequency response functions, and model updating are discussed in brief, as they are key elements in the modeling for a control design. Theoretical methods are tested with a laboratory test rig. The results conclude that an appropriately designed random phase multisine excitation is suitable for the identification of AMB rotor systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though frequency analysis of body sway is widely applied in clinical studies, the lack of standardized procedures concerning power spectrum estimation may provide unreliable descriptors. Stabilometric tests were applied to 35 subjects (20-51 years, 54-95 kg, 1.6-1.9 m) and the power spectral density function was estimated for the anterior-posterior center of pressure time series. The median frequency was compared between power spectra estimated according to signal partitioning, sampling rate, test duration, and detrending methods. The median frequency reliability for different test durations was assessed using the intraclass correlation coefficient. When increasing number of segments, shortening test duration or applying linear detrending, the median frequency values increased significantly up to 137%. Even the shortest test duration provided reliable estimates as observed with the intraclass coefficient (0.74-0.89 confidence interval for a single 20-s test). Clinical assessment of balance may benefit from a standardized protocol for center of pressure spectral analysis that provides an adequate relationship between resolution and variance. An algorithm to estimate center of pressure power density spectrum is also proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The infinitesimal differential quantum Monte Carlo (QMC) technique is used to estimate electrostatic polarizabilities of the H and He atoms up to the sixth order in the electric field perturbation. All 542 different QMC estimators of the nonzero atomic polarizabilities are derived and used in order to decrease the statistical error and to obtain the maximum efficiency of the simulations. We are confident that the estimates are "exact" (free of systematic error): the two atoms are nodeless systems, hence no fixed-node error is introduced. Furthermore, we develope and use techniques which eliminate systematic error inherent when extrapolating our results to zero time-step and large stack-size. The QMC results are consistent with published accurate values obtained using perturbation methods. The precision is found to be related to the number of perturbations, varying from 2 to 4 significant digits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several Authors Have Discussed Recently the Limited Dependent Variable Regression Model with Serial Correlation Between Residuals. the Pseudo-Maximum Likelihood Estimators Obtained by Ignoring Serial Correlation Altogether, Have Been Shown to Be Consistent. We Present Alternative Pseudo-Maximum Likelihood Estimators Which Are Obtained by Ignoring Serial Correlation Only Selectively. Monte Carlo Experiments on a Model with First Order Serial Correlation Suggest That Our Alternative Estimators Have Substantially Lower Mean-Squared Errors in Medium Size and Small Samples, Especially When the Serial Correlation Coefficient Is High. the Same Experiments Also Suggest That the True Level of the Confidence Intervals Established with Our Estimators by Assuming Asymptotic Normality, Is Somewhat Lower Than the Intended Level. Although the Paper Focuses on Models with Only First Order Serial Correlation, the Generalization of the Proposed Approach to Serial Correlation of Higher Order Is Also Discussed Briefly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a recent paper, Bai and Perron (1998) considered theoretical issues related to the limiting distribution of estimators and test statistics in the linear model with multiple structural changes. In this companion paper, we consider practical issues for the empirical applications of the procedures. We first address the problem of estimation of the break dates and present an efficient algorithm to obtain global minimizers of the sum of squared residuals. This algorithm is based on the principle of dynamic programming and requires at most least-squares operations of order O(T 2) for any number of breaks. Our method can be applied to both pure and partial structural-change models. Secondly, we consider the problem of forming confidence intervals for the break dates under various hypotheses about the structure of the data and the errors across segments. Third, we address the issue of testing for structural changes under very general conditions on the data and the errors. Fourth, we address the issue of estimating the number of breaks. We present simulation results pertaining to the behavior of the estimators and tests in finite samples. Finally, a few empirical applications are presented to illustrate the usefulness of the procedures. All methods discussed are implemented in a GAUSS program available upon request for non-profit academic use.