80 resultados para grouping estimators

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reductions in firing costs are often advocated as a way of increasingthe dynamism of labour markets in both developed and less developed countries. Evidence from Europe and the U.S. on the impact of firing costs has, however, been mixed. Moreover, legislative changes both in Europe and the U.S. have been limited. This paper, instead, examines the impact of the Colombian Labour Market Reform of 1990, which substantially reduced dismissal costs. I estimate the incidence of a reduction in firing costs on worker turnover by exploiting the temporal change in the Colombian labour legislation as well as the variability in coverage between formal and informal sector workers. Using a grouping estimator to control for common aggregate shocks and selection, I find that the exit hazard rates into and out of unemployment increased after the reform by over 1% for formal workers (covered by the legislation) relative to informal workers (uncovered). The increase of the hazards implies a net decrease in unemployment of a third of a percentage point, which accounts for about one quarter of the fall in unemployment during the period of study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Hausman (1978) test is based on the vector of differences of two estimators. It is usually assumed that one of the estimators is fully efficient, since this simplifies calculation of the test statistic. However, this assumption limits the applicability of the test, since widely used estimators such as the generalized method of moments (GMM) or quasi maximum likelihood (QML) are often not fully efficient. This paper shows that the test may easily be implemented, using well-known methods, when neither estimator is efficient. To illustrate, we present both simulation results as well as empirical results for utilization of health care services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce simple nonparametric density estimators that generalize theclassical histogram and frequency polygon. The new estimators are expressed as linear combination of density functions that are piecewisepolynomials, where the coefficients are optimally chosen in order to minimize the integrated square error of the estimator. We establish the asymptotic behaviour of the proposed estimators, and study theirperformance in a simulation study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare a set of empirical Bayes and composite estimators of the population means of the districts (small areas) of a country, and show that the natural modelling strategy of searching for a well fitting empirical Bayes model and using it for estimation of the area-level means can be inefficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we propose using small area estimators to improve the estimatesof both the small and large area parameters. When the objective is to estimateparameters at both levels accurately, optimality is achieved by a mixed sampledesign of fixed and proportional allocations. In the mixed sample design, oncea sample size has been determined, one fraction of it is distributedproportionally among the different small areas while the rest is evenlydistributed among them. We use Monte Carlo simulations to assess theperformance of the direct estimator and two composite covariant-freesmall area estimators, for different sample sizes and different sampledistributions. Performance is measured in terms of Mean Squared Errors(MSE) of both small and large area parameters. It is found that the adoptionof small area composite estimators open the possibility of 1) reducingsample size when precision is given, or 2) improving precision for a givensample size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the comparative performance of five small areaestimators. We use Monte Carlo simulation in the context of boththeoretical and empirical populations. In addition to the direct andindirect estimators, we consider the optimal composite estimator withpopulation weights, and two composite estimators with estimatedweights: one that assumes homogeneity of within area variance andsquare bias, and another one that uses area specific estimates ofvariance and square bias. It is found that among the feasibleestimators, the best choice is the one that uses area specificestimates of variance and square bias.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We conduct a large-scale comparative study on linearly combining superparent-one-dependence estimators (SPODEs), a popular family of seminaive Bayesian classifiers. Altogether, 16 model selection and weighing schemes, 58 benchmark data sets, and various statistical tests are employed. This paper's main contributions are threefold. First, it formally presents each scheme's definition, rationale, and time complexity and hence can serve as a comprehensive reference for researchers interested in ensemble learning. Second, it offers bias-variance analysis for each scheme's classification error performance. Third, it identifies effective schemes that meet various needs in practice. This leads to accurate and fast classification algorithms which have an immediate and significant impact on real-world applications. Another important feature of our study is using a variety of statistical tests to evaluate multiple learning methods across multiple data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the derivation of new estimators and performance bounds for the problem of timing estimation of (linearly) digitally modulated signals. The conditional maximum likelihood (CML) method is adopted, in contrast to the classical low-SNR unconditional ML (UML) formulationthat is systematically applied in the literature for the derivationof non-data-aided (NDA) timing-error-detectors (TEDs). A new CML TED is derived and proved to be self-noise free, in contrast to the conventional low-SNR-UML TED. In addition, the paper provides a derivation of the conditional Cramér–Rao Bound (CRB ), which is higher (less optimistic) than the modified CRB (MCRB)[which is only reached by decision-directed (DD) methods]. It is shown that the CRB is a lower bound on the asymptotic statisticalaccuracy of the set of consistent estimators that are quadratic with respect to the received signal. Although the obtained boundis not general, it applies to most NDA synchronizers proposed in the literature. A closed-form expression of the conditional CRBis obtained, and numerical results confirm that the CML TED attains the new bound for moderate to high Eg/No.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En aquest treball estudiem si el valor intrínsec de Tubacex entre 1994-2013 coincideix amb la seva tendència bursàtil a llarg termini, tenint en compte part de la teoria defensada per Shiller. També verifiquem la possible infravaloració de l’acció de Tubacex a 31/12/13. A la primera part expliquem els principals mètodes de valoració d’empreses y a la segona part fem una anàlisi del sector en el que opera Tubacex (acer inoxidable) i calculem el valor de l’acció de Tubacex per mitjà de tres mètodes de valoració (Free Cash Flow, Cash Flow i Valor en Llibres). Apliquem aquests tres mètodes de valoració per verificar si com a mínim algun d’ells coincideix amb la tendència bursàtil a llarg termini.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the incidence and consequences of the mismatch between formal education and the educational requirements of jobs in Estonia during the years 1997-2003. We fi nd large wage penalties associated with the phenomenon of educational mismatch. Moreover, the incidence and wage penalty of mismatches increase with age. This suggests that structural educational mismatches can occur after fast transition periods. Our results are robust for various methodologies, and more importantly regarding departures from the exogeneity assumptions inherent in the matching estimators used in our analysis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary purpose of this exploratory empirical study is to examine the structural stability of a limited number of alternative explanatory factors of strategic change. On the basis of theoretical arguments and prior empirical evidence from two traditional perspectives, we propose an original empirical framework to analyse whether these potential explanatory factors have remained stable over time in a highly turbulent environment. This original question is explored in a particular setting: the population of Spanish private banks. The firms of this industry have experienced a high level of strategic mobility as a consequence of fundamental changes undergone in their environmental conditions over the last two decades (mainly changes related to the new banking and financial regulation process). Our results consistently support that the effect of most explanatory factors of strategic mobility considered did not remain stable over the whole period of analysis. From this point of view, the study sheds new light on major debates and dilemmas in the field of strategy regarding why firms change their competitive patterns over time and, hence, to what extent the "contextdependency" of alternative views of strategic change as their relative validation can vary over time for a given population. Methodologically, this research makes two major contributions to the study of potential determinants of strategic change. First, the definition and measurement of strategic change employing a new grouping method, the Model-based Cluster Method or MCLUST. Second, in order to asses the possible effect of determinants of strategic mobility we have controlled the non-observable heterogeneity using logistic regression models for panel data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ever since the appearance of the ARCH model [Engle(1982a)], an impressive array of variance specifications belonging to the same class of models has emerged [i.e. Bollerslev's (1986) GARCH; Nelson's (1990) EGARCH]. This recent domain has achieved very successful developments. Nevertheless, several empirical studies seem to show that the performance of such models is not always appropriate [Boulier(1992)]. In this paper we propose a new specification: the Quadratic Moving Average Conditional heteroskedasticity model. Its statistical properties, such as the kurtosis and the symmetry, as well as two estimators (Method of Moments and Maximum Likelihood) are studied. Two statistical tests are presented, the first one tests for homoskedasticity and the second one, discriminates between ARCH and QMACH specification. A Monte Carlo study is presented in order to illustrate some of the theoretical results. An empirical study is undertaken for the DM-US exchange rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.