953 resultados para quasi-likelihood
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.
Resumo:
Using a panel of 48 provinces for four years we empirically analyze a series of temporary policies aimed at curbing fuel consumption implemented in Spain between March and June 2011. The first policy was a reduction in the speed limit in highways. The second policy was an increase in the biofuel content of fuels used in the transport sector. The third measure was a reduction of 5% in commuting and regional train fares that resulted in two major metropolitan areas reducing their overall fare for public transit. The results indicate that the speed limit reduction in highways reduced gasoline consumption by between 2% and 3%, while an increase in the biofuel content of gasoline increased this consumption. This last result is consistent with experimental evidence that indicates that mileage per liter falls with an increase in the biofuel content in gasolines. As for the reduction in transit fares, we do not find a significant effect for this policy. However, in specifications including the urban transit fare for the major cities in each province the estimated cross-price elasticity of the demand for gasoline -used as a proxy for car use- with respect to the price of transit is within the range reported in the literature. This is important since one of the main eficiency justification for subsidizing public transit rests on the positive value of this parameter and most of the estimates reported in the literature are quite dated.
Resumo:
We characterize divergence times, intraspecific diversity and distributions for recently recognized lineages within the Hyla arborea species group, based on mitochondrial and nuclear sequences from 160 localities spanning its whole distribution. Lineages of H. arborea, H. orientalis, H. molleri have at least Pliocene age, supporting species level divergence. The genetically uniform Iberian H. molleri, although largely isolated by the Pyrenees, is parapatric to H. arborea, with evidence for successful hybridization in a small Aquitanian corridor (southwestern France), where the distribution also overlaps with H. meridionalis. The genetically uniform H. arborea, spread from Crete to Brittany, exhibits molecular signatures of a postglacial range expansion. It meets different mtDNA clades of H. orientalis in NE-Greece, along the Carpathians, and in Poland along the Vistula River (there including hybridization). The East-European H. orientalis is strongly structured genetically. Five geographic mitochondrial clades are recognized, with a molecular signature of postglacial range expansions for the clade that reached the most northern latitudes. Hybridization with H. savignyi is suggested in southwestern Turkey. Thus, cryptic diversity in these Pliocene Hyla lineages covers three extremes: a genetically poor, quasi-Iberian endemic (H. molleri), a more uniform species distributed from the Balkans to Western Europe (H. arborea), and a well-structured Asia Minor-Eastern European species (H. orientalis).
Resumo:
We focus on full-rate, fast-decodable space–time block codes (STBCs) for 2 x 2 and 4 x 2 multiple-input multiple-output (MIMO) transmission. We first derive conditions and design criteria for reduced-complexity maximum-likelihood (ML) decodable 2 x 2 STBCs, and we apply them to two families of codes that were recently discovered. Next, we derive a novel reduced-complexity 4 x 2 STBC, and show that it outperforms all previously known codes with certain constellations.
Resumo:
This letter to the Editor comments on the article When 'neutral' evidence still has probative value (with implications from the Barry George Case) by N. Fenton et al. [[1], 2014].
Resumo:
This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.
Resumo:
Whether providing additional resources to local communities leads to improved public services and better outcomes more generally, given existing management capacity and incentive and accountability structures, is an unresolved yet important question for public policy. This paper uses a regression-discontinuity design to evaluate the effect of unrestricted fiscal transfers on local spending (including on education), schooling and learning in Brazil. Results show that transfers increase local public spending almost one for one with no evidence of crowding out own revenue or other revenue sources. Extra per capita transfers of 1000 Reais lead to about 0.42 additional years of elementary schooling and student literacy rates increase by about 5.6 percentage points on average. Part of this effect arises through higher teacher-student ratios in municipal elementary school systems. Results also suggest that additional resources have stronger effects in more rural and less developed parts of Brazil.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
This work studies the organization of less-than-truckload trucking from a contractual point of view. We show that the huge number of owner-operators working in the industry hides a much less fragmented reality. Most of those owner-operators are quasi-integrated in higher organizational structures. This hybrid form is generally more efficient than vertical integration because, in the Spanish institutional environment, it lessens serious moral hazard problems, related mainly to the use of the vehicles, and makes it possible to reach economies of scale and density. Empirical evidence suggests that what leads organizations to vertically integrate is not the presence of such economies but hold-up problems, related to the existence of specific assets. Finally, an international comparison hints that institutional constraints are able to explain differences in the evolution of vertical integration across countries.
Resumo:
Does additional government spending improve the electoral chances of incumbent politicalparties? This paper provides the first quasi-experimental evidence on this question. Our researchdesign exploits discontinuities in federal funding to local governments in Brazil around severalpopulation cutoffs over the period 1982-1985. We show that extra fiscal transfers resulted in a20% increase in local government spending per capita, and an increase of about 10 percentagepoints in the re-election probability of local incumbent parties. In the context of an agency modelof electoral accountability, as well as existing results indicating that the revenue jumps studiedhere had positive impacts on education outcomes and earnings, these results suggest that expectedelectoral rewards encouraged incumbents to spend additional funds in ways that were valued byvoters.
Resumo:
This paper uses a regression discontinuity design to estimate the impact of additional unrestrictedgrant financing on local public spending, public service provision, schooling, literacy, andincome at the community (municipio) level in Brazil. Additional transfers increased local publicspending per capita by about 20% with no evidence of crowding out own revenue or otherrevenue sources. The additional local spending increased schooling per capita by about 7% andliteracy rates by about 4 percentage points. The implied marginal cost of schooling -accountingfor corruption and other leakages- amounts to about US$ 126, which turns out to be similar tothe average cost of schooling in Brazil in the early 1980s. In line with the effect on human capital,the poverty rate was reduced by about 4 percentage points, while income per capita gains werepositive but not statistically significant. Results also suggest that additional public spending hadstronger effects on schooling and literacy in less developed parts of Brazil, while poverty reductionwas evenly spread across the country.
Resumo:
This paper presents a classical Cournot oligopoly model with some peculiar features: it is non--quasi--competitive as price under N-poly is greater than monopoly price; Cournot equilibrium exists and is unique with each new entry; the successive equilibria after new entries are stable under the adjustment mechanism that assumes that actual output of each seller is adjusted proportionally to the difference between actual output and profit maximizing output. Moreover, the model tends to perfect competition as N goes to infinity, reaching the monopoly price again.
Resumo:
The paper explores an efficiency hypothesis regarding the contractual process between large retailers, such as Wal-Mart and Carrefour, and their suppliers. The empirical evidence presented supports the idea that large retailers play a quasi-judicial role, acting as "courts of first instance" in their relationships with suppliers. In this role, large retailers adjust the terms of trade to on-going changes and sanction performance failures, sometimes delaying payments. A potential abuse of their position is limited by the need for re-contracting and preserving their reputations. Suppliers renew their confidence in their retailers on a yearly basis, through writing new contracts. This renovation contradicts the alternative hypothesis that suppliers are expropriated by large retailers as a consequence of specific investments.
Resumo:
Many authors have discussed a decline in internal labor markets and an apparent shift to a new employment contract, characterized by less commitment between employer and employee and more portable skills. These discussions occur without much evidence on what employment contract employees currently feel is fair. We perfomed quasi-experimental surveys to study when employees in the U.S. andCanada feel that layoffs are fair.Layoffs were perceived as more fair if they were due to lower product demand than if the result of employee suggestions. This result appears to be solely due to norms of reciprocity (companiesshould not punish employees for their efforts), rather than norms of sharing rents, as new technology was also considered a justification for layoffs.Consistent with theories of distributive and procedural equity, layoffs were perceived as more fair if the CEO voluntarily shared the pain. CEO bonuses due to layoffs lowered their reported fairness only slightly.Respondents in Silicon Valley were not more accepting of layoffsthan were those in Canada on average, although the justificationsconsidered valid differed slightly.