988 resultados para Gaussian assumption
Resumo:
Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.
Resumo:
The standard New Keynesian model with staggered wage settingis shown to imply a simple dynamic relation between wage inflationand unemployment. Under some assumptions, that relation takes aform similar to that found in empirical wage equations-starting fromPhillips' (1958) original work-and may thus be viewed as providingsome theoretical foundations to the latter. The structural wage equation derived here is shown to account reasonably well for the comovement of wage inflation and the unemployment rate in the U.S. economy, even under the strong assumption of a constant natural rate ofunemployment.
Resumo:
We evaluate conditional predictive densities for U.S. output growth and inflationusing a number of commonly used forecasting models that rely on a large number ofmacroeconomic predictors. More specifically, we evaluate how well conditional predictive densities based on the commonly used normality assumption fit actual realizationsout-of-sample. Our focus on predictive densities acknowledges the possibility that, although some predictors can improve or deteriorate point forecasts, they might have theopposite effect on higher moments. We find that normality is rejected for most modelsin some dimension according to at least one of the tests we use. Interestingly, however,combinations of predictive densities appear to be correctly approximated by a normaldensity: the simple, equal average when predicting output growth and Bayesian modelaverage when predicting inflation.
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
This article analyses the impact of the reference pricesystem on the price-setting strategies of thepharmaceutical firms and on the level of generic usage.This model is the first to take explicitly into accountthe impact of the reference price mechanism on the levelof competition between brand-name and generic drugs andnational pharmaceutical spending. We consider aduopolistic model with one firm producing the brand-namedrug, whose patent has already expired, and the otherproducing the corresponding generic version. We work ina partial equilibrium framework where firms set pricessequentially and consumers face heterogeneous switchingcosts.We show that brand producers compensate thedecline of profits by selling greater quantities insteadof charging higher prices, thus fostering pricecompetition in the pharmaceutical market. This result isa consequence of both the assumption of a verticallydifferentiated model and the introduction of thereference price system.
Resumo:
Résumé Les champignons endomycorhiziens arbusculaires (CEA) forment des symbioses avec la plupart des plantes terrestres. Les CEA influencent la croissance des plantes et la biodiversité. Ils sont supposés avoir évolué de manière asexuée pendant au moins 400 millions d'années et aucune diversification morphologique majeure n'a été constatée. Pour ces raisons, les CEA sont considérés comme d'anciens asexués. Très peu d'espèces sont connues actuellement. Les individus de ces champignons contiennent des noyaux génétiquement différents dans un cytoplasme continu. La signification évolutive, la variabilité et la maintenance des génomes multiples au sein des individus sont inconnues. Ce travail a démontré qu'une population du CEA Glomus intraradices est génétiquement très variable. Nous avons conclu que les plantes hôtes plutôt que la différenciation géographique devraient être responsables de cette grande diversité. Puis nous avons cherché l'existence de recombinaison entre génotypes dans une population. Nous avons détecté un groupe recombinant au sein de la population, ce qui met en doute l'état d'anciens asexués des CEA. Nous avons également détecté l'occurrence de fusions d'hyphes et l'échange de noyaux entre isolats génétiquement différents. La descendance hybride issue de cet échange était viable et distincte phénotypiquement des isolats parentaux. En résumé, ce travail identifie des événements cruciaux dans le cycle de vie des CEA qui ont le potentiel d'influencer l'évolution de génomes multiples. L'étude des conséquences de ces événements sur les interactions avec les plantes hôtes pourrait éclaircir significativement la compréhension de la symbiose entre plantes et CEA. Abstract Arbuscular mycorrhizal fungi (AMF) are important symbionts of most land plants. AMF influence plant growth and biodiversity. Very few extant species are described. AMF are thought to have evolved asexually for at least 400 million years and no major morphological diversification has occurred. Due to these reasons, they were termed `ancient asexuals'. Fungal individuals harbour genetically different nuclei in a continuous cytoplasm. The variability, maintenance and evolutionary significance of multiple genomes within individuals are unknown. This work showed that a population of the AMF Glomus intraradices harbours very high genetic diversity. We concluded that host plants rather than geographic differentiation were responsible for this diversity. Furthermore, we investigated whether recombination occurred among genotypes of a G. intraradices population. The identification of a core group of recombining genotypes in the population refutes the assumption of ancient asexuality in AMF. We found that genetically different isolates can form hyphal fusions and exchange nuclei. The hybrid progeny produced by the exchange was viable and phenotypically distinct from the parental isolates. Taken together, this work provided evidence for key events in the AMF life cycle, that influence the evolution of multiple genomes. Studying the consequences of these events on the interaction with host plants may significantly further the understanding of the AMF-plant symbiosis.
Resumo:
We analyze risk sensitive incentive compatible deposit insurancein the presence of private information when the market value of depositinsurance can be determined using Merton's (1997) formula. We show that,under the assumption that transferring funds from taxpayers to financialinstitutions has a social cost, the optimal regulation combines differentlevels of capital requirements combined with decreasing premia on depositinsurance. On the other hand, it is never efficient to require the banksto hold riskless assets, so that narrow banking is not efficient. Finally,chartering banks is necessary in order to decrease the cost of asymmetricinformation.
Resumo:
We analyze a standard environment of adverse selection in credit markets. In our environment,entrepreneurs who are privately informed about the quality of their projects need toborrow from banks. Conventional wisdom says that, in this class of economies, the competitiveequilibrium is typically inefficient.We show that this conventional wisdom rests on one implicit assumption: entrepreneurscan only borrow from banks. If an additional market is added to provide entrepreneurs withadditional funds, efficiency can be attained in equilibrium. An important characteristic of thisadditional market is that it must be non-exclusive, in the sense that entrepreneurs must be ableto simultaneously borrow from many different lenders operating in it. This makes it possible toattain efficiency by pooling all entrepreneurs in the new market while separating them in themarket for bank loans.
Resumo:
We propose a new econometric estimation method for analyzing the probabilityof leaving unemployment using uncompleted spells from repeated cross-sectiondata, which can be especially useful when panel data are not available. Theproposed method-of-moments-based estimator has two important features:(1) it estimates the exit probability at the individual level and(2) it does not rely on the stationarity assumption of the inflowcomposition. We illustrate and gauge the performance of the proposedestimator using the Spanish Labor Force Survey data, and analyze the changesin distribution of unemployment between the 1980s and 1990s during a periodof labor market reform. We find that the relative probability of leavingunemployment of the short-term unemployed versus the long-term unemployedbecomes significantly higher in the 1990s.
Resumo:
We use aggregate GDP data and within-country income shares for theperiod 1970-1998 to assign a level of income to each person in theworld. We then estimate the gaussian kernel density function for theworldwide distribution of income. We compute world poverty rates byintegrating the density function below the poverty lines. The $1/daypoverty rate has fallen from 20% to 5% over the last twenty five years.The $2/day rate has fallen from 44% to 18%. There are between 300 and500 million less poor people in 1998 than there were in the 70s.We estimate global income inequality using seven different popularindexes: the Gini coefficient, the variance of log-income, two ofAtkinson s indexes, the Mean Logarithmic Deviation, the Theil indexand the coefficient of variation. All indexes show a reduction in globalincome inequality between 1980 and 1998. We also find that most globaldisparities can be accounted for by across-country, not within-country,inequalities. Within-country disparities have increased slightly duringthe sample period, but not nearly enough to offset the substantialreduction in across-country disparities. The across-country reductionsin inequality are driven mainly, but not fully, by the large growth rateof the incomes of the 1.2 billion Chinese citizens. Unless Africa startsgrowing in the near future, we project that income inequalities willstart rising again. If Africa does not start growing, then China, India,the OECD and the rest of middle-income and rich countries diverge awayfrom it, and global inequality will rise. Thus, the aggregate GDP growthof the African continent should be the priority of anyone concerned withincreasing global income inequality.
Resumo:
New-Keynesian (NK) models can only account for the dynamic effects of monetary policy shocks if it is assumed that aggregate capital accumulation is much smoother than it would be the case under frictionless firm-level investment, as discussed in Woodford (2003, Ch. 5). We find that lumpy investment, when combined with price stickiness and market power of firms,can rationalize this assumption. Our main result is in stark contrast with the conclusions obtained by Thomas (2002) in the context of a real business cycle (RBC) model. We use our model to explain the economic mechanism behind this difference in the predictions of RBC and NK theory.
Resumo:
Traditional economic wisdom says that free entry in a market will drive profits down to zero. This conclusion is usually drawn under the assumption of perfect information. We assumethat a priori there exists imperfect information about theprofitability of the market, but that potential entrants maylearn the demand curve perfectly at negligible cost byengaging in market research. Even if in equilibrium firmslearn the demand perfectly, profits may be strictly positivebecause of insufficient entry. The mere fact that it will notbecome common knowledge that every entrant has perfectinformation about demand causes this surprising result. Belief means doubt. Knowing means certainty. Introduction to the Kabalah.
Resumo:
Two-stage game models of information acquisition in stochastic oligopoliesrequire the unrealistic assumption that firms observe the precision ofinformation chosen by their competitors before determining quantities. Thispaper analyzes secret information acquisition as a one-stage game. Relativeto the two-stage game firms are shown to acquire less information. Policyimplications based on the two-stage game yield, therefore, too high taxes ortoo low subsidies for research activities. For the case of heterogeneousduopoly it is shown that comparative statics results partly depend on theobservability assumption.
Resumo:
Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.
Resumo:
This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.