104 resultados para Bayesian hypothesis testing
Resumo:
Does financial development result in capital being reallocated more rapidly to industries where it is most productive? We argue that if this was the case, financially developed countries should see faster growth in industries with investment opportunities due to global demand and productivity shifts. Testing this cross-industry cross-country growth implication requires proxies for (latent) global industry investment opportunities. We show that tests relying only on data from specific (benchmark) countries may yield spurious evidence for or against the hypothesis. We therefore develop an alternative approach that combines benchmark-country proxies with a proxy that does not reflect opportunities specific to a country or level of financial development. Our empirical results yield clear support for the capital reallocation hypothesis.
Resumo:
The number of hypothesis trying to explain which are the reasons behind the decision to migrate to work into a developed country are diverse and at the same time, difficult to test due to the multiplicity of factors which affect it. This papers attempts to move forward trying to disentangle which are the socio-economic factors that explain the differences in the figures of immigrants in the OECD countries. We show empirical evidence about the determinants of the migratory flows to 17 OECD countries from 65 countries in the 1980-2000 period. Our results reveal the importance to differentiate the inflows composition by at least income in the origin countries. Thus, regarding inflows from non-high-income countries, the results suggest that there is a pull effect from monetary and not real income, and then, the welfare magnets hypothesis should be rejected. This group reacts more to the migratory policy than the inflows coming from high-income countries, although those policies designed to slow down the inflows have not been able, in the aggregate, to reduce them.
Resumo:
The paper explores an efficiency hypothesis regarding the contractual process between large retailers, such as Wal-Mart and Carrefour, and their suppliers. The empirical evidence presented supports the idea that large retailers play a quasi-judicial role, acting as "courts of first instance" in their relationships with suppliers. In this role, large retailers adjust the terms of trade to on-going changes and sanction performance failures, sometimes delaying payments. A potential abuse of their position is limited by the need for re-contracting and preserving their reputations. Suppliers renew their confidence in their retailers on a yearly basis, through writing new contracts. This renovation contradicts the alternative hypothesis that suppliers are expropriated by large retailers as a consequence of specific investments.
Resumo:
This paper proposes a common and tractable framework for analyzingdifferent definitions of fixed and random effects in a contant-slopevariable-intercept model. It is shown that, regardless of whethereffects (i) are treated as parameters or as an error term, (ii) areestimated in different stages of a hierarchical model, or whether (iii)correlation between effects and regressors is allowed, when the sameinformation on effects is introduced into all estimation methods, theresulting slope estimator is also the same across methods. If differentmethods produce different results, it is ultimately because differentinformation is being used for each methods.
Resumo:
When the behaviour of a specific hypothesis test statistic is studied by aMonte Carlo experiment, the usual way to describe its quality is by givingthe empirical level of the test. As an alternative to this procedure, we usethe empirical distribution of the obtained \emph{p-}values and exploit itsinformation both graphically and numerically.
Resumo:
This poster shows how to efficiently observe high-frequency figures of merit in RF circuits by measuring DC temperature with CMOS-compatible built-in sensors.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.
Resumo:
The paper addresses the concept of multicointegration in panel data frame- work. The proposal builds upon the panel data cointegration procedures developed in Pedroni (2004), for which we compute the moments of the parametric statistics. When individuals are either cross-section independent or cross-section dependence can be re- moved by cross-section demeaning, our approach can be applied to the wider framework of mixed I(2) and I(1) stochastic processes analysis. The paper also deals with the issue of cross-section dependence using approximate common factor models. Finite sample performance is investigated through Monte Carlo simulations. Finally, we illustrate the use of the procedure investigating inventories, sales and production relationship for a panel of US industries.
Resumo:
The aim of this paper is twofold. First, we study the determinants of economic growth among a wide set of potential variables for the Spanish provinces (NUTS3). Among others, we include various types of private, public and human capital in the group of growth factors. Also,we analyse whether Spanish provinces have converged in economic terms in recent decades. Thesecond objective is to obtain cross-section and panel data parameter estimates that are robustto model speci¯cation. For this purpose, we use a Bayesian Model Averaging (BMA) approach.Bayesian methodology constructs parameter estimates as a weighted average of linear regression estimates for every possible combination of included variables. The weight of each regression estimate is given by the posterior probability of each model.
Resumo:
The paper addresses the concept of multicointegration in panel data frame- work. The proposal builds upon the panel data cointegration procedures developed in Pedroni (2004), for which we compute the moments of the parametric statistics. When individuals are either cross-section independent or cross-section dependence can be re- moved by cross-section demeaning, our approach can be applied to the wider framework of mixed I(2) and I(1) stochastic processes analysis. The paper also deals with the issue of cross-section dependence using approximate common factor models. Finite sample performance is investigated through Monte Carlo simulations. Finally, we illustrate the use of the procedure investigating inventories, sales and production relationship for a panel of US industries.
Resumo:
The aim of this paper is twofold. First, we study the determinants of economic growth among a wide set of potential variables for the Spanish provinces (NUTS3). Among others, we include various types of private, public and human capital in the group of growth factors. Also,we analyse whether Spanish provinces have converged in economic terms in recent decades. Thesecond objective is to obtain cross-section and panel data parameter estimates that are robustto model speci¯cation. For this purpose, we use a Bayesian Model Averaging (BMA) approach.Bayesian methodology constructs parameter estimates as a weighted average of linear regression estimates for every possible combination of included variables. The weight of each regression estimate is given by the posterior probability of each model.
Resumo:
In this paper we deal with the identification of dependencies between time series of equity returns. Marginal distribution functions are assumed to be known, and a bivariate chi-square test of fit is applied in a fully parametric copula approach. Several families of copulas are fitted and compared with Spanish stock market data. The results show that the t-copula generally outperforms other dependence structures, and highlight the difficulty in adjusting a significant number of bivariate data series
Resumo:
Purpose. The aim of this study was to identify new surfactants with low skin irritant properties for use in pharmaceutical and cosmetic formulations, employing cell culture as an alternative method to in vivo testing. In addition, we sought to establish whether potential cytotoxic properties were related to the size of the counterions bound to the surfactants. Methods. Cytotoxicity was assessed in the mouse fibroblast cell line 3T6, and the human keratinocyte cell line NCTC 2544, using the MTT assay and uptake of the vital dye neutral red 24 h after dosing (NRU). Results. Lysine-derivative surfactants showed higher IC50s than did commercial anionic irritant compounds such as sodium dodecyl sulphate, proving to be no more harmful than amphoteric betaines. The aggressiveness of the surfactants depended upon the size of their constituent counterions: surfactants associated with lighter counterions showed a proportionally higher aggressivity than those with heavier ones. Conclusions. Synthetic lysine-derivative anionic surfactants are less irritant than commercial surfactants such as sodium dodecyl sulphate and Hexadecyltrimethylammonium bromide and are similar to Betaines. These surfactants may offer promising applications in pharmaceutical and cosmetic preparations, representing a potential alternative to commercial anionic surfactants as a result of their low irritancy potential.