916 resultados para Bayes Estimator
Resumo:
In cognition, common factors play a crucial role. For example, different types of intelligence are highly correlated, pointing to a common factor, which is often called g. One might expect that a similar common factor would also exist for vision. Surprisingly, no one in the field has addressed this issue. Here, we provide the first evidence that there is no common factor for vision. We tested 40 healthy students' performance in six basic visual paradigms: visual acuity, vernier discrimination, two visual backward masking paradigms, Gabor detection, and bisection discrimination. One might expect that performance levels on these tasks would be highly correlated because some individuals generally have better vision than others due to superior optics, better retinal or cortical processing, or enriched visual experience. However, only four out of 15 correlations were significant, two of which were nontrivial. These results cannot be explained by high intraobserver variability or ceiling effects because test-retest reliability was high and the variance in our student population is commensurate with that from other studies with well-sighted populations. Using a variety of tests (e.g., principal components analysis, Bayes theorem, test-retest reliability), we show the robustness of our null results. We suggest that neuroplasticity operates during everyday experience to generate marked individual differences. Our results apply only to the normally sighted population (i.e., restricted range sampling). For the entire population, including those with degenerate vision, we expect different results.
Resumo:
Real-world objects are often endowed with features that violate Gestalt principles. In our experiment, we examined the neural correlates of binding under conflict conditions in terms of the binding-by-synchronization hypothesis. We presented an ambiguous stimulus ("diamond illusion") to 12 observers. The display consisted of four oblique gratings drifting within circular apertures. Its interpretation fluctuates between bound ("diamond") and unbound (component gratings) percepts. To model a situation in which Gestalt-driven analysis contradicts the perceptually explicit bound interpretation, we modified the original diamond (OD) stimulus by speeding up one grating. Using OD and modified diamond (MD) stimuli, we managed to dissociate the neural correlates of Gestalt-related (OD vs. MD) and perception-related (bound vs. unbound) factors. Their interaction was expected to reveal the neural networks synchronized specifically in the conflict situation. The synchronization topography of EEG was analyzed with the multivariate S-estimator technique. We found that good Gestalt (OD vs. MD) was associated with a higher posterior synchronization in the beta-gamma band. The effect of perception manifested itself as reciprocal modulations over the posterior and anterior regions (theta/beta-gamma bands). Specifically, higher posterior and lower anterior synchronization supported the bound percept, and the opposite was true for the unbound percept. The interaction showed that binding under challenging perceptual conditions is sustained by enhanced parietal synchronization. We argue that this distributed pattern of synchronization relates to the processes of multistage integration ranging from early grouping operations in the visual areas to maintaining representations in the frontal networks of sensory memory.
Resumo:
The Athlete Biological Passport (ABP) is an individual electronic document that collects data regarding a specific athlete that is useful in differentiating between natural physiologic variations of selected biomarkers and deviations caused by artificial manipulations. A subsidiary of the endocrine module of the ABP, that which here is called Athlete Steroidal Passport (ASP), collects data on markers of an altered metabolism of endogenous steroidal hormones measured in urine samples. The ASP aims to identify not only doping with anabolic-androgenic steroids, but also most indirect steroid doping strategies such as doping with estrogen receptor antagonists and aromatase inhibitors. Development of specific markers of steroid doping, use of the athlete's previous measurements to define individual limits, with the athlete becoming his or her own reference, the inclusion of heterogeneous factors such as the UDPglucuronosyltransferase B17 genotype of the athlete, the knowledge of potentially confounding effects such as heavy alcohol consumption, the development of an external quality control system to control analytical uncertainty, and finally the use of Bayesian inferential methods to evaluate the value of indirect evidence have made the ASP a valuable alternative to deter steroid doping in elite sports. The ASP can be used to target athletes for gas chromatography/combustion/ isotope ratio mass spectrometry (GC/C/IRMS) testing, to withdraw temporarily the athlete from competing when an abnormality has been detected, and ultimately to lead to an antidoping infraction if that abnormality cannot be explained by a medical condition. Although the ASP has been developed primarily to ensure fairness in elite sports, its application in endocrinology for clinical purposes is straightforward in an evidence-based medicine paradigm.
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Resumo:
This is a guide that explains how to use software that implements the simulated nonparametric moments (SNM) estimator proposed by Creel and Kristensen (2009). The guide shows how results of that paper may easily be replicated, and explains how to install and use the software for estimation of simulable econometric models.
Resumo:
Forensic scientists working in 12 state or private laboratories participated in collaborative tests to improve the reliability of the presentation of DNA data at trial. These tests were motivated in response to the growing criticism of the power of DNA evidence. The experts' conclusions in the tests are presented and discussed in the context of the Bayesian approach to interpretation. The use of a Bayesian approach and subjective probabilities in trace evaluation permits, in an easy and intuitive manner, the integration into the decision procedure of any revision of the measure of uncertainty in the light of new information. Such an integration is especially useful with forensic evidence. Furthermore, we believe that this probabilistic model is a useful tool (a) to assist scientists in the assessment of the value of scientific evidence, (b) to help jurists in the interpretation of judicial facts and (c) to clarify the respective roles of scientists and of members of the court. Respondents to the survey were reluctant to apply this methodology in the assessment of DNA evidence.
Resumo:
In this paper we included a very broad representation of grass family diversity (84% of tribes and 42% of genera). Phylogenetic inference was based on three plastid DNA regions rbcL, matK and trnL-F, using maximum parsimony and Bayesian methods. Our results resolved most of the subfamily relationships within the major clades (BEP and PACCMAD), which had previously been unclear, such as, among others the: (i) BEP and PACCMAD sister relationship, (ii) composition of clades and the sister-relationship of Ehrhartoideae and Bambusoideae + Pooideae, (iii) paraphyly of tribe Bambuseae, (iv) position of Gynerium as sister to Panicoideae, (v) phylogenetic position of Micrairoideae. With the presence of a relatively large amount of missing data, we were able to increase taxon sampling substantially in our analyses from 107 to 295 taxa. However, bootstrap support and to a lesser extent Bayesian inference posterior probabilities were generally lower in analyses involving missing data than those not including them. We produced a fully resolved phylogenetic summary tree for the grass family at subfamily level and indicated the most likely relationships of all included tribes in our analysis.
Resumo:
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria.
Resumo:
Nandrolone (19-nortestosterone) is a widely used anabolic steroid in sports where strength plays an essential role. Once nandrolone has been metabolised, two major metabolites are excreted in urine, 19-norandrosterone (NA) and 19-noretiocholanolone (NE). In 1997, in France, quite a few sportsmen had concentrations of 19-norandrosterone very close to the IOC cut off limit (2ng/ml). At that time, a debate took place about the capability of the human male body to produce by itself these metabolites without any intake of nandrolone or related compounds. The International Football Federation (FIFA) was very concerned with this problematic, especially because the World Cup was about to start in France. In this respect, a statistical study was held with all football players from the first and second divisions of the Swiss Football National League. All players gave a urine sample after effort and around 6% of them showed traces of 19-norandrosterone. These results were compared with amateur football players (control group) and around 6% of them had very small amounts of 19-norandrosterone and/or 19-noretiocholanolone in urine after effort, whereas none of them had detectable traces of one or the other metabolite before effort. The origin of these compounds in urine after a strenuous physical activity is still unknown, but three hypotheses can be put forward. First, an endogenous production of nandrolone metabolites takes place. Second, nandrolone metabolites are released from the fatty tissues after an intake of nandrolone, some related compounds or some contaminated nutritive supplements. Finally, the sportsmen may have taken something during or just before the football game.
Resumo:
We use a difference-in-difference estimator to examine the effects of a merger involving three airlines. The novelty lies in the examination of this operation in two distinct scenarios: (1) on routes where two low-cost carriers and (2) on routes where a network and one of the low-cost airlines had previously been competing. We report a reduction in frequencies but no substantial effect on prices in the first scenario, while in the second we report an increase in prices but no substantial effect on frequencies. These results may be attributed to the differences in passenger types flying on these routes.
Resumo:
We examine the evolution of monetary policy rules in a group of inflation targeting countries (Australia, Canada, New Zealand, Sweden and the United Kingdom) applying moment- based estimator at time-varying parameter model with endogenous regressors. Using this novel flexible framework, our main findings are threefold. First, monetary policy rules change gradually pointing to the importance of applying time-varying estimation framework. Second, the interest rate smoothing parameter is much lower that what previous time-invariant estimates of policy rules typically report. External factors matter for all countries, albeit the importance of exchange rate diminishes after the adoption of inflation targeting. Third, the response of interest rates on inflation is particularly strong during the periods, when central bankers want to break the record of high inflation such as in the U.K. or in Australia at the beginning of 1980s. Contrary to common wisdom, the response becomes less aggressive after the adoption of inflation targeting suggesting the positive effect of this regime on anchoring inflation expectations. This result is supported by our finding that inflation persistence as well as policy neutral rate typically decreased after the adoption of inflation targeting.
Resumo:
Given a sample from a fully specified parametric model, let Zn be a given finite-dimensional statistic - for example, an initial estimator or a set of sample moments. We propose to (re-)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirect likelihood (MIL) estimator. We also propose a computationally tractable Bayesian version of the estimator which we refer to as a Bayesian Indirect Likelihood (BIL) estimator. In most cases, the density of the statistic will be of unknown form, and we develop simulated versions of the MIL and BIL estimators. We show that the indirect likelihood estimators are consistent and asymptotically normally distributed, with the same asymptotic variance as that of the corresponding efficient two-step GMM estimator based on the same statistic. However, our likelihood-based estimators, by taking into account the full finite-sample distribution of the statistic, are higher order efficient relative to GMM-type estimators. Furthermore, in many cases they enjoy a bias reduction property similar to that of the indirect inference estimator. Monte Carlo results for a number of applications including dynamic and nonlinear panel data models, a structural auction model and two DSGE models show that the proposed estimators indeed have attractive finite sample properties.
Resumo:
The location and timing of domestication of the olive tree, a key crop in Early Mediterranean societies, remain hotly debated. Here, we unravel the history of wild olives (oleasters), and then infer the primary origins of the domesticated olive. Phylogeography and Bayesian molecular dating analyses based on plastid genome profiling of 1263 oleasters and 534 cultivated genotypes reveal three main lineages of pre-Quaternary origin. Regional hotspots of plastid diversity, species distribution modelling and macrofossils support the existence of three long-term refugia; namely the Near East (including Cyprus), the Aegean area and the Strait of Gibraltar. These ancestral wild gene pools have provided the essential foundations for cultivated olive breeding. Comparison of the geographical pattern of plastid diversity between wild and cultivated olives indicates the cradle of first domestication in the northern Levant followed by dispersals across the Mediterranean basin in parallel with the expansion of civilizations and human exchanges in this part of the world.
Resumo:
The paper follows on from earlier work [Taroni F and Aitken CGG. Probabilistic reasoning in the law, Part 1: assessment of probabilities and explanation of the value of DNA evidence. Science & Justice 1998; 38: 165-177]. Different explanations of the value of DNA evidence were presented to students from two schools of forensic science and to members of fifteen laboratories all around the world. The responses were divided into two groups; those which came from a school or laboratory identified as Bayesian and those which came from a school or laboratory identified as non-Bayesian. The paper analyses these responses using a likelihood approach. This approach is more consistent with a Bayesian analysis than one based on a frequentist approach, as was reported by Taroni F and Aitken CGG. [Probabilistic reasoning in the law, Part 1: assessment of probabilities and explanation of the value of DNA evidence] in Science & Justice 1998.
Resumo:
This paper examines the determinants of young innovative companies’ (YICs) R&D activities taking into account the autoregressive nature of innovation. Using a large longitudinal dataset comprising Spanish manufacturing firms over the period 1990-2008, we find that previous R&D experience is a fundamental determinant for mature and young firms, albeit to a smaller extent in the case of the YICs, suggesting that their innovation behaviour is less persistent and more erratic. Moreover, our results suggest that firm and market characteristics play a distinct role in boosting the innovation activity of firms of different age. In particular, while market concentration and the degree of product diversification are found to be important in boosting R&D activities in the sub-sample of mature firms only, YICs’ spending on R&D appears to be more sensitive to demand-pull variables, suggesting the presence of credit constraints. These results have been obtained using a recently proposed dynamic type-2 tobit estimator, which accounts for individual effects and efficiently handles the initial conditions problem.