960 resultados para HYPOTHESIS TESTS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In longitudinal data analysis, our primary interest is in the regression parameters for the marginal expectations of the longitudinal responses; the longitudinal correlation parameters are of secondary interest. The joint likelihood function for longitudinal data is challenging, particularly for correlated discrete outcome data. Marginal modeling approaches such as generalized estimating equations (GEEs) have received much attention in the context of longitudinal regression. These methods are based on the estimates of the first two moments of the data and the working correlation structure. The confidence regions and hypothesis tests are based on the asymptotic normality. The methods are sensitive to misspecification of the variance function and the working correlation structure. Because of such misspecifications, the estimates can be inefficient and inconsistent, and inference may give incorrect results. To overcome this problem, we propose an empirical likelihood (EL) procedure based on a set of estimating equations for the parameter of interest and discuss its characteristics and asymptotic properties. We also provide an algorithm based on EL principles for the estimation of the regression parameters and the construction of a confidence region for the parameter of interest. We extend our approach to variable selection for highdimensional longitudinal data with many covariates. In this situation it is necessary to identify a submodel that adequately represents the data. Including redundant variables may impact the model’s accuracy and efficiency for inference. We propose a penalized empirical likelihood (PEL) variable selection based on GEEs; the variable selection and the estimation of the coefficients are carried out simultaneously. We discuss its characteristics and asymptotic properties, and present an algorithm for optimizing PEL. Simulation studies show that when the model assumptions are correct, our method performs as well as existing methods, and when the model is misspecified, it has clear advantages. We have applied the method to two case examples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we test the Prebish-Singer (PS) hypothesis, which states that real commodity prices decline in the long run, using two recent powerful panel data stationarity tests accounting for cross-sectional dependence and a structural break. We find that the hypothesis cannot be rejected for most commodities other than oil.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study tested a dynamic field theory (DFT) of spatial working memory and an associated spatial precision hypothesis (SPH). Between 3 and 6 years of age, there is a qualitative shift in how children use reference axes to remember locations: 3-year-olds’ spatial recall responses are biased toward reference axes after short memory delays, whereas 6-year-olds’ responses are biased away from reference axes. According to the DFT and the SPH, quantitative improvements over development in the precision of excitatory and inhibitory working memory processes lead to this qualitative shift. Simulations of the DFT in Experiment 1 predict that improvements in precision should cause the spatial range of targets attracted toward a reference axis to narrow gradually over development, with repulsion emerging and gradually increasing until responses to most targets show biases away from the axis. Results from Experiment 2 with 3- to 5-year-olds support these predictions. Simulations of the DFT in Experiment 3 quantitatively fit the empirical results and offer insights into the neural processes underlying this developmental change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Intersensory Redundancy Hypothesis (IRH; Bahrick & Lickliter, 2000, 2002, 2012) predicts that early in development information presented to a single sense modality will selectively recruit attention to modality-specific properties of stimulation and facilitate learning of those properties at the expense of amodal properties (unimodal facilitation). Vaillant (2010) demonstrated that bobwhite quail chicks prenatally exposed to a maternal call alone (unimodal stimulation) are able to detect a pitch change, a modality-specific property, in subsequent postnatal testing between the familiarized call and the same call with altered pitch. In contrast, chicks prenatally exposed to a maternal call paired with a temporally synchronous light (redundant audiovisual stimulation) were unable to detect a pitch change. According to the IRH (Bahrick & Lickliter, 2012), as development proceeds and the individual's perceptual abilities increase, the individual should detect modality-specific properties in both nonredundant, unimodal and redundant, bimodal conditions. However, when the perceiver is presented with a difficult task, relative to their level of expertise, unimodal facilitation should become evident. The first experiment of the present study exposed bobwhite quail chicks 24 hr after hatching to unimodal auditory, nonredundant audiovisual, or redundant audiovisual presentations of a maternal call for 10min/hr for 24 hours. All chicks were subsequently tested 24 hr after the completion of the stimulation (72 hr following hatching) between the familiarized maternal call and the same call with altered pitch. Chicks from all experimental groups (unimodal, nonredundant audiovisual, and redundant audiovisual exposure) significantly preferred the familiarized call over the pitch-modified call. The second experiment exposed chicks to the same exposure conditions, but created a more difficult task by narrowing the pitch range between the two maternal calls with which they were tested. Chicks in the unimodal and nonredundant audiovisual conditions demonstrated detection of the pitch change, whereas the redundant audiovisual exposure group did not show detection of the pitch change, providing evidence of unimodal facilitation. These results are consistent with predictions of the IRH and provide further support for the effects of unimodal facilitation and the role of task difficulty across early development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Age-related maculopathy (ARM) has remained a challenging topic with respect to its aetiology, pathomechanisms, early detection and treatment since the late 19th century when it was first described as its own entity. ARM was previously considered an inflammatory disease, a degenerative disease, a tumor and as the result of choroidal hemodynamic disturbances and ischaemia. The latter processes have been repeatedly suggested to have a key role in its development and progression. In vivo experiments under hypoxic conditions could be models for the ischaemic deficits in ARM. Recent research has also linked ARM with gene polymorphisms. It is however unclear what triggers a person's gene susceptibility. In this manuscript, a linking hypothesis between aetiological factors including ischaemia and genetics and the development of early clinicopathological changes in ARM is proposed. New clinical psychophysical and electrophysiological tests are introduced that can detect ARM at an early stage. Models of early ARM based upon hemodynamic, photoreceptor and post-receptoral deficits are described and the mechanisms by which ischaemia may be involved as a final common pathway are considered. In neovascular age-related macular degeneration (neovascular AMD), ischaemia is thought to promote release of vascular endothelial growth factor (VEGF) which induces chorioretinal neovascularisation. VEGF is critical in the maintenance of the healthy choriocapillaris. In the final section of the manuscript the documentation of the effect of new anti-VEGF treatments on retinal function in neovascular AMD is critically viewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bootstrap likelihood ratio tests of cointegration rank are commonly used because they tend to have rejection probabilities that are closer to the nominal level than the rejection probabilities of the correspond- ing asymptotic tests. The e¤ect of bootstrapping the test on its power is largely unknown. We show that a new computationally inexpensive procedure can be applied to the estimation of the power function of the bootstrap test of cointegration rank. The bootstrap test is found to have a power function close to that of the level-adjusted asymp- totic test. The bootstrap test estimates the level-adjusted power of the asymptotic test highly accurately. The bootstrap test may have low power to reject the null hypothesis of cointegration rank zero, or underestimate the cointegration rank. An empirical application to Euribor interest rates is provided as an illustration of the findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The polymorphism of a gene or a locus is studied with increasing frequency by multiple laboratories or the same group at different times. Such practice results in polymorphism being revealed by different samples at different regions of the locus. Tests of neutrality have been widely conducted for polymorphism data but commonly used statistical tests cannot be applied directly to such data. This article provides a procedure to conduct a neutrality test and details are given for two commonly used tests. Applying the two new tests to the chemokine-receptor gene (CCR5) in humans, we found that the hypothesis that all mutations are selectively neutral cannot explain the observed pattern of DNA polymorphism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce and explore an approach to estimating statistical significance of classification accuracy, which is particularly useful in scientific applications of machine learning where high dimensionality of the data and the small number of training examples render most standard convergence bounds too loose to yield a meaningful guarantee of the generalization ability of the classifier. Instead, we estimate statistical significance of the observed classification accuracy, or the likelihood of observing such accuracy by chance due to spurious correlations of the high-dimensional data patterns with the class labels in the given training set. We adopt permutation testing, a non-parametric technique previously developed in classical statistics for hypothesis testing in the generative setting (i.e., comparing two probability distributions). We demonstrate the method on real examples from neuroimaging studies and DNA microarray analysis and suggest a theoretical analysis of the procedure that relates the asymptotic behavior of the test to the existing convergence bounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the finite sample properties of three testing regimes for the null hypothesis of a panel unit root against stationary alternatives in the presence of cross-sectional correlation. The regimes of Bai and Ng (2004), Moon and Perron (2004) and Pesaran (2007) are assessed in the presence of multiple factors and also other non-standard situations. The behaviour of some information criteria used to determine the number of factors in a panel is examined and new information criteria with improved properties in small-N panels proposed. An application to the efficient markets hypothesis is also provided. The null hypothesis of a panel random walk is not rejected by any of the tests, supporting the efficient markets hypothesis in the financial services sector of the Australian Stock Exchange.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating evidence from multiple domains is useful in prioritizing disease candidate genes for subsequent testing. We ranked all known human genes (n = 3819) under linkage peaks in the Irish Study of High-Density Schizophrenia Families using three different evidence domains: 1) a meta-analysis of microarray gene expression results using the Stanley Brain collection, 2) a schizophrenia protein-protein interaction network, and 3) a systematic literature search. Each gene was assigned a domain-specific p-value and ranked after evaluating the evidence within each domain. For comparison to this
ranking process, a large-scale candidate gene hypothesis was also tested by including genes with Gene Ontology terms related to neurodevelopment. Subsequently, genotypes of 3725 SNPs in 167 genes from a custom Illumina iSelect array were used to evaluate the top ranked vs. hypothesis selected genes. Seventy-three genes were both highly ranked and involved in neurodevelopment (category 1) while 42 and 52 genes were exclusive to neurodevelopment (category 2) or highly ranked (category 3), respectively. The most significant associations were observed in genes PRKG1, PRKCE, and CNTN4 but no individual SNPs were significant after correction for multiple testing. Comparison of the approaches showed an excess of significant tests using the hypothesis-driven neurodevelopment category. Random selection of similar sized genes from two independent genome-wide association studies (GWAS) of schizophrenia showed the excess was unlikely by chance. In a further meta-analysis of three GWAS datasets, four candidate SNPs reached nominal significance. Although gene ranking using integrated sources of prior information did not enrich for significant results in the current experiment, gene selection using an a priori hypothesis (neurodevelopment) was superior to random selection. As such, further development of gene ranking strategies using more carefully selected sources of information is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we re-examine two important aspects of the dynamics of relative primary commodity prices, namely the secular trend and the short run volatility. To do so, we employ 25 series, some of them starting as far back as 1650 and powerful panel data stationarity tests that allow for endogenous multiple structural breaks. Results show that all the series are stationary after allowing for endogenous multiple breaks. Test results on the Prebisch–Singer hypothesis, which states that relative commodity prices follow a downward secular trend, are mixed but with a majority of series showing negative trends. We also make a first attempt at identifying the potential drivers of the structural breaks. We end by investigating the dynamics of the volatility of the 25 relative primary commodity prices also allowing for endogenous multiple breaks. We describe the often time-varying volatility in commodity prices and show that it has increased in recent years.