919 resultados para Optimal two-sided tests
Resumo:
This paper considers two-sided tests for the parameter of an endogenous variable in an instrumental variable (IV) model with heteroskedastic and autocorrelated errors. We develop the nite-sample theory of weighted-average power (WAP) tests with normal errors and a known long-run variance. We introduce two weights which are invariant to orthogonal transformations of the instruments; e.g., changing the order in which the instruments appear. While tests using the MM1 weight can be severely biased, optimal tests based on the MM2 weight are naturally two-sided when errors are homoskedastic. We propose two boundary conditions that yield two-sided tests whether errors are homoskedastic or not. The locally unbiased (LU) condition is related to the power around the null hypothesis and is a weaker requirement than unbiasedness. The strongly unbiased (SU) condition is more restrictive than LU, but the associated WAP tests are easier to implement. Several tests are SU in nite samples or asymptotically, including tests robust to weak IV (such as the Anderson-Rubin, score, conditional quasi-likelihood ratio, and I. Andrews' (2015) PI-CLC tests) and two-sided tests which are optimal when the sample size is large and instruments are strong. We refer to the WAP-SU tests based on our weights as MM1-SU and MM2-SU tests. Dropping the restrictive assumptions of normality and known variance, the theory is shown to remain valid at the cost of asymptotic approximations. The MM2-SU test is optimal under the strong IV asymptotics, and outperforms other existing tests under the weak IV asymptotics.
Resumo:
Equivalence testing is growing in use in scientific research outside of its traditional role in the drug approval process. Largely due to its ease of use and recommendation from the United States Food and Drug Administration guidance, the most common statistical method for testing (bio)equivalence is the two one-sided tests procedure (TOST). Like classical point-null hypothesis testing, TOST is subject to multiplicity concerns as more comparisons are made. In this manuscript, a condition that bounds the family-wise error rate (FWER) using TOST is given. This condition then leads to a simple solution for controlling the FWER. Specifically, we demonstrate that if all pairwise comparisons of k independent groups are being evaluated for equivalence, then simply scaling the nominal Type I error rate down by (k - 1) is sufficient to maintain the family-wise error rate at the desired value or less. The resulting rule is much less conservative than the equally simple Bonferroni correction. An example of equivalence testing in a non drug-development setting is given.
Resumo:
Following the Introduction, which surveys existing literature on the technology advances and regulation in telecommunications and on two-sided markets, we address specific issues on the industries of the New Economy, featured by the existence of network effects. We seek to explore how each one of these industries work, identify potential market failures and find new solutions at the economic regulation level promoting social welfare. In Chapter 1 we analyze a regulatory issue on access prices and investments in the telecommunications market. The existing literature on access prices and investment has pointed out that networks underinvest under a regime of mandatory access provision with a fixed access price per end-user. We propose a new access pricing rule, the indexation approach, i.e., the access price, per end-user, that network i pays to network j is function of the investment levels set by both networks. We show that the indexation can enhance economic efficiency beyond what is achieved with a fixed access price. In particular, access price indexation can simultaneously induce lower retail prices and higher investment and social welfare as compared to a fixed access pricing or a regulatory holidays regime. Furthermore, we provide sufficient conditions under which the indexation can implement the socially optimal investment or the Ramsey solution, which would be impossible to obtain under fixed access pricing. Our results contradict the notion that investment efficiency must be sacrificed for gains in pricing efficiency. In Chapter 2 we investigate the effect of regulations that limit advertising airtime on advertising quality and on social welfare. We show, first, that advertising time regulation may reduce the average quality of advertising broadcast on TV networks. Second, an advertising cap may reduce media platforms and firms' profits, while the net effect on viewers (subscribers) welfare is ambiguous because the ad quality reduction resulting from a regulatory cap o¤sets the subscribers direct gain from watching fewer ads. We find that if subscribers are sufficiently sensitive to ad quality, i.e., the ad quality reduction outweighs the direct effect of the cap, a cap may reduce social welfare. The welfare results suggest that a regulatory authority that is trying to increase welfare via regulation of the volume of advertising on TV might necessitate to also regulate advertising quality or, if regulating quality proves impractical, take the effect of advertising quality into consideration. 3 In Chapter 3 we investigate the rules that govern Electronic Payment Networks (EPNs). In EPNs the No-Surcharge Rule (NSR) requires that merchants charge at most the same amount for a payment card transaction as for cash. In this chapter, we analyze a three- party model (consumers, merchants, and a proprietary EPN) with endogenous transaction volumes and heterogenous merchants' transactional benefits of accepting cards to assess the welfare impacts of the NSR. We show that, if merchants are local monopolists and the network externalities from merchants to cardholders are sufficiently strong, with the exception of the EPN, all agents will be worse o¤ with the NSR, and therefore the NSR is socially undesirable. The positive role of the NSR in terms of improvement of retail price efficiency for cardholders is also highlighted.
Resumo:
More and more academic journals adopt an open-access policy, by which articlesare accessible free of charge, while publication costs are recovered through authorfees. We study the consequences of this open access policy on a journal s qualitystandard. If the journal s objective was to maximize social welfare, open accesswould be optimal as long as the positive externalities generated by its diffusionexceed the marginal cost of distribution. However, we show that if an open accessjournal has a different objective (such as maximizing readers payoffs, the impactof the journal or its profit), it tends to choose a quality standard below the sociallyefficient level.
Resumo:
In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.
Resumo:
This article is motivated by the prominence of one-sided S,s rules in the literature and by the unrealistic strict conditions necessary for their optimality. It aims to assess whether one-sided pricing rules could be an adequate individual rule for macroeconomic models, despite its suboptimality. It aims to answer two questions. First, since agents are not fully rational, is it plausible that they use such a non-optimal rule? Second, even if the agents adopt optimal rules, is the economist committing a serious mistake by assuming that agents use one-sided Ss rules? Using parameters based on real economy data, we found that since the additional cost involved in adopting the simpler rule is relatively small, it is plausible that one-sided rules are used in practice. We also found that suboptimal one-sided rules and optimal two-sided rules are in practice similar, since one of the bounds is not reached very often. We concluded that the macroeconomic effects when one-sided rules are suboptimal are similar to the results obtained under two-sided optimal rules, when they are close to each other. However, this is true only when one-sided rules are used in the context where they are not optimal.
Resumo:
In this paper we present the operational matrices of the left Caputo fractional derivative, right Caputo fractional derivative and Riemann–Liouville fractional integral for shifted Legendre polynomials. We develop an accurate numerical algorithm to solve the two-sided space–time fractional advection–dispersion equation (FADE) based on a spectral shifted Legendre tau (SLT) method in combination with the derived shifted Legendre operational matrices. The fractional derivatives are described in the Caputo sense. We propose a spectral SLT method, both in temporal and spatial discretizations for the two-sided space–time FADE. This technique reduces the two-sided space–time FADE to a system of algebraic equations that simplifies the problem. Numerical results carried out to confirm the spectral accuracy and efficiency of the proposed algorithm. By selecting relatively few Legendre polynomial degrees, we are able to get very accurate approximations, demonstrating the utility of the new approach over other numerical methods.
Resumo:
This paper presents a stylised framework to examine how skill-biased technological change and labour market frictions affect the relationship between economic expansion and unskilled unemployment. The first part of the analysis focuses on the investment decisions in skill-acquisition and technology adoption activities faced by workers and firms in response to the introduction of an innovative technology. The second part examines how endogenous two-sided heterogeneity in the labour market affects the macroeconomic outcomes in terms of unemployment, technological diffusion, and economic expansion. To conclude, the framework is used to discuss the effects of alternative forms of policy intervention on agents' investment decisions and on the macroeconomic outcomes.
Resumo:
We prove two-sided inequalities between the integral moduli of smoothness of a function on R d[superscript] / T d[superscript] and the weighted tail-type integrals of its Fourier transform/series. Sharpness of obtained results in particular is given by the equivalence results for functions satisfying certain regular conditions. Applications include a quantitative form of the Riemann-Lebesgue lemma as well as several other questions in approximation theory and the theory of function spaces.
Resumo:
The diagnosis of leprosy continues to be based on clinical symptoms and early diagnosis and treatment are critical to preventing disability and transmission. Sensitive and specific laboratory tests are not available for diagnosing leprosy. Despite the limited applicability of anti-phenolic glycolipid-I (PGL-I) serology for diagnosis, it has been suggested as an additional tool to classify leprosy patients (LPs) for treatment purposes. Two formats of rapid tests to detect anti-PGL-I antibodies [ML immunochromatography assay (ICA) and ML Flow] were compared in different groups, multibacillary patients, paucibacillary patients, household contacts and healthy controls in Brazil and Nepal. High ML Flow intra-test concordance was observed and low to moderate agreement between the results of ML ICA and ML Flow tests on the serum of LPs was observed. LPs were "seroclassified" according to the results of these tests and the seroclassification was compared to other currently used classification systems: the World Health Organization operational classification, the bacilloscopic index and the Ridley-Jopling classification. When analysing the usefulness of these tests in the operational classification of PB and MB leprosy for treatment and follow-up purposes, the ML Flow test was the best point-of-care test for subjects in Nepal and despite the need for sample dilution, the ML ICA test yielded better performance among Brazilian subjects. Our results identified possible ways to improve the performance of both tests.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
This paper investigates the role of learning by private agents and the central bank(two-sided learning) in a New Keynesian framework in which both sides of the economyhave asymmetric and imperfect knowledge about the true data generating process. Weassume that all agents employ the data that they observe (which may be distinct fordifferent sets of agents) to form beliefs about unknown aspects of the true model ofthe economy, use their beliefs to decide on actions, and revise these beliefs througha statistical learning algorithm as new information becomes available. We study theshort-run dynamics of our model and derive its policy recommendations, particularlywith respect to central bank communications. We demonstrate that two-sided learningcan generate substantial increases in volatility and persistence, and alter the behaviorof the variables in the model in a significant way. Our simulations do not convergeto a symmetric rational expectations equilibrium and we highlight one source thatinvalidates the convergence results of Marcet and Sargent (1989). Finally, we identifya novel aspect of central bank communication in models of learning: communicationcan be harmful if the central bank's model is substantially mis-specified.
Resumo:
It is proved the algebraic equality between Jennrich's (1970) asymptotic$X^2$ test for equality of correlation matrices, and a Wald test statisticderived from Neudecker and Wesselman's (1990) expression of theasymptoticvariance matrix of the sample correlation matrix.
Resumo:
Two-sided flux decoration experiments indicate that threading dislocation lines (TDLs), which cross the entire film, are sometimes trapped in metastable states. We calculate the elastic energy associated with the meanderings of a TDL. The TDL behaves as an anisotropic and dispersive string with thermal fluctuations largely along its Burgers vector. These fluctuations also modify the structure factor of the vortex solid. Both effects can, in principle, be used to estimate the elastic moduli of the material.