164 resultados para Empirical Testing
Resumo:
The paper contrasts empirically the results of alternative methods for estimating thevalue and the depreciation of mineral resources. The historical data of Mexico andVenezuela, covering the period 1920s-1980s, is used to contrast the results of severalmethods. These are the present value, the net price method, the user cost method andthe imputed income method. The paper establishes that the net price and the user costare not competing methods as such, but alternative adjustments to different scenariosof closed and open economies. The results prove that the biases of the methods, ascommonly described in the theoretical literature, only hold under the most restrictedscenario of constant rents over time. It is argued that the difference between what isexpected to happen and what actually did happen is for the most part due to a missingvariable, namely technological change. This is an important caveat to therecommendations made based on these models.
Resumo:
Expected utility theory (EUT) has been challenged as a descriptive theoryin many contexts. The medical decision analysis context is not an exception.Several researchers have suggested that rank dependent utility theory (RDUT)may accurately describe how people evaluate alternative medical treatments.Recent research in this domain has addressed a relevant feature of RDU models-probability weighting-but to date no direct test of this theoryhas been made. This paper provides a test of the main axiomatic differencebetween EUT and RDUT when health profiles are used as outcomes of riskytreatments. Overall, EU best described the data. However, evidence on theediting and cancellation operation hypothesized in Prospect Theory andCumulative Prospect Theory was apparent in our study. we found that RDUoutperformed EU in the presentation of the risky treatment pairs in whichthe common outcome was not obvious. The influence of framing effects onthe performance of RDU and their importance as a topic for future researchis discussed.
Resumo:
Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.
Resumo:
This chapter highlights the problems that structural methods and SVAR approaches have when estimating DSGE models and examining their ability to capture important features of the data. We show that structural methods are subject to severe identification problems due, in large part, to the nature of DSGE models. The problems can be patched up in a number of ways but solved only if DSGEs are completely reparametrized or respecified. The potential misspecification of the structural relationships give Bayesian methods an hedge over classical ones in structural estimation. SVAR approaches may face invertibility problems but simple diagnostics can help to detect and remedy these problems. A pragmatic empirical approach ought to use the flexibility of SVARs against potential misspecificationof the structural relationships but must firmly tie SVARs to the class of DSGE models which could have have generated the data.
Resumo:
We consider a dynamic multifactor model of investment with financing imperfections,adjustment costs and fixed and variable capital. We use the model to derive a test offinancing constraints based on a reduced form variable capital equation. Simulation resultsshow that this test correctly identifies financially constrained firms even when the estimationof firms investment opportunities is very noisy. In addition, the test is well specified inthe presence of both concave and convex adjustment costs of fixed capital. We confirmempirically the validity of this test on a sample of small Italian manufacturing companies.
Resumo:
As companies and shareholders begin to note the potential repercussions of intangible assets uponbusiness results, the inability of the traditional financial statement model to reflect these new waysof creating business value has become evident. Companies have widely adopted newmanagement tools, covering in this way the inability of the traditional financial statement model toreflect these new ways of creating business value.However, there are few prior studies measuring on a quantifiable manner the level of productivityunexplained in the financial statements. In this study, we measure the effect of intangible assets onproductivity using data from Spanish firms selected randomly by size and sector over a ten-yearperiod, from 1995 to 2004. Through a sample of more than 10,000 Spanish firms we analyse towhat extent labour productivity can be explained by physical capital deepening, by quantifiedintangible capital deepening and by firm s economic efficiency (or total factor productivity PTF).Our results confirm the hypothesis that PTF weigh has increased during the period studied,especially on those firms that have experienced a significant raise in quantified intangible capital,evidencing that there are some important complementary effects between capital investment andintangible resources in the explanation of productivity growth. These results have significantdifferences considering economic sector and firm s dimension.
Resumo:
We compare a set of empirical Bayes and composite estimators of the population means of the districts (small areas) of a country, and show that the natural modelling strategy of searching for a well fitting empirical Bayes model and using it for estimation of the area-level means can be inefficient.
Resumo:
This paper extends previous resuls on optimal insurance trading in the presence of a stock market that allows continuous asset trading and substantial personal heterogeneity, and applies those results in a context of asymmetric informationwith references to the role of genetic testing in insurance markets.We find a novel and surprising result under symmetric information:agents may optimally prefer to purchase full insurance despitethe presence of unfairly priced insurance contracts, and other assets which are correlated with insurance.Asymmetric information has a Hirschleifer-type effect whichcan be solved by suspending insurance trading. Nevertheless,agents can attain their first best allocations, which suggeststhat the practice of restricting insurance not to be contingenton genetic tests can be efficient.
Resumo:
This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.
Resumo:
The demands of representative design, as formulated by Egon Brunswik (1956), set a high methodological standard. Both experimental participants and the situations with which they are faced should be representative of the populations to which researchers claim to generalize results. Failure to observe the latter has led to notable experimental failures in psychology from which economics could learn. It also raises questions about the meaning of testing economic theories in abstract environments. Logically, abstract tests can only be generalized to abstract realities and these may or may not have anything to do with the empirical realities experienced by economic actors.
Resumo:
Recent research shows that financial reports are losing relevance. Mainly thisis due to the growing strategic importance of intangible assets in theperformance of a company. A possible solution is to modify accounting standardsso that statements include more self-generated intangible assets, taking intoaccount with their inherent risk and difficulty of valuation. We surveyed loanofficers who were asked to assess the credit-worthiness of a hypotheticalcompany. The only information given was a simplified version of financialstatements. Half the group got statements where research and development costshad been capitalized. The other half got statements in which these costs hadbeen treated as an expense. The findings show that capitalization wassignificantly more likely to attract a positive response to a loan request. Thepaper raises the question of whether accounting for intangibles might providemanagers with one more creative accounting technique and, in consequence, itsethical implications.
Resumo:
Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.
Resumo:
The demographic shift underway in Southern Europe requires a revision of some of thefundamental principles of the traditional welfare state. We analyze the evolution of several aspects of welfare and social expenditure over the last two decades. We find that in the context of the present demographic changes and real estate boom current social and pension policy leads to a new distribution of benefits and burdens which is highly intergenerationally unequal. We argue for a revised definition of public policy based on Musgrave's proposition as a possible rule for an intergenerationally fair distribution.
Resumo:
We investigate the impact of 20th--century European colonizationon growth in Africa. We find that in the 1960--88 period growth has beenfaster for dependencies than for colonies; for British and Frenchcolonies than for Portuguese, Belgian and Italian ones; and for countrieswith less economic penetration during the colonial period. On average,African growth accelerates after decolonization. Proxies for colonialheritage add explanatory power to growth regressions and make indicatorsfor human capital, political and ethnic instability lose significance.Colonial variables capture the same effects of a sub--Saharan dummy andreduce its significance when jointly included in a cross sectionalregression with 98 countries.
Resumo:
The results of the examinations taken by graduated high school studentswho want to enrol at a Catalan university are here studied. To do so,the authors address several issues related to the equity of the system:reliability of grading, difficulty and discrimination power of the exams.The general emphasis is put upon the concurrent research and empiricalevidence about the properties of the examination items and scores. Aftera discussion about the limitations of the exams' format and appropriatenessof the instruments used in the study, the article concludes with somesuggestions to improve such examinations.