944 resultados para quantitative methods
Resumo:
From a scientific point of view, surveys are undoubtedly a valuable tool for the knowledge of the social and political reality. They are widely used in the social sciences research. However, the researcher's task is often disturbed by a series of deficiencies related to some technical aspects that make difficult both the inference and the comparison. The main aim of the present paper is to report and justify the European Social Survey's technical specifications addressed to avoid and/or minimize such deficiencies. The article also gives a characterization of the non-respondents in Spain obtained from the analysis of the 2002 fieldwork data file.
Resumo:
In this paper we consider an insider with privileged information thatis affected by an independent noise vanishing as the revelation timeapproaches. At this time, information is available to every trader. Ourfinancial markets are based on Wiener space. In probabilistic terms weobtain an infinite dimensional extension of Jacod s theorem to covercases of progressive enlargement of filtrations. The application ofthis result gives the semimartingale decomposition of the originalWiener process under the progressively enlarged filtration. As anapplication we prove that if the rate at which the additional noise inthe insider s information vanishes is slow enough then there is noarbitrage and the additional utility of the insider is finite.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
We extend to score, Wald and difference test statistics the scaled and adjusted corrections to goodness-of-fit test statistics developed in Satorra and Bentler (1988a,b). The theory is framed in the general context of multisample analysis of moment structures, under general conditions on the distribution of observable variables. Computational issues, as well as the relation of the scaled and corrected statistics to the asymptotic robust ones, is discussed. A Monte Carlo study illustrates thecomparative performance in finite samples of corrected score test statistics.
Resumo:
With the two aims of monitoring social change and improving social measurement, the European Social Survey is now closing its third round. This paper shows how the accumulated experience of the two first rounds has been used to validate the questionnaire, better adapt the sampling design to the country characteristics and efficiently commit fieldwork in Spain. For example, the dynamic character of the population nowadays makes necessary to estimated design effects at each round from the data of the previous round. The paper also demonstrates how, starting with a response rate of 52% at first round, a 66% response rate is achieved at the third round thanks to an extensive quality control conducted by the polling agency and the ESS national team and based on a detailed analysis of the non-response cases and the incidences reported by the interviewed in the contact form.
Resumo:
Power transformations of positive data tables, prior to applying the correspondence analysis algorithm, are shown to open up a family of methods with direct connections to the analysis of log-ratios. Two variations of this idea are illustrated. The first approach is simply to power the original data and perform a correspondence analysis this method is shown to converge to unweighted log-ratio analysis as the power parameter tends to zero. The second approach is to apply the power transformation to thecontingency ratios, that is the values in the table relative to expected values based on the marginals this method converges to weighted log-ratio analysis, or the spectral map. Two applications are described: first, a matrix of population genetic data which is inherently two-dimensional, and second, a larger cross-tabulation with higher dimensionality, from a linguistic analysis of several books.
Resumo:
This work studies the organization of less-than-truckload trucking from a contractual point of view. We show that the huge number of owner-operators working in the industry hides a much less fragmented reality. Most of those owner-operators are quasi-integrated in higher organizational structures. This hybrid form is generally more efficient than vertical integration because, in the Spanish institutional environment, it lessens serious moral hazard problems, related mainly to the use of the vehicles, and makes it possible to reach economies of scale and density. Empirical evidence suggests that what leads organizations to vertically integrate is not the presence of such economies but hold-up problems, related to the existence of specific assets. Finally, an international comparison hints that institutional constraints are able to explain differences in the evolution of vertical integration across countries.
Resumo:
In this work I study the stability of the dynamics generated by adaptivelearning processes in intertemporal economies with lagged variables. Iprove that determinacy of the steady state is a necessary condition for the convergence of the learning dynamics and I show that the reciprocal is not true characterizing the economies where convergence holds. In the case of existence of cycles I show that there is not, in general, a relationship between determinacy and convergence of the learning process to the cycle. I also analyze the expectational stability of these equilibria.
Resumo:
In this paper we evaluate the quantitative impact that a number ofalternative reform scenarios may have on the total expenditure for publicpensions in Spain. Our quantitative findings can be summarized in twosentences. For all the reforms considered, the financial impact of themechanical effect (change in benefits) is order of magnitudes larger thanthe behavioral impact or change in behavior. For the two Spanish reforms,we find once again that their effect on the outstanding liability of theSpanish Social Security System is essentially negligible: neither themechanical nor the behavioral effects amount to much for the 1997 reform,and amount to very little for the 2002 amendment.
Resumo:
In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.
Resumo:
We will call a game a reachable (pure strategy) equilibria game if startingfrom any strategy by any player, by a sequence of best-response moves weare able to reach a (pure strategy) equilibrium. We give a characterizationof all finite strategy space duopolies with reachable equilibria. Wedescribe some applications of the sufficient conditions of the characterization.
Resumo:
This article is an introduction to Malliavin Calculus for practitioners.We treat one specific application to the calculation of greeks in Finance.We consider also the kernel density method to compute greeks and anextension of the Vega index called the local vega index.
Resumo:
The present paper proposes a model for the persistence of abnormal returnsboth at firm and industry levels, when longitudinal data for the profitsof firms classiffied as industries are available. The model produces a two-way variance decomposition of abnormal returns: (a) at firm versus industrylevels, and (b) for permanent versus transitory components. This variancedecomposition supplies information on the relative importance of thefundamental components of abnormal returns that have been discussed in theliterature. The model is applied to a Spanish sample of firms, obtainingresults such as: (a) there are significant and permanent differences betweenprofit rates both at industry and firm levels; (b) variation of abnormal returnsat firm level is greater than at industry level; and (c) firm and industry levelsdo not differ significantly regarding rates of convergence of abnormal returns.
Resumo:
It is widely accepted in the literature about the classicalCournot oligopoly model that the loss of quasi competitiveness is linked,in the long run as new firms enter the market, to instability of the equilibrium. In this paper, though, we present a model in which a stableunique symmetric equilibrium is reached for any number of oligopolistsas industry price increases with each new entry. Consequently, the suspicion that non quasi competitiveness implies, in the long run, instabilityis proved false.