13 resultados para Entanglement Measures
em Repositório digital da Fundação Getúlio Vargas - FGV
Resumo:
Market timing performance of mutual funds is usually evaluated with linear models with dummy variables which allow for the beta coefficient of CAPM to vary across two regimes: bullish and bearish market excess returns. Managers, however, use their predictions of the state of nature to deÞne whether to carry low or high beta portfolios instead of the observed ones. Our approach here is to take this into account and model market timing as a switching regime in a way similar to Hamilton s Markov-switching GNP model. We then build a measure of market timing success and apply it to simulated and real world data.
An ordering of measures of the welfare cost of inflation in economies with interest-bearing deposits
Resumo:
This paper builds on Lucas (2000) and on Cysne (2003) to derive and order six alternative measures of the welfare costs of inflation (five of which already existing in the literature) for any vector of opportunity costs. The ordering of the functions is carried out for economies with or without interestbearing deposits. We provide examples and closed-form solutions for the log-log money demand both in the unidimensional and in the multidimensional setting (when interest-bearing monies are present). An estimate of the maximum relative error a researcher can incur when using any particular measure is also provided.
Resumo:
This paper presents semiparametric estimators of changes in inequality measures of a dependent variable distribution taking into account the possible changes on the distributions of covariates. When we do not impose parametric assumptions on the conditional distribution of the dependent variable given covariates, this problem becomes equivalent to estimation of distributional impacts of interventions (treatment) when selection to the program is based on observable characteristics. The distributional impacts of a treatment will be calculated as differences in inequality measures of the potential outcomes of receiving and not receiving the treatment. These differences are called here Inequality Treatment Effects (ITE). The estimation procedure involves a first non-parametric step in which the probability of receiving treatment given covariates, the propensity-score, is estimated. Using the inverse probability weighting method to estimate parameters of the marginal distribution of potential outcomes, in the second step weighted sample versions of inequality measures are computed. Root-N consistency, asymptotic normality and semiparametric efficiency are shown for the semiparametric estimators proposed. A Monte Carlo exercise is performed to investigate the behavior in finite samples of the estimator derived in the paper. We also apply our method to the evaluation of a job training program.
Resumo:
We outline possible actions to be adopted by the European Union to ensure a better share of total coffee revenues to producers in developing countries. The way to this translates, ultimately, in producers receiving a fair price for the commodity they supply, i.e., a market price that results from fair market conditions in the whole coffee producing chain. We plead for proposals to take place in the consuming countries, as market conditions in the consuming-countries side of the coffee producing chain are not fair; market failures and ingenious distortions are responsible for the enormous asymmetry of gains in the two sides. The first of three proposals for consumer government supported actions is to help in the creation of domestic trading companies for achieving higher export volumes. These tradings would be associated to roasters that, depending on the final product envisaged, could perform the roasting in the country and export the roasted – and sometimes ground – coffee, breaking the increasing importers-exporters verticalisation. Another measure would be the systematic provision of basic intelligence on the consuming markets. Statistics of the quantities sold according to mode of consumption, by broad “categories of coffee” and point of sale, could be produced for each country. They should be matched to the exports/imports data and complemented by (aggregate) country statistics on the roasting sector. This would extremely help producing countries design their own market and producing strategies. Finally, a fund, backed by a common EU tax on roasted coffee – created within the single market tax harmonisation programme, is suggested. This European Coffee Fund would have two main projects. Together with the ICO, it would launch an advertising campaign on coffee in general, aimed at counterbalancing the increasing “brandification” of coffee. Basic information on the characteristics of the plant and the drink would be passed, and the effort could be extended to the future Eastern European members of the Union, as a further assurance that EU processors would not have a too privileged access to these new markets. A quality label for every coffee sold in the Union could complement this initiative, helping to create a level playing field for products from outside the EU. A second project would consist in a careful diversification effort, to take place in selected producing countries.
Resumo:
This paper presents three contributions to the literature on the welfare cost of ináation. First, it introduces a new sensible way of measuring this cost - that of a compensating variation in consumption or income, instead of the equivalent variation notion that has been extensively used in empirical and theoretical research during the past fiftt years. We Önd this new measure to be interestingly related to the proxy measure of the shopping-time welfare cost of ináation introduced by Simonsen and Cysne (2001). Secondly, it discusses for which money-demand functions this and the shopping-time measure can be evaluated in an economically meaningful way. And, last but not least, it completely orders a comprehensive set of measures of the welfare cost of ináation for these money-demand specification. All of our results are extended to an economy in which there are many types of monies present, and are illustrated with the log-log money-demand specification.
Resumo:
In this paper we construct sunspot equilibria that arrise from chaotic deterministic dynamics. These equilibria are robust and therefore observables. We prove that they may be learned by a sim pie rule based on the histograms or past state variables. This work gives the theoretical justification or deterministic models that might compete with stochastic models to explain real data.
Resumo:
In trade agreements, governments can design remedies to ensure compliance (property rule) or to compensate victims (liability rule). This paper describes an economic framework to explain the pattern of remedies over non-tariff restrictions—particularly domestic subsidies and nonviolation complaints subject to liability rules. The key determinants of the contract form for any individual measure are the expected joint surplus from an agreement and the expected loss to the constrained government. The loss is higher for domestic subsidies and nonviolations because these are the policies most likely to correct domestic distortions. Governments choose property rules when expected gains from compliance are sufficiently high and expected losses to the constrained country are sufficiently low. Liability rules are preferable when dispute costs are relatively high, because inefficiencies in the compensation process reduce the number of socially inefficient disputes filed.
Resumo:
In this paper I will investigate the conditions under which a convex capacity (or a non-additive probability which exhibts uncertainty aversion) can be represented as a squeeze of a(n) (additive) probability measure associate to an uncertainty aversion function. Then I will present two alternatives forrnulations of the Choquet integral (and I will extend these forrnulations to the Choquet expected utility) in a parametric approach that will enable me to do comparative static exercises over the uncertainty aversion function in an easy way.
Resumo:
This article proposes an alternative methodology for estimating the effects of non-tariff measures on trade flows, based on the recent literature on gravity models. A two-stage Heckman selection model is applied to the case of Brazilian exports, where the second stage gravity equation is theoretically grounded on the seminal Melitz model of heterogeneous firms. This extended gravity equation highlights the role played by zero trade flows as well as firm heterogeneity in explaining bilateral trade among countries, two factors usually omitted in traditional gravity specifications found in previous literature. Last, it also proposes a economic rationale for the effects of NTM on trade flows, helping to shed some light on its main operating channels under a rather simple Cournot’s duopolistic competition framework.
Resumo:
This article proposes an alternative methodology for estimating the effects of non-tariff measures on trade flows, based on the recent literature on gravity models. A two-stage Heckman selection model is applied to the case of Brazilian exports, where the second stage gravity equation is theoretically grounded on the seminal Melitz model of heterogeneous firms. This extended gravity equation highlights the role played by zero trade flows as well as firm heterogeneity in explaining bilateral trade among countries, two factors usually omitted in traditional gravity specifications found in previous literature. Last, it also proposes a economic rationale for the effects of NTM on trade flows, helping to shed some light on its main operating channels under a rather simple Cournot’s duopolistic competition framework
Resumo:
We consider risk-averse convex stochastic programs expressed in terms of extended polyhedral risk measures. We derive computable con dence intervals on the optimal value of such stochastic programs using the Robust Stochastic Approximation and the Stochastic Mirror Descent (SMD) algorithms. When the objective functions are uniformly convex, we also propose a multistep extension of the Stochastic Mirror Descent algorithm and obtain con dence intervals on both the optimal values and optimal solutions. Numerical simulations show that our con dence intervals are much less conservative and are quicker to compute than previously obtained con dence intervals for SMD and that the multistep Stochastic Mirror Descent algorithm can obtain a good approximate solution much quicker than its nonmultistep counterpart. Our con dence intervals are also more reliable than asymptotic con dence intervals when the sample size is not much larger than the problem size.