967 resultados para International New Thought Alliance
Resumo:
As worldwide consumer demand for high-quality products and for information about these products increases, labels and geographical indications (GIs) can serve to signal quality traits to consumers. However, GI systems among countries are not homogeneous and can be used as trade barriers against competition. Philosophical differences between the European Union and the United States about how GIs should be registered and protected led to the formation of a WTO dispute settlement panel. In this paper we discuss the issues behind the dispute, the World Trade Organization (WTO) panel decision, and the EU response to the panel decision leading to the new Regulation 510/2006. Given the potential for GI labels to supply consumer information, context is provided for the discussion using recent literature on product labeling. Implications are drawn regarding the importance of the panel decision and the EU response relative to GI issues yet to be negotiated under the Doha Round.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
ABSTRACT The purpose of this research is to clarify the contribution of international dispute adjudication mechanisms in regard to environmental protection. Most specifically, the study aims to identify and develop the criterion adopted by the international judge in relation to the compensation for environmental damages. In this perspective, the study identifies some gaps between international responsibility and environmental protection interests. The premise sustained all along the study is that compensation is determinant to conciliate environmental prerogatives with mechanisms of international adjudication, in particular the system of international responsibility. Supported by the analysis of treaties, international decisions and secondary sources, the thesis defends the idea that some elements of international law allow the adjudicator to adapt the compensation to attend certain environmental interests, creating a new approach which was entitled 'fair compensation'. The antithesis of this approach is the idea that compensation in international law is limited exclusively to the strict reparation of the material losses incurred by the victim. As a synthesis, the study defends the specificity of environmental damages in relation to other kind of damages that are subject to compensation under international law. The measure upon which compensation for environmental damages could be classified as a specific type of damage under international law remains to be determined. The main conclusion of the study is that the existing standard of compensation defined by the theory and practice of international law is impossible to be strictly respected in cases involving environmental damages. This limitation is mainly due to the complexity of the notion of environment, which is constantly conflicting with the anthropologic view of legal theory. The study supports the idea that the establishment of a 'fair compensation' which takes into account the political, legal and technical context of the environmental damage, is the best possible approach to conciliate internationally responsibility and environmental interests. This could be implemented by the observance of certain elements by the international judge/arbitrator through a case-by-case analysis.
Resumo:
This paper provides empirical evidence on the explanatory factorsaffecting introductory prices of new pharmaceuticals in a heavilyregulated and highly subsidized market. We collect a data setconsisting of all new chemical entities launched in Spain between1997 and 2005, and model launching prices. We found that, unlike inthe US and Sweden, therapeutically "innovative" products are notoverpriced relative to "imitative" ones. Price setting is mainly used asa mechanism to adjust for inflation independently of the degree ofinnovation. The drugs that enter through the centralized EMAapproval procedure are overpriced, which may be a consequence ofmarket globalization and international price setting.
Resumo:
We reformulate the Smets-Wouters (2007) framework by embedding the theory of unemployment proposed in Galí (2011a,b). Weestimate the resulting model using postwar U.S. data, while treatingthe unemployment rate as an additional observable variable. Our approach overcomes the lack of identification of wage markup and laborsupply shocks highlighted by Chari, Kehoe and McGrattan (2008) intheir criticism of New Keynesian models, and allows us to estimate a"correct" measure of the output gap. In addition, the estimated modelcan be used to analyze the sources of unemployment fluctuations.
Resumo:
I revisit the General Theory's discussion of the role of wages inemployment determination through the lens of the New Keynesianmodel. The analysis points to the key role played by the monetarypolicy rule in shaping the link between wages and employment, andin determining the welfare impact of enhanced wage flexibility. I showthat the latter is not always welfare improving.
Resumo:
This paper investigates the role of learning by private agents and the central bank(two-sided learning) in a New Keynesian framework in which both sides of the economyhave asymmetric and imperfect knowledge about the true data generating process. Weassume that all agents employ the data that they observe (which may be distinct fordifferent sets of agents) to form beliefs about unknown aspects of the true model ofthe economy, use their beliefs to decide on actions, and revise these beliefs througha statistical learning algorithm as new information becomes available. We study theshort-run dynamics of our model and derive its policy recommendations, particularlywith respect to central bank communications. We demonstrate that two-sided learningcan generate substantial increases in volatility and persistence, and alter the behaviorof the variables in the model in a significant way. Our simulations do not convergeto a symmetric rational expectations equilibrium and we highlight one source thatinvalidates the convergence results of Marcet and Sargent (1989). Finally, we identifya novel aspect of central bank communication in models of learning: communicationcan be harmful if the central bank's model is substantially mis-specified.
Resumo:
The subject "Value and prices in Russian economic thought (1890--1920)" should evoke several names and debates in the reader's mind. For a long time, Western scholars have been aware that the Russian economists Tugan-Baranovsky and Bortkiewicz were active participants to the Marxian transformation problem, that the mathematical models of Dmitriev prefigured forthcoming neoricardian based models, and that many Russian economists were either supporting the Marxian labour theory of value or being revisionists. Moreover, these ideas were preparing the ground for Soviet planning. Russian scholars additionally knew that this period was the time of introduction of marginalism in Russia, and that, during this period, economists were active in thinking the relation of ethics with economic theory. All these issues are well covered in the existing literature. But there is a big gap that this dissertation intends to fill. The existing literature handles these pieces separately, although they are part of a single, more general, history. All these issues (the labour theory of value, marginalism, the Marxian transformation problem, planning, ethics, mathematical economics) were part of what this dissertation calls here "The Russian synthesis". The Russian synthesis (in the singular) designates here all the attempts at synthesis between classical political economy and marginalism, between labour theory of value and marginal utility, and between value and prices that occurred in Russian economic thought between 1890 and 1920, and that embraces the whole set of issues evoked above. This dissertation has the ambition of being the first comprehensive history of that Russian synthesis. In this, this contribution is unique. It has always surprised the author of the present dissertation that such a book has not yet been written. Several good reasons, both in terms of scarce availability of sources and of ideological restrictions, may accounted for a reasonable delay of several decades. But it is now urgent to remedy the situation before the protagonists of the Russian synthesis are definitely classified under the wrong labels in the pantheon of economic thought. To accomplish this task, it has seldom be sufficient to gather together the various existing studies on aspects of this story. It as been necessary to return to the primary sources in the Russian language. The most important part of the primary literature has never been translated, and in the last years only some of them have been republished in Russian. Therefore, most translations from the Russian have been made by the author of the present dissertation. The secondary literature has been surveyed in the languages that are familiar (Russian, English and French) or almost familiar (German) to the present author, and which are hopefully the most pertinent to the present investigation. Besides, and in order to increase the acquaintance with the text, which was the objective of all this, some archival sources were used. The analysis consists of careful chronological studies of the authors' writings and their evolution in their historical and intellectual context. As a consequence, the dissertation brings new authors to the foreground - Shaposhnikov and Yurovsky - who were traditionally confined to the substitutes' bench, because they only superficially touched the domains quoted above. In the Russian synthesis however, they played an important part of the story. As a side effect, some authors that used to play in the foreground - Dmitriev and Bortkiewicz - are relegated to the background, but are not forgotten. Besides, the dissertation refreshes the views on authors already known, such as Ziber and, especially, Tugan-Baranovsky. The ultimate objective of this dissertation is to change the opinion that one could have on "value and prices in Russian economic thought", by setting the Russian synthesis at the centre of the debates.
Resumo:
Most central banks perceive a trade-off between stabilizing inflation and stabilizing the gap between output and desired output. However, the standard new Keynesian framework implies no such trade-off. In that framework, stabilizing inflation is equivalent to stabilizing the welfare-relevant output gap. In this paper, we argue that this property of the new Keynesian framework, which we call the divine coincidence, is due to a special feature of the model: the absence of non trivial real imperfections.We focus on one such real imperfection, namely, real wage rigidities. When the baseline new Keynesian model is extended to allow for real wage rigidities, the divine coincidence disappears, and central banks indeed face a trade-off between stabilizing inflation and stabilizing the welfare-relevant output gap. We show that not only does the extended model have more realistic normative implications, but it also has appealing positive properties. In particular, it provides a natural interpretation for the dynamic inflation-unemployment relation found in the data.
Resumo:
We construct a utility-based model of fluctuations, with nominal rigidities andunemployment, and draw its implications for the unemployment-inflation trade-off and for the conduct of monetary policy.We proceed in two steps. We first leave nominal rigidities aside. We show that,under a standard utility specification, productivity shocks have no effect onunemployment in the constrained efficient allocation. We then focus on theimplications of alternative real wage setting mechanisms for fluctuations in un-employment. We show the role of labor market frictions and real wage rigiditiesin determining the effects of productivity shocks on unemployment.We then introduce nominal rigidities in the form of staggered price setting byfirms. We derive the relation between inflation and unemployment and discusshow it is influenced by the presence of labor market frictions and real wagerigidities. We show the nature of the tradeoff between inflation and unemployment stabilization, and its dependence on labor market characteristics. We draw the implications for optimal monetary policy.
Resumo:
This paper reviews two recent books on Political Economy by Allan Drazen and Torsten Persson and Guido Tabellini. It discusses some problems of the recent Political Economy literature.
Resumo:
There is a gap between the importance given to accounting and the low level of bookkeeping and accounting practice in the agricultural sector. Current general accounting rules do not adapt very well to the particularities of farming and are difficult and expensive to implement. The Farm Accountancy Data Network (FADN) and IASC's Proposed International Accounting Standard on Agriculture (PIASA) could be key elements to improve the use of accounting in European farms. The PIASA provides a strong conceptual framework but might need further instruments for its implementation in practice. FADN is an experienced network that has elaborated very detailed farm accounting procedures. Empirical data indicate that current FADN reports are already considered useful by farmers for different purposes. Some changes in the FADN procedures are suggested, while some aspects of FADN are worthwhile for the future IAS on agriculture.
Resumo:
This paper evaluates the empirical and theoretical contributions of theEconomic Growth Literature since the publication of Paul Romer s seminalpaper in 1986.