11 resultados para Efficiency Rating
em Université de Montréal, Canada
Resumo:
A contingent contract in a transferable utility game under uncertainty specifies an outcome for each possible state. It is assumed that coalitions evaluate these contracts by considering the minimal possible excesses. A main question of the paper concerns the existence and characterization of efficient contracts. It is shown that they exist if and only if the set of possible coalitions contains a balanced subset. Moreover, a characterization of values that result in efficient contracts in the case of minimally balanced collections is provided.
Resumo:
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.
Resumo:
This paper develops a model where the value of the monetary policy instrument is selected by a heterogenous committee engaged in a dynamic voting game. Committee members differ in their institutional power and, in certain states of nature, they also differ in their preferred instrument value. Preference heterogeneity and concern for the future interact to generate decisions that are dynamically ineffcient and inertial around the previously-agreed instrument value. This model endogenously generates autocorrelation in the policy variable and provides an explanation for the empirical observation that the nominal interest rate under the central bank’s control is infrequently adjusted.
Resumo:
This paper is an examination of the Supreme Court of Canada's interpretation of federalism since constitutional repatriation in 1982. It argues that the lure of centralist efficiency is overpowering a fundamentally important part of our federal order: regionalism. The author contends that changes made by the Court to certain fundamental concepts of Canadian constitutional law now provide Parliament with greater latitude than before in the exercise of its legislative powers. According to the author, these changes are disturbing because they are structured so as to preclude consideration of the legitimate concerns of regional polities. Furthermore, he argues that the Court has reinforced the central government's power to regulate the economy, including intraprovincial matters affecting trade, by resorting to highly functional tests that emphasize economic efficiency over other criteria. This, he claims, makes it more difficult to invoke legitimate regional interests that would lead to duplication, overlapping and even, in the eyes of some, inefficiency. The author the focuses on the Court's treatment of environmental protection in an attempt to show the tension between the Court's desire to use a functional approach and the need to recognize regional interests. Finally, through an examination of recent case law, he attemps to demonstrate that the Court's dominant perspective remains functional despite its endorsement of a more community-oriented undestanding of federalism in Secession Reference. If the Court chooses to proceed in this manner, it will alienate regional polities and may encourage them to choose more radical means of asserting their differences. Further, the author argues that strict adherence to the functional effectiveness approach will undermine the very values that federalism is meant to promote.
Resumo:
This paper develops a bargaining model of wage and employment determination for the public sector. the solution to the model generates structural wage and employment equations that are estimated using data from New York State teacher-school district collective bargaining agreements.
Resumo:
It is highly desirable for an allocation of goods to be efficient. However, one might also deem it important that an allocation gives individuals what they deserve. This paper investigates whether it is possible for an allocation to be both efficient and give people what they deserve. It will first of all consider comparative desert, and conclude that it is possible to satisfy both desiderata. It will then consider absolute desert by integrating Shelly Kagan’s work on desert and economic theory. The conclusion will be that there are potential conflicts between absolute desert and efficiency. The paper will then examine how to select the best compromise between the two values, considering several different conceptions of absolute desert.
Resumo:
Les personnes vieillissantes doivent composer au quotidien avec des douleurs chroniques. Le but de ce travail est de mieux comprendre les mécanismes sous-jacents qui contribueraient aux douleurs chroniques liées au vieillissement et par là, ouvrir un chemin vers de nouvelles perspectives thérapeutiques. Les contrôles inhibiteurs diffus nociceptifs (CIDN) ont un rôle qui n’est pas des moindres dans le contrôle de la douleur. Des études expérimentales examinant l’effet analgésique de la contre stimulation hétérotopique nociceptive (HNCS), un protocole permettant de tester l’efficacité de ces CIDN, suggèrent que le recrutement des CIDN au sein de cette population était plus faible (i.e. moins d’inhibition) comparé à une population plus jeune. En revanche, les études examinant la sensibilisation centrale induite par sommation temporelle (TS) de la douleur rapportent des résultats mitigés. De plus, une composante importante influençant l’expérience de douleur, les ressources cognitives, dont l’inhibition cognitive, se voient aussi décliner avec l’âge. Premièrement, le recrutement des CIDN a été comparé entre des participants sains, jeunes et des plus âgés avec la HNCS, et le recrutement des mécanismes de sensibilisation centrale avec la TS. La stimulation électrique du nerf sural a été choisie pour permettre de quantifier la douleur, tout en prenant une mesure indicative de la nociception spinale qu’est le réflexe nociceptif spinal (RIII). Nos sujets ont aussi participé à une tâche cognitive (le Stroop), testant l’inhibition cognitive. Deuxièmement, l’efficacité des CIDN ainsi que de l’inhibition cognitive a été testée chez les jeunes et les aînés en imagerie par résonance magnétique (IRM), afin de vérifier la relation entre ces deux mesures psychophysiques et l’épaisseur corticale des régions qui y sont impliquées ainsi que l’effet de l’âge sur celles-ci. Les résultats suggèrent un moindre recrutement des CIDN chez les plus âgés lors de l’expérimentation de la HNCS. Également, les sujets âgés présentaient des capacités d’inhibitions cognitives plus faibles que les jeunes. En plus, une corrélation entre l’inhibition cognitive et la modulation du réflexe RIII par la HNCS a été mise en évidence. Pour l’expérience de TS, les résultats étaient comparables pour les deux groupes, suggérant que les mécanismes impliqués dans la régulation de la douleur ne subiraient pas l’effet de l’âge de la même manière. Pour l’étude de l’épaisseur corticale, on y trouve une diminution globale de l’épaisseur corticale liée à l’âge, mais aussi une corrélation de l’analgésie par la HNCS avec l’inhibition cognitive et également, une relation des deux avec l’épaisseur corticale du cortex orbitofrontal (OFC) latéral gauche, suggérant la possibilité d’une existence d’un réseau neuronal au moins partiellement commun du contrôle inhibiteur descendant sensoriel et cognitif. Ce travail montre que l’effet de l’âge sur les mécanismes centraux de la régulation de la douleur est loin d’être uniforme. Également, il montre une corrélation entre la modulation endogène de la douleur et l’inhibition cognitive, ces deux processus seraient associés à une même région cérébrale. Ces résultats pourraient contribuer à identifier d’autres méthodes thérapeutiques, ouvrant ainsi une nouvelle avenue vers d’autres options dans la prise en charge des douleurs chroniques chez les personnes vieillissantes.
Resumo:
We consider two new approaches to nonparametric estimation of the leverage effect. The first approach uses stock prices alone. The second approach uses the data on stock prices as well as a certain volatility instrument, such as the CBOE volatility index (VIX) or the Black-Scholes implied volatility. The theoretical justification for the instrument-based estimator relies on a certain invariance property, which can be exploited when high frequency data is available. The price-only estimator is more robust since it is valid under weaker assumptions. However, in the presence of a valid volatility instrument, the price-only estimator is inefficient as the instrument-based estimator has a faster rate of convergence. We consider two empirical applications, in which we study the relationship between the leverage effect and the debt-to-equity ratio, credit risk, and illiquidity.
Resumo:
Each item in a given collection is characterized by a set of possible performances. A (ranking) method is a function that assigns an ordering of the items to every performance profile. Ranking by Rating consists in evaluating each item’s performance by using an exogenous rating function, and ranking items according to their performance ratings. Any such method is separable: the ordering of two items does not depend on the performances of the remaining items. We prove that every separable method must be of the ranking-by-rating type if (i) the set of possible performances is the same for all items and the method is anonymous, or (ii) the set of performances of each item is ordered and the method is monotonic. When performances are m-dimensional vectors, a separable, continuous, anonymous, monotonic, and invariant method must rank items according to a weighted geometric mean of their performances along the m dimensions.