15 resultados para Assimilation Efficiency
em Université de Montréal, Canada
Resumo:
Rapport de recherche
Resumo:
A contingent contract in a transferable utility game under uncertainty specifies an outcome for each possible state. It is assumed that coalitions evaluate these contracts by considering the minimal possible excesses. A main question of the paper concerns the existence and characterization of efficient contracts. It is shown that they exist if and only if the set of possible coalitions contains a balanced subset. Moreover, a characterization of values that result in efficient contracts in the case of minimally balanced collections is provided.
Resumo:
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.
Resumo:
This paper develops a model where the value of the monetary policy instrument is selected by a heterogenous committee engaged in a dynamic voting game. Committee members differ in their institutional power and, in certain states of nature, they also differ in their preferred instrument value. Preference heterogeneity and concern for the future interact to generate decisions that are dynamically ineffcient and inertial around the previously-agreed instrument value. This model endogenously generates autocorrelation in the policy variable and provides an explanation for the empirical observation that the nominal interest rate under the central bank’s control is infrequently adjusted.
Resumo:
This paper is an examination of the Supreme Court of Canada's interpretation of federalism since constitutional repatriation in 1982. It argues that the lure of centralist efficiency is overpowering a fundamentally important part of our federal order: regionalism. The author contends that changes made by the Court to certain fundamental concepts of Canadian constitutional law now provide Parliament with greater latitude than before in the exercise of its legislative powers. According to the author, these changes are disturbing because they are structured so as to preclude consideration of the legitimate concerns of regional polities. Furthermore, he argues that the Court has reinforced the central government's power to regulate the economy, including intraprovincial matters affecting trade, by resorting to highly functional tests that emphasize economic efficiency over other criteria. This, he claims, makes it more difficult to invoke legitimate regional interests that would lead to duplication, overlapping and even, in the eyes of some, inefficiency. The author the focuses on the Court's treatment of environmental protection in an attempt to show the tension between the Court's desire to use a functional approach and the need to recognize regional interests. Finally, through an examination of recent case law, he attemps to demonstrate that the Court's dominant perspective remains functional despite its endorsement of a more community-oriented undestanding of federalism in Secession Reference. If the Court chooses to proceed in this manner, it will alienate regional polities and may encourage them to choose more radical means of asserting their differences. Further, the author argues that strict adherence to the functional effectiveness approach will undermine the very values that federalism is meant to promote.
Resumo:
This paper develops a bargaining model of wage and employment determination for the public sector. the solution to the model generates structural wage and employment equations that are estimated using data from New York State teacher-school district collective bargaining agreements.
Resumo:
It is highly desirable for an allocation of goods to be efficient. However, one might also deem it important that an allocation gives individuals what they deserve. This paper investigates whether it is possible for an allocation to be both efficient and give people what they deserve. It will first of all consider comparative desert, and conclude that it is possible to satisfy both desiderata. It will then consider absolute desert by integrating Shelly Kagan’s work on desert and economic theory. The conclusion will be that there are potential conflicts between absolute desert and efficiency. The paper will then examine how to select the best compromise between the two values, considering several different conceptions of absolute desert.
Resumo:
Bodies, Saracen Giants, and the Medieval Romance: Transgression, Difference, and Assimilation explore le traitement des corps de trois géants Sarasins dans les romances de Roland and Vernagu (c. 1330), Sir Beues of Hamtoun (c. 1330), et The Taill of Rauf Coilyear (c. 1513-42).Grâce à une étude de la représentation de ces trois géants Sarasin, la signification du corps humain au Moyen Age, et des pratiques de la Chrétienté an accord avec les discours et idéologies envers le Proche-Orient qui existaient dans l’Occident médiéval, ce mémoire de maîtrise juxtapose le géant Sarasin et le héros de la romance pour indiquer une similarité apparente entre leur deux corps et leur religion respective. La romance démontre avec hésitation un désir d’assimiler le géants Sarasin dans le code héroïque ainsi que dans la religion chrétienne, mais souvent rejette avec suspicion le corps du géant par sa mort sur le champ de bataille. Malgré sa mort ou son assimilation dans le code héroïque et la Chrétienté, le corps du géant Sarasin demeure toujours important dans le contexte de la Romance, puisqu’il contribue à la construction de l’identité du héros, de sa foi, et de sa société.
Resumo:
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
L’utilisation d’une méthode d’assimilation de données, associée à un modèle de convection anélastique, nous permet la reconstruction des structures physiques d’une partie de la zone convective située en dessous d’une région solaire active. Les résultats obtenus nous informent sur les processus d’émergence des tubes de champ magnétique au travers de la zone convective ainsi que sur les mécanismes de formation des régions actives. Les données solaires utilisées proviennent de l’instrument MDI à bord de l’observatoire spatial SOHO et concernent principalement la région active AR9077 lors de l’ ́évènement du “jour de la Bastille”, le 14 juillet 2000. Cet évènement a conduit à l’avènement d’une éruption solaire, suivie par une importante éjection de masse coronale. Les données assimilées (magnétogrammes, cartes de températures et de vitesses verticales) couvrent une surface de 175 méga-mètres de coté acquises au niveau photosphérique. La méthode d’assimilation de données employée est le “coup de coude direct et rétrograde”, une méthode de relaxation Newtonienne similaire à la méthode “quasi-linéaire inverse 3D”. Elle présente l’originalité de ne pas nécessiter le calcul des équations adjointes au modèle physique. Aussi, la simplicité de la méthode est un avantage numérique conséquent. Notre étude montre au travers d’un test simple l’applicabilité de cette méthode à un modèle de convection utilisé dans le cadre de l’approximation anélastique. Nous montrons ainsi l’efficacité de cette méthode et révélons son potentiel pour l’assimilation de données solaires. Afin d’assurer l’unicité mathématique de la solution obtenue nous imposons une régularisation dans tout le domaine simulé. Nous montrons enfin que l’intérêt de la méthode employée ne se limite pas à la reconstruction des structures convectives, mais qu’elle permet également l’interpolation optimale des magnétogrammes photosphériques, voir même la prédiction de leur évolution temporelle.
Resumo:
Avec la participation de 38 Montréalais d'origine congolaise, dont 21 femmes et 17 hommes
Resumo:
La version intégrale de ce mémoire est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (www.bib.umontreal.ca/MU).
Resumo:
We consider two new approaches to nonparametric estimation of the leverage effect. The first approach uses stock prices alone. The second approach uses the data on stock prices as well as a certain volatility instrument, such as the CBOE volatility index (VIX) or the Black-Scholes implied volatility. The theoretical justification for the instrument-based estimator relies on a certain invariance property, which can be exploited when high frequency data is available. The price-only estimator is more robust since it is valid under weaker assumptions. However, in the presence of a valid volatility instrument, the price-only estimator is inefficient as the instrument-based estimator has a faster rate of convergence. We consider two empirical applications, in which we study the relationship between the leverage effect and the debt-to-equity ratio, credit risk, and illiquidity.