959 resultados para [JEL:G1] Financial Economics - General Financial Markets
Resumo:
Since the late 1970's, but particularly since the mid-1980s, the economy of Nicaragua has had persistent and large macroeconomic imbalances, while GDP per-capita has declined to 1950s' levels. By the second half of the 1990s, huge fiscal deficits and a reduction of foreign financing resulted in record hyperinflation. The Sandinista government's (1979–1990) harsh stabilization program in 1988–89 had only modest and short-lived success. It was doomed by their inability to lower the public sector deficit due to the war, plus diminishing financial support from abroad. Hyperinflation stopped only after their 1990 electoral defeat ended the war and massive aid began to flow in. Five years later, macroeconomic stability is still very fragile. A sluggish recovery of export agriculture plus import liberalization, have impeded a reduction of huge trade and current account deficits. Facing the prospects of diminished aid flows, the government's strategy has hinged on the achievement of a real devaluation through a crawling-peg adjustment of the nominal rate. However, at the end of 1995 the situation of the external accounts was still critical, and the modest progress achieved was attributable to cyclical terms-of-trade improvement and changes in the political outlook of agricultural producers. Using a Computable General Equilibrium Model and a Social Accounting Matrix constructed for this dissertation, the importance of structural rigidities in production and demand in explaining such outcome is shown. It is shown that under the plausible structural assumptions incorporated in the model, the role of devaluation in the adjustment process is restricted by structural rigidities. Moreover, contrary to the premise of the orthodox economic thinking behind the economic program, it is the contractionary effect of devaluation more than its expenditure-switching effects that provide the basis for is use in solving the external sector's problems. A fixed nominal exchange rate is found to lead to adverse results. The broader conclusion that emerges from the study is that a new social compact and a rapid increase in infrastructure spending plus fiscal support for the traditional agro-export activities is at the center of a successful adjustment towards external viability in Nicaragua. ^
Resumo:
From H. G. Johnson's work (Review of Economic Studies, 1953–54) on tariff retaliation, the questions of whether a country can win a “tariff war” and how or even the broader question of what will affect a country's strategic position in setting bilateral tariff have been tackled in various situations. Although it is widely accepted that a country will have strategic advantages in winning the tariff war if its relative monopoly power is sufficiently large, it is unclear what are the forces behind such power formation. The goal of this research is to provide a unified framework and discuss various forces such as relative country size, absolute advantages and relative advantages simultaneously. In a two-country continuum-of-commodity neoclassical trade model, it is shown that sufficiently large relative country size is a sufficient condition for a country to choose a non-cooperative tariff Nash equilibrium over free trade. It is also shown that technology disparities such as absolute advantage, rate of technology disparity and the distribution of the technology disparity all contribute to a country's strategic position and interact with country size. ^ Leverage effect is usually used to explain the phenomenon of asymmetric volatility in equity returns. However, leverage itself can only account for parts of the asymmetry. In this research, it is shown that stock return volatility is related to firms’ financial status. Financially constrained firms tend to be more sensitive to the return changes. Financial constraint factor explains why some firms tend to be more volatile than others. I found that the financial constraint factor explains the stock return volatility independent of other factors such as firm size, industry affiliation and leverage. Firms’ industry affiliations are shown to be very weak in differentiating volatility. Firm size is proven to be a good factor in distinguishing the different levels of volatility and volatility-return sensitivity. Leverage hypothesis is also partly corroborated and the situation where leverage effect is not applicable is discussed. Finally, I examined the macroeconomic policy's effects on overall market volatility. ^
Resumo:
Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^
Resumo:
The challenging living conditions of many Senegalese families, and the absence of a providing spouse, have led women to covet new economic opportunities, such as microcredit loans. These loans offer Senegalese women the possibility to financially support their households and become active participants in their economies by starting or sustaining their micro businesses. The study takes place in Grand-Yoff, an overpopulated peri-urban area of the Senegalese capital city Dakar, where most people face daily survival issues. This research examines the impact of microcredit activities in the household of Senegalese female loan recipients in Grand-Yoff by examining socioeconomic indicators, in particular outcomes of health, education and nutrition.^ The research total sample is constituted of 166 female participants who engage in microcredit activities. The research combines both qualitative and quantitative methods. Data for the study were gathered through interviews, surveys, participant observation, focus-groups with the study participants and some of their household members, and document analysis.^ While some women in the study make steady profits from their business activities, others struggle to make ends meet from their businesses’ meager or unreliable profits. Some study participants who are impoverished have no choice but to invest their loans directly into their households’ dire needs, hence missing their business prerogative. Many women in the study end up in a vicious cycle of debt by defaulting on their loans or making late payments because they do not have the required household and socioeconomic conditions to take advantage of these loans. Therefore, microcredit does not make a significant impact in the households of the poorest female participants. The study finds that microcredit improves the household well-being - especially nutrition, health and education - of the participants who have acquired significant social capital such as a providing spouse, formal education, training, business experience, and belonging to business or social networks.^ The study finds that microcredit’s household impact is intimately tied to the female borrowers’ household conditions and social capital. It is recommended that microcredit services and programs offer their female clients assistance and additional basic services, financial guidance, lower interest rates, and flexible repayment schedules. ^
Resumo:
This dissertation focused on the longitudinal analysis of business start-ups using three waves of data from the Kauffman Firm Survey. ^ The first essay used the data from years 2004-2008, and examined the simultaneous relationship between a firm's capital structure, human resource policies, and its impact on the level of innovation. The firm leverage was calculated as, debt divided by total financial resources. Index of employee well-being was determined by a set of nine dichotomous questions asked in the survey. A negative binomial fixed effects model was used to analyze the effect of employee well-being and leverage on the count data of patents and copyrights, which were used as a proxy for innovation. The paper demonstrated that employee well-being positively affects the firm's innovation, while a higher leverage ratio had a negative impact on the innovation. No significant relation was found between leverage and employee well-being.^ The second essay used the data from years 2004-2009, and inquired whether a higher entrepreneurial speed of learning is desirable, and whether there is a linkage between the speed of learning and growth rate of the firm. The change in the speed of learning was measured using a pooled OLS estimator in repeated cross-sections. There was evidence of a declining speed of learning over time, and it was concluded that a higher speed of learning is not necessarily a good thing, because speed of learning is contingent on the entrepreneur's initial knowledge, and the precision of the signals he receives from the market. Also, there was no reason to expect speed of learning to be related to the growth of the firm in one direction over another.^ The third essay used the data from years 2004-2010, and determined the timing of diversification activities by the business start-ups. It captured when a start-up diversified for the first time, and explored the association between an early diversification strategy adopted by a firm, and its survival rate. A semi-parametric Cox proportional hazard model was used to examine the survival pattern. The results demonstrated that firms diversifying at an early stage in their lives show a higher survival rate; however, this effect fades over time.^
Resumo:
Prior finance literature lacks a comprehensive analysis of microstructure characteristics of U.S. futures markets due to the lack of data availability. Utilizing a unique data set for five different futures contract this dissertation fills this gap in the finance literature. In three essays price discovery, resiliency and the components of bid-ask spreads in electronic futures markets are examined. In order to provide comprehensive and robust analysis, both moderately volatile pre-crisis and volatile crisis periods are included in the analysis. The first essay entitled “Price Discovery and Liquidity Characteristics for U.S. Electronic Futures and ETF Markets” explores the price discovery process in U.S. futures and ETF markets. Hasbrouck’s information share method is applied to futures and ETF instruments. The information share results show that futures markets dominate the price discovery process. The results on the factors that affect the price discovery process show that when volatility increases, the price leadership of futures markets declines. Furthermore, when the relative size of bid-ask spread in one market increases, its information share decreases. The second essay, entitled “The Resiliency of Large Trades for U.S. Electronic Futures Markets,“ examines the effects of large trades in futures markets. How quickly prices and liquidity recovers after large trades is an important characteristic of financial markets. The price effects of large trades are greater during the crisis period compared to the pre-crisis period. Furthermore, relative to the pre-crisis period, during the crisis period it takes more trades until liquidity returns to the pre-block trade levels. The third essay, entitled “Components of Quoted Bid-Ask Spreads in U.S. Electronic Futures Markets,” investigates the bid-ask spread components in futures market. The components of bid-ask spreads is one of the most important subjects of microstructure studies. Utilizing Huang and Stoll’s (1997) method the third essay of this dissertation provides the first analysis of the components of quoted bid-ask spreads in U.S. electronic futures markets. The results show that order processing cost is the largest component of bid-ask spreads, followed by inventory holding costs. During the crisis period market makers increase bid-ask spreads due to increasing inventory holding and adverse selection risks.
Resumo:
This dissertation examines the drivers and implications of international capital flows. The overarching motivation is the observation that countries not at the centre of global financial markets are subject to considerable spillovers from centre countries, notably from their monetary policy. I present new empirical evidence on the determinants of the observed patterns of international capital flows and monetary policy spillovers, and study their effect on both financial markets and the real economy. In Chapter 2 I provide evidence on the determinants of a puzzling negative correlation observed between productivity growth and net capital inflows to developing and emerging market economies (EMEs) since 1980. By disaggregating net capital inflows into their gross components, I show that this negative correlation is explained by capital outflows related to purchases of very liquid assets from the fastest growing countries. My results suggest a desire for international portfolio diversification in liquid assets by fast growing countries is driving much of the original puzzle. In the reminder of my dissertation I pivot to study the foreign characteristics that drive international capital flows and monetary policy spillovers, with a particular focus on the role of unconventional monetary policy in the United States (U.S.). In Chapter 3 I show that a significant portion of the heterogeneity in EMEs' asset price adjustment following the quantitative easing operations by the Federal Reserve (the Fed) during 2008-2014 can be explained by the degree of bilateral capital market frictions between these countries and the U.S. This is true even after accounting for capital controls, exchange rate regimes, and domestic monetary policies. Chapter 4, co-authored with Michal Ksawery Popiel, studies unconventional monetary policy in a small open economy, looking specifically at the case of Canada since the global financial crisis. We quantify the effect Canadian unconventional monetary policy shocks had on the real economy, while carefully controlling for and quantifying spillovers from U.S. unconventional monetary policy. Our results indicate that the Bank of Canada's unconventional monetary policy increased Canadian output significantly from 2009-2010, but that spillovers from the Fed's policy were even more important for increasing Canadian output after 2008.
Resumo:
In the highly competitive world of modern finance, new derivatives are continually required to take advantage of changes in financial markets, and to hedge businesses against new risks. The research described in this paper aims to accelerate the development and pricing of new derivatives in two different ways. Firstly, new derivatives can be specified mathematically within a general framework, enabling new mathematical formulae to be specified rather than just new parameter settings. This Generic Pricing Engine (GPE) is expressively powerful enough to specify a wide range of stand¬ard pricing engines. Secondly, the associated price simulation using the Monte Carlo method is accelerated using GPU or multicore hardware. The parallel implementation (in OpenCL) is automatically derived from the mathematical description of the derivative. As a test, for a Basket Option Pricing Engine (BOPE) generated using the GPE, on the largest problem size, an NVidia GPU runs the generated pricing engine at 45 times the speed of a sequential, specific hand-coded implementation of the same BOPE. Thus a user can more rapidly devise, simulate and experiment with new derivatives without actual programming.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
Cette thèse développe des méthodes bootstrap pour les modèles à facteurs qui sont couram- ment utilisés pour générer des prévisions depuis l'article pionnier de Stock et Watson (2002) sur les indices de diffusion. Ces modèles tolèrent l'inclusion d'un grand nombre de variables macroéconomiques et financières comme prédicteurs, une caractéristique utile pour inclure di- verses informations disponibles aux agents économiques. Ma thèse propose donc des outils éco- nométriques qui améliorent l'inférence dans les modèles à facteurs utilisant des facteurs latents extraits d'un large panel de prédicteurs observés. Il est subdivisé en trois chapitres complémen- taires dont les deux premiers en collaboration avec Sílvia Gonçalves et Benoit Perron. Dans le premier article, nous étudions comment les méthodes bootstrap peuvent être utilisées pour faire de l'inférence dans les modèles de prévision pour un horizon de h périodes dans le futur. Pour ce faire, il examine l'inférence bootstrap dans un contexte de régression augmentée de facteurs où les erreurs pourraient être autocorrélées. Il généralise les résultats de Gonçalves et Perron (2014) et propose puis justifie deux approches basées sur les résidus : le block wild bootstrap et le dependent wild bootstrap. Nos simulations montrent une amélioration des taux de couverture des intervalles de confiance des coefficients estimés en utilisant ces approches comparativement à la théorie asymptotique et au wild bootstrap en présence de corrélation sérielle dans les erreurs de régression. Le deuxième chapitre propose des méthodes bootstrap pour la construction des intervalles de prévision permettant de relâcher l'hypothèse de normalité des innovations. Nous y propo- sons des intervalles de prédiction bootstrap pour une observation h périodes dans le futur et sa moyenne conditionnelle. Nous supposons que ces prévisions sont faites en utilisant un ensemble de facteurs extraits d'un large panel de variables. Parce que nous traitons ces facteurs comme latents, nos prévisions dépendent à la fois des facteurs estimés et les coefficients de régres- sion estimés. Sous des conditions de régularité, Bai et Ng (2006) ont proposé la construction d'intervalles asymptotiques sous l'hypothèse de Gaussianité des innovations. Le bootstrap nous permet de relâcher cette hypothèse et de construire des intervalles de prédiction valides sous des hypothèses plus générales. En outre, même en supposant la Gaussianité, le bootstrap conduit à des intervalles plus précis dans les cas où la dimension transversale est relativement faible car il prend en considération le biais de l'estimateur des moindres carrés ordinaires comme le montre une étude récente de Gonçalves et Perron (2014). Dans le troisième chapitre, nous suggérons des procédures de sélection convergentes pour les regressions augmentées de facteurs en échantillons finis. Nous démontrons premièrement que la méthode de validation croisée usuelle est non-convergente mais que sa généralisation, la validation croisée «leave-d-out» sélectionne le plus petit ensemble de facteurs estimés pour l'espace généré par les vraies facteurs. Le deuxième critère dont nous montrons également la validité généralise l'approximation bootstrap de Shao (1996) pour les regressions augmentées de facteurs. Les simulations montrent une amélioration de la probabilité de sélectionner par- cimonieusement les facteurs estimés comparativement aux méthodes de sélection disponibles. L'application empirique revisite la relation entre les facteurs macroéconomiques et financiers, et l'excès de rendement sur le marché boursier américain. Parmi les facteurs estimés à partir d'un large panel de données macroéconomiques et financières des États Unis, les facteurs fortement correlés aux écarts de taux d'intérêt et les facteurs de Fama-French ont un bon pouvoir prédictif pour les excès de rendement.
Resumo:
Quelles sont les contraintes auxquelles font face les firmes africaines ? Cette étude a pour but de cerner les principaux défis du climat des affaires africain en utilisant les données de la World Bank Enterprise Survey. Dans cette enquête, deux questions majeures ont été posées au gestionnaire de firmes : i) l’ampleur des contraintes énumérées pour leur activité et ii) la décision d’investissement durant l’année qui a précédé l’enquête. Nos résultats montrent que, pour les firmes de notre échantillon, la décision d’investissement est retardée par un ensemble d’obstacles parmi lesquelles : la corruption, l’accès difficile au marché financier, la concurrence déloyale du secteur informel et l’instabilité politique. Nous procédons ensuite à une analyse de sensibilité et de robustesse en intégrant parmi les variables de contrôle ; la taille des firmes, le secteur d’activité et leur localisation. Nos résultats montrent que les firmes de taille moyenne sont les plus vulnérables, elles sont différemment affectées selon qu’elles soient dans le secteur des manufactures ou celui des services et les compagnies situées en Afrique centrale sont très peu affectées par ces contraintes.
Resumo:
Quelles sont les contraintes auxquelles font face les firmes africaines ? Cette étude a pour but de cerner les principaux défis du climat des affaires africain en utilisant les données de la World Bank Enterprise Survey. Dans cette enquête, deux questions majeures ont été posées au gestionnaire de firmes : i) l’ampleur des contraintes énumérées pour leur activité et ii) la décision d’investissement durant l’année qui a précédé l’enquête. Nos résultats montrent que, pour les firmes de notre échantillon, la décision d’investissement est retardée par un ensemble d’obstacles parmi lesquelles : la corruption, l’accès difficile au marché financier, la concurrence déloyale du secteur informel et l’instabilité politique. Nous procédons ensuite à une analyse de sensibilité et de robustesse en intégrant parmi les variables de contrôle ; la taille des firmes, le secteur d’activité et leur localisation. Nos résultats montrent que les firmes de taille moyenne sont les plus vulnérables, elles sont différemment affectées selon qu’elles soient dans le secteur des manufactures ou celui des services et les compagnies situées en Afrique centrale sont très peu affectées par ces contraintes.
Resumo:
Esta investigación analiza el impacto del Programa de Alimentación Escolar en el trabajo infantil en Colombia a través de varias técnicas de evaluación de impacto que incluyen emparejamiento simple, emparejamiento genético y emparejamiento con reducción de sesgo. En particular, se encuentra que este programa disminuye la probabilidad de que los escolares trabajen alrededor de un 4%. Además, se explora que el trabajo infantil se reduce gracias a que el programa aumenta la seguridad alimentaria, lo que consecuentemente cambia las decisiones de los hogares y anula la carga laboral en los infantes. Son numerosos los avances en primera infancia llevados a cabo por el Estado, sin embargo, estos resultados sirven de base para construir un marco conceptual en el que se deben rescatar y promover las políticas públicas alimentarias en toda la edad escolar.
Resumo:
The Efficient Market Hypothesis (EMH), one of the most important hypothesis in financial economics, argues that return rates have no memory (correlation) which implies that agents cannot make abnormal profits in financial markets, due to the possibility of arbitrage operations. With return rates for the US stock market, we corroborate the fact that with a linear approach, return rates do not show evidence of correlation. However, linear approaches might not be complete or global, since return rates could suffer from nonlinearities. Using detrended cross-correlation analysis and its correlation coefficient, a methodology which analyzes long-range behavior between series, we show that the long-range correlation of return rates only ends in the 149th lag, which corresponds to about seven months. Does this result undermine the EMH?
Resumo:
The first chapter provides evidence that aggregate Research and Development (R&D) investment drives a persistent component in productivity growth and that this embodies a risk priced in financial markets. In a semi-endogenous growth model, this component is identified by the R&D in excess of equilibrium levels and can be approximated by the Error Correction Term in the cointegration between R&D and Total Factor Productivity. Empirically, the component results being well defined and it satisfies all key theoretical predictions: it exhibits appropriate persistency, it forecasts productivity growth, and it is associated with a cross-sectional risk premium. CAPM is the most foundational model in financial economics, but is known to empirically underestimate expected returns of low-risk assets and overestimate those with high risk. The second chapter studies how risks omission and funding tightness jointly contribute to explaining this anomaly, with the former affecting the definition of assets’ riskiness and the latter affecting how risk is remunerated. Theoretically, the two effects are shown to counteract each other. Empirically, the spread related to binding leverage constraints is found to be significant at 2% yearly. Nonetheless, average returns of portfolios that exploit this anomaly are found to mostly reflect omitted risks, in contrast to their employment in previous literature. The third chapter studies how ‘sustainability’ of assets affect discount rates, which is intrinsically mediated by the risk profile of the assets themselves. This has implications for the assessment of the sustainability-related spread and for hedging changes in the sustainability concern. This mechanism is tested on the ESG-score dimension for US data, with inconclusive evidence regarding the existence of an ESG-related premium in the first place. Also, the risk profile of the long-short ESG portfolio is not likely to impact the sign of its average returns with respect to the sustainability-spread, for the time being.