26 resultados para Real interest rates
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
The flexible-price two-country monetary model is extended to include a consumption externality with habit persistence. Two methodologies are employed to explore this model's ability to generate volatile and persistent exchange rates. In the first, actual data is used for the exogenous driving processes. In the second, the model is simulated using estimated forcing processes. The theory, in both cases, is capable of explaining the high volatility and persistence of real and nominal exchange rates as well as the high correlation between real and nominal rates. © 2007 Elsevier B.V. All rights reserved.
Resumo:
We propose a new approach for modeling nonlinear multivariate interest rate processes based on time-varying copulas and reducible stochastic differential equations (SDEs). In the modeling of the marginal processes, we consider a class of nonlinear SDEs that are reducible to Ornstein--Uhlenbeck (OU) process or Cox, Ingersoll, and Ross (1985) (CIR) process. The reducibility is achieved via a nonlinear transformation function. The main advantage of this approach is that these SDEs can account for nonlinear features, observed in short-term interest rate series, while at the same time leading to exact discretization and closed-form likelihood functions. Although a rich set of specifications may be entertained, our exposition focuses on a couple of nonlinear constant elasticity volatility (CEV) processes, denoted as OU-CEV and CIR-CEV, respectively. These two processes encompass a number of existing models that have closed-form likelihood functions. The transition density, the conditional distribution function, and the steady-state density function are derived in closed form as well as the conditional and unconditional moments for both processes. In order to obtain a more flexible functional form over time, we allow the transformation function to be time varying. Results from our study of U.S. and UK short-term interest rates suggest that the new models outperform existing parametric models with closed-form likelihood functions. We also find the time-varying effects in the transformation functions statistically significant. To examine the joint behavior of interest rate series, we propose flexible nonlinear multivariate models by joining univariate nonlinear processes via appropriate copulas. We study the conditional dependence structure of the two rates using Patton (2006a) time-varying symmetrized Joe--Clayton copula. We find evidence of asymmetric dependence between the two rates, and that the level of dependence is positively related to the level of the two rates. (JEL: C13, C32, G12) Copyright The Author 2010. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oxfordjournals.org, Oxford University Press.
Reducible Diffusions with Time-Varying Transformations with Application to Short-Term Interest Rates
Resumo:
Reducible diffusions (RDs) are nonlinear transformations of analytically solvable Basic Diffusions (BDs). Hence, by construction RDs are analytically tractable and flexible diffusion processes. Existing literature on RDs has mostly focused on time-homogeneous transformations, which to a significant extent fail to explore the full potential of RDs from both theoretical and practical points of view. In this paper, we propose flexible and economically justifiable time variations to the transformations of RDs. Concentrating on the Constant Elasticity Variance (CEV) RDs, we consider nonlinear dynamics for our time-varying transformations with both deterministic and stochastic designs. Such time variations can greatly enhance the flexibility of RDs while maintaining sufficient tractability of the resulting models. In the meantime, our modeling approach enjoys the benefits of classical inferential techniques such as the Maximum Likelihood (ML). Our application to the UK and the US short-term interest rates suggests that from an empirical point of view time-varying transformations are highly relevant and statistically significant. We expect that the proposed models can describe more truthfully the dynamic time-varying behavior of economic and financial variables and potentially improve out-of-sample forecasts significantly.
Resumo:
This is a definitive new account of Britain's economic evolution from a backwater of Europe in 1270 to the hub of the global economy in 1870. For the first time Britain's national accounts are reconstructed right back into the thirteenth century to show what really happened quantitatively during the centuries leading up to the Industrial Revolution. Contrary to traditional views of the earlier period as one of Malthusian stagnation, they reveal how the transition to modern economic growth built on the earlier foundations of a persistent upward trend in GDP per capita which doubled between 1270 and 1700. Featuring comprehensive estimates of population, land use, agricultural production, industrial and service-sector production and GDP per capita, as well as analysis of their implications, this is an essential reference work for those interest in British economic history and the origins of modern economic growth more generally.
Resumo:
We examine the dynamic optimization problem for not-for-profit financial institutions (NFPs) that maximize consumer surplus, not profits. We characterize the optimal dynamic policy and find that it involves credit rationing. Interest rates set by mature NFPs will typically be more favorable to customers than market rates, as any surplus is distributed in the form of interest rate subsidies, with credit rationing being required to prevent these subsidies from distorting loan volumes from their optimal levels. Rationing overcomes a fundamental problem in NFPs; it allows them to distribute the surplus without distorting the volume of activity from the efficient level.
Resumo:
We investigate the relationship between information disclosure and depositor behaviour in the Chinese banking sector. Specifically, we enquire whether enhanced information disclosure enables investors to more effectively infer a banking institution's risk profile, thereby influencing their deposit decisions. Utilising an unbalanced panel, incorporating financial data from 169 Chinese banks over the 1998–2009 period, we employ generalised-method-of-moments (GMM) estimation procedures to control for potential endogeneity, unobserved heterogeneity, and persistence in the dependent variable. We uncover evidence that: (i) the growth rate of deposits is sensitive to bank fundamentals after controlling for macroeconomic factors, diversity in ownership structure, and government intervention; (ii) a bank publicly disclosing more transparent information in its financial reports, is more likely to experience growth in its deposit base; and (iii) banks characterised by high information transparency, well-capitalised and adopted international accounting standards, are more able to attract funds by offering higher interest rates.
Resumo:
Since the financial crash of 2008 monetary policy has been in a state of stasis – a condition in which things are not changing, moving, or progressing, but rather appear frozen. Interest rates have been frozen at low levels for a considerable period time. Inflation targets have consistently been missed, through phases of both overshooting and undershooting. At the same time, a variety of unconventional monetary policies involving asset purchases and liquidity provision have been pursued. Questions have been raised from a variety of sources, including various international organizations, covering distinct BIS and IMF positions about the continuing validity and sustainability of existing monetary policy frameworks, not least because inflation targeting has ceased to act as reliable guide for policy for over six years. Despite this central banks have been reluctant to debate moving to a new formal policy framework. This article argues that as an apex policy forum only the G20 leaders’ summits has the necessary political authority to call their central banks to account and initiate a wide ranging debate on the future of monetary policy. A case is made for convening a monetary policy working group to discuss a range of positions, including those of the BIS and IMF, and to make recommendations, because the G20 has been most effective in displaying international financial leadership, when leaders have convened and made use of specialist working groups.
Resumo:
This paper addresses three questions: (1) How severe were the episodes of banking instability
experienced by the UK over the past two centuries? (2) What have been the macroeconomic
indicators of UK banking instability? and (3) What have been the consequences of UK banking
instability for the cost of credit? Using a unique dataset of bank share prices from 1830 to 2010
to assess the stability of the UK banking system, we find that banking instability has grown more
severe since the 1970s. We also find that interest rates, inflation, lending growth, and equity
prices are consistent macroeconomic indicators of UK banking instability over the long run.
Furthermore, utilising a unique dataset of corporate-bond yields for the period 1860 to 2010, we
find that there is a significant long-run relationship between banking instability and the creditrisk
premium faced by businesses.
Resumo:
Pessimistic Malthusian verdicts on the capacity of pre-industrial European economies to sustain a degree of real economic growth under conditions of population growth are challenged using current reconstructions of urbanisation ratios, the real wage rates of building and agricultural labourers, and GDP per capita estimated by a range of methods. Economic growth is shown to have outpaced population growth and raised GDP per capita to in excess of $1,500 (1990 $ international at PPP) in Italy during its twelfth- and thirteenth-century commercial revolution, Holland during its fifteenth- and sixteenth-century golden age, and England during the seventeenth- and eighteenth-century runup to its industrial revolution. During each of these Smithian growth episodes expanding trade and commerce sustained significant output and employment growth in the manufacturing and service sectors. These positive developments were not necessarily reflected by trends in real wage rates for the latter were powerfully influenced by associated changes in relative factor prices and the per capita supply of labour as workers varied the length of the working year in order to consume either more leisure or more goods. The scale of the divergence between trends in real wage rates and GDP per capita nevertheless varied a great deal between countries for reasons which have yet to be adequately explained.
Resumo:
Despite 10 years of research on behavior in hypothetical referenda, conflict remains in the literature on whether or not the mechanism generates biased responses compared to real referenda, and the nature and source of any such bias. Almost all previous inquiry in respect of this issue has concentrated on bias at the aggregate level. This paper reports a series of three experiments which focuses on bias at the individual level and how this can translate to bias at the aggregate level. The authors argue that only an individual approach to hypothetical bias is consistent with the concept of incentive compatibility. The results of these experiments reflect these previous conflicting findings but go on to show that individual hypothetical bias is a robust result driven by the differing influence of pure self-interest and other-regarding preferences in real and hypothetical situations, rather than by a single behavioral theory such as free riding. In a hypothetical situation these preferences cause yea-saying and non-demand revealing voting. This suggests that investigation of individual respondents in other hypothetical one-shot binary choices may also provide us with insights into aggregate behavior in these situations.
Resumo:
STUDY QUESTION Is there an association between high levels of sperm DNA damage and miscarriage?SUMMARY ANSWERMiscarriage rates are positively correlated with sperm DNA damage levels.WHAT IS KNOWN ALREADYMost ejaculates contain a subpopulation of sperm with DNA damage, also referred to as DNA fragmentation, in the form of double or single-strand breaks which have been induced in the DNA prior to or following ejaculation. This DNA damage may be particularly elevated in some subfertile men, hence several studies have examined the link between sperm DNA damage levels and conception and miscarriage rates.STUDY DESIGN, SIZE, DURATIONA systematic review and meta-analysis of studies which examined the effect of sperm DNA damage on miscarriage rates was performed. Searches were conducted on MEDLINE, EMBASE and the Cochrane Library without any language restrictions from database inception to January 2012.PARTICIPANTS/MATERIALS, SETTING, METHODSWe used the terms 'DNA damage' or 'DNA fragmentation' combined with 'miscarriage', 'abortion' or 'pregnancy' to generate a set of relevant citations. Data extraction was performed by two reviewers. Study quality was assessed using the Newcastle-Ottawa Scale. Meta-analysis of relative risks of miscarriage was performed with a random effects model. Subgroup analyses were performed by the type of DNA damage test, whether the sperm examined were prepared or from raw semen and for pregnancies resulting from IVF or ICSI treatment.MAIN RESULTS AND THE ROLE OF CHANCEWe identified 16 cohort studies (2969 couples), 14 of which were prospective. Eight studies used acridine orange-based assays, six the TUNEL assay and two the COMET assay. Meta-analysis showed a significant increase in miscarriage in patients with high DNA damage compared with those with low DNA damage [risk ratio (RR) = 2.16 (1.54, 3.03), P <0.00001)]. A subgroup analysis showed that the miscarriage association is strongest for the TUNEL assay (RR = 3.94 (2.45, 6.32), P <0.00001).LIMITATIONS, REASONS FOR CAUTIONThere is some variation in study characteristics, including the use of different assays and different thresholds for DNA damage and the definition of pregnancy loss.WIDER IMPLICATIONS OF THE FINDINGSThe use of methods which select sperm without DNA damage for use in assisted conception treatment may reduce the risk of miscarriage. This finding indicates that assays detecting DNA damage could be considered in those suffering from recurrent pregnancy loss. Further research is necessary to study the mechanisms of DNA damage and the potential therapeutic effects of antioxidant therapy.STUDY FUNDING/COMPETING INTEREST(S)None.
Resumo:
The potential that laser based particle accelerators offer to solve sizing and cost issues arising with conventional proton therapy has generated great interest in the understanding and development of laser ion acceleration, and in investigating the radiobiological effects induced by laser accelerated ions. Laser-driven ions are produced in bursts of ultra-short duration resulting in ultra-high dose rates, and an investigation at Queen's University Belfast was carried out to investigate this virtually unexplored regime of cell rdaiobiology. This employed the TARANIS terawatt laser producing protons in the MeV range for proton irradiation, with dose rates exceeding 10 Gys on a single exposure. A clonogenic assay was implemented to analyse the biological effect of proton irradiation on V79 cells, which, when compared to data obtained with the same cell line irradiated with conventionally accelerated protons, was found to show no significant difference. A Relative Biological effectiveness of 1.4±0.2 at 10 % Survival Fraction was estimated from a comparison with a 225 kVp X-ray source. © 2013 SPIE.
Resumo:
Mortality models used for forecasting are predominantly based on the statistical properties of time series and do not generally incorporate an understanding of the forces driving secular trends. This paper addresses three research questions: Can the factors found in stochastic mortality-forecasting models be associated with real-world trends in health-related variables? Does inclusion of health-related factors in models improve forecasts? Do resulting models give better forecasts than existing stochastic mortality models? We consider whether the space spanned by the latent factor structure in mortality data can be adequately described by developments in gross domestic product, health expenditure and lifestyle-related risk factors using statistical techniques developed in macroeconomics and finance. These covariates are then shown to improve forecasts when incorporated into a Bayesian hierarchical model. Results are comparable or better than benchmark stochastic mortality models.
Resumo:
Social signals and interpretation of carried information is of high importance in Human Computer Interaction. Often used for affect recognition, the cues within these signals are displayed in various modalities. Fusion of multi-modal signals is a natural and interesting way to improve automatic classification of emotions transported in social signals. Throughout most present studies, uni-modal affect recognition as well as multi-modal fusion, decisions are forced for fixed annotation segments across all modalities. In this paper, we investigate the less prevalent approach of event driven fusion, which indirectly accumulates asynchronous events in all modalities for final predictions. We present a fusion approach, handling short-timed events in a vector space, which is of special interest for real-time applications. We compare results of segmentation based uni-modal classification and fusion schemes to the event driven fusion approach. The evaluation is carried out via detection of enjoyment-episodes within the audiovisual Belfast Story-Telling Corpus.