17 resultados para Real interest rates

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper compares the experience of forecasting the UK government bond yield curve before and after the dramatic lowering of short-term interest rates from October 2008. Out-of-sample forecasts for 1, 6 and 12 months are generated from each of a dynamic Nelson-Siegel model, autoregressive models for both yields and the principal components extracted from those yields, a slope regression and a random walk model. At short forecasting horizons, there is little difference in the performance of the models both prior to and after 2008. However, for medium- to longer-term horizons, the slope regression provided the best forecasts prior to 2008, while the recent experience of near-zero short interest rates coincides with a period of forecasting superiority for the autoregressive and dynamic Nelson-Siegel models. © 2014 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the central explanations of the recent Asian Crisis has been the problem of moral hazard as the source of over-investment and excessive external borrowing. There is however rather limited firm-level empirical evidence to characterise inefficient use of internal and external finances. Using a large firm-level panel data-set from four badly affected Asian countries, this paper compares the rates of return to various internal and external funds among firms with low and high debt financing (relative to equity) among financially constrained and other firms. Selectivity-corrected estimates obtained from random effects panel data model do suggest evidence of significantly lower rates of return to long-term debt, even among firms relying more on debt relative to equity in our sample. There is also evidence that average effective interest rates often significantly exceeded the average returns to long-term debt in the sample countries in the pre-crisis period. © 2006 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Contrary to the long-received theory of FDI, interest rates or rates of return can motivate foreign direct investment (FDI) in concert with the benefits of direct ownership. Thus, access to investor capital and capital markets is a vital component of the multinational’s competitive market structure. Moreover, multinationals can use their superior financial capacity as a competitive advantage in exploiting FDI opportunities in dynamic markets. They can also mitigate higher levels of foreign business risks under dynamic conditions by shifting more financial risk to creditors in the host economy. Furthermore, the investor’s expectation of foreign business risk necessarily commands a risk premium for exposing their equity to foreign market risk. Multinationals can modify the profit maximization strategy of their foreign subsidiaries to maximize growth or profits to generate this risk premium. In this context, we investigate how foreign subsidiaries manage their capital funding, business risk, and profit strategies with a diverse sample of 8,000 matched parents and foreign subsidiary accounts from multiple industries in 38 countries.We find that interest rates, asset prices, and expectations in capital markets have a significant effect on the capital movements of foreign subsidiaries. We also find that foreign subsidiaries mitigate their exposure to foreign business risk by modifying their capital structure and debt maturity. Further, we show how the operating strategy of foreign subsidiaries affects their preference for growth or profit maximization. We further show that superior shareholder value, which is a vital link for access to capital for funding foreign expansion in open market economies, is achieved through maintaining stability in the rate of growth and good asset utilization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I model the forward premium in the U.K. gilt-edged market over the period 1982–96 using a two-factor general equilibrium model of the term structure of interest rates. The model permits the decomposition of the forward premium into separate components representing interest rate expectations, the risk premia associated with each of the underlying factors, and terms capturing the direct impact of the variances of the factors on the shape of the forward curve.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis focuses on the theoretical examination of the exchange rate economic (operating) exposure within the context of the theory of the firm, and proposes some hedging solutions using currency options. The examination of economic exposure is based on such parameters as firms' objectives, industry structure and production cost efficiency. In particular, it examines an hypothetical exporting firm with costs in domestic currency, which faces competition from foreign firms in overseas markets and has a market share expansion objective. Within this framework, the hypothesis is established that economic exposure, portrayed in a diagram connecting export prices and real exchange rates, is asymmetric (i.e. the negative effects depreciation are higher than the positive effects of a currency depreciation). In this case, export business can be seen as a real option, given by exporting firms to overseas customer. Different scenarios about the asymmetry hypothesis can be derived for different assumptions about the determinants of economic exposure. Having established the asymmetry hypothesis, the hedging against this exposure is analysed. The hypothesis is established, that a currency call option should be used in hedging against asymmetric economic exposure. Further, some advanced currency options stategies are discussed, and their use in hedging several scenarios of exposure is indicated, establishing the hypothesis that, the optimal options strategy is a function of the determinants of exposure. Some extensions on the theoretical analysis are examined. These include the hedging of multicurrency exposure using options, and the exposure of a purely domestic firm facing import competition. The empirical work addresses two issues: the empirical validity of the asymmetry hypothesis and the examination of the hedging effectiveness of currency options.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The literature on bond markets and interest rates has focused largely on the term structure of interest rates, specifically, on the so-called expectations hypothesis. At the same time, little is known about the nature of the spread of the interest rates in the money market beyond the fact that such spreads are generally unstable. However, with the evolution of complex financial instruments, it has become imperative to identify the time series process that can help one accurately forecast such spreads into the future. This article explores the nature of the time series process underlying the spread between three-month and one-year US rates, and concludes that the movements in this spread over time is best captured by a GARCH(1,1) process. It also suggests the use of a relatively long term measure of interest rate volatility as an explanatory variable. This exercise has gained added importance in view of the revelation that GARCH based estimates of option prices consistently outperform the corresponding estimates based on the stylized Black-Scholes algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The year so far has been a slow start for many businesses, but at least we have not seen the collapse of as many businesses that we were seeing around two years ago. We are, however, still well and truly in the midst of a global recession. Interest rates are still at an all time low, UK house prices seem to be showing little signs of increase (except in London where everyone still seems to want to live!) and for the ardent shopper there are bargains to be had everywhere. It seems strange that prices on the high street do not seem to have increased in over ten years. Mobile phones, DVD players even furniture seems to be cheaper than they used to be. Whist much of this is down to cheaper manufacturing and the rest could probably be explained by competition within the market place. Does this mean that quality suffered too? Now that we live in a world when if a television is not working it is thrown away and replaced. There was a time when you would take it to some odd looking man that your father would know who could fix it for you. (I remember our local television fix-it man, with his thick rimmed bifocal spectacles and a poor comb-over; he had cardboard boxes full of resistors and electrical wires on the floor of his front room that smelt of soldering irons!) Is this consumerism at an extreme or has this move to disposability made us a better society? Before you think these are just ramblings there is a point to this. According to latest global figures of contact lens sales the vast majority of contact lenses fitted around the world are daily, fortnightly or monthly disposable hydrogel lenses. Certainly in the UK over 90% of lenses are disposable (with daily disposables being the most popular, having a market share of over 50%). This begs the question – is this a good thing? Maybe more importantly, do our patients benefit? I think it is worth reminding ourselves why we went down the disposability route with contact lenses in the first place, and unlike electrical goods it was not just so we did not have to take them for repair! There are the obvious advantages of overcoming problems of breakage and tearing of lenses and the lens deterioration with age. The lenses are less likely to be contaminated and the disinfection is either easier or not required at all (in the case of daily disposable lenses). Probably the landmark paper in the field was the work more commonly known as the ‘Gothenburg Study’. The paper, entitled ‘Strategies for minimizing the Ocular Effects of Extended Contact Lens Wear’ published in the American Journal of Optometry in 1987 (volume 64, pages 781-789) by Holden, B.A., Swarbrick, H.A., Sweeney, D.F., Ho, A., Efron, N., Vannas, A., Nilsson, K.T. They suggested that contact lens induced ocular effects were minimised by: •More frequently removed contact lenses •More regularly replaced contact lenses •A lens that was more mobile on the eye (to allow better removal of debris) •Better flow of oxygen through the lens All of these issues seem to be solved with disposability, except the oxygen issue which has been solved with the advent of silicone hydrogel materials. Newer issues have arisen and most can be solved in practice by the eye care practitioner. The emphasis now seems to be on making lenses more comfortable. The problems of contact lens related dry eyes symptoms seem to be ever present and maybe this would explain why in the UK we have a pretty constant contact lens wearing population of just over three million but every year we have over a million dropouts! That means we must be attracting a million new wearers every year (well done to the marketing departments!) but we are also losing a million wearers every year. We certainly are not losing them all to the refractive surgery clinics. We know that almost anyone can now wear a contact lens and we know that some lenses will solve problems of sharper vision, some will aid comfort, and some will be useful for patients with dry eyes. So if we still have so many dropouts then we must be doing something wrong! I think the take home message has to be ‘must try harder’! I must end with an apology for two errors in my editorial of issue 1 earlier this year. Firstly there was a typo in the first sentence; I meant to state that it was 40 years not 30 years since the first commercial soft lens was available in the UK. The second error was one that I was unaware of until colleagues Geoff Wilson (Birmingham, UK) and Tim Bowden (London, UK) wrote to me to explain that soft lenses were actually available in the UK before 1971 (please see their ‘Letters to the Editor’ in this issue). I am grateful to both of them for correcting the mistake.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most studies investigating the determinants of R&D investment consider pooled estimates. However, if the parameters are heterogeneous, pooled coefficients may not provide reliable estimates of individual industry effects. Hence pooled parameters may conceal valuable information that may help target government tools more efficiently across heterogeneous industries. There is little evidence to date on the decomposition of the determinants of R&D investment by industry. Moreover, the existing work does not distinguish between those R&D determinants for which pooling may be valid and those for which it is not. In this paper, we test the pooling assumption for a panel of manufacturing industries and find that pooling is valid only for output fluctuations, adjustment costs and interest rates. Implementing the test results into our model, we find government funding is significant only for low-tech R&D. Foreign R&D and skilled labour matter only in high-tech sectors. These results suggest important implications for R&D policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We used magnetoencephalography (MEG) to examine the nature of oscillatory brain rhythms when passively viewing both illusory and real visual contours. Three stimuli were employed: a Kanizsa triangle; a Kanizsa triangle with a real triangular contour superimposed; and a control figure in which the corner elements used to form the Kanizsa triangle were rotated to negate the formation of illusory contours. The MEG data were analysed using synthetic aperture magnetometry (SAM) to enable the spatial localisation of task-related oscillatory power changes within specific frequency bands, and the time-course of activity within given locations-of-interest was determined by calculating time-frequency plots using a Morlet wavelet transform. In contrast to earlier studies, we did not find increases in gamma activity (> 30 Hz) to illusory shapes, but instead a decrease in 10–30 Hz activity approximately 200 ms after stimulus presentation. The reduction in oscillatory activity was primarily evident within extrastriate areas, including the lateral occipital complex (LOC). Importantly, this same pattern of results was evident for each stimulus type. Our results further highlight the importance of the LOC and a network of posterior brain regions in processing visual contours, be they illusory or real in nature. The similarity of the results for both real and illusory contours, however, leads us to conclude that the broadband (< 30 Hz) decrease in power we observed is more likely to reflect general changes in visual attention than neural computations specific to processing visual contours.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

National meteorological offices are largely concerned with synoptic-scale forecasting where weather predictions are produced for a whole country for 24 hours ahead. In practice, many local organisations (such as emergency services, construction industries, forestry, farming, and sports) require only local short-term, bespoke, weather predictions and warnings. This thesis shows that the less-demanding requirements do not require exceptional computing power and can be met by a modern, desk-top system which monitors site-specific ground conditions (such as temperature, pressure, wind speed and direction, etc) augmented with above ground information from satellite images to produce `nowcasts'. The emphasis in this thesis has been towards the design of such a real-time system for nowcasting. Local site-specific conditions are monitored using a custom-built, stand alone, Motorola 6809 based sub-system. Above ground information is received from the METEOSAT 4 geo-stationary satellite using a sub-system based on a commercially available equipment. The information is ephemeral and must be captured in real-time. The real-time nowcasting system for localised weather handles the data as a transparent task using the limited capabilities of the PC system. Ground data produces a time series of measurements at a specific location which represents the past-to-present atmospheric conditions of the particular site from which much information can be extracted. The novel approach adopted in this thesis is one of constructing stochastic models based on the AutoRegressive Integrated Moving Average (ARIMA) technique. The satellite images contain features (such as cloud formations) which evolve dynamically and may be subject to movement, growth, distortion, bifurcation, superposition, or elimination between images. The process of extracting a weather feature, following its motion and predicting its future evolution involves algorithms for normalisation, partitioning, filtering, image enhancement, and correlation of multi-dimensional signals in different domains. To limit the processing requirements, the analysis in this thesis concentrates on an `area of interest'. By this rationale, only a small fraction of the total image needs to be processed, leading to a major saving in time. The thesis also proposes an extention to an existing manual cloud classification technique for its implementation in automatically classifying a cloud feature over the `area of interest' for nowcasting using the multi-dimensional signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Illiteracy is often associated with people in developing countries. However, an estimated 50 % of adults in a developed country such as Canada lack the literacy skills required to cope with the challenges of today's society; for them, tasks such as reading, understanding, basic arithmetic, and using everyday items are a challenge. Many community-based organizations offer resources and support for these adults, yet overall functional literacy rates are not improving. This is due to a wide range of factors, such as poor retention of adult learners in literacy programs, obstacles in transferring the acquired skills from the classroom to the real life, personal attitudes toward learning, and the stigma of functional illiteracy. In our research we examined the opportunities afforded by personal mobile devices in providing learning and functional support to low-literacy adults. We present the findings of an exploratory study aimed at investigating the reception and adoption of a technological solution for adult learners. ALEX© is a mobile application designed for use both in the classroom and in daily life in order to help low-literacy adults become increasingly literate and independent. Such a solution complements literacy programs by increasing users' motivation and interest in learning, and raising their confidence levels both in their education pursuits and in facing the challenges of their daily lives. We also reflect on the challenges we faced in designing and conducting our research with two user groups (adults enrolled in literacy classes and in an essential skills program) and contrast the educational impact and attitudes toward such technology between these. Our conclusions present the lessons learned from our evaluations and the impact of the studies' specific challenges on the outcome and uptake of such mobile assistive technologies in providing practical support to low-literacy adults in conjunction with literacy and essential skills training. © 2013 Her Majesty the Queen in Right of Canada.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.