23 resultados para Stock model
em Aston University Research Archive
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
This paper examines the impact that the introduction of a closing call auction had on market quality at the London Stock Exchange. Using estimates from the partial adjustment with noise model of Amihud and Mendelson [Amihud, Y., Mendelson, H., 1987. Trading mechanisms and stock returns: An empirical investigation. Journal of Finance 42, 533–553] we show that opening and closing market quality improved for participating stocks. When we stratify our sample securities into five groups based on trading activity we find that the least active securities experience the greatest improvements to market quality. A control sample of stocks are not characterized by discernable changes to market quality.
Resumo:
The aim of this study is to examine the relationship between momentum profitability and the stock market trading mechanism and is motivated by recent changes to the trading systems that have taken place on the London Stock Exchange. Since 1975 the London stock market has employed three different trading systems: a floor based system, a computerized dealer system called SEAQ and the automated auction system SETS. Since each new trading system has reduced the level of execution costs, one might expect, a priori, the magnitude of momentum profits to decline with each amendment to the trading system. However, the opposite empirical result is found showing that shares trading on the automated system generate higher momentum profits than those trading on the floor system and companies trading on the SETS system display greater momentum profitability than those trading on SEAQ. Our empirical results concur with the theoretical findings of the trader’s hesitation model of Du [Du, J., 2002. Heterogeneity in investor confidence and asset market under- and overreaction. Working paper] and the empirical findings of Arena et al. [Arena, M., Haggard, S., Yan, X., Price momentum and idiosyncratic volatility. Financial Review, in press].
Resumo:
This is the first paper to examine the microstructure of the Irish Stock Market empirically and is motivated by the adoption, on June 7th of Xetra the modern pan European auction trading system. Prior to this the exchange utilized an antiquated floor based system. This change was an important event for the market as a rich literature exists to suggest that the trading system exerts a strong influence over the behavior of security returns. We apply the ICSS algorithm of Inclan and Tiao (1994) to discover whether the change to the trading system caused a shift in unconditional volatility at the time Xetra was introduced. Because the trading mechanism can influence volatility in a number of ways we also estimate the partial adjustment coefficients of the Amihud and Mendelson (1987) model prior and subsequent to the introduction of Xetra. Although we find no evidence of volatility changes associated with the introduction of Xetra we do find evidence of an increase in the speed of adjustment (JEL: G15).
Resumo:
This paper assesses the extent to which the equity markets of Hungary, Poland the Czech Republic and Russia have become less segmented. Using a variety of tests it is shown there has been a consistent increase in the co-movement of some Eastern European markets and developed markets. Using the variance decompositions from a vector autoregressive representation of returns it is shown that for Poland and Hungary global factors are having an increasing influence on equity returns, suggestive of increased equity market integration. In this paper we model a system of bivariate equity market correlations as a smooth transition logistic trend model in order to establish how rapidly the countries of Eastern Europe are moving away from market segmentation. We find that Hungary is the country which is becoming integrated the most quickly. © 2005 ELsevier Ltd. All rights reserved.
Resumo:
Purpose – The purpose of this paper is to investigate the impact of foreign exchange and interest rate changes on US banks’ stock returns. Design/methodology/approach – The approach employs an EGARCH model to account for the ARCH effects in daily returns. Most prior studies have used standard OLS estimation methods with the result that the presence of ARCH effects would have affected estimation efficiency. For comparative purposes, the standard OLS estimation method is also used to measure sensitivity. Findings – The findings are as follows: under the conditional t-distributional assumption, the EGARCH model generated a much better fit to the data although the goodness-of-fit of the model is not entirely satisfactory; the market index return accounts for most of the variation in stock returns at both the individual bank and portfolio levels; and the degree of sensitivity of the stock returns to interest rate and FX rate changes is not very pronounced despite the use of high frequency data. Earlier results had indicated that daily data provided greater evidence of exposure sensitivity. Practical implications – Assuming that banks do not hedge perfectly, these findings have important financial implications as they suggest that the hedging policies of the banks are not reflected in their stock prices. Alternatively, it is possible that different GARCH-type models might be more appropriate when modelling high frequency returns. Originality/value – The paper contributes to existing knowledge in the area by showing that ARCH effects do impact on measures of sensitivity.
Resumo:
Recently, Drǎgulescu and Yakovenko proposed an analytical formula for computing the probability density function of stock log returns, based on the Heston model, which they tested empirically. Their research design inadvertently favourably biased the fit of the data to the Heston model, thus overstating their empirical results. Furthermore, Drǎgulescu and Yakovenko did not perform any goodness-of-fit statistical tests. This study employs a research design that facilitates statistical tests of the goodness-of-fit of the Heston model to empirical returns. Robustness checks are also performed. In brief, the Heston model outperformed the Gaussian model only at high frequencies and even so does not provide a statistically acceptable fit to the data. The Gaussian model performed (marginally) better at medium and low frequencies, at which points the extra parameters of the Heston model have adverse impacts on the test statistics. © 2005 Taylor & Francis Group Ltd.
Resumo:
This study seeks to explain the leverage in UK stock returns by reference to the return volatility, leverage and size characteristics of UK companies. A leverage effect is found that is stronger for smaller companies and has greater explanatory power over the returns of smaller companies. The properties of a theoretical model that predicts that companies with higher leverage ratios will experience greater leverage effects are explored. On examining leverage ratio data, it is found that there is a propensity for smaller companies to have higher leverage ratios. The transmission of volatility shocks between the companies is also examined and it is found that the volatility of larger firm returns is important in determining both the volatility and returns of smaller firms, but not the reverse. Moreover, it is found that where volatility spillovers are important, they improve out-of-sample volatility forecasts. © 2005 Taylor & Francis Group Ltd.
Resumo:
The aim in this paper is to replicate and extend the analysis of visual technical patterns by Lo et al. (2000) using data on the UK market. A non-parametric smoother is used to model a nonlinear trend in stock price series. Technical patterns, such as the 'head-and-shoulders' pattern, that are characterised by a sequence of turning points are identified in the smoothed data. Statistical tests are used to determine whether returns conditioned on the technical patterns are different from random returns and, in an extension to the analysis of Lo et al. (2000), whether they can outperform a market benchmark return. For the stocks in the FTSE 100 and FTSE 250 indices over the period 1986 to 2001, we find that technical patterns occur with different frequencies to each other and in different relativities to the frequencies found in the US market. Our extended statistical testing indicates that UK stock returns are less influenced by technical patterns than was the case for US stock returns.
Resumo:
A two-factor no-arbitrage model is used to provide a theoretical link between stock and bond market volatility. While this model suggests that short-term interest rate volatility may, at least in part, drive both stock and bond market volatility, the empirical evidence suggests that past bond market volatility affects both markets and feeds back into short-term yield volatility. The empirical modelling goes on to examine the (time-varying) correlation structure between volatility in the stock and bond markets and finds that the sign of this correlation has reversed over the last 20 years. This has important implications far portfolio selection in financial markets. © 2005 Elsevier B.V. All rights reserved.
Resumo:
In this paper we propose a data envelopment analysis (DEA) based method for assessing the comparative efficiencies of units operating production processes where input-output levels are inter-temporally dependent. One cause of inter-temporal dependence between input and output levels is capital stock which influences output levels over many production periods. Such units cannot be assessed by traditional or 'static' DEA which assumes input-output correspondences are contemporaneous in the sense that the output levels observed in a time period are the product solely of the input levels observed during that same period. The method developed in the paper overcomes the problem of inter-temporal input-output dependence by using input-output 'paths' mapped out by operating units over time as the basis of assessing them. As an application we compare the results of the dynamic and static model for a set of UK universities. The paper is suggested that dynamic model capture the efficiency better than static model. © 2003 Elsevier Inc. All rights reserved.
Resumo:
This thesis examines the effect of rights issue announcements on stock prices by companies listed on the Kuala Lumpur Stock Exchange (KLSE) between 1987 to 1996. The emphasis is to report whether the KLSE is semi strongly efficient with respect to the announcement of rights issues and to check whether the implications of corporate finance theories on the effect of an event can be supported in the context of an emerging market. Once the effect is established, potential determinants of abnormal returns identified by previous empirical work and corporate financial theory are analysed. By examining 70 companies making clean rights issue announcements, this thesis will hopefully shed light on some important issues in long term corporate financing. Event study analysis is used to check on the efficiency of the Malaysian stock market; while cross-sectional regression analysis is executed to identify possible explanators of the rights issue announcements' effect. To ensure the results presented are not contaminated, econometric and statistical issues raised in both analyses have been taken into account. Given the small amount of empirical research conducted in this part of the world, the results of this study will hopefully be of use to investors, security analysts, corporate financial managements, regulators and policy makers as well as those who are interested in capital market based research of an emerging market. It is found that the Malaysian stock market is not semi strongly efficient since there exists a persistent non-zero abnormal return. This finding is not consistent with the hypothesis that security returns adjust rapidly to reflect new information. It may be possible that the result is influenced by the sample, consisting mainly of below average size companies which tend to be thinly traded. Nevertheless, these issues have been addressed. Another important issue which has emerged from the study is that there is some evidence to suggest that insider trading activity existed in this market. In addition to these findings, when the rights issue announcements' effect is compared to the implications of corporate finance theories in predicting the sign of abnormal returns, the signalling model, asymmetric information model, perfect substitution hypothesis and Scholes' information hypothesis cannot be supported.
Resumo:
A review of published literature was made to establish the fundamental aspects of rolling and allow an experimental programme to be planned. Simulated hot rolling tests, using pure lead as a model material, were performed on a laboratory mill to obtain data on load and torque when rolling square section stock. Billet metallurgy and consolidation of representative defects was studied when modelling the rolling of continuously cast square stock with a view to determining optimal reduction schedules that would result in a product having properties to the high level found in fully wrought billets manufactured from large ingots. It is difficult to characterize sufficiently the complexity of the porous central region in a continuously cast billet for accurate modelling. However, holes drilled into a lead billet prior to rolling was found to be a good means of assessing central void consolidation in the laboratory. A rolling schedule of 30% (1.429:1) per pass to a total of 60% (2.5:1) will give a homogeneous, fully recrystallized product. To achieve central consolidation, a total reduction of approximately 70% (3.333:1) is necessary. At the reduction necessary to achieve consolidation, full recrystallization is assured. A theoretical analysis using a simplified variational principle with experimentally derived spread data has been developed for a homogeneous material. An upper bound analysis of a single, centrally situated void has been shown to give good predictions of void closure with reduction and the reduction required for void closure for initial void area fractions 0.45%. A limited number of tests in the works has indicated compliance with the results for void closure obtained in the laboratory.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.