838 resultados para American Stock Exchange.
Resumo:
Each part has special t.-p. only.
Resumo:
George W. Norris, chairman.
Resumo:
This thesis examines the effect of rights issue announcements on stock prices by companies listed on the Kuala Lumpur Stock Exchange (KLSE) between 1987 to 1996. The emphasis is to report whether the KLSE is semi strongly efficient with respect to the announcement of rights issues and to check whether the implications of corporate finance theories on the effect of an event can be supported in the context of an emerging market. Once the effect is established, potential determinants of abnormal returns identified by previous empirical work and corporate financial theory are analysed. By examining 70 companies making clean rights issue announcements, this thesis will hopefully shed light on some important issues in long term corporate financing. Event study analysis is used to check on the efficiency of the Malaysian stock market; while cross-sectional regression analysis is executed to identify possible explanators of the rights issue announcements' effect. To ensure the results presented are not contaminated, econometric and statistical issues raised in both analyses have been taken into account. Given the small amount of empirical research conducted in this part of the world, the results of this study will hopefully be of use to investors, security analysts, corporate financial managements, regulators and policy makers as well as those who are interested in capital market based research of an emerging market. It is found that the Malaysian stock market is not semi strongly efficient since there exists a persistent non-zero abnormal return. This finding is not consistent with the hypothesis that security returns adjust rapidly to reflect new information. It may be possible that the result is influenced by the sample, consisting mainly of below average size companies which tend to be thinly traded. Nevertheless, these issues have been addressed. Another important issue which has emerged from the study is that there is some evidence to suggest that insider trading activity existed in this market. In addition to these findings, when the rights issue announcements' effect is compared to the implications of corporate finance theories in predicting the sign of abnormal returns, the signalling model, asymmetric information model, perfect substitution hypothesis and Scholes' information hypothesis cannot be supported.
Resumo:
This thesis focuses on three main questions. The first uses ExchangeTraded Funds (ETFs) to evaluate estimated adverse selection costs obtained spread decomposition models. The second compares the Probability of Informed Trading (PIN) in Exchange-Traded Funds to control securities. The third examines the intra-day ETF trading patterns. These spread decomposition models evaluated are Glosten and Harris (1988); George, Kaul, and Nimalendran (1991); Lin, Sanger, and Booth (1995); Madhavan, Richardson, and Roomans (1997); Huang and Stoll (1997). Using the characteristics of ETFs it is shown that only the Glosten and Harris (1988) and Madhavan, et al (1997) models provide theoretically consistent results. When the PIN measure is employed ETFs are shown to have greater PINs than control securities. The investigation of the intra-day trading patterns shows that return volatility and trading volume have a U-shaped intra-day pattern. A study of trading systems shows that ETFs on the American Stock Exchange (AMEX) have a U-shaped intra-day pattern of bid-ask spreads, while ETFs on NASDAQ do not. Specifically, ETFs on NASDAQ have higher bid-ask spreads at the market opening, then the lowest bid-ask spread in the middle of the day. At the close of the market, the bid-ask spread of ETFs on NASDAQ slightly elevated when compared to mid-day.
Resumo:
Az árhatásfüggvények azt mutatják meg, hogy egy adott értékű megbízás mekkora relatív árváltozást okoz. Az árhatásfüggvény ismerete a piaci szereplők számára fontos szerepet játszik a jövőben benyújtandó ajánlataikhoz kapcsolódó árhatás előrejelzésében, a kereskedés árváltozásból eredő többletköltségének becslésében, illetve az optimális kereskedési algoritmus kialakításában. Az általunk kidolgozott módszer révén a piaci szereplők a teljes ajánlati könyv ismerete nélkül egyszerűen és gyorsan tudnak virtuális árhatásfüggvényt meghatározni, ugyanis bemutatjuk az árhatásfüggvény és a likviditási mértékek kapcsolatát, valamint azt, hogy miként lehet a Budapesti Likviditási Mérték (BLM) idősorából ár ha tás függ vényt becsülni. A kidolgozott módszertant az OTP-részvény idősorán szemléltetjük, és a részvény BLM-adatsorából a 2007. január 1-je és 2011. június 3-a közötti időszakra virtuális árhatás függvényt becsülünk. Empirikus elemzésünk során az árhatás függ vény időbeli alakulásának és alapvető statisztikai tulajdonságainak vizsgálatát végezzük el, ami révén képet kaphatunk a likviditás hiányában fellépő tranzakciós költségek múltbeli viselkedéséről. Az így kapott információk például a dinamikus portfólióoptimalizálás során lehetnek a kereskedők segítségére. / === / Price-effect equations show what relative price change a commission of a given value will have. Knowledge of price-effect equations plays an important part in enabling market players to predict the price effect of their future commissions and to develop an optimal trading algorithm. The method devised by the authors allows a virtual price-effect equation to be defined simply and rapidly without knowledge of the whole offer book, by presenting the relation between the price-effect equation and degree of liquidity, and how to estimate the price-effect equation from the time line of the Budapest Liquidity Measure (BLM). The methodology is shown using the time line for OTP shares and the virtual price-effect equation estimated for the 1 January 2007 to 3 June 2011 period from the shares BML data set. During the empirical analysis the authors conducted an examination of the tendency of the price-effect equation over time and for its basic statistical attributes, to yield a picture of the past behaviour of the transaction costs arising in the absence of liquidity. The information obtained may, for instance, help traders in dynamic portfolio optimization.
Resumo:
Alarming statistics provides that only 10,2 percentage of companies listed on the Swedish stock exchange has achieved gender equality in their top management. The fact is that women being discriminated, since men dominates these positions of power. The study is of a qualitative nature and aims to achieve a deeper understanding and knowledge contribution of how gender equal companies´ has achieved this gender diversity in their top management. Sweden's highest ranking business leaders has been interviewed in order to obtain their view, and the companies they represent, in order to get an answer to what the most important requirements has been in the achievement. The study's main result has shown that strong core values and corporate culture are basic and required condition for a successful gender equality strategy. A deliberate or emergent strategy can then be successfully implemented, and it is mainly the impact of structural barriers that determine which strategy a company uses. At a deliberate strategy, following measures are in additional to core values and corporate cultural crucial; commitment towards gender equality, a specific plan with clear objectives, and a conscious objective recruitment process. The result found aboute these two factors and three measures also identified a required specific order to follow in order to achieve gender diversity in top management. These findings, which in a near future, aims to contribute to a more gender equal Sweden.
Resumo:
International research shows that low-volatility stocks have beaten high-volatility stocks in terms of returns for decades on multiple markets. This abbreviation from traditional risk-return framework is known as low-volatility anomaly. This study focuses on explaining the anomaly and finding how strongly it appears in NASDAQ OMX Helsinki stock exchange. Data consists of all listed companies starting from 2001 and ending close to 2015. Methodology follows closely Baker and Haugen (2012) by sorting companies into deciles according to 3-month volatility and then calculating monthly returns for these different volatility groups. Annualized return for the lowest volatility decile is 8.85 %, while highest volatility decile destroys wealth at rate of -19.96 % per annum. Results are parallel also in quintiles that represent larger amount of companies and thus dilute outliers. Observation period captures financial crisis of 2007-2008 and European debt crisis, which embodies as low main index annual return of 1 %, but at the same time proves the success of low-volatility strategy. Low-volatility anomaly is driven by multiple reasons such as leverage constrained trading and managerial incentives which both prompt to invest in risky assets, but behavioral matters also have major weight in maintaining the anomaly.
Resumo:
This research investigates whether the major stock markets in Latin America (Brazil, Mexico, Chile, Colombia, Peru and Argentina) exhibited herd behavior over the period January 2, 2002 to June 30, 2014, using the variation in the returns overall and by sector in the most representative stock market index in each country, using the model proposed by Christie y Huang (1995) -- The results do not reveal any herd behavior in the total market, or in the sectors of the markets examined in the study
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
O tema Governança de Tecnologia da Informação (GTI) tornou-se mais premente no ambiente empresarial brasileiro, principalmente após as repercussões mundiais ocorridas com a queda da bolsa de valores americana Nasdaq, com a nova configuração mundial após os ataques aos Estados Unidos em 11 de setembro de 2001 e a partir da promulgação da Lei Sarbanes-Oxley em 2002. Esse modelo de gestão tem sido implementado por organizações que buscam não somente obter melhor controle de gestão em Tecnologia da Informação, como para aquelas que têm de atender às conformidades legais exigidas pelos órgãos de controle. Implementá-la é um processo complexo e desafiador em virtude da necessidade de se identificar o melhor modelo de GTI dentre as práticas existentes no mundo empresarial. As empresas precisam fazer uma composição daquelas que melhor se aderem às suas realidades. Este trabalho tem como objetivo analisar os modelos de GTI adotados em organizações no Brasil, avaliar os seus resultados, seus níveis de maturidade, os seus benefícios, suas dificuldades e suas tendências, contribuindo assim para o seu melhor entendimento e para amenizar a carência de estudos nessa área no Brasil. Este estudo, que é de natureza empírica, baseou-se na metodologia de estudo de casos múltiplos realizado em cinco empresas para explorar como este modelo de gestão vem sendo adotado, quais estruturas, metodologias e práticas de mercado têm sido utilizadas para a sua efetividade. Neste contexto, apresentam-se os resultados obtidos, os aspectos que envolvem a implementação dos modelos de GTI nas organizações, as dificuldades encontradas, o que têm condicionado o seu desempenho e suas tendências.
Resumo:
O tema Governança de Tecnologia da Informação (GTI) tornou-se mais premente no ambiente empresarial brasileiro, principalmente após as repercussões mundiais ocorridas com a queda da bolsa de valores americana Nasdaq, com a nova configuração mundial após os ataques aos Estados Unidos em 11 de setembro de 2001 e a partir da promulgação da Lei Sarbanes-Oxley em 2002. Esse modelo de gestão tem sido implementado por organizações que buscam não somente obter melhor controle de gestão em Tecnologia da Informação, como para aquelas que têm de atender às conformidades legais exigidas pelos órgãos de controle. Implementá-la é um processo complexo e desafiador em virtude da necessidade de se identificar o melhor modelo de GTI dentre as práticas existentes no mundo empresarial. As empresas precisam fazer uma composição daquelas que melhor se aderem às suas realidades. Este trabalho tem como objetivo analisar os modelos de GTI adotados em organizações no Brasil, avaliar os seus resultados, seus níveis de maturidade, os seus benefícios, suas dificuldades e suas tendências, contribuindo assim para o seu melhor entendimento e para amenizar a carência de estudos nessa área no Brasil. Este estudo, que é de natureza empírica, baseou-se na metodologia de estudo de casos múltiplos realizado em cinco empresas para explorar como este modelo de gestão vem sendo adotado, quais estruturas, metodologias e práticas de mercado têm sido utilizadas para a sua efetividade. Neste contexto, apresentam-se os resultados obtidos, os aspectos que envolvem a implementação dos modelos de GTI nas organizações, as dificuldades encontradas, o que têm condicionado o seu desempenho e suas tendências.