966 resultados para New York Stock Exchange.
Resumo:
This paper investigates whether the momentum effect exists in the NYSE energy sector. Momentum is defined as the strategy that buys (sells) these stocks that are best (worst) performers, over a pre-specified past period of time (the 'look-back' period), by constructing equally weighted portfolios. Different momentum strategies are obtained by changing the number of stocks included in these portfolios, as well as the look-back period. Next, their performance is compared against two benchmarks: the equally weighted portfolio consisting of most stocks in the NYSE energy index and the market portfolio, and the S&P500 index. The results indicate that the momentum effect is strongly present in the energy sector, and leads to highly profitable portfolios, improving the risk-reward measures and easily outperforming both benchmarks.
Resumo:
George W. Norris, chairman.
Resumo:
"January 25, 1875."
Resumo:
Volumes for 1887/88-1901/02 published with Report of the New York Produce Exchange.
Resumo:
Mode of access: Internet.
Resumo:
Reports for 1883-1886, 1902/1903- contain no statistical reports, the latter being published in separate volumes
Resumo:
The integration between the London and New York Stock Exchanges is analyzed during the era when they were still developing as asset markets. The domestic securities on both exchanges showed little sustained integration, even when controlling for the different characteristics of stocks, implying that the pricing of securities in the US and UK were still being driven by local factors. However, there was considerable integration between New York and those listings on London which operated internationally. These results place a limit on the view that pre-World War I was the first era of globalization in terms of capital markets, and suggest that the listing of foreign securities may be one of the primary mechanisms driving asset market integration.
Resumo:
Esta dissertação trata da importância da Governança Corporativa e da Gestão de Risco para as empresas brasileiras que tem suas ações negociadas nas Bolsas de Valores de Nova York e de São Paulo. Tem como principais objetivos: a avaliação do atual estágio de adequação dessas empresas brasileiras às normas da Lei Sarbanes & Oxley, a confirmação da importância do gerenciamento de risco para a Governança Corporativa, buscando fazer uma associação da ocorrência de perdas patrimoniais com as ferramentas da gestão de risco e das fraudes com a fragilidade de normas de controle interno e com as normas emanadas dos órgãos externos regulatórios. O trabalho acadêmico, um estudo exploratório, teve como ponto de partida uma pesquisa bibliográfica de livros e artigos técnicos versando sobre Governança Corporativa com foco na gestão de riscos. A pesquisa foi feita através da leitura dos relatórios de administração das empresas selecionadas e a aplicabilidade das normas da Lei Sarbanes Oxley. Como conclusão foi possível confirmar com razoável certeza que as grandes perdas, que levaram empresas internacionais a quebra, ocorreram pela falta de uma eficaz gestão de risco ou por um deficiente sistema de controle interno associada a falta de ações preventivas. Por outro lado, apesar dos esforços das empresas brasileiras em se adequar às novas exigências para poder atuar no mercado financeiro do Brasil e dos Estados Unidos da América, parte das empresas pesquisadas ainda se encontra em fase de implementação dos Comitês de Auditoria, de Normas e Procedimentos de Controle Interno e das demais práticas de Gestão Corporativa. Novas pesquisas sobre o tema central deste estudo poderão ensejar no aprofundamento da questão da relação custo x benefício da implantação das práticas de Governança Corporativa e na questão da eficácia dos sistemas corporativos de gerenciamento e controle considerando os custos incorridos em sua implantação e manutenção e os benefícios obtidos. Propõe-se ainda um estudo que busque a revisão das responsabilidades das autoridades reguladoras no que tange ao controle ante e pós-fato. Um dilema a ser resolvido e que deve instigar futuros pesquisadores.(AU)
Resumo:
Esta dissertação trata da importância da Governança Corporativa e da Gestão de Risco para as empresas brasileiras que tem suas ações negociadas nas Bolsas de Valores de Nova York e de São Paulo. Tem como principais objetivos: a avaliação do atual estágio de adequação dessas empresas brasileiras às normas da Lei Sarbanes & Oxley, a confirmação da importância do gerenciamento de risco para a Governança Corporativa, buscando fazer uma associação da ocorrência de perdas patrimoniais com as ferramentas da gestão de risco e das fraudes com a fragilidade de normas de controle interno e com as normas emanadas dos órgãos externos regulatórios. O trabalho acadêmico, um estudo exploratório, teve como ponto de partida uma pesquisa bibliográfica de livros e artigos técnicos versando sobre Governança Corporativa com foco na gestão de riscos. A pesquisa foi feita através da leitura dos relatórios de administração das empresas selecionadas e a aplicabilidade das normas da Lei Sarbanes Oxley. Como conclusão foi possível confirmar com razoável certeza que as grandes perdas, que levaram empresas internacionais a quebra, ocorreram pela falta de uma eficaz gestão de risco ou por um deficiente sistema de controle interno associada a falta de ações preventivas. Por outro lado, apesar dos esforços das empresas brasileiras em se adequar às novas exigências para poder atuar no mercado financeiro do Brasil e dos Estados Unidos da América, parte das empresas pesquisadas ainda se encontra em fase de implementação dos Comitês de Auditoria, de Normas e Procedimentos de Controle Interno e das demais práticas de Gestão Corporativa. Novas pesquisas sobre o tema central deste estudo poderão ensejar no aprofundamento da questão da relação custo x benefício da implantação das práticas de Governança Corporativa e na questão da eficácia dos sistemas corporativos de gerenciamento e controle considerando os custos incorridos em sua implantação e manutenção e os benefícios obtidos. Propõe-se ainda um estudo que busque a revisão das responsabilidades das autoridades reguladoras no que tange ao controle ante e pós-fato. Um dilema a ser resolvido e que deve instigar futuros pesquisadores.(AU)
Resumo:
Esta dissertação trata da importância da Governança Corporativa e da Gestão de Risco para as empresas brasileiras que tem suas ações negociadas nas Bolsas de Valores de Nova York e de São Paulo. Tem como principais objetivos: a avaliação do atual estágio de adequação dessas empresas brasileiras às normas da Lei Sarbanes & Oxley, a confirmação da importância do gerenciamento de risco para a Governança Corporativa, buscando fazer uma associação da ocorrência de perdas patrimoniais com as ferramentas da gestão de risco e das fraudes com a fragilidade de normas de controle interno e com as normas emanadas dos órgãos externos regulatórios. O trabalho acadêmico, um estudo exploratório, teve como ponto de partida uma pesquisa bibliográfica de livros e artigos técnicos versando sobre Governança Corporativa com foco na gestão de riscos. A pesquisa foi feita através da leitura dos relatórios de administração das empresas selecionadas e a aplicabilidade das normas da Lei Sarbanes Oxley. Como conclusão foi possível confirmar com razoável certeza que as grandes perdas, que levaram empresas internacionais a quebra, ocorreram pela falta de uma eficaz gestão de risco ou por um deficiente sistema de controle interno associada a falta de ações preventivas. Por outro lado, apesar dos esforços das empresas brasileiras em se adequar às novas exigências para poder atuar no mercado financeiro do Brasil e dos Estados Unidos da América, parte das empresas pesquisadas ainda se encontra em fase de implementação dos Comitês de Auditoria, de Normas e Procedimentos de Controle Interno e das demais práticas de Gestão Corporativa. Novas pesquisas sobre o tema central deste estudo poderão ensejar no aprofundamento da questão da relação custo x benefício da implantação das práticas de Governança Corporativa e na questão da eficácia dos sistemas corporativos de gerenciamento e controle considerando os custos incorridos em sua implantação e manutenção e os benefícios obtidos. Propõe-se ainda um estudo que busque a revisão das responsabilidades das autoridades reguladoras no que tange ao controle ante e pós-fato. Um dilema a ser resolvido e que deve instigar futuros pesquisadores.(AU)
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubblelike deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the nonfundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
Exchange traded funds (ETFs) have increased significantly in popularity since they were first introduced in 1993. However, there is still much that is unknown about ETFs in the extant literature. This dissertation attempts to fill gaps in the ETF literature by using three related essays. In these three essays, we compare ETFs to closed ended mutual funds (CEFs) by decomposing the bid-ask spread into its three components; we look at the intraday shape of ETFs and compare it to the intraday shape of equities as well as examine the co-integration factor between ETFs on the London Stock Exchange and the New York Stock Exchange; we also examine the differences between leveraged ETFs and unleveraged ETFs by analyzing the impact of liquidity and volatility. These three essays are presented in Chapters 1, 2, and 3, respectively. ^ Chapter one uses the Huang and Stoll (1997) model to decompose the bid-ask spread in CEFs and ETFs for two distinct periods—a normal and a volatile period. We show a higher adverse selection component for CEFs than for ETFs without regard to volatility. However, both ETFs and CEFs increased in magnitude of the adverse selection component in the period of high volatility. Chapter two uses a mix of the Werner and Kleidon (1993) and the Hupperets and Menkveld (2002) methods to get the intraday shape of ETFs and analyze co-integration between London and New York trading. We find two different shapes for New York and London ETFs. There also appears to be evidence of co-integration in the overlapping two-hour trading period but not over the entire trading day for the two locations. The third chapter discusses the new class of ETFs called leveraged ETFs. We examine the liquidity and depth differences between unleveraged and leveraged ETFs at the aggregate level and when the leveraged ETFs are classified by the leveraged multiples of -3, -2, -1, 2, and 3, both for a normal and a volatile period. We find distinct differences between leveraged and unleveraged ETFs at the aggregate level, with leveraged ETFs having larger spreads than unleveraged ETFs. Furthermore, while both leveraged and unleveraged ETFs have larger spreads in high volatility, for the leveraged ETFs the change in magnitude is significantly larger than for the unleveraged ETFs. Among the multiples, the -2 leveraged ETF is the most pronounced in its liquidity characteristics, more so in volatile times. ^
Resumo:
Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.