887 resultados para Market efficiency hypothesis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study market reaction to the announcements of the selected country hosting the Summer and Winter Olympic Games, the World Football Cup, the European Football Cup and World and Specialized Exhibitions. We generalize previous results analyzing a large number and different types of mega-events, evaluate the effects for winning and losing countries, investigate the determinants of the observed market reaction and control for the ex ante probability of a country being a successful bidder. Average abnormal returns measured at the announcement date and around the event are not significantly different from zero. Further, we find no evidence supporting that industries, that a priori were more likely to extract direct benefits from the event, observe positive significant effects. Yet, when we control for anticipation, the stock price reactions around the announcements are significant.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We provide a theoretical framework to explain the empirical finding that the estimated betas are sensitive to the sampling interval even when using continuously compounded returns. We suppose that stock prices have both permanent and transitory components. The permanent component is a standard geometric Brownian motion while the transitory component is a stationary Ornstein-Uhlenbeck process. The discrete time representation of the beta depends on the sampling interval and two components labelled \"permanent and transitory betas\". We show that if no transitory component is present in stock prices, then no sampling interval effect occurs. However, the presence of a transitory component implies that the beta is an increasing (decreasing) function of the sampling interval for more (less) risky assets. In our framework, assets are labelled risky if their \"permanent beta\" is greater than their \"transitory beta\" and vice versa for less risky assets. Simulations show that our theoretical results provide good approximations for the means and standard deviations of estimated betas in small samples. Our results can be perceived as indirect evidence for the presence of a transitory component in stock prices, as proposed by Fama and French (1988) and Poterba and Summers (1988).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose several finite-sample specification tests for multivariate linear regressions (MLR) with applications to asset pricing models. We focus on departures from the assumption of i.i.d. errors assumption, at univariate and multivariate levels, with Gaussian and non-Gaussian (including Student t) errors. The univariate tests studied extend existing exact procedures by allowing for unspecified parameters in the error distributions (e.g., the degrees of freedom in the case of the Student t distribution). The multivariate tests are based on properly standardized multivariate residuals to ensure invariance to MLR coefficients and error covariances. We consider tests for serial correlation, tests for multivariate GARCH and sign-type tests against general dependencies and asymmetries. The procedures proposed provide exact versions of those applied in Shanken (1990) which consist in combining univariate specification tests. Specifically, we combine tests across equations using the MC test procedure to avoid Bonferroni-type bounds. Since non-Gaussian based tests are not pivotal, we apply the “maximized MC” (MMC) test method [Dufour (2002)], where the MC p-value for the tested hypothesis (which depends on nuisance parameters) is maximized (with respect to these nuisance parameters) to control the test’s significance level. The tests proposed are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995. Our empirical results reveal the following. Whereas univariate exact tests indicate significant serial correlation, asymmetries and GARCH in some equations, such effects are much less prevalent once error cross-equation covariances are accounted for. In addition, significant departures from the i.i.d. hypothesis are less evident once we allow for non-Gaussian errors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the problem of testing the error distribution in a multivariate linear regression (MLR) model. The tests are functions of appropriately standardized multivariate least squares residuals whose distribution is invariant to the unknown cross-equation error covariance matrix. Empirical multivariate skewness and kurtosis criteria are then compared to simulation-based estimate of their expected value under the hypothesized distribution. Special cases considered include testing multivariate normal, Student t; normal mixtures and stable error models. In the Gaussian case, finite-sample versions of the standard multivariate skewness and kurtosis tests are derived. To do this, we exploit simple, double and multi-stage Monte Carlo test methods. For non-Gaussian distribution families involving nuisance parameters, confidence sets are derived for the the nuisance parameters and the error distribution. The procedures considered are evaluated in a small simulation experi-ment. Finally, the tests are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose exact inference procedures for asset pricing models that can be formulated in the framework of a multivariate linear regression (CAPM), allowing for stable error distributions. The normality assumption on the distribution of stock returns is usually rejected in empirical studies, due to excess kurtosis and asymmetry. To model such data, we propose a comprehensive statistical approach which allows for alternative - possibly asymmetric - heavy tailed distributions without the use of large-sample approximations. The methods suggested are based on Monte Carlo test techniques. Goodness-of-fit tests are formally incorporated to ensure that the error distributions considered are empirically sustainable, from which exact confidence sets for the unknown tail area and asymmetry parameters of the stable error distribution are derived. Tests for the efficiency of the market portfolio (zero intercepts) which explicitly allow for the presence of (unknown) nuisance parameter in the stable error distribution are derived. The methods proposed are applied to monthly returns on 12 portfolios of the New York Stock Exchange over the period 1926-1995 (5 year subperiods). We find that stable possibly skewed distributions provide statistically significant improvement in goodness-of-fit and lead to fewer rejections of the efficiency hypothesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present an ultimatum wage bargaining experiment showing that a trade union facilitating non-binding communication among workers, raises wages by simultaneously increasing employers’ posted offers and toughening the bargaining position of employees, without reducing overall market efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article reports the results of an experiment that examined how demand aggregators can discipline vertically-integrated firms - generator and distributor-retailer holdings-, which have a high share in wholesale electricity market with uniform price double auction (UPDA). We initially develop a treatment where holding members redistribute the profit based on the imposition of supra-competitive prices, in equal proportions (50%-50%). Subsequently, we introduce a vertical disintegration (unbundling) treatment with holding-s information sharing, where profits are distributed according to market outcomes. Finally, a third treatment is performed to introduce two active demand aggregators, with flexible interruptible loads in real time. We found that the introduction of responsive demand aggregators neutralizes the power market and increases market efficiency, even beyond what is achieved through vertical disintegration.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines two contrasting interpretations of how bank market concentration (Market Power Hypothesis) and banking relationships (Information Hypothesis) affect three sources of small firm liquidity (cash, lines of credit and trade credit). Supportive of a market power interpretation, we find that in a highly concentrated banking market, small firms hold less cash, have less access to lines of credit, and are more likely to be financially constrained, use greater amounts of more expensive trade credit and face higher penalties for trade credit late payment. We also find support for the information hypothesis: relationship banking improves small business liquidity, particularly in a concentrated banking market, thereby mitigating the adverse effects of bank market concentration derived from market power. Our results are robust to different cash, lines of credit and trade credit measures and to alternative empirical approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esse trabalho examina os anúncios de reaquisição de ações na BOVESPA no período de 30/05/97 a 31/10/98 e procura verificar se existe efeito positivo ou negativo sobre o preço das ações na presença de tal evento. São apresentadas as diversas hipóteses que tem sido utilizadas para justificar a estratégia adotada pelas empresas e o efeito esperado sobre o preço das ações para cada uma dessas hipóteses. Utilizando uma amostra de 110 eventos são realizadas diversas segmentações com vários tipos de ajuste para negociações infreqüentes em cada uma delas. Em seguida é conduzido um estudo de eventos com as diversas segmentações. Para aferir o efeito do evento sobre o preço da ação é realizados o teste t de student. O perfil do AR(Retomo anormal) e do CAR(Retomo anormal cumulativo) também são avaliados em busca de inferências adicionais a respeito do efeito do anúncio de recompra. No geral os resultados obtidos corroboram a hipótese da sinalização. O perfil do CAR indica claramente um efeito positivo do anúncio, entretanto tal efeito não surge de imediato, mas progressivamente ao longo dos, aproximadamente, 25 dias subseqüentes ao evento. O padrão observado parece indicar que o mercado reage lentamente a informação o que depõem contra a presença de eficiência forte ou semi-forte no mercado brasileiro. Além disso os resultados obtidos com as diversas segmentações não são sensivelmente afetados pelos diversos ajustes para negociações infreqüentes utilizados o que reforça os resultados obtidos. Também é constatado que os bancos apresentam comportamento diferenciado em relação ao restante da amostra, entretanto esse resultado deve ser avaliado com reservas devido ao tamanho da amostra utilizada.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O objetivo deste trabalho é examinar se a análise técnica agrega valor às decisões de investimentos. Através da elaboração de intervalos de confiança, construídos através da técnica de Bootstrap de inferência amostral, e consistentes com a hipótese nula de eficiência de mercado na sua forma fraca, foram testados 4 sistemas técnicos de trading. Mais especificamente, obteve-se os resultados de cada sistema aplicado às series originais dos ativos. Então, comparou-se esses resultados com a média dos resultados obtidos quando os mesmos sistemas foram aplicados a 1000 séries simuladas, segundo um random walk, de cada ativo. Caso os mercados sejam eficientes em sua forma fraca, não haveria razão para os resultados das séries originais serem superiores aos das séries simuladas. Os resultados empíricos encontrados sugeriram que os sistemas testados não foram capazes de antecipar o futuro utilizando-se apenas de dados passados. Porém, alguns deles geraram retornos expressivos

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A indústria bancária brasileira foi transformada nas últimas décadas em meio a um fenômeno conhecido como consolidação, que marca uma concentração do mercado em poucas instituições. O objetivo do trabalho é testar empiricamente quais as causas desse processo no Brasil. As duas hipóteses testadas foram formuladas por Berger, Dick et al. (2007): a hipótese da eficiência indica que avanços tecnológicos melhoram a competitividade dos grandes em relação aos pequenos. Deste modo, os resultados dos pequenos são sacrificados por esse fator. Por outro lado, a hipótese da arrogância afirma que os administradores realizam fusões e aquisições pelos maiores bônus dos grandes conglomerados, mas as deseconomias de escala são superiores aos ganhos competitivos da tecnologia e, com o tempo, os pequenos passam a competir em vantagem. Modelos de dados em painel foram utilizados para testar se houve pressões competitivas durante o processo de consolidação. A conclusão foi de que a hipótese da eficiência explica melhor empiricamente o fenômeno brasileiro, assim como o norte-americano. A pressão para diminuição de receitas financeiras foi o fator determinante para que os bancos pequenos sofressem efeitos deletérios com o aumento do peso dos grandes na indústria.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Um dos principais fatores de estudo do mercado de capitais é a discussão a respeito da teoria de eficiência de mercado, que no caso diverge em relação ao comportamento do preço da maioria dos ativos. Este trabalho tem o intuito de analisar o comportamento do principal índice de preços do mercado de bitcoins (BPI) durante o período de julho de 2010 a setembro de 2014. Inicialmente será testada a hipótese do passeio aleatório para o BPI. Em seguida serão verificadas as correlações de longa data nas séries financeiras temporais utilizando como instrumento de análise o expoente de Hurst (H), que inicialmente foi usado para calcular correlações em fenômenos naturais e posteriormente sua abrangência alcançou a área financeira. O estudo avalia o expoente H através de métodos distintos destacando-se a análise R/S e a DFA. Para o cálculo do expoente ao longo do tempo, utiliza-se uma janela móvel de 90 dias deslocando-se de 10 em 10 dias. Já para o cálculo em diferentes escalas verifica-se, para cada dia, o valor do expoente H nos últimos 360, 180 e 90 dias respectivamente. Os resultados evidenciaram que o índice BPI apresenta memória longa persistente em praticamente todo o período analisado. Além disso, a análise em diferentes escalas indica a possibilidade de previsão de eventos turbulentos no índice neste mesmo período. Finalmente foi possível comprovar a hipótese de mercados fractais para a série histórica de retornos do BPI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper studies the empirical effects of risk classification in the mandatory third-party motor insurance of Germany following the European Union’s directive to de-regulate insurance tariffs of 1994. We find evidence that inefficient risk categories had been selected while potentially efficient information was dismissed. Risk classification did generally not improve the efficiency of contracting or the composition of insureds in this market. These findings are partly explained by the continuing existence of institutional restraints in this market such as compulsory fixed coverage and unitary owner insurance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The neuronal causes of individual differences in mental abilities such as intelligence are complex and profoundly important. Understanding these abilities has the potential to facilitate their enhancement. The purpose of this study was to identify the functional brain network characteristics and their relation to psychometric intelligence. In particular, we examined whether the functional network exhibits efficient small-world network attributes (high clustering and short path length) and whether these small-world network parameters are associated with intellectual performance. High-density resting state electroencephalography (EEG) was recorded in 74 healthy subjects to analyze graph-theoretical functional network characteristics at an intracortical level. Ravens advanced progressive matrices were used to assess intelligence. We found that the clustering coefficient and path length of the functional network are strongly related to intelligence. Thus, the more intelligent the subjects are the more the functional brain network resembles a small-world network. We further identified the parietal cortex as a main hub of this resting state network as indicated by increased degree centrality that is associated with higher intelligence. Taken together, this is the first study that substantiates the neural efficiency hypothesis as well as the Parieto-Frontal Integration Theory (P-FIT) of intelligence in the context of functional brain network characteristics. These theories are currently the most established intelligence theories in neuroscience. Our findings revealed robust evidence of an efficiently organized resting state functional brain network for highly productive cognitions.