96 resultados para Stock-prices
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.
Resumo:
Abstract: We analyze the realized stock-bond correlation. Gradual transitions between negative and positive stock-bond correlation is accommodated by the smooth transition regression (STR) model. The changes in regime are de ned by economic and financial transition variables. Both in sample and out-of- sample results document that STR models with multiple transition variables outperform STR models with a single transition variable. The most important transition variables are the short rate, the yield spread, and the VIX volatility index. Keywords: realized correlation; smooth transition regressions; stock-bond correlation; VIX index JEL Classifi cations: C22; G11; G12; G17
Resumo:
During the recent period of economic crisis, many countries have introduced scrappage schemes to boost the sale and production of vehicles, particularly of vehicles designed to pollute less. In this paper, we analyze the impact of a particular scheme in Spain (Plan2000E) on vehicle prices and sales figures as well as on the reduction of polluting emissions from vehicles on the road. We considered the introduction of this scheme an exogenous policy change and because we could distinguish a control group (non-subsidized vehicles) and a treatment group (subsidized vehicles), before and after the introduction of the Plan, we were able to carry out our analysis as a quasi-natural experiment. Our study reveals that manufacturers increased vehicle prices by the same amount they were granted through the Plan (1,000 â¬). In terms of sales, econometric estimations revealed an increase of almost 5% as a result of the implementation of the Plan. With regard to environmental efficiency, we compared the costs (inverted quantity of money) and the benefits of the program (reductions in polluting emissions and additional fiscal revenues) and found that the Plan would only be beneficial if it boosted demand by at least 30%.
Resumo:
El objetivo de este proyecto es desarrollar una aplicación web que sirva y gestione una tienda de música, tanto para su tienda física como para su tienda online. La aplicación Web está gestionada por los usuarios "administrador" y utilizada por los dos tipos de usuarios: administradores y clientes. Sus principales funciones son: Introducción y modificación de artículos. Gestión de entradas y salidas de productos. Gestión de pedidos. Obtención de datos para la gestión de la empresa. Minimizar los errores de gestión. Mejorar la imagen de la empresa. Ampliar los ámbitos de negocio. Correcta visualización de los artículos. Facilitar la búsqueda y compra de artículos.
Predicting random level and seasonality of hotel prices. A structural equation growth curve approach
Resumo:
This article examines the effect on price of different characteristics of holiday hotels in the sun-and-beach segment, under the hedonic function perspective. Monthly prices of the majority of hotels in the Spanish continental Mediterranean coast are gathered from May to October 1999 from the tour operator catalogues. Hedonic functions are specified as random-effect models and parametrized as structural equation models with two latent variables, a random peak season price and a random width of seasonal fluctuations. Characteristics of the hotel and the region where they are located are used as predictors of both latent variables. Besides hotel category, region, distance to the beach, availability of parking place and room equipment have an effect on peak price and also on seasonality. 3- star hotels have the highest seasonality and hotels located in the southern regions the lowest, which could be explained by a warmer climate in autumn
Resumo:
Most of economic literature has presented its analysis under the assumption of homogeneous capital stock.However, capital composition differs across countries. What has been the pattern of capital compositionassociated with World economies? We make an exploratory statistical analysis based on compositional datatransformed by Aitchinson logratio transformations and we use tools for visualizing and measuring statisticalestimators of association among the components. The goal is to detect distinctive patterns in the composition.As initial findings could be cited that:1. Sectorial components behaved in a correlated way, building industries on one side and , in a lessclear view, equipment industries on the other.2. Full sample estimation shows a negative correlation between durable goods component andother buildings component and between transportation and building industries components.3. Countries with zeros in some components are mainly low income countries at the bottom of theincome category and behaved in a extreme way distorting main results observed in the fullsample.4. After removing these extreme cases, conclusions seem not very sensitive to the presence ofanother isolated cases
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
We consider stock market contagion as a significant increase in cross-market linkages after a shock to one country or group of countries. Under this definition we study if contagion occurred from the U.S. Financial Crisis to the rest of the major stock markets in the world by using the adjusted (unconditional) correlation coefficient approach (Forbes and Rigobon, 2002) which consists of testing if average crossmarket correlations increase significantly during the relevant period of turmoil. We would not reject the null hypothesis of interdependence in favour of contagion if the increase in correlation only suggests a continuation of high linkages in all state of the world. Moreover, if contagion occurs, this would justify the intervention of the IMF and the suddenly portfolio restructuring during the period under study.
Resumo:
In this paper we investigate the goodness of fit of the Kirk's approximation formula for spread option prices in the correlated lognormal framework. Towards this end, we use the Malliavin calculus techniques to find an expression for the short-time implied volatility skew of options with random strikes. In particular, we obtain that this skew is very pronounced in the case of spread options with extremely high correlations, which cannot be reproduced by a constant volatility approximation as in the Kirk's formula. This fact agrees with the empirical evidence. Numerical examples are given.
Resumo:
Using a new dataset on capital account openness, we investigate why equity return correlations changed over the last century. Based on a new, long-run dataset on capital account regulations in a group of 16 countries over the period 1890-2001, we show that correlations increase as financial markets are liberalized. These findings are robust to controlling for both the Forbes-Rigobon bias and global averages in equity return correlations. We test the robustness of our conclusions, and show that greater synchronization of fundamentals is not the main cause of increasing correlations. These results imply that the home bias puzzle may be smaller than traditionally claimed.
Resumo:
According to the Taylor principle a central bank should adjust the nominal interest rate by more than one-for-one in response to changes in current inflation. Most of the existing literature supports the view that by following this simple recommendation a central bank can avoid being a source of unnecessary fluctuations in economic activity. The present paper shows that this conclusion is not robust with respect to the modelling of capital accumulation. We use our insights to discuss the desirability of alternative interest rate rules. Our results suggest a reinterpretation of monetary policy under Volcker and Greenspan: The empirically plausible characterization of monetary policy can explain the stabilization of macroeconomic outcomes observed in the early eighties for the US economy. The Taylor principle in itself cannot.
Resumo:
This paper argues that low levels of nutrition impaired cognitive development in industrializing England, and that welfare transfers mitigated the adverse effects of high food prices. Age heaping is used as an indicator of numeracy, as derived from census data. For the cohorts from 1780-1850, we analyse the effect of high grain prices during the Napoleonic Wars. We show that numeracy declined markedly for those born during the war years, especially when wheat was dear. Crucially, where the Old Poor Law provided for generous relief payments, the adverse impact of high prices for foodstuffs was mitigated. Finally, we show some tentative evidence that Englishmen born in areas with low income support selected into occupations with lower cognitive requirements.
Resumo:
We study the quantitative properties of a dynamic general equilibrium model in which agents face both idiosyncratic and aggregate income risk, state-dependent borrowing constraints that bind in some but not all periods and markets are incomplete. Optimal individual consumption-savings plans and equilibrium asset prices are computed under various assumptions about income uncertainty. Then we investigate whether our general equilibrium model with incomplete markets replicates two empirical observations: the high correlation between individual consumption and individual income, and the equity premium puzzle. We find that, when the driving processes are calibrated according to the data from wage income in different sectors of the US economy, the results move in the direction of explaining these observations, but the model falls short of explaining the observed correlations quantitatively. If the incomes of agents are assumed independent of each other, the observations can be explained quantitatively.
Resumo:
This paper provides empirical evidence on the explanatory factorsaffecting introductory prices of new pharmaceuticals in a heavilyregulated and highly subsidized market. We collect a data setconsisting of all new chemical entities launched in Spain between1997 and 2005, and model launching prices. We found that, unlike inthe US and Sweden, therapeutically "innovative" products are notoverpriced relative to "imitative" ones. Price setting is mainly used asa mechanism to adjust for inflation independently of the degree ofinnovation. The drugs that enter through the centralized EMAapproval procedure are overpriced, which may be a consequence ofmarket globalization and international price setting.
Resumo:
We show that the welfare of a representative consumer can be related to observable aggregatedata. To a first order, the change in welfare is summarized by (the present value of) the Solowproductivity residual and by the growth rate of the capital stock per capita. We also show thatproductivity and the capital stock suffice to calculate differences in welfare across countries, withboth variables computed as log level deviations from a reference country. These results hold forarbitrary production technology, regardless of the degree of product market competition, and applyto open economies as well if TFP is constructed using absorption rather than GDP as the measureof output. They require that TFP be constructed using prices and quantities as perceived byconsumers. Thus, factor shares need to be calculated using after-tax wages and rental rates, andwill typically sum to less than one. We apply these results to calculate welfare gaps and growthrates in a sample of developed countries for which high-quality TFP and capital data are available.We find that under realistic scenarios the United Kingdom and Spain had the highest growth ratesof welfare over our sample period of 1985-2005, but the United States had the highest level ofwelfare.