895 resultados para asset prices
Resumo:
Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.
Resumo:
We analysed the specific case of how information in the financial press influences economic bubbles. We found considerable flaws in the information market due to several factors: demand, the predominance of what are termed “irrational investors” (herding), and supply, which has the problem that the sources of information are biasedand feeds. A financial bubble is a deviation between real value of a financial asset and its persistent market price in time, which also has a speculative origin fed back by the illusion of the owners of these financial values, who will take benefits because of the future prices, which must be higher than the previous ones. The economical information in the media is submitting three problems. First of all, it is information generated by companies. In second place, the information circuit is fed back. A problem of informative independence becomes created, particularly serious in the case of the banks, which are very were as creditors. And in a third place, some informative biases are manifested for the companies of regulated sectors which are starring the economical information in the media.
Resumo:
[eng] A multi-sided Böhm-Bawerk assignment game (Tejada, to appear) is a model for a multilateral market with a finite number of perfectly complementary indivisible commodities owned by different sellers, and inflexible demand and support functions. We show that for each such market game there is a unique vector of competitive prices for the commodities that is vertical syndication-proof, in the sense that, at those prices, syndication of sellers each owning a different commodity is neither beneficial nor detrimental for the buyers. Since, moreover, the benefits obtained by the agents at those prices correspond to the nucleolus of the market game, we provide a syndication-based foundation for the nucleolus as an appropriate solution concept for market games. For different solution concepts a syndicate can be disadvantageous and there is no escape to Aumman’s paradox (Aumann, 1973). We further show that vertical syndicationproofness and horizontal syndication-proofness – in which sellers of the same commodity collude – are incompatible requirements under some mild assumptions. Our results build on a self-interesting link between multi-sided Böhm-Bawerk assignment games and bankruptcy games (O’Neill, 1982). We identify a particular subset of Böhm-Bawerk assignment games and we show that it is isomorphic to the whole class of bankruptcy games. This isomorphism enables us to show the uniqueness of the vector of vertical syndication-proof prices for the whole class of Böhm-Bawerk assignment market using well-known results of bankruptcy problems.
Resumo:
This work carries out an empirical evaluation of the impact of the main mechanism for regulating the prices of medicines in the UK on a variety ofpharmaceutical price indices. The empirical evidence shows that the overall impact of the rate of return cap appears to have been slight or even null, and in any case that the impact would differ across therapeutic areas. These empiricalfindings suggest that the price regulation has managed to encourage UK-based firms¿ diversification in many therapeutic areas
Resumo:
Expanded abstract: Iowa Department of Transportation (IA DOT) is finalizing research to streamline field inventory/inspection of culverts by Maintenance and Construction staff while maximizing the use of tablet technologies. The project began in 2011 to develop some new best practices for field staff to assist in the inventory, inspection and maintenance of assets along the roadway. The team has spent the past year working through the complexities of identifying the most appropriate tablet hardware for field data collection. A small scale deployment of tablets occurred in spring of 2013 to collect several safety related assets (culverts, signs, guardrail, and incidents). Data can be collected in disconnected or connected modes and there is an associated desktop environment where data can be viewed and queried after being synced into the master database. The development of a deployment plan and related workflow processes are underway; which will eventually feed information into IA DOTs larger asset management system and make the information available for decision making. The team is also working with the IA DOT Design Office on Computer Aided Drafting (CAD) data processing and the IA DOT Construction office with a new digital As-Built plan process to leverage the complete data life-cycle so information can be developed once and leveraged by the Maintenance staff farther along in the process.
Resumo:
We explore the linkage between equity and commodity markets, focusing in particular on its evolution over time. We document that a country's equity market valuehas significant out-of-sample predictive ability for the future global commodity priceindex for several primary commodity-exporting countries. The out-of-sample predictive ability of the equity market appears around 2000s. The results are robust to usingseveral control variables as well as firm-level equity data. Finally, our results indicatethat exchange rates are a better predictor of commodity prices than equity markets,especially at very short horizons.
Resumo:
We estimate the response of stock prices to exogenous monetary policy shocks usinga vector-autoregressive model with time-varying parameters. Our evidence points toprotracted episodes in which, after a a short-run decline, stock prices increase persistently in response to an exogenous tightening of monetary policy. That responseis clearly at odds with the "conventional" view on the effects of monetary policy onbubbles, as well as with the predictions of bubbleless models. We also argue that it isunlikely that such evidence be accounted for by an endogenous response of the equitypremium to the monetary policy shocks.
Resumo:
The Iowa Crop and Livestock Report
Resumo:
This project resulted in the development of a framework for making asset management decisions on low-volume bridges. The research focused on low-volume bridges located in the agricultural counties of Iowa because recent research has shown that these counties have the greatest percentage of structurally deficient bridges in the nation. Many of the same counties also have the highest crop yields in the state, creating a situation where detours caused by deficient bridges on farm-to-market roads increase the cost to transport the crops. Thus, the research proposed the use of social return on investment (SROI), a tool used by international institutions such as the World Bank, as an asset management metric to gauge to the socioeconomic impact of structurally deficient bridges on the state in an effort to provide quantified justification to fund improvements on low-volume assets such as these rural bridges. The study found that combining SROI with current asset management metrics like average daily traffic (ADT) made it possible to prioritize the bridges in such a way that the limited resources available are allocated in a manner that promotes a more equitable distribution and that directly benefits the user, in this case Iowa farmers. The result is a system that more closely aligns itself with the spirit of MAP-21, in that infrastructure investments are used to facilitate economic growth for Iowa’s agricultural economy.
Resumo:
This project resulted in the development of a proof of concept for a features inventory process to be used by field staff. The resulting concept is adaptable for different asset classes (e.g. culverts, guardrail) and able to leverage existing DOT resources such as the videolog and LRS and our current technology platforms including Oracle and our GIS web infrastructure. The concept examined the feasibility of newly available technologies, such as mobile devices, while balancing ease of use in the field. Implementation and deployment costs were also important considerations in evaluating the success of the project. These project funds allowed the pilot to address the needs of two DOT districts. A report of findings was prepared, including recommendations for or against full deployment of the pilot solution.
Final Report (SPR Project 90-00-RB10-012) on the Maintenance Asset Management Project Phase II, 2013
Resumo:
This project resulted in the development of a proof of concept for a features inventory process to be used by field staff. The resulting concept is adaptable for different asset classes (e.g. culverts, guardrail) and able to leverage existing DOT resources such as the videolog and LRS and our current technology platforms including Oracle and our GIS web infrastructure. The concept examined the feasibility of newly available technologies, such as mobile devices, while balancing ease of use in the field. Implementation and deployment costs were also important considerations in evaluating the success of the project. These project funds allowed the pilot to address the needs of two DOT districts. A report of findings was prepared, including recommendations for a full deployment of a field data collection.
Resumo:
Most local agencies in Iowa currently make their pavement treatment decisions based on their limited experience due primarily to lack of a systematic decision-making framework and a decision-aid tool. The lack of objective condition assessment data of agency pavements also contributes to this problem. This study developed a systematic pavement treatment selection framework for local agencies to assist them in selecting the most appropriate treatment and to help justify their maintenance and rehabilitation decisions. The framework is based on an extensive literature review of the various pavement treatment techniques in terms of their technical applicability and limitations, meaningful practices of neighboring states, and the results of a survey of local agencies. The treatment selection framework involves three different steps: pavement condition assessment, selection of technically feasible treatments using decision trees, and selection of the most appropriate treatment considering the return-on-investment (ROI) and other non-economic factors. An Excel-based spreadsheet tool that automates the treatment selection framework was also developed, along with a standalone user guide for the tool. The Pavement Treatment Selection Tool (PTST) for Local Agencies allows users to enter the severity and extent levels of existing distresses and then, recommends a set of technically feasible treatments. The tool also evaluates the ROI of each feasible treatment and, if necessary, it can also evaluate the non-economic value of each treatment option to help determine the most appropriate treatment for the pavement. It is expected that the framework and tool will help local agencies improve their pavement asset management practices significantly and make better economic and defensible decisions on pavement treatment selection.