963 resultados para asset pricing model
Resumo:
This dissertation contributes to the rapidly growing empirical research area in the field of operations management. It contains two essays, tackling two different sets of operations management questions which are motivated by and built on field data sets from two very different industries --- air cargo logistics and retailing.
The first essay, based on the data set obtained from a world leading third-party logistics company, develops a novel and general Bayesian hierarchical learning framework for estimating customers' spillover learning, that is, customers' learning about the quality of a service (or product) from their previous experiences with similar yet not identical services. We then apply our model to the data set to study how customers' experiences from shipping on a particular route affect their future decisions about shipping not only on that route, but also on other routes serviced by the same logistics company. We find that customers indeed borrow experiences from similar but different services to update their quality beliefs that determine future purchase decisions. Also, service quality beliefs have a significant impact on their future purchasing decisions. Moreover, customers are risk averse; they are averse to not only experience variability but also belief uncertainty (i.e., customer's uncertainty about their beliefs). Finally, belief uncertainty affects customers' utilities more compared to experience variability.
The second essay is based on a data set obtained from a large Chinese supermarket chain, which contains sales as well as both wholesale and retail prices of un-packaged perishable vegetables. Recognizing the special characteristics of this particularly product category, we develop a structural estimation model in a discrete-continuous choice model framework. Building on this framework, we then study an optimization model for joint pricing and inventory management strategies of multiple products, which aims at improving the company's profit from direct sales and at the same time reducing food waste and thus improving social welfare.
Collectively, the studies in this dissertation provide useful modeling ideas, decision tools, insights, and guidance for firms to utilize vast sales and operations data to devise more effective business strategies.
Resumo:
This paper examines the effects of higher-order risk attitudes and statistical moments on the optimal allocation of risky assets within the standard portfolio choice model. We derive the expressions for the optimal proportion of wealth invested in the risky asset to show they are functions of portfolio returns third- and fourth-order moments as well as on the investor’s risk preferences of prudence and temperance. We illustrate the relative importance that the introduction of those higher-order effects have in the decision of expected utility maximizers using data for the US.
Resumo:
This paper examines assumptions about future prices used in real estate applications of DCF models. We confirm both the widespread reliance on an ad hoc rule of increasing period-zero capitalization rates by 50 to 100 basis points to obtain terminal capitalization rates and the inability of the rule to project future real estate pricing. To understand how investors form expectations about future prices, we model the spread between the contemporaneously period-zero going-in and terminal capitalization rates and the spread between terminal rates assigned in period zero and going-in rates assigned in period N. Our regression results confirm statistical relationships between the terminal and next holding period going-in capitalization rate spread and the period-zero discount rate, although other economically significant variables are statistically insignificant. Linking terminal capitalization rates by assumption to going-in capitalization rates implies investors view future real estate pricing with myopic expectations. We discuss alternative specifications devoid of such linkage that align more with a rational expectations view of future real estate pricing.
Resumo:
We find evidence that conflicts of interest are pervasive in the asset management business owned by investment banks. Using data from 1990 to 2008, we compare the alphas of mutual funds, hedge funds, and institutional funds operated by investment banks and non-bank conglomerates. We find that, while no difference exists in performance by fund type, being owned by an investment bank reduces alphas by 46 basis points per year in our baseline model. Making lead loans increases alphas, but the dispersion of fees across portfolios decreases alphas. The economic loss is $4.9 billion per year.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The purpose of the study was to explore how a public, IT services transferor, organization, comprised of autonomous entities, can effectively develop and organize its data center cost recovery mechanisms in a fair manner. The lack of a well-defined model for charges and a cost recovery scheme could cause various problems. For example one entity may be subsidizing the costs of another entity(s). Transfer pricing is in the best interest of each autonomous entity in a CCA. While transfer pricing plays a pivotal role in the price settings of services and intangible assets, TCE focuses on the arrangement at the boundary between entities. TCE is concerned with the costs, autonomy, and cooperation issues of an organization. The theory is concern with the factors that influence intra-firm transaction costs and attempting to manifest the problems involved in the determination of the charges or prices of the transactions. This study was carried out, as a single case study, in a public organization. The organization intended to transfer the IT services of its own affiliated public entities and was in the process of establishing a municipal-joint data center. Nine semi-structured interviews, including two pilot interviews, were conducted with the experts and managers of the case company and its affiliating entities. The purpose of these interviews was to explore the charging and pricing issues of the intra-firm transactions. In order to process and summarize the findings, this study employed qualitative techniques with the multiple methods of data collection. The study, by reviewing the TCE theory and a sample of transfer pricing literature, created an IT services pricing framework as a conceptual tool for illustrating the structure of transferring costs. Antecedents and consequences of the transfer price based on TCE were developed. An explanatory fair charging model was eventually developed and suggested. The findings of the study suggested that the Chargeback system was inappropriate scheme for an organization with affiliated autonomous entities. The main contribution of the study was the application of TP methodologies in the public sphere with no tax issues consideration.
Resumo:
In this dissertation I quantify residential behavior response to interventions designed to reduce electricity demand at different periods of the day. In the first chapter, I examine the effect of information provision coupled with bimonthly billing, monthly billing, and in-home displays, as well as a time-of-use (TOU) pricing scheme to measure consumption over each month of the Irish Consumer Behavior Trial. I find that time-of-use pricing with real time usage information reduces electricity usage up to 8.7 percent during peak times at the start of the trial but the effect decays over the first three months and after three months the in-home display group is indistinguishable from the monthly treatment group. Monthly and bi-monthly billing treatments are not found to be statistically different from another. These findings suggest that increasing billing reports to the monthly level may be more cost effective for electricity generators who wish to decrease expenses and consumption, rather than providing in-home displays. In the following chapter, I examine the response of residential households after exposure to time of use tariffs at different hours of the day. I find that these treatments reduce electricity consumption during peak hours by almost four percent, significantly lowering demand. Within the model, I find evidence of overall conservation in electricity used. In addition, weekday peak reductions appear to carry over to the weekend when peak pricing is not present, suggesting changes in consumer habit. The final chapter of my dissertation imposes a system wide time of use plan to analyze the potential reduction in carbon emissions from load shifting based on the Ireland and Northern Single Electricity Market. I find that CO2 emissions savings are highest during the winter months when load demand is highest and dirtier power plants are scheduled to meet peak demand. TOU pricing allows for shifting in usage from peak usage to off peak usage and this shift in load can be met with cleaner and cheaper generated electricity from imports, high efficiency gas units, and hydro units.
Resumo:
This dissertation mainly focuses on coordinated pricing and inventory management problems, where the related background is provided in Chapter 1. Several periodic-review models are then discussed in Chapters 2,3,4 and 5, respectively. Chapter 2 analyzes a deterministic single-product model, where a price adjustment cost incurs if the current selling price is changed from the previous period. We develop exact algorithms for the problem under different conditions and find out that computation complexity varies significantly associated with the cost structure. %Moreover, our numerical study indicates that dynamic pricing strategies may outperform static pricing strategies even when price adjustment cost accounts for a significant portion of the total profit. Chapter 3 develops a single-product model in which demand of a period depends not only on the current selling price but also on past prices through the so-called reference price. Strongly polynomial time algorithms are designed for the case without no fixed ordering cost, and a heuristic is proposed for the general case together with an error bound estimation. Moreover, our illustrates through numerical studies that incorporating reference price effect into coordinated pricing and inventory models can have a significant impact on firms' profits. Chapter 4 discusses the stochastic version of the model in Chapter 3 when customers are loss averse. It extends the associated results developed in literature and proves that the reference price dependent base-stock policy is proved to be optimal under a certain conditions. Instead of dealing with specific problems, Chapter 5 establishes the preservation of supermodularity in a class of optimization problems. This property and its extensions include several existing results in the literature as special cases, and provide powerful tools as we illustrate their applications to several operations problems: the stochastic two-product model with cross-price effects, the two-stage inventory control model, and the self-financing model.
Resumo:
This Ph.D. thesis contains 4 essays in mathematical finance with a focus on pricing Asian option (Chapter 4), pricing futures and futures option (Chapter 5 and Chapter 6) and time dependent volatility in futures option (Chapter 7). In Chapter 4, the applicability of the Albrecher et al.(2005)'s comonotonicity approach was investigated in the context of various benchmark models for equities and com- modities. Instead of classical Levy models as in Albrecher et al.(2005), the focus is the Heston stochastic volatility model, the constant elasticity of variance (CEV) model and the Schwartz (1997) two-factor model. It is shown that the method delivers rather tight upper bounds for the prices of Asian Options in these models and as a by-product delivers super-hedging strategies which can be easily implemented. In Chapter 5, two types of three-factor models were studied to give the value of com- modities futures contracts, which allow volatility to be stochastic. Both these two models have closed-form solutions for futures contracts price. However, it is shown that Model 2 is better than Model 1 theoretically and also performs very well empiri- cally. Moreover, Model 2 can easily be implemented in practice. In comparison to the Schwartz (1997) two-factor model, it is shown that Model 2 has its unique advantages; hence, it is also a good choice to price the value of commodity futures contracts. Fur- thermore, if these two models are used at the same time, a more accurate price for commodity futures contracts can be obtained in most situations. In Chapter 6, the applicability of the asymptotic approach developed in Fouque et al.(2000b) was investigated for pricing commodity futures options in a Schwartz (1997) multi-factor model, featuring both stochastic convenience yield and stochastic volatility. It is shown that the zero-order term in the expansion coincides with the Schwartz (1997) two-factor term, with averaged volatility, and an explicit expression for the first-order correction term is provided. With empirical data from the natural gas futures market, it is also demonstrated that a significantly better calibration can be achieved by using the correction term as compared to the standard Schwartz (1997) two-factor expression, at virtually no extra effort. In Chapter 7, a new pricing formula is derived for futures options in the Schwartz (1997) two-factor model with time dependent spot volatility. The pricing formula can also be used to find the result of the time dependent spot volatility with futures options prices in the market. Furthermore, the limitations of the method that is used to find the time dependent spot volatility will be explained, and it is also shown how to make sure of its accuracy.
Resumo:
Ph.D. in the Faculty of Business Administration
Resumo:
¿What have we learnt from the 2006-2012 crisis, including events such as the subprime crisis, the bankruptcy of Lehman Brothers or the European sovereign debt crisis, among others? It is usually assumed that in firms that have a CDS quotation, this CDS is the key factor in establishing the credit premiumrisk for a new financial asset. Thus, the CDS is a key element for any investor in taking relative value opportunities across a firm’s capital structure. In the first chapter we study the most relevant aspects of the microstructure of the CDS market in terms of pricing, to have a clear idea of how this market works. We consider that such an analysis is a necessary point for establishing a solid base for the rest of the chapters in order to carry out the different empirical studies we perform. In its document “Basel III: A global regulatory framework for more resilient banks and banking systems”, Basel sets the requirement of a capital charge for credit valuation adjustment (CVA) risk in the trading book and its methodology for the computation for the capital requirement. This regulatory requirement has added extra pressure for in-depth knowledge of the CDS market and this motivates the analysis performed in this thesis. The problem arises in estimating of the credit risk premium for those counterparties without a directly quoted CDS in the market. How can we estimate the credit spread for an issuer without CDS? In addition to this, given the high volatility period in the credit market in the last few years and, in particular, after the default of Lehman Brothers on 15 September 2008, we observe the presence of big outliers in the distribution of credit spread in the different combinations of rating, industry and region. After an exhaustive analysis of the results from the different models studied, we have reached the following conclusions. It is clear that hierarchical regression models fit the data much better than those of non-hierarchical regression. Furthermore,we generally prefer the median model (50%-quantile regression) to the mean model (standard OLS regression) due to its robustness when assigning the price to a new credit asset without spread,minimizing the “inversion problem”. Finally, an additional fundamental reason to prefer the median model is the typical "right skewness" distribution of CDS spreads...
Resumo:
The MARS (Media Asset Retrieval System) Project is the collaborative effort of public broadcasters,libraries and schools in the Puget Sound region to create a digital online resource that provides access to content produced by public broadcasters via the public libraries. Convergence ConsortiumThe Convergence Consortium is a model for community collaboration, including organizations such as public broadcasters, libraries, museums, and schools in the Puget Sound region to assess the needs of their constituents and pool resources to develop solutions to meet those needs. Specifically, the archives of public broadcasters have been identified as significant resources for the local communities and nationally. These resources can be accessed on the broadcasters websites, and through libraries and used by schools, and integrated with text and photographic archives from other partners.MARS’ goalCreate an online resource that provides effective access to the content produced locally by KCTS (Seattle PBS affiliate) and KUOW (Seattle NPR affiliate). The broadcasts will be made searchable using the CPB Metadata Element Set (under development) and controlled vocabularies (to be developed). This will ensure a user friendly search and navigation mechanism and user satisfaction.Furthermore, the resource can search the local public library’s catalog concurrently and provide the user with relevant TV material, radio material, and books on a given subject.The ultimate goal is to produce a model that can be used in cities around the country.The current phase of the project assesses the community’s need, analyzes the current operational systems, and makes recommendations for the design of the resource.Deliverables• Literature review of the issues surrounding the organization, description and representation of media assets• Needs assessment report of internal and external stakeholders• Profile of the systems in the area of managing and organizing media assetsfor public broadcasting nationwideActivities• Analysis of information seeking behavior• Analysis of collaboration within the respective organizations• Analysis of the scope and context of the proposed system• Examining the availability of information resources and exchangeof resources among users
Resumo:
The MARS (Media Asset Retrieval System) Project is a collaboration between public broadcasters, libraries and schools in the Puget Sound region to assess the needs of their constituents and pool resources to develop solutions to meet those needs. The Project’s ultimate goal is to create a digital online resource that will provide access to content produced by public broadcasters and libraries. The MARS Project is funded by a grant from the Corporation for Public Broadcasting (CPB) Television Future Fund. Convergence ConsortiumThe Convergence Consortium is a model for community collaboration, including representatives from public broadcasting, libraries and schools in the Puget Sound region. They meet regularly to consider collaborative efforts that will be mutually beneficial to their institutions and constituents. Specifically, the archives of public broadcasters have been identified as significant resources that can be accessed through libraries and used by schools, and integrated with text and photographic archives from other partners.Using the work-centered framework, we collected data through interviews with nine engineers and observation of their searching while they performed their regular, job-related searches on the Web. The framework was used to analyze the data on two levels: 1) the activities and organizational relationships and constrains of work domains, and 2) users’ cognitive and social activities and their subjective preferences during searching.
Resumo:
The first paper sheds light on the informational content of high frequency data and daily data. I assess the economic value of the two family models comparing their performance in forecasting asset volatility through the Value at Risk metric. In running the comparison this paper introduces two key assumptions: jumps in prices and leverage effect in volatility dynamics. Findings suggest that high frequency data models do not exhibit a superior performance over daily data models. In the second paper, building on Majewski et al. (2015), I propose an affine-discrete time model, labeled VARG-J, which is characterized by a multifactor volatility specification. In the VARG-J model volatility experiences periods of extreme movements through a jump factor modeled as an Autoregressive Gamma Zero process. The estimation under historical measure is done by quasi-maximum likelihood and the Extended Kalman Filter. This strategy allows to filter out both volatility factors introducing a measurement equation that relates the Realized Volatility to latent volatility. The risk premia parameters are calibrated using call options written on S&P500 Index. The results clearly illustrate the important contribution of the jump factor in the pricing performance of options and the economic significance of the volatility jump risk premia. In the third paper, I analyze whether there is empirical evidence of contagion at the bank level, measuring the direction and the size of contagion transmission between European markets. In order to understand and quantify the contagion transmission on banking market, I estimate the econometric model by Aït-Sahalia et al. (2015) in which contagion is defined as the within and between countries transmission of shocks and asset returns are directly modeled as a Hawkes jump diffusion process. The empirical analysis indicates that there is a clear evidence of contagion from Greece to European countries as well as self-contagion in all countries.