873 resultados para Net expected return
Resumo:
This paper examines the relationships between uncertainty and the perceived usefulness of traditional annual budgets versus flexible budgets in 95 Swedish companies. We form hypotheses that the perceived usefulness of the annual budgets as well as the attitudes to more flexible budget alternatives are influenced by the uncertainty that the companies face. Our study distinguishes between two separate kinds of uncertainty: exogenous stochastic uncertainty (deriving from the firm’s environment) and endogenous deterministic uncertainty (caused by the strategic choices made by the firm itself). Based on a structural equations modelling analysis of data from a mail survey we found that the more accentuated exogenous uncertainty a company faces, the more accentuated is the expected trend towards flexibility in the budget system, and vice versa; the more endogenous uncertainty they face, the more negative are their attitudes towards budget flexibility. We also found that these relationships were not present with regard to the attitudes towards the usefulness of the annual budget. Noteworthy is, however, that there was a significant negative relationship between the perceived usefulness of the annual budget and budget flexibility. Thus, our results seem to indicate that the degree of flexibility in the budget system is influenced by both general attitudes towards the usefulness of traditional budgets and by the actual degree of exogenous uncertainty a company faces and by the strategy that it executes.
Resumo:
This study develops a real options approach for analyzing the optimal risk adoption policy in an environment where the adoption means a switch from one stochastic flow representation into another. We establish that increased volatility needs not decelerate investment, as predicted by the standard literature on real options, once the underlying volatility of the state is made endogenous. We prove that for a decision maker with a convex (concave) objective function, increased post-adoption volatility increases (decreases) the expected cumulative present value of the post-adoption profit flow, which consequently decreases (increases) the option value of waiting and, therefore, accelerates (decelerates) current investment.
Resumo:
We characterize the optimal reserves, and the generated probability of a bank run, as a function of the penalty imposed by the central bank, the probability of depositors’ liquidity needs, and the return on outside investment opportunities.
Resumo:
This study contributes to the executive stock option literature by looking at factors driving the introduction of such a compensation form on a firm level. Using a discrete decision model I test the explanatory power of several agency theory based variables and find strong support for predictability of the form of executive compensation. Ownership concentration and liquidity are found to have a significant negative effect on the probability of stock option adoption. Furtermore, I find evidence of CEO ownership, institutional ownership, investment intensity, and historical market return having a significant and a positive relationship to the likelihood of adopting a executive stock option program.
Resumo:
The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.
Resumo:
The use of different time units in option pricing may lead to inconsistent estimates of time decay and spurious jumps in implied volatilities. Different time units in the pricing model leads to different implied volatilities although the option price itself is the same.The chosen time unit should make it necessary to adjust the volatility parameter only when there are some fundamental reasons for it and not due to wrong specifications of the model. This paper examined the effects of option pricing using different time hypotheses and empirically investigated which time frame the option markets in Germany employ over weekdays. The paper specifically tries to get a picture of how the market prices options. The results seem to verify that the German market behaves in a fashion that deviates from the most traditional time units in option pricing, calendar and trading days. The study also showed that the implied volatility of Thursdays was somewhat higher and thus differed from the pattern of other days of the week. Using a GARCH model to further investigate the effect showed that although a traditional tests, like the analysis of variance, indicated a negative return for Thursday during the same period as the implied volatilities used, this was not supported using a GARCH model.
Resumo:
Although empirical evidence suggests the contrary, many asset pricing models assume stock returns to be symmetrically distributed. In this paper it is argued that the occurrence of negative jumps in a firm's future earnings and, consequently, in its stock price, is positively related to the level of network externalities in the firm's product market. If the ex post frequency of these negative jumps in a sample does not equal the ex ante assessed probability of occurrence, the sample is subject to a peso problem. The hypothesis is tested for by regressing the skewness coefficient of a firm’s realised stock return distribution on the firm’s R&D intensity, i.e. the ratio of the firm’s research and development expenditure to its net sales. The empirical results support the technology-related peso problem hypothesis. In samples subject to such a peso problem, the returns are biased up and the variance is biased down.
Resumo:
This study evaluates three different time units in option pricing: trading time, calendar time and continuous time using discrete approximations (CTDA). The CTDA-time model partitions the trading day into 30-minute intervals, where each interval is given a weight corresponding to the historical volatility in the respective interval. Furthermore, the non-trading volatility, both overnight and weekend volatility, is included in the first interval of the trading day in the CTDA model. The three models are tested on market prices. The results indicate that the trading-time model gives the best fit to market prices in line with the results of previous studies, but contrary to expectations under non-arbitrage option pricing. Under non-arbitrage pricing, the option premium should reflect the cost of hedging the expected volatility during the option’s remaining life. The study concludes that the historical patterns in volatility are not fully accounted for by the market, rather the market prices options closer to trading time.
Resumo:
The low predictive power of implied volatility in forecasting the subsequently realized volatility is a well-documented empirical puzzle. As suggested by e.g. Feinstein (1989), Jackwerth and Rubinstein (1996), and Bates (1997), we test whether unrealized expectations of jumps in volatility could explain this phenomenon. Our findings show that expectations of infrequently occurring jumps in volatility are indeed priced in implied volatility. This has two important consequences. First, implied volatility is actually expected to exceed realized volatility over long periods of time only to be greatly less than realized volatility during infrequently occurring periods of very high volatility. Second, the slope coefficient in the classic forecasting regression of realized volatility on implied volatility is very sensitive to the discrepancy between ex ante expected and ex post realized jump frequencies. If the in-sample frequency of positive volatility jumps is lower than ex ante assessed by the market, the classic regression test tends to reject the hypothesis of informational efficiency even if markets are informationally effective.
Resumo:
This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.
Resumo:
A vast literature documents negative skewness and excess kurtosis in stock return distributions on several markets. We approach the issue of negative skewness from a different angle than in previous studies by suggesting a model, which we denote the “negative news threshold” hypothesis, that builds on asymmetrically distributed information and symmetric market responses. Our empirical tests reveal that returns for days when non-scheduled news are disclosed are the source of negative skewness in stock returns. This finding lends solid support to our model and suggests that negative skewness in stock returns is induced by asymmetries in the news disclosure policies of firm management.
Resumo:
This study examines the intraday and weekend volatility on the German DAX. The intraday volatility is partitioned into smaller intervals and compared to a whole day’s volatility. The estimated intraday variance is U-shaped and the weekend variance is estimated to 19 % of a normal trading day. The patterns in the intraday and weekend volatility are used to develop an extension to the Black and Scholes formula to form a new time basis. Calendar or trading days are commonly used for measuring time in option pricing. The Continuous Time using Discrete Approximations model (CTDA) developed in this study uses a measure of time with smaller intervals, approaching continuous time. The model presented accounts for the lapse of time during trading only. Arbitrage pricing suggests that the option price equals the expected cost of hedging volatility during the option’s remaining life. In this model, time is allowed to lapse as volatility occurs on an intraday basis. The measure of time is modified in CTDA to correct for the non-constant volatility and to account for the patterns in volatility.
Resumo:
We describe the on-going design and implementation of a sensor network for agricultural management targeted at resource-poor farmers in India. Our focus on semi-arid regions led us to concentrate on water-related issues. Throughout 2004, we carried out a survey on the information needs of the population living in a cluster of villages in our study area. The results highlighted the potential that environment-related information has for the improvement of farming strategies in the face of highly variable conditions, in particular for risk management strategies (choice of crop varieties, sowing and harvest periods, prevention of pests and diseases, efficient use of irrigation water etc.). This leads us to advocate an original use of Information and Communication Technologies (ICT). We believe our demand-driven approach for the design of appropriate ICT tools that are targeted at the resource-poor to be relatively new. In order to go beyond a pure technocratic approach, we adopted an iterative, participatory methodology.
Resumo:
We provide a survey of some of our recent results ([9], [13], [4], [6], [7]) on the analytical performance modeling of IEEE 802.11 wireless local area networks (WLANs). We first present extensions of the decoupling approach of Bianchi ([1]) to the saturation analysis of IEEE 802.11e networks with multiple traffic classes. We have found that even when analysing WLANs with unsaturated nodes the following state dependent service model works well: when a certain set of nodes is nonempty, their channel attempt behaviour is obtained from the corresponding fixed point analysis of the saturated system. We will present our experiences in using this approximation to model multimedia traffic over an IEEE 802.11e network using the enhanced DCF channel access (EDCA) mechanism. We have found that we can model TCP controlled file transfers, VoIP packet telephony, and streaming video in the IEEE802.11e setting by this simple approximation.
Resumo:
The unique characteristics of marketspace in combination with the fast growing number of consumers interested in e-commerce have created new research areas of interest to both marketing and consumer behaviour researchers. Consumer behaviour researchers interested in the decision making processes of consumers have two new sets of questions to answer. The first set of questions is related to how useful theories developed for a marketplace are in a marketspace context. Cyber auctions, Internet communities and the possibilities for consumers to establish dialogues not only with companies but also with other consumers make marketspace unique. The effects of these distinctive characteristics on the behaviour of consumers have not been systematically analysed and therefore constitute the second set of questions which have to be studied. Most companies feel that they have to be online even though the effects of being on the Net are not unambiguously positive. The relevance of the relationship marketing paradigm in a marketspace context have to be studied. The relationship enhancement effects of websites from the customers’ point of view are therefore emphasized in this research paper. Representatives of the Net-generation were analysed and the results show that companies should develop marketspace strategies while Net presence has a value-added effect on consumers. The results indicate that the decision making processes of the consumers are also changing as a result of the progress of marketspace