894 resultados para Prices and dividends


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most monetary models make use of the quantity theory of money along with a Phillips curve. This implies a strong correlation between money growth and output in the short run (with little or no correlation between money and prices) and a strong long run correlation between money growth and inflation and inflation (with little or no correlation between money growth and output). The empirical evidence between money and inflation is very robust, but the long run money/output relationship is ambiguous at best. This paper attempts to explain this by looking at the impact of money growth on firm financing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a solution for building a better strategy to take part in external electricity markets. For an optimal strategy development, both the internal system costs as well as the future values of the series of electricity prices in external markets need to be known. But in practice, the real problems that must be faced are that both future electricity prices and costs are unknown. Thus, the first ones must be modeled and forecasted and the costs must be calculated. Our methodology for building an optimal strategy consists of three steps: The first step is modeling and forecasting market prices in external systems. The second step is the cost calculation on internal system taking into account the expected prices in the first step. The third step is based on the results of the previous steps, and consists of preparing the bids for external markets. The main goal is to reduce consumers' costs unlike many others that are oriented to increase GenCo's profits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a forecasting technique for forward energy prices, one day ahead. This technique combines a wavelet transform and forecasting models such as multi- layer perceptron, linear regression or GARCH. These techniques are applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the wavelet transform. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The predictive accuracy of competing crude-oil price forecast densities is investigated for the 1994–2006 period. Moving beyond standard ARCH type models that rely exclusively on past returns, we examine the benefits of utilizing the forward-looking information that is embedded in the prices of derivative contracts. Risk-neutral densities, obtained from panels of crude-oil option prices, are adjusted to reflect real-world risks using either a parametric or a non-parametric calibration approach. The relative performance of the models is evaluated for the entire support of the density, as well as for regions and intervals that are of special interest for the economic agent. We find that non-parametric adjustments of risk-neutral density forecasts perform significantly better than their parametric counterparts. Goodness-of-fit tests and out-of-sample likelihood comparisons favor forecast densities obtained by option prices and non-parametric calibration methods over those constructed using historical returns and simulated ARCH processes. © 2010 Wiley Periodicals, Inc. Jrl Fut Mark 31:727–754, 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study pursues two objectives: first, to provide evidence on the information content of dividend policy, conditional on past earnings and dividend patterns prior to an annual earnings decline; second, to examine the effect of the magnitude of low earnings realizations on dividend policy when firms have more-or-less established dividend payouts. The information content of dividend policy for firms that incur earnings reductions following long patterns of positive earnings and dividends has been examined (DeAngelo et al., 1992, 1996; Charitou, 2000). No research has examined the association between the informativeness of dividend policy changes in the event of an earnings drop, relative to varying patterns of past earnings and dividends. Our dataset consists of 4,873 U.S. firm-year observations over the period 1986-2005. Our evidence supports the hypotheses that, among earnings-reducing or loss firms, longer patterns of past earnings and dividends: (a) strengthen the information conveyed by dividends regarding future earnings, and (b) enhance the role of the magnitude of low earnings realizations in explaining dividend policy decisions, in that earnings hold more information content that explains the likelihood of dividend cuts the longer the past earnings and dividend patterns. Both results stem from the stylized facts that managers aim to maintain consistency with respect to historic payout policy, being reluctant to proceed with dividend reductions, and that this reluctance is higher the more established is the historic payout policy. © 2010 The Authors. Journal compilation © 2010 Accounting Foundation, The University of Sydney.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A 2008-ban kezdődött gazdasági válság a korábbiaknál is fontosabbá tette az árakat a vásárlók számára. Azt eddig is mindenki tudta, hogy az árak alapvetően befolyásolják a fogyasztók vásárlási döntését. Arra a kérdésre azonban, hogy miképpen, már nem mindig tudunk pontos választ adni. A közgazdaságtan szerint az árak csökkenése növeli a fogyasztók vásárlási hajlandóságát és fordítva, az árak emelkedése kisebbíti azt. A valóság azonban nem mindig írható le közgazdaságtani fogalmakkal vagy matematikai képletekkel. _______ Since the beginning of the global economic recession prices have become more and more important for sellers and buyers. To study the role of prices in consumer behaviour is a rather new field of marketing research. The paper starts out from the fact that prices can be regarded as a multidimensional stimulus, which influences the purchasing decision of consumers. The study describes the process how, in this multidimensional pricing environment, consumers get from the perception through the evaluation of prices to the purchasing decision. According to the model constructed by the author the perception of prices depends on the presentation of prices and on the willingness and ability of people to numerically perceive and evaluate the different presentations of prices. In the process how consumers get from the perceived prices through the excepted prices to the purchasing decision the perceived value plays the most important role. The perceived value is motivated by the internal and external reference prices and the perceived reference value. The paper comes to the conclusion that in recession and post recession times, companies are compelled to understand these processes better to be able to set their price points according to the changing buyers behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some research works state that speculation with agricultural commodities on the futures market has risen agricultural commodity spot prices. This research work analyzes the causal relationships between spot prices of corn, wheat, and soybean and agricultural commodity futures trading activities. These causal relationships between agricultural commodity spot prices and financial variables are tested for Granger-causality. Model results show that causal relationships have been found among changes in “volume traded” and “open positions” of futures contracts and changes in spot prices for corn. These results do not show that financial speculation might be a major driver of rising agricultural commodity prices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines the relationship between the volatility implied in option prices and the subsequently realized volatility by using the S&P/ASX 200 index options (XJO) traded on the Australian Stock Exchange (ASX) during a period of 5 years. Unlike stock index options such as the S&P 100 index options in the US market, the S&P/ASX 200 index options are traded infrequently and in low volumes, and have a long maturity cycle. Thus an errors-in-variables problem for measurement of implied volatility is more likely to exist. After accounting for this problem by instrumental variable method, it is found that both call and put implied volatilities are superior to historical volatility in forecasting future realized volatility. Moreover, implied call volatility is nearly an unbiased forecast of future volatility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Credence goods markets are characterized by asymmetric information between sellers and consumers that may give rise to inefficiencies, such as under- and overtreatment or market break-down. We study in a large experiment with 936 participants the determinants for efficiency in credence goods markets. While theory predicts that either liability or verifiability yields efficiency, we find that liability has a crucial, but verifiability only a minor effect. Allowing sellers to build up reputation has little influence, as predicted. Seller competition drives down prices and yields maximal trade, but does not lead to higher efficiency as long as liability is violated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fishers are faced with multiple risks, including unpredictability of future catch rates, prices and costs. While the latter are largely beyond the control of fisheries managers, effective fisheries management should reduce uncertainty about future catches. Different management instruments are likely to have different impacts on the risk perception of fishers, and this should manifest itself in their implicit discount rate. Assuming licence and quota values represent the net present value of the flow of expected future profits, then a proxy for the implicit discount rate of vessels in a fishery can be derived by the ratio of the average level of profits to the average licence/quota value. From this, an indication of the risk perception can be derived, assuming higher discount rates reflect higher levels of systematic risk. In this paper, we apply the capital asset pricing model (CAPM) to determine the risk premium implicit in the discount rates for a range of Australian fisheries, and compare this with the set of management instruments in place. We test the assumption that rights based management instruments lower perceptions of risk in fisheries. We find little evidence to support this assumption. although the analysis was based on only limited data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Open pit mine operations are complex businesses that demand a constant assessment of risk. This is because the value of a mine project is typically influenced by many underlying economic and physical uncertainties, such as metal prices, metal grades, costs, schedules, quantities, and environmental issues, among others, which are not known with much certainty at the beginning of the project. Hence, mining projects present a considerable challenge to those involved in associated investment decisions, such as the owners of the mine and other stakeholders. In general terms, when an option exists to acquire a new or operating mining project, , the owners and stock holders of the mine project need to know the value of the mining project, which is the fundamental criterion for making final decisions about going ahead with the venture capital. However, obtaining the mine project’s value is not an easy task. The reason for this is that sophisticated valuation and mine optimisation techniques, which combine advanced theories in geostatistics, statistics, engineering, economics and finance, among others, need to be used by the mine analyst or mine planner in order to assess and quantify the existing uncertainty and, consequently, the risk involved in the project investment. Furthermore, current valuation and mine optimisation techniques do not complement each other. That is valuation techniques based on real options (RO) analysis assume an expected (constant) metal grade and ore tonnage during a specified period, while mine optimisation (MO) techniques assume expected (constant) metal prices and mining costs. These assumptions are not totally correct since both sources of uncertainty—that of the orebody (metal grade and reserves of mineral), and that about the future behaviour of metal prices and mining costs—are the ones that have great impact on the value of any mining project. Consequently, the key objective of this thesis is twofold. The first objective consists of analysing and understanding the main sources of uncertainty in an open pit mining project, such as the orebody (in situ metal grade), mining costs and metal price uncertainties, and their effect on the final project value. The second objective consists of breaking down the wall of isolation between economic valuation and mine optimisation techniques in order to generate a novel open pit mine evaluation framework called the ―Integrated Valuation / Optimisation Framework (IVOF)‖. One important characteristic of this new framework is that it incorporates the RO and MO valuation techniques into a single integrated process that quantifies and describes uncertainty and risk in a mine project evaluation process, giving a more realistic estimate of the project’s value. To achieve this, novel and advanced engineering and econometric methods are used to integrate financial and geological uncertainty into dynamic risk forecasting measures. The proposed mine valuation/optimisation technique is then applied to a real gold disseminated open pit mine deposit to estimate its value in the face of orebody, mining costs and metal price uncertainties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Some uncertainties such as the stochastic input/output power of a plug-in electric vehicle due to its stochastic charging and discharging schedule, that of a wind unit and that of a photovoltaic generation source, volatile fuel prices and future uncertain load growth, all together could lead to some risks in determining the optimal siting and sizing of distributed generators (DGs) in distributed systems. Given this background, under the chance constrained programming (CCP) framework, a new method is presented to handle these uncertainties in the optimal sitting and sizing problem of DGs. First, a mathematical model of CCP is developed with the minimization of DGs investment cost, operational cost and maintenance cost as well as the network loss cost as the objective, security limitations as constraints, the sitting and sizing of DGs as optimization variables. Then, a Monte Carolo simulation embedded genetic algorithm approach is developed to solve the developed CCP model. Finally, the IEEE 37-node test feeder is employed to verify the feasibility and effectiveness of the developed model and method. This work is supported by an Australian Commonwealth Scientific and Industrial Research Organisation (CSIRO) Project on Intelligent Grids Under the Energy Transformed Flagship, and Project from Jiangxi Power Company.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There has been substantial interest within the Australian sugar industry in product diversification as a means to reduce its exposure to fluctuating raw sugar prices and in order to increase its commercial viability. In particular, the industry is looking at fibrous residues from sugarcane harvesting (trash) and from sugarcane milling (bagasse) for cogeneration and the production of biocommodities, as these are complementary to the core process of sugar production. A means of producing surplus residue (biomass) is to process whole sugarcane crop. In this paper, the composition of different juices derived from different harvesting methods, viz. burnt cane with all trash extracted (BE), green cane with half of the trash extracted (GE), and green cane (whole sugarcane crop) with trash unextracted (GU), were investigated and the results and comparison presented. The determination of electrical conductivity, inorganic composition, and organic acids indicate that both GU and GE cane juice contain a higher proportion of soluble inorganic ions and ionisable organic acids, compared to BE cane juice. It is important to note that there are considerably higher levels of Na ions and citric acid, but relatively low P levels in the GU samples. A higher level of reducing sugars was analysed in the GU samples than the BE samples due to the higher proportion of impurities found naturally in sugarcane tops and leaves. The purity of the first expressed juice (FEJ) of GU cane was on average higher than that of FEJ of BE cane. Results also show that GU juices appear to contain higher levels of proteins and polysaccharides, with no significant difference in starch levels.