52 resultados para Prices and dividends

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sweden’s protest against the Vietnam War was given tangible form in 1969 through the decision to give economic aid to the Government of North Vietnam. The main outcome was an integrated pulp and paper mill in the Vinh Phu Province north-west of Hanoi. Known as Bai Bang after its location, the mill became the most costly, one of the longest lasting and the most controversial project in the history of Swedish development cooperation. In 1996 Bai Bang produced at its full capacity. Today the mill is exclusively managed and staffed by the Vietnamese and there are plans for future expansion. At the same time a substantial amount of money has been spent to reach these achievements. Looking back at the cumbersome history of the project the results are against many’s expectations. To learn more about the conditions for sustainable development Sida commissioned two studies of the Bai Bang project. Together they touch upon several important issues in development cooperation over a period of almost 30 years: the change of aid paradigms over time, the role of foreign policy in development cooperation, cultural obstacles, recipient responsibility versus donor led development etc. The two studies were commissioned by Sida’s Department for Evaluation and Internal Audit which is an independent department reporting directly to Sida’s Board of Directors. One study assesses the financial and economic viability of the pulp and paper mill and the broader development impact of the project in Vietnam. It has been carried out by the Centre for International Economics, an Australian private economic research agency. The other study analyses the decision-making processes that created and shaped the project over a period of two decades, and reflects on lessons from the project for development cooperation in general. This study has been carried out by the Chr. Michelsen Institute, a Norweigan independent research institution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aims of this project is to develop demand side response model which assists electricity consumers who are exposed to the market price through aggregator to manage the air-conditioning peak electricity demand. The main contribution of this research is to show how consumers can optimise the energy cost caused by the air-conditioning load considering the electricity market price and network overload. The model is tested with selected characteristics of the room, Queensland electricity market data from Australian Energy Market Operator and data from the Bureau of Statistics on temperatures in Brisbane, during weekdays on hot days from 2011 - 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent discussions of energy security and climate change have attracted significant attention to clean energy. We hypothesize that rising prices of conventional energy and/or placement of a price on carbon emissions would encourage investments in clean energy firms. The data from three clean energy indices show that oil prices and technology stock prices separately affect the stock prices of clean energy firms. However, the data fail to demonstrate a significant relationship between carbon prices and the stock prices of the firms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper examines the relationship between the volatility implied in option prices and the subsequently realized volatility by using the S&P/ASX 200 index options (XJO) traded on the Australian Stock Exchange (ASX) during a period of 5 years. Unlike stock index options such as the S&P 100 index options in the US market, the S&P/ASX 200 index options are traded infrequently and in low volumes, and have a long maturity cycle. Thus an errors-in-variables problem for measurement of implied volatility is more likely to exist. After accounting for this problem by instrumental variable method, it is found that both call and put implied volatilities are superior to historical volatility in forecasting future realized volatility. Moreover, implied call volatility is nearly an unbiased forecast of future volatility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Credence goods markets are characterized by asymmetric information between sellers and consumers that may give rise to inefficiencies, such as under- and overtreatment or market break-down. We study in a large experiment with 936 participants the determinants for efficiency in credence goods markets. While theory predicts that either liability or verifiability yields efficiency, we find that liability has a crucial, but verifiability only a minor effect. Allowing sellers to build up reputation has little influence, as predicted. Seller competition drives down prices and yields maximal trade, but does not lead to higher efficiency as long as liability is violated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fishers are faced with multiple risks, including unpredictability of future catch rates, prices and costs. While the latter are largely beyond the control of fisheries managers, effective fisheries management should reduce uncertainty about future catches. Different management instruments are likely to have different impacts on the risk perception of fishers, and this should manifest itself in their implicit discount rate. Assuming licence and quota values represent the net present value of the flow of expected future profits, then a proxy for the implicit discount rate of vessels in a fishery can be derived by the ratio of the average level of profits to the average licence/quota value. From this, an indication of the risk perception can be derived, assuming higher discount rates reflect higher levels of systematic risk. In this paper, we apply the capital asset pricing model (CAPM) to determine the risk premium implicit in the discount rates for a range of Australian fisheries, and compare this with the set of management instruments in place. We test the assumption that rights based management instruments lower perceptions of risk in fisheries. We find little evidence to support this assumption. although the analysis was based on only limited data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Open pit mine operations are complex businesses that demand a constant assessment of risk. This is because the value of a mine project is typically influenced by many underlying economic and physical uncertainties, such as metal prices, metal grades, costs, schedules, quantities, and environmental issues, among others, which are not known with much certainty at the beginning of the project. Hence, mining projects present a considerable challenge to those involved in associated investment decisions, such as the owners of the mine and other stakeholders. In general terms, when an option exists to acquire a new or operating mining project, , the owners and stock holders of the mine project need to know the value of the mining project, which is the fundamental criterion for making final decisions about going ahead with the venture capital. However, obtaining the mine project’s value is not an easy task. The reason for this is that sophisticated valuation and mine optimisation techniques, which combine advanced theories in geostatistics, statistics, engineering, economics and finance, among others, need to be used by the mine analyst or mine planner in order to assess and quantify the existing uncertainty and, consequently, the risk involved in the project investment. Furthermore, current valuation and mine optimisation techniques do not complement each other. That is valuation techniques based on real options (RO) analysis assume an expected (constant) metal grade and ore tonnage during a specified period, while mine optimisation (MO) techniques assume expected (constant) metal prices and mining costs. These assumptions are not totally correct since both sources of uncertainty—that of the orebody (metal grade and reserves of mineral), and that about the future behaviour of metal prices and mining costs—are the ones that have great impact on the value of any mining project. Consequently, the key objective of this thesis is twofold. The first objective consists of analysing and understanding the main sources of uncertainty in an open pit mining project, such as the orebody (in situ metal grade), mining costs and metal price uncertainties, and their effect on the final project value. The second objective consists of breaking down the wall of isolation between economic valuation and mine optimisation techniques in order to generate a novel open pit mine evaluation framework called the ―Integrated Valuation / Optimisation Framework (IVOF)‖. One important characteristic of this new framework is that it incorporates the RO and MO valuation techniques into a single integrated process that quantifies and describes uncertainty and risk in a mine project evaluation process, giving a more realistic estimate of the project’s value. To achieve this, novel and advanced engineering and econometric methods are used to integrate financial and geological uncertainty into dynamic risk forecasting measures. The proposed mine valuation/optimisation technique is then applied to a real gold disseminated open pit mine deposit to estimate its value in the face of orebody, mining costs and metal price uncertainties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Some uncertainties such as the stochastic input/output power of a plug-in electric vehicle due to its stochastic charging and discharging schedule, that of a wind unit and that of a photovoltaic generation source, volatile fuel prices and future uncertain load growth, all together could lead to some risks in determining the optimal siting and sizing of distributed generators (DGs) in distributed systems. Given this background, under the chance constrained programming (CCP) framework, a new method is presented to handle these uncertainties in the optimal sitting and sizing problem of DGs. First, a mathematical model of CCP is developed with the minimization of DGs investment cost, operational cost and maintenance cost as well as the network loss cost as the objective, security limitations as constraints, the sitting and sizing of DGs as optimization variables. Then, a Monte Carolo simulation embedded genetic algorithm approach is developed to solve the developed CCP model. Finally, the IEEE 37-node test feeder is employed to verify the feasibility and effectiveness of the developed model and method. This work is supported by an Australian Commonwealth Scientific and Industrial Research Organisation (CSIRO) Project on Intelligent Grids Under the Energy Transformed Flagship, and Project from Jiangxi Power Company.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There has been substantial interest within the Australian sugar industry in product diversification as a means to reduce its exposure to fluctuating raw sugar prices and in order to increase its commercial viability. In particular, the industry is looking at fibrous residues from sugarcane harvesting (trash) and from sugarcane milling (bagasse) for cogeneration and the production of biocommodities, as these are complementary to the core process of sugar production. A means of producing surplus residue (biomass) is to process whole sugarcane crop. In this paper, the composition of different juices derived from different harvesting methods, viz. burnt cane with all trash extracted (BE), green cane with half of the trash extracted (GE), and green cane (whole sugarcane crop) with trash unextracted (GU), were investigated and the results and comparison presented. The determination of electrical conductivity, inorganic composition, and organic acids indicate that both GU and GE cane juice contain a higher proportion of soluble inorganic ions and ionisable organic acids, compared to BE cane juice. It is important to note that there are considerably higher levels of Na ions and citric acid, but relatively low P levels in the GU samples. A higher level of reducing sugars was analysed in the GU samples than the BE samples due to the higher proportion of impurities found naturally in sugarcane tops and leaves. The purity of the first expressed juice (FEJ) of GU cane was on average higher than that of FEJ of BE cane. Results also show that GU juices appear to contain higher levels of proteins and polysaccharides, with no significant difference in starch levels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the last three decades neoliberalism has transitioned from occupying the margins of economic policy debate to becoming the dominant approach by governments and their economic advisers, a process that has accelerated with the collapse of the former Stalinist states in Eastern Europe and the Soviet Union. This thesis adopts a Marxist framework for understanding this process, beginning as it did in the realm of relatively abstract philosophical and ideological debate to the permeation of neoliberal values throughout all capitalist institutions, including the state bureaucracy. This necessarily means a focus on the dialectical relationship between the rise of neoliberalism and the shifting balance of class forces that accompanied the success of the neoliberal project in transforming the dominant economic policy paradigm. The extent to which neoliberal reforms impacted on workers and public sector institutions, along with the success or otherwise of traditional working class institutions in defending the material interests of workers will therefore be a recurring theme throughout this body of work. The evidence borne from this research and analysis suggests a major shift in the dialectic of class struggle in favour of the power of capital over labour during the period covered, with the neoliberal age being one of defeat for a labour movement that largely failed to adopt successful strategies for defending itself.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

"The financial system is a key influencer of the health and efficiency of an economy. The role of the financial system is to gather money from people and businesses that currently have more money than they need and transfer it to those that can use it for either business or consumer expenditures. This flow of funds through financial markets and institutions in the Australian economy is huge (in the billions of dollars), affecting business profits, the rate of inflation, interest rates and the production of goods and services. In general, the larger the flow of funds and the more efficient the financial system, the greater the economic output and welfare in the economy. It is not possible to have a modern, complex economy such as that in Australia, without an efficient and sound financial system. The global financial crisis (GFC) of late 2007–09 (and the ensuing European debt crisis), where the global financial market was on the brink of collapse with only significant government intervention stopping a catastrophic global failure of the market, illustrated the importance of the financial system. Financial Markets, Institutions and Money 3rd edition introduces students to the financial system, its operations, and participants. The text offers a fresh, succinct analysis of the financial markets and discusses how the many participants in the financial system interrelate. This includes coverage of regulators, regulations and the role of the Reserve Bank of Australia, that ensure the system’s smooth running, which is essential to a modern economy. The text has been significantly revised to take into account changes in the financial world."---publisher website Table of Contents 1. The financial system - an overview 2. The Monetary Authorities 3. The Reserve Bank of Australia and interest rates 4. The level of interest rates 5. Mathematics of finance 6. Bond Prices and interest rate risk 7. The Structure of Interest Rates 8. Money Markets 9. Bond Markets 10. Equity Markets

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Food prices and food affordability are important determinants of food choices, obesity and non-communicable diseases. As governments around the world consider policies to promote the consumption of healthier foods, data on the relative price and affordability of foods, with a particular focus on the difference between ‘less healthy’ and ‘healthy’ foods and diets, are urgently needed. This paper briefly reviews past and current approaches to monitoring food prices, and identifies key issues affecting the development of practical tools and methods for food price data collection, analysis and reporting. A step-wise monitoring framework, including measurement indicators, is proposed. ‘Minimal’ data collection will assess the differential price of ‘healthy’ and ‘less healthy’ foods; ‘expanded’ monitoring will assess the differential price of ‘healthy’ and ‘less healthy’ diets; and the ‘optimal’ approach will also monitor food affordability, by taking into account household income. The monitoring of the price and affordability of ‘healthy’ and ‘less healthy’ foods and diets globally will provide robust data and benchmarks to inform economic and fiscal policy responses. Given the range of methodological, cultural and logistical challenges in this area, it is imperative that all aspects of the proposed monitoring framework are tested rigorously before implementation.