30 resultados para Market Price of Risk

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study is to examine the market valuation of environmental capital expenditure investment related to pollution abatement in the pulp and paper industry. The total environmental capital expenditure of $8.7 billion by our sample firms during 1989-2000 supports the focus on this industry. In order to be capitalized, an asset should be associated with future economic benefits. The existing environmental literature suggests that investors condition their evaluation of the future economic benefits arising from environmental capital expenditure on an assessment of the firms' environmental performance. This literature predicts the emergence of two environmental stereotypes: low-polluting firms that overcomply with existing environmental regulations, and high-polluting firms that just meet minimal environmental requirements. Our valuation evidence indicates that there are incremental economic benefits associated with environmental capital expenditure investment by low-polluting firms but not high-polluting firms. We also find that investors use environmental performance information to assess unbooked environmental liabilities, which we interpret to represent the future abatement spending obligations of high-polluting firms in the pulp and paper industry. We estimate average unbooked liabilities of $560 million for high-polluting firms, or 16.6 percent of market capitalization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electricity market price forecast is a changeling yet very important task for electricity market managers and participants. Due to the complexity and uncertainties in the power grid, electricity prices are highly volatile and normally carry with spikes. which may be (ens or even hundreds of times higher than the normal price. Such electricity spikes are very difficult to be predicted. So far. most of the research on electricity price forecast is based on the normal range electricity prices. This paper proposes a data mining based electricity price forecast framework, which can predict the normal price as well as the price spikes. The normal price can be, predicted by a previously proposed wavelet and neural network based forecast model, while the spikes are forecasted based on a data mining approach. This paper focuses on the spike prediction and explores the reasons for price spikes based on the measurement of a proposed composite supply-demand balance index (SDI) and relative demand index (RDI). These indices are able to reflect the relationship among electricity demand, electricity supply and electricity reserve capacity. The proposed model is based on a mining database including market clearing price, trading hour. electricity), demand, electricity supply and reserve. Bayesian classification and similarity searching techniques are used to mine the database to find out the internal relationships between electricity price spikes and these proposed. The mining results are used to form the price spike forecast model. This proposed model is able to generate forecasted price spike, level of spike and associated forecast confidence level. The model is tested with the Queensland electricity market data with promising results. Crown Copyright (C) 2004 Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Sentinel node biopsy (SNB) is being increasingly used but its place outside randomized trials has not yet been established. Methods: The first 114 sentinel node (SN) biopsies performed for breast cancer at the Princess Alexandra Hospital from March 1999 to June 2001 are presented. In 111 cases axillary dissection was also performed, allowing the accuracy of the technique to be assessed. A standard combination of preoperative lymphoscintigraphy, intraoperative gamma probe and injection of blue dye was used in most cases. Results are discussed in relation to the risk and potential consequences of understaging. Results: Where both probe and dye were used, the SN was identified in 90% of patients. A significant number of patients were treated in two stages and the technique was no less effective in patients who had SNB performed at a second operation after the primary tumour had already been removed. The interval from radioisotope injection to operation was very wide (between 2 and 22 h) and did not affect the outcome. Nodal metastases were present in 42 patients in whom an SN was found, and in 40 of these the SN was positive, giving a false negative rate of 4.8% (2/42), with the overall percentage of patients understaged being 2%. For this particular group as a whole, the increased risk of death due to systemic therapy being withheld as a consequence of understaging (if SNB alone had been employed) is estimated at less than 1/500. The risk for individuals will vary depending on other features of the particular primary tumour. Conclusion: For patients who elect to have the axilla staged using SNB alone, the risk and consequences of understaging need to be discussed. These risks can be estimated by allowing for the specific surgeon's false negative rate for the technique, and considering the likelihood of nodal metastases for a given tumour. There appears to be no disadvantage with performing SNB at a second operation after the primary tumour has already been removed. Clearly, for a large number of patients, SNB alone will be safe, but ideally participation in randomized trials should continue to be encouraged.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uses research in a major UK company on the introduction of an electronic document management system to explore perceptions of, and attitudes to, risk. Phenomenological methods were used; with subsequent dialogue transcripts evaluated with Winmax dialogue software, using an adapted theoretical framework based upon an analysis of the literature. The paper identifies a number of factors, and builds a framework, that should support a greater understanding of risk assessment and project management by the academic community and practitioners.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ross River virus (RE) is a mosquito-borne arbovirus responsible for outbreaks of polyarthritic disease throughout Australia. To better understand human and environmental factors driving such events, 57 historical reports oil RR Outbreaks between 1896 and 1998 were examined collectively. The magnitude, regularity, seasonality, and locality of outbreaks were found to be wide ranging; however, analysis of climatic and tidal data highlighted that environmental conditions let differently ill tropical, arid, and temperate regions. Overall, rainfall seems to be the single most important risk factor, with over 90% of major outbreak locations receiving higher than average rainfall in preceding mouths. Many temperatures were close to average, particularly in tropical populations; however, in arid regions, below average maximum temperatures predominated, and ill southeast temperate regions, above average minimum temperatures predominated. High spring tides preceded coastal Outbreaks, both in the presence and absence of rainfall, and the relationship between rainfall and the Southern Oscillation Index and Lit Nina episodes suggest they may be useful predictive tools, but only ill southeast temperate regions. Such heterogeneity predisposing outbreaks supports the notion that there are different RE epidemiologies throughout Australia but also Suggests that generic parameters for the prediction and control of outbreaks are of limited use at a local level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Purpose. The re-admission of patients to intensive care is associated with increased morbidity, mortality, loss of morale for patients and family, and increased health costs. The aim of the present study was to identify factors which place patients at a higher risk of re-admission to intensive care. Method. A prospective study of patients who were re-admitted to a 22-bed tertiary level intensive care facility within a 12-month period. Data were kept on every patient re-admitted to intensive care, including standard demographic data, initial admission diagnosis, co-morbidities, re-admission diagnosis, mobility on discharge, secretions, airway, chest X-ray, PaCO2, PaO2, PaO2/FiO2and time of discharge. Subjects included 74 patients who had been re-admitted to intensive care in a 12-month period and a comparison group of patients who were not re-admitted to intensive care. A cross-tabs procedure was initially used to estimate maximum likelihood. Significant factors with an value of 65 years (p

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we utilise a stochastic address model of broadcast oligopoly markets to analyse the Australian broadcast television market. In particular, we examine the effect of the presence of a single government market participant in this market. An examination of the dynamics of the simulations demonstrates that the presence of a government market participant can simultaneously generate positive outcomes for viewers as well as for other market suppliers. Further examination of simulation dynamics indicates that privatisation of the government market participant results in reduced viewer choice and diversity. We also demonstrate that additional private market participants would not result in significant benefits to viewers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador: