766 resultados para price discovery
Resumo:
Energy prices are highly volatile and often feature unexpected spikes. It is the aim of this paper to examine whether the occurrence of these extreme price events displays any regularities that can be captured using an econometric model. Here we treat these price events as point processes and apply Hawkes and Poisson autoregressive models to model the dynamics in the intensity of this process.We use load and meteorological information to model the time variation in the intensity of the process. The models are applied to data from the Australian wholesale electricity market, and a forecasting exercise illustrates both the usefulness of these models and their limitations when attempting to forecast the occurrence of extreme price events.
Resumo:
This thesis is a study for automatic discovery of text features for describing user information needs. It presents an innovative data-mining approach that discovers useful knowledge from both relevance and non-relevance feedback information. The proposed approach can largely reduce noises in discovered patterns and significantly improve the performance of text mining systems. This study provides a promising method for the study of Data Mining and Web Intelligence.
Resumo:
Presently organisations engage in what is termed as Global Business Transformation Projects [GBTPs], for consolidating, innovating, transforming and restructuring their processes and business strategies while undergoing fundamental change. Culture plays an important role in global business transformation projects as these involve people of different cultural backgrounds and span across countries, industries and disciplinary boundaries. Nevertheless, there is scant empirical research on how culture is conceptualised beyond national and organisational cultures but also on how culture is to be taken into account and dealt with within global business transformation projects. This research is situated in a business context and discovers a theory that aids in describing and dealing with culture. It draws on the lived experiences of thirty-two senior management practitioners, reporting on more than sixty-one global business transformation projects in which they were actively involved. The research method used is a qualitative and interpretive one and applies a grounded theory approach, with rich data generated through interviews. In addition, vignettes were developed to illustrate the derived theoretical models. The findings from this study contribute to knowledge in multiple ways. First, it provides a holistic account of global business transformation projects that describe the construct of culture by the elements of culture types, cultural differences and cultural diversity. A typology of culture types has been developed which enlarges the view of culture beyond national and organisational culture including an industry culture, professional service firm culture and 'theme' culture. The amalgamation of the culture types instantiated in a global business transformation project compromises its project culture. Second, the empirically grounded process for managing culture in global business transformation projects integrates the stages of recognition, understanding and management as well as the enablement providing a roadmap for dealing with culture in global business transformation projects. Third, this study identified contextual variables to global business transformation projects, which provide the means of describing the environment global business transformation projects are situated, influence the construct of culture and inform the process for managing culture. Fourth, the contribution to the research method is the positioning of interview research as a strategy for data generation and the detailed documentation applying grounded theory to discover theory.
Resumo:
This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.
Resumo:
In vivo small molecules as necessary intermediates are involved in numerous critical metabolic pathways and biological processes associated with many essential biological functions and events. There is growing evidence that MS-based metabolomics is emerging as a powerful tool to facilitate the discovery of functional small molecules that can better our understanding of development, infection, nutrition, disease, toxicity, drug therapeutics, gene modifications and host-pathogen interaction from metabolic perspectives. However, further progress must still be made in MS-based metabolomics because of the shortcomings in the current technologies and knowledge. This technique-driven review aims to explore the discovery of in vivo functional small molecules facilitated by MS-based metabolomics and to highlight the analytic capabilities and promising applications of this discovery strategy. Moreover, the biological significance of the discovery of in vivo functional small molecules with different biological contexts is also interrogated at a metabolic perspective.
Resumo:
As a renewable energy source, wind power is playing an increasingly important role in China’s electricity supply. Meanwhile, China is also the world’s largest market for Clean Development Mechanism (CDM) wind power projects. Based on the data of 27 wind power projects of Inner Mongolia registered with the Executive Board of the United Nations (EB) in 2010, this paper constructs a financial model of Net Present Value (NPV) to analyze the cost of wind power electricity. A sensitivity analysis is then conducted to examine the impact of different variables with and without Certified Emission Reduction (CER) income brought about by the CDM. It is concluded that the CDM, along with static investment and annual wind electricity production, is one of the most significant factors in promoting the development of wind power in China. Additionally, wind power is envisaged as a practical proposition for competing with thermal power if the appropriate actions identified in the paper are made.
Resumo:
Consumer awareness and usage of Unit Price (UP) information continues to hold academic interest. Originally designed as a device to enable shoppers to make comparisons between grocery products, it is argued consumers still lack a sufficient understanding of the device. Previous research has tended to focus on product choice, effect of time, and structural changes to price presentation. No studies have tested the effect of UP consumer education on grocery shopping expenditure. Supported by distributed learning theories, this is the first study to condition participants over a twenty week period, to comprehend and employ UP information while shopping. A 3x5 mixed factorial design was employed to collect data from 357 shoppers. A 3 (Control, Massed, Spaced) x 5 (Time Point: Week 0, 5, 10, 15 and 20) mixed factorial analysis of variance (ANOVA) was performed to analyse the data. Preliminary results revealed that the three groups differed in their average expenditure over the twenty weeks. The Control group remained stable across the five time points. Results indicated that both intensive (Massed) and less intensive (Spaced) exposure to UP information achieved similar results, with both group reducing average expenditure similarly by Week 5. These patterns held for twenty weeks, with conditioned groups reducing their grocery expenditure by over 10%. This research has academic value as a test of applied learning theories. We argue, retailers can attain considerable market advantages as efforts to enhance customers’ knowledge, through consumer education campaigns, can have a positive and strong impact on customer trust and goodwill toward the organisation. Hence, major practical implications for both regulators and retailers exist.
Resumo:
Food prices and food affordability are important determinants of food choices, obesity and non-communicable diseases. As governments around the world consider policies to promote the consumption of healthier foods, data on the relative price and affordability of foods, with a particular focus on the difference between ‘less healthy’ and ‘healthy’ foods and diets, are urgently needed. This paper briefly reviews past and current approaches to monitoring food prices, and identifies key issues affecting the development of practical tools and methods for food price data collection, analysis and reporting. A step-wise monitoring framework, including measurement indicators, is proposed. ‘Minimal’ data collection will assess the differential price of ‘healthy’ and ‘less healthy’ foods; ‘expanded’ monitoring will assess the differential price of ‘healthy’ and ‘less healthy’ diets; and the ‘optimal’ approach will also monitor food affordability, by taking into account household income. The monitoring of the price and affordability of ‘healthy’ and ‘less healthy’ foods and diets globally will provide robust data and benchmarks to inform economic and fiscal policy responses. Given the range of methodological, cultural and logistical challenges in this area, it is imperative that all aspects of the proposed monitoring framework are tested rigorously before implementation.
Resumo:
This paper examines the dynamic behaviour of relative prices across seven Australian cities by applying panel unit root test procedures with structural breaks to quarterly consumer price index data for 1972 Q1–2011 Q4. We find overwhelming evidence of convergence in city relative prices. Three common structural breaks are endogenously determined at 1985, 1995, and 2007. Further, correcting for two potential biases, namely Nickell bias and time aggregation bias, we obtain half-life estimates of 2.3–3.8 quarters that are much shorter than those reported by previous research. Thus, we conclude that both structural breaks and bias corrections are important to obtain shorter half-life estimates.
Resumo:
The purpose of this paper is to document and explain the allocation of takeover purchase price to identifiable intangible assets (IIAs), purchased goodwill, and/or target net tangible assets in an accounting environment unconstrained with respect to IIA accounting policy choice. Using a sample of Australian acquisitions during the unconstrained accounting environment from 1988 to 2004, we find the percentage allocation of purchase price to IIAs averaged 19.09%. The percentage allocation to IIAs is significantly positively related to return on assets and insignificantly related to leverage, contrary to opportunism. Efficiency suggests an explanation: profitable firms acquire and capitalise a higher percentage of IIAs in acquisitions. The target's investment opportunity set is significantly positively related to the percentage allocation to IIAs, consistent with information-signalling. The paper contributes to the accounting policy choice literature by showing how Australian firms make the one-off accounting policy choice in regards allocation of takeover purchase price (which is often a substantial dollar amount to) in an environment where accounting for IIAs was unconstrained.
Resumo:
Guaranteeing the quality of extracted features that describe relevant knowledge to users or topics is a challenge because of the large number of extracted features. Most popular existing term-based feature selection methods suffer from noisy feature extraction, which is irrelevant to the user needs (noisy). One popular method is to extract phrases or n-grams to describe the relevant knowledge. However, extracted n-grams and phrases usually contain a lot of noise. This paper proposes a method for reducing the noise in n-grams. The method first extracts more specific features (terms) to remove noisy features. The method then uses an extended random set to accurately weight n-grams based on their distribution in the documents and their terms distribution in n-grams. The proposed approach not only reduces the number of extracted n-grams but also improves the performance. The experimental results on Reuters Corpus Volume 1 (RCV1) data collection and TREC topics show that the proposed method significantly outperforms the state-of-art methods underpinned by Okapi BM25, tf*idf and Rocchio.
Resumo:
The international tax system, designed a century ago, has not kept pace with the modern multinational entity rendering it ineffective in taxing many modern businesses according to economic activity. One of those modern multinational entities is the multinational financial institution (MNFI). The recent global financial crisis provides a particularly relevant and significant example of the failure of the current system on a global scale. The modern MNFI is increasingly undertaking more globalised and complex trading operations. A primary reason for the globalisation of financial institutions is that they typically ‘follow-the-customer’ into jurisdictions where international capital and international investors are required. The International Monetary Fund (IMF) recently reported that from 1995-2009, foreign bank presence in developing countries grew by 122 per cent. The same study indicates that foreign banks have a 20 per cent market share in OECD countries and 50 per cent in emerging markets and developing countries. Hence, most significant is that fact that MNFIs are increasingly undertaking an intermediary role in developing economies where they are financing core business activities such as mining and tourism. IMF analysis also suggests that in the future, foreign bank expansion will be greatest in emerging economies. The difficulties for developing countries in applying current international tax rules, especially the current traditional transfer pricing regime, are particularly acute in relation to MNFIs, which are the biggest users of tax havens and offshore finance. This paper investigates whether a unitary taxation approach which reflects economic reality would more easily and effectively ensure that the profits of MNFIs are taxed in the jurisdictions which give rise to those profits. It has previously been argued that the uniqueness of MNFIs results in a failure of the current system to accurately allocate profits and that unitary tax as an alternative could provide a sounder allocation model for international tax purposes. This paper goes a step further, and examines the practicalities of the implementation of unitary taxation for MNFIs in terms of the key components of such a regime, along with their their implications. This paper adopts a two-step approach in considering the implications of unitary taxation as a means of improved corporate tax coordination which requires international acceptance and agreement. First, the definitional issues of the unitary MNFI are examined and second, an appropriate allocation formula for this sector is investigated. To achieve this, the paper asks first, how the financial sector should be defined for the purposes of unitary taxation and what should constitute a unitary business for that sector and second, what is the ‘best practice’ model of an allocation formula for the purposes of the apportionment of the profits of the unitary business of a financial institution.
Resumo:
The aim of this work is to develop a demand-side-response model, which assists electricity consumers exposed to the market price to independently and proactively manage air-conditioning peak electricity demand. The main contribution of this research is to show how consumers can optimize the energy cost caused by the air conditioning load considering to several cases e.g. normal price, spike price, and the probability of a price spike case. This model also investigated how air-conditioning applies a pre-cooling method when there is a substantial risk of a price spike. The results indicate the potential of the scheme to achieve financial benefits for consumers and target the best economic performance for electrical generation distribution and transmission. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics regarding hot days from 2011 to 2012.
Resumo:
This work presents a demand side response model (DSR) which assists small electricity consumers, through an aggregator, exposed to the market price to proactively mitigate price and peak impact on the electrical system. The proposed model allows consumers to manage air-conditioning when as a function of possible price spikes. The main contribution of this research is to demonstrate how consumers can minimise the total expected cost by optimising air-conditioning to account for occurrences of a price spike in the electricity market. This model investigates how pre-cooling method can be used to minimise energy costs when there is a substantial risk of an electricity price spike. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics during hot days on weekdays in the period 2011 to 2012.