9 resultados para Real electricity markets
em Aston University Research Archive
Resumo:
This paper employs a Component GARCH in Mean model to show that house prices across a number of major US cities between 1987 and 2009 have displayed asset market properties in terms of both risk-return relationships and asymmetric adjustment to shocks. In addition, tests for structural breaks in the mean and variance indicate structural instability across the data range. Multiple breaks are identified across all cities, particularly for the early 1990s and during the post-2007 financial crisis as housing has become an increasingly risky asset. Estimating the models over the individual sub-samples suggests that over the last 20 years the financial sector has increasingly failed to account for the levels of risk associated with real estate markets. This result has possible implications for the way in which financial institutions should be regulated in the future.
Resumo:
Lock-in is observed in real world markets of experience goods; experience goods are goods whose characteristics are difficult to determine in advance, but ascertained upon consumption. We create an agent-based simulation of consumers choosing between two experience goods available in a virtual market. We model consumers in a grid representing the spatial network of the consumers. Utilising simple assumptions, including identical distributions of product experience and consumers having a degree of follower tendency, we explore the dynamics of the model through simulations. We conduct simulations to create a lock-in before testing several hypotheses upon how to break an existing lock-in; these include the effect of advertising and free give-away. Our experiments show that the key to successfully breaking a lock-in required the creation of regions in a consumer population. Regions arise due to the degree of local conformity between agents within the regions, which spread throughout the population when a mildly superior competitor was available. These regions may be likened to a niche in a market, which gains in popularity to transition into the mainstream.
Resumo:
Market mechanisms are a means by which resources in contention can be allocated between contending parties, both in human economies and those populated by software agents. Designing such mechanisms has traditionally been carried out by hand, and more recently by automation. Assessing these mechanisms typically involves them being evaluated with respect to multiple conflicting objectives, which can often be nonlinear, noisy, and expensive to compute. For typical performance objectives, it is known that designed mechanisms often fall short on being optimal across all objectives simultaneously. However, in all previous automated approaches, either only a single objective is considered, or else the multiple performance objectives are combined into a single objective. In this paper we do not aggregate objectives, instead considering a direct, novel application of multi-objective evolutionary algorithms (MOEAs) to the problem of automated mechanism design. This allows the automatic discovery of trade-offs that such objectives impose on mechanisms. We pose the problem of mechanism design, specifically for the class of linear redistribution mechanisms, as a naturally existing multi-objective optimisation problem. We apply a modified version of NSGA-II in order to design mechanisms within this class, given economically relevant objectives such as welfare and fairness. This application of NSGA-II exposes tradeoffs between objectives, revealing relationships between them that were otherwise unknown for this mechanism class. The understanding of the trade-off gained from the application of MOEAs can thus help practitioners with an insightful application of discovered mechanisms in their respective real/artificial markets.
Resumo:
This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.
Resumo:
This paper presents a forecasting technique for forward energy prices, one day ahead. This technique combines a wavelet transform and forecasting models such as multi- layer perceptron, linear regression or GARCH. These techniques are applied to real data from the UK gas markets to evaluate their performance. The results show that the forecasting accuracy is improved significantly by using the wavelet transform. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.
Resumo:
This paper presents a forecasting technique for forward electricity/gas prices, one day ahead. This technique combines a Kalman filter (KF) and a generalised autoregressive conditional heteroschedasticity (GARCH) model (often used in financial forecasting). The GARCH model is used to compute next value of a time series. The KF updates parameters of the GARCH model when the new observation is available. This technique is applied to real data from the UK energy markets to evaluate its performance. The results show that the forecasting accuracy is improved significantly by using this hybrid model. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.
Resumo:
This thesis presents a comparison of integrated biomass to electricity systems on the basis of their efficiency, capital cost and electricity production cost. Four systems are evaluated: combustion to raise steam for a steam cycle; atmospheric gasification to produce fuel gas for a dual fuel diesel engine; pressurised gasification to produce fuel gas for a gas turbine combined cycle; and fast pyrolysis to produce pyrolysis liquid for a dual fuel diesel engine. The feedstock in all cases is wood in chipped form. This is the first time that all three thermochemical conversion technologies have been compared in a single, consistent evaluation.The systems have been modelled from the transportation of the wood chips through pretreatment, thermochemical conversion and electricity generation. Equipment requirements during pretreatment are comprehensively modelled and include reception, storage, drying and communication. The de-coupling of the fast pyrolysis system is examined, where the fast pyrolysis and engine stages are carried out at separate locations. Relationships are also included to allow learning effects to be studied. The modelling is achieved through the use of multiple spreadsheets where each spreadsheet models part of the system in isolation and the spreadsheets are combined to give the cost and performance of a whole system.The use of the models has shown that on current costs the combustion system remains the most cost-effective generating route, despite its low efficiency. The novel systems only produce lower cost electricity if learning effects are included, implying that some sort of subsidy will be required during the early development of the gasification and fast pyrolysis systems to make them competitive with the established combustion approach. The use of decoupling in fast pyrolysis systems is a useful way of reducing system costs if electricity is required at several sites because• a single pyrolysis site can be used to supply all the generators, offering economies of scale at the conversion step. Overall, costs are much higher than conventional electricity generating costs for fossil fuels, due mainly to the small scales used. Biomass to electricity opportunities remain restricted to niche markets where electricity prices are high or feed costs are very low. It is highly recommended that further work examines possibilities for combined beat and power which is suitable for small scale systems and could increase revenues that could reduce electricity prices.
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.
Resumo:
Since the development of large scale power grid interconnections and power markets, research on available transfer capability (ATC) has attracted great attention. The challenges for accurate assessment of ATC originate from the numerous uncertainties in electricity generation, transmission, distribution and utilization sectors. Power system uncertainties can be mainly described as two types: randomness and fuzziness. However, the traditional transmission reliability margin (TRM) approach only considers randomness. Based on credibility theory, this paper firstly built models of generators, transmission lines and loads according to their features of both randomness and fuzziness. Then a random fuzzy simulation is applied, along with a novel method proposed for ATC assessment, in which both randomness and fuzziness are considered. The bootstrap method and multi-core parallel computing technique are introduced to enhance the processing speed. By implementing simulation for the IEEE-30-bus system and a real-life system located in Northwest China, the viability of the models and the proposed method is verified.