52 resultados para Competitive price
Resumo:
The increase of distributed generation (DG) has brought about new challenges in electrical networks electricity markets and in DG units operation and management. Several approaches are being developed to manage the emerging potential of DG, such as Virtual Power Players (VPPs), which aggregate DG plants; and Smart Grids, an approach that views generation and associated loads as a subsystem. This paper presents a multi-level negotiation mechanism for Smart Grids optimal operation and negotiation in the electricity markets, considering the advantages of VPPs’ management. The proposed methodology is implemented and tested in MASCEM – a multiagent electricity market simulator, developed to allow deep studies of the interactions between the players that take part in the electricity market negotiations.
Resumo:
In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.
Resumo:
In competitive electricity markets with deep concerns for the efficiency level, demand response programs gain considerable significance. As demand response levels have decreased after the introduction of competition in the power industry, new approaches are required to take full advantage of demand response opportunities. This paper presents DemSi, a demand response simulator that allows studying demand response actions and schemes in distribution networks. It undertakes the technical validation of the solution using realistic network simulation based on PSCAD. The use of DemSi by a retailer in a situation of energy shortage, is presented. Load reduction is obtained using a consumer based price elasticity approach supported by real time pricing. Non-linear programming is used to maximize the retailer’s profit, determining the optimal solution for each envisaged load reduction. The solution determines the price variations considering two different approaches, price variations determined for each individual consumer or for each consumer type, allowing to prove that the approach used does not significantly influence the retailer’s profit. The paper presents a case study in a 33 bus distribution network with 5 distinct consumer types. The obtained results and conclusions show the adequacy of the used methodology and its importance for supporting retailers’ decision making.
Resumo:
Sustainable development concerns made renewable energy sources to be increasingly used for electricity distributed generation. However, this is mainly due to incentives or mandatory targets determined by energy policies as in European Union. Assuring a sustainable future requires distributed generation to be able to participate in competitive electricity markets. To get more negotiation power in the market and to get advantages of scale economy, distributed generators can be aggregated giving place to a new concept: the Virtual Power Producer (VPP). VPPs are multi-technology and multisite heterogeneous entities that should adopt organization and management methodologies so that they can make distributed generation a really profitable activity, able to participate in the market. This paper presents ViProd, a simulation tool that allows simulating VPPs operation, in the context of MASCEM, a multi-agent based eletricity market simulator.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.
Resumo:
With the increasing importance of large commerce across the Internet it is becoming increasingly evident that in a few years the Iternet will host a large number of interacting software agents. a vast number of them will be economically motivated, and will negociate a variety of goods and services. It is therefore important to consider the economic incentives and behaviours of economic software agents, and to use all available means to anticipate their collective interactions. This papers addresses this concern by presenting a multi-agent market simulator designed for analysing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, consideting risk preferences. The system includes agents that are capable of increasing their performance with their own experience, by adapting to the market conditions. The results of the negotiations between agents are analysed by data minig algorithms in order to extract rules that give agents feedback to imprive their strategies.
Residential property loans and performance during property price booms: evidence from European banks
Resumo:
Understanding the performance of banks is of the utmost relevance, because of the impact of this sector on economic growth and financial stability. Of all the different assets that make up a bank portfolio, the residential mortgage loans constitute one of its main. Using the dynamic panel data method, we analyse the influence of residential mortgage loans on bank profitability and risk, using a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that banks with larger weights of residential mortgage loans show lower credit risk in good times. This result explains why banks rush to lend on property during booms due to the positive effects it has on credit risk. The results show further that credit risk and profitability are lower during the upturn in the residential property price cycle. The results also reveal the existence of a non-linear relationship (U-shaped marginal effect), as a function of bank’s risk, between profitability and the residential mortgage loans exposure. For those banks that have high credit risk, a large exposure of residential mortgage loans is associated with higher risk-adjusted profitability, through lower risk. For banks with a moderate/low credit risk, the effects of higher residential mortgage loan exposure on its risk-adjusted profitability are also positive or marginally positive.
Resumo:
Understanding the performance of banks is of the u tmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of performance. Using a dynamic panel model , we analyse the impact of res idential mortgage loans on bank profitability and risk , based on a sample of 555 banks in the European Union ( EU - 15 ) , over the period from 1995 to 2008. We find that banks with larger weight s in residential mortgage loans display lower credit risk in good market conditions . This result may explain why banks rush to lend on property during b ooms due to the positive effect it has on credit risk . The results also show that credit risk and profitability are lower during the upturn in the residential property cy cle. Furthermore, t he results reveal the existence of a non - linear relationship ( U - shaped marginal effect), as a function of bank’s risk, between profitability and residential mortgage exposure . For those banks that have high er credit risk, a large exposur e to residential loans is associated with increased risk - adjusted profitability, through a reduction in risk. For banks with a moderate to low credit risk, the impact of higher exposure are also positive on risk - adjusted profitability.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia
Resumo:
A criação de valor no mercado da saúde enquanto factor diferenciador para a negociação de preços e competitividade em contexto de crise mundial Numa altura em que o sector da saúde é apontado como uma área crítica de custos, torna-se cada vez mais difícil orientar a contratação em saúde baseada em valor para os pacientes, ou seja, pelos resultados obtidos e não pelo volume de cuidados prestados. Pretendeu-se estudar a criação de valor no mercado da saúde enquanto factor diferenciador para a negociação de preços e competitividade em contexto de crise económica. Procedeu-se à comparação dos resultados operacionais de uma empresa enquanto prestadora de serviços de Oxigenoterapia ao domicílio, tendo por base duas estratégias diferentes: redução directa de preços ou manutenção de preços com criação de valor para o cliente. As propostas foram posteriormente apresentadas para avaliação e votação on-line por um grupo 8 gestores hospitalares. A proposta baseada em valor (Nº2) apresenta melhores resultados operacionais (41%) embora apresente maiores custos. No que se refere à votação das propostas e tendo em conta o cenário apresentado, metade dos gestores optaram pela proposta Nº1 (N=4) e outra metade pela proposta Nº2 (N=4). Contudo, a maioria dos gestores (N=7) consideraram a proposta Nº1 a mais competitiva em contexto de competição com mais fornecedores. Conclui-se que numa negociação de contratos de cuidados de saúde, uma proposta baseada em valor, pode garantir a manutenção dos preços. Todavia, mantendo-se uma situação económica de recessão e num cenário competitivo de vários fornecedores este tipo de propostas pode não eleita, pelo facto de aparentemente não representar ganhos imediatos para a instituição contratante.
Resumo:
Ancillary services represent a good business opportunity that must be considered by market players. This paper presents a new methodology for ancillary services market dispatch. The method considers the bids submitted to the market and includes a market clearing mechanism based on deterministic optimization. An Artificial Neural Network is used for day-ahead prediction of Regulation Down, regulation-up, Spin Reserve and Non-Spin Reserve requirements. Two test cases based on California Independent System Operator data concerning dispatch of Regulation Down, Regulation Up, Spin Reserve and Non-Spin Reserve services are included in this paper to illustrate the application of the proposed method: (1) dispatch considering simple bids; (2) dispatch considering complex bids.
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.
Resumo:
Compositional real-time scheduling clearly requires that ”normal” real-time scheduling challenges are addressed but challenges intrinsic to compositionality must be addressed as well, in particular: (i) how should interfaces be described? and (ii) how should numerical values be assigned to parameters constituting the interfaces? The real-time systems community has traditionally used narrow interfaces for describing a component (for example, a utilization/bandwidthlike metric and the distribution of this bandwidth in time). In this paper, we introduce the concept of competitive ratio of an interface and show that typical narrow interfaces cause poor performance for scheduling constrained-deadline sporadic tasks (competitive ratio is infinite). Therefore, we explore more expressive interfaces; in particular a class called medium-wide interfaces. For this class, we propose an interface type and show how the parameters of the interface should be selected. We also prove that this interface is 8-competitive.
Resumo:
Consider the problem of scheduling a set of sporadically arriving tasks on a uniform multiprocessor with the goal of meeting deadlines. A processor p has the speed Sp. Tasks can be preempted but they cannot migrate between processors. We propose an algorithm which can schedule all task sets that any other possible algorithm can schedule assuming that our algorithm is given processors that are three times faster.
Resumo:
Consider the problem of scheduling a set of sporadically arriving tasks on a uniform multiprocessor with the goal of meeting deadlines. A processor p has the speed Sp. Tasks can be preempted but they cannot migrate between processors. On each processor, tasks are scheduled according to rate-monotonic. We propose an algorithm that can schedule all task sets that any other possible algorithm can schedule assuming that our algorithm is given processors that are √2 / √2−1 ≈ 3.41 times faster. No such guarantees are previously known for partitioned static-priority scheduling on uniform multiprocessors.