170 resultados para Multi-stage programming
Resumo:
Following the deregulation experience of retail electricity markets in most countries, the majority of the new entrants of the liberalized retail market were pure REP (retail electricity providers). These entities were subject to financial risks because of the unexpected price variations, price spikes, volatile loads and the potential for market power exertion by GENCO (generation companies). A REP can manage the market risks by employing the DR (demand response) programs and using its' generation and storage assets at the distribution network to serve the customers. The proposed model suggests how a REP with light physical assets, such as DG (distributed generation) units and ESS (energy storage systems), can survive in a competitive retail market. The paper discusses the effective risk management strategies for the REPs to deal with the uncertainties of the DAM (day-ahead market) and how to hedge the financial losses in the market. A two-stage stochastic programming problem is formulated. It aims to establish the financial incentive-based DR programs and the optimal dispatch of the DG units and ESSs. The uncertainty of the forecasted day-ahead load demand and electricity price is also taken into account with a scenario-based approach. The principal advantage of this model for REPs is reducing the risk of financial losses in DAMs, and the main benefit for the whole system is market power mitigation by virtually increasing the price elasticity of demand and reducing the peak demand.
Resumo:
Environmental concerns and the shortage in the fossil fuel reserves have been potentiating the growth and globalization of distributed generation. Another resource that has been increasing its importance is the demand response, which is used to change consumers’ consumption profile, helping to reduce peak demand. Aiming to support small players’ participation in demand response events, the Curtailment Service Provider emerged. This player works as an aggregator for demand response events. The control of small and medium players which act in smart grid and micro grid environments is enhanced with a multi-agent system with artificial intelligence techniques – the MASGriP (Multi-Agent Smart Grid Platform). Using strategic behaviours in each player, this system simulates the profile of real players by using software agents. This paper shows the importance of modeling these behaviours for studying this type of scenarios. A case study with three examples shows the differences between each player and the best behaviour in order to achieve the higher profit in each situation.
Resumo:
Traditional vertically integrated power utilities around the world have evolved from monopoly structures to open markets that promote competition among suppliers and provide consumers with a choice of services. Market forces drive the price of electricity and reduce the net cost through increased competition. Electricity can be traded in both organized markets or using forward bilateral contracts. This article focuses on bilateral contracts and describes some important features of an agent-based system for bilateral trading in competitive markets. Special attention is devoted to the negotiation process, demand response in bilateral contracting, and risk management. The article also presents a case study on forward bilateral contracting: a retailer agent and a customer agent negotiate a 24h-rate tariff.
Resumo:
The dynamism and ongoing changes that the electricity markets sector is constantly suffering, enhanced by the huge increase in competitiveness, create the need of using simulation platforms to support operators, regulators, and the involved players in understanding and dealing with this complex environment. This paper presents an enhanced electricity market simulator, based on multi-agent technology, which provides an advanced simulation framework for the study of real electricity markets operation, and the interactions between the involved players. MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) uses real data for the creation of realistic simulation scenarios, which allow the study of the impacts and implications that electricity markets transformations bring to different countries. Also, the development of an upper-ontology to support the communication between participating agents, provides the means for the integration of this simulator with other frameworks, such as MAN-REM (Multi-Agent Negotiation and Risk Management in Electricity Markets). A case study using the enhanced simulation platform that results from the integration of several systems and different tools is presented, with a scenario based on real data, simulating the MIBEL electricity market environment, and comparing the simulation performance with the real electricity market results.
Resumo:
This paper presents the Realistic Scenarios Generator (RealScen), a tool that processes data from real electricity markets to generate realistic scenarios that enable the modeling of electricity market players’ characteristics and strategic behavior. The proposed tool provides significant advantages to the decision making process in an electricity market environment, especially when coupled with a multi-agent electricity markets simulator. The generation of realistic scenarios is performed using mechanisms for intelligent data analysis, which are based on artificial intelligence and data mining algorithms. These techniques allow the study of realistic scenarios, adapted to the existing markets, and improve the representation of market entities as software agents, enabling a detailed modeling of their profiles and strategies. This work contributes significantly to the understanding of the interactions between the entities acting in electricity markets by increasing the capability and realism of market simulations.
Resumo:
Multi-agent approaches have been widely used to model complex systems of distributed nature with a large amount of interactions between the involved entities. Power systems are a reference case, mainly due to the increasing use of distributed energy sources, largely based on renewable sources, which have potentiated huge changes in the power systems’ sector. Dealing with such a large scale integration of intermittent generation sources led to the emergence of several new players, as well as the development of new paradigms, such as the microgrid concept, and the evolution of demand response programs, which potentiate the active participation of consumers. This paper presents a multi-agent based simulation platform which models a microgrid environment, considering several different types of simulated players. These players interact with real physical installations, creating a realistic simulation environment with results that can be observed directly in the reality. A case study is presented considering players’ responses to a demand response event, resulting in an intelligent increase of consumption in order to face the wind generation surplus.
Resumo:
A sustentabilidade do sistema energético é crucial para o desenvolvimento económico e social das sociedades presentes e futuras. Para garantir o bom funcionamento dos sistemas de energia actua-se, tipicamente, sobre a produção e sobre as redes de transporte e de distribuição. No entanto, a integração crescente de produção distribuída, principalmente nas redes de distribuição de média e de baixa tensão, a liberalização dos mercados energéticos, o desenvolvimento de mecanismos de armazenamento de energia, o desenvolvimento de sistemas automatizados de controlo de cargas e os avanços tecnológicos das infra-estruturas de comunicação impõem o desenvolvimento de novos métodos de gestão e controlo dos sistemas de energia. O contributo deste trabalho é o desenvolvimento de uma metodologia de gestão de recursos energéticos num contexto de SmartGrids, considerando uma entidade designada por VPP que gere um conjunto de instalações (unidades produtoras, consumidores e unidades de armazenamento) e, em alguns casos, tem ao seu cuidado a gestão de uma parte da rede eléctrica. Os métodos desenvolvidos contemplam a penetração intensiva de produção distribuída, o aparecimento de programas de Demand Response e o desenvolvimento de novos sistemas de armazenamento. São ainda propostos níveis de controlo e de tomada de decisão hierarquizados e geridos por entidades que actuem num ambiente de cooperação mas também de concorrência entre si. A metodologia proposta foi desenvolvida recorrendo a técnicas determinísticas, nomeadamente, à programação não linear inteira mista, tendo sido consideradas três funções objectivo distintas (custos mínimos, emissões mínimas e cortes de carga mínimos), originando, posteriormente, uma função objectivo global, o que permitiu determinar os óptimos de Pareto. São ainda determinados os valores dos custos marginais locais em cada barramento e consideradas as incertezas dos dados de entrada, nomeadamente, produção e consumo. Assim, o VPP tem ao seu dispor um conjunto de soluções que lhe permitirão tomar decisões mais fundamentadas e de acordo com o seu perfil de actuação. São apresentados dois casos de estudo. O primeiro utiliza uma rede de distribuição de 32 barramentos publicada por Baran & Wu. O segundo caso de estudo utiliza uma rede de distribuição de 114 barramentos adaptada da rede de 123 barramentos do IEEE.
Resumo:
Um dos principais objetivos da ciência é perceber a natureza, i.e., descobrir e explicar o funcionamento do mundo que nos rodeia. Para tal, os cientistas precisam de coligir dados e monitorar o meio ambiente. Em particular, considerando que cerca de 70% da Terra é coberta por água, a coleta de parâmetros de caracterização da água de grandes superfícies é uma prioridade. A monitorização das condições da água é feita principalmente através de bóias. No entanto, as bóias disponíveis no mercado não satisfazem as necessidades existentes. Esta é uma das principais razões que levaram o Laboratório de Sistemas Autónomos (LSA) do Instituto Superior de Engenharia do Porto a lançarem um projeto para o desenvolvimento de uma bóia reconfigurável e com dois modos de funcionamento: monitorização ambiental e baliza ativa de regata. O segundo modo é destinado a regatas de veleiros autónomos. O projeto começou há um ano com um projeto do European Project Project [1] (EPS), realizado por quatro estudantes internacionais, destinado à construção da estrutura da bóia e à selecção dos componentes mais adequados para o sistema de medição e controlo. A arquitetura que foi definida para este sistema é do tipo mestre-escravo e é composta por uma unidade de controlo mestre para a telemetria e configuração e uma unidade de controlo escrava para a medição e armazenamento de dados. O desenvolvimento do projeto continuou com dois estudantes belgas que trabalharam na comunicação e no armazenamento de dados. Este projeto, que prossegue com o desenvolvimento da medição e do armazenamento de dados do lado da unidade de controlo escrava, tem os seguintes objetivos: (i ) implementar o protocolo de comunicação na unidade de controlo escrava; (ii ) coligir e armazenar os dados dos sensores no cartão SD em tempo real; (iii ) fornecer dados em tempo útil; e (iv) recuperar dados do cartão SD em tempo diferido. As contribuições anteriores foram estudadas e foi feito um levantamento dos projetos congéneres existentes. O desenvolvimento do projeto atual começou com o protocolo de comunicação. Este protocolo, que foi projetado pelos alunos anteriores, foi um bom ponto de partida. No entanto, o protocolo foi atualizado e melhorado com novas funcionalidades. Esta última componente foi um trabalho conjunto com Laurens Allart, que esteve a trabalhar no subsistema de telemetria e de configuração durante este semestre. O protocolo foi implementado do lado da unidade de controlo escrava através de uma estrutura de múltiplas actividades paralelas (multithreaded). Esta estrutura recebe as mensagens da unidade mestre, executa as ações solicitadas e envia de volta o resultado. A bóia é um dispositivo reconfigurável multimodo que pode ser expandido com novos modos de operação no futuro. Infelizmente, sofre de algumas limitações: suporta uma carga máxima de 40 kg e tem uma área de implantação limitada pela distância máxima à estacão base.
Resumo:
Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.
Resumo:
The last decade has witnessed a major shift towards the deployment of embedded applications on multi-core platforms. However, real-time applications have not been able to fully benefit from this transition, as the computational gains offered by multi-cores are often offset by performance degradation due to shared resources, such as main memory. To efficiently use multi-core platforms for real-time systems, it is hence essential to tightly bound the interference when accessing shared resources. Although there has been much recent work in this area, a remaining key problem is to address the diversity of memory arbiters in the analysis to make it applicable to a wide range of systems. This work handles diverse arbiters by proposing a general framework to compute the maximum interference caused by the shared memory bus and its impact on the execution time of the tasks running on the cores, considering different bus arbiters. Our novel approach clearly demarcates the arbiter-dependent and independent stages in the analysis of these upper bounds. The arbiter-dependent phase takes the arbiter and the task memory-traffic pattern as inputs and produces a model of the availability of the bus to a given task. Then, based on the availability of the bus, the arbiter-independent phase determines the worst-case request-release scenario that maximizes the interference experienced by the tasks due to the contention for the bus. We show that the framework addresses the diversity problem by applying it to a memory bus shared by a fixed-priority arbiter, a time-division multiplexing (TDM) arbiter, and an unspecified work-conserving arbiter using applications from the MediaBench test suite. We also experimentally evaluate the quality of the analysis by comparison with a state-of-the-art TDM analysis approach and consistently showing a considerable reduction in maximum interference.
Resumo:
10th Conference on Telecommunications (Conftele 2015), Aveiro, Portugal.
Resumo:
8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.
Resumo:
8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.
Resumo:
4th International Conference on Future Generation Communication Technologies (FGCT 2015), Luton, United Kingdom.