900 resultados para Constraint solving
Resumo:
The large increase of distributed energy resources, including distributed generation, storage systems and demand response, especially in distribution networks, makes the management of the available resources a more complex and crucial process. With wind based generation gaining relevance, in terms of the generation mix, the fact that wind forecasting accuracy rapidly drops with the increase of the forecast anticipation time requires to undertake short-term and very short-term re-scheduling so the final implemented solution enables the lowest possible operation costs. This paper proposes a methodology for energy resource scheduling in smart grids, considering day ahead, hour ahead and five minutes ahead scheduling. The short-term scheduling, undertaken five minutes ahead, takes advantage of the high accuracy of the very-short term wind forecasting providing the user with more efficient scheduling solutions. The proposed method uses a Genetic Algorithm based approach for optimization that is able to cope with the hard execution time constraint of short-term scheduling. Realistic power system simulation, based on PSCAD , is used to validate the obtained solutions. The paper includes a case study with a 33 bus distribution network with high penetration of distributed energy resources implemented in PSCAD .
Resumo:
In order to develop a flexible simulator, a variety of models for Ancillary Services (AS) negotiation has been implemented in MASCEM – a multi-agent system competitive electricity markets simulator. In some of these models, the energy and the AS are addressed simultaneously while in other models they are addressed separately. This paper presents an energy and ancillary services joint market simulation. This paper proposes a deterministic approach for solving the energy and ancillary services joint market. A case study based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve, and Non-Spinning Reserve services is used to demonstrate that the use of the developed methodology is suitable for solving this kind of optimization problem. The presented case study is based on CAISO real AS market data considers fifteen bids.
Fuzzy Monte Carlo mathematical model for load curtailment minimization in transmission power systems
Resumo:
This paper presents a methodology which is based on statistical failure and repair data of the transmission power system components and uses fuzzyprobabilistic modeling for system component outage parameters. Using statistical records allows developing the fuzzy membership functions of system component outage parameters. The proposed hybrid method of fuzzy set and Monte Carlo simulation based on the fuzzy-probabilistic models allows catching both randomness and fuzziness of component outage parameters. A network contingency analysis to identify any overloading or voltage violation in the network is performed once obtained the system states by Monte Carlo simulation. This is followed by a remedial action algorithm, based on optimal power flow, to reschedule generations and alleviate constraint violations and, at the same time, to avoid any load curtailment, if possible, or, otherwise, to minimize the total load curtailment, for the states identified by the contingency analysis. In order to illustrate the application of the proposed methodology to a practical case, the paper will include a case study for the Reliability Test System (RTS) 1996 IEEE 24 BUS.
Resumo:
The activity of Control Center operators is important to guarantee the effective performance of Power Systems. Operators’ actions are crucial to deal with incidents, especially severe faults like blackouts. In this paper, we present an Intelligent Tutoring approach for training Portuguese Control Center operators in tasks like incident analysis and diagnosis, and service restoration of Power Systems. Intelligent Tutoring System (ITS) approach is used in the training of the operators, having into account context awareness and the unobtrusive integration in the working environment. Several Artificial Intelligence techniques were criteriously used and combined together to obtain an effective Intelligent Tutoring environment, namely Multiagent Systems, Neural Networks, Constraint-based Modeling, Intelligent Planning, Knowledge Representation, Expert Systems, User Modeling, and Intelligent User Interfaces.
Resumo:
Locational Marginal Prices (LMP) are important pricing signals for the participants of competitive electricity markets, as the effects of transmission losses and binding constraints are embedded in LMPs [1],[2]. This paper presents a software tool that evaluates the nodal marginal prices considering losses and congestion. The initial dispatch is based on all the electricity transactions negotiated in the pool and in bilateral contracts. It must be checked if the proposed initial dispatch leads to congestion problems; if a congestion situation is detected, it must be solved. An AC power flow is used to verify if there are congestion situations in the initial dispatch. Whenever congestion situations are detected, they are solved and a feasible dispatch (re-dispatch) is obtained. After solving the congestion problems, the simulator evaluates LMP. The paper presents a case study based on the the 118 IEEE bus test network.
Resumo:
This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.
Resumo:
Swarm Intelligence generally refers to a problem-solving ability that emerges from the interaction of simple information-processing units. The concept of Swarm suggests multiplicity, distribution, stochasticity, randomness, and messiness. The concept of Intelligence suggests that problem-solving approach is successful considering learning, creativity, cognition capabilities. This paper introduces some of the theoretical foundations, the biological motivation and fundamental aspects of swarm intelligence based optimization techniques such Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and Artificial Bees Colony (ABC) algorithms for scheduling optimization.
Resumo:
The main purpose of this paper is to propose a Multi-Agent Autonomic and Bio-Inspired based framework with selfmanaging capabilities to solve complex scheduling problems using cooperative negotiation. Scheduling resolution requires the intervention of highly skilled human problem-solvers. This is a very hard and challenging domain because current systems are becoming more and more complex, distributed, interconnected and subject to rapidly changing. A natural Autonomic Computing (AC) evolution in relation to Current Computing is to provide systems with Self-Managing ability with a minimum human interference.
Resumo:
Qualidade é a palavra de ordem pelo qual se regem todos os processos e intervenções da REFER (Rede Ferroviária Nacional). Com a evolução de requisitos exigidos pelo transporte em caminho de ferro, há que procurar sempre as últimas inovações, para que a circulação se faça sempre com maior segurança e comodidade, dando aos utilizadores deste tipo transporte uma qualidade extrema. Nos últimos anos esses requisitos tornaram-se cada vez mais exigentes, pois as condições de circulação aumentam tais como a prática de maiores velocidades, tonelagem e frequência das composições, o que leva a um maior rigor nos processos construtivos e conservativos das vias. A REFER, depois de grandes investimentos em novas infraestruturas ferroviárias, detém agora um decisivo e importante papel na resolução das problemáticas que emergem, apostando sempre em tecnologia de ponta para que possa desenvolver um trabalho de conservação que satisfaça todas as necessidades exigidas. Este trabalho desenvolver-se-á nesta ótica de conservação e manutenção, acompanhando todo um processo específico de ataque mecânico pesado, até à sua certificação. A análise dos vários processos caracterizará a exigência referida na manutenção, principalmente da via moderna, onde a fasquia de requisitos é mais elevada.
Resumo:
Os sistemas fotovoltaicos produzem energia eléctrica limpa, e inesgotável na nossa escala temporal. A Agência Internacional de Energia encara a tecnologia fotovoltaica como uma das mais promissoras, esperando nas suas previsões mais optimistas, que em 2050 possa representar 20% da produção eléctrica mundial, o equivalente a 18000 TWh. No entanto, e apesar do desenvolvimento notável nas últimas décadas, a principal condicionante a uma maior proliferação destes sistemas é o ainda elevado custo, aliado ao seu fraco desempenho global. Apesar do custo e ineficiência dos módulos fotovoltaicos ter vindo a diminuir, o rendimento dos sistemas contínua dependente de factores externos sujeitos a grande variabilidade, como a temperatura e a irradiância, e às limitações tecnológicas e falta de sinergia dos seus equipamentos constituintes. Neste sentido procurou-se como objectivo na elaboração desta dissertação, avaliar o potencial de optimização dos sistemas fotovoltaicos recorrendo a técnicas de modelação e simulação. Para o efeito, em primeiro lugar foram identificados os principais factores que condicionam o desempenho destes sistemas. Em segundo lugar, e como caso prático de estudo, procedeu-se à modelação de algumas configurações de sistemas fotovoltaicos, e respectivos componentes em ambiente MatlabTM/SimulinkTM. Em seguida procedeu-se à análise das principais vantagens e desvantagens da utilização de diversas ferramentas de modelação na optimização destes sistemas, assim como da incorporação de técnicas de inteligência artificial para responder aos novos desafios que esta tecnologia enfrentará no futuro. Através deste estudo, conclui-se que a modelação é não só um instrumento útil para a optimização dos actuais sistemas PV, como será, certamente uma ferramenta imprescindível para responder aos desafios das novas aplicações desta tecnologia. Neste último ponto as técnicas de modelação com recurso a inteligência artificial (IA) terão seguramente um papel preponderante. O caso prático de modelação realizado permitiu concluir que esta é igualmente uma ferramenta útil no apoio ao ensino e investigação. Contudo, convém não esquecer que um modelo é apenas uma aproximação à realidade, devendo recorrer-se sempre ao sentido crítico na interpretação dos seus resultados.
Resumo:
The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.
Resumo:
In the last years there has been a considerable increase in the number of people in need of intensive care, especially among the elderly, a phenomenon that is related to population ageing (Brown 2003). However, this is not exclusive of the elderly, as diseases as obesity, diabetes, and blood pressure have been increasing among young adults (Ford and Capewell 2007). As a new fact, it has to be dealt with by the healthcare sector, and particularly by the public one. Thus, the importance of finding new and cost effective ways for healthcare delivery are of particular importance, especially when the patients are not to be detached from their environments (WHO 2004). Following this line of thinking, a VirtualECare Multiagent System is presented in section 2, being our efforts centered on its Group Decision modules (Costa, Neves et al. 2007) (Camarinha-Matos and Afsarmanesh 2001).On the other hand, there has been a growing interest in combining the technological advances in the information society - computing, telecommunications and knowledge – in order to create new methodologies for problem solving, namely those that convey on Group Decision Support Systems (GDSS), based on agent perception. Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities, in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life cycle. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the GDSS referred to above to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This attainment is vital, regarding the incoming to the market of new drugs and medical practices, which compete in the use of limited resources.
Resumo:
Group decision making plays an important role in today’s organisations. The impact of decision making is so high and complex, that rarely the decision making process is made individually. In Group Decision Argumentation, there is a set of participants, with different profiles and expertise levels, that exchange ideas or engage in a process of argumentation and counter-argumentation, negotiate, cooperate, collaborate or even discuss techniques and/or methodologies for problem solving. In this paper, it is proposed a Multi-Agent simulator for the behaviour representation of group members in a decision making process. Agents behave depending on rational and emotional intelligence and use persuasive argumentation to convince and make alternative choices.
Resumo:
As the time goes on, it is a question of common sense to involve in the process of decision making people scattered around the globe. Groups are created in a formal or informal way, exchange ideas or engage in a process of argumentation and counterargumentation, negotiate, cooperate, collaborate or even discuss techniques and/or methodologies for problem solving. In this work it is proposed an agent-based architecture to support a ubiquitous group decision support system, i.e. based on the concept of agent, which is able to exhibit intelligent, and emotional-aware behaviour, and support argumentation, through interaction with individual persons or groups. It is enforced the paradigm of Mixed Initiative Systems, so the initiative is to be pushed by human users and/or intelligent agents.
Resumo:
Dissertação apresentada à Escola Superior de Educação para a obtenção do Grau de Mestre em Ciências da Educação, especialidade em Supervisão em Educação