46 resultados para Multi objective evolutionary algorithms
Resumo:
Smart grids with an intensive penetration of distributed energy resources will play an important role in future power system scenarios. The intermittent nature of renewable energy sources brings new challenges, requiring an efficient management of those sources. Additional storage resources can be beneficially used to address this problem; the massive use of electric vehicles, particularly of vehicle-to-grid (usually referred as gridable vehicles or V2G), becomes a very relevant issue. This paper addresses the impact of Electric Vehicles (EVs) in system operation costs and in power demand curve for a distribution network with large penetration of Distributed Generation (DG) units. An efficient management methodology for EVs charging and discharging is proposed, considering a multi-objective optimization problem. The main goals of the proposed methodology are: to minimize the system operation costs and to minimize the difference between the minimum and maximum system demand (leveling the power demand curve). The proposed methodology perform the day-ahead scheduling of distributed energy resources in a distribution network with high penetration of DG and a large number of electric vehicles. It is used a 32-bus distribution network in the case study section considering different scenarios of EVs penetration to analyze their impact in the network and in the other energy resources management.
Resumo:
A methodology to increase the probability of delivering power to any load point through the identification of new investments in distribution network components is proposed in this paper. The method minimizes the investment cost as well as the cost of energy not supplied in the network. A DC optimization model based on mixed integer non-linear programming is developed considering the Pareto front technique in order to identify the adequate investments in distribution networks components which allow increasing the probability of delivering power for any customer in the distribution system at the minimum possible cost for the system operator, while minimizing the energy not supplied cost. Thus, a multi-objective problem is formulated. To illustrate the application of the proposed methodology, the paper includes a case study which considers a 180 bus distribution network
Resumo:
A sustentabilidade do sistema energético é crucial para o desenvolvimento económico e social das sociedades presentes e futuras. Para garantir o bom funcionamento dos sistemas de energia actua-se, tipicamente, sobre a produção e sobre as redes de transporte e de distribuição. No entanto, a integração crescente de produção distribuída, principalmente nas redes de distribuição de média e de baixa tensão, a liberalização dos mercados energéticos, o desenvolvimento de mecanismos de armazenamento de energia, o desenvolvimento de sistemas automatizados de controlo de cargas e os avanços tecnológicos das infra-estruturas de comunicação impõem o desenvolvimento de novos métodos de gestão e controlo dos sistemas de energia. O contributo deste trabalho é o desenvolvimento de uma metodologia de gestão de recursos energéticos num contexto de SmartGrids, considerando uma entidade designada por VPP que gere um conjunto de instalações (unidades produtoras, consumidores e unidades de armazenamento) e, em alguns casos, tem ao seu cuidado a gestão de uma parte da rede eléctrica. Os métodos desenvolvidos contemplam a penetração intensiva de produção distribuída, o aparecimento de programas de Demand Response e o desenvolvimento de novos sistemas de armazenamento. São ainda propostos níveis de controlo e de tomada de decisão hierarquizados e geridos por entidades que actuem num ambiente de cooperação mas também de concorrência entre si. A metodologia proposta foi desenvolvida recorrendo a técnicas determinísticas, nomeadamente, à programação não linear inteira mista, tendo sido consideradas três funções objectivo distintas (custos mínimos, emissões mínimas e cortes de carga mínimos), originando, posteriormente, uma função objectivo global, o que permitiu determinar os óptimos de Pareto. São ainda determinados os valores dos custos marginais locais em cada barramento e consideradas as incertezas dos dados de entrada, nomeadamente, produção e consumo. Assim, o VPP tem ao seu dispor um conjunto de soluções que lhe permitirão tomar decisões mais fundamentadas e de acordo com o seu perfil de actuação. São apresentados dois casos de estudo. O primeiro utiliza uma rede de distribuição de 32 barramentos publicada por Baran & Wu. O segundo caso de estudo utiliza uma rede de distribuição de 114 barramentos adaptada da rede de 123 barramentos do IEEE.
Resumo:
This contribution introduces the fractional calculus (FC) fundamental mathematical aspects and discuses some of their consequences. Based on the FC concepts, the chapter reviews the main approaches for implementing fractional operators and discusses the adoption of FC in control systems. Finally are presented some applications in the areas of modeling and control, namely fractional PID, heat diffusion systems, electromagnetism, fractional electrical impedances, evolutionary algorithms, robotics, and nonlinear system control.
Resumo:
A Computação Evolutiva enquadra-se na área da Inteligência Artificial e é um ramo das ciências da computação que tem vindo a ser aplicado na resolução de problemas em diversas áreas da Engenharia. Este trabalho apresenta o estado da arte da Computação Evolutiva, assim como algumas das suas aplicações no ramo da eletrónica, denominada Eletrónica Evolutiva (ou Hardware Evolutivo), enfatizando a síntese de circuitos digitais combinatórios. Em primeiro lugar apresenta-se a Inteligência Artificial, passando à Computação Evolutiva, nas suas principais vertentes: os Algoritmos Evolutivos baseados no processo da evolução das espécies de Charles Darwin e a Inteligência dos Enxames baseada no comportamento coletivo de alguns animais. No que diz respeito aos Algoritmos Evolutivos, descrevem-se as estratégias evolutivas, a programação genética, a programação evolutiva e com maior ênfase, os Algoritmos Genéticos. Em relação à Inteligência dos Enxames, descreve-se a otimização por colônia de formigas e a otimização por enxame de partículas. Em simultâneo realizou-se também um estudo da Eletrónica Evolutiva, explicando sucintamente algumas das áreas de aplicação, entre elas: a robótica, as FPGA, o roteamento de placas de circuito impresso, a síntese de circuitos digitais e analógicos, as telecomunicações e os controladores. A título de concretizar o estudo efetuado, apresenta-se um caso de estudo da aplicação dos algoritmos genéticos na síntese de circuitos digitais combinatórios, com base na análise e comparação de três referências de autores distintos. Com este estudo foi possível comparar, não só os resultados obtidos por cada um dos autores, mas também a forma como os algoritmos genéticos foram implementados, nomeadamente no que diz respeito aos parâmetros, operadores genéticos utilizados, função de avaliação, implementação em hardware e tipo de codificação do circuito.
Resumo:
A construction project is a group of discernible tasks or activities that are conduct-ed in a coordinated effort to accomplish one or more objectives. Construction projects re-quire varying levels of cost, time and other resources. To plan and schedule a construction project, activities must be defined sufficiently. The level of detail determines the number of activities contained within the project plan and schedule. So, finding feasible schedules which efficiently use scarce resources is a challenging task within project management. In this context, the well-known Resource Constrained Project Scheduling Problem (RCPSP) has been studied during the last decades. In the RCPSP the activities of a project have to be scheduled such that the makespan of the project is minimized. So, the technological precedence constraints have to be observed as well as limitations of the renewable resources required to accomplish the activities. Once started, an activity may not be interrupted. This problem has been extended to a more realistic model, the multi-mode resource con-strained project scheduling problem (MRCPSP), where each activity can be performed in one out of several modes. Each mode of an activity represents an alternative way of combining different levels of resource requirements with a related duration. Each renewable resource has a limited availability for the entire project such as manpower and machines. This paper presents a hybrid genetic algorithm for the multi-mode resource-constrained pro-ject scheduling problem, in which multiple execution modes are available for each of the ac-tivities of the project. The objective function is the minimization of the construction project completion time. To solve the problem, is applied a two-level genetic algorithm, which makes use of two separate levels and extend the parameterized schedule generation scheme. It is evaluated the quality of the schedules and presents detailed comparative computational re-sults for the MRCPSP, which reveal that this approach is a competitive algorithm.
Resumo:
This paper presents a genetic algorithm for the multimode resource-constrained project scheduling problem (MRCPSP), in which multiple execution modes are available for each of the activities of the project. The objective function is the minimization of the construction project completion time. To solve the problem, is applied a two-level genetic algorithm, which makes use of two separate levels and extend the parameterized schedule generation scheme by introducing an improvement procedure. It is evaluated the quality of the schedule and present detailed comparative computational results for the MRCPSP, which reveal that this approach is a competitive algorithm.
Resumo:
Computerized scheduling methods and computerized scheduling systems according to exemplary embodiments. A computerized scheduling method may be stored in a memory and executed on one or more processors. The method may include defining a main multi-machine scheduling problem as a plurality of single machine scheduling problems; independently solving the plurality of single machine scheduling problems thereby calculating a plurality of near optimal single machine scheduling problem solutions; integrating the plurality of near optimal single machine scheduling problem solutions into a main multi-machine scheduling problem solution; and outputting the main multi-machine scheduling problem solution.
Resumo:
This paper presents a Swarm based Cooperation Mechanism for scheduling optimization. We intend to conceptualize real manufacturing systems as interacting autonomous entities in order to support decision making in agile manufacturing environments. Agents coordinate their actions automatically without human supervision considering a common objective – global scheduling solution taking advantages from collective behavior of species through implicit and explicit cooperation. The performance of the cooperation mechanism will be evaluated consider implicit cooperation at first stage through ACS, PSO and ABC algorithms and explicit through cooperation mechanism application.
Resumo:
This paper presents an agent-based simulator designed for analyzing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, considering user risk preferences. The system includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions. In the simulated market agents interact in several different ways and may joint together to form coalitions. In this paper we address multi-agent coalitions to analyse Distributed Generation in Electricity Markets
Resumo:
A novel agent-based approach to Meta-Heuristics self-configuration is proposed in this work. Meta-heuristics are examples of algorithms where parameters need to be set up as efficient as possible in order to unsure its performance. This paper presents a learning module for self-parameterization of Meta-heuristics (MHs) in a Multi-Agent System (MAS) for resolution of scheduling problems. The learning is based on Case-based Reasoning (CBR) and two different integration approaches are proposed. A computational study is made for comparing the two CBR integration perspectives. In the end, some conclusions are reached and future work outlined.
Resumo:
With the increasing importance of large commerce across the Internet it is becoming increasingly evident that in a few years the Iternet will host a large number of interacting software agents. a vast number of them will be economically motivated, and will negociate a variety of goods and services. It is therefore important to consider the economic incentives and behaviours of economic software agents, and to use all available means to anticipate their collective interactions. This papers addresses this concern by presenting a multi-agent market simulator designed for analysing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, consideting risk preferences. The system includes agents that are capable of increasing their performance with their own experience, by adapting to the market conditions. The results of the negotiations between agents are analysed by data minig algorithms in order to extract rules that give agents feedback to imprive their strategies.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.
Resumo:
Screening of topologies developed by hierarchical heuristic procedures can be carried out by comparing their optimal performance. In this work we will be exploiting mono-objective process optimization using two algorithms, simulated annealing and tabu search, and four different objective functions: two of the net present value type, one of them including environmental costs and two of the global potential impact type. The hydrodealkylation of toluene to produce benzene was used as case study, considering five topologies with different complexities mainly obtained by including or not liquid recycling and heat integration. The performance of the algorithms together with the objective functions was observed, analyzed and discussed from various perspectives: average deviation of results for each algorithm, capacity for producing high purity product, screening of topologies, objective functions robustness in screening of topologies, trade-offs between economic and environmental type objective functions and variability of optimum solutions.