979 resultados para Constrained nonlinear optimization
Resumo:
The two-Higgs-doublet model can be constrained by imposing Higgs-family symmetries and/or generalized CP symmetries. It is known that there are only six independent classes of such symmetry-constrained models. We study the CP properties of all cases in the bilinear formalism. An exact symmetry implies CP conservation. We show that soft breaking of the symmetry can lead to spontaneous CP violation (CPV) in three of the classes.
Resumo:
This paper is on the problem of short-term hydro scheduling (STHS), particularly concerning a head-dependent hydro chain We propose a novel mixed-integer nonlinear programming (MINLP) approach, considering hydroelectric power generation as a nonlinear function of water discharge and of the head. As a new contribution to eat her studies, we model the on-off behavior of the hydro plants using integer variables, in order to avoid water discharges at forbidden areas Thus, an enhanced STHS is provided due to the more realistic modeling presented in this paper Our approach has been applied successfully to solve a test case based on one of the Portuguese cascaded hydro systems with a negligible computational time requirement.
Resumo:
A realização do presente trabalho teve como principais objectivos o desenvolvimento de espumas de poliuretano de um componente com propriedades de resistência à chama superiores (B1 & B2), aplicadas por pistola ou por adaptador/tubo e a optimização de uma espuma de poliuretano de um componente de inverno aplicada por pistola. Todo o trabalho desenvolvido está dividido em dois projectos distintos: i. O primeiro projecto consistiu em desenvolver espumas de um componente com propriedades de resistência à chama (classificadas como B1 e B2 de acordo com a norma alemã DIN 4102), aplicadas por pistola (GWB1 e GWB2) ou por adaptador/tubo (AWB), utilizando polióis poliésteres aromáticos modificados e aditivos retardantes de chama halogenados. Estas espumas deveriam apresentar também propriedades aceitáveis a baixas temperaturas. Após realizar várias formulações foi possível desenvolver uma espuma AWB2 com apenas 3,3% de poliol poliéster no pré-polímero e com propriedades equivalentes às da melhor espuma comercial mesmo a 5/-10 (temperatura da lata/cura da espuma em °C) e também com uma altura de chama de apenas 11 cm. A partir de duas formulações (AWB2) que passaram o Teste B2, foram obtidas também, uma espuma GWB2 e outra GWB1 com propriedades equivalentes às da melhor espuma da concorrência a -10/-10 e a 23/5, respectivamente, embora não tenham sido submetidas ao teste B2 e B1 após as modificações efectuadas. ii. O segundo projecto consistiu em optimizar uma espuma de poliuretano de um componente de inverno aplicada por pistola (GWB3). A espuma inicial tinha problemas de glass bubbles quando esta era dispensada a partir de uma lata cheia, sendo necessário ultrapassar este problema. Este problema foi resolvido diminuindo a razão de GPL/DME através do aumento da percentagem em volume de DME no pré-polímero para 14% no entanto, a estabilidade dimensional piorou um pouco. O reagente FCA 400 foi removido da formulação anterior (6925) numa tentativa de diminuir o custo da espuma, obtendo-se uma espuma aceitável a 23/23 e a 5/5, com uma redução de 4% no custo da produção e com uma redução de 5,5% no custo por litro de espuma dispensada, quando comparada com a sua antecessora. Por último, foi avaliada a influência da concentração de diferentes surfactantes na formulação 6925, verificando-se o melhoramento da estrutura celular da espuma para concentrções mais elevadas de surfactante, sendo este efeito mais notório a temperaturas mais baixas (5/5). Dos surfactantes estudados, o B 8871 mostrou o melhor desempenho a 5/5 com a concentração mais baixa, sendo portanto o melhor surfactante, enquanto o Struksilon 8003 demonstrou ser o menos adequado para esta formulação específica, apresentando piores resultados globais. Pode-se ainda acrescentar que os surfactantes L-5351, L-5352 e B 8526 também não são adequados para esta formulação uma vez que as espumas resultantes apresentam cell collapse, especialmente a 5/5. No caso dos surfactantes L-5351 e L-5352, esta propriedade piora com concentrações mais elevadas. Em cada projecto foram também efectuados testes de benchmark em determinadas espumas comerciais com o principal objectivo de comparar todos os resultados das espumas desenvolvidas, em ambos os projectos, com espumas da concorrência.
Resumo:
A presente dissertação pretende conceber e implementar um sistema de controlo tolerante a falhas, no canal experimental de rega da Universidade de Évora, utilizando um modelo implementado em MATLAB/SIMULINK®. Como forma de responder a este desafio, analisaram-se várias técnicas de diagnóstico de falhas, tendo-se optado por técnicas baseadas em redes neuronais para o desenvolvimento de um sistema de detecção e isolamento de falhas no canal de rega, sem ter em conta o tipo de sistema de controlo utilizado. As redes neuronais foram, assim, os processadores não lineares utilizados e mais aconselhados em situações onde exista uma abundância de dados do processo, porque aprendem por exemplos e são suportadas por teorias estatísticas e de optimização, focando não somente o processamento de sinais, como também expandindo os horizontes desse processamento. A ênfase dos modelos das redes neuronais está na sua dinâmica, na sua estabilidade e no seu comportamento. Portanto, o trabalho de investigação do qual resultou esta Dissertação teve como principais objectivos o desenvolvimento de modelos de redes neuronais que representassem da melhor forma a dinâmica do canal de rega, de modo a obter um sistema de detecção de falhas que faça uma comparação entre os valores obtidos nos modelos e no processo. Com esta diferença de valores, da qual resultará um resíduo, é possível desenvolver tanto o sistema de detecção como de isolamento de falhas baseados nas redes neuronais, possibilitando assim o desenvolvimento dum sistema de controlo tolerante a falhas, que engloba os módulos de detecção, de isolamento/diagnóstico e de reconfiguração do canal de rega. Em síntese, na Dissertação realizada desenvolveu-se um sistema que permite reconfigurar o processo em caso de ocorrência de falhas, melhorando significativamente o desempenho do canal de rega.
Resumo:
Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.
Resumo:
In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective functions. Our framework is inspired by the search/poll paradigm of direct-search methods of directional type and uses the concept of Pareto dominance to maintain a list of nondominated points (from which the new iterates or poll centers are chosen). The aim of our method is to generate as many points in the Pareto front as possible from the polling procedure itself, while keeping the whole framework general enough to accommodate other disseminating strategies, in particular, when using the (here also) optional search step. DMS generalizes to multiobjective optimization (MOO) all direct-search methods of directional type. We prove under the common assumptions used in direct search for single objective optimization that at least one limit point of the sequence of iterates generated by DMS lies in (a stationary form of) the Pareto front. However, extensive computational experience has shown that our methodology has an impressive capability of generating the whole Pareto front, even without using a search step. Two by-products of this paper are (i) the development of a collection of test problems for MOO and (ii) the extension of performance and data profiles to MOO, allowing a comparison of several solvers on a large set of test problems, in terms of their efficiency and robustness to determine Pareto fronts.
Resumo:
This paper presents a methodology that aims to increase the probability of delivering power to any load point of the electrical distribution system by identifying new investments in distribution components. The methodology is based on statistical failure and repair data of the distribution power system components and it uses fuzzy-probabilistic modelling for system component outage parameters. Fuzzy membership functions of system component outage parameters are obtained by statistical records. A mixed integer non-linear optimization technique is developed to identify adequate investments in distribution networks components that allow increasing the availability level for any customer in the distribution system at minimum cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a real distribution network.
Resumo:
This paper addresses the problem of energy resources management using modern metaheuristics approaches, namely Particle Swarm Optimization (PSO), New Particle Swarm Optimization (NPSO) and Evolutionary Particle Swarm Optimization (EPSO). The addressed problem in this research paper is intended for aggregators’ use operating in a smart grid context, dealing with Distributed Generation (DG), and gridable vehicles intelligently managed on a multi-period basis according to its users’ profiles and requirements. The aggregator can also purchase additional energy from external suppliers. The paper includes a case study considering a 30 kV distribution network with one substation, 180 buses and 90 load points. The distribution network in the case study considers intense penetration of DG, including 116 units from several technologies, and one external supplier. A scenario of 6000 EVs for the given network is simulated during 24 periods, corresponding to one day. The results of the application of the PSO approaches to this case study are discussed deep in the paper.
Resumo:
Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.
Resumo:
Scheduling is a critical function that is present throughout many industries and applications. A great need exists for developing scheduling approaches that can be applied to a number of different scheduling problems with significant impact on performance of business organizations. A challenge is emerging in the design of scheduling support systems for manufacturing environments where dynamic adaptation and optimization become increasingly important. In this paper, we describe a Self-Optimizing Mechanism for Scheduling System through Nature Inspired Optimization Techniques (NIT).
Resumo:
The paper proposes a methodology to increase the probability of delivering power to any load point by identifying new investments in distribution energy systems. The proposed methodology is based on statistical failure and repair data of distribution components and it uses a fuzzy-probabilistic modeling for the components outage parameters. The fuzzy membership functions of the outage parameters of each component are based on statistical records. A mixed integer nonlinear programming optimization model is developed in order to identify the adequate investments in distribution energy system components which allow increasing the probability of delivering power to any customer in the distribution system at the minimum possible cost for the system operator. To illustrate the application of the proposed methodology, the paper includes a case study that considers a 180 bus distribution network.
Resumo:
The operation of power systems in a Smart Grid (SG) context brings new opportunities to consumers as active players, in order to fully reach the SG advantages. In this context, concepts as smart homes or smart buildings are promising approaches to perform the optimization of the consumption, while reducing the electricity costs. This paper proposes an intelligent methodology to support the consumption optimization of an industrial consumer, which has a Combined Heat and Power (CHP) facility. A SCADA (Supervisory Control and Data Acquisition) system developed by the authors is used to support the implementation of the proposed methodology. An optimization algorithm implemented in the system in order to perform the determination of the optimal consumption and CHP levels in each instant, according to the Demand Response (DR) opportunities. The paper includes a case study with several scenarios of consumption and heat demand in the context of a DR event which specifies a maximum demand level for the consumer.
Resumo:
This paper proposes a particle swarm optimization (PSO) approach to support electricity producers for multiperiod optimal contract allocation. The producer risk preference is stated by a utility function (U) expressing the tradeoff between the expectation and variance of the return. Variance estimation and expected return are based on a forecasted scenario interval determined by a price range forecasting model developed by the authors. A certain confidence level is associated to each forecasted scenario interval. The proposed model makes use of contracts with physical (spot and forward) and financial (options) settlement. PSO performance was evaluated by comparing it with a genetic algorithm-based approach. This model can be used by producers in deregulated electricity markets but can easily be adapted to load serving entities and retailers. Moreover, it can easily be adapted to the use of other type of contracts.
Resumo:
Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.