44 resultados para Optimal values


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new and efficient methodology for distribution network reconfiguration integrated with optimal power flow (OPF) based on a Benders decomposition approach. The objective minimizes power losses, balancing load among feeders and subject to constraints: capacity limit of branches, minimum and maximum power limits of substations or distributed generators, minimum deviation of bus voltages and radial optimal operation of networks. The Generalized Benders decomposition algorithm is applied to solve the problem. The formulation can be embedded under two stages; the first one is the Master problem and is formulated as a mixed integer non-linear programming problem. This stage determines the radial topology of the distribution network. The second stage is the Slave problem and is formulated as a non-linear programming problem. This stage is used to determine the feasibility of the Master problem solution by means of an OPF and provides information to formulate the linear Benders cuts that connect both problems. The model is programmed in GAMS. The effectiveness of the proposal is demonstrated through two examples extracted from the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies Optimal Intelligent Supervisory Control System (OISCS) model for the design of control systems which can work in the presence of cyber-physical elements with privacy protection. The development of such architecture has the possibility of providing new ways of integrated control into systems where large amounts of fast computation are not easily available, either due to limitations on power, physical size or choice of computing elements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a Unit Commitment model with reactive power compensation that has been solved by Genetic Algorithm (GA) optimization techniques. The GA has been developed a computational tools programmed/coded in MATLAB. The main objective is to find the best generations scheduling whose active power losses are minimal and the reactive power to be compensated, subjected to the power system technical constraints. Those are: full AC power flow equations, active and reactive power generation constraints. All constraints that have been represented in the objective function are weighted with a penalty factors. The IEEE 14-bus system has been used as test case to demonstrate the effectiveness of the proposed algorithm. Results and conclusions are dully drawn.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Native agars from Gracilaria vermiculophylla produced in sustainable aquaculture systems (IMTA) were extracted under conventional (TWE) and microwave (MAE) heating. The optimal extracts from both processes were compared in terms of their properties. The agars’ structure was further investigated through Fourier transform infrared and NMR spectroscopy. Both samples showed a regular structure with an identical backbone, β-D-galactose (G) and 3,6-anhydro-α-L-galactose (LA) units; a considerable degree of methylation was found at C6 of the G units and, to a lesser extent, at C2 of the LA residues. The methylation degree in the G units was lower for MAEopt agar; the sulfate content was also reduced. MAE led to higher agar recoveries with drastic extraction time and solvent volume reductions. Two times lower values of [η] and Mv obtained for the MAEopt sample indicate substantial depolymerization of the polysaccharide backbone; this was reflected in its gelling properties; yet it was clearly appropriate for commercial application in soft-texture food products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Os objectivos principais deste estudo são a caracterização de uma das linhas de extrusão existentes na Cabelte, nomeadamente a linha de extrusão de referência EP5, composta por duas extrusoras. Pretende-se fazer a determinação de indicadores energéticos e de processo e a optimização do consumo energético, no que diz respeito à energia consumida e às perdas térmicas relativas a esta linha. Para fazer a monitorização da linha de extrusão EP5 foi colocado no quadro geral dessa linha um equipamento central de medida de forma a ser possível a sua monitorização. No entanto, para a extrusora auxiliar as medições foram efectuadas com uma pinça amperimétrica e um fasímetro. Foram também efectuados ensaios onde foi avaliada a quantidade de material transformada, para isso foi utilizado um equipamento de pesagem, doseador gravimétrico aplicado nas extrusoras. As medições de temperatura para os cálculos das perdas térmicas da extrusora principal e para a caracterização dos materiais plásticos, foram efectuadas utilizando um termómetro digital. Foram efectuados ensaios de débito às extrusoras auxiliar e principal e foi estudada a variação do factor de potência em função da rotação do fuso. Na perspectiva do utilizador final a optimização para a utilização racional de energia está na redução de encargos da factura de energia eléctrica. Essa factura não depende só da quantidade mas também do modo temporal como se utiliza essa energia, principalmente a energia eléctrica, bastante dependente do período em que é consumida. Uma metodologia diferente no planeamento da produção, contemplando o fabrico dos cabos com maior custo específico nas horas de menor custo energético, implicaria uma redução dos custos específicos de 18,7% para o horário de verão e de 20,4% para o horário de inverno. Os materiais de revestimento utilizados (PE e PVC), influenciam directamente os custos energéticos, uma vez que o polietileno (PE) apresenta sempre valores de entalpia superiores (0,317 kWh/kg e 0,281 kWh/kg)) e necessita de temperaturas de trabalho mais elevadas do que o policloreto de vinilo (PVC) (0,141 kWh/kg e 0,124 kWh/kg). O consumo específico tendencialmente diminui à medida que aumenta a rotação do fuso, até se atingir o valor de rotação óptimo, a partir do qual esta tendência se inverte. O cosφ para as duas extrusoras em estudo, aumenta sempre com o aumento de rotação do fuso. Este estudo permitiu avaliar as condições óptimas no processo de revestimento dos cabos, de forma a minimizarmos os consumos energéticos. A redução de toda a espécie de desperdícios (sobre consumos, desperdício em purgas) é uma prioridade de gestão que alia também a eficácia à eficiência, e constitui uma ferramenta fundamental para assegurar o futuro da empresa. O valor médio lido para o factor de potência (0,38) da linha EP5, valor extremamente baixo e que vem associado à energia reactiva, além do factor económico que lhe está inerente, condiciona futuras ampliações. A forma de se corrigir o factor de potência é instalando uma bateria de condensadores de 500 kVAr. Considerando o novo sistema tarifário aplicado à energia reactiva, vamos ter um ganho de 36167,4 Euro/ano e o período de retorno de investimento é de 0,37 ano (4,5 meses). Esta medida implica também uma redução anual na quantidade de CO2 emitida de 6,5%. A quantificação das perdas térmicas é importante, pois só desta forma se podem definir modos de actuação de forma a aumentar a eficiência energética. Se não existir conhecimento profundo dos processos e metodologias correctas, não podem existir soluções eficientes, logo é importante medir antes de avançar com qualquer medida de gestão.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neste trabalho serão apresentados e discutidos vários aspectos relacionados com células de combustível, com particular enfoque na modelação de células de combustível de membrana de permuta protónica. Este trabalho está dividido em vários capítulos. No Capítunlo 1 são apresentadas as motivações e os objectivos da tese. No Capítulo 2 serão apresentadas as células de combustível em geral, a sua origem, os diversos tipos, o que as diferencia das restantes tecnologias de geração de energia e as suas vantagens e desvantagens. No Capítulo 3 discute-se a modelação de células de combustível. Serão expostos e explicados os diferentes tipos de modelos, seguindo-se uma apresentação do modelo selecionado para estudo, com referência aos fundamentos teóricos exposição detalhada da fórmulação matemática e os parâmetros que caracterizam o modelo. É também apresentado a implementação do modelo em Matlab/Simulink. No Capítulo 4 será discutida e apresentada a abordagem utilizada para a identificação dos parâmetros do modelo da célula de combustível. Propõe-se e prova-se que uma abordagem baseada num algoritmo de optimização inteligente proporciona um método eficaz e preciso para a identificação dos parâmetros. Esta abordagem requer a existência de alguns dados experimentais que são também apresentados. O algoritmo utilizado designa-se por Optimização por Enxame de Partículas – Particle Swarm Optimization (PSO). São apresentados os seus fundamentos, características, implementação em Matlab/Simulink e a estratégia de optimização, isto é, a configuração do algoritmo, a definição da função objectivo e limites de variação dos parâmetros. São apresentados os resultados do processo de optimização, resultados adicionais de validação do modelo, uma análise de robustez do conjunto óptimo de parâmetros e uma análise de sensibilidade dos mesmos. O trabalho termina apresentando, no último capítulo, algumas conclusões, das quais se destacam: - O bom desempenho do algoritmo PSO para a identificação dos parâmetros do modelo da célula de combsutível; - Uma robustez interessante do algoritmo PSO, no sentido em que, para várias execuções do método resultam valores do parâmetros e da função objectivo com variabilidade bastante reduzidas; - Um bom modelo da célula de combustível, que quando caracterizado pelo conjunto óptimo de parâmetros, apresenta, sistematicamente, erros relativos médios inferiores a 2,5% para um conjunto alargado de condições de funcionamento.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this paper is to evaluate the key elements in the construction of cosistent organisational messages over time. In order to accomplish that, we propose the aligment of several elements: vision, misson, objectives, cultural values, optimal identity attributes, positioning, type of messages, communication style and means, and image...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tympanometry values of children between 3-45 months old during cold season, with a 226 Hz probe tone

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of uncertainty propagation in composite laminate structures is studied. An approach based on the optimal design of composite structures to achieve a target reliability level is proposed. Using the Uniform Design Method (UDM), a set of design points is generated over a design domain centred at mean values of random variables, aimed at studying the space variability. The most critical Tsai number, the structural reliability index and the sensitivities are obtained for each UDM design point, using the maximum load obtained from optimal design search. Using the UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on supervised evolutionary learning. Finally, using the developed ANN a Monte Carlo simulation procedure is implemented and the variability of the structural response based on global sensitivity analysis (GSA) is studied. The GSA is based on the first order Sobol indices and relative sensitivities. An appropriate GSA algorithm aiming to obtain Sobol indices is proposed. The most important sources of uncertainty are identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the optimization of complex-order algorithms for the discrete-time control of linear and nonlinear systems. The fundamentals of fractional systems and genetic algorithms are introduced. Based on these concepts, complexorder control schemes and their implementation are evaluated in the perspective of evolutionary optimization. The results demonstrate not only that complex-order derivatives constitute a valuable alternative for deriving control algorithms, but also the feasibility of the adopted optimization strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scheduling of constrained deadline sporadic task systems on multiprocessor platforms is an area which has received much attention in the recent past. It is widely believed that finding an optimal scheduler is hard, and therefore most studies have focused on developing algorithms with good processor utilization bounds. These algorithms can be broadly classified into two categories: partitioned scheduling in which tasks are statically assigned to individual processors, and global scheduling in which each task is allowed to execute on any processor in the platform. In this paper we consider a third, more general, approach called cluster-based scheduling. In this approach each task is statically assigned to a processor cluster, tasks in each cluster are globally scheduled among themselves, and clusters in turn are scheduled on the multiprocessor platform. We develop techniques to support such cluster-based scheduling algorithms, and also consider properties that minimize total processor utilization of individual clusters. In the last part of this paper, we develop new virtual cluster-based scheduling algorithms. For implicit deadline sporadic task systems, we develop an optimal scheduling algorithm that is neither Pfair nor ERfair. We also show that the processor utilization bound of us-edf{m/(2m−1)} can be improved by using virtual clustering. Since neither partitioned nor global strategies dominate over the other, cluster-based scheduling is a natural direction for research towards achieving improved processor utilization bounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a wireless sensor network (WSN) where a broadcast from a sensor node does not reach all sensor nodes in the network; such networks are often called multihop networks. Sensor nodes take individual sensor readings, however, in many cases, it is relevant to compute aggregated quantities of these readings. In fact, the minimum and maximum of all sensor readings at an instant are often interesting because they indicate abnormal behavior, for example if the maximum temperature is very high then it may be that a fire has broken out. In this context, we propose an algorithm for computing the min or max of sensor readings in a multihop network. This algorithm has the particularly interesting property of having a time complexity that does not depend on the number of sensor nodes; only the network diameter and the range of the value domain of sensor readings matter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An approach for the analysis of uncertainty propagation in reliability-based design optimization of composite laminate structures is presented. Using the Uniform Design Method (UDM), a set of design points is generated over a domain centered on the mean reference values of the random variables. A methodology based on inverse optimal design of composite structures to achieve a specified reliability level is proposed, and the corresponding maximum load is outlined as a function of ply angle. Using the generated UDM design points as input/output patterns, an Artificial Neural Network (ANN) is developed based on an evolutionary learning process. Then, a Monte Carlo simulation using ANN development is performed to simulate the behavior of the critical Tsai number, structural reliability index, and their relative sensitivities as a function of the ply angle of laminates. The results are generated for uniformly distributed random variables on a domain centered on mean values. The statistical analysis of the results enables the study of the variability of the reliability index and its sensitivity relative to the ply angle. Numerical examples showing the utility of the approach for robust design of angle-ply laminates are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PROFIBUS is an international standard (IEC 61158, EN 50170) for factory-floor communications, with several thousands of installations worldwide. Taking into account the increasing need for mobile devices in industrial environments, one obvious solution is to extend traditional wired PROFIBUS networks with wireless capabilities. In this paper, we outline the major aspects of a hybrid wired/wireless PROFIBUS-based architecture, where most of the design options were made in order to guarantee the real-time behaviour of the overall network. We also introduce the timing unpredictability problems resulting from the co-existence of heterogeneous physical media in the same network. However, the major focus of this paper is on how to guarantee real-time communications in such a hybrid network, where nodes (and whole segments) can move between different radio cells (inter-cell mobility). Assuming a simple mobility management mechanism based on mobile nodes performing periodic radio channel assessment and switching, we propose a methodology to compute values for specific parameters that enable an optimal (minimum) and bounded duration of the handoff procedure.