91 resultados para Electronic optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Swarm Intelligence (SI) is a growing research field of Artificial Intelligence (AI). SI is the general term for several computational techniques which use ideas and get inspiration from the social behaviours of insects and of other animals. This paper presents hybridization and combination of different AI approaches, like Bio-Inspired Techniques (BIT), Multi-Agent systems (MAS) and Machine Learning Techniques (ML T). The resulting system is applied to the problem of jobs scheduling to machines on dynamic manufacturing environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scheduling is a critical function that is present throughout many industries and applications. A great need exists for developing scheduling approaches that can be applied to a number of different scheduling problems with significant impact on performance of business organizations. A challenge is emerging in the design of scheduling support systems for manufacturing environments where dynamic adaptation and optimization become increasingly important. At this scenario, self-optimizing arise as the ability of the agent to monitor its state and performance and proactively tune itself to respond to environmental stimuli.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing importance of large commerce across the Internet it is becoming increasingly evident that in a few years the Iternet will host a large number of interacting software agents. a vast number of them will be economically motivated, and will negociate a variety of goods and services. It is therefore important to consider the economic incentives and behaviours of economic software agents, and to use all available means to anticipate their collective interactions. This papers addresses this concern by presenting a multi-agent market simulator designed for analysing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, consideting risk preferences. The system includes agents that are capable of increasing their performance with their own experience, by adapting to the market conditions. The results of the negotiations between agents are analysed by data minig algorithms in order to extract rules that give agents feedback to imprive their strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we solve Mathematical Programs with Complementarity Constraints using the hyperbolic smoothing strategy. Under this approach, the complementarity condition is relaxed through the use of the hyperbolic smoothing function, involving a positive parameter that can be decreased to zero. An iterative algorithm is implemented in MATLAB language and a set of AMPL problems from MacMPEC database were tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a new method for single-drop microextraction (SDME) for the preconcentration of organochlorine pesticides (OCP) from complex matrices. It is based on the use of a silicone ring at the tip of the syringe. A 5 μL drop of n-hexane is applied to an aqueous extract containing the OCP and found to be adequate to preconcentrate the OCPs prior to analysis by GC in combination with tandem mass spectrometry. Fourteen OCP were determined using this technique in combination with programmable temperature vaporization. It is shown to have many advantages over traditional split/splitless injection. The effects of kind of organic solvent, exposure time, agitation and organic drop volume were optimized. Relative recoveries range from 59 to 117 %, with repeatabilities of <15 % (coefficient of variation) were achieved. The limits of detection range from 0.002 to 0.150 μg kg−1. The method was applied to the preconcentration of OCPs in fresh strawberry, strawberry jam, and soil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A QuEChERS method has been developed for the determination of 14 organochlorine pesticides in 14 soils from different Portuguese regions with wide range composition. The extracts were analysed by GC-ECD (where GC-ECD is gas chromatography-electron-capture detector) and confirmed by GC-MS/MS (where MS/MS is tandem mass spectrometry). The organic matter content is a key factor in the process efficiency. An optimization was carried out according to soils organic carbon level, divided in two groups: HS (organic carbon>2.3%) and LS (organic carbon<2.3%). Themethod was validated through linearity, recovery, precision and accuracy studies. The quantification was carried out using a matrixmatched calibration to minimize the existence of the matrix effect. Acceptable recoveries were obtained (70–120%) with a relative standard deviation of ≤16% for the three levels of contamination. The ranges of the limits of detection and of the limits of quantification in soils HS were from 3.42 to 23.77 μg kg−1 and from 11.41 to 79.23 μg kg−1, respectively. For LS soils, the limits of detection ranged from 6.11 to 14.78 μg kg−1 and the limits of quantification from 20.37 to 49.27 μg kg−1. In the 14 collected soil samples only one showed a residue of dieldrin (45.36 μg kg−1) above the limit of quantification. This methodology combines the advantages of QuEChERS, GC-ECD detection and GC-MS/MS confirmation producing a very rapid, sensitive and reliable procedure which can be applied in routine analytical laboratories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scientific evidence has shown an association between organochlorine compounds (OCC) exposure and human health hazards. Concerning this, OCC detection in human adipose samples has to be considered a public health priority. This study evaluated the efficacy of various solid-phase extraction (SPE) and cleanup methods for OCC determination in human adipose tissue. Octadecylsilyl endcapped (C18-E), benzenesulfonic acid modified silica cation exchanger (SA), poly (styrene-divinylbenzene (EN) and EN/RP18 SPE sorbents were evaluated. The relative sample cleanup provided by these SPE columns was evaluated using gas chromatography with electron capture detection (GC–ECD). The C18-E columns with strong homogenization were found to provide the most effective cleanup, removing the greatest amount of interfering substance, and simultaneously ensuring good analyte recoveries higher than 70%. Recoveries>70% with standard deviations (SD)<15% were obtained for all compounds under the selected conditions. Method detection limits were in the 0.003–0.009 mg/kg range. The positive samples were confirmed by gas chromatography coupled with tandem mass spectrometry (GC-MS/MS). The highest percentage found of the OCC in real samples corresponded to HCB, o,p′-DDT and methoxychlor, which were detected in 80 and 95% of samples analyzed respectively. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiclass analysis method was optimized in order to analyze pesticides traces by gas chromatography with ion-trap and tandem mass spectrometry (GC-MS/MS). The influence of some analytical parameters on pesticide signal response was explored. Five ion trap mass spectrometry (IT-MS) operating parameters, including isolation time (IT), excitation voltage (EV), excitation time (ET),maximum excitation energy or “q” value (q), and isolationmass window (IMW) were numerically tested in order to maximize the instrument analytical signal response. For this, multiple linear regression was used in data analysis to evaluate the influence of the five parameters on the analytical response in the ion trap mass spectrometer and to predict its response. The assessment of the five parameters based on the regression equations substantially increased the sensitivity of IT-MS/MS in the MS/MS mode. The results obtained show that for most of the pesticides, these parameters have a strong influence on both signal response and detection limit.Using the optimized method, a multiclass pesticide analysis was performed for 46 pesticides in a strawberry matrix. Levels higher than the limit established for strawberries by the European Union were found in some samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores. Área de Especialização de Telecomunicações.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho surgiu no âmbito da Tese de Mestrado em Engenharia Química - Ramo Optimização Energética na Indústria Química, aliando a necessidade da Empresa Monteiro Ribas – Indústrias, S.A. em resolver alguns problemas relacionados com as estufas da unidade J da fábrica de revestimentos. Outro dos objectivos era propor melhorias de eficiência energética neste sector da empresa. Para tal, foi necessário fazer um levantamento energético de toda a unidade, o que permitiu verificar que as estufas de secagem (Recobrimento 1 e 2) seriam o principal objecto de estudo. O levantamento energético da empresa permitiu conhecer o seu consumo anual de energia de 697,9 tep, o que a classifica, segundo o Decreto-lei nº 71 de 15 de Abril de 2008, como Consumidora Intensiva de Energia (CIE). Além disso, as situações que devem ser alvo de melhoria são: a rede de termofluido, que apresenta válvulas sem isolamento, o sistema de iluminação, que não é o mais eficiente e a rede de distribuição de ar comprimido, que não tem a estrutura mais adequada. Desta forma sugere-se que a rede de distribuição de termofluido passe a ter válvulas isoladas com lã de rocha, o investimento total é de 2.481,56 €, mas a poupança pode ser de 21.145,14 €/ano, com o período de retorno de 0,12 anos. No sistema de iluminação propõe-se a substituição dos balastros normais por electrónicos, o investimento total é de 13.873,74 €, mas a poupança é de 2.620,26 €/ano, com período de retorno de 5 anos. No processo de secagem das linhas de recobrimento mediram-se temperaturas de todos os seus componentes, velocidades de ar o que permitiu conhecer a distribuição do calor fornecido pelo termofluido. No Recobrimento 1, o ar recebe entre 39 a 51% do calor total, a tela recebe cerca de 25% e na terceira estufa este é apenas de 6%. Nesta linha as perdas de calor por radiação oscilam entre 6 e 11% enquanto as perdas por convecção representam cerca de 17 a 44%. Como o calor que a tela recebe é muito inferior ao calor recebido pelo ar no Recobrimento 1, propõe-se uma redução do caudal de ar que entra na estufa, o que conduzirá certamente à poupança de energia térmica. No Recobrimento 2 o calor fornecido ao ar representa cerca de 51 a 77% do calor total e o cedido à tela oscila entre 2 e 3%. As perdas de calor por convecção oscilam entre 12 e 26%, enquanto que as perdas por radiação têm valores entre 4 e 8%. No que diz respeito ao calor necessário para evaporar os solventes este oscila entre os 4 e 13%. Os balanços de massa e energia realizados ao processo de secagem permitiram ainda determinar o rendimento das 3 estufas do Recobrimento 1, com 36, 47 e 24% paras as estufa 1, 2 e 3, respectivamente. No Recobrimento 2 os valores de rendimento foram superiores, tendo-se obtido valores próximos dos 41, 81 e 88%, para as estufas 1, 2 e 3, respectivamente. Face aos resultados obtidos propõem-se a reengenharia do processo introduzindo permutadores compactos para aquecer o ar antes de este entrar nas estufas. O estudo desta alteração foi apenas realizado para a estufa 1 do Recobrimento 1, tendo-se obtido uma área de transferência de calor de 6,80 m2, um investimento associado de 8.867,81 €. e uma poupança de 708,88 €/ano, com um período de retorno do investimento de 13 anos. Outra sugestão consiste na recirculação de parte do ar de saída (5%), que conduz à poupança de 158,02 €/ano. Estes valores, pouco significativos, não estimulam a adopção das referidas sugestões.