940 resultados para Evolutionary optimization methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power system planning, control and operation require an adequate use of existing resources as to increase system efficiency. The use of optimal solutions in power systems allows huge savings stressing the need of adequate optimization and control methods. These must be able to solve the envisaged optimization problems in time scales compatible with operational requirements. Power systems are complex, uncertain and changing environments that make the use of traditional optimization methodologies impracticable in most real situations. Computational intelligence methods present good characteristics to address this kind of problems and have already proved to be efficient for very diverse power system optimization problems. Evolutionary computation, fuzzy systems, swarm intelligence, artificial immune systems, neural networks, and hybrid approaches are presently seen as the most adequate methodologies to address several planning, control and operation problems in power systems. Future power systems, with intensive use of distributed generation and electricity market liberalization increase power systems complexity and bring huge challenges to the forefront of the power industry. Decentralized intelligence and decision making requires more effective optimization and control techniques techniques so that the involved players can make the most adequate use of existing resources in the new context. The application of computational intelligence methods to deal with several problems of future power systems is presented in this chapter. Four different applications are presented to illustrate the promises of computational intelligence, and illustrate their potentials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we solve Mathematical Programs with Complementarity Constraints using the hyperbolic smoothing strategy. Under this approach, the complementarity condition is relaxed through the use of the hyperbolic smoothing function, involving a positive parameter that can be decreased to zero. An iterative algorithm is implemented in MATLAB language and a set of AMPL problems from MacMPEC database were tested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scientific evidence has shown an association between organochlorine compounds (OCC) exposure and human health hazards. Concerning this, OCC detection in human adipose samples has to be considered a public health priority. This study evaluated the efficacy of various solid-phase extraction (SPE) and cleanup methods for OCC determination in human adipose tissue. Octadecylsilyl endcapped (C18-E), benzenesulfonic acid modified silica cation exchanger (SA), poly (styrene-divinylbenzene (EN) and EN/RP18 SPE sorbents were evaluated. The relative sample cleanup provided by these SPE columns was evaluated using gas chromatography with electron capture detection (GC–ECD). The C18-E columns with strong homogenization were found to provide the most effective cleanup, removing the greatest amount of interfering substance, and simultaneously ensuring good analyte recoveries higher than 70%. Recoveries>70% with standard deviations (SD)<15% were obtained for all compounds under the selected conditions. Method detection limits were in the 0.003–0.009 mg/kg range. The positive samples were confirmed by gas chromatography coupled with tandem mass spectrometry (GC-MS/MS). The highest percentage found of the OCC in real samples corresponded to HCB, o,p′-DDT and methoxychlor, which were detected in 80 and 95% of samples analyzed respectively. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The BALA project (Biodiversity of Arthropods of Laurisilva of the Azores) is a research initiative to quantify the spatial distribution of arthropod biodiversity in native forests of the Azores archipelago. Arthropods were collected using a combination of two techniques, targeting epigean (ground dwelling) and canopy (arboreal) arthropods: pitfall traps (with Turquin and Ethylene solutions) and beating samples (using the three most dominant plant species). A total of 109 transects distributed amongst 18 forest fragments in seven of the nine Azorean islands were used in this study. The performance of alternative sampling methods and effort were tested. No significant differences were found in the accumulated number of species captured whether an alternative method was used or whether another transect with similar effort was established in another location within the same fragment. A combination of Ethylene and Turquin traps captured more species per individual, Turquin and beating captured more species per sample, and Turquin captured more species per unit time. An optimization exercise was performed and we found that the protocol applied during recent years is very close to optimal, allowing its future replication with confidence. The minimum combinations of sampling effort and methods, in order to monitor or to inventory diversity, taking into account different proportions of sample completeness are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: A new protocol for fixation and slide preservation was evaluated in order to improve the quality of immunocytochemical reactions on cytology slides. Methods: The quality of immunoreactions was evaluated retrospectively on 186 cytology slides (130 direct smears, 56 cytospins) prepared from different cytology samples. Ninety-three of the slides were air dried, stored at -20 °C and fixed in acetone for 10 minutes (Protocol 1), whereas the other 93 were immediately fixed in methanol at -20 °C for at least 30 minutes, subsequently protected with polyethylene glycol (PEG) and stored at room temperature (Protocol 2). Immunocytochemical staining, with eight primary antibodies, was performed on a Ventana BenchMark Ultra instrument using an UltraView Universal DAB Detection Kit. The following parameters were evaluated for each immunoreaction: morphology preservation, intensity of specific staining, background and counterstain. The slides were blinded and independently scored by four observers with marks from 0 to 20. Results: The quality of immunoreactions was better on methanol-fixed slides protected with PEG than on air-dried slides stored in the freezer: X¯ = 14.44 ± 3.58 versus X¯ = 11.02 ± 3.86, respectively (P < 0.001). Conclusion: Immediate fixation of cytology slides in cold methanol with subsequent application of PEG is an easy and straightforward procedure that improves the quality of immunocytochemical reactions and allows the storage of the slides at room temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An optimised version of the Quick, Easy, Cheap, Effective, Rugged and Safe (QuEChERS) method for simultaneous determination of 14 organochlorine pesticides in carrots was developed using gas chromatography coupled with electron-capture detector (GC-ECD) and confirmation by gas chromatography tandem mass spectrometry (GC-MS/MS). A citrate-buffered version of QuEChERS was applied for the extraction of the organochlorine pesticides, and for the extract clean-up, primary secondary amine, octadecyl-bonded silica (C18), magnesium sulphate (MgSO4) and graphitized carbon black were used as sorbents. The GC-ECD determination of the target compounds was achieved in less than 20 min. The limits of detection were below the EUmaximum residue limits (MRLs) for carrots, 10–50 μg kg−1, while the limit of quantification did exceed 10 μg kg−1 for hexachlorobenzene (HCB). The introduction of a sonication step was shown to improve the recoveries. The overall average recoveries in carrots, at the four tested levels (60, 80, 100 and 140 μg kg−1), ranged from 66 to 111% with relative standard deviations in the range of 2– 15 % (n03) for all analytes, with the exception of HCB. The method has been applied to the analysis of 21 carrot samples from different Portuguese regions, and β-HCH was the pesticide most frequently found, with concentrations oscillating between less than the limit of quantification to 14.6 μg kg−1. Only one sample had a pesticide residue (β-HCH) above the MRL, 14.6 μg kg−1. This methodology combines the advantages of both QuEChERS and GC-ECD, producing a very rapid, sensitive and reliable procedure which can be applied in routine analytical laboratories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity short-term load forecast is very important for the operation of power systems. In this work a classical exponential smoothing model, the Holt-Winters with double seasonality was used to test for accurate predictions applied to the Portuguese demand time series. Some metaheuristic algorithms for the optimal selection of the smoothing parameters of the Holt-Winters forecast function were used and the results after testing in the time series showed little differences among methods, so the use of the simple local search algorithms is recommended as they are easier to implement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity short-term load forecast is very important for the operation of power systems. In this work a classical exponential smoothing model, the Holt-Winters with double seasonality was used to test for accurate predictions applied to the Portuguese demand time series. Some metaheuristic algorithms for the optimal selection of the smoothing parameters of the Holt-Winters forecast function were used and the results after testing in the time series showed little differences among methods, so the use of the simple local search algorithms is recommended as they are easier to implement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The foot and the ankle are small structures commonly affected by disorders, and their complex anatomy represent significant diagnostic challenges. SPECT/CT Image fusion can provide missing anatomical and bone structure information to functional imaging, which is particularly useful to increase diagnosis certainty of bone pathology. However, due to SPECT acquisition duration, patient’s involuntary movements may lead to misalignment between SPECT and CT images. Patient motion can be reduced using a dedicated patient support. We aimed at designing an ankle and foot immobilizing device and measuring its efficacy at improving image fusion. Methods: We enrolled 20 patients undergoing distal lower-limb SPECT/CT of the ankle and the foot with and without a foot holder. The misalignment between SPECT and CT images was computed by manually measuring 14 fiducial markers chosen among anatomical landmarks also visible on bone scintigraphy. Analysis of variance was performed for statistical analysis. Results: The obtained absolute average difference without and with support was 5.1±5.2 mm (mean±SD) and 3.1±2.7 mm, respectively, which is significant (p<0.001). Conclusion: The introduction of the foot holder significantly decreases misalignment between SPECT and CT images, which may have clinical influence in the precise localization of foot and ankle pathology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constraints nonlinear optimization problems can be solved using penalty or barrier functions. This strategy, based on solving the problems without constraints obtained from the original problem, have shown to be e ective, particularly when used with direct search methods. An alternative to solve the previous problems is the lters method. The lters method introduced by Fletcher and Ley er in 2002, , has been widely used to solve problems of the type mentioned above. These methods use a strategy di erent from the barrier or penalty functions. The previous functions de ne a new one that combine the objective function and the constraints, while the lters method treat optimization problems as a bi-objective problems that minimize the objective function and a function that aggregates the constraints. Motivated by the work of Audet and Dennis in 2004, using lters method with derivative-free algorithms, the authors developed works where other direct search meth- ods were used, combining their potential with the lters method. More recently. In a new variant of these methods was presented, where it some alternative aggregation restrictions for the construction of lters were proposed. This paper presents a variant of the lters method, more robust than the previous ones, that has been implemented with a safeguard procedure where values of the function and constraints are interlinked and not treated completely independently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing use of Carbon-Fibre Reinforced Plastic (CFRP) laminates in high responsibility applications introduces an issue regarding their handling after damage. The availability of efficient repair methods is essential to restore the strength of the structure. The availability of accurate predictive tools for the repairs behaviour is also essential for the reduction of costs and time associated to extensive tests. This work reports on a numerical study of the tensile behaviour of three-dimensional (3D) adhesively-bonded scarf repairs in CFRP structures, using a ductile adhesive. The Finite Element (FE) analysis was performed in ABAQUS® and Cohesive Zone Models (CZM’s) was used for the simulation of damage in the adhesive layer. A parametric study was performed on two geometric parameters. The use of overlaminating plies covering the repaired region at the outer or both repair surfaces was also tested as an attempt to increase the repairs efficiency. The results allowed the proposal of design principles for repairing CFRP structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The trajectory planning of redundant robots is an important area of research and efficient optimization algorithms are needed. The pseudoinverse control is not repeatable, causing drift in joint space which is undesirable for physical control. This paper presents a new technique that combines the closed-loop pseudoinverse method with genetic algorithms, leading to an optimization criterion for repeatable control of redundant manipulators, and avoiding the joint angle drift problem. Computer simulations performed based on redundant and hyper-redundant planar manipulators show that, when the end-effector traces a closed path in the workspace, the robot returns to its initial configuration. The solution is repeatable for a workspace with and without obstacles in the sense that, after executing several cycles, the initial and final states of the manipulator are very close.