67 resultados para Interior point algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improved procedure is proposed for determination of the pesticide carbaryl in natural waters based on double cloud point extraction. The clean up step was carried out only with Triton X-114 in alkaline medium in order to avoid the use of toxic organic solvents as well as to minimise waste generation. Cloud point preconcentration of the product of the reaction of the analyte with p-aminophenol and cetyltrimethylammonium bromide was explored to increase sensitivity and improve the detection limit. Linear response was achieved within 10 and 500 mu g L-1 and the apparent molar absorptivity was estimated as 4.6 x 105 L mol-1 cm-1. The detection limit was estimated as 7 mu g L-1 at the 99.7% confidence level and the coefficient of variation was 3.4% (n = 8). Recoveries within 91 and 99% were estimated for carbaryl spiked water samples. The results obtained for natural water samples were in agreement with those achieved by the batch of spectrophotometric procedure at the 95% confidence level. The proposed procedure is then a simple, fast, inexpensive and greener alternative for carbaryl determination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A flow injection (FI) micelle-mediated separation/preconcentration procedure for the determination of lead and cadmium by flame atomic absorption spectrometry (FAAS) has been proposed. The analytes reacted with 1-(2-thiazolylazo)-2-naphthol (TAN) to form hydrophobic chelates, which were extracted into the micelles of 0.05% (w/v) Triton X-114 in a solution buffered at pH 8.4. In the preconcentration stage, the micellar solution was continuously injected into a flow system with four mini-columns packed with cotton, glass wool. or TNT compresses for phase separation. The analytes-containing micelles were eluted from the mini-columns by a stream of 3 mol L(-1) HCl solution and the analytes were determined by FAAS. Chemical and flow variables affecting the preconcentration of the analytes were studied. For 15 mL. of preconcentrated solution, the enhancement factors varied between 15.1 and 20.3, the limits of detection were approximately 4.5 and 0.75 mu g L(-1) for lead and cadmium, respectively. For a solution containing 100 and 10 mu g L(-1) of lead and cadmium, respectively, the R.S.D. values varied from 1.6 to 3.2% (n = 7). The accuracy of the preconcentration system was evaluated by recovery measurements on spiked water samples. The method was susceptible to matrix effects, but these interferences were minimized by adding barium ions as masking agent in the sample solutions, and recoveries from spiked sample varied in the range of 95.1-107.3%. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concentration of hydrogen peroxide is an important parameter in the azo dyes decoloration process through the utilization of advanced oxidizing processes, particularly by oxidizing via UV/H2O2. It is pointed out that, from a specific concentration, the hydrogen peroxide works as a hydroxyl radical self-consumer and thus a decrease of the system`s oxidizing power happens. The determination of the process critical point (maximum amount of hydrogen peroxide to be added) was performed through a ""thorough mapping"" or discretization of the target region, founded on the maximization of an objective function objective (constant of reaction kinetics of pseudo-first order). The discretization of the operational region occurred through a feedforward backpropagation neural model. The neural model obtained presented remarkable coefficient of correlation between real and predicted values for the absorbance variable, above 0.98. In the present work, the neural model had, as phenomenological basis the Acid Brown 75 dye decoloration process. The hydrogen peroxide addition critical point, represented by a value of mass relation (F) between the hydrogen peroxide mass and the dye mass, was established in the interval 50 < F < 60. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents the study and development of a combined fault location scheme for three-terminal transmission lines using wavelet transforms (WTs). The methodology is based on the low- and high-frequency components of the transient signals originated from fault situations registered in the terminals of a system. By processing these signals and using the WT, it is possible to determine the time of travelling waves of voltages and/or currents from the fault point to the terminals, as well as estimate the fundamental frequency components. A new approach presents a reliable and accurate fault location scheme combining some different solutions. The main idea is to have a decision routine in order to select which method should be used in each situation presented to the algorithm. The combined algorithm was tested for different fault conditions by simulations using the ATP (Alternative Transients Program) software. The results obtained are promising and demonstrate a highly satisfactory degree of accuracy and reliability of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this paper is to relieve the power system engineers from the burden of the complex and time-consuming process of power system stabilizer (PSS) tuning. To achieve this goal, the paper proposes an automatic process for computerized tuning of PSSs, which is based on an iterative process that uses a linear matrix inequality (LMI) solver to find the PSS parameters. It is shown in the paper that PSS tuning can be written as a search problem over a non-convex feasible set. The proposed algorithm solves this feasibility problem using an iterative LMI approach and a suitable initial condition, corresponding to a PSS designed for nominal operating conditions only (which is a quite simple task, since the required phase compensation is uniquely defined). Some knowledge about the PSS tuning is also incorporated in the algorithm through the specification of bounds defining the allowable PSS parameters. The application of the proposed algorithm to a benchmark test system and the nonlinear simulation of the resulting closed-loop models demonstrate the efficiency of this algorithm. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article a novel algorithm based on the chemotaxis process of Echerichia coil is developed to solve multiobjective optimization problems. The algorithm uses fast nondominated sorting procedure, communication between the colony members and a simple chemotactical strategy to change the bacterial positions in order to explore the search space to find several optimal solutions. The proposed algorithm is validated using 11 benchmark problems and implementing three different performance measures to compare its performance with the NSGA-II genetic algorithm and with the particle swarm-based algorithm NSPSO. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general flowshop scheduling problem is a production problem where a set of n jobs have to be processed with identical flow pattern on in machines. In permutation flowshops the sequence of jobs is the same on all machines. A significant research effort has been devoted for sequencing jobs in a flowshop minimizing the makespan. This paper describes the application of a Constructive Genetic Algorithm (CGA) to makespan minimization on flowshop scheduling. The CGA was proposed recently as an alternative to traditional GA approaches, particularly, for evaluating schemata directly. The population initially formed only by schemata, evolves controlled by recombination to a population of well-adapted structures (schemata instantiation). The CGA implemented is based on the NEH classic heuristic and a local search heuristic used to define the fitness functions. The parameters of the CGA are calibrated using a Design of Experiments (DOE) approach. The computational results are compared against some other successful algorithms from the literature on Taillard`s well-known standard benchmark. The computational experience shows that this innovative CGA approach provides competitive results for flowshop scheduling; problems. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study proposes a new PSOS-model based damage identification procedure using frequency domain data. The formulation of the objective function for the minimization problem is based on the Frequency Response Functions (FRFs) of the system. A novel strategy for the control of the Particle Swarm Optimization (PSO) parameters based on the Nelder-Mead algorithm (Simplex method) is presented; consequently, the convergence of the PSOS becomes independent of the heuristic constants and its stability and confidence are enhanced. The formulated hybrid method performs better in different benchmark functions than the Simulated Annealing (SA) and the basic PSO (PSO(b)). Two damage identification problems, taking into consideration the effects of noisy and incomplete data, were studied: first, a 10-bar truss and second, a cracked free-free beam, both modeled with finite elements. In these cases, the damage location and extent were successfully determined. Finally, a non-linear oscillator (Duffing oscillator) was identified by PSOS providing good results. (C) 2009 Elsevier Ltd. All rights reserved

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analysis of heteroplasmy (presence of more than one type of mitochondrial DNA in an individual) is used as a tool in human identification studies, anthropology, and most currently in studies that relate heteroplasmy with longevity. The frequency of heteroplasmy and its correlation with age has been analyzed using different tissues such as blood, muscle, heart, bone and brain and in different regions of mitochondrial DNA, but this analysis had never been performed using hair samples. In this study, samples of hair were sequenced in order to ascertain whether the presence or not of heteroplasmy varied according to age, sex and origin of haplogroup individuals. The samples were grouped by age (3 groups), gender (male and female) and haplogroup of origin (European, African and Native American), and analyzed using the chi-square statistical test (chi(2)). Based in statistical results obtained, we conclude that there is no relationship between heteroplasmy and sex, age and haplogroup origin using hair samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a free software tool that supports the next-generation Mobile Communications, through the automatic generation of models of components and electronic devices based on neural networks. This tool enables the creation, training, validation and simulation of the model directly from measurements made on devices of interest, using an interface totally oriented to non-experts in neural models. The resulting model can be exported automatically to a traditional circuit simulator to test different scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new methodology to estimate unbalanced harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The problem solving algorithm herein proposed makes use of data from various power quality meters, which can either be synchronized by high technology GPS devices or by using information from a fundamental frequency load flow, what makes the overall power quality monitoring system much less costly. The ES based harmonic estimation model is applied to a 14 bus network to compare its performance to a conventional Monte Carlo approach. It is also applied to a 50 bus subtransmission network in order to compare the three-phase and single-phase approaches as well as the robustness of the proposed method. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improvement to the quality bidimensional Delaunay mesh generation algorithm, which combines the mesh refinement algorithms strategy of Ruppert and Shewchuk is proposed in this research. The developed technique uses diametral lenses criterion, introduced by L. P. Chew, with the purpose of eliminating the extremely obtuse triangles in the boundary mesh. This method splits the boundary segment and obtains an initial prerefinement, and thus reducing the number of necessary iterations to generate a high quality sequential triangulation. Moreover, it decreases the intensity of the communication and synchronization between subdomains in parallel mesh refinement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most post-processors for boundary element (BE) analysis use an auxiliary domain mesh to display domain results, working against the profitable modelling process of a pure boundary discretization. This paper introduces a novel visualization technique which preserves the basic properties of the boundary element methods. The proposed algorithm does not require any domain discretization and is based on the direct and automatic identification of isolines. Another critical aspect of the visualization of domain results in BE analysis is the effort required to evaluate results in interior points. In order to tackle this issue, the present article also provides a comparison between the performance of two different BE formulations (conventional and hybrid). In addition, this paper presents an overview of the most common post-processing and visualization techniques in BE analysis, such as the classical algorithms of scan line and the interpolation over a domain discretization. The results presented herein show that the proposed algorithm offers a very high performance compared with other visualization procedures.