864 resultados para Genetic Algorithms and Simulated Annealing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensional data. Several experiments are used to compare the proposed approach with the original algorithm and some of its modification and speed-up techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Controllers for feedback substitution schemes demonstrate a trade-off between noise power gain and normalized response time. Using as an example the design of a controller for a radiometric transduction process subjected to arbitrary noise power gain and robustness constraints, a Pareto-front of optimal controller solutions fulfilling a range of time-domain design objectives can be derived. In this work, we consider designs using a loop shaping design procedure (LSDP). The approach uses linear matrix inequalities to specify a range of objectives and a genetic algorithm (GA) to perform a multi-objective optimization for the controller weights (MOGA). A clonal selection algorithm is used to further provide a directed search of the GA towards the Pareto front. We demonstrate that with the proposed methodology, it is possible to design higher order controllers with superior performance in terms of response time, noise power gain and robustness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

J.A. Ferreira Neto, E.C. Santos Junior, U. Fra Paleo, D. Miranda Barros, and M.C.O. Moreira. 2011. Optimal subdivision of land in agrarian reform projects: an analysis using genetic algorithms. Cien. Inv. Agr. 38(2): 169-178. The objective of this manuscript is to develop a new procedure to achieve optimal land subdivision using genetic algorithms (GA). The genetic algorithm was tested in the rural settlement of Veredas, located in Minas Gerais, Brazil. This implementation was based on the land aptitude and its productivity index. The sequence of tests in the study was carried out in two areas with eight different agricultural aptitude classes, including one area of 391.88 ha subdivided into 12 lots and another of 404.1763 ha subdivided into 14 lots. The effectiveness of the method was measured using the shunting line standard value of a parceled area lot`s productivity index. To evaluate each parameter, a sequence of 15 calculations was performed to record the best individual fitness average (MMI) found for each parameter variation. The best parameter combination found in testing and used to generate the new parceling with the GA was the following: 320 as the generation number, a population of 40 individuals, 0.8 mutation tax, and a 0.3 renewal tax. The solution generated rather homogeneous lots in terms of productive capacity.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The simulated annealing optimization technique has been successfully applied to a number of electrical engineering problems, including transmission system expansion planning. The method is general in the sense that it does not assume any particular property of the problem being solved, such as linearity or convexity. Moreover, it has the ability to provide solutions arbitrarily close to an optimum (i.e. it is asymptotically convergent) as the cooling process slows down. The drawback of the approach is the computational burden: finding optimal solutions may be extremely expensive in some cases. This paper presents a Parallel Simulated Annealing, PSA, algorithm for solving the long term transmission network expansion planning problem. A strategy that does not affect the basic convergence properties of the Sequential Simulated Annealing algorithm have been implementeded and tested. The paper investigates the conditions under which the parallel algorithm is most efficient. The parallel implementations have been tested on three example networks: a small 6-bus network, and two complex real-life networks. Excellent results are reported in the test section of the paper: in addition to reductions in computing times, the Parallel Simulated Annealing algorithm proposed in the paper has shown significant improvements in solution quality for the largest of the test networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have compared the recently introduced generalized simulated annealing (GSA) with conventional simulated annealing (CSA). GSA was tested as a tool to obtain the ground-state geometry of molecules. We have used selected silicon clusters (Sin, n=4-7,10) as test cases. Total energies were calculated through tight-binding molecular dynamics. We have found that the replacement of Boltzmann statistics (CSA) by Tsallis's statistics (GSA) has the potential to speed up optimizations with no loss of accuracy. Next, we applied the GSA method to study the ground-state geometry of a 20-atom silicon cluster. We found an original geometry, apparently lower in energy than those previously described in the literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A combined methodology consisting of successive linear programming (SLP) and a simple genetic algorithm (SGA) solves the reactive planning problem. The problem is divided into operating and planning subproblems; the operating subproblem, which is a nonlinear, ill-conditioned and nonconvex problem, consists of determining the voltage control and the adjustment of reactive sources. The planning subproblem consists of obtaining the optimal reactive source expansion considering operational, economical and physical characteristics of the system. SLP solves the optimal reactive dispatch problem related to real variables, while SGA is used to determine the necessary adjustments of both the binary and discrete variables existing in the modelling problem. Once the set of candidate busbars has been defined, the program implemented gives the location and size of the reactive sources needed, if any, to maintain the operating and security constraints.