97 resultados para Multi objective evolutionary algorithms
Resumo:
Evolutionary algorithms have been widely used for Artificial Neural Networks (ANN) training, being the idea to update the neurons' weights using social dynamics of living organisms in order to decrease the classification error. In this paper, we have introduced Social-Spider Optimization to improve the training phase of ANN with Multilayer perceptrons, and we validated the proposed approach in the context of Parkinson's Disease recognition. The experimental section has been carried out against with five other well-known meta-heuristics techniques, and it has shown SSO can be a suitable approach for ANN-MLP training step.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Engenharia Elétrica - FEB
Resumo:
This paper presents the application of a new metaheuristic algorithm to solve the transmission expansion planning problem. A simple heuristic, using a relaxed network model associated with cost perturbation, is applied to generate a set of high quality initial solutions with different topologies. The population is evolved using a multi-move path-relinking with the objective of finding minimum investment cost for the transmission expansion planning problem employing the DC representation. The algorithm is tested on the southern Brazilian system, obtaining the optimal solution for the system with better performance than similar metaheuristics algorithms applied to the same problem. ©2010 IEEE.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The multi-relational Data Mining approach has emerged as alternative to the analysis of structured data, such as relational databases. Unlike traditional algorithms, the multi-relational proposals allow mining directly multiple tables, avoiding the costly join operations. In this paper, is presented a comparative study involving the traditional Patricia Mine algorithm and its corresponding multi-relational proposed, MR-Radix in order to evaluate the performance of two approaches for mining association rules are used for relational databases. This study presents two original contributions: the proposition of an algorithm multi-relational MR-Radix, which is efficient for use in relational databases, both in terms of execution time and in relation to memory usage and the presentation of the empirical approach multirelational advantage in performance over several tables, which avoids the costly join operations from multiple tables. © 2011 IEEE.
Resumo:
This work develops two approaches based on the fuzzy set theory to solve a class of fuzzy mathematical optimization problems with uncertainties in the objective function and in the set of constraints. The first approach is an adaptation of an iterative method that obtains cut levels and later maximizes the membership function of fuzzy decision making using the bound search method. The second one is a metaheuristic approach that adapts a standard genetic algorithm to use fuzzy numbers. Both approaches use a decision criterion called satisfaction level that reaches the best solution in the uncertain environment. Selected examples from the literature are presented to compare and to validate the efficiency of the methods addressed, emphasizing the fuzzy optimization problem in some import-export companies in the south of Spain. © 2012 Brazilian Operations Research Society.
Resumo:
This paper tackles a Nurse Scheduling Problem which consists of generating work schedules for a set of nurses while considering their shift preferences and other requirements. The objective is to maximize the satisfaction of nurses' preferences and minimize the violation of soft constraints. This paper presents a new deterministic heuristic algorithm, called MAPA (multi-assignment problem-based algorithm), which is based on successive resolutions of the assignment problem. The algorithm has two phases: a constructive phase and an improvement phase. The constructive phase builds a full schedule by solving successive assignment problems, one for each day in the planning period. The improvement phase uses a couple of procedures that re-solve assignment problems to produce a better schedule. Given the deterministic nature of this algorithm, the same schedule is obtained each time that the algorithm is applied to the same problem instance. The performance of MAPA is benchmarked against published results for almost 250,000 instances from the NSPLib dataset. In most cases, particularly on large instances of the problem, the results produced by MAPA are better when compared to best-known solutions from the literature. The experiments reported here also show that the MAPA algorithm finds more feasible solutions compared with other algorithms in the literature, which suggest that this proposed approach is effective and robust. © 2013 Springer Science+Business Media New York.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Objective. To determine the influence of cement thickness and ceramic/cement bonding on stresses and failure of CAD/CAM crowns, using both multi-physics finite element analysis and monotonic testing.Methods. Axially symmetric FEA models were created for stress analysis of a stylized monolithic crown having resin cement thicknesses from 50 to 500 mu m under occlusal loading. Ceramic-cement interface was modeled as bonded or not-bonded (cement-dentin as bonded). Cement polymerization shrinkage was simulated as a thermal contraction. Loads necessary to reach stresses for radial cracking from the intaglio surface were calculated by FEA. Experimentally, feldspathic CAD/CAM crowns based on the FEA model were machined having different occlusal cementation spaces, etched and cemented to dentin analogs. Non-bonding of etched ceramic was achieved using a thin layer of poly(dimethylsiloxane). Crowns were loaded to failure at 5 N/s, with radial cracks detected acoustically.Results. Failure loads depended on the bonding condition and the cement thickness for both FEA and physical testing. Average fracture loads for bonded crowns were: 673.5 N at 50 mu m cement and 300.6 N at 500 mu m. FEA stresses due to polymerization shrinkage increased with the cement thickness overwhelming the protective effect of bonding, as was also seen experimentally. At 50 mu m cement thickness, bonded crowns withstood at least twice the load before failure than non-bonded crowns.Significance. Occlusal "fit" can have structural implications for CAD/CAM crowns; pre-cementation spaces around 50-100 mu m being recommended from this study. Bonding benefits were lost at thickness approaching 450-500 mu m due to polymerization shrinkage stresses. (C) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
A comparative study of two groups of patients with paracoccidioidomycosis was carried out with the objective of comparing the evolutionary serologic, clinical and radiologic results after 6, 12, 15 and 18 months of treatment with ketoconazole (22 patients) or amphotericin B plus sulfonamides (32 patients). The serologic data analyzed as a whole showed a tendency to sharper drops in antibody titers in the patients treated with ketoconazole. Clinically patients treated with ketoconazole fared better but the differences were not statistically significant. No statistical difference was detected between groups in terms of the results of radiologic evolution. © 1985 Martinus Nijhoff/Dr W. Junk Publishers.
Resumo:
Piecewise-Linear Programming (PLP) is an important area of Mathematical Programming and concerns the minimisation of a convex separable piecewise-linear objective function, subject to linear constraints. In this paper a subarea of PLP called Network Piecewise-Linear Programming (NPLP) is explored. The paper presents four specialised algorithms for NPLP: (Strongly Feasible) Primal Simplex, Dual Method, Out-of-Kilter and (Strongly Polynomial) Cost-Scaling and their relative efficiency is studied. A statistically designed experiment is used to perform a computational comparison of the algorithms. The response variable observed in the experiment is the CPU time to solve randomly generated network piecewise-linear problems classified according to problem class (Transportation, Transshipment and Circulation), problem size, extent of capacitation, and number of breakpoints per arc. Results and conclusions on performance of the algorithms are reported.