13 resultados para Algorithm Comparison
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
The paper presents an extended genetic algorithm for solving the optimal transmission network expansion planning problem. Two main improvements have been introduced in the genetic algorithm: (a) initial population obtained by conventional optimisation based methods; (b) mutation approach inspired in the simulated annealing technique, the proposed method is general in the sense that it does not assume any particular property of the problem being solved, such as linearity or convexity. Excellent performance is reported in the test results section of the paper for a difficult large-scale real-life problem: a substantial reduction in investment costs has been obtained with regard to previous solutions obtained via conventional optimisation methods and simulated annealing algorithms; statistical comparison procedures have been employed in benchmarking different versions of the genetic algorithm and simulated annealing methods.
Resumo:
In this work, genetic algorithms concepts along with a rotamer library for proteins side chains are used to optimize the tertiary structure of the hydrophobic core of Cytochrome b(562) starting from the known PDB structure of its backbone which is kept fixed while the side chains of the hydrophobic core are allowed to adopt the conformations present in the rotamer library. The atoms of the side chains forming the core interact via van der Waals energy. Besides the prediction of the native core structure, it is also suggested a set of different amino acid sequences for this core. Comparison between these new cores and the native are made in terms of their volumes, van der Waals energies values and the numbers of contacts made by the side chains forming the cores. This paper proves that genetic algorithms area efficient to design new sequence for the protein core. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The third primary production algorithm round robin (PPARR3) compares output from 24 models that estimate depth-integrated primary production from satellite measurements of ocean color, as well as seven general circulation models (GCMs) coupled with ecosystem or biogeochemical models. Here we compare the global primary production fields corresponding to eight months of 1998 and 1999 as estimated from common input fields of photosynthetically-available radiation (PAR), sea-surface temperature (SST), mixed-layer depth, and chlorophyll concentration. We also quantify the sensitivity of the ocean-color-based models to perturbations in their input variables. The pair-wise correlation between ocean-color models was used to cluster them into groups or related output, which reflect the regions and environmental conditions under which they respond differently. The groups do not follow model complexity with regards to wavelength or depth dependence, though they are related to the manner in which temperature is used to parameterize photosynthesis. Global average PP varies by a factor of two between models. The models diverged the most for the Southern Ocean, SST under 10 degrees C, and chlorophyll concentration exceeding 1 mg Chlm(-3). Based on the conditions under which the model results diverge most, we conclude that current ocean-color-based models are challenged by high-nutrient low-chlorophyll conditions, and extreme temperatures or chlorophyll concentrations. The GCM-based models predict comparable primary production to those based on ocean color: they estimate higher values in the Southern Ocean, at low SST, and in the equatorial band, while they estimate lower values in eutrophic regions (probably because the area of high chlorophyll concentrations is smaller in the GCMs). Further progress in primary production modeling requires improved understanding of the effect of temperature on photosynthesis and better parameterization of the maximum photosynthetic rate. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
This paper presents a comparison of reactive power support in distribution networks provided by switched Capacitor Banks (CBs) and Distributed Generators (DGs). Regarding switched CBs, a Tabu Search metaheuristic algorithm is developed to determine their optimal operation with the objective of reducing the power losses in the lines on the system, while meeting network constraints. on the other hand, the optimal operation of DGs is analyzed through an evolutionary Multi-Objective (MO) programming approach. The objectives of such approach are the minimization of power losses and operation cost of the DGs. The comparison of the reactive power support provided by switched CBs and DGs is carried out using a modified IEEE 34 bus distribution test system.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We introduce a new hybrid approach to determine the ground state geometry of molecular systems. Firstly, we compared the ability of genetic algorithm (GA) and simulated annealing (SA) to find the lowest energy geometry of silicon clusters with six and 10 atoms. This comparison showed that GA exhibits fast initial convergence, but its performance deteriorates as it approaches the desired global extreme. Interestingly, SA showed a complementary convergence pattern, in addition to high accuracy. Our new procedure combines selected features from GA and SA to achieve weak dependence on initial parameters, parallel search strategy, fast convergence and high accuracy. This hybrid algorithm outperforms GA and SA by one order of magnitude for small silicon clusters (Si6 and Si10). Next, we applied the hybrid method to study the geometry of a 20-atom silicon cluster. It was able to find an original geometry, apparently lower in energy than those previously described in literature. In principle, our procedure can be applied successfully to any molecular system. © 1998 Elsevier Science B.V.
Resumo:
The risk for venous thromboembolism (VTE) in medical patients is high, but risk assessment is rarely performed because there is not yet a good method to identify candidates for prophylaxis. Purpose: To perform a systematic review about VTE risk factors (RFs) in hospitalized medical patients and generate recommendations (RECs) for prophylaxis that can be implemented into practice. Data sources: A multidisciplinary group of experts from 12 Brazilian Medical Societies searched MEDLINE, Cochrane, and LILACS. Study selection: Two experts independently classified the evidence for each RF by its scientific quality in a standardized manner. A risk-assessment algorithm was created based on the results of the review. Data synthesis: Several VTE RFs have enough evidence to support RECs for prophylaxis in hospitalized medical patients (eg, increasing age, heart failure, and stroke). Other factors are considered adjuncts of risk (eg, varices, obesity, and infections). According to the algorithm, hospitalized medical patients ≥40 years-old with decreased mobility, and ≥1 RFs should receive chemoprophylaxis with heparin, provided they don't have contraindications. High prophylactic doses of unfractionated heparin or low-molecular-weight-heparin must be administered and maintained for 6-14 days. Conclusions: A multidisciplinary group generated evidence-based RECs and an easy-to-use algorithm to facilitate VTE prophylaxis in medical patients. © 2007 Rocha et al, publisher and licensee Dove Medical Press Ltd.
Resumo:
This chapter studies a two-level production planning problem where, on each level, a lot sizing and scheduling problem with parallel machines, capacity constraints and sequence-dependent setup costs and times must be solved. The problem can be found in soft drink companies where the production process involves two interdependent levels with decisions concerning raw material storage and soft drink bottling. Models and solution approaches proposed so far are surveyed and conceptually compared. Two different approaches have been selected to perform a series of computational comparisons: an evolutionary technique comprising a genetic algorithm and its memetic version, and a decomposition and relaxation approach. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
Hepatocellular carcinoma (HCC) is a primary tumor of the liver. After local therapies, the tumor evaluation is based on the mRECIST criteria, which involves the measurement of the maximum diameter of the viable lesion. This paper describes a computed methodology to measure through the contrasted area of the lesions the maximum diameter of the tumor by a computational algorithm 63 computed tomography (CT) slices from 23 patients were assessed. Non-contrasted liver and HCC typical nodules were evaluated, and a virtual phantom was developed for this purpose. Optimization of the algorithm detection and quantification was made using the virtual phantom. After that, we compared the algorithm findings of maximum diameter of the target lesions against radiologist measures. Computed results of the maximum diameter are in good agreement with the results obtained by radiologist evaluation, indicating that the algorithm was able to detect properly the tumor limits A comparison of the estimated maximum diameter by radiologist versus the algorithm revealed differences on the order of 0.25 cm for large-sized tumors (diameter > 5 cm), whereas agreement lesser than 1.0cm was found for small-sized tumors. Differences between algorithm and radiologist measures were accurate for small-sized tumors with a trend to a small increase for tumors greater than 5 cm. Therefore, traditional methods for measuring lesion diameter should be complemented with non-subjective measurement methods, which would allow a more correct evaluation of the contrast-enhanced areas of HCC according to the mRECIST criteria.
Resumo:
The algorithm creates a buffer area around the cartographic features of interest in one of the images and compare it with the other one. During the comparison, the algorithm calculates the number of equals and different points and uses it to calculate the statistical values of the analysis. One calculated statistical value is the correctness, which shows the user the percentage of points that were correctly extracted. Another one is the completeness that shows the percentage of points that really belong to the interest feature. And the third value shows the idea of quality obtained by the extraction method, since that in order to calculate the quality the algorithm uses the correctness and completeness previously calculated. All the performed tests using this algorithm were possible to use the statistical values calculated to represent quantitatively the quality obtained by the extraction method executed. So, it is possible to say that the developed algorithm can be used to analyze extraction methods of cartographic features of interest, since that the results obtained were promising.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The increasing amount of sequences stored in genomic databases has become unfeasible to the sequential analysis. Then, the parallel computing brought its power to the Bioinformatics through parallel algorithms to align and analyze the sequences, providing improvements mainly in the running time of these algorithms. In many situations, the parallel strategy contributes to reducing the computational complexity of the big problems. This work shows some results obtained by an implementation of a parallel score estimating technique for the score matrix calculation stage, which is the first stage of a progressive multiple sequence alignment. The performance and quality of the parallel score estimating are compared with the results of a dynamic programming approach also implemented in parallel. This comparison shows a significant reduction of running time. Moreover, the quality of the final alignment, using the new strategy, is analyzed and compared with the quality of the approach with dynamic programming.