912 resultados para Simulated annealing algorithms


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho teve como objetivo avaliar uma estratégia utilizada para geração de alternativas de manejo na formulação e solução de problemas de planejamento florestal com restrições de recobrimento. O problema de planejamento florestal foi formulado via modelo I e modelo II, assim denominados por Johnson E Scheurman (1977), resultando em problemas de programação linear inteira com 63 e 42 alternativas de manejo, respectivamente. Conforme esperado, no problema formulado via modelo I não houve violação das restrições de recobrimento, enquanto no problema formulado via modelo II algumas unidades de manejo foram fracionadas, fato já esperado, uma vez que essa formulação não assegura a integridade das unidades de manejo. Na formulação via modelo II, para assegurar a integridade das unidades de manejo foi necessário reformular o problema como um problema de programação não-linear inteira, problema esse de solução ainda mais complexa do que os de programação linear inteira. As soluções eficientes dos problemas de programação não-linear inteira esbarram nas limitações de eficiências dos principais algoritmos de solução exata e na carência de aplicações dos algoritmos aproximativos na solução desse tipo de problema, a exemplo das metaeurísticas simulated annealing, busca tabu e algoritmos genéticos, tornando-se, portanto, um atrativo para pesquisas nessa área.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polyglutamine is a naturally occurring peptide found within several proteins in neuronal cells of the brain, and its aggregation has been implicated in several neurodegenerative diseases, including Huntington's disease. The resulting aggregates have been demonstrated to possess ~-sheet structure, and aggregation has been shown to start with a single misfolded peptide. The current project sought to computationally examine the structural tendencies of three mutant poly glutamine peptides that were studied experimentally, and found to aggregate with varying efficiencies. Low-energy structures were generated for each peptide by simulated annealing, and were analyzed quantitatively by various geometry- and energy-based methods. According to the results, the experimentally-observed inhibition of aggregation appears to be due to localized conformational restraint placed on the peptide backbone by inserted prolines, which in tum confines the peptide to native coil structure, discouraging transition towards the ~sheet structure required for aggregation. Such knowledge could prove quite useful to the design of future treatments for Huntington's and other related diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The prediction of proteins' conformation helps to understand their exhibited functions, allows for modeling and allows for the possible synthesis of the studied protein. Our research is focused on a sub-problem of protein folding known as side-chain packing. Its computational complexity has been proven to be NP-Hard. The motivation behind our study is to offer the scientific community a means to obtain faster conformation approximations for small to large proteins over currently available methods. As the size of proteins increases, current techniques become unusable due to the exponential nature of the problem. We investigated the capabilities of a hybrid genetic algorithm / simulated annealing technique to predict the low-energy conformational states of various sized proteins and to generate statistical distributions of the studied proteins' molecular ensemble for pKa predictions. Our algorithm produced errors to experimental results within .acceptable margins and offered considerable speed up depending on the protein and on the rotameric states' resolution used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les résultats présentés dans cette thèse précisent certains aspects de la fonction du cotransporteur Na+/glucose (SGLT1), une protéine transmembranaire qui utilise le gradient électrochimique favorable des ions Na+ afin d’accumuler le glucose à l’intérieur des cellules épithéliales de l’intestin grêle et du rein. Nous avons tout d’abord utilisé l’électrophysiologie à deux microélectrodes sur des ovocytes de xénope afin d’identifier les ions qui constituaient le courant de fuite de SGLT1, un courant mesuré en absence de glucose qui est découplé de la stoechiométrie stricte de 2 Na+/1 glucose caractérisant le cotransport. Nos résultats ont démontré que des cations comme le Li+, le K+ et le Cs+, qui n’interagissent que faiblement avec les sites de liaison de SGLT1 et ne permettent pas les conformations engendrées par la liaison du Na+, pouvaient néanmoins générer un courant de fuite d’amplitude comparable à celui mesuré en présence de Na+. Ceci suggère que le courant de fuite traverse SGLT1 en utilisant une voie de perméation différente de celle définie par les changements de conformation propres au cotransport Na+/glucose, possiblement similaire à celle empruntée par la perméabilité à l’eau passive. Dans un deuxième temps, nous avons cherché à estimer la vitesse des cycles de cotransport de SGLT1 à l’aide de la technique de la trappe ionique, selon laquelle le large bout d’une électrode sélective (~100 μm) est pressé contre la membrane plasmique d’un ovocyte et circonscrit ainsi un petit volume de solution extracellulaire que l’on nomme la trappe. Les variations de concentration ionique se produisant dans la trappe en conséquence de l’activité de SGLT1 nous ont permis de déduire que le cotransport Na+/glucose s’effectuait à un rythme d’environ 13 s-1 lorsque le potentiel membranaire était fixé à -155 mV. Suite à cela, nous nous sommes intéressés au développement d’un modèle cinétique de SGLT1. En se servant de l’algorithme du recuit simulé, nous avons construit un schéma cinétique à 7 états reproduisant de façon précise les courants du cotransporteur en fonction du Na+ et du glucose extracellulaire. Notre modèle prédit qu’en présence d’une concentration saturante de glucose, la réorientation dans la membrane de SGLT1 suivant le relâchement intracellulaire de ses substrats est l’étape qui limite la vitesse de cotransport.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse porte sur l’étude de la relation entre la structure et la fonction chez les cotransporteurs Na+/glucose (SGLTs). Les SGLTs sont des protéines membranaires qui se servent du gradient électrochimique transmembranaire du Na+ afin d’accumuler leurs substrats dans la cellule. Une mise en contexte présentera d’abord un bref résumé des connaissances actuelles dans le domaine, suivi par un survol des différentes techniques expérimentales utilisées dans le cadre de mes travaux. Ces travaux peuvent être divisés en trois projets. Un premier projet a porté sur les bases structurelles de la perméation de l’eau au travers des SGLTs. En utilisant à la fois des techniques de modélisation moléculaire, mais aussi la volumétrie en voltage imposé, nous avons identifié les bases structurelles de cette perméation. Ainsi, nous avons pu identifier in silico la présence d’une voie de perméation passive à l’eau traversant le cotransporteur, pour ensuite corroborer ces résultats à l’aide de mesures faites sur le cotransporteur Na/glucose humain (hSGLT1) exprimé dans les ovocytes. Un second projet a permis d’élucider certaines caractéristiques structurelles de hSGLT1 de par l’utilisation de la dipicrylamine (DPA), un accepteur de fluorescence dont la répartition dans la membrane lipidique dépend du potentiel membranaire. L’utilisation de la DPA, conjuguée aux techniques de fluorescence en voltage imposé et de FRET (fluorescence resonance energy transfer), a permis de démontrer la position extracellulaire d’une partie de la boucle 12-13 et le fait que hSGLT1 forme des dimères dont les sous-unités sont unies par un pont disulfure. Un dernier projet a eu pour but de caractériser les courants stationnaires et pré-stationaires d’un membre de la famille des SGLTs, soit le cotransporteur Na+/myo-inositol humain hSMIT2 afin de proposer un modèle cinétique qui décrit son fonctionnement. Nous avons démontré que la phlorizine inhibe mal les courants préstationnaires suite à une dépolarisation, et la présence de courants de fuite qui varient en fonction du temps, du potentiel membranaire et des substrats. Un algorithme de recuit simulé a été mis au point afin de permettre la détermination objective de la connectivité et des différents paramètres associés à la modélisation cinétique.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a Reinforcement Learning (RL) approach to economic dispatch (ED) using Radial Basis Function neural network. We formulate the ED as an N stage decision making problem. We propose a novel architecture to store Qvalues and present a learning algorithm to learn the weights of the neural network. Even though many stochastic search techniques like simulated annealing, genetic algorithm and evolutionary programming have been applied to ED, they require searching for the optimal solution for each load demand. Also they find limitation in handling stochastic cost functions. In our approach once we learn the Q-values, we can find the dispatch for any load demand. We have recently proposed a RL approach to ED. In that approach, we could find only the optimum dispatch for a set of specified discrete values of power demand. The performance of the proposed algorithm is validated by taking IEEE 6 bus system, considering transmission losses

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The simulated annealing approach to structure solution from powder diffraction data, as implemented in the DASH program, is easily amenable to parallelization at the individual run level. Very large scale increases in speed of execution can therefore be achieved by distributing individual DASH runs over a network of computers. The GDASH program achieves this by packaging DASH in a form that enables it to run under the Univa UD Grid MP system, which harnesses networks of existing computing resources to perform calculations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The simulated annealing approach to structure solution from powder diffraction data, as implemented in the DASH program, is easily amenable to parallelization at the individual run level. Modest increases in speed of execution can therefore be achieved by executing individual DASH runs on the individual cores of CPUs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Matheron's usual variogram estimator can result in unreliable variograms when data are strongly asymmetric or skewed. Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. This paper examines the effects of underlying asymmetry on the variogram and on the accuracy of prediction, and the second one examines the effects arising from outliers. Standard geostatistical texts suggest ways of dealing with underlying asymmetry; however, this is based on informed intuition rather than detailed investigation. To determine whether the methods generally used to deal with underlying asymmetry are appropriate, the effects of different coefficients of skewness on the shape of the experimental variogram and on the model parameters were investigated. Simulated annealing was used to create normally distributed random fields of different size from variograms with different nugget:sill ratios. These data were then modified to give different degrees of asymmetry and the experimental variogram was computed in each case. The effects of standard data transformations on the form of the variogram were also investigated. Cross-validation was used to assess quantitatively the performance of the different variogram models for kriging. The results showed that the shape of the variogram was affected by the degree of asymmetry, and that the effect increased as the size of data set decreased. Transformations of the data were more effective in reducing the skewness coefficient in the larger sets of data. Cross-validation confirmed that variogram models from transformed data were more suitable for kriging than were those from the raw asymmetric data. The results of this study have implications for the 'standard best practice' in dealing with asymmetry in data for geostatistical analyses. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. The first paper of this series examined the effects of the former on the variogram and this paper examines the effects of asymmetry arising from outliers. Simulated annealing was used to create normally distributed random fields of different size that are realizations of known processes described by variograms with different nugget:sill ratios. These primary data sets were then contaminated with randomly located and spatially aggregated outliers from a secondary process to produce different degrees of asymmetry. Experimental variograms were computed from these data by Matheron's estimator and by three robust estimators. The effects of standard data transformations on the coefficient of skewness and on the variogram were also investigated. Cross-validation was used to assess the performance of models fitted to experimental variograms computed from a range of data contaminated by outliers for kriging. The results showed that where skewness was caused by outliers the variograms retained their general shape, but showed an increase in the nugget and sill variances and nugget:sill ratios. This effect was only slightly more for the smallest data set than for the two larger data sets and there was little difference between the results for the latter. Overall, the effect of size of data set was small for all analyses. The nugget:sill ratio showed a consistent decrease after transformation to both square roots and logarithms; the decrease was generally larger for the latter, however. Aggregated outliers had different effects on the variogram shape from those that were randomly located, and this also depended on whether they were aggregated near to the edge or the centre of the field. The results of cross-validation showed that the robust estimators and the removal of outliers were the most effective ways of dealing with outliers for variogram estimation and kriging. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visual exploration of scientific data in life science area is a growing research field due to the large amount of available data. The Kohonen’s Self Organizing Map (SOM) is a widely used tool for visualization of multidimensional data. In this paper we present a fast learning algorithm for SOMs that uses a simulated annealing method to adapt the learning parameters. The algorithm has been adopted in a data analysis framework for the generation of similarity maps. Such maps provide an effective tool for the visual exploration of large and multi-dimensional input spaces. The approach has been applied to data generated during the High Throughput Screening of molecular compounds; the generated maps allow a visual exploration of molecules with similar topological properties. The experimental analysis on real world data from the National Cancer Institute shows the speed up of the proposed SOM training process in comparison to a traditional approach. The resulting visual landscape groups molecules with similar chemical properties in densely connected regions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The results of applying a fragment-based protein tertiary structure prediction method to the prediction of 14 CASP5 target domains are described. The method is based on the assembly of supersecondary structural fragments taken from highly resolved protein structures using a simulated annealing algorithm. A number of good predictions for proteins with novel folds were produced, although not always as the first model. For two fold recognition targets, FRAGFOLD produced the most accurate model in both cases, despite the fact that the predictions were not based on a template structure. Although clear progress has been made in improving FRAGFOLD since CASP4, the ranking of final models still seems to be the main problem that needs to be addressed before the next CASP experiment

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In cooperative communication networks, owing to the nodes' arbitrary geographical locations and individual oscillators, the system is fundamentally asynchronous. This will damage some of the key properties of the space-time codes and can lead to substantial performance degradation. In this paper, we study the design of linear dispersion codes (LDCs) for such asynchronous cooperative communication networks. Firstly, the concept of conventional LDCs is extended to the delay-tolerant version and new design criteria are discussed. Then we propose a new design method to yield delay-tolerant LDCs that reach the optimal Jensen's upper bound on ergodic capacity as well as minimum average pairwise error probability. The proposed design employs stochastic gradient algorithm to approach a local optimum. Moreover, it is improved by using simulated annealing type optimization to increase the likelihood of the global optimum. The proposed method allows for flexible number of nodes, receive antennas, modulated symbols and flexible length of codewords. Simulation results confirm the performance of the newly-proposed delay-tolerant LDCs.