982 resultados para Simulated annealing (Matemática)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Geológica (Georrecursos)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Apresenta-se nesta tese uma revisão da literatura sobre a modelação de semicondutores de potência baseada na física e posterior análise de desempenho de dois métodos estocásticos, Particle Swarm Optimizaton (PSO) e Simulated Annealing (SA), quando utilizado para identificação eficiente de parâmetros de modelos de dispositivos semicondutores de potência, baseado na física. O conhecimento dos valores destes parâmetros, para cada dispositivo, é fundamental para uma simulação precisa do comportamento dinâmico do semicondutor. Os parâmetros são extraídos passo-a-passo durante simulação transiente e desempenham um papel relevante. Uma outra abordagem interessante nesta tese relaciona-se com o facto de que nos últimos anos, os métodos de modelação para dispositivos de potência têm emergido, com alta precisão e baixo tempo de execução baseado na Equação de Difusão Ambipolar (EDA) para díodos de potência e implementação no MATLAB numa estratégia de optimização formal. A equação da EDA é resolvida numericamente sob várias condições de injeções e o modelo é desenvolvido e implementado como um subcircuito no simulador IsSpice. Larguras de camada de depleção, área total do dispositivo, nível de dopagem, entre outras, são alguns dos parâmetros extraídos do modelo. Extração de parâmetros é uma parte importante de desenvolvimento de modelo. O objectivo de extração de parâmetros e otimização é determinar tais valores de parâmetros de modelo de dispositivo que minimiza as diferenças entre um conjunto de características medidas e resultados obtidos pela simulação de modelo de dispositivo. Este processo de minimização é frequentemente chamado de ajuste de características de modelos para dados de medição. O algoritmo implementado, PSO é uma técnica de heurística de otimização promissora, eficiente e recentemente proposta por Kennedy e Eberhart, baseado no comportamento social. As técnicas propostas são encontradas para serem robustas e capazes de alcançar uma solução que é caracterizada para ser precisa e global. Comparada com algoritmo SA já realizada, o desempenho da técnica proposta tem sido testado utilizando dados experimentais para extrair parâmetros de dispositivos reais das características I-V medidas. Para validar o modelo, comparação entre resultados de modelo desenvolvido com um outro modelo já desenvolvido são apresentados.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this contribution is to extend the techniques of composite materials design to non-linear material behaviour and apply it for design of new materials for passive vibration control. As a first step a computational tool allowing determination of macroscopic optimized one-dimensional isolator behaviour was developed. Voigt, Maxwell, standard and more complex material models can be implemented. Objective function considers minimization of the initial reaction and/or displacement peak as well as minimization of the steady-state amplitude of reaction and/or displacement. The complex stiffness approach is used to formulate the governing equations in an efficient way. Material stiffness parameters are assumed as non-linear functions of the displacement. The numerical solution is performed in the complex space. The steady-state solution in the complex space is obtained by an iterative process based on the shooting method which imposes the conditions of periodicity with respect to the known value of the period. Extension of the shooting method to the complex space is presented and verified. Non-linear behaviour of material parameters is then optimized by generic probabilistic meta-algorithm, simulated annealing. Dependence of the global optimum on several combinations of leading parameters of the simulated annealing procedure, like neighbourhood definition and annealing schedule, is also studied and analyzed. Procedure is programmed in MATLAB environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A optimização nas aplicações modernas assume um carácter fortemente interdisciplinar, relacionando-se com a necessidade de integração de diferentes técnicas e paradigmas na resolução de problemas reais complexos. O problema do escalonamento é recorrente no planeamento da produção. Sempre que uma ordem de fabrico é lançada, é necessário determinar que recursos serão utilizados e em que sequência as atividades serão executadas, para otimizar uma dada medida de desempenho. Embora ainda existam empresas a abordar o problema do escalonamento através de simples heurísticas, a proposta de sistemas de escalonamento tem-se evidenciado na literatura. Pretende-se nesta dissertação, a realização da análise de desempenho de Técnicas de Optimização, nomeadamente as meta-heurísticas, na resolução de problemas de optimização complexos – escalonamento de tarefas, particularmente no problema de minimização dos atrasos ponderados, 1||ΣwjTj. Assim sendo, foi desenvolvido um protótipo que serviu de suporte ao estudo computacional, com vista à avaliação do desempenho do Simulated Annealing (SA) e o Discrete Artificial Bee Colony (DABC). A resolução eficiente de um problema requer, em geral, a aplicação de diferentes métodos, e a afinação dos respetivos parâmetros. A afinação dos parâmetros pode permitir uma maior flexibilidade e robustez mas requer uma inicialização cuidadosa. Os parâmetros podem ter uma grande influência na eficiência e eficácia da pesquisa. A sua definição deve resultar de um cuidadoso esforço experimental no sentido da respectiva especificação. Foi usado, no âmbito deste trabalho de mestrado, para suportar a fase de parametrização das meta-heurísticas em análise, o planeamento de experiências de Taguchi. Da análise dos resultados, foi possível concluir que existem vantagem estatisticamente significativa no desempenho do DABC, mas quando analisada a eficiência é possível concluir que há vantagem do SA, que necessita de menos tempo computacional.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays the main honey producing countries require accurate labeling of honey before commercialization, including floral classification. Traditionally, this classification is made by melissopalynology analysis, an accurate but time-consuming task requiring laborious sample pre-treatment and high-skilled technicians. In this work the potential use of a potentiometric electronic tongue for pollinic assessment is evaluated, using monofloral and polyfloral honeys. The results showed that after splitting honeys according to color (white, amber and dark), the novel methodology enabled quantifying the relative percentage of the main pollens (Castanea sp., Echium sp., Erica sp., Eucaliptus sp., Lavandula sp., Prunus sp., Rubus sp. and Trifolium sp.). Multiple linear regression models were established for each type of pollen, based on the best sensors sub-sets selected using the simulated annealing algorithm. To minimize the overfitting risk, a repeated K-fold cross-validation procedure was implemented, ensuring that at least 10-20% of the honeys were used for internal validation. With this approach, a minimum average determination coefficient of 0.91 ± 0.15 was obtained. Also, the proposed technique enabled the correct classification of 92% and 100% of monofloral and polyfloral honeys, respectively. The quite satisfactory performance of the novel procedure for quantifying the relative pollen frequency may envisage its applicability for honey labeling and geographical origin identification. Nevertheless, this approach is not a full alternative to the traditional melissopalynologic analysis; it may be seen as a practical complementary tool for preliminary honey floral classification, leaving only problematic cases for pollinic evaluation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Olive oils may be commercialized as intense, medium or light, according to the intensity perception of fruitiness, bitterness and pungency attributes, assessed by a sensory panel. In this work, the capability of an electronic tongue to correctly classify olive oils according to the sensory intensity perception levels was evaluated. Cross-sensitivity and non-specific lipid polymeric membranes were used as sensors. The sensor device was firstly tested using quinine monohydrochloride standard solutions. Mean sensitivities of 14±2 to 25±6 mV/decade, depending on the type of plasticizer used in the lipid membranes, were obtained showing the device capability for evaluating bitterness. Then, linear discriminant models based on sub-sets of sensors, selected by a meta-heuristic simulated annealing algorithm, were established enabling to correctly classify 91% of olive oils according to their intensity sensory grade (leave-one-out cross-validation procedure). This capability was further evaluated using a repeated K-fold cross-validation procedure, showing that the electronic tongue allowed an average correct classification of 80% of the olive oils used for internal-validation. So, the electronic tongue can be seen as a taste sensor, allowing differentiating olive oils with different sensory intensities, and could be used as a preliminary, complementary and practical tool for panelists during olive oil sensory analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Olive oil quality grading is traditionally assessed by human sensory evaluation of positive and negative attributes (olfactory, gustatory, and final olfactorygustatory sensations). However, it is not guaranteed that trained panelist can correctly classify monovarietal extra-virgin olive oils according to olive cultivar. In this work, the potential application of human (sensory panelists) and artificial (electronic tongue) sensory evaluation of olive oils was studied aiming to discriminate eight single-cultivar extra-virgin olive oils. Linear discriminant, partial least square discriminant, and sparse partial least square discriminant analyses were evaluated. The best predictive classification was obtained using linear discriminant analysis with simulated annealing selection algorithm. A low-level data fusion approach (18 electronic tongue signals and nine sensory attributes) enabled 100 % leave-one-out cross-validation correct classification, improving the discrimination capability of the individual use of sensor profiles or sensory attributes (70 and 57 % leave-one-out correct classifications, respectively). So, human sensory evaluation and electronic tongue analysis may be used as complementary tools allowing successful monovarietal olive oil discrimination.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Natural mineral waters (still), effervescent natural mineral waters (sparkling) and aromatized waters with fruit-flavors (still or sparkling) are an emerging market. In this work, the capability of a potentiometric electronic tongue, comprised with lipid polymeric membranes, to quantitatively estimate routinely quality physicochemical parameters (pH and conductivity) as well as to qualitatively classify water samples according to the type of water was evaluated. The study showed that a linear discriminant model, based on 21 sensors selected by the simulated annealing algorithm, could correctly classify 100 % of the water samples (leave-one out cross-validation). This potential was further demonstrated by applying a repeated K-fold cross-validation (guaranteeing that at least 15 % of independent samples were only used for internal-validation) for which 96 % of correct classifications were attained. The satisfactory recognition performance of the E-tongue could be attributed to the pH, conductivity, sugars and organic acids contents of the studied waters, which turned out in significant differences of sweetness perception indexes and total acid flavor. Moreover, the E-tongue combined with multivariate linear regression models, based on sub-sets of sensors selected by the simulated annealing algorithm, could accurately estimate waters pH (25 sensors: R 2 equal to 0.99 and 0.97 for leave-one-out or repeated K-folds cross-validation) and conductivity (23 sensors: R 2 equal to 0.997 and 0.99 for leave-one-out or repeated K-folds cross-validation). So, the overall satisfactory results achieved, allow envisaging a potential future application of electronic tongue devices for bottled water analysis and classification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hardware-Software Co-Design, Simulated Annealing, Real-Time Image Processing, Automated Hardware-Software Partitioning

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uma das maiores ameaças à diversidade biológica é a perda de hábitat, de modo que uma das alternativas para proteção da biodiversidade é a seleção de reservas pela utilização de procedimentos de otimização para estabelecer áreas prioritárias para conservação. Neste estudo, um algoritmo simulated annealing foi usado para verificar como a periferia das distribuições das espécies influencia na seleção de áreas no Cerrado para conservação de 131 espécies de anfíbios anuros. Dois conjuntos de dados foram analisados, um contendo a distribuição original das espécies e outro excluindo a periferia das distribuições. As redes ótimas encontradas a partir das distribuições originais contiveram 17 quadrículas enquanto aquelas encontradas a partir das distribuições restritas foram maiores, com 22 células. As células com alto grau de insubstituibilidade foram mantidas em todas as redes e novas regiões de células substituíveis, localizadas na margem do bioma, surgiram quando apenas as distribuições reduzidas foram usadas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Na,K-ATPase is the main active transport system that maintains the large gradients of Na(+) and K(+) across the plasma membrane of animal cells. The crystal structure of a K(+)-occluding conformation of this protein has been recently published, but the movements of its different domains allowing for the cation pumping mechanism are not yet known. The structure of many more conformations is known for the related calcium ATPase SERCA, but the reliability of homology modeling is poor for several domains with low sequence identity, in particular the extracellular loops. To better define the structure of the large fourth extracellular loop between the seventh and eighth transmembrane segments of the alpha subunit, we have studied the formation of a disulfide bond between pairs of cysteine residues introduced by site-directed mutagenesis in the second and the fourth extracellular loop. We found a specific pair of cysteine positions (Y308C and D884C) for which extracellular treatment with an oxidizing agent inhibited the Na,K pump function, which could be rapidly restored by a reducing agent. The formation of the disulfide bond occurred preferentially under the E2-P conformation of Na,K-ATPase, in the absence of extracellular cations. Using recently published crystal structure and a distance constraint reproducing the existence of disulfide bond, we performed an extensive conformational space search using simulated annealing and showed that the Tyr(308) and Asp(884) residues can be in close proximity, and simultaneously, the SYGQ motif of the fourth extracellular loop, known to interact with the extracellular domain of the beta subunit, can be exposed to the exterior of the protein and can easily interact with the beta subunit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An ab initio structure prediction approach adapted to the peptide-major histocompatibility complex (MHC) class I system is presented. Based on structure comparisons of a large set of peptide-MHC class I complexes, a molecular dynamics protocol is proposed using simulated annealing (SA) cycles to sample the conformational space of the peptide in its fixed MHC environment. A set of 14 peptide-human leukocyte antigen (HLA) A0201 and 27 peptide-non-HLA A0201 complexes for which X-ray structures are available is used to test the accuracy of the prediction method. For each complex, 1000 peptide conformers are obtained from the SA sampling. A graph theory clustering algorithm based on heavy atom root-mean-square deviation (RMSD) values is applied to the sampled conformers. The clusters are ranked using cluster size, mean effective or conformational free energies, with solvation free energies computed using Generalized Born MV 2 (GB-MV2) and Poisson-Boltzmann (PB) continuum models. The final conformation is chosen as the center of the best-ranked cluster. With conformational free energies, the overall prediction success is 83% using a 1.00 Angstroms crystal RMSD criterion for main-chain atoms, and 76% using a 1.50 Angstroms RMSD criterion for heavy atoms. The prediction success is even higher for the set of 14 peptide-HLA A0201 complexes: 100% of the peptides have main-chain RMSD values < or =1.00 Angstroms and 93% of the peptides have heavy atom RMSD values < or =1.50 Angstroms. This structure prediction method can be applied to complexes of natural or modified antigenic peptides in their MHC environment with the aim to perform rational structure-based optimizations of tumor vaccines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The integration of geophysical data into the subsurface characterization problem has been shown in many cases to significantly improve hydrological knowledge by providing information at spatial scales and locations that is unattainable using conventional hydrological measurement techniques. The investigation of exactly how much benefit can be brought by geophysical data in terms of its effect on hydrological predictions, however, has received considerably less attention in the literature. Here, we examine the potential hydrological benefits brought by a recently introduced simulated annealing (SA) conditional stochastic simulation method designed for the assimilation of diverse hydrogeophysical data sets. We consider the specific case of integrating crosshole ground-penetrating radar (GPR) and borehole porosity log data to characterize the porosity distribution in saturated heterogeneous aquifers. In many cases, porosity is linked to hydraulic conductivity and thus to flow and transport behavior. To perform our evaluation, we first generate a number of synthetic porosity fields exhibiting varying degrees of spatial continuity and structural complexity. Next, we simulate the collection of crosshole GPR data between several boreholes in these fields, and the collection of porosity log data at the borehole locations. The inverted GPR data, together with the porosity logs, are then used to reconstruct the porosity field using the SA-based method, along with a number of other more elementary approaches. Assuming that the grid-cell-scale relationship between porosity and hydraulic conductivity is unique and known, the porosity realizations are then used in groundwater flow and contaminant transport simulations to assess the benefits and limitations of the different approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract Accurate characterization of the spatial distribution of hydrological properties in heterogeneous aquifers at a range of scales is a key prerequisite for reliable modeling of subsurface contaminant transport, and is essential for designing effective and cost-efficient groundwater management and remediation strategies. To this end, high-resolution geophysical methods have shown significant potential to bridge a critical gap in subsurface resolution and coverage between traditional hydrological measurement techniques such as borehole log/core analyses and tracer or pumping tests. An important and still largely unresolved issue, however, is how to best quantitatively integrate geophysical data into a characterization study in order to estimate the spatial distribution of one or more pertinent hydrological parameters, thus improving hydrological predictions. Recognizing the importance of this issue, the aim of the research presented in this thesis was to first develop a strategy for the assimilation of several types of hydrogeophysical data having varying degrees of resolution, subsurface coverage, and sensitivity to the hydrologic parameter of interest. In this regard a novel simulated annealing (SA)-based conditional simulation approach was developed and then tested in its ability to generate realizations of porosity given crosshole ground-penetrating radar (GPR) and neutron porosity log data. This was done successfully for both synthetic and field data sets. A subsequent issue that needed to be addressed involved assessing the potential benefits and implications of the resulting porosity realizations in terms of groundwater flow and contaminant transport. This was investigated synthetically assuming first that the relationship between porosity and hydraulic conductivity was well-defined. Then, the relationship was itself investigated in the context of a calibration procedure using hypothetical tracer test data. Essentially, the relationship best predicting the observed tracer test measurements was determined given the geophysically derived porosity structure. Both of these investigations showed that the SA-based approach, in general, allows much more reliable hydrological predictions than other more elementary techniques considered. Further, the developed calibration procedure was seen to be very effective, even at the scale of tomographic resolution, for predictions of transport. This also held true at locations within the aquifer where only geophysical data were available. This is significant because the acquisition of hydrological tracer test measurements is clearly more complicated and expensive than the acquisition of geophysical measurements. Although the above methodologies were tested using porosity logs and GPR data, the findings are expected to remain valid for a large number of pertinent combinations of geophysical and borehole log data of comparable resolution and sensitivity to the hydrological target parameter. Moreover, the obtained results allow us to have confidence for future developments in integration methodologies for geophysical and hydrological data to improve the 3-D estimation of hydrological properties.