849 resultados para real genetic algorithm


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Heterogeneous materials are ubiquitous in nature and as synthetic materials. These materials provide unique combination of desirable mechanical properties emerging from its heterogeneities at different length scales. Future structural and technological applications will require the development of advanced light weight materials with superior strength and toughness. Cost effective design of the advanced high performance synthetic materials by tailoring their microstructure is the challenge facing the materials design community. Prior knowledge of structure-property relationships for these materials is imperative for optimal design. Thus, understanding such relationships for heterogeneous materials is of primary interest. Furthermore, computational burden is becoming critical concern in several areas of heterogeneous materials design. Therefore, computationally efficient and accurate predictive tools are highly essential. In the present study, we mainly focus on mechanical behavior of soft cellular materials and tough biological material such as mussel byssus thread. Cellular materials exhibit microstructural heterogeneity by interconnected network of same material phase. However, mussel byssus thread comprises of two distinct material phases. A robust numerical framework is developed to investigate the micromechanisms behind the macroscopic response of both of these materials. Using this framework, effect of microstuctural parameters has been addressed on the stress state of cellular specimens during split Hopkinson pressure bar test. A voronoi tessellation based algorithm has been developed to simulate the cellular microstructure. Micromechanisms (microinertia, microbuckling and microbending) governing macroscopic behavior of cellular solids are investigated thoroughly with respect to various microstructural and loading parameters. To understand the origin of high toughness of mussel byssus thread, a Genetic Algorithm (GA) based optimization framework has been developed. It is found that two different material phases (collagens) of mussel byssus thread are optimally distributed along the thread. These applications demonstrate that the presence of heterogeneity in the system demands high computational resources for simulation and modeling. Thus, Higher Dimensional Model Representation (HDMR) based surrogate modeling concept has been proposed to reduce computational complexity. The applicability of such methodology has been demonstrated in failure envelope construction and in multiscale finite element techniques. It is observed that surrogate based model can capture the behavior of complex material systems with sufficient accuracy. The computational algorithms presented in this thesis will further pave the way for accurate prediction of macroscopic deformation behavior of various class of advanced materials from their measurable microstructural features at a reasonable computational cost.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With proper application of Best Management Practices (BMPs), the impact from the sediment to the water bodies could be minimized. However, finding the optimal allocation of BMP can be difficult, since there are numerous possible options. Also, economics plays an important role in BMP affordability and, therefore, the number of BMPs able to be placed in a given budget year. In this study, two methodologies are presented to determine the optimal cost-effective BMP allocation, by coupling a watershed-level model, Soil and Water Assessment Tool (SWAT), with two different methods, targeting and a multi-objective genetic algorithm (Non-dominated Sorting Genetic Algorithm II, NSGA-II). For demonstration, these two methodologies were applied to an agriculture-dominant watershed located in Lower Michigan to find the optimal allocation of filter strips and grassed waterways. For targeting, three different criteria were investigated for sediment yield minimization, during the process of which it was found that the grassed waterways near the watershed outlet reduced the watershed outlet sediment yield the most under this study condition, and cost minimization was also included as a second objective during the cost-effective BMP allocation selection. NSGA-II was used to find the optimal BMP allocation for both sediment yield reduction and cost minimization. By comparing the results and computational time of both methodologies, targeting was determined to be a better method for finding optimal cost-effective BMP allocation under this study condition, since it provided more than 13 times the amount of solutions with better fitness for the objective functions while using less than one eighth of the SWAT computational time than the NSGA-II with 150 generations did.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An invisibility cloak is a device that can hide the target by enclosing it from the incident radiation. This intriguing device has attracted a lot of attention since it was first implemented at a microwave frequency in 2006. However, the problems of existing cloak designs prevent them from being widely applied in practice. In this dissertation, we try to remove or alleviate the three constraints for practical applications imposed by loosy cloaking media, high implementation complexity, and small size of hidden objects compared to the incident wavelength. To facilitate cloaking design and experimental characterization, several devices and relevant techniques for measuring the complex permittivity of dielectric materials at microwave frequencies are developed. In particular, a unique parallel plate waveguide chamber has been set up to automatically map the electromagnetic (EM) field distribution for wave propagation through the resonator arrays and cloaking structures. The total scattering cross section of the cloaking structures was derived based on the measured scattering field by using this apparatus. To overcome the adverse effects of lossy cloaking media, microwave cloaks composed of identical dielectric resonators made of low loss ceramic materials are designed and implemented. The effective permeability dispersion was provided by tailoring dielectric resonator filling fractions. The cloak performances had been verified by full-wave simulation of true multi-resonator structures and experimental measurements of the fabricated prototypes. With the aim to reduce the implementation complexity caused by metamaterials employment for cloaking, we proposed to design 2-D cylindrical cloaks and 3-D spherical cloaks by using multi-layer ordinary dielectric material (εr>1) coating. Genetic algorithm was employed to optimize the dielectric profiles of the cloaking shells to provide the minimum scattering cross sections of the cloaked targets. The designed cloaks can be easily scaled to various operating frequencies. The simulation results show that the multi-layer cylindrical cloak essentially outperforms the similarly sized metamaterials-based cloak designed by using the transformation optics-based reduced parameters. For the designed spherical cloak, the simulated scattering pattern shows that the total scattering cross section is greatly reduced. In addition, the scattering in specific directions could be significantly reduced. It is shown that the cloaking efficiency for larger targets could be improved by employing lossy materials in the shell. At last, we propose to hide a target inside a waveguide structure filled with only epsilon near zero materials, which are easy to implement in practice. The cloaking efficiency of this method, which was found to increase for large targets, has been confirmed both theoretically and by simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim of this paper is to evaluate the diagnostic contribution of various types of texture features in discrimination of hepatic tissue in abdominal non-enhanced Computed Tomography (CT) images. Regions of Interest (ROIs) corresponding to the classes: normal liver, cyst, hemangioma, and hepatocellular carcinoma were drawn by an experienced radiologist. For each ROI, five distinct sets of texture features are extracted using First Order Statistics (FOS), Spatial Gray Level Dependence Matrix (SGLDM), Gray Level Difference Method (GLDM), Laws' Texture Energy Measures (TEM), and Fractal Dimension Measurements (FDM). In order to evaluate the ability of the texture features to discriminate the various types of hepatic tissue, each set of texture features, or its reduced version after genetic algorithm based feature selection, was fed to a feed-forward Neural Network (NN) classifier. For each NN, the area under Receiver Operating Characteristic (ROC) curves (Az) was calculated for all one-vs-all discriminations of hepatic tissue. Additionally, the total Az for the multi-class discrimination task was estimated. The results show that features derived from FOS perform better than other texture features (total Az: 0.802+/-0.083) in the discrimination of hepatic tissue.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ein auf Basis von Prozessdaten kalibriertes Viskositätsmodell wird vorgeschlagen und zur Vorhersage der Viskosität einer Polyamid 12 (PA12) Kunststoffschmelze als Funktion von Zeit, Temperatur und Schergeschwindigkeit angewandt. Im ersten Schritt wurde das Viskositätsmodell aus experimentellen Daten abgeleitet. Es beruht hauptsächlich auf dem drei-parametrigen Ansatz von Carreau, wobei zwei zusätzliche Verschiebungsfaktoren eingesetzt werden. Die Temperaturabhängigkeit der Viskosität wird mithilfe des Verschiebungsfaktors aT von Arrhenius berücksichtigt. Ein weiterer Verschiebungsfaktor aSC (Structural Change) wird eingeführt, der die Strukturänderung von PA12 als Folge der Prozessbedingungen beim Lasersintern beschreibt. Beobachtet wurde die Strukturänderung in Form einer signifikanten Viskositätserhöhung. Es wurde geschlussfolgert, dass diese Viskositätserhöhung auf einen Molmassenaufbau zurückzuführen ist und als Nachkondensation verstanden werden kann. Abhängig von den Zeit- und Temperaturbedingungen wurde festgestellt, dass die Viskosität als Folge des Molmassenaufbaus exponentiell gegen eine irreversible Grenze strebt. Die Geschwindigkeit dieser Nachkondensation ist zeit- und temperaturabhängig. Es wird angenommen, dass die Pulverbetttemperatur einen Molmassenaufbau verursacht und es damit zur Kettenverlängerung kommt. Dieser fortschreitende Prozess der zunehmenden Kettenlängen setzt molekulare Beweglichkeit herab und unterbindet die weitere Nachkondensation. Der Verschiebungsfaktor aSC drückt diese physikalisch-chemische Modellvorstellung aus und beinhaltet zwei zusätzliche Parameter. Der Parameter aSC,UL entspricht der oberen Viskositätsgrenze, wohingegen k0 die Strukturänderungsrate angibt. Es wurde weiterhin festgestellt, dass es folglich nützlich ist zwischen einer Fließaktivierungsenergie und einer Strukturänderungsaktivierungsenergie für die Berechnung von aT und aSC zu unterscheiden. Die Optimierung der Modellparameter erfolgte mithilfe eines genetischen Algorithmus. Zwischen berechneten und gemessenen Viskositäten wurde eine gute Übereinstimmung gefunden, so dass das Viskositätsmodell in der Lage ist die Viskosität einer PA12 Kunststoffschmelze als Folge eines kombinierten Lasersinter Zeit- und Temperatureinflusses vorherzusagen. Das Modell wurde im zweiten Schritt angewandt, um die Viskosität während des Lasersinter-Prozesses in Abhängigkeit von der Energiedichte zu berechnen. Hierzu wurden Prozessdaten, wie Schmelzetemperatur und Belichtungszeit benutzt, die mithilfe einer High-Speed Thermografiekamera on-line gemessen wurden. Abschließend wurde der Einfluss der Strukturänderung auf das Viskositätsniveau im Prozess aufgezeigt.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ventricular assist devices (VADs) are blood pumps that offer an option to support the circulation of patients with severe heart failure. Since a failing heart has a remaining pump function, its interaction with the VAD influences the hemodynamics. Ideally, the heart's action is taken into account for actuating the device such that the device is synchronized to the natural cardiac cycle. To realize this in practice, a reliable real-time algorithm for the automatic synchronization of the VAD to the heart rate is required. This paper defines the tasks such an algorithm needs to fulfill: the automatic detection of irregular heart beats and the feedback control of the phase shift between the systolic phases of the heart and the assist device. We demonstrate a possible solution to these problems and analyze its performance in two steps. First, the algorithm is tested using the MIT-BIH arrhythmia database. Second, the algorithm is implemented in a controller for a pulsatile and a continuous-flow VAD. These devices are connected to a hybrid mock circulation where three test scenarios are evaluated. The proposed algorithm ensures a reliable synchronization of the VAD to the heart cycle, while being insensitive to irregularities in the heart rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Two new approaches to quantitatively analyze diffuse diffraction intensities from faulted layer stacking are reported. The parameters of a probability-based growth model are determined with two iterative global optimization methods: a genetic algorithm (GA) and particle swarm optimization (PSO). The results are compared with those from a third global optimization method, a differential evolution (DE) algorithm [Storn & Price (1997). J. Global Optim. 11, 341–359]. The algorithm efficiencies in the early and late stages of iteration are compared. The accuracy of the optimized parameters improves with increasing size of the simulated crystal volume. The wall clock time for computing quite large crystal volumes can be kept within reasonable limits by the parallel calculation of many crystals (clones) generated for each model parameter set on a super- or grid computer. The faulted layer stacking in single crystals of trigonal three-pointedstar- shaped tris(bicylco[2.1.1]hexeno)benzene molecules serves as an example for the numerical computations. Based on numerical values of seven model parameters (reference parameters), nearly noise-free reference intensities of 14 diffuse streaks were simulated from 1280 clones, each consisting of 96 000 layers (reference crystal). The parameters derived from the reference intensities with GA, PSO and DE were compared with the original reference parameters as a function of the simulated total crystal volume. The statistical distribution of structural motifs in the simulated crystals is in good agreement with that in the reference crystal. The results found with the growth model for layer stacking disorder are applicable to other disorder types and modeling techniques, Monte Carlo in particular.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nuevo paradigma para algunos, o simple eslogan político-mediático para otros, el concepto de desarrollo sustentable levanta reacciones contradictorias en el mundo científico. Existen notables diferencias en el acercamiento al desarrollo sustentable entre los representantes de las ciencias naturales y los de las ciencias humanas. El carácter nómada del concepto (Stengers, 1987) de desarrollo sustentable es percibido por ciertos investigadores en geografía como una verdadera tara genética, cuando otros piensan por el contrario que puede servir de palanca para renovar el acercamiento geográfico a los grandes problemas contemporáneos. ¿Para la investigación en geografía, qué postura es conveniente adoptar? ¿El desarrollo sustentable, como concepto nómada, puede convertirse en una herramienta eficaz e innovadora? ¿O bien su incertidumbre semántica nos conduce de manera inexorable hacia un vagabundeo de las ideas? El objetivo central de esta reflexión es intentar situar el concepto de desarrollo sustentable en la órbita del pensamiento geográfico, en una óptica comparativa con disciplinas vecinas y cuestionando a su vez su interés para la investigación en nuestro campo disciplinario. Para ello, es necesario en primer lugar acercarse sin tabúes a los problemas planteados por este concepto: ¿Debe considerarse el nomadismo conceptual del desarrollo sustentable como una especie de pecado original que lo convertiría en un instrumento inoperante para la geografía? En segundo lugar, desmitificando la novedad aparente del concepto, conviene analizar cuáles son sus filiaciones con la geografía y otras disciplinas. Y finalmente, plantear casos concretos de utilización del concepto en la investigación geográfica (ciudad sustentable; bosques sustentables), subrayando sus aportes, pero también poniendo a la luz sus límites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AnewRelativisticScreenedHydrogenicModel has been developed to calculate atomic data needed to compute the optical and thermodynamic properties of high energy density plasmas. The model is based on anewset of universal screeningconstants, including nlj-splitting that has been obtained by fitting to a large database of ionization potentials and excitation energies. This database was built with energies compiled from the National Institute of Standards and Technology (NIST) database of experimental atomic energy levels, and energies calculated with the Flexible Atomic Code (FAC). The screeningconstants have been computed up to the 5p3/2 subshell using a Genetic Algorithm technique with an objective function designed to minimize both the relative error and the maximum error. To select the best set of screeningconstants some additional physical criteria has been applied, which are based on the reproduction of the filling order of the shells and on obtaining the best ground state configuration. A statistical error analysis has been performed to test the model, which indicated that approximately 88% of the data lie within a ±10% error interval. We validate the model by comparing the results with ionization energies, transition energies, and wave functions computed using sophisticated self-consistent codes and experimental data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A compact planar array with parasitic elements is studied to be used in MIMO systems. Classical compact arrays suffer from high coupling which makes correlation and matching efficiency to be worse. A proper matching network improves these lacks although its bandwidth is low and may increase the antenna size. The proposed antenna makes use of parasitic elements to improve both correlation and efficiency. A specific software based on MoM has been developed to analyze radiating structures with several feed points. The array is optimized through a Genetic Algorithm to determine parasitic elements position in order to fulfill different figures of merit. The proposed design provides the required correlation and matching efficiency to have a good performance over a significant bandwidth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes the EvoBANE system. EvoBANE automatically generates Bayesian networks for solving special-purpose problems. EvoBANE evolves a population of individuals that codify Bayesian networks until it finds near optimal individual that solves a given classification problem. EvoBANE has the flexibility to modify the constraints that condition the solution search space, self-adapting to the specifications of the problem to be solved. The system extends the GGEAS architecture. GGEAS is a general-purpose grammar-guided evolutionary automatic system, whose modular structure favors its application to the automatic construction of intelligent systems. EvoBANE has been applied to two classification benchmark datasets belonging to different application domains, and statistically compared with a genetic algorithm performing the same tasks. Results show that the proposed system performed better, as it manages different complexity constraints in order to find the simplest solution that best solves every problem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Se presenta un nuevo método de diseño conceptual en Ingeniería Aeronáutica basado el uso de modelos reducidos, también llamados modelos sustitutos (‘surrogates’). Los ingredientes de la función objetivo se calculan para cada indiviudo mediante la utilización de modelos sustitutos asociados a las distintas disciplinas técnicas que se construyen mediante definiciones de descomposición en valores singulares de alto orden (HOSVD) e interpolaciones unidimensionales. Estos modelos sustitutos se obtienen a partir de un número limitado de cálculos CFD. Los modelos sustitutos pueden combinarse, bien con un método de optimización global de tipo algoritmo genético, o con un método local de tipo gradiente. El método resultate es flexible a la par que mucho más eficiente, computacionalmente hablando, que los modelos convencionales basados en el cálculo directo de la función objetivo, especialmente si aparecen un gran número de parámetros de diseño y/o de modelado. El método se ilustra considerando una versión simplificada del diseño conceptual de un avión. Abstract An optimization method for conceptual design in Aeronautics is presented that is based on the use of surrogate models. The various ingredients in the target function are calculated for each individual using surrogates of the associated technical disciplines that are constructed via high order singular value decomposition and one dimensional interpolation. These surrogates result from a limited number of CFD calculated snapshots. The surrogates are combined with an optimization method, which can be either a global optimization method such as a genetic algorithm or a local optimization method, such as a gradient-like method. The resulting method is both flexible and much more computationally efficient than the conventional method based on direct calculation of the target function, especially if a large number of free design parameters and/or tunablemodeling parameters are present. The method is illustrated considering a simplified version of the conceptual design of an aircraft empennage.