12 resultados para Evolutionary optimization methods
em Universidad de Alicante
Resumo:
This paper presents a new approach to the delineation of local labor markets based on evolutionary computation. The aim of the exercise is the division of a given territory into functional regions based on travel-to-work flows. Such regions are defined so that a high degree of inter-regional separation and of intra-regional integration in both cases in terms of commuting flows is guaranteed. Additional requirements include the absence of overlap between delineated regions and the exhaustive coverage of the whole territory. The procedure is based on the maximization of a fitness function that measures aggregate intra-region interaction under constraints of inter-region separation and minimum size. In the experimentation stage, two variations of the fitness function are used, and the process is also applied as a final stage for the optimization of the results from one of the most successful existing methods, which are used by the British authorities for the delineation of travel-to-work areas (TTWAs). The empirical exercise is conducted using real data for a sufficiently large territory that is considered to be representative given the density and variety of travel-to-work patterns that it embraces. The paper includes the quantitative comparison with alternative traditional methods, the assessment of the performance of the set of operators which has been specifically designed to handle the regionalization problem and the evaluation of the convergence process. The robustness of the solutions, something crucial in a research and policy-making context, is also discussed in the paper.
Resumo:
Aims. Despite their importance to a number of astrophysical fields, the lifecycles of very massive stars are still poorly defined. In order to address this shortcoming, we present a detailed quantitative study of the physical properties of four early-B hypergiants (BHGs) of spectral type B1-4 Ia+; Cyg OB2 #12, ζ1 Sco, HD 190603 and BP Cru. These are combined with an analysis of their long-term spectroscopic and photometric behaviour in order to determine their evolutionary status. Methods. Quantitative analysis of UV–radio photometric and spectroscopic datasets was undertaken with a non-LTE model atmosphere code in order to derive physical parameters for comparison with apparently closely related objects, such as B supergiants (BSGs) and luminous blue variables (LBVs), and theoretical evolutionary predictions. Results. The long-term photospheric and spectroscopic datasets compiled for the early-B HGs revealed that they are remarkably stable over long periods ( ≥ 40 yrs), with the possible exception of ζ1 Sco prior to the 20th century; in contrast to the typical excursions that characterise LBVs. Quantitative analysis of ζ1 Sco, HD 190603 and BP Cru yielded physical properties intermediate between BSGs and LBVs; we therefore suggest that BHGs are the immediate descendants and progenitors (respectively) of such stars, for initial masses in the range ~30−60 M⊙. Comparison of the properties of ζ1 Sco with the stellar population of its host cluster/association NGC 6231/Sco OB1 provides further support for such an evolutionary scenario. In contrast, while the wind properties of Cyg OB2 #12 are consistent with this hypothesis, the combination of extreme luminosity and spectroscopic mass (~110 M⊙) and comparatively low temperature means it cannot be accommodated in such a scheme. Likewise, despite its co-location with several LBVs above the Humphreys-Davidson (HD) limit, the lack of long term variability and its unevolved chemistry apparently excludes such an identification. Since such massive stars are not expected to evolve to such cool temperatures, instead traversing an O4-6Ia → O4-6Ia+ → WN7-9ha pathway, the properties of Cyg OB2 #12 are therefore difficult to understand under current evolutionary paradigms. Finally, we note that as with AG Car in its cool phase, despite exceeding the HD limit, the properties of Cyg OB2 #12 imply that it lies below the Eddington limit – thus we conclude that the HD limit does not define a region of the HR diagram inherently inimical to the presence of massive stars.
Resumo:
Given a territory composed of basic geographical units, the delineation of local labour market areas (LLMAs) can be seen as a problem in which those units are grouped subject to multiple constraints. In previous research, standard genetic algorithms were not able to find valid solutions, and a specific evolutionary algorithm was developed. The inclusion of multiple ad hoc operators allowed the algorithm to find better solutions than those of a widely-used greedy method. However, the percentage of invalid solutions was still very high. In this paper we improve that evolutionary algorithm through the inclusion of (i) a reparation process, that allows every invalid individual to fulfil the constraints and contribute to the evolution, and (ii) a hillclimbing optimisation procedure for each generated individual by means of an appropriate reassignment of some of its constituent units. We compare the results of both techniques against the previous results and a greedy method.
Resumo:
Context. Historically, supergiant (sg)B[e] stars have been difficult to include in theoretical schemes for the evolution of massive OB stars. Aims. The location of Wd1-9 within the coeval starburst cluster Westerlund 1 means that it may be placed into a proper evolutionary context and we therefore aim to utilise a comprehensive multiwavelength dataset to determine its physical properties and consequently its relation to other sgB[e] stars and the global population of massive evolved stars within Wd1. Methods. Multi-epoch R- and I-band VLT/UVES and VLT/FORS2 spectra are used to constrain the properties of the circumstellar gas, while an ISO-SWS spectrum covering 2.45−45μm is used to investigate the distribution, geometry and composition of the dust via a semi-analytic irradiated disk model. Radio emission enables a long term mass-loss history to be determined, while X-ray observations reveal the physical nature of high energy processes within the system. Results. Wd1-9 exhibits the rich optical emission line spectrum that is characteristic of sgB[e] stars. Likewise its mid-IR spectrum resembles those of the LMC sgB[e] stars R66 and 126, revealing the presence of equatorially concentrated silicate dust, with a mass of ~10−4M⊙. Extreme historical and ongoing mass loss (≳ 10−4M⊙yr−1) is inferred from the radio observations. The X-ray properties of Wd1-9 imply the presence of high temperature plasma within the system and are directly comparable to a number of confirmed short-period colliding wind binaries within Wd1. Conclusions. The most complete explanation for the observational properties of Wd1-9 is that it is a massive interacting binary currently undergoing, or recently exited from, rapid Roche-lobe overflow, supporting the hypothesis that binarity mediates the formation of (a subset of) sgB[e] stars. The mass loss rate of Wd1-9 is consistent with such an assertion, while viable progenitor and descendent systems are present within Wd1 and comparable sgB[e] binaries have been identified in the Galaxy. Moreover, the rarity of sgB[e] stars - only two examples are identified from a census of ~ 68 young massive Galactic clusters and associations containing ~ 600 post-Main Sequence stars - is explicable given the rapidity (~ 104yr) expected for this phase of massive binary evolution.
Resumo:
We consider quasi-Newton methods for generalized equations in Banach spaces under metric regularity and give a sufficient condition for q-linear convergence. Then we show that the well-known Broyden update satisfies this sufficient condition in Hilbert spaces. We also establish various modes of q-superlinear convergence of the Broyden update under strong metric subregularity, metric regularity and strong metric regularity. In particular, we show that the Broyden update applied to a generalized equation in Hilbert spaces satisfies the Dennis–Moré condition for q-superlinear convergence. Simple numerical examples illustrate the results.
Resumo:
The optimization of chemical processes where the flowsheet topology is not kept fixed is a challenging discrete-continuous optimization problem. Usually, this task has been performed through equation based models. This approach presents several problems, as tedious and complicated component properties estimation or the handling of huge problems (with thousands of equations and variables). We propose a GDP approach as an alternative to the MINLP models coupled with a flowsheet program. The novelty of this approach relies on using a commercial modular process simulator where the superstructure is drawn directly on the graphical use interface of the simulator. This methodology takes advantage of modular process simulators (specially tailored numerical methods, reliability, and robustness) and the flexibility of the GDP formulation for the modeling and solution. The optimization tool proposed is successfully applied to the synthesis of a methanol plant where different alternatives are available for the streams, equipment and process conditions.
Resumo:
Tuning compilations is the process of adjusting the values of a compiler options to improve some features of the final application. In this paper, a strategy based on the use of a genetic algorithm and a multi-objective scheme is proposed to deal with this task. Unlike previous works, we try to take advantage of the knowledge of this domain to provide a problem-specific genetic operation that improves both the speed of convergence and the quality of the results. The evaluation of the strategy is carried out by means of a case of study aimed to improve the performance of the well-known web server Apache. Experimental results show that a 7.5% of overall improvement can be achieved. Furthermore, the adaptive approach has shown an ability to markedly speed-up the convergence of the original strategy.
Resumo:
We present a derivative-free optimization algorithm coupled with a chemical process simulator for the optimal design of individual and complex distillation processes using a rigorous tray-by-tray model. The proposed approach serves as an alternative tool to the various models based on nonlinear programming (NLP) or mixed-integer nonlinear programming (MINLP) . This is accomplished by combining the advantages of using a commercial process simulator (Aspen Hysys), including especially suited numerical methods developed for the convergence of distillation columns, with the benefits of the particle swarm optimization (PSO) metaheuristic algorithm, which does not require gradient information and has the ability to escape from local optima. Our method inherits the superstructure developed in Yeomans, H.; Grossmann, I. E.Optimal design of complex distillation columns using rigorous tray-by-tray disjunctive programming models. Ind. Eng. Chem. Res.2000, 39 (11), 4326–4335, in which the nonexisting trays are considered as simple bypasses of liquid and vapor flows. The implemented tool provides the optimal configuration of distillation column systems, which includes continuous and discrete variables, through the minimization of the total annual cost (TAC). The robustness and flexibility of the method is proven through the successful design and synthesis of three distillation systems of increasing complexity.
Resumo:
Microalgae have many applications, such as biodiesel production or food supplement. Depending on the application, the optimization of certain fractions of the biochemical composition (proteins, carbohydrates and lipids) is required. Therefore, samples obtained in different culture conditions must be analyzed in order to compare the content of such fractions. Nevertheless, traditional methods necessitate lengthy analytical procedures with prolonged sample turn-around times. Results of the biochemical composition of Nannochloropsis oculata samples with different protein, carbohydrate and lipid contents obtained by conventional analytical methods have been compared to those obtained by thermogravimetry (TGA) and a Pyroprobe device connected to a gas chromatograph with mass spectrometer detector (Py–GC/MS), showing a clear correlation. These results suggest a potential applicability of these techniques as fast and easy methods to qualitatively compare the biochemical composition of microalgal samples.
Resumo:
Modern compilers present a great and ever increasing number of options which can modify the features and behavior of a compiled program. Many of these options are often wasted due to the required comprehensive knowledge about both the underlying architecture and the internal processes of the compiler. In this context, it is usual, not having a single design goal but a more complex set of objectives. In addition, the dependencies between different goals are difficult to be a priori inferred. This paper proposes a strategy for tuning the compilation of any given application. This is accomplished by using an automatic variation of the compilation options by means of multi-objective optimization and evolutionary computation commanded by the NSGA-II algorithm. This allows finding compilation options that simultaneously optimize different objectives. The advantages of our proposal are illustrated by means of a case study based on the well-known Apache web server. Our strategy has demonstrated an ability to find improvements up to 7.5% and up to 27% in context switches and L2 cache misses, respectively, and also discovers the most important bottlenecks involved in the application performance.
Resumo:
This paper studies stability properties of linear optimization problems with finitely many variables and an arbitrary number of constraints, when only left hand side coefficients can be perturbed. The coefficients of the constraints are assumed to be continuous functions with respect to an index which ranges on certain compact Hausdorff topological space, and these properties are preserved by the admissible perturbations. More in detail, the paper analyzes the continuity properties of the feasible set, the optimal set and the optimal value, as well as the preservation of desirable properties (boundedness, uniqueness) of the feasible and of the optimal sets, under sufficiently small perturbations.
Resumo:
Superstructure approaches are the solution to the difficult problem which involves the rigorous economic design of a distillation column. These methods require complex initialization procedures and they are hard to solve. For this reason, these methods have not been extensively used. In this work, we present a methodology for the rigorous optimization of chemical processes implemented on a commercial simulator using surrogate models based on a kriging interpolation. Several examples were studied, but in this paper, we perform the optimization of a superstructure for a non-sharp separation to show the efficiency and effectiveness of the method. Noteworthy that it is possible to get surrogate models accurate enough with up to seven degrees of freedom.