16 resultados para Multi-objective functions

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current SoC design trends are characterized by the integration of larger amount of IPs targeting a wide range of application fields. Such multi-application systems are constrained by a set of requirements. In such scenario network-on-chips (NoC) are becoming more important as the on-chip communication structure. Designing an optimal NoC for satisfying the requirements of each individual application requires the specification of a large set of configuration parameters leading to a wide solution space. It has been shown that IP mapping is one of the most critical parameters in NoC design, strongly influencing the SoC performance. IP mapping has been solved for single application systems using single and multi-objective optimization algorithms. In this paper we propose the use of a multi-objective adaptive immune algorithm (M(2)AIA), an evolutionary approach to solve the multi-application NoC mapping problem. Latency and power consumption were adopted as the target multi-objective functions. To compare the efficiency of our approach, our results are compared with those of the genetic and branch and bound multi-objective mapping algorithms. We tested 11 well-known benchmarks, including random and real applications, and combines up to 8 applications at the same SoC. The experimental results showed that the M(2)AIA decreases in average the power consumption and the latency 27.3 and 42.1 % compared to the branch and bound approach and 29.3 and 36.1 % over the genetic approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many engineering sectors are challenged by multi-objective optimization problems. Even if the idea behind these problems is simple and well established, the implementation of any procedure to solve them is not a trivial task. The use of evolutionary algorithms to find candidate solutions is widespread. Usually they supply a discrete picture of the non-dominated solutions, a Pareto set. Although it is very interesting to know the non-dominated solutions, an additional criterion is needed to select one solution to be deployed. To better support the design process, this paper presents a new method of solving non-linear multi-objective optimization problems by adding a control function that will guide the optimization process over the Pareto set that does not need to be found explicitly. The proposed methodology differs from the classical methods that combine the objective functions in a single scale, and is based on a unique run of non-linear single-objective optimizers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decision tree induction algorithms represent one of the most popular techniques for dealing with classification problems. However, traditional decision-tree induction algorithms implement a greedy approach for node splitting that is inherently susceptible to local optima convergence. Evolutionary algorithms can avoid the problems associated with a greedy search and have been successfully employed to the induction of decision trees. Previously, we proposed a lexicographic multi-objective genetic algorithm for decision-tree induction, named LEGAL-Tree. In this work, we propose extending this approach substantially, particularly w.r.t. two important evolutionary aspects: the initialization of the population and the fitness function. We carry out a comprehensive set of experiments to validate our extended algorithm. The experimental results suggest that it is able to outperform both traditional algorithms for decision-tree induction and another evolutionary algorithm in a variety of application domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network reconfiguration for service restoration (SR) in distribution systems is a complex optimization problem. For large-scale distribution systems, it is computationally hard to find adequate SR plans in real time since the problem is combinatorial and non-linear, involving several constraints and objectives. Two Multi-Objective Evolutionary Algorithms that use Node-Depth Encoding (NDE) have proved able to efficiently generate adequate SR plans for large distribution systems: (i) one of them is the hybridization of the Non-Dominated Sorting Genetic Algorithm-II (NSGA-II) with NDE, named NSGA-N; (ii) the other is a Multi-Objective Evolutionary Algorithm based on subpopulation tables that uses NDE, named MEAN. Further challenges are faced now, i.e. the design of SR plans for larger systems as good as those for relatively smaller ones and for multiple faults as good as those for one fault (single fault). In order to tackle both challenges, this paper proposes a method that results from the combination of NSGA-N, MEAN and a new heuristic. Such a heuristic focuses on the application of NDE operators to alarming network zones according to technical constraints. The method generates similar quality SR plans in distribution systems of significantly different sizes (from 3860 to 30,880 buses). Moreover, the number of switching operations required to implement the SR plans generated by the proposed method increases in a moderate way with the number of faults.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a technique for performing analog design synthesis at circuit level providing feedback to the designer through the exploration of the Pareto frontier. A modified simulated annealing which is able to perform crossover with past anchor points when a local minimum is found which is used as the optimization algorithm on the initial synthesis procedure. After all specifications are met, the algorithm searches for the extreme points of the Pareto frontier in order to obtain a non-exhaustive exploration of the Pareto front. Finally, multi-objective particle swarm optimization is used to spread the results and to find a more accurate frontier. Piecewise linear functions are used as single-objective cost functions to produce a smooth and equal convergence of all measurements to the desired specifications during the composition of the aggregate objective function. To verify the presented technique two circuits were designed, which are: a Miller amplifier with 96 dB Voltage gain, 15.48 MHz unity gain frequency, slew rate of 19.2 V/mu s with a current supply of 385.15 mu A, and a complementary folded cascode with 104.25 dB Voltage gain, 18.15 MHz of unity gain frequency and a slew rate of 13.370 MV/mu s. These circuits were synthesized using a 0.35 mu m technology. The results show that the method provides a fast approach for good solutions using the modified SA and further good Pareto front exploration through its connection to the particle swarm optimization algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sensor and actuator based on laminated piezocomposite shells have shown increasing demand in the field of smart structures. The distribution of piezoelectric material within material layers affects the performance of these structures; therefore, its amount, shape, size, placement, and polarization should be simultaneously considered in an optimization problem. In addition, previous works suggest the concept of laminated piezocomposite structure that includes fiber-reinforced composite layer can increase the performance of these piezoelectric transducers; however, the design optimization of these devices has not been fully explored yet. Thus, this work aims the development of a methodology using topology optimization techniques for static design of laminated piezocomposite shell structures by considering the optimization of piezoelectric material and polarization distributions together with the optimization of the fiber angle of the composite orthotropic layers, which is free to assume different values along the same composite layer. The finite element model is based on the laminated piezoelectric shell theory, using the degenerate three-dimensional solid approach and first-order shell theory kinematics that accounts for the transverse shear deformation and rotary inertia effects. The topology optimization formulation is implemented by combining the piezoelectric material with penalization and polarization model and the discrete material optimization, where the design variables describe the amount of piezoelectric material and polarization sign at each finite element, with the fiber angles, respectively. Three different objective functions are formulated for the design of actuators, sensors, and energy harvesters. Results of laminated piezocomposite shell transducers are presented to illustrate the method. Copyright (C) 2012 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes two new approaches for the sensitivity analysis of multiobjective design optimization problems whose performance functions are highly susceptible to small variations in the design variables and/or design environment parameters. In both methods, the less sensitive design alternatives are preferred over others during the multiobjective optimization process. While taking the first approach, the designer chooses the design variable and/or parameter that causes uncertainties. The designer then associates a robustness index with each design alternative and adds each index as an objective function in the optimization problem. For the second approach, the designer must know, a priori, the interval of variation in the design variables or in the design environment parameters, because the designer will be accepting the interval of variation in the objective functions. The second method does not require any law of probability distribution of uncontrollable variations. Finally, the authors give two illustrative examples to highlight the contributions of the paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Predição de estruturas de proteínas (PSP) é um problema computacionalmente complexo. Modelos simplificados da molécula proteica (como o Modelo HP) e o uso de Algoritmos Evolutivos (AEs) estão entre as principais técnicas investigadas para PSP. Entretanto, a avaliação de uma estrutura representada pelo Modelo HP considera apenas o número de contatos hidrofóbicos, não possibilitando distinguir entre estruturas com o mesmo número de contatos hidrofóbicos. Neste trabalho, é apresentada uma nova formulação multiobjetivo para PSP em Modelo HP. Duas métricas são avaliadas: o número de contatos hidrofóbicos e a distância entre os aminoácidos hidrofóbicos, as quais são tratados pelo AE Multiobjetivo em Tabelas (AEMT). O algoritmo mostrou-se rápido e robusto.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Small scale fluid flow systems have been studied for various applications, such as chemical reagent dosages and cooling devices of compact electronic components. This work proposes to present the complete cycle development of an optimized heat sink designed by using Topology Optimization Method (TOM) for best performance, including minimization of pressure drop in fluid flow and maximization of heat dissipation effects, aiming small scale applications. The TOM is applied to a domain, to obtain an optimized channel topology, according to a given multi-objective function that combines pressure drop minimization and heat transfer maximization. Stokes flow hypothesis is adopted. Moreover, both conduction and forced convection effects are included in the steady-state heat transfer model. The topology optimization procedure combines the Finite Element Method (to carry out the physical analysis) with Sequential Linear Programming (as the optimization algorithm). Two-dimensional topology optimization results of channel layouts obtained for a heat sink design are presented as example to illustrate the design methodology. 3D computational simulations and prototype manufacturing have been carried out to validate the proposed design methodology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this study was to estimate (co)variance components using random regression on B-spline functions to weight records obtained from birth to adulthood. A total of 82 064 weight records of 8145 females obtained from the data bank of the Nellore Breeding Program (PMGRN/Nellore Brazil) which started in 1987, were used. The models included direct additive and maternal genetic effects and animal and maternal permanent environmental effects as random. Contemporary group and dam age at calving (linear and quadratic effect) were included as fixed effects, and orthogonal Legendre polynomials of age (cubic regression) were considered as random covariate. The random effects were modeled using B-spline functions considering linear, quadratic and cubic polynomials for each individual segment. Residual variances were grouped in five age classes. Direct additive genetic and animal permanent environmental effects were modeled using up to seven knots (six segments). A single segment with two knots at the end points of the curve was used for the estimation of maternal genetic and maternal permanent environmental effects. A total of 15 models were studied, with the number of parameters ranging from 17 to 81. The models that used B-splines were compared with multi-trait analyses with nine weight traits and to a random regression model that used orthogonal Legendre polynomials. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most appropriate and parsimonious model to describe the covariance structure of the data. Selection for higher weight, such as at young ages, should be performed taking into account an increase in mature cow weight. Particularly, this is important in most of Nellore beef cattle production systems, where the cow herd is maintained on range conditions. There is limited modification of the growth curve of Nellore cattle with respect to the aim of selecting them for rapid growth at young ages while maintaining constant adult weight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate whether there are visual and neurophysical decrements in workers with low exposure to Hg vapor. Methods: Visual fields, contrast sensitivity, color vision, and neuropsychological functions were measured in 10 workers (32.5 +/- 8.5 years) chronically exposed to Hg vapor (4.3 +/- 2.8 years; urinary Hg concentration 22.3 +/- 9.3 mu g/g creatinine). Results: For the worst eyes, we found altered visual field thresholds, lower contrast sensitivity, and color discrimination compared with controls (P < 0.05). There were no significant differences between Hg-exposed subjects and controls on. neuropsychological tests. Nevertheless, duration of exposure was statistically correlated to verbal memory and depression scores. Conclusions: Chronic exposure to Hg vapor at currently accepted safety levels was found to be associated with visual losses but not with neuropsychological dysfunctions in the sample of workers studied. (J Occup Environ Med. 2009,51:1403-1412)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Patients with terminal heart failure have increased more than the available organs leading to a high mortality rate on the waiting list. Use of Marginal and expanded criteria donors has increased due to the heart shortage. Objective. We analyzed all heart transplantations (HTx) in Sao Paulo state over 8 years for donor profile and recipient risk factors. Method. This multi-institutional review collected HTx data from all institutions in the state of Sao Paulo, Brazil. From 2002 to 2008 (6 years), only 512 (28.8%) of 1777 available heart donors were accepted for transplantation. All medical records were analyzed retrospectively; none of the used donors was excluded, even those considered to be nonstandard. Results. The hospital mortality rate was 27.9% (n = 143) and the average follow-up time was 29.4 +/- 28.4 months. The survival rate was 55.5% (n = 285) at 6 years after HTx. Univariate analysis showed the following factors to impact survival: age (P = .0004), arterial hypertension (P = .4620), norepinephrine (P = .0450), cardiac arrest (P = .8500), diabetes mellitus (P = .5120), infection (P = .1470), CKMB (creatine kinase MB) (P = .8694), creatinine (P = .7225), and Na+ (P = .3273). On multivariate analysis, only age showed significance; logistic regression showed a significant cut-off at 40 years: organs from donors older than 40 years showed a lower late survival rates (P = .0032). Conclusions. Donor age older than 40 years represents an important risk factor for survival after HTx. Neither donor gender nor norepinephrine use negatively affected early survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a structural damage detection methodology based on genetic algorithms and dynamic parameters. Three chromosomes are used to codify an individual in the population. The first and second chromosomes locate and quantify damage, respectively. The third permits the self-adaptation of the genetic parameters. The natural frequencies and mode shapes are used to formulate the objective function. A numerical analysis was performed for several truss structures under different damage scenarios. The results have shown that the methodology can reliably identify damage scenarios using noisy measurements and that it results in only a few misidentified elements. (C) 2012 Civil-Comp Ltd and Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our objective here is to prove that the uniform convergence of a sequence of Kurzweil integrable functions implies the convergence of the sequence formed by its corresponding integrals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Strategic Environmental Assessment (SEA) of the sugar and alcohol sector guides a territorial and sectoral planning that benefits most of the local society and supports this economic activity in all its stages. In this way, the present work aims to determine an index of aggregation of the indicators generated in the baseline of the SEA process, called Index of Sustainability of Expansion of the Sugar and Alcohol Sector (IScana). For this, it was used the normalization of the indicators of each city by the fuzzy logic and attribution of weights by the Analytic Hierarchy Process (AHP). Then, the IScana values had been spatialized in the region of 'Grande Dourados'-Mato Grosso do Sul State. The northern portion concentrated the highest values of IScana, 0.48 and 0.55, referring to the cities of Nova Alvorada do Sul and Rio Brilhante, while, in the central portion, the city of Dourados presented the lowest value, 0.10. The selection of the set of indicators forming the IScana, and their relative importance, was satisfactory for the application of fuzzy logic and AHP techniques. The IScana index supplies objective information regarding the diagnosis of the region for the application of SEA.