971 resultados para Meta heuristic algorithm
Resumo:
Genetic Algorithms (GAs) are adaptive heuristic search algorithm based on the evolutionary ideas of natural selection and genetic. The basic concept of GAs is designed to simulate processes in natural system necessary for evolution, specifically those that follow the principles first laid down by Charles Darwin of survival of the fittest. On the other hand, Particle swarm optimization (PSO) is a population based stochastic optimization technique inspired by social behavior of bird flocking or fish schooling. PSO shares many similarities with evolutionary computation techniques such as GAs. The system is initialized with a population of random solutions and searches for optima by updating generations. However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions, called particles, fly through the problem space by following the current optimum particles. PSO is attractive because there are few parameters to adjust. This paper presents hybridization between a GA algorithm and a PSO algorithm (crossing the two algorithms). The resulting algorithm is applied to the synthesis of combinational logic circuits. With this combination is possible to take advantage of the best features of each particular algorithm.
Resumo:
Several phenomena present in electrical systems motivated the development of comprehensive models based on the theory of fractional calculus (FC). Bearing these ideas in mind, in this work are applied the FC concepts to define, and to evaluate, the electrical potential of fractional order, based in a genetic algorithm optimization scheme. The feasibility and the convergence of the proposed method are evaluated.
Resumo:
ABSTRACT OBJECTIVE To estimate the prevalence of hypertension among adolescent Brazilian students. METHODS A systematic review of school-based cross-sectional studies was conducted. The articles were searched in the databases MEDLINE, Embase, Scopus, LILACS, SciELO, Web of Science, CAPES thesis database and Trip Database. In addition, we examined the lists of references of relevant studies to identify potentially eligible articles. No restrictions regarding publication date, language, or status applied. The studies were selected by two independent evaluators, who also extracted the data and assessed the methodological quality following eight criteria related to sampling, measuring blood pressure, and presenting results. The meta-analysis was calculated using a random effects model and analyses were performed to investigate heterogeneity. RESULTS We retrieved 1,577 articles from the search and included 22 in the review. The included articles corresponded to 14,115 adolescents, 51.2% (n = 7,230) female. We observed a variety of techniques, equipment, and references used. The prevalence of hypertension was 8.0% (95%CI 5.0–11.0; I2 = 97.6%), 9.3% (95%CI 5.6–13.6; I2 = 96.4%) in males and 6.5% (95%CI 4.2–9.1; I2 = 94.2%) in females. The meta-regression failed to identify the causes of the heterogeneity among studies. CONCLUSIONS Despite the differences found in the methodologies of the included studies, the results of this systematic review indicate that hypertension is prevalent in the Brazilian adolescent school population. For future investigations, we suggest the standardization of techniques, equipment, and references, aiming at improving the methodological quality of the studies.
Resumo:
This paper presents an optimization approach for the job shop scheduling problem (JSSP). The JSSP is a difficult problem in combinatorial optimization for which extensive investigation has been devoted to the development of efficient algorithms. The proposed approach is based on a genetic algorithm technique. The scheduling rules such as SPT and MWKR are integrated into the process of genetic evolution. The chromosome representation of the problem is based on random keys. The schedules are constructed using a priority rule in which the priorities and delay times of the operations are defined by the genetic algorithm. Schedules are constructed using a procedure that generates parameterized active schedules. After a schedule is obtained a local search heuristic is applied to improve the solution. The approach is tested on a set of standard instances taken from the literature and compared with other approaches. The computation results validate the effectiveness of the proposed approach.
Resumo:
This work addresses the signal propagation and the fractional-order dynamics during the evolution of a genetic algorithm (GA). In order to investigate the phenomena involved in the GA population evolution, the mutation is exposed to excitation perturbations during some generations and the corresponding fitness variations are evaluated. Three distinct fitness functions are used to study their influence in the GA dynamics. The input and output signals are studied revealing a fractional-order dynamic evolution, characteristic of a long-term system memory.
Resumo:
IEEE International Symposium on Circuits and Systems, pp. 724 – 727, Seattle, EUA
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Descrição do Processo de Revisão da futura ISO 9001:2015, principais alterações, benefícios e dificuldades expectáveis.
Resumo:
The ability to respond sensibly to changing and conflicting beliefs is an integral part of intelligent agency. To this end, we outline the design and implementation of a Distributed Assumption-based Truth Maintenance System (DATMS) appropriate for controlling cooperative problem solving in a dynamic real world multi-agent community. Our DATMS works on the principle of local coherence which means that different agents can have different perspectives on the same fact provided that these stances are appropriately justified. The belief revision algorithm is presented, the meta-level code needed to ensure that all system-wide queries can be uniquely answered is described, and the DATMS’ implementation in a general purpose multi-agent shell is discussed.
Resumo:
Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.
Resumo:
An adaptive antenna array combines the signal of each element, using some constraints to produce the radiation pattern of the antenna, while maximizing the performance of the system. Direction of arrival (DOA) algorithms are applied to determine the directions of impinging signals, whereas beamforming techniques are employed to determine the appropriate weights for the array elements, to create the desired pattern. In this paper, a detailed analysis of both categories of algorithms is made, when a planar antenna array is used. Several simulation results show that it is possible to point an antenna array in a desired direction based on the DOA estimation and on the beamforming algorithms. A comparison of the performance in terms of runtime and accuracy of the used algorithms is made. These characteristics are dependent on the SNR of the incoming signal.
Resumo:
In Portugal, especially starting in the 1970s, women’s studies had implications on the emergency of the concept of gender and the feminist criticism to the prevailing models about differences between sexes. Until then, women had been absent from scientific research both as subject and as object. Feminism brought more reflexivity to the scientific thinking. After the 25th of April 1974, because of the consequent political openness, several innovating themes of research emerged, together with new concepts and fields of study. However, as far as gender and science relationship is concerned, such studies especially concentrate on higher education institutions. The feminist thinking seems to have two main objectives: to give women visibility, on the one hand, and to denunciate men’s domain in the several fields of knowledge. In 1977, the “Feminine Commission” is created and since then it has been publishing studies on women’s condition and contributing to the enhancement of the reflection of female condition at all levels. In the 1980s, the growing feminisation of tertiary education (both of students and academics), favoured the development of women’s studies, especially on their condition within universities with a special focus on the glass ceiling, despite the lack of statistical data by gender, thus making difficult the analysis of women integration in several sectors, namely in educational and scientific research activities. Other agglutinating themes are family, social and legal condition, work, education, and feminine intervention on political and social movements. In the 1990s, Women Studies are institutionalised in the academic context with the creation of the first Master in Women Studies in the Universidade Aberta (Open University), in Lisbon. In 1999, the first Portuguese journal of women studies is created – “Faces de Eva”. Seminars, conferences, thesis, journals, and projects on women’s studies are more and more common. However, results and publications are not so divulgated as they should be, because of lack of comprehensive and coordinated databases. 2. Analysis by topics 2.1. Horizontal and vertical segregation Research questions It is one of the main areas of research in Portugal. Essentially two issues have been considered: - The analysis of vertical gender segregation in educational and professional fields, having reflexes on women professional career progression with special attention to men’s power in control positions and the glass ceiling. - The analysis of horizontal segregation, special in higher education (teaching and research) where women have less visibility than men, and the under-representation of women in technology and technological careers. Research in this area mainly focuses on description, showing the under-representation of women in certain scientific areas and senior positions. Nevertheless, the studies that analyze horizontal segregation in the field of education adopt a more analytical approach which focuses on the analysis of the mechanisms of reproduction of gender stereotypes, especially socialisation, influencing educational and career choices. 1
Resumo:
The container loading problem (CLP) is a combinatorial optimization problem for the spatial arrangement of cargo inside containers so as to maximize the usage of space. The algorithms for this problem are of limited practical applicability if real-world constraints are not considered, one of the most important of which is deemed to be stability. This paper addresses static stability, as opposed to dynamic stability, looking at the stability of the cargo during container loading. This paper proposes two algorithms. The first is a static stability algorithm based on static mechanical equilibrium conditions that can be used as a stability evaluation function embedded in CLP algorithms (e.g. constructive heuristics, metaheuristics). The second proposed algorithm is a physical packing sequence algorithm that, given a container loading arrangement, generates the actual sequence by which each box is placed inside the container, considering static stability and loading operation efficiency constraints.
Resumo:
“Many-core” systems based on a Network-on-Chip (NoC) architecture offer various opportunities in terms of performance and computing capabilities, but at the same time they pose many challenges for the deployment of real-time systems, which must fulfill specific timing requirements at runtime. It is therefore essential to identify, at design time, the parameters that have an impact on the execution time of the tasks deployed on these systems and the upper bounds on the other key parameters. The focus of this work is to determine an upper bound on the traversal time of a packet when it is transmitted over the NoC infrastructure. Towards this aim, we first identify and explore some limitations in the existing recursive-calculus-based approaches to compute the Worst-Case Traversal Time (WCTT) of a packet. Then, we extend the existing model by integrating the characteristics of the tasks that generate the packets. For this extended model, we propose an algorithm called “Branch and Prune” (BP). Our proposed method provides tighter and safe estimates than the existing recursive-calculus-based approaches. Finally, we introduce a more general approach, namely “Branch, Prune and Collapse” (BPC) which offers a configurable parameter that provides a flexible trade-off between the computational complexity and the tightness of the computed estimate. The recursive-calculus methods and BP present two special cases of BPC when a trade-off parameter is 1 or ∞, respectively. Through simulations, we analyze this trade-off, reason about the implications of certain choices, and also provide some case studies to observe the impact of task parameters on the WCTT estimates.
Resumo:
This paper focuses on a PV system linked to the electric grid by power electronic converters, identification of the five parameters modeling for photovoltaic systems and the assessment of the shading effect. Normally, the technical information for photovoltaic panels is too restricted to identify the five parameters. An undemanding heuristic method is used to find the five parameters for photovoltaic systems, requiring only the open circuit, maximum power, and short circuit data. The I- V and the P- V curves for a monocrystalline, polycrystalline and amorphous photovoltaic systems are computed from the parameters identification and validated by comparison with experimental ones. Also, the I- V and the P- V curves under the effect of partial shading are obtained from those parameters. The modeling for the converters emulates the association of a DC-DC boost with a two-level power inverter in order to follow the performance of a testing commercial inverter employed on an experimental system. © 2015 Elsevier Ltd.