969 resultados para attori, concorrenza, COOP, Akka, benchmark


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new search-space-updating technique for genetic algorithms is proposed for continuous optimisation problems. Other than gradually reducing the search space during the evolution process with a fixed reduction rate set ‘a priori’, the upper and the lower boundaries for each variable in the objective function are dynamically adjusted based on its distribution statistics. To test the effectiveness, the technique is applied to a number of benchmark optimisation problems in comparison with three other techniques, namely the genetic algorithms with parameter space size adjustment (GAPSSA) technique [A.B. Djurišic, Elite genetic algorithms with adaptive mutations for solving continuous optimization problems – application to modeling of the optical constants of solids, Optics Communications 151 (1998) 147–159], successive zooming genetic algorithm (SZGA) [Y. Kwon, S. Kwon, S. Jin, J. Kim, Convergence enhanced genetic algorithm with successive zooming method for solving continuous optimization problems, Computers and Structures 81 (2003) 1715–1725] and a simple GA. The tests show that for well-posed problems, existing search space updating techniques perform well in terms of convergence speed and solution precision however, for some ill-posed problems these techniques are statistically inferior to a simple GA. All the tests show that the proposed new search space update technique is statistically superior to its counterparts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a parallel-matching processor architecture with early jump-out (EJO) control is proposed to carry out high-speed biometric fingerprint database retrieval. The processor performs the fingerprint retrieval by using minutia point matching. An EJO method is applied to the proposed architecture to speed up the large database retrieval. The processor is implemented on a Xilinx Virtex-E, and occupies 6,825 slices and runs at up to 65 MHz. The software/hardware co-simulation benchmark with a database of 10,000 fingerprints verifies that the matching speed can achieve the rate of up to 1.22 million fingerprints per second. EJO results in about a 22% gain in computing efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present results from three-dimensional protein folding simulations in the HP-model on ten benchmark problems. The simulations are executed by a simulated annealing-based algorithm with a time-dependent cooling schedule. The neighbourhood relation is determined by the pull-move set. The results provide experimental evidence that the maximum depth D of local minima of the underlying energy landscape can be upper bounded by D < n(2/3). The local search procedure employs the stopping criterion (In/delta)(D/gamma) where m is an estimation of the average number of neighbouring conformations, gamma relates to the mean of non-zero differences of the objective function for neighbouring conformations, and 1-delta is the confidence that a minimum conformation has been found. The bound complies with the results obtained for the ten benchmark problems. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The total cross sections for single ionization of helium and single and double ionization of argon by antiproton impact have been measured in the kinetic energy range from 3 to 25 jeVusing a new technique for the creation of intense slow antiproton beams. The new data provide benchmark results for the development of advanced descriptions of atomic collisions and we show that they can be used to judge, for the first time, the validity of many recent theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computionally efficient sequential learning algorithms are developed for direct-link resource-allocating networks (DRANs). These are achieved by decomposing existing recursive training algorithms on a layer by layer and neuron by neuron basis. This allows network weights to be updated in an efficient parallel manner and facilitates the implementation of minimal update extensions that yield a significant reduction in computation load per iteration compared to existing sequential learning methods employed in resource-allocation network (RAN) and minimal RAN (MRAN) approaches. The new algorithms, which also incorporate a pruning strategy to control network growth, are evaluated on three different system identification benchmark problems and shown to outperform existing methods both in terms of training error convergence and computational efficiency. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PEGS (Production and Environmental Generic Scheduler) is a generic production scheduler that produces good schedules over a wide range of problems. It is centralised, using search strategies with the Shifting Bottleneck algorithm. We have also developed an alternative distributed approach using software agents. In some cases this reduces run times by a factor of 10 or more. In most cases, the agent-based program also produces good solutions for published benchmark data, and the short run times make our program useful for a large range of problems. Test results show that the agents can produce schedules comparable to the best found so far for some benchmark datasets and actually better schedules than PEGS on our own random datasets. The flexibility that agents can provide for today's dynamic scheduling is also appealing. We suggest that in this sort of generic or commercial system, the agent-based approach is a good alternative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Surrogate-based-optimization methods provide a means to achieve high-fidelity design optimization at reduced computational cost by using a high-fidelity model in combination with lower-fidelity models that are less expensive to evaluate. This paper presents a provably convergent trust-region model-management methodology for variableparameterization design models: that is, models for which the design parameters are defined over different spaces. Corrected space mapping is introduced as a method to map between the variable-parameterization design spaces. It is then used with a sequential-quadratic-programming-like trust-region method for two aerospace-related design optimization problems. Results for a wing design problem and a flapping-flight problem show that the method outperforms direct optimization in the high-fidelity space. On the wing design problem, the new method achieves 76% savings in high-fidelity function calls. On a bat-flight design problem, it achieves approximately 45% time savings, although it converges to a different local minimum than did the benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.

We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a random iterative graph based hyper-heuristic to produce a collection of heuristic sequences to construct solutions of different quality. These heuristic sequences can be seen as dynamic hybridisations of different graph colouring heuristics that construct solutions step by step. Based on these sequences, we statistically analyse the way in which graph colouring heuristics are automatically hybridised. This, to our knowledge, represents a new direction in hyper-heuristic research. It is observed that spending the search effort on hybridising Largest Weighted Degree with Saturation Degree at the early stage of solution construction tends to generate high quality solutions. Based on these observations, an iterative hybrid approach is developed to adaptively hybridise these two graph colouring heuristics at different stages of solution construction. The overall aim here is to automate the heuristic design process, which draws upon an emerging research theme on developing computer methods to design and adapt heuristics automatically. Experimental results on benchmark exam timetabling and graph colouring problems demonstrate the effectiveness and generality of this adaptive hybrid approach compared with previous methods on automatically generating and adapting heuristics. Indeed, we also show that the approach is competitive with the state of the art human produced methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present an investigation into using fuzzy methodologies to guide the construction of high quality feasible examination timetabling solutions. The provision of automated solutions to the examination timetabling problem is achieved through a combination of construction and improvement. The enhancement of solutions through the use of techniques such as metaheuristics is, in some cases, dependent on the quality of the solution obtained during the construction process. With a few notable exceptions, recent research has concentrated on the improvement of solutions as opposed to focusing on investigating the ‘best’ approaches to the construction phase. Addressing this issue, our approach is based on combining multiple criteria in deciding on how the construction phase should proceed. Fuzzy methods were used to combine three single construction heuristics into three different pair wise combinations of heuristics in order to guide the order in which exams were selected to be inserted into the timetable solution. In order to investigate the approach, we compared the performance of the various heuristic approaches with respect to a number of important criteria (overall cost penalty, number of skipped exams, number of iterations of a rescheduling procedure required and computational time) on twelve well-known benchmark problems. We demonstrate that the fuzzy combination of heuristics allows high quality solutions to be constructed. On one of the twelve problems we obtained lower penalty than any previously published constructive method and for all twelve we obtained lower penalty than when any of the single heuristics were used alone. Furthermore, we demonstrate that the fuzzy approach used less backtracking when constructing solutions than any of the single heuristics. We conclude that this novel fuzzy approach is a highly effective method for heuristically constructing solutions and, as such, has particular relevance to real-world situations in which the construction of feasible solutions is often a difficult task in its own right.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many domains when we have several competing classifiers available we want to synthesize them or some of them to get a more accurate classifier by a combination function. In this paper we propose a ‘class-indifferent’ method for combining classifier decisions represented by evidential structures called triplet and quartet, using Dempster's rule of combination. This method is unique in that it distinguishes important elements from the trivial ones in representing classifier decisions, makes use of more information than others in calculating the support for class labels and provides a practical way to apply the theoretically appealing Dempster–Shafer theory of evidence to the problem of ensemble learning. We present a formalism for modelling classifier decisions as triplet mass functions and we establish a range of formulae for combining these mass functions in order to arrive at a consensus decision. In addition we carry out a comparative study with the alternatives of simplet and dichotomous structure and also compare two combination methods, Dempster's rule and majority voting, over the UCI benchmark data, to demonstrate the advantage our approach offers. (A continuation of the work in this area that was published in IEEE Trans on KDE, and conferences)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present experimental results on benchmark problems in 3D cubic lattice structures with the Miyazawa-Jernigan energy function for two local search procedures that utilise the pull-move set: (i) population-based local search (PLS) that traverses the energy landscape with greedy steps towards (potential) local minima followed by upward steps up to a certain level of the objective function; (ii) simulated annealing with a logarithmic cooling schedule (LSA). The parameter settings for PLS are derived from short LSA-runs executed in pre-processing and the procedure utilises tabu lists generated for each member of the population. In terms of the total number of energy function evaluations both methods perform equally well, however. PLS has the potential of being parallelised with an expected speed-up in the region of the population size. Furthermore, both methods require a significant smaller number of function evaluations when compared to Monte Carlo simulations with kink-jump moves. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to examine the extent and nature of greening the supply chain (SC) in the UK manufacturing sector; and the factors that influence the breadth and depth of this activity.

Design/methodology/approach: Based on the findings from a sample of manufacturing organisations drawn from the membership of The Chartered Institute for Purchasing and Supply. Data are collected using a questionnaire, piloted and pre-tested before distribution with responses from 60 manufacturing companies.

Findings: On average manufacturers perceive the greatest pressure to improve environmental performance through legislation and internal drivers (IDs). The least influential pressures are related to societal drivers and SC pressures from individual customers. Green supply chain management (GSCM) practices amongst this “average” group of UK manufacturing organisations are focusing on internal, higher risk, descriptive activities, rather than proactive, external engagement processes. Environmental attitude (EA) is a key predictor of GSCM activity and those organisations that have a progressive attitude are also operationally very active. EA shows some relationship to legislative drivers but other factors are also influential. Operational activity may also be moderated by organisational contingencies such as risk, size, and nationality.

Research limitations/implications: The main limitation to this paper is the relatively small manufacturing sample.

Practical implications: This paper presents a series of constructs that identify GSCM operational activities companies to benchmark themselves against. It suggests which factors are driving these operational changes and how industry contingencies may be influential.

Originality/value: This paper explores what is driving environmental behaviour amongst an “average” sample of manufacturers, what specific management practices take place and the relationships between them.

Keywords: Manufacturing industries, Environmental management, Supply chain management, Sustainable development, United Kingdom
Paper type: Research paper

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have measured the two-electron contribution of the ground state energy of helium-like argon ions using an electron beam ion trap (EBIT). A two-dimensional map was measured showing the intensity of x-rays from the trap passing through a krypton-filled absorption cell. The independent axes of this map were electron beam energy and x-ray energy. From this map, we deduced the two-electron contribution of the ground state of helium-like argon. This experimentally determined Value (312.4 +/- 9.5 eV) was found to be in good agreement with our calculated values (about 303.35 eV) and previous calculations of the same quantity. Based on these measurements, we have shown that a ten-day absorption spectroscopy run with a super-EBIT should be sufficient to provide a new benchmark value for the two-electron contribution to the ground state of helium-like krypton. Such a measurement would then constitute a test of quantum electrodynamics to second order.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To increase eco-efficiency environmental information needs to be integrated into corporate decision making. For decision makers the interpretation of eco-efficiency as a ratio can however be quite difficult in practice. One of the reasons for this is, that eco-efficiency as a ratio is measured in a unit, that is difficult to interpret. This article therefore suggests an alternative measure for eco-efficiency. The Environmental Value Added, the measure proposed in this paper, reflects the excess economic benefit, resulting from the difference between the eco-efficiency under consideration and a benchmark eco-efficiency. It is measured in a purely monetary unit and is thus easier to interpret and integrate than eco-efficiency as a ratio.