108 resultados para BENCHMARK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The total cross sections for single ionization of helium and single and double ionization of argon by antiproton impact have been measured in the kinetic energy range from 3 to 25 jeVusing a new technique for the creation of intense slow antiproton beams. The new data provide benchmark results for the development of advanced descriptions of atomic collisions and we show that they can be used to judge, for the first time, the validity of many recent theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computionally efficient sequential learning algorithms are developed for direct-link resource-allocating networks (DRANs). These are achieved by decomposing existing recursive training algorithms on a layer by layer and neuron by neuron basis. This allows network weights to be updated in an efficient parallel manner and facilitates the implementation of minimal update extensions that yield a significant reduction in computation load per iteration compared to existing sequential learning methods employed in resource-allocation network (RAN) and minimal RAN (MRAN) approaches. The new algorithms, which also incorporate a pruning strategy to control network growth, are evaluated on three different system identification benchmark problems and shown to outperform existing methods both in terms of training error convergence and computational efficiency. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PEGS (Production and Environmental Generic Scheduler) is a generic production scheduler that produces good schedules over a wide range of problems. It is centralised, using search strategies with the Shifting Bottleneck algorithm. We have also developed an alternative distributed approach using software agents. In some cases this reduces run times by a factor of 10 or more. In most cases, the agent-based program also produces good solutions for published benchmark data, and the short run times make our program useful for a large range of problems. Test results show that the agents can produce schedules comparable to the best found so far for some benchmark datasets and actually better schedules than PEGS on our own random datasets. The flexibility that agents can provide for today's dynamic scheduling is also appealing. We suggest that in this sort of generic or commercial system, the agent-based approach is a good alternative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Surrogate-based-optimization methods provide a means to achieve high-fidelity design optimization at reduced computational cost by using a high-fidelity model in combination with lower-fidelity models that are less expensive to evaluate. This paper presents a provably convergent trust-region model-management methodology for variableparameterization design models: that is, models for which the design parameters are defined over different spaces. Corrected space mapping is introduced as a method to map between the variable-parameterization design spaces. It is then used with a sequential-quadratic-programming-like trust-region method for two aerospace-related design optimization problems. Results for a wing design problem and a flapping-flight problem show that the method outperforms direct optimization in the high-fidelity space. On the wing design problem, the new method achieves 76% savings in high-fidelity function calls. On a bat-flight design problem, it achieves approximately 45% time savings, although it converges to a different local minimum than did the benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.

We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a random iterative graph based hyper-heuristic to produce a collection of heuristic sequences to construct solutions of different quality. These heuristic sequences can be seen as dynamic hybridisations of different graph colouring heuristics that construct solutions step by step. Based on these sequences, we statistically analyse the way in which graph colouring heuristics are automatically hybridised. This, to our knowledge, represents a new direction in hyper-heuristic research. It is observed that spending the search effort on hybridising Largest Weighted Degree with Saturation Degree at the early stage of solution construction tends to generate high quality solutions. Based on these observations, an iterative hybrid approach is developed to adaptively hybridise these two graph colouring heuristics at different stages of solution construction. The overall aim here is to automate the heuristic design process, which draws upon an emerging research theme on developing computer methods to design and adapt heuristics automatically. Experimental results on benchmark exam timetabling and graph colouring problems demonstrate the effectiveness and generality of this adaptive hybrid approach compared with previous methods on automatically generating and adapting heuristics. Indeed, we also show that the approach is competitive with the state of the art human produced methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present an investigation into using fuzzy methodologies to guide the construction of high quality feasible examination timetabling solutions. The provision of automated solutions to the examination timetabling problem is achieved through a combination of construction and improvement. The enhancement of solutions through the use of techniques such as metaheuristics is, in some cases, dependent on the quality of the solution obtained during the construction process. With a few notable exceptions, recent research has concentrated on the improvement of solutions as opposed to focusing on investigating the ‘best’ approaches to the construction phase. Addressing this issue, our approach is based on combining multiple criteria in deciding on how the construction phase should proceed. Fuzzy methods were used to combine three single construction heuristics into three different pair wise combinations of heuristics in order to guide the order in which exams were selected to be inserted into the timetable solution. In order to investigate the approach, we compared the performance of the various heuristic approaches with respect to a number of important criteria (overall cost penalty, number of skipped exams, number of iterations of a rescheduling procedure required and computational time) on twelve well-known benchmark problems. We demonstrate that the fuzzy combination of heuristics allows high quality solutions to be constructed. On one of the twelve problems we obtained lower penalty than any previously published constructive method and for all twelve we obtained lower penalty than when any of the single heuristics were used alone. Furthermore, we demonstrate that the fuzzy approach used less backtracking when constructing solutions than any of the single heuristics. We conclude that this novel fuzzy approach is a highly effective method for heuristically constructing solutions and, as such, has particular relevance to real-world situations in which the construction of feasible solutions is often a difficult task in its own right.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many domains when we have several competing classifiers available we want to synthesize them or some of them to get a more accurate classifier by a combination function. In this paper we propose a ‘class-indifferent’ method for combining classifier decisions represented by evidential structures called triplet and quartet, using Dempster's rule of combination. This method is unique in that it distinguishes important elements from the trivial ones in representing classifier decisions, makes use of more information than others in calculating the support for class labels and provides a practical way to apply the theoretically appealing Dempster–Shafer theory of evidence to the problem of ensemble learning. We present a formalism for modelling classifier decisions as triplet mass functions and we establish a range of formulae for combining these mass functions in order to arrive at a consensus decision. In addition we carry out a comparative study with the alternatives of simplet and dichotomous structure and also compare two combination methods, Dempster's rule and majority voting, over the UCI benchmark data, to demonstrate the advantage our approach offers. (A continuation of the work in this area that was published in IEEE Trans on KDE, and conferences)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present experimental results on benchmark problems in 3D cubic lattice structures with the Miyazawa-Jernigan energy function for two local search procedures that utilise the pull-move set: (i) population-based local search (PLS) that traverses the energy landscape with greedy steps towards (potential) local minima followed by upward steps up to a certain level of the objective function; (ii) simulated annealing with a logarithmic cooling schedule (LSA). The parameter settings for PLS are derived from short LSA-runs executed in pre-processing and the procedure utilises tabu lists generated for each member of the population. In terms of the total number of energy function evaluations both methods perform equally well, however. PLS has the potential of being parallelised with an expected speed-up in the region of the population size. Furthermore, both methods require a significant smaller number of function evaluations when compared to Monte Carlo simulations with kink-jump moves. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to examine the extent and nature of greening the supply chain (SC) in the UK manufacturing sector; and the factors that influence the breadth and depth of this activity.

Design/methodology/approach: Based on the findings from a sample of manufacturing organisations drawn from the membership of The Chartered Institute for Purchasing and Supply. Data are collected using a questionnaire, piloted and pre-tested before distribution with responses from 60 manufacturing companies.

Findings: On average manufacturers perceive the greatest pressure to improve environmental performance through legislation and internal drivers (IDs). The least influential pressures are related to societal drivers and SC pressures from individual customers. Green supply chain management (GSCM) practices amongst this “average” group of UK manufacturing organisations are focusing on internal, higher risk, descriptive activities, rather than proactive, external engagement processes. Environmental attitude (EA) is a key predictor of GSCM activity and those organisations that have a progressive attitude are also operationally very active. EA shows some relationship to legislative drivers but other factors are also influential. Operational activity may also be moderated by organisational contingencies such as risk, size, and nationality.

Research limitations/implications: The main limitation to this paper is the relatively small manufacturing sample.

Practical implications: This paper presents a series of constructs that identify GSCM operational activities companies to benchmark themselves against. It suggests which factors are driving these operational changes and how industry contingencies may be influential.

Originality/value: This paper explores what is driving environmental behaviour amongst an “average” sample of manufacturers, what specific management practices take place and the relationships between them.

Keywords: Manufacturing industries, Environmental management, Supply chain management, Sustainable development, United Kingdom
Paper type: Research paper

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have measured the two-electron contribution of the ground state energy of helium-like argon ions using an electron beam ion trap (EBIT). A two-dimensional map was measured showing the intensity of x-rays from the trap passing through a krypton-filled absorption cell. The independent axes of this map were electron beam energy and x-ray energy. From this map, we deduced the two-electron contribution of the ground state of helium-like argon. This experimentally determined Value (312.4 +/- 9.5 eV) was found to be in good agreement with our calculated values (about 303.35 eV) and previous calculations of the same quantity. Based on these measurements, we have shown that a ten-day absorption spectroscopy run with a super-EBIT should be sufficient to provide a new benchmark value for the two-electron contribution to the ground state of helium-like krypton. Such a measurement would then constitute a test of quantum electrodynamics to second order.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To increase eco-efficiency environmental information needs to be integrated into corporate decision making. For decision makers the interpretation of eco-efficiency as a ratio can however be quite difficult in practice. One of the reasons for this is, that eco-efficiency as a ratio is measured in a unit, that is difficult to interpret. This article therefore suggests an alternative measure for eco-efficiency. The Environmental Value Added, the measure proposed in this paper, reflects the excess economic benefit, resulting from the difference between the eco-efficiency under consideration and a benchmark eco-efficiency. It is measured in a purely monetary unit and is thus easier to interpret and integrate than eco-efficiency as a ratio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nurse rostering is a difficult search problem with many constraints. In the literature, a number of approaches have been investigated including penalty function methods to tackle these constraints within genetic algorithm frameworks. In this paper, we investigate an extension of a previously proposed stochastic ranking method, which has demonstrated superior performance to other constraint handling techniques when tested against a set of constrained optimisation benchmark problems. An initial experiment on nurse rostering problems demonstrates that the stochastic ranking method is better in finding feasible solutions but fails to obtain good results with regard to the objective function. To improve the performance of the algorithm, we hybridise it with a recently proposed simulated annealing hyper-heuristic within a local search and genetic algorithm framework. The hybrid algorithm shows significant improvement over both the genetic algorithm with stochastic ranking and the simulated annealing hyper-heuristic alone. The hybrid algorithm also considerably outperforms the methods in the literature which have the previously best known results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the national and regional pressures in Northern Ireland in the post-war period for parity in public sector pay with the rest of the UK. Northern Ireland had a devolved legislature and government within the UK from 192 1 and was constitutionality in all essentially federal relationship with the rest of the UK. However, the Stormont Government chose to use legislative devolution to minimize policy differences with the rest of the UK. The article highlights the national industrial relations environment as the backdrop for provincial developments in pay setting. It establishes the important role Played by the Social Services Agreement negotiated with the Labour Government at Westminster in triggering the principle of parity in public sector pay in the early post-war years. The principle of pay parity subsequently became a benchmark for regional trade union coercive comparisons in collective bargaining across the devolved public sector. The article highlights the Policy relevance of these developments both to the UK Treasury and to devolved Governments in the UK, as they address the issue of regional public sector pay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.

Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.

Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.

Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.

Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.