928 resultados para BENCHMARK


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project was commissioned to generate an improved understanding of the sensitivities of seagrass habitats to pressures associated with human activities in the marine environment - to provide an evidence base to facilitate and support management advice for Marine Protected Areas; development of UK marine monitoring and assessment, and conservation advice to offshore marine industries. Seagrass bed habitats are identified as a Priority Marine Feature (PMF) under the Marine (Scotland) Act 2010, they are also included on the OSPAR list of threatened and declining species and habitats, and are a Habitat of Principle Importance (HPI) under the Natural Environment and Rural Communities (NERC) Act 2006, in England and Wales. The purpose of this project was to produce sensitivity assessments with supporting evidence for the HPI, OSPAR and PMF seagrass/Zostera bed habitat definitions, clearly documenting the evidence behind the assessments and any differences between assessments. Nineteen pressures, falling in five categories - biological, hydrological, physical damage, physical loss, and pollution and other chemical changes - were assessed in this report. Assessments were based on the three British seagrasses Zostera marina, Z. noltei and Ruppia maritima. Z. marina var. angustifolia was considered to be a subspecies of Z. marina but it was specified where studies had considered it as a species in its own rights. Where possible other components of the community were investigated but the basis of the assessment focused on seagrass species. To develop each sensitivity assessment, the resistance and resilience of the key elements were assessed against the pressure benchmark using the available evidence. The benchmarks were designed to provide a ‘standard’ level of pressure against which to assess sensitivity. Overall, seagrass beds were highly sensitive to a number of human activities: • penetration or disturbance of the substratum below the surface; • habitat structure changes – removal of substratum; • physical change to another sediment type; • physical loss of habitat; • siltation rate changes including and smothering; and • changes in suspended solids. High sensitivity was recorded for pressures which directly impacted the factors that limit seagrass growth and health such as light availability. Physical pressures that caused mechanical modification of the sediment, and hence damage to roots and leaves, also resulted in high sensitivity. Seagrass beds were assessed as ‘not sensitive’ to microbial pathogens or ‘removal of target species’. These assessments were based on the benchmarks used. Z. marina is known to be sensitive to Labyrinthula zosterae but this was not included in the benchmark used. Similarly, ‘removal of target species’ addresses only the biological effects of removal and not the physical effects of the process used. For example, seagrass beds are probably not sensitive to the removal of scallops found within the bed but are highly sensitive to the effects of dredging for scallops, as assessed under the pressure penetration or disturbance of the substratum below the surface‘. This is also an example of a synergistic effect Assessing the sensitivity of seagrass bed biotopes to pressures associated with marine activities between pressures. Where possible, synergistic effects were highlighted but synergistic and cumulative effects are outside the scope off this study. The report found that no distinct differences in sensitivity exist between the HPI, PMF and OSPAR definitions. Individual biotopes do however have different sensitivities to pressures. These differences were determined by the species affected, the position of the habitat on the shore and the sediment type. For instance evidence showed that beds growing in soft and muddy sand were more vulnerable to physical damage than beds on harder, more compact substratum. Temporal effects can also influence the sensitivity of seagrass beds. On a seasonal time frame, physical damage to roots and leaves occurring in the reproductive season (summer months) will have a greater impact than damage in winter. On a daily basis, the tidal regime could accentuate or attenuate the effects of pressures depending on high and low tide. A variety of factors must therefore be taken into account in order to assess the sensitivity of a particular seagrass habitat at any location. No clear difference in resilience was established across the three seagrass definitions assessed in this report. The resilience of seagrass beds and the ability to recover from human induced pressures is a combination of the environmental conditions of the site, growth rates of the seagrass, the frequency and the intensity of the disturbance. This highlights the importance of considering the species affected as well as the ecology of the seagrass bed, the environmental conditions and the types and nature of activities giving rise to the pressure and the effects of that pressure. For example, pressures that result in sediment modification (e.g. pitting or erosion), sediment change or removal, prolong recovery. Therefore, the resilience of each biotope and habitat definitions is discussed for each pressure. Using a clearly documented, evidence based approach to create sensitivity assessments allows the assessment and any subsequent decision making or management plans to be readily communicated, transparent and justifiable. The assessments can be replicated and updated where new evidence becomes available ensuring the longevity of the sensitivity assessment tool. The evidence review has reduced the uncertainty around assessments previously undertaken in the MB0102 project (Tillin et al 2010) by assigning a single sensitivity score to the pressures as opposed to a range. Finally, as seagrass habitats may also contribute to ecosystem function and the delivery of ecosystem services, understanding the sensitivity of these biotopes may also support assessment and management in regard to these. Whatever objective measures are applied to data to assess sensitivity, the final sensitivity assessment is indicative. The evidence, the benchmarks, the confidence in the assessments and the limitations of the process, require a sense-check by experienced marine ecologists before the outcome is used in management decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Joint Nature Conservation Committee (JNCC) commissioned this project to generate an improved understanding of the sensitivities of Sabellaria spinulosa reefs based on the OSPAR habitat definition. This work aimed to provide an evidence base to facilitate and support management advice for Marine Protected Areas, development of UK marine monitoring and assessment, and conservation advice to offshore marine industries. The OSPAR list of threatened and declining species and habitats refers to subtidal S. spinulosa reefs on hard or mixed substratum. S. spinulosa may also occur as thin crusts or individual worms but these are not the focus of conservation. The purpose of this project was to produce sensitivity assessments with supporting evidence for S. spinulosa reefs, clearly documenting the evidence behind the assessments and the confidence in these assessments. Sixteen pressures, falling in five categories - biological, hydrological, physical damage, physical loss, and pollution and other chemical changes - were assessed in this report. To develop each sensitivity assessment, the resistance and resilience of the key elements of the habitat were assessed against the pressure benchmark using the available evidence. The benchmarks were designed to provide a ‘standard’ level of pressure against which to assess sensitivity. The highest sensitivity (‘medium’) was recorded for physical pressures which directly impact the reefs including: • habitat structure changes – removal of substratum; • abrasion and penetration and sub-surface disturbance; • physical loss of habitat and change to habitat; and • siltation rate changes including and smothering. The report found that no evidence for differences in the sensitivity of the three EUNIS S. spinulosa biotopes that comprise the OSPAR definition. However, this evidence review has identified significant information gaps regarding sensitivity, ecological interactions with other species and resilience. No clear difference in resilience was established across the OSPAR S. spinulosa biotopes that were assessed in this report. Using a clearly documented, evidence based approach to create sensitivity assessments allows the assessment and any subsequent decision making or management plans to be readily communicated, transparent and justifiable. The assessments can be replicated and updated where new evidence becomes available ensuring the longevity of the sensitivity assessment tool. Finally, as S. spinulosa habitats may also contribute to ecosystem function and the delivery of ecosystem services, understanding the sensitivity of these biotopes may also support assessment and management in regard to these. Whatever objective measures are applied to data to assess sensitivity, the final sensitivity assessment is indicative. The evidence, the benchmarks, the confidence in the assessments and the limitations of the process, require a sense-check by experienced marine ecologists before the outcome is used in management decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Joint Nature Conservation Committee (JNCC) commissioned this project to generate an improved understanding of the sensitivities of blue mussel (Mytilus edulis) beds, found in UK waters, to pressures associated with human activities in the marine environment. The work will provide an evidence base that will facilitate and support management advice for Marine Protected Areas, development of UK marine monitoring and assessment, and conservation advice to offshore marine industries. Blue mussel beds are identified as a Habitat of Principle Importance (HPI) under the Natural Environment and Rural Communities (NERC) Act 2006, as a Priority Marine Feature (PMF) under the Marine (Scotland) Act 2010, and included on the OSPAR (Annex V) list of threatened and declining species and habitats. The purpose of this project was to produce sensitivity assessments for the blue mussel biotopes included within the HPI, PMF and OSPAR habitat definitions, and clearly document the supporting evidence behind the assessments and any differences between them. A total of 20 pressures falling in five categories - biological, hydrological, physical damage, physical loss, and pollution and other chemical changes - were assessed in this report. The review examined seven blue mussel bed biotopes found on littoral sediment and sublittoral rock and sediment. The assessments were based on the sensitivity of M. edulis rather than associated species, as M. edulis was considered the most important characteristic species in blue mussel beds. To develop each sensitivity assessment, the resistance and resilience of the key elements are assessed against the pressure benchmark using the available evidence gathered in this review. The benchmarks were designed to provide a ‘standard’ level of pressure against which to assess sensitivity. Blue mussel beds were highly sensitive to a few human activities: • introduction or spread of non-indigenous species (NIS); • habitat structure changes - removal of substratum (extraction); and • physical loss (to land or freshwater habitat). Physical loss of habitat and removal of substratum are particularly damaging pressures, while the sensitivity of blue mussel beds to non-indigenous species depended on the species assessed. Crepidula fornicata and Crassostrea gigas both had the potential to outcompete and replace mussel beds, so resulted in a high sensitivity assessment. Mytilus spp. populations are considered to have a strong ability to recover from environmental disturbance. A good annual recruitment may allow a bed to recovery rapidly, though this cannot always be expected due to the sporadic nature of M. edulis recruitment. Therefore, blue mussel beds were considered to have a 'Medium' resilience (recovery within 2-10 years). As a result, even where the removal or loss of proportion of a mussel bed was expected due to a pressure, a sensitivity of 'Medium' was reported. Hence, most of the sensitivities reported were 'Medium'. It was noted, however, that the recovery rates of blue mussel beds were reported to be anywhere between two years to several decades. In addition, M. edulis is considered very tolerant of a range of physical and chemical conditions. As a result, blue mussel beds were considered to be 'Not sensitive' to changes in temperature, salinity, de-oxygenation, nutrient and organic enrichment, and substratum type, at the benchmark level of pressure. The report found that no distinct differences in overall sensitivity exist between the HPI, PMF and OSPAR definitions. Individual biotopes do however have different sensitivities to pressures, and the OSPAR definition only includes blue mussel beds on sediment. These differences were determined by the position of the habitat on the shore and the sediment type. For example, the infralittoral rock biotope (A3.361) was unlikely to be exposed to pressures that affect sediments. However in the case of increased water flow, mixed sediment biotopes were considered more stable and ‘Not sensitive’ (at the benchmark level) while the remaining biotopes were likely to be affected.

Using a clearly documented, evidence-based approach to create sensitivity assessments allows the assessment basis and any subsequent decision making or management plans to be readily communicated, transparent and justifiable. The assessments can be replicated and updated where new evidence becomes available ensuring the longevity of the sensitivity assessment tool. For every pressure where sensitivity was previously assessed as a range of scores in MB0102, the assessments made by the evidence review have supported one of the MB0102 assessments. The evidence review has reduced the uncertainty around assessments previously undertaken in the MB0102 project (Tillin et al., 2010) by assigning a single sensitivity score to the pressures as opposed to a range. Finally, as blue mussel bed habitats also contribute to ecosystem function and the delivery of ecosystem services, understanding the sensitivity of these biotopes may also support assessment and management in regard to these. Whatever objective measures are applied to data to assess sensitivity, the final sensitivity assessment is indicative. The evidence, the benchmarks, the confidence in the assessments and the limitations of the process, require a sense-check by experienced marine ecologists before the outcome is used in management decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe here a method of assessment for students. A number of short-comings of traditional assessment methods, especially essays and examinations, are discussed and an alternative assessment method, the student project, is suggested. The method aims not just to overcome the short-comings of more traditional methods, but also to provide over-worked and under-resourced academics with viable primary data for socio-legal research work. Limitations to the method are discussed, with proposals for minimising the impact of these limitations. The whole �student project� approach is also discussed with reference to the Quality Assurance Agency benchmark standards for law degrees, standards which are expected of all institutions in the UK.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Course Scheduling consists of assigning lecture events to a limited set of specific timeslots and rooms. The objective is to satisfy as many soft constraints as possible, while maintaining a feasible solution timetable. The most successful techniques to date require a compute-intensive examination of the solution neighbourhood to direct searches to an optimum solution. Although they may require fewer neighbourhood moves than more exhaustive techniques to gain comparable results, they can take considerably longer to achieve success. This paper introduces an extended version of the Great Deluge Algorithm for the Course Timetabling problem which, while avoiding the problem of getting trapped in local optima, uses simple Neighbourhood search heuristics to obtain solutions in a relatively short amount of time. The paper presents results based on a standard set of benchmark datasets, beating over half of the currently published best results with in some cases up to 60% of an improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new search-space-updating technique for genetic algorithms is proposed for continuous optimisation problems. Other than gradually reducing the search space during the evolution process with a fixed reduction rate set ‘a priori’, the upper and the lower boundaries for each variable in the objective function are dynamically adjusted based on its distribution statistics. To test the effectiveness, the technique is applied to a number of benchmark optimisation problems in comparison with three other techniques, namely the genetic algorithms with parameter space size adjustment (GAPSSA) technique [A.B. Djurišic, Elite genetic algorithms with adaptive mutations for solving continuous optimization problems – application to modeling of the optical constants of solids, Optics Communications 151 (1998) 147–159], successive zooming genetic algorithm (SZGA) [Y. Kwon, S. Kwon, S. Jin, J. Kim, Convergence enhanced genetic algorithm with successive zooming method for solving continuous optimization problems, Computers and Structures 81 (2003) 1715–1725] and a simple GA. The tests show that for well-posed problems, existing search space updating techniques perform well in terms of convergence speed and solution precision however, for some ill-posed problems these techniques are statistically inferior to a simple GA. All the tests show that the proposed new search space update technique is statistically superior to its counterparts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a parallel-matching processor architecture with early jump-out (EJO) control is proposed to carry out high-speed biometric fingerprint database retrieval. The processor performs the fingerprint retrieval by using minutia point matching. An EJO method is applied to the proposed architecture to speed up the large database retrieval. The processor is implemented on a Xilinx Virtex-E, and occupies 6,825 slices and runs at up to 65 MHz. The software/hardware co-simulation benchmark with a database of 10,000 fingerprints verifies that the matching speed can achieve the rate of up to 1.22 million fingerprints per second. EJO results in about a 22% gain in computing efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present results from three-dimensional protein folding simulations in the HP-model on ten benchmark problems. The simulations are executed by a simulated annealing-based algorithm with a time-dependent cooling schedule. The neighbourhood relation is determined by the pull-move set. The results provide experimental evidence that the maximum depth D of local minima of the underlying energy landscape can be upper bounded by D < n(2/3). The local search procedure employs the stopping criterion (In/delta)(D/gamma) where m is an estimation of the average number of neighbouring conformations, gamma relates to the mean of non-zero differences of the objective function for neighbouring conformations, and 1-delta is the confidence that a minimum conformation has been found. The bound complies with the results obtained for the ten benchmark problems. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The total cross sections for single ionization of helium and single and double ionization of argon by antiproton impact have been measured in the kinetic energy range from 3 to 25 jeVusing a new technique for the creation of intense slow antiproton beams. The new data provide benchmark results for the development of advanced descriptions of atomic collisions and we show that they can be used to judge, for the first time, the validity of many recent theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computionally efficient sequential learning algorithms are developed for direct-link resource-allocating networks (DRANs). These are achieved by decomposing existing recursive training algorithms on a layer by layer and neuron by neuron basis. This allows network weights to be updated in an efficient parallel manner and facilitates the implementation of minimal update extensions that yield a significant reduction in computation load per iteration compared to existing sequential learning methods employed in resource-allocation network (RAN) and minimal RAN (MRAN) approaches. The new algorithms, which also incorporate a pruning strategy to control network growth, are evaluated on three different system identification benchmark problems and shown to outperform existing methods both in terms of training error convergence and computational efficiency. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PEGS (Production and Environmental Generic Scheduler) is a generic production scheduler that produces good schedules over a wide range of problems. It is centralised, using search strategies with the Shifting Bottleneck algorithm. We have also developed an alternative distributed approach using software agents. In some cases this reduces run times by a factor of 10 or more. In most cases, the agent-based program also produces good solutions for published benchmark data, and the short run times make our program useful for a large range of problems. Test results show that the agents can produce schedules comparable to the best found so far for some benchmark datasets and actually better schedules than PEGS on our own random datasets. The flexibility that agents can provide for today's dynamic scheduling is also appealing. We suggest that in this sort of generic or commercial system, the agent-based approach is a good alternative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Surrogate-based-optimization methods provide a means to achieve high-fidelity design optimization at reduced computational cost by using a high-fidelity model in combination with lower-fidelity models that are less expensive to evaluate. This paper presents a provably convergent trust-region model-management methodology for variableparameterization design models: that is, models for which the design parameters are defined over different spaces. Corrected space mapping is introduced as a method to map between the variable-parameterization design spaces. It is then used with a sequential-quadratic-programming-like trust-region method for two aerospace-related design optimization problems. Results for a wing design problem and a flapping-flight problem show that the method outperforms direct optimization in the high-fidelity space. On the wing design problem, the new method achieves 76% savings in high-fidelity function calls. On a bat-flight design problem, it achieves approximately 45% time savings, although it converges to a different local minimum than did the benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.

We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a random iterative graph based hyper-heuristic to produce a collection of heuristic sequences to construct solutions of different quality. These heuristic sequences can be seen as dynamic hybridisations of different graph colouring heuristics that construct solutions step by step. Based on these sequences, we statistically analyse the way in which graph colouring heuristics are automatically hybridised. This, to our knowledge, represents a new direction in hyper-heuristic research. It is observed that spending the search effort on hybridising Largest Weighted Degree with Saturation Degree at the early stage of solution construction tends to generate high quality solutions. Based on these observations, an iterative hybrid approach is developed to adaptively hybridise these two graph colouring heuristics at different stages of solution construction. The overall aim here is to automate the heuristic design process, which draws upon an emerging research theme on developing computer methods to design and adapt heuristics automatically. Experimental results on benchmark exam timetabling and graph colouring problems demonstrate the effectiveness and generality of this adaptive hybrid approach compared with previous methods on automatically generating and adapting heuristics. Indeed, we also show that the approach is competitive with the state of the art human produced methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present an investigation into using fuzzy methodologies to guide the construction of high quality feasible examination timetabling solutions. The provision of automated solutions to the examination timetabling problem is achieved through a combination of construction and improvement. The enhancement of solutions through the use of techniques such as metaheuristics is, in some cases, dependent on the quality of the solution obtained during the construction process. With a few notable exceptions, recent research has concentrated on the improvement of solutions as opposed to focusing on investigating the ‘best’ approaches to the construction phase. Addressing this issue, our approach is based on combining multiple criteria in deciding on how the construction phase should proceed. Fuzzy methods were used to combine three single construction heuristics into three different pair wise combinations of heuristics in order to guide the order in which exams were selected to be inserted into the timetable solution. In order to investigate the approach, we compared the performance of the various heuristic approaches with respect to a number of important criteria (overall cost penalty, number of skipped exams, number of iterations of a rescheduling procedure required and computational time) on twelve well-known benchmark problems. We demonstrate that the fuzzy combination of heuristics allows high quality solutions to be constructed. On one of the twelve problems we obtained lower penalty than any previously published constructive method and for all twelve we obtained lower penalty than when any of the single heuristics were used alone. Furthermore, we demonstrate that the fuzzy approach used less backtracking when constructing solutions than any of the single heuristics. We conclude that this novel fuzzy approach is a highly effective method for heuristically constructing solutions and, as such, has particular relevance to real-world situations in which the construction of feasible solutions is often a difficult task in its own right.