924 resultados para Resources use optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many ecosystem services are delivered by organisms that depend on habitats that are segregated spatially or temporally from the location where services are provided. Management of mobile organisms contributing to ecosystem services requires consideration not only of the local scale where services are delivered, but also the distribution of resources at the landscape scale, and the foraging ranges and dispersal movements of the mobile agents. We develop a conceptual model for exploring how one such mobile-agent-based ecosystem service (MABES), pollination, is affected by land-use change, and then generalize the model to other MABES. The model includes interactions and feedbacks among policies affecting land use, market forces and the biology of the organisms involved. Animal-mediated pollination contributes to the production of goods of value to humans such as crops; it also bolsters reproduction of wild plants on which other services or service-providing organisms depend. About one-third of crop production depends on animal pollinators, while 60-90% of plant species require an animal pollinator. The sensitivity of mobile organisms to ecological factors that operate across spatial scales makes the services provided by a given community of mobile agents highly contextual. Services vary, depending on the spatial and temporal distribution of resources surrounding the site, and on biotic interactions occurring locally, such as competition among pollinators for resources, and among plants for pollinators. The value of the resulting goods or services may feed back via market-based forces to influence land-use policies, which in turn influence land management practices that alter local habitat conditions and landscape structure. Developing conceptual models for MABES aids in identifying knowledge gaps, determining research priorities, and targeting interventions that can be applied in an adaptive management context.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of government initiatives has raised both the profile of ICT in the curriculum and the expectation that high quality teaching and learning resources will be accessible across electronic networks. In order for e-learning resources such as websites to have the maximum educational impact, teachers need to be involved in their design and development. Use-case analysis provides a means of defining user requirements and other constraints in such a way that software developers can produce e-learning resources which reflect teachers' professional knowledge and support their classroom practice. It has some features in common with the participatory action research used to develop other aspects of classroom practice. Two case-studies are presented: one involves the development of an on-line resource centred on transcripts of original historical documents; the other describes how 'Learning how to Learn', a major, distributed research project funded under the ESRC Teaching and Learning Research Programme is using use-case analysis to develop web resources and services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the design of optimal multiple gravity assist trajectories with deep space manoeuvres. A pruning method which considers the sequential nature of the problem is presented. The method locates feasible vectors using local optimization and applies a clustering algorithm to find reduced bounding boxes which can be used in a subsequent optimization step. Since multiple local minima remain within the pruned search space, the use of a global optimization method, such as Differential Evolution, is suggested for finding solutions which are likely to be close to the global optimum. Two case studies are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whilst radial basis function (RBF) equalizers have been employed to combat the linear and nonlinear distortions in modern communication systems, most of them do not take into account the equalizer's generalization capability. In this paper, it is firstly proposed that the. model's generalization capability can be improved by treating the modelling problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets. Then, as a modelling application, a new RBF equalizer learning scheme is introduced based on the directional evolutionary MOO (EMOO). Directional EMOO improves the computational efficiency of conventional EMOO, which has been widely applied in solving MOO problems, by explicitly making use of the directional information. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good performance not only on explaining the training samples but on predicting the unseen samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Danish Eulerian Model (DEM) is a powerful air pollution model, designed to calculate the concentrations of various dangerous species over a large geographical region (e.g. Europe). It takes into account the main physical and chemical processes between these species, the actual meteorological conditions, emissions, etc.. This is a huge computational task and requires significant resources of storage and CPU time. Parallel computing is essential for the efficient practical use of the model. Some efficient parallel versions of the model were created over the past several years. A suitable parallel version of DEM by using the Message Passing Interface library (AIPI) was implemented on two powerful supercomputers of the EPCC - Edinburgh, available via the HPC-Europa programme for transnational access to research infrastructures in EC: a Sun Fire E15K and an IBM HPCx cluster. Although the implementation is in principal, the same for both supercomputers, few modifications had to be done for successful porting of the code on the IBM HPCx cluster. Performance analysis and parallel optimization was done next. Results from bench marking experiments will be presented in this paper. Another set of experiments was carried out in order to investigate the sensitivity of the model to variation of some chemical rate constants in the chemical submodel. Certain modifications of the code were necessary to be done in accordance with this task. The obtained results will be used for further sensitivity analysis Studies by using Monte Carlo simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combining geological knowledge with proved plus probable ('2P') oil discovery data indicates that over 60 countries are now past their resource-limited peak of conventional oil production. The data show that the global peak of conventional oil production is close. Many analysts who rely only on proved ('1P') oil reserves data draw a very different conclusion. But proved oil reserves contain no information about the true size of discoveries, being variously under-reported, over-reported and not reported. Reliance on 1P data has led to a number of misconceptions, including the notion that past oil forecasts were incorrect, that oil reserves grow very significantly due to technology gain, and that the global supply of oil is ensured provided sufficient investment is forthcoming to 'turn resources into reserves'. These misconceptions have been widely held, including within academia, governments, some oil companies, and organisations such as the IEA. In addition to conventional oil, the world contains large quantities of non-conventional oil. Most current detailed models show that past the conventional oil peak the non-conventional oils are unlikely to come on-stream fast enough to offset conventional's decline. To determine the extent of future oil supply constraints calculations are required to determine fundamental rate limits for the production of non-conventional oils, as well as oil from gas, coal and biomass, and of oil substitution. Such assessments will need to examine technological readiness and lead-times, as well as rate constraints on investment, pollution, and net-energy return. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to identify the most popular techniques used to rank a web page highly in Google. Design/methodology/approach - The paper presents the results of a study into 50 highly optimized web pages that were created as part of a Search Engine Optimization competition. The study focuses on the most popular techniques that were used to rank highest in this competition, and includes an analysis on the use of PageRank, number of pages, number of in-links, domain age and the use of third party sites such as directories and social bookmarking sites. A separate study was made into 50 non-optimized web pages for comparison. Findings - The paper provides insight into the techniques that successful Search Engine Optimizers use to ensure a page ranks highly in Google. Recognizes the importance of PageRank and links as well as directories and social bookmarking sites. Research limitations/implications - Only the top 50 web sites for a specific query were analyzed. Analysing more web sites and comparing with similar studies in different competition would provide more concrete results. Practical implications - The paper offers a revealing insight into the techniques used by industry experts to rank highly in Google, and the success or other-wise of those techniques. Originality/value - This paper fulfils an identified need for web sites and e-commerce sites keen to attract a wider web audience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the design, implementation and testing of an intelligent knowledge-based supervisory control (IKBSC) system for a hot rolling mill process. A novel architecture is used to integrate an expert system with an existing supervisory control system and a new optimization methodology for scheduling the soaking pits in which the material is heated prior to rolling. The resulting IKBSC system was applied to an aluminium hot rolling mill process to improve the shape quality of low-gauge plate and to optimise the use of the soaking pits to reduce energy consumption. The results from the trials demonstrate the advantages to be gained from the IKBSC system that integrates knowledge contained within data, plant and human resources with existing model-based systems. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a new equalizer learning scheme is introduced based on the algorithm of the directional evolutionary multi-objective optimization (EMOO). Whilst nonlinear channel equalizers such as the radial basis function (RBF) equalizers have been widely studied to combat the linear and nonlinear distortions in the modern communication systems, most of them do not take into account the equalizers' generalization capabilities. In this paper, equalizers are designed aiming at improving their generalization capabilities. It is proposed that this objective can be achieved by treating the equalizer design problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets, followed by deriving equalizers with good capabilities of recovering the signals for all the training sets. Conventional EMOO which is widely applied in the MOO problems suffers from disadvantages such as slow convergence speed. Directional EMOO improves the computational efficiency of the conventional EMOO by explicitly making use of the directional information. The new equalizer learning scheme based on the directional EMOO is applied to the RBF equalizer design. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good generalization capabilities, i.e., good performance on predicting the unseen samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The success of Matrix-assisted laser desorption / ionisation (MALDI) in fields such as proteomics has partially but not exclusively been due to the development of improved data acquisition and sample preparation techniques. This has been required to overcome some of the short comings of the commonly used solid-state MALDI matrices such as - cyano-4-hydroxycinnamic acid (CHCA) and 2,5-dihydroxybenzoic acid (DHB). Solid state matrices form crystalline samples with highly inhomogeneous topography and morphology which results in large fluctuations in analyte signal intensity from spot to spot and positions within the spot. This means that efficient tuning of the mass spectrometer can be impeded and the use of MALDI MS for quantitative measurements is severely impeded. Recently new MALDI liquid matrices have been introduced which promise to be an effective alternative to crystalline matrices. Generally the liquid matrices comprise either ionic liquid matrices (ILMs) or a usually viscous liquid matrix which is doped with a UV lightabsorbing chromophore [1-3]. The advantages are that the droplet surface is smooth and relatively uniform with the analyte homogeneously distributed within. They have the ability to replenish a sampling position between shots negating the need to search for sample hot-spots. Also the liquid nature of the matrix allows for the use of additional additives to change the environment to which the analyte is added.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deep Brain Stimulation (DBS) has been successfully used throughout the world for the treatment of Parkinson's disease symptoms. To control abnormal spontaneous electrical activity in target brain areas DBS utilizes a continuous stimulation signal. This continuous power draw means that its implanted battery power source needs to be replaced every 18–24 months. To prolong the life span of the battery, a technique to accurately recognize and predict the onset of the Parkinson's disease tremors in human subjects and thus implement an on-demand stimulator is discussed here. The approach is to use a radial basis function neural network (RBFNN) based on particle swarm optimization (PSO) and principal component analysis (PCA) with Local Field Potential (LFP) data recorded via the stimulation electrodes to predict activity related to tremor onset. To test this approach, LFPs from the subthalamic nucleus (STN) obtained through deep brain electrodes implanted in a Parkinson patient are used to train the network. To validate the network's performance, electromyographic (EMG) signals from the patient's forearm are recorded in parallel with the LFPs to accurately determine occurrences of tremor, and these are compared to the performance of the network. It has been found that detection accuracies of up to 89% are possible. Performance comparisons have also been made between a conventional RBFNN and an RBFNN based on PSO which show a marginal decrease in performance but with notable reduction in computational overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Progress is reported in the development of a new synthesis method for the design of filters and coatings for use in spaceborne infrared optics. This method uses the Golden Section optimization routine to make a search, using designated dielectric thin film combinations, for the coating design which fulfills the required spectral requirements. The final design is that which uses the least number of layers for the given thin film materials in the starting design. This synthesis method has successfully been used to design broadband anti-reflection coatings on infrared substrates. The 6 micrometers to 18 micrometers anti-reflection coating for the germanium optics of the HIRDLS instrument, to be flown on the NASA EOS-Chem satellite, is given as an example. By correctly defining the target function to describe any specific type of filter in the optimization part of the method, this synthesis method may be used to design general filters for use in spaceborne infrared optics.