47 resultados para Resources use optimization
Resumo:
The Danish Eulerian Model (DEM) is a powerful air pollution model, designed to calculate the concentrations of various dangerous species over a large geographical region (e.g. Europe). It takes into account the main physical and chemical processes between these species, the actual meteorological conditions, emissions, etc.. This is a huge computational task and requires significant resources of storage and CPU time. Parallel computing is essential for the efficient practical use of the model. Some efficient parallel versions of the model were created over the past several years. A suitable parallel version of DEM by using the Message Passing Interface library (AIPI) was implemented on two powerful supercomputers of the EPCC - Edinburgh, available via the HPC-Europa programme for transnational access to research infrastructures in EC: a Sun Fire E15K and an IBM HPCx cluster. Although the implementation is in principal, the same for both supercomputers, few modifications had to be done for successful porting of the code on the IBM HPCx cluster. Performance analysis and parallel optimization was done next. Results from bench marking experiments will be presented in this paper. Another set of experiments was carried out in order to investigate the sensitivity of the model to variation of some chemical rate constants in the chemical submodel. Certain modifications of the code were necessary to be done in accordance with this task. The obtained results will be used for further sensitivity analysis Studies by using Monte Carlo simulation.
Resumo:
Combining geological knowledge with proved plus probable ('2P') oil discovery data indicates that over 60 countries are now past their resource-limited peak of conventional oil production. The data show that the global peak of conventional oil production is close. Many analysts who rely only on proved ('1P') oil reserves data draw a very different conclusion. But proved oil reserves contain no information about the true size of discoveries, being variously under-reported, over-reported and not reported. Reliance on 1P data has led to a number of misconceptions, including the notion that past oil forecasts were incorrect, that oil reserves grow very significantly due to technology gain, and that the global supply of oil is ensured provided sufficient investment is forthcoming to 'turn resources into reserves'. These misconceptions have been widely held, including within academia, governments, some oil companies, and organisations such as the IEA. In addition to conventional oil, the world contains large quantities of non-conventional oil. Most current detailed models show that past the conventional oil peak the non-conventional oils are unlikely to come on-stream fast enough to offset conventional's decline. To determine the extent of future oil supply constraints calculations are required to determine fundamental rate limits for the production of non-conventional oils, as well as oil from gas, coal and biomass, and of oil substitution. Such assessments will need to examine technological readiness and lead-times, as well as rate constraints on investment, pollution, and net-energy return. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Purpose - The purpose of this paper is to identify the most popular techniques used to rank a web page highly in Google. Design/methodology/approach - The paper presents the results of a study into 50 highly optimized web pages that were created as part of a Search Engine Optimization competition. The study focuses on the most popular techniques that were used to rank highest in this competition, and includes an analysis on the use of PageRank, number of pages, number of in-links, domain age and the use of third party sites such as directories and social bookmarking sites. A separate study was made into 50 non-optimized web pages for comparison. Findings - The paper provides insight into the techniques that successful Search Engine Optimizers use to ensure a page ranks highly in Google. Recognizes the importance of PageRank and links as well as directories and social bookmarking sites. Research limitations/implications - Only the top 50 web sites for a specific query were analyzed. Analysing more web sites and comparing with similar studies in different competition would provide more concrete results. Practical implications - The paper offers a revealing insight into the techniques used by industry experts to rank highly in Google, and the success or other-wise of those techniques. Originality/value - This paper fulfils an identified need for web sites and e-commerce sites keen to attract a wider web audience.
Resumo:
Traditional resource management has had as its main objective the optimization of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The Self-organizing ICT Resource Management (SORMA) project aims at allowing resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA's motivation is to achieve efficient resource utilization by maximizing revenue for resource providers and minimizing the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that the desired quality of service levels meet the expectations of market participants. This paper explains the proposed use of an economically enhanced resource manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximization across multiple service level agreements and provides an application scenario to demonstrate its usefulness and effectiveness. Copyright © 2008 John Wiley & Sons, Ltd.
Resumo:
This paper describes the design, implementation and testing of an intelligent knowledge-based supervisory control (IKBSC) system for a hot rolling mill process. A novel architecture is used to integrate an expert system with an existing supervisory control system and a new optimization methodology for scheduling the soaking pits in which the material is heated prior to rolling. The resulting IKBSC system was applied to an aluminium hot rolling mill process to improve the shape quality of low-gauge plate and to optimise the use of the soaking pits to reduce energy consumption. The results from the trials demonstrate the advantages to be gained from the IKBSC system that integrates knowledge contained within data, plant and human resources with existing model-based systems. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, a new equalizer learning scheme is introduced based on the algorithm of the directional evolutionary multi-objective optimization (EMOO). Whilst nonlinear channel equalizers such as the radial basis function (RBF) equalizers have been widely studied to combat the linear and nonlinear distortions in the modern communication systems, most of them do not take into account the equalizers' generalization capabilities. In this paper, equalizers are designed aiming at improving their generalization capabilities. It is proposed that this objective can be achieved by treating the equalizer design problem as a multi-objective optimization (MOO) problem, with each objective based on one of several training sets, followed by deriving equalizers with good capabilities of recovering the signals for all the training sets. Conventional EMOO which is widely applied in the MOO problems suffers from disadvantages such as slow convergence speed. Directional EMOO improves the computational efficiency of the conventional EMOO by explicitly making use of the directional information. The new equalizer learning scheme based on the directional EMOO is applied to the RBF equalizer design. Computer simulation demonstrates that the new scheme can be used to derive RBF equalizers with good generalization capabilities, i.e., good performance on predicting the unseen samples.
Resumo:
The success of Matrix-assisted laser desorption / ionisation (MALDI) in fields such as proteomics has partially but not exclusively been due to the development of improved data acquisition and sample preparation techniques. This has been required to overcome some of the short comings of the commonly used solid-state MALDI matrices such as - cyano-4-hydroxycinnamic acid (CHCA) and 2,5-dihydroxybenzoic acid (DHB). Solid state matrices form crystalline samples with highly inhomogeneous topography and morphology which results in large fluctuations in analyte signal intensity from spot to spot and positions within the spot. This means that efficient tuning of the mass spectrometer can be impeded and the use of MALDI MS for quantitative measurements is severely impeded. Recently new MALDI liquid matrices have been introduced which promise to be an effective alternative to crystalline matrices. Generally the liquid matrices comprise either ionic liquid matrices (ILMs) or a usually viscous liquid matrix which is doped with a UV lightabsorbing chromophore [1-3]. The advantages are that the droplet surface is smooth and relatively uniform with the analyte homogeneously distributed within. They have the ability to replenish a sampling position between shots negating the need to search for sample hot-spots. Also the liquid nature of the matrix allows for the use of additional additives to change the environment to which the analyte is added.
Resumo:
Deep Brain Stimulation (DBS) has been successfully used throughout the world for the treatment of Parkinson's disease symptoms. To control abnormal spontaneous electrical activity in target brain areas DBS utilizes a continuous stimulation signal. This continuous power draw means that its implanted battery power source needs to be replaced every 18–24 months. To prolong the life span of the battery, a technique to accurately recognize and predict the onset of the Parkinson's disease tremors in human subjects and thus implement an on-demand stimulator is discussed here. The approach is to use a radial basis function neural network (RBFNN) based on particle swarm optimization (PSO) and principal component analysis (PCA) with Local Field Potential (LFP) data recorded via the stimulation electrodes to predict activity related to tremor onset. To test this approach, LFPs from the subthalamic nucleus (STN) obtained through deep brain electrodes implanted in a Parkinson patient are used to train the network. To validate the network's performance, electromyographic (EMG) signals from the patient's forearm are recorded in parallel with the LFPs to accurately determine occurrences of tremor, and these are compared to the performance of the network. It has been found that detection accuracies of up to 89% are possible. Performance comparisons have also been made between a conventional RBFNN and an RBFNN based on PSO which show a marginal decrease in performance but with notable reduction in computational overhead.
Resumo:
Progress is reported in the development of a new synthesis method for the design of filters and coatings for use in spaceborne infrared optics. This method uses the Golden Section optimization routine to make a search, using designated dielectric thin film combinations, for the coating design which fulfills the required spectral requirements. The final design is that which uses the least number of layers for the given thin film materials in the starting design. This synthesis method has successfully been used to design broadband anti-reflection coatings on infrared substrates. The 6 micrometers to 18 micrometers anti-reflection coating for the germanium optics of the HIRDLS instrument, to be flown on the NASA EOS-Chem satellite, is given as an example. By correctly defining the target function to describe any specific type of filter in the optimization part of the method, this synthesis method may be used to design general filters for use in spaceborne infrared optics.
Resumo:
There have been various techniques published for optimizing the net present value of tenders by use of discounted cash flow theory and linear programming. These approaches to tendering appear to have been largely ignored by the industry. This paper utilises six case studies of tendering practice in order to establish the reasons for this apparent disregard. Tendering is demonstrated to be a market orientated function with many subjective judgements being made regarding a firm's environment. Detailed consideration of 'internal' factors such as cash flow are therefore judged to be unjustified. Systems theory is then drawn upon and applied to the separate processes of estimating and tendering. Estimating is seen as taking place in a relatively sheltered environment and as such operates as a relatively closed system. Tendering, however, takes place in a changing and dynamic environment and as such must operate as a relatively open system. The use of sophisticated methods to optimize the value of tenders is then identified as being dependent upon the assumption of rationality, which is justified in the case of a relatively closed system (i.e. estimating), but not for a relatively open system (i.e. tendering).
Resumo:
Research in the late 1980s showed that in many corporate real estates users were not fully aware of the full extent of their property holdings. In many cases, not only was the value of the holdings unknown, but there was uncertainty over the actual extent of ownership within the portfolio. This resulted in a large number of corporate occupiers reviewing their property holdings during the 1990s, initially to create a definitive asset register, but also to benefit from an more efficient use of space. Good management of corporately owned property assets is of equal importance as the management of other principal resources within the company. A comprehensive asset register can be seen as the first step towards a rational property audit. For the effective, efficient and economic delivery of services, it is vital that all property holdings are utilised to the best advantage. This requires that the property provider and the property user are both fully conversant with the value of the property holding and that an asset/internal rent/charge is made accordingly. The advantages of internal rent charging are twofold. Firstly, it requires the occupying department to “contribute” an amount to the business equivalent to the open market rental value of the space that it occupies. This prevents the treating of space as a free good and, as individual profit centres, each department will then rationalise its holdings to minimise its costs. The second advantage is from a strategic viewpoint. By charging an asset rent, the holding department can identify the performance of its real estate holdings. This can then be compared to an internal or external benchmark to help determine whether the company has adopted the most efficient tenure pattern for its properties. This paper investigates the use of internal rents by UK-based corporate businesses and explains internal rents as a form of transfer pricing in the context of management and responsibility accounting. The research finds that the majority of charging organisations introduced internal rents primarily to help calculate true profits at the business unit level. However, less than 10% of the charging organisations introduced internal rents primarily to capture the return on assets within the business. There was also a sizeable element of the market who had no plans to introduce internal rents. Here, it appears that, despite academic and professional views that internal rents are beneficial in improving the efficient use of property, opinion at the business and operational level has not universally accepted this proposition.
Resumo:
This paper critically explores the politics that mediate the use of environmental science assessments as the basis of resource management policy. Drawing on recent literature in the political ecology tradition that has emphasised the politicised nature of the production and use of scientific knowledge in environmental management, the paper analyses a hydrological assessment in a small river basin in Chile, undertaken in response to concerns over the possible overexploitation of groundwater resources. The case study illustrates the limitations of an approach based predominantly on hydrogeological modelling to ascertain the effects of increased groundwater abstraction. In particular, it identifies the subjective ways in which the assessment was interpreted and used by the state water resources agency to underpin water allocation decisions in accordance with its own interests, and the role that a desocialised assessment played in reproducing unequal patterns of resource use and configuring uneven waterscapes. Nevertheless, as Chile’s ‘neoliberal’ political-economic framework privileges the role of science and technocracy, producing other forms of environmental knowledge to complement environmental science is likely to be contentious. In conclusion, the paper considers the potential of mobilising the concept of the hydrosocial cycle to further critically engage with environmental science.
Resumo:
Tourism is the worlds largest employer, accounting for 10% of jobs worldwide (WTO, 1999). There are over 30,000 protected areas around the world, covering about 10% of the land surface(IUCN, 2002). Protected area management is moving towards a more integrated form of management, which recognises the social and economic needs of the worlds finest areas and seeks to provide long term income streams and support social cohesion through active but sustainable use of resources. Ecotourism - 'responsible travel to natural areas that conserves the environment and improves the well- being of local people' (The Ecotourism Society, 1991) - is often cited as a panacea for incorporating the principles of sustainable development in protected area management. However, few examples exist worldwide to substantiate this claim. In reality, ecotourism struggles to provide social and economic empowerment locally and fails to secure proper protection of the local and global environment. Current analysis of ecotourism provides a useful checklist of interconnected principles for more successful initiatives, but no overall framework of analysis or theory. This paper argues that applying common property theory to the application of ecotourism can help to establish more rigorous, multi-layered analysis that identifies the institutional demands of community based ecotourism (CBE). The paper draws on existing literature on ecotourism and several new case studies from developed and developing countries around the world. It focuses on the governance of CBE initiatives, particularly the interaction between local stakeholders and government and the role that third party non-governmental organisations can play in brokering appropriate institutional arrangements. The paper concludes by offering future research directions."
Organisational semiotics methods to assess organisational readiness for internal use of social media
Resumo:
The paper presents organisational semiotics (OS) as an approach for identifying organisational readiness factors for internal use of social media within information intensive organisations (IIO). The paper examines OS methods, such as organisational morphology, containment analysis and collateral analysis to reveal factors of readiness within an organisation. These models also help to identify the essential patterns of activities needed for social media use within an organisation, which can provide a basis for future analysis. The findings confirmed many of the factors, previously identified in literature, while also revealing new factors using OS methods. The factors for organisational readiness for internal use of social media include resources, organisational climate, processes, motivational readiness, benefit and organisational control factors. Organisational control factors revealed are security/privacy, policies, communication procedures, accountability and fallback.