887 resultados para network cost models
Resumo:
Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.
Resumo:
The accurate prediction of the biochemical function of a protein is becoming increasingly important, given the unprecedented growth of both structural and sequence databanks. Consequently, computational methods are required to analyse such data in an automated manner to ensure genomes are annotated accurately. Protein structure prediction methods, for example, are capable of generating approximate structural models on a genome-wide scale. However, the detection of functionally important regions in such crude models, as well as structural genomics targets, remains an extremely important problem. The method described in the current study, MetSite, represents a fully automatic approach for the detection of metal-binding residue clusters applicable to protein models of moderate quality. The method involves using sequence profile information in combination with approximate structural data. Several neural network classifiers are shown to be able to distinguish metal sites from non-sites with a mean accuracy of 94.5%. The method was demonstrated to identify metal-binding sites correctly in LiveBench targets where no obvious metal-binding sequence motifs were detectable using InterPro. Accurate detection of metal sites was shown to be feasible for low-resolution predicted structures generated using mGenTHREADER where no side-chain information was available. High-scoring predictions were observed for a recently solved hypothetical protein from Haemophilus influenzae, indicating a putative metal-binding site.
Resumo:
The Mitigation Options for Phosphorus and Sediment (MOPS) project investigated the effectiveness of within-field control measures (tramline management, straw residue management, type of cultivation and direction, and vegetative buffers) in terms of mitigating sediment and phosphorus loss from winter-sown combinable cereal crops using three case study sites. To determine the cost of the approaches, simple financial spreadsheet models were constructed at both farm and regional levels. Taking into account crop areas, crop rotation margins per hectare were calculated to reflect the costs of crop establishment, fertiliser and agro-chemical applications, harvesting, and the associated labour and machinery costs. Variable and operating costs associated with each mitigation option were then incorporated to demonstrate the impact on the relevant crop enterprise and crop rotation margins. These costs were then compared to runoff, sediment and phosphorus loss data obtained from monitoring hillslope-length scale field plots. Each of the mitigation options explored in this study had potential for reducing sediment and phosphorus losses from arable land under cereal crops. Sediment losses were reduced from between 9 kg ha−1 to as much as 4780 kg ha−1 with a corresponding reduction in phosphorus loss from 0.03 kg ha−1 to 2.89 kg ha−1. In percentage terms reductions of phosphorus were between 9% and 99%. Impacts on crop rotation margins also varied. Minimum tillage resulted in cost savings (up to £50 ha−1) whilst other options showed increased costs (up to £19 ha−1 for straw residue incorporation). Overall, the results indicate that each of the options has potential for on-farm implementation. However, tramline management appeared to have the greatest potential for reducing runoff, sediment, and phosphorus losses from arable land (between 69% and 99%) and is likely to be considered cost-effective with only a small additional cost of £2–4 ha−1, although further work is needed to evaluate alternative tramline management methods. Tramline management is also the only option not incorporated within current policy mechanisms associated with reducing soil erosion and phosphorus loss and in light of its potential is an approach that should be encouraged once further evidence is available.
Resumo:
The African Technology Policy Studies Network (ATPS) is a multidisciplinary network of researchers, private sector actors, policymakers and civil society. ATPS has the vision to become the leading international centre of excellence and reference in science, technology and innovation (STI) systems research, training and capacity building, communication and sensitization, knowledge brokerage, policy advocacy and outreach in Africa. It has a Regional Secretariat in Nairobi Kenya, and operates through national chapters in 29 countries (including 27 in Africa and two Chapters in the United Kingdom and USA for Africans in the Diaspora) with an expansion plan to cover the entire continent by 2015. The ATPS Phase VI Strategic Plan aims to improve the understanding and functioning of STI processes and systems to strengthen the learning capacity, social responses, and governance of STI for addressing Africa's development challenges, with a specific focus on the Millennium Development Goals (MDGs). A team of external evaluators carried out a midterm review to assess the effectiveness and efficiency of the implementation of the Strategic Plan for the period January 1, 2009 to December 31, 2010. The evaluation methodology involved multiple quantitative and qualitative methods to assess the qualitative and quantitative inputs (human resources, financial resources, time, etc.) into ATPS activities (both thematic and facilitative) and their tangible and intangible outputs, outcomes and impacts. Methods included a questionnaire survey of ATPS members and stakeholders, key informant interviews, and focus group discussions (FGDs) with members in six countries. Effectiveness of Programmes Under all six strategic goals, very good progress has been made towards planned outputs and outcomes. This is evidenced by key performance indicators (KPIs) generated from desk review, ratings from the survey respondents, and the themes that run through the FGDs. Institutional and Programme Cost Effectiveness Institutional Effectiveness: assessment of institutional effectiveness suggests that adequate management frameworks are in place and are being used effectively and transparently. Also technical and financial accounting mechanisms are being followed in accordance with grant agreements and with global good practice. This is evidenced by KPIs generated from desk review. Programme Cost Effectiveness: assessment of cost-effectiveness of execution of programmes shows that organisational structure is efficient, delivering high quality, relevant research at relatively low cost by international standards. The evidence includes KPIs from desk review: administrative costs to programme cost ratio has fallen steadily, to around 10%; average size of research grants is modest, without compromising quality. There is high level of pro bono input by ATPS members. ATPS Programmes Strategic Evaluation ATPS research and STI related activities are indeed unique and well aligned with STI issues and needs facing Africa and globally. The multi-disciplinary and trans-boundary nature of the research activities are creating a unique group of research scientists. The ATPS approach to research and STI issues is paving the way for the so called Third Generation University (3GU). Understanding this unique positioning, an increasing number of international multilateral agencies are seeking partnership with ATPS. ATPS is seeing an increasing level of funding commitments by Donor Partners. Recommendations for ATPS Continued Growth and Effectiveness On-going reform of ATPS administrative structure to continue The on-going reforms that have taken place within the Board, Regional Secretariat, and at the National Chapter coordination levels are welcomed. Such reform should continue until fully functional corporate governance policy and practices are fully established and implemented across the ATPS governance structures. This will further strengthen ATPS to achieve the vision of being the leading STI policy brokerage organization in Africa. Although training in corporate governance has been carried out for all sectors of ATPS leadership structure in recent time, there is some evidence that these systems have not yet been fully implemented effectively within all the governance structures of the organization, especially at the Board and National chapter levels. Future training should emphasize practical application with exercises relevant to ATPS leadership structure from the Board to the National Chapter levels. Training on Transformational Leadership - Leading a Change Though a subject of intense debate amongst economists and social scientists, it is generally agreed that cultural mindsets and attitudes could enhance and/or hinder organizational progress. ATPS’s vision demands transformational leadership skills amongst its leaders from the Board members to the National Chapter Coordinators. To lead such a change, ATPS leaders must understand and avoid personal and cultural mindsets and value systems that hinder change, while embracing those that enhance it. It requires deliberate assessment of cultural, behavioural patterns that could hinder progress and the willingness to be recast into cultural and personal habits that make for progress. Improvement of relationship amongst the Board, Secretariat, and National Chapters A large number of ATPS members and stakeholders feel they do not have effective communications and/or access to Board, National Chapter Coordinators and Regional Secretariat activities. Effort should be made to improve the implementation of ATPS communication strategy to improve on information flows amongst the ATPS management and the members. The results of the survey and the FGDs suggest that progress has been made during the past two years in this direction, but more could be done to ensure effective flow of pertinent information to members following ATPS communications channels. Strategies for Increased Funding for National Chapters There is a big gap between the fundraising skills of the Regional Secretariat and those of the National Coordinators. In some cases, funds successfully raised by the Secretariat and disbursed to national chapters were not followed up with timely progress and financial reports by some national chapters. Adequate training in relevant skills required for effective interactions with STI key policy players should be conducted regularly for National Chapter coordinators and ATPS members. The ongoing training in grant writing should continue and be made continent-wide if funding permits. Funding of National Chapters should be strategic such that capacity in a specific area of research is built which, with time, will not only lead to a strong research capacity in that area, but also strengthen academic programmes. For example, a strong climate change programme is emerging at University of Nigeria Nsukka (UNN), with strong collaborations with Universities from neighbouring States. Strategies to Increase National Government buy-in and support for STI Translating STI research outcomes into policies requires a great deal of emotional intelligence, skills which are often lacking in the first and second generation universities. In the epoch of the science-based or 2GUs, governments were content with universities carrying out scientific research and providing scientific education. Now they desire to see universities as incubators of new science- or technology-based commercial activities, whether by existing firms or start-ups. Hence, governments demand that universities take an active and leading role in the exploitation of their knowledge and they are willing to make funds available to support such activities. Thus, for universities to gain the attention of national leadership they must become centres of excellence and explicit instruments of economic development in the knowledge-based economy. The universities must do this while working collaboratively with government departments, parastatals, and institutions and dedicated research establishments. ATPS should anticipate these shifting changes and devise programmes to assist both government and universities to relate effectively. New administrative structures in member organizations to sustain and manage the emerging STI multidisciplinary teams Second Generation universities (2GUs) tend to focus on pure science and often do not regard the application of their know-how as their task. In contrast, Third Generation Universities (3GUs) objectively stimulate techno-starters – students or academics – to pursue the exploitation or commercialisation of the knowledge they generate. They view this as being equal in importance to the objectives of scientific research and education. Administratively, research in the 2GU era was mainly monodisciplinary and departments were structured along disciplines. The emerging interdisciplinary scientific teams with focus on specific research areas functionally work against the current mono-disciplinary faculty-based, administrative structure of 2GUs. For interdisciplinary teams, the current faculty system is an obstacle. There is a need for new organisational forms for university management that can create responsibilities for the task of know-how exploitation. ATPS must anticipate this and begin to strategize solutions for their member institutions to transition to 3Gus administrative structure, otherwise ATPS growth will plateau, and progress achieved so far may be stunted.
Resumo:
Spiking neural networks are usually limited in their applications due to their complex mathematical models and the lack of intuitive learning algorithms. In this paper, a simpler, novel neural network derived from a leaky integrate and fire neuron model, the ‘cavalcade’ neuron, is presented. A simulation for the neural network has been developed and two basic learning algorithms implemented within the environment. These algorithms successfully learn some basic temporal and instantaneous problems. Inspiration for neural network structures from these experiments are then taken and applied to process sensor information so as to successfully control a mobile robot.
Resumo:
Background Cortical cultures grown long-term on multi-electrode arrays (MEAs) are frequently and extensively used as models of cortical networks in studies of neuronal firing activity, neuropharmacology, toxicology and mechanisms underlying synaptic plasticity. However, in contrast to the predominantly asynchronous neuronal firing activity exhibited by intact cortex, electrophysiological activity of mature cortical cultures is dominated by spontaneous epileptiform-like global burst events which hinders their effective use in network-level studies, particularly for neurally-controlled animat (‘artificial animal’) applications. Thus, the identification of culture features that can be exploited to produce neuronal activity more representative of that seen in vivo could increase the utility and relevance of studies that employ these preparations. Acetylcholine has a recognised neuromodulatory role affecting excitability, rhythmicity, plasticity and information flow in vivo although its endogenous production by cortical cultures and subsequent functional influence upon neuronal excitability remains unknown. Results Consequently, using MEA electrophysiological recording supported by immunohistochemical and RT-qPCR methods, we demonstrate for the first time, the presence of intrinsic cholinergic neurons and significant, endogenous cholinergic tone in cortical cultures with a characterisation of the muscarinic and nicotinic components that underlie modulation of spontaneous neuronal activity. We found that tonic muscarinic ACh receptor (mAChR) activation affects global excitability and burst event regularity in a culture age-dependent manner whilst, in contrast, tonic nicotinic ACh receptor (nAChR) activation can modulate burst duration and the proportion of spikes occurring within bursts in a spatio-temporal fashion. Conclusions We suggest that the presence of significant endogenous cholinergic tone in cortical cultures and the comparability of its modulatory effects to those seen in intact brain tissues support emerging, exploitable commonalities between in vivo and in vitro preparations. We conclude that experimental manipulation of endogenous cholinergic tone could offer a novel opportunity to improve the use of cortical cultures for studies of network-level mechanisms in a manner that remains largely consistent with its functional role.
Resumo:
There is a current need to constrain the parameters of gravity wave drag (GWD) schemes in climate models using observational information instead of tuning them subjectively. In this work, an inverse technique is developed using data assimilation principles to estimate gravity wave parameters. Because mostGWDschemes assume instantaneous vertical propagation of gravity waves within a column, observations in a single column can be used to formulate a one-dimensional assimilation problem to estimate the unknown parameters. We define a cost function that measures the differences between the unresolved drag inferred from observations (referred to here as the ‘observed’ GWD) and the GWD calculated with a parametrisation scheme. The geometry of the cost function presents some difficulties, including multiple minima and ill-conditioning because of the non-independence of the gravity wave parameters. To overcome these difficulties we propose a genetic algorithm to minimize the cost function, which provides a robust parameter estimation over a broad range of prescribed ‘true’ parameters. When real experiments using an independent estimate of the ‘observed’ GWD are performed, physically unrealistic values of the parameters can result due to the non-independence of the parameters. However, by constraining one of the parameters to lie within a physically realistic range, this degeneracy is broken and the other parameters are also found to lie within physically realistic ranges. This argues for the essential physical self-consistency of the gravity wave scheme. A much better fit to the observed GWD at high latitudes is obtained when the parameters are allowed to vary with latitude. However, a close fit can be obtained either in the upper or the lower part of the profiles, but not in both at the same time. This result is a consequence of assuming an isotropic launch spectrum. The changes of sign in theGWDfound in the tropical lower stratosphere, which are associated with part of the quasi-biennial oscillation forcing, cannot be captured by the parametrisation with optimal parameters.
Resumo:
Purpose – Commercial real estate is a highly specific asset: heterogeneous, indivisible and with less information transparency than most other commonly held investment assets. These attributes encourage the use of intermediaries during asset acquisition and disposal. However, there are few attempts to explain the use of different brokerage models (with differing costs) in different markets. This study aims to address this gap. Design/methodology/approach – The study analyses 9,338 real estate transactions in London and New York City from 2001 to 2011. Data are provided by Real Capital Analytics and cover over $450 billion of investments in this period. Brokerage trends in the two cities are compared and probit regressions are used to test whether the decision to transact with broker representation varies with investor or asset characteristics. Findings – Results indicate greater use of brokerage in London, especially by purchasers. This persists when data are disaggregated by sector, time or investor type, pointing to the role of local market culture and institutions in shaping brokerage models and transaction costs. Within each city, the nature of the investors involved seems to be a more significant influence on broker use than the characteristics of the assets being traded. Originality/value – Brokerage costs are the single largest non-tax charge to an investor when trading commercial real estate, yet there is little research in this area. This study examines the role of brokers and provides empirical evidence on factors that influence the use and mode of brokerage in two major investment destinations.
Resumo:
Traditional resource management has had as its main objective the optimisation of throughput, based on parameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.
Resumo:
Traditional resource management has had as its main objective the optimisation of throughput, based on pa- rameters such as CPU, memory, and network bandwidth. With the appearance of Grid Markets, new variables that determine economic expenditure, benefit and opportunity must be taken into account. The SORMA project aims to allow resource owners and consumers to exploit market mechanisms to sell and buy resources across the Grid. SORMA’s motivation is to achieve efficient resource utilisation by maximising revenue for resource providers, and minimising the cost of resource consumption within a market environment. An overriding factor in Grid markets is the need to ensure that desired Quality of Service levels meet the expectations of market participants. This paper explains the proposed use of an Economically Enhanced Resource Manager (EERM) for resource provisioning based on economic models. In particular, this paper describes techniques used by the EERM to support revenue maximisation across multiple Service Level Agreements.
Resumo:
With a wide range of applications benefiting from dense network air temperature observations but with limitations of costs, existing siting guidelines and risk of damage to sensors, new methods are required to gain a high resolution understanding of the spatio-temporal patterns of urban meteorological phenomena such as the urban heat island or precision farming needs. With the launch of a new generation of low cost sensors it is possible to deploy a network to monitor air temperature at finer spatial resolutions. Here we investigate the Aginova Sentinel Micro (ASM) sensor with a bespoke radiation shield (together < US$150) which can provide secure near-real-time air temperature data to a server utilising existing (or user deployed) Wireless Fidelity (Wi-Fi) networks. This makes it ideally suited for deployment where wireless communications readily exist, notably urban areas. Assessment of the performance of the ASM relative to traceable standards in a water bath and atmospheric chamber show it to have good measurement accuracy with mean errors < ± 0.22 °C between -25 and 30 °C, with a time constant in ambient air of 110 ± 15 s. Subsequent field tests of it within the bespoke shield also had excellent performance (root-mean-square error = 0.13 °C) over a range of meteorological conditions relative to a traceable operational UK Met Office platinum resistance thermometer. These results indicate that the ASM and bespoke shield are more than fit-for-purpose for dense network deployment in urban areas at relatively low cost compared to existing observation techniques.
Resumo:
Runoff generation processes and pathways vary widely between catchments. Credible simulations of solute and pollutant transport in surface waters are dependent on models which facilitate appropriate, catchment-specific representations of perceptual models of the runoff generation process. Here, we present a flexible, semi-distributed landscape-scale rainfall-runoff modelling toolkit suitable for simulating a broad range of user-specified perceptual models of runoff generation and stream flow occurring in different climatic regions and landscape types. PERSiST (the Precipitation, Evapotranspiration and Runoff Simulator for Solute Transport) is designed for simulating present-day hydrology; projecting possible future effects of climate or land use change on runoff and catchment water storage; and generating hydrologic inputs for the Integrated Catchments (INCA) family of models. PERSiST has limited data requirements and is calibrated using observed time series of precipitation, air temperature and runoff at one or more points in a river network. Here, we apply PERSiST to the river Thames in the UK and describe a Monte Carlo tool for model calibration, sensitivity and uncertainty analysis
Resumo:
Last fall, a network of the European Cooperation in Science and Technology (COST), called “Basic Concepts for Convection Parameterization in Weather Forecast and Climate Models” (COST Action ES0905; see http://w3.cost.esf.org/index.php?id=205&action_number=ES0905), organized a 10-day training course on atmospheric convection and its parameterization. The aim of the workshop, held on the island of Brac, Croatia, was to help young scientists develop an in-depth understanding of the core theory underpinning convection parameterizations. The speakers also sought to impart an appreciation of the various approximations, compromises, and ansatz necessary to translate theory into operational practice for numerical models.
Resumo:
With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.
Resumo:
In order to overcome divergence of estimation with the same data, the proposed digital costing process adopts an integrated design of information system to design the process knowledge and costing system together. By employing and extending a widely used international standard, industry foundation classes, the system can provide an integrated process which can harvest information and knowledge of current quantity surveying practice of costing method and data. Knowledge of quantification is encoded from literatures, motivation case and standards. It can reduce the time consumption of current manual practice. The further development will represent the pricing process in a Bayesian Network based knowledge representation approach. The hybrid types of knowledge representation can produce a reliable estimation for construction project. In a practical term, the knowledge management of quantity surveying can improve the system of construction estimation. The theoretical significance of this study lies in the fact that its content and conclusion make it possible to develop an automatic estimation system based on hybrid knowledge representation approach.