24 resultados para Hydroelectric generators
em CentAUR: Central Archive University of Reading - UK
Resumo:
We describe, and make publicly available, two problem instance generators for a multiobjective version of the well-known quadratic assignment problem (QAP). The generators allow a number of instance parameters to be set, including those controlling epistasis and inter-objective correlations. Based on these generators, several initial test suites are provided and described. For each test instance we measure some global properties and, for the smallest ones, make some initial observations of the Pareto optimal sets/fronts. Our purpose in providing these tools is to facilitate the ongoing study of problem structure in multiobjective (combinatorial) optimization, and its effects on search landscape and algorithm performance.
Resumo:
This paper investigates random number generators in stochastic iteration algorithms that require infinite uniform sequences. We take a simple model of the general transport equation and solve it with the application of a linear congruential generator, the Mersenne twister, the mother-of-all generators, and a true random number generator based on quantum effects. With this simple model we show that for reasonably contractive operators the theoretically not infinite-uniform sequences perform also well. Finally, we demonstrate the power of stochastic iteration for the solution of the light transport problem.
Resumo:
An integration by parts formula is derived for the first order differential operator corresponding to the action of translations on the space of locally finite simple configurations of infinitely many points on Rd. As reference measures, tempered grand canonical Gibbs measures are considered corresponding to a non-constant non-smooth intensity (one-body potential) and translation invariant potentials fulfilling the usual conditions. It is proven that such Gibbs measures fulfill the intuitive integration by parts formula if and only if the action of the translation is not broken for this particular measure. The latter is automatically fulfilled in the high temperature and low intensity regime.
Resumo:
Electricity load shifting is becoming a big topic in the world of ‘green’ retail. Marks & Spencer (M&S) aim to become the world’s most sustainable retailer (1) and part of that commitment means contributing to the future electricity network. While intelligent operation of fridges and Heating, Ventilation and Air Conditioning (HVAC) systems are a wide area of research, standby generators should be considered too, as they are the most widely adopted form of distributed generation. In this paper, the experience of using standby generators in Northern Ireland to support the grid is shared and the logistics of future projects are discussed. Interactions with maintenance schedules, electricity costs, grid code, staffing and store opening times are discussed as well as the financial implications associated with running generators for grid support.
Resumo:
Integrating renewable energy into built environments requires additional attention to the balancing of supply and demand due to their intermittent nature. Demand Side Response (DSR) has the potential to make money for organisations as well as support the System Operator as the generation mix changes. There is an opportunity to increase the use of existing technologies in order to manage demand. Company-owned standby generators are a rarely used resource; their maintenance schedule often accounts for a majority of their running hours. DSR encompasses a range of technologies and organisations; Sustainability First (2012) suggest that the System Operator (SO), energy supply companies, Distribution Network Operators (DNOs), Aggregators and Customers all stand to benefit from DSR. It is therefore important to consider impact of DSR measures to each of these stakeholders. This paper assesses the financial implications of organisations using existing standby generation equipment for DSR in order to avoid peak electricity charges. It concludes that under the current GB electricity pricing structure, there are several regions where running diesel generators at peak times is financially beneficial to organisations. Issues such as fuel costs, Carbon Reduction Commitment (CRC) charges, maintenance costs and electricity prices are discussed.
Resumo:
A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.
Resumo:
Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.
Resumo:
Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.
Resumo:
Development research has responded to a number of charges over the past few decades. For example, when traditional research was accused of being 'top-down', the response was participatory research, linking the 'receptors' to the generators of research. As participatory processes were recognised as producing limited outcomes, the demand-led agenda was born. In response to the alleged failure of research to deliver its products, the 'joined-up' model, which links research with the private sector, has become popular. However, using examples from animal-health research, this article demonstrates that all the aforementioned approaches are seriously limited in their attempts to generate outputs to address the multi-faceted problems facing the poor. The article outlines a new approach to research: the Mosaic Model. By combining different knowledge forms, and focusing on existing gaps, the model aims to bridge basic and applied findings to enhance the efficiency and value of research, past, present, and future.
Resumo:
Asynchronous Optical Sampling has the potential to improve signal to noise ratio in THz transient sperctrometry. The design of an inexpensive control scheme for synchronising two femtosecond pulse frequency comb generators at an offset frequency of 20 kHz is discussed. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing recorded THz transients in the time and frequency domain are outlined. Finally, possibilities for femtosecond pulse shaping using genetic algorithms are mentioned.
Resumo:
This paper describes a new method for reconstructing 3D surface points and a wireframe on the surface of a freeform object using a small number, e.g. 10, of 2D photographic images. The images are taken at different viewing directions by a perspective camera with full prior knowledge of the camera configurations. The reconstructed surface points are frontier points and the wireframe is a network of contour generators. Both of them are reconstructed by pairing apparent contours in the 2D images. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The unique pattern of the reconstructed points and contours may be used in 31) object recognition and measurement without computationally intensive full surface reconstruction. The results are obtained from both computer-generated and real objects. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This paper describes a method for reconstructing 3D frontier points, contour generators and surfaces of anatomical objects or smooth surfaces from a small number, e. g. 10, of conventional 2D X-ray images. The X-ray images are taken at different viewing directions with full prior knowledge of the X-ray source and sensor configurations. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the number of viewing directions is fixed and the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The technique may be used not only in medicine but also in industrial applications.
Resumo:
Some 50,000 Win Studies in Chess challenge White to find an effectively unique route to a win. Judging the impact of less than absolute uniqueness requires both technical analysis and artistic judgment. Here, for the first time, an algorithm is defined to help analyse uniqueness in endgame positions objectively. The key idea is to examine how critical certain positions are to White in achieving the win. The algorithm uses sub-n-man endgame tables (EGTs) for both Chess and relevant, adjacent variants of Chess. It challenges authors of EGT generators to generalise them to create EGTs for these chess variants. It has already proved efficient and effective in an implementation for Starchess, itself a variant of chess. The approach also addresses a number of similar questions arising in endgame theory, games and compositions.
Resumo:
A description is given of the global atmospheric electric circuit operating between the Earth’s surface and the ionosphere. Attention is drawn to the huge range of horizontal and vertical spatial scales, ranging from 10−9 m to 1012 m, concerned with the many important processes at work. A similarly enormous range of time scales is involved from 10−6 s to 109 s, in the physical effects and different phenomena that need to be considered. The current flowing in the global circuit is generated by disturbed weather such as thunderstorms and electrified rain/shower clouds, mostly occurring over the Earth’s land surface. The profile of electrical conductivity up through the atmosphere, determined mainly by galactic cosmic ray ionization, is a crucial parameter of the circuit. Model simulation results on the variation of the ionospheric potential, ∼250 kV positive with respect to the Earth’s potential, following lightning discharges and sprites are summarized. Experimental results comparing global circuit variations with the neutron rate recorded at Climax, Colorado, are then discussed. Within the return (load) part of the circuit in the fair weather regions remote from the generators, charge layers exist on the upper and lower edges of extensive layer clouds; new experimental evidence for these charge layers is also reviewed. Finally, some directions for future research in the subject are suggested.
Resumo:
Almost all the electricity currently produced in the UK is generated as part of a centralised power system designed around large fossil fuel or nuclear power stations. This power system is robust and reliable but the efficiency of power generation is low, resulting in large quantities of waste heat. The principal aim of this paper is to investigate an alternative concept: the energy production by small scale generators in close proximity to the energy users, integrated into microgrids. Microgrids—de-centralised electricity generation combined with on-site production of heat—bear the promise of substantial environmental benefits, brought about by a higher energy efficiency and by facilitating the integration of renewable sources such as photovoltaic arrays or wind turbines. By virtue of good match between generation and load, microgrids have a low impact on the electricity network, despite a potentially significant level of generation by intermittent energy sources. The paper discusses the technical and economic issues associated with this novel concept, giving an overview of the generator technologies, the current regulatory framework in the UK, and the barriers that have to be overcome if microgrids are to make a major contribution to the UK energy supply. The focus of this study is a microgrid of domestic users powered by small Combined Heat and Power generators and photovoltaics. Focusing on the energy balance between the generation and load, it is found that the optimum combination of the generators in the microgrid- consisting of around 1.4 kWp PV array per household and 45% household ownership of micro-CHP generators- will maintain energy balance on a yearly basis if supplemented by energy storage of 2.7 kWh per household. We find that there is no fundamental technological reason why microgrids cannot contribute an appreciable part of the UK energy demand. Indeed, an estimate of cost indicates that the microgrids considered in this study would supply electricity at a cost comparable with the present electricity supply if the current support mechanisms for photovoltaics were maintained. Combining photovoltaics and micro-CHP and a small battery requirement gives a microgrid that is independent of the national electricity network. In the short term, this has particular benefits for remote communities but more wide-ranging possibilities open up in the medium to long term. Microgrids could meet the need to replace current generation nuclear and coal fired power stations, greatly reducing the demand on the transmission and distribution network.