24 resultados para AC generators

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe, and make publicly available, two problem instance generators for a multiobjective version of the well-known quadratic assignment problem (QAP). The generators allow a number of instance parameters to be set, including those controlling epistasis and inter-objective correlations. Based on these generators, several initial test suites are provided and described. For each test instance we measure some global properties and, for the smallest ones, make some initial observations of the Pareto optimal sets/fronts. Our purpose in providing these tools is to facilitate the ongoing study of problem structure in multiobjective (combinatorial) optimization, and its effects on search landscape and algorithm performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates random number generators in stochastic iteration algorithms that require infinite uniform sequences. We take a simple model of the general transport equation and solve it with the application of a linear congruential generator, the Mersenne twister, the mother-of-all generators, and a true random number generator based on quantum effects. With this simple model we show that for reasonably contractive operators the theoretically not infinite-uniform sequences perform also well. Finally, we demonstrate the power of stochastic iteration for the solution of the light transport problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An integration by parts formula is derived for the first order differential operator corresponding to the action of translations on the space of locally finite simple configurations of infinitely many points on Rd. As reference measures, tempered grand canonical Gibbs measures are considered corresponding to a non-constant non-smooth intensity (one-body potential) and translation invariant potentials fulfilling the usual conditions. It is proven that such Gibbs measures fulfill the intuitive integration by parts formula if and only if the action of the translation is not broken for this particular measure. The latter is automatically fulfilled in the high temperature and low intensity regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity load shifting is becoming a big topic in the world of ‘green’ retail. Marks & Spencer (M&S) aim to become the world’s most sustainable retailer (1) and part of that commitment means contributing to the future electricity network. While intelligent operation of fridges and Heating, Ventilation and Air Conditioning (HVAC) systems are a wide area of research, standby generators should be considered too, as they are the most widely adopted form of distributed generation. In this paper, the experience of using standby generators in Northern Ireland to support the grid is shared and the logistics of future projects are discussed. Interactions with maintenance schedules, electricity costs, grid code, staffing and store opening times are discussed as well as the financial implications associated with running generators for grid support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating renewable energy into built environments requires additional attention to the balancing of supply and demand due to their intermittent nature. Demand Side Response (DSR) has the potential to make money for organisations as well as support the System Operator as the generation mix changes. There is an opportunity to increase the use of existing technologies in order to manage demand. Company-owned standby generators are a rarely used resource; their maintenance schedule often accounts for a majority of their running hours. DSR encompasses a range of technologies and organisations; Sustainability First (2012) suggest that the System Operator (SO), energy supply companies, Distribution Network Operators (DNOs), Aggregators and Customers all stand to benefit from DSR. It is therefore important to consider impact of DSR measures to each of these stakeholders. This paper assesses the financial implications of organisations using existing standby generation equipment for DSR in order to avoid peak electricity charges. It concludes that under the current GB electricity pricing structure, there are several regions where running diesel generators at peak times is financially beneficial to organisations. Issues such as fuel costs, Carbon Reduction Commitment (CRC) charges, maintenance costs and electricity prices are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seasonal climate prediction offers the potential to anticipate variations in crop production early enough to adjust critical decisions. Until recently, interest in exploiting seasonal forecasts from dynamic climate models (e.g. general circulation models, GCMs) for applications that involve crop simulation models has been hampered by the difference in spatial and temporal scale of GCMs and crop models, and by the dynamic, nonlinear relationship between meteorological variables and crop response. Although GCMs simulate the atmosphere on a sub-daily time step, their coarse spatial resolution and resulting distortion of day-to-day variability limits the use of their daily output. Crop models have used daily GCM output with some success by either calibrating simulated yields or correcting the daily rainfall output of the GCM to approximate the statistical properties of historic observations. Stochastic weather generators are used to disaggregate seasonal forecasts either by adjusting input parameters in a manner that captures the predictable components of climate, or by constraining synthetic weather sequences to match predicted values. Predicting crop yields, simulated with historic weather data, as a statistical function of seasonal climatic predictors, eliminates the need for daily weather data conditioned on the forecast, but must often address poor statistical properties of the crop-climate relationship. Most of the work on using crop simulation with seasonal climate forecasts has employed historic analogs based on categorical ENSO indices. Other methods based on classification of predictors or weather types can provide daily weather inputs to crop models conditioned on forecasts. Advances in climate-based crop forecasting in the coming decade are likely to include more robust evaluation of the methods reviewed here, dynamically embedding crop models within climate models to account for crop influence on regional climate, enhanced use of remote sensing, and research in the emerging area of 'weather within climate'.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Development research has responded to a number of charges over the past few decades. For example, when traditional research was accused of being 'top-down', the response was participatory research, linking the 'receptors' to the generators of research. As participatory processes were recognised as producing limited outcomes, the demand-led agenda was born. In response to the alleged failure of research to deliver its products, the 'joined-up' model, which links research with the private sector, has become popular. However, using examples from animal-health research, this article demonstrates that all the aforementioned approaches are seriously limited in their attempts to generate outputs to address the multi-faceted problems facing the poor. The article outlines a new approach to research: the Mosaic Model. By combining different knowledge forms, and focusing on existing gaps, the model aims to bridge basic and applied findings to enhance the efficiency and value of research, past, present, and future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AC microsatellites have proved particularly useful as genetic markers. For some purposes, such as in population biology, the inferences drawn depend on the quantitative values of their mutation rates. This, together with intrinsic biological interest, has led to widespread study of microsatellite mutational mechanisms. Now, however, inconsistencies are appearing in the results of marker-based versus non-marker-based studies of mutational mechanisms. The reasons for this have not been investigated, but one possibility, pursued here, is that the differences result from structural differences between markers and genomic microsatellites. Here we report a comparison between the CEPH AC marker microsatellites and the global population of AC microsatellites in the human genome. AC marker microsatellites are longer than the global average. Controlling for length, marker microsatellites contain on average fewer interruptions, and have longer segments, than their genomic counterparts. Related to this, marker microsatellites show a greater tendency to concentrate the majority of their repeats into one segment. These differences plausibly result from scientists selecting markers for their high polymorphism. In addition to the structural differences, there are differences in the base composition of flanking sequences, marker flanking regions being richer in C and G and poorer in A and T. Our results indicate that there are profound differences between marker and genomic microsatellites that almost certainly affect their mutation rates. There is a need for a unified model of mutational mechanisms that accounts for both marker-derived and genomic observations. A suggestion is made as to how this might be done.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microsatellite lengths change over evolutionary time through a process of replication slippage. A recently proposed model of this process holds that the expansionary tendencies of slippage mutation are balanced by point mutations breaking longer microsatellites into smaller units and that this process gives rise to the observed frequency distributions of uninterrupted microsatellite lengths. We refer to this as the slippage/point-mutation theory. Here we derive the theory's predictions for interrupted microsatellites comprising regions of perfect repeats, labeled segments, separated by dinucleotide interruptions containing point mutations. These predictions are tested by reference to the frequency distributions of segments of AC microsatellite in the human genome, and several predictions are shown not to be supported by the data, as follows. The estimated slippage rates are relatively low for the first four repeats, and then rise initially linearly with length, in accordance with previous work. However, contrary to expectation and the experimental evidence, the inferred slippage rates decline in segments above 10 repeats. Point mutation rates are also found to be higher within microsatellites than elsewhere. The theory provides an excellent fit to the frequency distribution of peripheral segment lengths but fails to explain why internal segments are shorter. Furthermore, there are fewer microsatellites with many segments than predicted. The frequencies of interrupted microsatellites decline geometrically with microsatellite size measured in number of segments, so that for each additional segment, the number of microsatellites is 33.6% less. Overall we conclude that the detailed structure of interrupted microsatellites cannot be reconciled with the existing slippage/point-mutation theory of microsatellite evolution, and we suggest that microsatellites are stabilized by processes acting on interior rather than on peripheral segments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asynchronous Optical Sampling has the potential to improve signal to noise ratio in THz transient sperctrometry. The design of an inexpensive control scheme for synchronising two femtosecond pulse frequency comb generators at an offset frequency of 20 kHz is discussed. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing recorded THz transients in the time and frequency domain are outlined. Finally, possibilities for femtosecond pulse shaping using genetic algorithms are mentioned.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a new method for reconstructing 3D surface points and a wireframe on the surface of a freeform object using a small number, e.g. 10, of 2D photographic images. The images are taken at different viewing directions by a perspective camera with full prior knowledge of the camera configurations. The reconstructed surface points are frontier points and the wireframe is a network of contour generators. Both of them are reconstructed by pairing apparent contours in the 2D images. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The unique pattern of the reconstructed points and contours may be used in 31) object recognition and measurement without computationally intensive full surface reconstruction. The results are obtained from both computer-generated and real objects. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a method for reconstructing 3D frontier points, contour generators and surfaces of anatomical objects or smooth surfaces from a small number, e. g. 10, of conventional 2D X-ray images. The X-ray images are taken at different viewing directions with full prior knowledge of the X-ray source and sensor configurations. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the number of viewing directions is fixed and the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The technique may be used not only in medicine but also in industrial applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Some 50,000 Win Studies in Chess challenge White to find an effectively unique route to a win. Judging the impact of less than absolute uniqueness requires both technical analysis and artistic judgment. Here, for the first time, an algorithm is defined to help analyse uniqueness in endgame positions objectively. The key idea is to examine how critical certain positions are to White in achieving the win. The algorithm uses sub-n-man endgame tables (EGTs) for both Chess and relevant, adjacent variants of Chess. It challenges authors of EGT generators to generalise them to create EGTs for these chess variants. It has already proved efficient and effective in an implementation for Starchess, itself a variant of chess. The approach also addresses a number of similar questions arising in endgame theory, games and compositions.