51 resultados para Multi-objective evolutionary algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new fast, effective and practical model structure construction algorithm for a mixture of experts network system utilising only process data. The algorithm is based on a novel forward constrained regression procedure. Given a full set of the experts as potential model bases, the structure construction algorithm, formed on the forward constrained regression procedure, selects the most significant model base one by one so as to minimise the overall system approximation error at each iteration, while the gate parameters in the mixture of experts network system are accordingly adjusted so as to satisfy the convex constraints required in the derivation of the forward constrained regression procedure. The procedure continues until a proper system model is constructed that utilises some or all of the experts. A pruning algorithm of the consequent mixture of experts network system is also derived to generate an overall parsimonious construction algorithm. Numerical examples are provided to demonstrate the effectiveness of the new algorithms. The mixture of experts network framework can be applied to a wide variety of applications ranging from multiple model controller synthesis to multi-sensor data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An input variable selection procedure is introduced for the identification and construction of multi-input multi-output (MIMO) neurofuzzy operating point dependent models. The algorithm is an extension of a forward modified Gram-Schmidt orthogonal least squares procedure for a linear model structure which is modified to accommodate nonlinear system modeling by incorporating piecewise locally linear model fitting. The proposed input nodes selection procedure effectively tackles the problem of the curse of dimensionality associated with lattice-based modeling algorithms such as radial basis function neurofuzzy networks, enabling the resulting neurofuzzy operating point dependent model to be widely applied in control and estimation. Some numerical examples are given to demonstrate the effectiveness of the proposed construction algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-rate multicarrier DS-CDMA is a potentially attractive multiple access method for future wireless networks that must support multimedia, and thus multi-rate, traffic. Considering that high performance detection such as coherent demodulation needs the explicit knowledge of the channel, this paper proposes a subspace-based blind adaptive algorithm for timing acquisition and channel estimation in asynchronous multirate multicarrier DS-CDMA systems, which is applicable to both multicode and variable spreading factor systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Where users are interacting in a distributed virtual environment, the actions of each user must be observed by peers with sufficient consistency and within a limited delay so as not to be detrimental to the interaction. The consistency control issue may be split into three parts: update control; consistent enactment and evolution of events; and causal consistency. The delay in the presentation of events, termed latency, is primarily dependent on the network propagation delay and the consistency control algorithms. The latency induced by the consistency control algorithm, in particular causal ordering, is proportional to the number of participants. This paper describes how the effect of network delays may be reduced and introduces a scalable solution that provides sufficient consistency control while minimising its effect on latency. The principles described have been developed at Reading over the past five years. Similar principles are now emerging in the simulation community through the HLA standard. This paper attempts to validate the suggested principles within the schema of distributed simulation and virtual environments and to compare and contrast with those described by the HLA definition documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the use of a genetic algorithm to select financial ratios for corporate distress classification models. For this purpose, the fitness value associated to a set of ratios is made to reflect the requirements of maximizing the amount of information available for the model and minimizing the collinearity between the model inputs. A case study involving 60 failed and continuing British firms in the period 1997-2000 is used for illustration. The classification model based on ratios selected by the genetic algorithm compares favorably with a model employing ratios usually found in the financial distress literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive general analytic approximations for pricing European basket and rainbow options on N assets. The key idea is to express the option’s price as a sum of prices of various compound exchange options, each with different pairs of subordinate multi- or single-asset options. The underlying asset prices are assumed to follow lognormal processes, although our results can be extended to certain other price processes for the underlying. For some multi-asset options a strong condition holds, whereby each compound exchange option is equivalent to a standard single-asset option under a modified measure, and in such cases an almost exact analytic price exists. More generally, approximate analytic prices for multi-asset options are derived using a weak lognormality condition, where the approximation stems from making constant volatility assumptions on the price processes that drive the prices of the subordinate basket options. The analytic formulae for multi-asset option prices, and their Greeks, are defined in a recursive framework. For instance, the option delta is defined in terms of the delta relative to subordinate multi-asset options, and the deltas of these subordinate options with respect to the underlying assets. Simulations test the accuracy of our approximations, given some assumed values for the asset volatilities and correlations. Finally, a calibration algorithm is proposed and illustrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whilst hydrological systems can show resilience to short-term streamflow deficiencies during within-year droughts, prolonged deficits during multi-year droughts are a significant threat to water resources security in Europe. This study uses a threshold-based objective classification of regional hydrological drought to qualitatively examine the characteristics, spatio-temporal evolution and synoptic climatic drivers of multi-year drought events in 1962–64, 1975–76 and 1995–97, on a European scale but with particular focus on the UK. Whilst all three events are multi-year, pan-European phenomena, their development and causes can be contrasted. The critical factor in explaining the unprecedented severity of the 1975–76 event is the consecutive occurrence of winter and summer drought. In contrast, 1962–64 was a succession of dry winters, mitigated by quiescent summers, whilst 1995–97 lacked spatial coherence and was interrupted by wet interludes. Synoptic climatic conditions vary within and between multi-year droughts, suggesting that regional factors modulate the climate signal in streamflow drought occurrence. Despite being underpinned by qualitatively similar climatic conditions and commonalities in evolution and characteristics, each of the three droughts has a unique spatio-temporal signature. An improved understanding of the spatio-temporal evolution and characteristics of multi-year droughts has much to contribute to monitoring and forecasting capability, and to improved mitigation strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of planning multiple vehicles deals with the design of an effective algorithm that can cause multiple autonomous vehicles on the road to communicate and generate a collaborative optimal travel plan. Our modelling of the problem considers vehicles to vary greatly in terms of both size and speed, which makes it suboptimal to have a faster vehicle follow a slower vehicle or for vehicles to drive with predefined speed lanes. It is essential to have a fast planning algorithm whilst still being probabilistically complete. The Rapidly Exploring Random Trees (RRT) algorithm developed and reported on here uses a problem specific coordination axis, a local optimization algorithm, priority based coordination, and a module for deciding travel speeds. Vehicles are assumed to remain in their current relative position laterally on the road unless otherwise instructed. Experimental results presented here show regular driving behaviours, namely vehicle following, overtaking, and complex obstacle avoidance. The ability to showcase complex behaviours in the absence of speed lanes is characteristic of the solution developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The planning of semi-autonomous vehicles in traffic scenarios is a relatively new problem that contributes towards the goal of making road travel by vehicles free of human drivers. An algorithm needs to ensure optimal real time planning of multiple vehicles (moving in either direction along a road), in the presence of a complex obstacle network. Unlike other approaches, here we assume that speed lanes are not present and that different lanes do not need to be maintained for inbound and outbound traffic. Our basic hypothesis is to carry forward the planning task to ensure that a sufficient distance is maintained by each vehicle from all other vehicles, obstacles and road boundaries. We present here a 4-layer planning algorithm that consists of road selection (for selecting the individual roads of traversal to reach the goal), pathway selection (a strategy to avoid and/or overtake obstacles, road diversions and other blockages), pathway distribution (to select the position of a vehicle at every instance of time in a pathway), and trajectory generation (for generating a curve, smooth enough, to allow for the maximum possible speed). Cooperation between vehicles is handled separately at the different levels, the aim being to maximize the separation between vehicles. Simulated results exhibit behaviours of smooth, efficient and safe driving of vehicles in multiple scenarios; along with typical vehicle behaviours including following and overtaking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Boreal winter wind storm situations over Central Europe are investigated by means of an objective cluster analysis. Surface data from the NCEP-Reanalysis and ECHAM4/OPYC3-climate change GHG simulation (IS92a) are considered. To achieve an optimum separation of clusters of extreme storm conditions, 55 clusters of weather patterns are differentiated. To reduce the computational effort, a PCA is initially performed, leading to a data reduction of about 98 %. The clustering itself was computed on 3-day periods constructed with the first six PCs using "k-means" clustering algorithm. The applied method enables an evaluation of the time evolution of the synoptic developments. The climate change signal is constructed by a projection of the GCM simulation on the EOFs attained from the NCEP-Reanalysis. Consequently, the same clusters are obtained and frequency distributions can be compared. For Central Europe, four primary storm clusters are identified. These clusters feature almost 72 % of the historical extreme storms events and add only to 5 % of the total relative frequency. Moreover, they show a statistically significant signature in the associated wind fields over Europe. An increased frequency of Central European storm clusters is detected with enhanced GHG conditions, associated with an enhancement of the pressure gradient over Central Europe. Consequently, more intense wind events over Central Europe are expected. The presented algorithm will be highly valuable for the analysis of huge data amounts as is required for e.g. multi-model ensemble analysis, particularly because of the enormous data reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybrid multiprocessor architectures which combine re-configurable computing and multiprocessors on a chip are being proposed to transcend the performance of standard multi-core parallel systems. Both fine-grained and coarse-grained parallel algorithm implementations are feasible in such hybrid frameworks. A compositional strategy for designing fine-grained multi-phase regular processor arrays to target hybrid architectures is presented in this paper. The method is based on deriving component designs using classical regular array techniques and composing the components into a unified global design. Effective designs with phase-changes and data routing at run-time are characteristics of these designs. In order to describe the data transfer between phases, the concept of communication domain is introduced so that the producer–consumer relationship arising from multi-phase computation can be treated in a unified way as a data routing phase. This technique is applied to derive new designs of multi-phase regular arrays with different dataflow between phases of computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For an increasing number of applications, mesoscale modelling systems now aim to better represent urban areas. The complexity of processes resolved by urban parametrization schemes varies with the application. The concept of fitness-for-purpose is therefore critical for both the choice of parametrizations and the way in which the scheme should be evaluated. A systematic and objective model response analysis procedure (Multiobjective Shuffled Complex Evolution Metropolis (MOSCEM) algorithm) is used to assess the fitness of the single-layer urban canopy parametrization implemented in the Weather Research and Forecasting (WRF) model. The scheme is evaluated regarding its ability to simulate observed surface energy fluxes and the sensitivity to input parameters. Recent amendments are described, focussing on features which improve its applicability to numerical weather prediction, such as a reduced and physically more meaningful list of input parameters. The study shows a high sensitivity of the scheme to parameters characterizing roof properties in contrast to a low response to road-related ones. Problems in partitioning of energy between turbulent sensible and latent heat fluxes are also emphasized. Some initial guidelines to prioritize efforts to obtain urban land-cover class characteristics in WRF are provided. Copyright © 2010 Royal Meteorological Society and Crown Copyright.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, multi-hop cooperative networks implementing channel state information (CSI)-assisted amplify-and-forward (AF) relaying in the presence of in-phase and quadrature-phase (I/Q) imbalance are investigated. We propose a compensation algorithm for the I/Q imbalance. The performance of the multi-hop CSI-assisted AF cooperative networks with and without compensation for I/Q imbalance in Nakagami-m fading environment is evaluated in terms of average symbol error probability. Numerical results are provided and show that the proposed compensation method can effectively mitigate the impact of I/Q imbalance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed generation plays a key role in reducing CO2 emissions and losses in transmission of power. However, due to the nature of renewable resources, distributed generation requires suitable control strategies to assure reliability and optimality for the grid. Multi-agent systems are perfect candidates for providing distributed control of distributed generation stations as well as providing reliability and flexibility for the grid integration. The proposed multi-agent energy management system consists of single-type agents who control one or more gird entities, which are represented as generic sub-agent elements. The agent applies one control algorithm across all elements and uses a cost function to evaluate the suitability of the element as a supplier. The behavior set by the agent's user defines which parameters of an element have greater weight in the cost function, which allows the user to specify the preference on suppliers dynamically. This study shows the ability of the multi-agent energy management system to select suppliers according to the selection behavior given by the user. The optimality of the supplier for the required demand is ensured by the cost function based on the parameters of the element.