56 resultados para Objective measure
Resumo:
Many optimal control problems are characterized by their multiple performance measures that are often noncommensurable and competing with each other. The presence of multiple objectives in a problem usually give rise to a set of optimal solutions, largely known as Pareto-optimal solutions. Evolutionary algorithms have been recognized to be well suited for multi-objective optimization because of their capability to evolve a set of nondominated solutions distributed along the Pareto front. This has led to the development of many evolutionary multi-objective optimization algorithms among which Nondominated Sorting Genetic Algorithm (NSGA and its enhanced version NSGA-II) has been found effective in solving a wide variety of problems. Recently, we reported a genetic algorithm based technique for solving dynamic single-objective optimization problems, with single as well as multiple control variables, that appear in fed-batch bioreactor applications. The purpose of this study is to extend this methodology for solution of multi-objective optimal control problems under the framework of NSGA-II. The applicability of the technique is illustrated by solving two optimal control problems, taken from literature, which have usually been solved by several methods as single-objective dynamic optimization problems. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Regional impacts of climate change remain subject to large uncertainties accumulating from various sources, including those due to choice of general circulation models (GCMs), scenarios, and downscaling methods. Objective constraints to reduce the uncertainty in regional predictions have proven elusive. In most studies to date the nature of the downscaling relationship (DSR) used for such regional predictions has been assumed to remain unchanged in a future climate. However,studies have shown that climate change may manifest in terms of changes in frequencies of occurrence of the leading modes of variability, and hence, stationarity of DSRs is not really a valid assumption in regional climate impact assessment. This work presents an uncertainty modeling framework where, in addition to GCM and scenario uncertainty, uncertainty in the nature of the DSR is explored by linking downscaling with changes in frequencies of such modes of natural variability. Future projections of the regional hydrologic variable obtained by training a conditional random field (CRF) model on each natural cluster are combined using the weighted Dempster-Shafer (D-S) theory of evidence combination. Each projection is weighted with the future projected frequency of occurrence of that cluster (''cluster linking'') and scaled by the GCM performance with respect to the associated cluster for the present period (''frequency scaling''). The D-S theory was chosen for its ability to express beliefs in some hypotheses, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The methodology is tested for predicting monsoon streamflow of the Mahanadi River at Hirakud Reservoir in Orissa, India. The results show an increasing probability of extreme, severe, and moderate droughts due to limate change. Significantly improved agreement between GCM predictions owing to cluster linking and frequency scaling is seen, suggesting that by linking regional impacts to natural regime frequencies, uncertainty in regional predictions can be realistically quantified. Additionally, by using a measure of GCM performance in simulating natural regimes, this uncertainty can be effectively constrained.
Resumo:
In this paper, we present a generic method/model for multi-objective design optimization of laminated composite components, based on Vector Evaluated Artificial Bee Colony (VEABC) algorithm. VEABC is a parallel vector evaluated type, swarm intelligence multi-objective variant of the Artificial Bee Colony algorithm (ABC). In the current work a modified version of VEABC algorithm for discrete variables has been developed and implemented successfully for the multi-objective design optimization of composites. The problem is formulated with multiple objectives of minimizing weight and the total cost of the composite component to achieve a specified strength. The primary optimization variables are the number of layers, its stacking sequence (the orientation of the layers) and thickness of each layer. The classical lamination theory is utilized to determine the stresses in the component and the design is evaluated based on three failure criteria: failure mechanism based failure criteria, maximum stress failure criteria and the tsai-wu failure criteria. The optimization method is validated for a number of different loading configurations-uniaxial, biaxial and bending loads. The design optimization has been carried for both variable stacking sequences, as well fixed standard stacking schemes and a comparative study of the different design configurations evolved has been presented. Finally the performance is evaluated in comparison with other nature inspired techniques which includes Particle Swarm Optimization (PSO), Artificial Immune System (AIS) and Genetic Algorithm (GA). The performance of ABC is at par with that of PSO, AIS and GA for all the loading configurations. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Channel assignment in multi-channel multi-radio wireless networks poses a significant challenge due to scarcity of number of channels available in the wireless spectrum. Further, additional care has to be taken to consider the interference characteristics of the nodes in the network especially when nodes are in different collision domains. This work views the problem of channel assignment in multi-channel multi-radio networks with multiple collision domains as a non-cooperative game where the objective of the players is to maximize their individual utility by minimizing its interference. Necessary and sufficient conditions are derived for the channel assignment to be a Nash Equilibrium (NE) and efficiency of the NE is analyzed by deriving the lower bound of the price of anarchy of this game. A new fairness measure in multiple collision domain context is proposed and necessary and sufficient conditions for NE outcomes to be fair are derived. The equilibrium conditions are then applied to solve the channel assignment problem by proposing three algorithms, based on perfect/imperfect information, which rely on explicit communication between the players for arriving at an NE. A no-regret learning algorithm known as Freund and Schapire Informed algorithm, which has an additional advantage of low overhead in terms of information exchange, is proposed and its convergence to the stabilizing outcomes is studied. New performance metrics are proposed and extensive simulations are done using Matlab to obtain a thorough understanding of the performance of these algorithms on various topologies with respect to these metrics. It was observed that the algorithms proposed were able to achieve good convergence to NE resulting in efficient channel assignment strategies.
Resumo:
We have developed a theory for an electrochemical way of measuring the statistical properties of a nonfractally rough electrode. We obtained the expression for the current transient on a rough electrode which shows three times regions: short and long time limits and the transition region between them. The expressions for these time ranges are exploited to extract morphological information about the surface roughness. In the short and long time regimes, we extract information regarding various morphological features like the roughness factor, average roughness, curvature, correlation length, dimensionality of roughness, and polynomial approximation for the correlation function. The formulas for the surface structure factors (the measure of surface roughness) of rough surfaces in terms of measured reversible and diffusion-limited current transients are also obtained. Finally, we explore the feasibility of making such measurements.
Resumo:
A new method based on analysis of a single diffraction pattern is proposed to measure deflections in micro-cantilever (MC) based sensor probes, achieving typical deflection resolutions of 1nm and surface stress changes of 50 mu N/m. The proposed method employs a double MC structure where the deflection of one of the micro-cantilevers relative to the other due to surface stress changes results in a linear shift of intensity maxima of the Fraunhofer diffraction pattern of the transilluminated MC. Measurement of such shifts in the intensity maxima of a particular order along the length of the structure can be done to an accuracy of 0.01mm leading to the proposed sensitivity of deflection measurement in a typical microcantilever. This method can overcome the fundamental measurement sensitivity limit set by diffraction and pointing stability of laser beam in the widely used Optical Beam Deflection method (OBDM).
Resumo:
Notched three point bend specimens (TPB) were tested under crack mouth opening displacement (CMOD) control at a rate of 0.0004 mm/s and during the fracture process acoustic emissions (AE) were simultaneously monitored. It was observed that AE energy could be related to fracture energy. An experimental study was done to understand the behavior of AE energy with parameters of concrete like its strength and size. In this study, AE energy was used as a quantitative measure of size independent specific fracture energy of concrete beams and the concepts of boundary effect and local fracture energy were used to obtain size independent AE energy from which size independent fracture energy was obtained. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The problem of quantification of intelligence of humans, and of intelligent systems, has been a challenging and controversial topic. IQ tests have been traditionally used to quantify human intelligence based on results of test designed by psychologists. It is in general very difficult to quantify intelligence. In this paper the authors consider a simple question-answering (Q-A) system and use this to quantify intelligence. The authors quantify intelligence as a vector with three components. The components consist of a measure of knowledge in asking questions, effectiveness of questions asked, and correctness of deduction. The authors formalize these parameters and have conducted experiments on humans to measure these parameters
Resumo:
The suitability of the European Centre for Medium Range Weather Forecasting (ECMWF) operational wind analysis for the period 1980-1991 for studying interannual variability is examined. The changes in the model and the analysis procedure are shown to give rise to a systematic and significant trend in the large scale circulation features. A new method of removing the systematic errors at all levels is presented using multivariate EOF analysis. Objectively detrended analysis of the three-dimensional wind field agrees well with independent Florida State University (FSU) wind analysis at the surface. It is shown that the interannual variations in the detrended surface analysis agree well in amplitude as well as spatial patterns with those of the FSU analysis. Therefore, the detrended analyses at other levels as well are expected to be useful for studies of variability and predictability at interannual time scales. It is demonstrated that this trend in the wind field is due to the shift in the climatologies from the period 1980-1985 to the period 1986-1991.
Resumo:
Two smectite samples having different layer charges were pillared using hydroxy aluminium oligomers at a OH/Al ratio of 2.5 and at pH 4.3 to 4.6. Pillaring was carried out at different conditions such as ageing, temperature and base addition time of the pillaring solution, and also in the presence of nonionic surfactant polyoxyethylene sorbitanmonooleate (Tween-80). The primary objective of preparing at different conditions was to introduce varied quantities of aluminium oligomer between the layers and to study its effect on the properties of the pillared products. A simple method has been followed to estimate the amount of interlayer aluminium. A quantity called pillar density number (PDN) based on the ratio of interlayer Al adsorbed to CEC of the parent clay has been effectively used to evaluate the nature of the resulting pillared product. PDN, for a given clay, was found to correlate well with the sharpness of the d(001) peaks for the air dried samples. The calculated number of pillars, varied from 3.00 x 10(18) to 5.32 x 10(18) per meq charge. The present study shows that a higher value of PDN is indicative of better thermal stability. Pillar density number may be conveniently used as a measure of the thermal stability of pillared samples.
Resumo:
A new approach based on occupation measures is introduced for studying stochastic differential games. For two-person zero-sum games, the existence of values and optimal strategies for both players is established for various payoff criteria. ForN-person games, the existence of equilibria in Markov strategies is established for various cases.
Resumo:
Seizure electroencephalography (EEG) was recorded from two channels-right (Rt) and left (Lt)-during bilateral electroconvulsive therapy (ECT) (n = 12) and unilateral ECT (n = 12). The EEG was also acquired into a microcomputer and was analyzed without knowledge of the clinical details. EEG recordings of both ECT procedures yielded seizures of comparable duration. The Strength Symmetry Index (SSI) was computed from the early- and midseizure phases using the fractal dimension of the EEG. The seizures of unilateral ECT were characterized by significantly smaller SSI in both phases. More unilateral than bilateral ECT seizures had a smaller than median SSI in both phases. The seizures also differed on other measures as reported in the literature. The findings indicate that SSI may be a potential measure of seizure adequacy that remains to be validated in future research.
Resumo:
The conventional definition of redundancy is applicable to skeletal structural systems only, whereas the concept of redundancy has never been discussed in the context of a continuum. Generally, structures in civil engineering constitute a combination of both skeletal and continuum segments. Hence, this gaper presents a generalized definition of redundancy that has been defined in terms of structural response sensitivity, which is applicable to both continuum and discrete structures. In contrast to the conventional definition of redundancy, which is assumed to be fixed for a given structure and is believed to be independent of loading and material properties, the new definition would depend on strength and response of the structure at a given stage of its service life. The redundancy measure proposed in this paper is linked to the structural response sensitivities. Thus, the structure can have different degrees of redundancy during its lifetime, depending on the response sensitivity under consideration It is believed that this new redundancy measure would be more relevant in structural evaluation, damage assessment, and reliability analysis of structures at large.
Resumo:
We introduce a multifield comparison measure for scalar fields that helps in studying relations between them. The comparison measure is insensitive to noise in the scalar fields and to noise in their gradients. Further, it can be computed robustly and efficiently. Results from the visual analysis of various data sets from climate science and combustion applications demonstrate the effective use of the measure.