880 resultados para Fuzzy Multi-Objective Linear Programming


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article addresses the problem of how to select the optimal combination of sensors and how to determine their optimal placement in a surveillance region in order to meet the given performance requirements at a minimal cost for a multimedia surveillance system. We propose to solve this problem by obtaining a performance vector, with its elements representing the performances of subtasks, for a given input combination of sensors and their placement. Then we show that the optimal sensor selection problem can be converted into the form of Integer Linear Programming problem (ILP) by using a linear model for computing the optimal performance vector corresponding to a sensor combination. Optimal performance vector corresponding to a sensor combination refers to the performance vector corresponding to the optimal placement of a sensor combination. To demonstrate the utility of our technique, we design and build a surveillance system consisting of PTZ (Pan-Tilt-Zoom) cameras and active motion sensors for capturing faces. Finally, we show experimentally that optimal placement of sensors based on the design maximizes the system performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, non-linear programming techniques are applied to the problem of controlling the vibration pattern of a stretched string. First, the problem of finding the magnitudes of two control forces applied at two points l1 and l2 on the string to reduce the energy of vibration over the interval (l1, l2) relative to the energy outside the interval (l1, l2) is considered. For this problem the relative merits of various methods of non-linear programming are compared. The more complicated problem of finding the positions and magnitudes of two control forces to obtain the desired energy pattern is then solved by using the slack unconstrained minimization technique with the Fletcher-Powell search. In the discussion of the results it is shown that the position of the control force is very important in controlling the energy pattern of the string.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates factors that impact the energy efficiency of a mining operation. An innovative mathematical framework and solution approach are developed to model, solve and analyse an open-pit coal mine. A case study in South East Queensland is investigated to validate the approach and explore the opportunities for using it to aid long, medium and short term decision makers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear optimization model was used to calculate seven wood procurement scenarios for years 1990, 2000 and 2010. Productivity and cost functions for seven cutting, five terrain transport, three long distance transport and various work supervision and scaling methods were calculated from available work study reports. All method's base on Nordic cut to length system. Finland was divided in three parts for description of harvesting conditions. Twenty imaginary wood processing points and their wood procurement areas were created for these areas. The procurement systems, which consist of the harvesting conditions and work productivity functions, were described as a simulation model. In the LP-model the wood procurement system has to fulfil the volume and wood assortment requirements of processing points by minimizing the procurement cost. The model consists of 862 variables and 560 restrictions. Results show that it is economical to increase the mechanical work in harvesting. Cost increment alternatives effect only little on profitability of manual work. The areas of later thinnings and seed tree- and shelter wood cuttings increase on cost of first thinnings. In mechanized work one method, 10-tonne one grip harvester and forwarder, is gaining advantage among other methods. Working hours of forwarder are decreasing opposite to the harvester. There is only little need to increase the number of harvesters and trucks or their drivers from today's level. Quite large fluctuations in level of procurement and cost can be handled by constant number of machines, by alternating the number of season workers and by driving machines in two shifts. It is possible, if some environmental problems of large scale summer time harvesting can be solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By using the lower bound limit analysis in conjunction with finite elements and linear programming, the bearing capacity factors due to cohesion, surcharge and unit weight, respectively, have been computed for a circular footing with different values of phi. The recent axisymmetric formulation proposed by the authors under phi = 0 condition, which is based on the concept that the magnitude of the hoop stress (sigma(theta)) remains closer to the least compressive normal stress (sigma(3)), is extended for a general c-phi soil. The computational results are found to compare quite well with the available numerical results from literature. It is expected that the study will be useful for solving various axisymmetric geotechnical stability problems. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By incorporating the variation of peak soil friction angle (phi) with mean principal stress (sigma(m)), the effect of anchor width (B) on vertical uplift resistance of a strip anchor plate has been examined. The anchor was embedded horizontally in a granular medium. The analysis was performed using lower bound finite element limit analysis and linear programming. An iterative procedure, proposed recently by the authors, was implemented to incorporate the variation of phi with sigma(m). It is noted that for a given embedment ratio, with a decrease in anchor width (B), (i) the uplift factor (F-gamma) increases continuously and (ii) the average ultimate uplift pressure (q(u)) decreases quite significantly. The scale effect becomes more pronounced at greater embedment ratios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of determining optimal power spectral density models for earthquake excitation which satisfy constraints on total average power, zero crossing rate and which produce the highest response variance in a given linear system is considered. The solution to this problem is obtained using linear programming methods. The resulting solutions are shown to display a highly deterministic structure and, therefore, fail to capture the stochastic nature of the input. A modification to the definition of critical excitation is proposed which takes into account the entropy rate as a measure of uncertainty in the earthquake loads. The resulting problem is solved using calculus of variations and also within linear programming framework. Illustrative examples on specifying seismic inputs for a nuclear power plant and a tall earth dam are considered and the resulting solutions are shown to be realistic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of spurious patterns in neural associative memory models is discussed, Some suggestions to solve this problem from the literature are reviewed and their inadequacies are pointed out, A solution based on the notion of neural self-interaction with a suitably chosen magnitude is presented for the Hebb learning rule. For an optimal learning rule based on linear programming, asymmetric dilution of synaptic connections is presented as another solution to the problem of spurious patterns, With varying percentages of asymmetric dilution it is demonstrated numerically that this optimal learning rule leads to near total suppression of spurious patterns. For practical usage of neural associative memory networks a combination of the two solutions with the optimal learning rule is recommended to be the best proposition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Routing of floods is essential to control the flood flow at the flood control station such that it is within the specified safe limit. In this paper, the applicability of the extended Muskingum method is examined for routing of floods for a case study of Hirakud reservoir, Mahanadi river basin, India. The inflows to the flood control station are of two types-one controllable which comprises of reservoir releases for power and spill and the other is uncontrollable which comprises of inflow from lower tributaries and intermediate catchment between the reservoir and the flood control station. Muskingum model is improved to incorporate multiple sources of inflows and single outflow to route the flood in the reach. Instead of time lag and prismoidal flow parameters, suitable coefficients for various types of inflows were derived using Linear Programming. Presently, the decisions about operation of gates of Hirakud dam are being taken once in 12 h during floods. However, four time intervals of 24, 18, 12 and 6 h are examined to test the sensitivity of the routing time interval on the computed flood flow at the flood control station. It is observed that mean relative error decreases with decrease in routing interval both for calibration and testing phase. It is concluded that the extended Muskingum method can be explored for similar reservoir configurations such as Hirakud reservoir with suitable modifications. (C) 2010 International Association of Hydro-environment Engineering and Research. Asia Pacific Division. Published by Elsevier By. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combinatorial exchanges are double sided marketplaces with multiple sellers and multiple buyers trading with the help of combinatorial bids. The allocation and other associated problems in such exchanges are known to be among the hardest to solve among all economic mechanisms. It has been shown that the problems of surplus maximization or volume maximization in combinatorial exchanges are inapproximable even with free disposal. In this paper, the surplus maximization problem is formulated as an integer linear programming problem and we propose a Lagrangian relaxation based heuristic to find a near optimal solution. We develop computationally efficient tâtonnement mechanisms for clearing combinatorial exchanges where the Lagrangian multipliers can be interpreted as the prices of the items set by the exchange in each iteration. Our mechanisms satisfy Individual-rationality and Budget-nonnegativity properties. The computational experiments performed on representative data sets show that the proposed heuristic produces a feasible solution with negligible optimality gap.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to increasing trend of intensive rice cultivation in a coastal river basin, crop planning and groundwater management are imperative for the sustainable agriculture. For effective management, two models have been developed viz. groundwater balance model and optimum cropping and groundwater management model to determine optimum cropping pattern and groundwater allocation from private and government tubewells according to different soil types (saline and non-saline), type of agriculture (rainfed and irrigated) and seasons (monsoon and winter). A groundwater balance model has been developed considering mass balance approach. The components of the groundwater balance considered are recharge from rainfall, irrigated rice and non-rice fields, base flow from rivers and seepage flow from surface drains. In the second phase, a linear programming optimization model is developed for optimal cropping and groundwater management for maximizing the economic returns. The models developed were applied to a portion of coastal river basin in Orissa State, India and optimal cropping pattern for various scenarios of river flow and groundwater availability was obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today's feature-rich multimedia products require embedded system solution with complex System-on-Chip (SoC) to meet market expectations of high performance at a low cost and lower energy consumption. The memory architecture of the embedded system strongly influences critical system design objectives like area, power and performance. Hence the embedded system designer performs a complete memory architecture exploration to custom design a memory architecture for a given set of applications. Further, the designer would be interested in multiple optimal design points to address various market segments. However, tight time-to-market constraints enforces short design cycle time. In this paper we address the multi-level multi-objective memory architecture exploration problem through a combination of exhaustive-search based memory exploration at the outer level and a two step based integrated data layout for SPRAM-Cache based architectures at the inner level. We present a two step integrated approach for data layout for SPRAM-Cache based hybrid architectures with the first step as data-partitioning that partitions data between SPRAM and Cache, and the second step is the cache conscious data layout. We formulate the cache-conscious data layout as a graph partitioning problem and show that our approach gives up to 34% improvement over an existing approach and also optimizes the off-chip memory address space. We experimented our approach with 3 embedded multimedia applications and our approach explores several hundred memory configurations for each application, yielding several optimal design points in a few hours of computation on a standard desktop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Radius of Direct attraction of a discrete neural network is a measure of stability of the network. it is known that Hopfield networks designed using Hebb's Rule have a radius of direct attraction of Omega(n/p) where n is the size of the input patterns and p is the number of them. This lower bound is tight if p is no larger than 4. We construct a family of such networks with radius of direct attraction Omega(n/root plog p), for any p greater than or equal to 5. The techniques used to prove the result led us to the first polynomial-time algorithm for designing a neural network with maximum radius of direct attraction around arbitrary input patterns. The optimal synaptic matrix is computed using the ellipsoid method of linear programming in conjunction with an efficient separation oracle. Restrictions of symmetry and non-negative diagonal entries in the synaptic matrix can be accommodated within this scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An improvised algorithm is presented for optimal VAr allocation in a large power system using a linear programming technique. The proposed method requires less computer memory than those algorithms currently available.