891 resultados para Multicommodity capacitated network design problem
Resumo:
An integrated approach to energy planning, when applied to large hydroelectric projects, requires that the energy-opportunity cost of the land submerged under the reservoir be incorporated into the planning methodology. Biomass energy lost from the submerged land has to be compared to the electrical energy generated, for which we develop four alternative formulations of the net-energy function. The design problem is posed as an LP problem and is solved for two sites in India. Our results show that the proposed designs may not be viable in net-energy terms, whereas a marginal reduction in the generation capacity could lead to an optimal design that gives substantial savings in the submerged area. Allowing seasonal variations in the hydroelectric generation capacity also reduces the reservoir size. A mixed hydro-wood generation system is then examined and is found to be viable.
Resumo:
In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.
Resumo:
Design creativity involves developing novel and useful solutions to design problems The research in this article is an attempt to understand how novelty of a design resulting from a design process is related to the kind of outcomes. described here as constructs, involved in the design process A model of causality, the SAPPhIRE model, is used as the basis of the analysis The analysis is based on previous research that shows that designing involves development and exploration of the seven basic constructs of the SAPPhIRE model that constitute the causal connection between the various levels of abstraction at which a design can be described The constructs am state change, action, parts. phenomenon. input. organs. and effect The following two questions are asked. Is there a relationship between novelty and the constructs? If them is a relationship, what is the degree of this relationship? A hypothesis is developed to answer the questions an increase in the number and variety of ideas explored while designing should enhance the variety of concept space. leading to an increase in the novelty of the concept space Eight existing observational studies of designing sessions are used to empirically validate the hypothesis Each designing session involves an individual designer. experienced or novice. solving a design problem by producing concepts and following a think-aloud protocol. The results indicate dependence of novelty of concept space on variety of concept space and dependence of variety of concept space on variety of idea space. thereby validating the hypothesis The Jesuits also reveal a strong correlation between novelty and the constructs, correlation value decreases as the abstraction level of the constructs reduces. signifying the importance of using constructs at higher abstraction levels for enhancing novelty
Resumo:
Energy harvesting sensors (EHS), which harvest energy from the environment in order to sense and then communicate their measurements over a wireless link, provide the tantalizing possibility of perpetual lifetime operation of a sensor network. The wireless communication link design problem needs to be revisited for these sensors as the energy harvested can be random and small and not available when required. In this paper, we develop a simple model that captures the interactions between important parameters that govern the communication link performance of a EHS node, and analyze its outage probability for both slow fading and fast fading wireless channels. Our analysis brings out the critical importance of the energy profile and the energy storage capability on the EHS link performance. Our results show that properly tuning the transmission parameters of the EHS node and having even a small amount of energy storage capability improves the EHS link performance considerably.
Resumo:
The major contribution of this paper is to introduce load compatibility constraints in the mathematical model for the capacitated vehicle routing problem with pickup and deliveries. The employee transportation problem in the Indian call centers and transportation of hazardous materials provided the motivation for this variation. In this paper we develop a integer programming model for the vehicle routing problem with load compatibility constraints. Specifically two types of load compatability constraints are introduced, namely mutual exclusion and conditional exclusion. The model is demonstrated with an application from the employee transportation problem in the Indian call centers.
Resumo:
A link failure in the path of a virtual circuit in a packet data network will lead to premature disconnection of the circuit by the end-points. A soft failure will result in degraded throughput over the virtual circuit. If these failures can be detected quickly and reliably, then appropriate rerouteing strategies can automatically reroute the virtual circuits that use the failed facility. In this paper, we develop a methodology for analysing and designing failure detection schemes for digital facilities. Based on errored second data, we develop a Markov model for the error and failure behaviour of a T1 trunk. The performance of a detection scheme is characterized by its false alarm probability and the detection delay. Using the Markov model, we analyse the performance of detection schemes that use physical layer or link layer information. The schemes basically rely upon detecting the occurrence of severely errored seconds (SESs). A failure is declared when a counter, that is driven by the occurrence of SESs, reaches a certain threshold.For hard failures, the design problem reduces to a proper choice;of the threshold at which failure is declared, and on the connection reattempt parameters of the virtual circuit end-point session recovery procedures. For soft failures, the performance of a detection scheme depends, in addition, on how long and how frequent the error bursts are in a given failure mode. We also propose and analyse a novel Level 2 detection scheme that relies only upon anomalies observable at Level 2, i.e. CRC failures and idle-fill flag errors. Our results suggest that Level 2 schemes that perform as well as Level 1 schemes are possible.
Resumo:
In this paper, we study the problem of wireless sensor network design by deploying a minimum number of additional relay nodes (to minimize network design cost) at a subset of given potential relay locationsin order to convey the data from already existing sensor nodes (hereafter called source nodes) to a Base Station within a certain specified mean delay bound. We formulate this problem in two different ways, and show that the problem is NP-Hard. For a problem in which the number of existing sensor nodes and potential relay locations is n, we propose an O(n) approximation algorithm of polynomial time complexity. Results show that the algorithm performs efficiently (in over 90% of the tested scenarios, it gave solutions that were either optimal or exceeding optimal just by one relay) in various randomly generated network scenarios.
Resumo:
This paper presents a decentralized/peer-to-peer architecture-based parallel version of the vector evaluated particle swarm optimization (VEPSO) algorithm for multi-objective design optimization of laminated composite plates using message passing interface (MPI). The design optimization of laminated composite plates being a combinatorially explosive constrained non-linear optimization problem (CNOP), with many design variables and a vast solution space, warrants the use of non-parametric and heuristic optimization algorithms like PSO. Optimization requires minimizing both the weight and cost of these composite plates, simultaneously, which renders the problem multi-objective. Hence VEPSO, a multi-objective variant of the PSO algorithm, is used. Despite the use of such a heuristic, the application problem, being computationally intensive, suffers from long execution times due to sequential computation. Hence, a parallel version of the PSO algorithm for the problem has been developed to run on several nodes of an IBM P720 cluster. The proposed parallel algorithm, using MPI's collective communication directives, establishes a peer-to-peer relationship between the constituent parallel processes, deviating from the more common master-slave approach, in achieving reduction of computation time by factor of up to 10. Finally we show the effectiveness of the proposed parallel algorithm by comparing it with a serial implementation of VEPSO and a parallel implementation of the vector evaluated genetic algorithm (VEGA) for the same design problem. (c) 2012 Elsevier Ltd. All rights reserved.
Resumo:
We report novel resistor grid network based space cloth for application in single and multi layer radar absorbers. The space cloth is analyzed and relations are derived for the sheet resistance in terms of the resistor in the grid network. Design curves are drawn using MATLAB and the space cloth is analyzed using HFSS™ software in a Salisbury screen for S, C and X bands. Next, prediction and simulation results for a three layer Jaumann absorber using square grid resistor network with a Radar Cross Section Reduction (RCSR) of -15 dB over C, X and Ku bands is reported. The simulation results are encouraging and have led to the fabrication of prototype broadband radar absorber and experimental work is under progress.
Resumo:
We have developed SmartConnect, a tool that addresses the growing need for the design and deployment of multihop wireless relay networks for connecting sensors to a control center. Given the locations of the sensors, the traffic that each sensor generates, the quality of service (QoS) requirements, and the potential locations at which relays can be placed, SmartConnect helps design and deploy a low-cost wireless multihop relay network. SmartConnect adopts a field interactive, iterative approach, with model based network design, field evaluation and relay augmentation performed iteratively until the desired QoS is met. The design process is based on approximate combinatorial optimization algorithms. In the paper, we provide the design choices made in SmartConnect and describe the experimental work that led to these choices. Finally, we provide results from some experimental deployments.
Resumo:
In recent times, crowdsourcing over social networks has emerged as an active tool for complex task execution. In this paper, we address the problem faced by a planner to incen-tivize agents in the network to execute a task and also help in recruiting other agents for this purpose. We study this mecha-nism design problem under two natural resource optimization settings: (1) cost critical tasks, where the planner’s goal is to minimize the total cost, and (2) time critical tasks, where the goal is to minimize the total time elapsed before the task is executed. We define a set of fairness properties that should beideally satisfied by a crowdsourcing mechanism. We prove that no mechanism can satisfy all these properties simultane-ously. We relax some of these properties and define their ap-proximate counterparts. Under appropriate approximate fair-ness criteria, we obtain a non-trivial family of payment mech-anisms. Moreover, we provide precise characterizations of cost critical and time critical mechanisms.
Resumo:
Conceptual design involves identification of required functions of the intended design, generation of concepts to fulfill these functions, and evaluation of these concepts to select the most promising ones for further development. The focus of this paper is the second phase-concept generation, in which a challenge has been to develop possible physical embodiments to offer designers for exploration and evaluation. This paper investigates the issue of how to transform and thus synthesise possible generic physical embodiments and reports an implemented method that could automatically generate these embodiments. In this paper, a method is proposed to transform a variety of possible initial solutions to a design problem into a set of physical solutions that are described in terms of abstraction of mechanical movements. The underlying principle of this method is to make it possible to link common attributes between a specific abstract representation and its possible physical objects. For a given input, this method can produce a set of concepts in terms of their generic physical embodiments. The method can be used to support designers to start with a given input-output function and systematically search for physical objects for design consideration in terms of simplified functional, spatial, and mechanical movement requirements.
Resumo:
Internal analogies are created if the knowledge of source domain is obtained only from the cognition of designers. In this paper, an understanding of the use of internal analogies in conceptual design is developed by studying: the types of internal analogies; the roles of internal analogies; the influence of design problems on the creation of internal analogies; the role of experience of designers on the use of internal analogies; the levels of abstraction at which internal analogies are searched in target domain, identified in source domain, and realized in the target domain; and the effect of internal analogies from the natural and artificial domains on the solution space created using these analogies. To facilitate this understanding, empirical studies of design sessions from earlier research, each involving a designer solving a design problem by identifying requirements and developing conceptual solutions, without using any support, are used. The following are the important findings: designers use analogies from the natural and artificial domains; analogies are used for generating requirements and solutions; the nature of the design problem influences the use of analogies; the role of experience of designers on the use of analogies is not clearly ascertained; analogical transfer is observed only at few levels of abstraction while many levels remain unexplored; and analogies from the natural domain seem to have more positive influence than the artificial domain on the number of ideas and variety of idea space.
Resumo:
In this paper we empirically investigate which are the structural characteristics that can help to predict the complexity of NK-landscape instances for estimation of distribution algorithms. To this end, we evolve instances that maximize the estimation of distribution algorithm complexity in terms of its success rate. Similarly, instances that minimize the algorithm complexity are evolved. We then identify network measures, computed from the structures of the NK-landscape instances, that have a statistically significant difference between the set of easy and hard instances. The features identified are consistently significant for different values of N and K.
Resumo:
Planning in design processes is modeled in terms of connectivities between product developments. Each product development comprises a network of processes. Similarity between processes is analysed by a layered classification ranging from common components to shared design knowledge. The connectivities between products arising from similarities among products are represented by a multidimensional network. Design planning is described by flows or 'traffic' on this network which represents a structural model of complexity. Comparison is made with information based measures of the complexity of designs and processes.