18 resultados para Auction

em Indian Institute of Science - Bangalore - Índia


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business processes and application functionality are becoming available as internal web services inside enterprise boundaries as well as becoming available as commercial web services from enterprise solution vendors and web services marketplaces. Typically there are multiple web service providers offering services capable of fulfilling a particular functionality, although with different Quality of Service (QoS). Dynamic creation of business processes requires composing an appropriate set of web services that best suit the current need. This paper presents a novel combinatorial auction approach to QoS aware dynamic web services composition. Such an approach would enable not only stand-alone web services but also composite web services to be a part of a business process. The combinatorial auction leads to an integer programming formulation for the web services composition problem. An important feature of the model is the incorporation of service level agreements. We describe a software tool QWESC for QoS-aware web services composition based on the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we develop a novel auction algorithm for procuring wireless channel by a wireless node in a heterogeneous wireless network. We assume that the service providers of the heterogeneous wireless network are selfish and non-cooperative in the sense that they are only interested in maximizing their own utilities. The wireless user needs to procure wireless channels to execute multiple tasks. To solve the problem of the wireless user, we propose a reverse optimal (REVOPT) auction and derive an expression for the expected payment by the wireless user. The proposed auction mechanism REVOPT satisfies important game theoretic properties such as Bayesian incentive compatibility and individual rationality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combinatorial exchanges are double sided marketplaces with multiple sellers and multiple buyers trading with the help of combinatorial bids. The allocation and other associated problems in such exchanges are known to be among the hardest to solve among all economic mechanisms. In this paper, we develop computationally efficient iterative auction mechanisms for solving combinatorial exchanges. Our mechanisms satisfy Individual-rationality (IR) and budget-nonnegativity (BN) properties. We also show that our method is bounded and convergent. Our numerical experiments show that our algorithm produces good quality solutions and is computationally efficient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we exploit the idea of decomposition to match buyers and sellers in an electronic exchange for trading large volumes of homogeneous goods, where the buyers and sellers specify marginal-decreasing piecewise constant price curves to capture volume discounts. Such exchanges are relevant for automated trading in many e-business applications. The problem of determining winners and Vickrey prices in such exchanges is known to have a worst-case complexity equal to that of as many as (1 + m + n) NP-hard problems, where m is the number of buyers and n is the number of sellers. Our method proposes the overall exchange problem to be solved as two separate and simpler problems: 1) forward auction and 2) reverse auction, which turns out to be generalized knapsack problems. In the proposed approach, we first determine the quantity of units to be traded between the sellers and the buyers using fast heuristics developed by us. Next, we solve a forward auction and a reverse auction using fully polynomial time approximation schemes available in the literature. The proposed approach has worst-case polynomial time complexity. and our experimentation shows that the approach produces good quality solutions to the problem. Note to Practitioners- In recent times, electronic marketplaces have provided an efficient way for businesses and consumers to trade goods and services. The use of innovative mechanisms and algorithms has made it possible to improve the efficiency of electronic marketplaces by enabling optimization of revenues for the marketplace and of utilities for the buyers and sellers. In this paper, we look at single-item, multiunit electronic exchanges. These are electronic marketplaces where buyers submit bids and sellers ask for multiple units of a single item. We allow buyers and sellers to specify volume discounts using suitable functions. Such exchanges are relevant for high-volume business-to-business trading of standard products, such as silicon wafers, very large-scale integrated chips, desktops, telecommunications equipment, commoditized goods, etc. The problem of determining winners and prices in such exchanges is known to involve solving many NP-hard problems. Our paper exploits the familiar idea of decomposition, uses certain algorithms from the literature, and develops two fast heuristics to solve the problem in a near optimal way in worst-case polynomial time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The decision to patent a technology is a difficult one to make for the top management of any organization. The expected value that the patent might deliver in the market is an important factor that impacts this judgement. Earlier researchers have suggested that patent prices are better indicators of value of a patent and that auction prices are the best way of determining value. However, the lack of public data on pricing has prevented research on understanding the dynamics of patent pricing. Our paper uses singleton patent auction price data of Ocean Tomo LLC to study the prices of patents. We describe price characteristics of these patents. The price of these patents was correlated with their age, and a significant correlation was found. A price - age matrix was developed and we describe the price characteristics of patents using four quadrants of the matrix, namely young and old patents with low and high prices. We also found that patents owned by small firms get transacted more often and inventor owned patents attracted a better price than assignee owned patents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reduction of carbon emissions is of paramount importance in the context of global warming. Countries and global companies are now engaged in understanding systematic ways of achieving well defined emission targets. In fact, carbon credits have become significant and strategic instruments of finance for countries and global companies. In this paper, we formulate and suggest a solution to the carbon allocation problem, which involves determining a cost minimizing allocation of carbon credits among different emitting agents. We address this challenge in the context of a global company which is faced with the challenge of determining an allocation of carbon credit caps among its divisions in a cost effective way. The problem is formulated as a reverse auction problem where the company plays the role of a buyer or carbon planning authority and the different divisions within the company are the emitting agents that specify cost curves for carbon credit reductions. Two natural variants of the problem: (a) with unlimited budget and (b) with limited budget are considered. Suitable assumptions are made on the cost curves and in each of the two cases we show that the resulting problem formulation is a knapsack problem that can be solved optimally using a greedy heuristic. The solution of the allocation problem provides critical decision support to global companies engaged seriously in green programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In pay-per click sponsored search auctions which are currently extensively used by search engines, the auction for a keyword involves a certain number of advertisers (say k) competing for available slots (say m) to display their ads. This auction is typically conducted for a number of rounds (say T). There are click probabilities mu_ij associated with agent-slot pairs. The search engine's goal is to maximize social welfare, for example, the sum of values of the advertisers. The search engine does not know the true value of an advertiser for a click to her ad and also does not know the click probabilities mu_ij s. A key problem for the search engine therefore is to learn these during the T rounds of the auction and also to ensure that the auction mechanism is truthful. Mechanisms for addressing such learning and incentives issues have recently been introduced and would be referred to as multi-armed-bandit (MAB) mechanisms. When m = 1,characterizations for truthful MAB mechanisms are available in the literature and it has been shown that the regret for such mechanisms will be O(T^{2/3}). In this paper, we seek to derive a characterization in the realistic but nontrivial general case when m > 1 and obtain several interesting results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web services are now a key ingredient of software services offered by software enterprises. Many standardized web services are now available as commodity offerings from web service providers. An important problem for a web service requester is the web service composition problem which involves selecting the right mix of web service offerings to execute an end-to-end business process. Web service offerings are now available in bundled form as composite web services and more recently, volume discounts are also on offer, based on the number of executions of web services requested. In this paper, we develop efficient algorithms for the web service composition problem in the presence of composite web service offerings and volume discounts. We model this problem as a combinatorial auction with volume discounts. We first develop efficient polynomial time algorithms when the end-to-end service involves a linear workflow of web services. Next we develop efficient polynomial time algorithms when the end-to-end service involves a tree workflow of web services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we address a key problem faced by advertisers in sponsored search auctions on the web: how much to bid, given the bids of the other advertisers, so as to maximize individual payoffs? Assuming the generalized second price auction as the auction mechanism, we formulate this problem in the framework of an infinite horizon alternative-move game of advertiser bidding behavior. For a sponsored search auction involving two advertisers, we characterize all the pure strategy and mixed strategy Nash equilibria. We also prove that the bid prices will lead to a Nash equilibrium, if the advertisers follow a myopic best response bidding strategy. Following this, we investigate the bidding behavior of the advertisers if they use Q-learning. We discover empirically an interesting trend that the Q-values converge even if both the advertisers learn simultaneously.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an intelligent procurement marketplace for finding the best mix of web services to dynamically compose the business process desired by a web service requester. We develop a combinatorial auction approach that leads to an integer programming formulation for the web services composition problem. The model takes into account the Quality of Service (QoS) and Service Level Agreements (SLA) for differentiating among multiple service providers who are capable of fulfilling a functionality. An important feature of the model is interface aware composition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper focuses on studying the relationship between patent latent variables and patent price. From the existing literature, seven patent latent variables, namely age, generality, originality, foreign filings, technology field, forward citations, and backward citations were identified as having an influence on patent value. We used Ocean Tomo's patent auction price data in this study. We transformed the price and the predictor variables (excluding the dummy variables) to its logarithmic value. The OLS estimates revealed that forward citations and foreign filings were positively correlated to price. Both the variables jointly explained 14.79% of the variance in patent pricing. We did not find sufficient evidence to come up with any definite conclusions on the relationship between price and the variables such as age, technology field, generality, backward citations and originality. The Heckman two-stage sample selection model was used to test for selection bias. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In pay-per-click sponsored search auctions which are currently extensively used by search engines, the auction for a keyword involves a certain number of advertisers (say k) competing for available slots (say m) to display their advertisements (ads for short). A sponsored search auction for a keyword is typically conducted for a number of rounds (say T). There are click probabilities mu(ij) associated with each agent slot pair (agent i and slot j). The search engine would like to maximize the social welfare of the advertisers, that is, the sum of values of the advertisers for the keyword. However, the search engine does not know the true values advertisers have for a click to their respective advertisements and also does not know the click probabilities. A key problem for the search engine therefore is to learn these click probabilities during the initial rounds of the auction and also to ensure that the auction mechanism is truthful. Mechanisms for addressing such learning and incentives issues have recently been introduced. These mechanisms, due to their connection to the multi-armed bandit problem, are aptly referred to as multi-armed bandit (MAB) mechanisms. When m = 1, exact characterizations for truthful MAB mechanisms are available in the literature. Recent work has focused on the more realistic but non-trivial general case when m > 1 and a few promising results have started appearing. In this article, we consider this general case when m > 1 and prove several interesting results. Our contributions include: (1) When, mu(ij)s are unconstrained, we prove that any truthful mechanism must satisfy strong pointwise monotonicity and show that the regret will be Theta T7) for such mechanisms. (2) When the clicks on the ads follow a certain click precedence property, we show that weak pointwise monotonicity is necessary for MAB mechanisms to be truthful. (3) If the search engine has a certain coarse pre-estimate of mu(ij) values and wishes to update them during the course of the T rounds, we show that weak pointwise monotonicity and type-I separatedness are necessary while weak pointwise monotonicity and type-II separatedness are sufficient conditions for the MAB mechanisms to be truthful. (4) If the click probabilities are separable into agent-specific and slot-specific terms, we provide a characterization of MAB mechanisms that are truthful in expectation.