73 resultados para Search engine results page
em Indian Institute of Science - Bangalore - Índia
Resumo:
The search engine log files have been used to gather direct user feedback on the relevancy of the documents presented in the results page. Typically the relative position of the clicks gathered from the log files is used a proxy for the direct user feedback. In this paper we identify reasons for the incompleteness of the relative position of clicks for deciphering the user preferences. Hence, we propose the use of time spent by the user in reading through the document as indicative of user preference for a document with respect to a query. Also, we identify the issues involved in using the time measure and propose means to address them.
Resumo:
The keyword based search technique suffers from the problem of synonymic and polysemic queries. Current approaches address only theproblem of synonymic queries in which different queries might have the same information requirement. But the problem of polysemic queries,i.e., same query having different intentions, still remains unaddressed. In this paper, we propose the notion of intent clusters, the members of which will have the same intention. We develop a clustering algorithm that uses the user session information in query logs in addition to query URL entries to identify cluster of queries having the same intention. The proposed approach has been studied through case examples from the actual log data from AOL, and the clustering algorithm is shown to be successful in discerning the user intentions.
Resumo:
In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.
Resumo:
In pay-per-click sponsored search auctions which are currently extensively used by search engines, the auction for a keyword involves a certain number of advertisers (say k) competing for available slots (say m) to display their advertisements (ads for short). A sponsored search auction for a keyword is typically conducted for a number of rounds (say T). There are click probabilities mu(ij) associated with each agent slot pair (agent i and slot j). The search engine would like to maximize the social welfare of the advertisers, that is, the sum of values of the advertisers for the keyword. However, the search engine does not know the true values advertisers have for a click to their respective advertisements and also does not know the click probabilities. A key problem for the search engine therefore is to learn these click probabilities during the initial rounds of the auction and also to ensure that the auction mechanism is truthful. Mechanisms for addressing such learning and incentives issues have recently been introduced. These mechanisms, due to their connection to the multi-armed bandit problem, are aptly referred to as multi-armed bandit (MAB) mechanisms. When m = 1, exact characterizations for truthful MAB mechanisms are available in the literature. Recent work has focused on the more realistic but non-trivial general case when m > 1 and a few promising results have started appearing. In this article, we consider this general case when m > 1 and prove several interesting results. Our contributions include: (1) When, mu(ij)s are unconstrained, we prove that any truthful mechanism must satisfy strong pointwise monotonicity and show that the regret will be Theta T7) for such mechanisms. (2) When the clicks on the ads follow a certain click precedence property, we show that weak pointwise monotonicity is necessary for MAB mechanisms to be truthful. (3) If the search engine has a certain coarse pre-estimate of mu(ij) values and wishes to update them during the course of the T rounds, we show that weak pointwise monotonicity and type-I separatedness are necessary while weak pointwise monotonicity and type-II separatedness are sufficient conditions for the MAB mechanisms to be truthful. (4) If the click probabilities are separable into agent-specific and slot-specific terms, we provide a characterization of MAB mechanisms that are truthful in expectation.
Resumo:
In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.
Resumo:
In pay-per click sponsored search auctions which are currently extensively used by search engines, the auction for a keyword involves a certain number of advertisers (say k) competing for available slots (say m) to display their ads. This auction is typically conducted for a number of rounds (say T). There are click probabilities mu_ij associated with agent-slot pairs. The search engine's goal is to maximize social welfare, for example, the sum of values of the advertisers. The search engine does not know the true value of an advertiser for a click to her ad and also does not know the click probabilities mu_ij s. A key problem for the search engine therefore is to learn these during the T rounds of the auction and also to ensure that the auction mechanism is truthful. Mechanisms for addressing such learning and incentives issues have recently been introduced and would be referred to as multi-armed-bandit (MAB) mechanisms. When m = 1,characterizations for truthful MAB mechanisms are available in the literature and it has been shown that the regret for such mechanisms will be O(T^{2/3}). In this paper, we seek to derive a characterization in the realistic but nontrivial general case when m > 1 and obtain several interesting results.
Resumo:
Bid optimization is now becoming quite popular in sponsored search auctions on the Web. Given a keyword and the maximum willingness to pay of each advertiser interested in the keyword, the bid optimizer generates a profile of bids for the advertisers with the objective of maximizing customer retention without compromising the revenue of the search engine. In this paper, we present a bid optimization algorithm that is based on a Nash bargaining model where the first player is the search engine and the second player is a virtual agent representing all the bidders. We make the realistic assumption that each bidder specifies a maximum willingness to pay values and a discrete, finite set of bid values. We show that the Nash bargaining solution for this problem always lies on a certain edge of the convex hull such that one end point of the edge is the vector of maximum willingness to pay of all the bidders. We show that the other endpoint of this edge can be computed as a solution of a linear programming problem. We also show how the solution can be transformed to a bid profile of the advertisers.
Resumo:
Users can rarely reveal their information need in full detail to a search engine within 1--2 words, so search engines need to "hedge their bets" and present diverse results within the precious 10 response slots. Diversity in ranking is of much recent interest. Most existing solutions estimate the marginal utility of an item given a set of items already in the response, and then use variants of greedy set cover. Others design graphs with the items as nodes and choose diverse items based on visit rates (PageRank). Here we introduce a radically new and natural formulation of diversity as finding centers in resistive graphs. Unlike in PageRank, we do not specify the edge resistances (equivalently, conductances) and ask for node visit rates. Instead, we look for a sparse set of center nodes so that the effective conductance from the center to the rest of the graph has maximum entropy. We give a cogent semantic justification for turning PageRank thus on its head. In marked deviation from prior work, our edge resistances are learnt from training data. Inference and learning are NP-hard, but we give practical solutions. In extensive experiments with subtopic retrieval, social network search, and document summarization, our approach convincingly surpasses recently-published diversity algorithms like subtopic cover, max-marginal relevance (MMR), Grasshopper, DivRank, and SVMdiv.
Resumo:
Query suggestion is an important feature of the search engine with the explosive and diverse growth of web contents. Different kind of suggestions like query, image, movies, music and book etc. are used every day. Various types of data sources are used for the suggestions. If we model the data into various kinds of graphs then we can build a general method for any suggestions. In this paper, we have proposed a general method for query suggestion by combining two graphs: (1) query click graph which captures the relationship between queries frequently clicked on common URLs and (2) query text similarity graph which finds the similarity between two queries using Jaccard similarity. The proposed method provides literally as well as semantically relevant queries for users' need. Simulation results show that the proposed algorithm outperforms heat diffusion method by providing more number of relevant queries. It can be used for recommendation tasks like query, image, and product suggestion.
Resumo:
Fragment Finder 2.0 is a web-based interactive computing server which can be used to retrieve structurally similar protein fragments from 25 and 90% nonredundant data sets. The computing server identifies structurally similar fragments using the protein backbone C alpha angles. In addition, the identified fragments can be superimposed using either of the two structural superposition programs, STAMP and PROFIT, provided in the server. The freely available Java plug-in Jmol has been interfaced with the server for the visualization of the query and superposed fragments. The server is the updated version of a previously developed search engine and employs an in-house-developed fast pattern matching algorithm. This server can be accessed freely over the World Wide Web through the URL http://cluster.physics.iisc.ernet.in/ff/.
Resumo:
Al-Si-graphite particle composite alloy pistons containing different percentages of about 80 μm uncoated graphite particles were successfully cast by foundry techniques. Tests with a 5 hp single-cylinder diesel engine show that Al-Si-graphite particle composite pistons can withstand an endurance test of 500 h without any apparent deterioration and do not seize during the running-in period. The use of the Al-Si-3% graphite particle composite piston also results in (a) up to 3% reduction in the specific fuel consumption, (b) considerable reduction in the wear of all four piston rings, (c) a reduction in piston wear, (d) a 9% reduction in the frictional horsepower losses of the engine as determined by the motoring test and (e) a slight increase in the exhaust gas temperature. These reductions (a)–(d) appear to be due to increased lubrication from the graphite particles which are smeared on the bearing surface, the higher damping capacity of the composite pistons and the reduced coefficient of thermal expansion of the composite pistons. Preliminary results indicate that aluminum-graphite particle composite alloy is a promising material for automotive pistons.
Resumo:
A modern system theory based nonlinear control design is discussed in this paper for successful operation of an air-breathing engine operating at supersonic speed. The primary objective of the control design of such an air-breathing engine is to ensure that the engine dynamically produces the thrust that tracks a commanded value of thrust as closely as possible by regulating the fuel flow to the combustion system. However, since the engine operates in the supersonic range, an important secondary objective is to manage the shock wave configuration in the intake section of the engine which is manipulated by varying the throat area of the nozzle. A nonlinear sliding mode control technique has been successfully used to achieve both of the above objectives. In this problem, since the process is faster than the actuators, independent control designs are also carried out for the actuators as well to assure the satisfactory performance of the system. Moreover, to filter out the sensor and process noises and to estimate the states for making the control design operate based on output feedback, an Extended Kalman Filter based state estimation design is also carried out. The promising simulation results suggest that the proposed control design approach is quite successful in obtaining robust performance of the air-breathing engine.
Resumo:
The operation of a stand-alone, as opposed to grid connected generation system, using a slip-ring induction machine as the electrical generator, is considered. In contrast to an alternator, a slip-ring induction machine can run at variable speed and still deliver constant frequency power to loads. This feature enables optimization of the system when the prime mover is inherently variable speed in nature eg. wind turbines, as well as diesel driven systems, where there is scope for economizing on fuel consumption. Experimental results from a system driven by a 44 bhp diesel engine are presented. Operation at subsynchronous as well as super-synchronous speeds is examined. The measurement facilitates the understanding of the system as well as its design.
Resumo:
Time-domain-finite-wave analysis of engine exhaust systems is usually carried out by means of the method of characteristics. The theory and the computational details of the stationary-frame method have been worked out in the accompanying paper (part I). In this paper (part II), typical computed results are given and discussed. A setup designed for experimental corroboration is described. The results obtained from the simulation are found to be in good agreement with experimental observations.
Resumo:
Antibodies to LH/chorionic gonadotrophin receptor (LH/CG-R; molecular weight 67 000), isolated in a homogenous state (established by SDS-PAGE and ligand blotting) from sheep luteal membrane using human CG (hCG)-Sepharose affinity chromatography, were raised in three adult male rabbits (R-I, R-II and R-III). Each of the rabbits received 20-30 mu g oi the purified receptor in Freund's complete adjuvant at a time. Primary immunization was followed by booster injection at intervals. Production of receptor antibodies was monitored by (1) determining the dilution of the serum (IgG fraction) that could specifically bind 50% of I-125-LH/CG-R added and (2) analysing sera for any chance in testosterone levels. Following primary immunization and the first booster, all three rabbits exhibited a 2.5- to 6.0-fold increase in serum testosterone over basal levels and this effect was spread over a period of time (similar to 40 days) coinciding with the rise and fall of receptor antibodies. The maximal antibody titre (ED(50)) produced at this time ranged from 1:350 to 1:100 to below detectable limits for R-I, R-II and R-III respectively. Subsequent immunizations followed by the second booster resulted in a substantial increase in antibody titre (ED(50) of 1:5000) in R-I, but this was not accompanied by any change in serum testosterone over preimmune levels, suggesting that with the progress of immunization the character of the antibody produced had also changed. Two pools of antisera from R-I collected 10 days following the booster (at day 70 (bleed I) and day 290 (bleed II)) were used in further experiments. IgG isolated from bleed I but not from bleed II antiserum showed a dose-dependent stimulation of testosterone production by mouse Leydig cells in vitro, thus confirming the in vivo hormone-mimicking activity antibodies generated during the early immunization phase. The IgG fractions from both bleeds were, however, capable of inhibiting (1) I-125-hCG binding to crude sheep luteal membrane (EC(50) of 1:70 and 1:350 for bleed I and II antisera respectively) and (2) ovine LH-stimulated testosterone production by mouse Leydig cells in vitro, indicating the presence oi antagonistic antibodies irrespective of the period of time during which the rabbits were immunized. The: fact that bleed I-stimulated testosterone production could be inhibited in a dose-dependent manner by the addition of IgG from bleed II to the mouse Leydig cell in vitro assay system showed that the agonistic activity is intrinsic to the bleed I antibody. The receptor antibody (bleed II) was also capable of blocking LH action in vivo, as rabbits passively (for 24 h with LH/CG-R antiserum) as well as actively (for 130 days) immunized against LH/CG-R failed to respond to a bolus injection of LH (50 mu g). At no time, however, was the serum testosterone reduced below the basal level. This study clearly shows that, unlike with LH antibody, attempts to achieve an LH deficiency effect in vivo by resorting to immunization with hole LH receptor is difficult, as receptor antibodies exhibit both hormone-mimicking (agonistic) as well as hormone-blocking (antagonistic) activities.