31 resultados para Web image search
em Indian Institute of Science - Bangalore - Índia
Resumo:
In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.
Resumo:
Owing to high evolutionary divergence, it is not always possible to identify distantly related protein domains by sequence search techniques. Intermediate sequences possess sequence features of more than one protein and facilitate detection of remotely related proteins. We have demonstrated recently the employment of Cascade PSI-BLAST where we perform PSI-BLAST for many 'generations', initiating searches from new homologues as well. Such a rigorous propagation through generations of PSI-BLAST employs effectively the role of intermediates in detecting distant similarities between proteins. This approach has been tested on a large number of folds and its performance in detecting superfamily level relationships is similar to 35% better than simple PSI-BLAST searches. We present a web server for this search method that permits users to perform Cascade PSI-BLAST searches against the Pfam, SCOP and SwissProt databases. The URL for this server is http://crick.mbu.iisc.ernet.in/similar to CASCADE/CascadeBlast.html.
Resumo:
In this paper, we address a key problem faced by advertisers in sponsored search auctions on the web: how much to bid, given the bids of the other advertisers, so as to maximize individual payoffs? Assuming the generalized second price auction as the auction mechanism, we formulate this problem in the framework of an infinite horizon alternative-move game of advertiser bidding behavior. For a sponsored search auction involving two advertisers, we characterize all the pure strategy and mixed strategy Nash equilibria. We also prove that the bid prices will lead to a Nash equilibrium, if the advertisers follow a myopic best response bidding strategy. Following this, we investigate the bidding behavior of the advertisers if they use Q-learning. We discover empirically an interesting trend that the Q-values converge even if both the advertisers learn simultaneously.
Resumo:
Representing images and videos in the form of compact codes has emerged as an important research interest in the vision community, in the context of web scale image/video search. Recently proposed Vector of Locally Aggregated Descriptors (VLAD), has been shown to outperform the existing retrieval techniques, while giving a desired compact representation. VLAD aggregates the local features of an image in the feature space. In this paper, we propose to represent the local features extracted from an image, as sparse codes over an over-complete dictionary, which is obtained by K-SVD based dictionary training algorithm. The proposed VLAD aggregates the residuals in the space of these sparse codes, to obtain a compact representation for the image. Experiments are performed over the `Holidays' database using SIFT features. The performance of the proposed method is compared with the original VLAD. The 4% increment in the mean average precision (mAP) indicates the better retrieval performance of the proposed sparse coding based VLAD.
Resumo:
In this paper, we present a machine learning approach to measure the visual quality of JPEG-coded images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity (HVS) factors such as edge amplitude, edge length, background activity and background luminance. Image quality assessment involves estimating the functional relationship between HVS features and subjective test scores. The quality of the compressed images are obtained without referring to their original images ('No Reference' metric). Here, the problem of quality estimation is transformed to a classification problem and solved using extreme learning machine (ELM) algorithm. In ELM, the input weights and the bias values are randomly chosen and the output weights are analytically calculated. The generalization performance of the ELM algorithm for classification problems with imbalance in the number of samples per quality class depends critically on the input weights and the bias values. Hence, we propose two schemes, namely the k-fold selection scheme (KS-ELM) and the real-coded genetic algorithm (RCGA-ELM) to select the input weights and the bias values such that the generalization performance of the classifier is a maximum. Results indicate that the proposed schemes significantly improve the performance of ELM classifier under imbalance condition for image quality assessment. The experimental results prove that the estimated visual quality of the proposed RCGA-ELM emulates the mean opinion score very well. The experimental results are compared with the existing JPEG no-reference image quality metric and full-reference structural similarity image quality metric.
Resumo:
The presence of folded solution conformations in the peptides Boc-Ala-(Aib-Ala)2-OMe, Boc-Val-(Aib-Val) 2-OMe, Boc-Ala-(Aib-Ala)3-OMe and Boc-Val-(Aib-Val)3-OMe has been established by 270MHz 1H NMR. Intramolecularly H-bonded NH groups have been identified using temperature and solvent dependence of NH chemical shifts and paramagnetic radical induced broadening of NH resonances. Both pentapeptides adopt 310 helical conformations possessing 3 intramolecular H-bonds in CDCl3 and (CD3)2SO. The heptapeptides favour helical structures with 5 H-bonds in CDCl3. In (CD3)2SO only 4 H-bonds are readily detected.
Resumo:
Lateral or transaxial truncation of cone-beam data can occur either due to the field of view limitation of the scanning apparatus or iregion-of-interest tomography. In this paper, we Suggest two new methods to handle lateral truncation in helical scan CT. It is seen that reconstruction with laterally truncated projection data, assuming it to be complete, gives severe artifacts which even penetrates into the field of view. A row-by-row data completion approach using linear prediction is introduced for helical scan truncated data. An extension of this technique known as windowed linear prediction approach is introduced. Efficacy of the two techniques are shown using simulation with standard phantoms. A quantitative image quality measure of the resulting reconstructed images are used to evaluate the performance of the proposed methods against an extension of a standard existing technique.
Resumo:
A novel method, designated the holographic spectrum reconstruction (HSR) method, is proposed for achieving simultaneous display of the spectrum and image of an object in a single plane. A study of the scaling behaviour of both the spectrum and the image has been carried out and based on this study, it is demonstrated that a lensless coherent optical processor can be realized.
Resumo:
In order to understand the molecular mechanism of non-oxidative decarboxylation of aromatic acids observed in microbial systems, 2,3 dihydroxybenzoic acid (DHBA) decarboxylase from Image Image was purified to homogeneity by affinity chromatography. The enzyme (Mr 120 kDa) had four identical subunits (28 kDa each) and was specific for DHBA. It had a pH optimum of 5.2 and Km was 0.34mM. The decarboxylation did not require any cofactors, nor did the enzyme had any pyruvoyl group at the active site. The carboxyl group and hydroxyl group in the Image -position were required for activity. The preliminary spectroscopic properties of the enzyme are also reported.
Resumo:
The existing internet computing resource, Biomolecules Segment Display Device (BSDD), has been updated with several additional useful features. An advanced option is provided to superpose the structural motifs obtained from a search on the Protein Data Bank (PDB) in order to see if the three-dimensional structures adopted by identical or similar sequence motifs are the same. Furthermore, the options to display structural aspects like inter- and intra-molecular interactions, ion-pairs, disulphide bonds, etc. have been provided.The updated resource is interfaced with an up-to-date copy of the public domain PDB as well as 25 and 90% non-redundant protein structures. Further, users can upload the three-dimensional atomic coordinates (PDB format) from the client machine. A free molecular graphics program, JMol, is interfaced with it to display the three-dimensional structures.
Resumo:
Microsomes (105,000xg sediment) prepared from induced cells of Image was found to hydroxylate progesterone to 11a-hydroxyprogesterone (11a-OHP) in high yields (85-90% in 30 min.) in the presence of NADPH and O2. The pH optimum for the hydroxylase was found to be 7.7. However, for the isolation of active microsomes grinding of the mycelium should be carried out at pH 8.3. Metyrapone, carbon monoxide, SKF-525A, p-CMB and N-methyl maleimide inhibited the hydroxylase activity indicating the involvement of cytochrome P-450 system. The inhibition of the hydroxylase by cytochrome Image and the presence of high levels of NADPH-cytochrome Image reductase in induced microsomes suggest that the reductase could be one of the components in the hydroxylase system.
Resumo:
A soluble fraction of Image catalyzed the hydroxylation of mandelic acid to Image -hydroxymandelic acid. The enzyme had a pH optimum of 5.4 and showed an absolute requirement for Fe2+, tetrahydropteridine, NADPH. Image -Hydroxymandelate, the product of the enzyme reaction was identified by paper chromatography, thin layer chromatography, UV and IR-spectra.
Resumo:
tRNA isolated from . grown in a medium containing [75Se] sodium selenosulfate was converted to nucleosides and analysed for selenonucleosides on a phosphocellulose column. Upon chromatography of the nucleosides on phosphocellulose column, the radioactivity resolved into three peaks. The first peak consisted of free selenium and traces of undigested nucleotides. The second peak was identified as 4-selenouridine by co-chromatographing with an authentic sample of 4-selenouridine. The identity of the third peak was not established. The second and third peaks represented 93% and 7% of the selenium present in nucleosides respectively.
Resumo:
Business processes and application functionality are becoming available as internal web services inside enterprise boundaries as well as becoming available as commercial web services from enterprise solution vendors and web services marketplaces. Typically there are multiple web service providers offering services capable of fulfilling a particular functionality, although with different Quality of Service (QoS). Dynamic creation of business processes requires composing an appropriate set of web services that best suit the current need. This paper presents a novel combinatorial auction approach to QoS aware dynamic web services composition. Such an approach would enable not only stand-alone web services but also composite web services to be a part of a business process. The combinatorial auction leads to an integer programming formulation for the web services composition problem. An important feature of the model is the incorporation of service level agreements. We describe a software tool QWESC for QoS-aware web services composition based on the proposed approach.
Resumo:
In this paper we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we design a novel auction which we call the OPT (optimal) auction. The OPT mechanism maximizes the search engine's expected revenue while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We show that the OPT mechanism is superior to two of the most commonly used mechanisms for sponsored search namely (1) GSP (Generalized Second Price) and (2) VCG (Vickrey-Clarke-Groves). We then show an important revenue equivalence result that the expected revenue earned by the search engine is the same for all the three mechanisms provided the advertisers are symmetric and the number of sponsored slots is strictly less than the number of advertisers.