858 resultados para COMPUTER SCIENCE, THEORY
Resumo:
Recently, mean-variance analysis has been proposed as a novel paradigm to model document ranking in Information Retrieval. The main merit of this approach is that it diversifies the ranking of retrieved documents. In its original formulation, the strategy considers both the mean of relevance estimates of retrieved documents and their variance. How- ever, when this strategy has been empirically instantiated, the concepts of mean and variance are discarded in favour of a point-wise estimation of relevance (to replace the mean) and of a parameter to be tuned or, alternatively, a quantity dependent upon the document length (to replace the variance). In this paper we revisit this ranking strategy by going back to its roots: mean and variance. For each retrieved document, we infer a relevance distribution from a series of point-wise relevance estimations provided by a number of different systems. This is used to compute the mean and the variance of document relevance estimates. On the TREC Clueweb collection, we show that this approach improves the retrieval performances. This development could lead to new strategies to address the fusion of relevance estimates provided by different systems.
Resumo:
The aim of this paper is to investigate the role of emotion features in diversifying document rankings to improve the effectiveness of Information Retrieval (IR) systems. For this purpose, two approaches are proposed to consider emotion features for diversification, and they are empirically tested on the TREC 678 Interactive Track collection. The results show that emotion features are capable of enhancing retrieval effectiveness.
Resumo:
Quantum-inspired models have recently attracted increasing attention in Information Retrieval. An intriguing characteristic of the mathematical framework of quantum theory is the presence of complex numbers. However, it is unclear what such numbers could or would actually represent or mean in Information Retrieval. The goal of this paper is to discuss the role of complex numbers within the context of Information Retrieval. First, we introduce how complex numbers are used in quantum probability theory. Then, we examine van Rijsbergen’s proposal of evoking complex valued representations of informations objects. We empirically show that such a representation is unlikely to be effective in practice (confuting its usefulness in Information Retrieval). We then explore alternative proposals which may be more successful at realising the power of complex numbers.
Resumo:
We study the natural problem of secure n-party computation (in the computationally unbounded attack model) of circuits over an arbitrary finite non-Abelian group (G,â‹…), which we call G-circuits. Besides its intrinsic interest, this problem is also motivating by a completeness result of Barrington, stating that such protocols can be applied for general secure computation of arbitrary functions. For flexibility, we are interested in protocols which only require black-box access to the group G (i.e. the only computations performed by players in the protocol are a group operation, a group inverse, or sampling a uniformly random group element). Our investigations focus on the passive adversarial model, where up to t of the n participating parties are corrupted.
Resumo:
Boolean functions and their Möbius transforms are involved in logical calculation, digital communications, coding theory and modern cryptography. So far, little is known about the relations of Boolean functions and their Möbius transforms. This work is composed of three parts. In the first part, we present relations between a Boolean function and its Möbius transform so as to convert the truth table/algebraic normal form (ANF) to the ANF/truth table of a function in different conditions. In the second part, we focus on the special case when a Boolean function is identical to its Möbius transform. We call such functions coincident. In the third part, we generalize the concept of coincident functions and indicate that any Boolean function has the coincidence property even it is not coincident.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a nonstandard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (geometry of numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.
Resumo:
Although recommender systems and reputation systems have quite different theoretical and technical bases, both types of systems have the purpose of providing advice for decision making in e-commerce and online service environments. The similarity in purpose makes it natural to integrate both types of systems in order to produce better online advice, but their difference in theory and implementation makes the integration challenging. In this paper, we propose to use mappings to subjective opinions from values produced by recommender systems as well as from scores produced by reputation systems, and to combine the resulting opinions within the framework of subjective logic.
Resumo:
Affordance is an important concept in HCI. There are various interpretations of affordances but it has been difficult to use this concept for design purposes. Often the treatment of affordances in the current HCI literature has been as a one-to-one relationship between a user and an artefact. According to our views, affordance is a dynamic, always emerging relationship between a human and his environment. We believe that the social and cultural contexts within which an artefact is situated affect the way in which the artefact is used. Using a Structuration Theory approach, we argue that affordances need also be treated at a much broader level, encompassing social and cultural aspects. We suggest that affordances should be seen at three levels: single user, organizational (or work group) and societal. Focusing on the organizational level affordances, we provide details of several important factors that affect the emergence of affordances.
Resumo:
The notion of identity-based IB cryptography was proposed by Shamir [177] as a specialization of public key PK cryptography which dispensed with the need for cumbersome directories, certificates, and revocation lists.
Resumo:
The concept of affordance has different interpretations in the field of Human-Computer Interaction (HCI). However, its treatment has been merely as a one-to-one relationship between a user and a technology. We believe that a broader view of affordances is needed which encompasses social and cultural aspects of our everyday life. We propose an interaction-centered view of affordance that can be useful for developing better understandings of designed artefacts. An interaction-centered view of affordance suggests that affordance is an interpretative relationship between users and the technology that emerges during the users' interaction with the technology in the lived environments. We distinguish two broad classes of affordances: affordance in Information and affordance in Articulation. Affordance in information refers to users' understanding of a technology based on their semantic and syntactic interpretation; and affordance in articulation refers to users' interpretations about the use of the technology. We also argue that the notion of affordance should be treated at two levels: at the 'artefact level' and at the 'practice level'. Consequently, we provide two examples to demonstrate our arguments.
Resumo:
In this chapter, we discuss four related areas of cryptology, namely, authentication, hashing, message authentication codes (MACs), and digital signatures. These topics represent active and growing research topics in cryptology. Space limitations allow us to concentrate only on the essential aspects of each topic. The bibliography is intended to supplement our survey. We have selected those items which providean overview of the current state of knowledge in the above areas. Authentication deals with the problem of providing assurance to a receiver that a communicated message originates from a particular transmitter, and that the received message has the same content as the transmitted message. A typical authentication scenario occurs in computer networks, where the identity of two communicating entities is established by means of authentication. Hashing is concerned with the problem of providing a relatively short digest–fingerprint of a much longer message or electronic document. A hashing function must satisfy (at least) the critical requirement that the fingerprints of two distinct messages are distinct. Hashing functions have numerous applications in cryptology. They are often used as primitives to construct other cryptographic functions. MACs are symmetric key primitives that provide message integrity against active spoofing by appending a cryptographic checksum to a message that is verifiable only by the intended recipient of the message. Message authentication is one of the most important ways of ensuring the integrity of information that is transferred by electronic means. Digital signatures provide electronic equivalents of handwritten signatures. They preserve the essential features of handwritten signatures and can be used to sign electronic documents. Digital signatures can potentially be used in legal contexts.
Resumo:
In this chapter we continue the exposition of crypto topics that was begun in the previous chapter. This chapter covers secret sharing, threshold cryptography, signature schemes, and finally quantum key distribution and quantum cryptography. As in the previous chapter, we have focused only on the essentials of each topic. We have selected in the bibliography a list of representative items, which can be consulted for further details. First we give a synopsis of the topics that are discussed in this chapter. Secret sharing is concerned with the problem of how to distribute a secret among a group of participating individuals, or entities, so that only predesignated collections of individuals are able to recreate the secret by collectively combining the parts of the secret that were allocated to them. There are numerous applications of secret-sharing schemes in practice. One example of secret sharing occurs in banking. For instance, the combination to a vault may be distributed in such a way that only specified collections of employees can open the vault by pooling their portions of the combination. In this way the authority to initiate an action, e.g., the opening of a bank vault, is divided for the purposes of providing security and for added functionality, such as auditing, if required. Threshold cryptography is a relatively recently studied area of cryptography. It deals with situations where the authority to initiate or perform cryptographic operations is distributed among a group of individuals. Many of the standard operations of single-user cryptography have counterparts in threshold cryptography. Signature schemes deal with the problem of generating and verifying electronic) signatures for documents.Asubclass of signature schemes is concerned with the shared-generation and the sharedverification of signatures, where a collaborating group of individuals are required to perform these actions. A new paradigm of security has recently been introduced into cryptography with the emergence of the ideas of quantum key distribution and quantum cryptography. While classical cryptography employs various mathematical techniques to restrict eavesdroppers from learning the contents of encrypted messages, in quantum cryptography the information is protected by the laws of physics.
Resumo:
A pseudonym provides anonymity by protecting the identity of a legitimate user. A user with a pseudonym can interact with an unknown entity and be confident that his/her identity is secret even if the other entity is dishonest. In this work, we present a system that allows users to create pseudonyms from a trusted master public-secret key pair. The proposed system is based on the intractability of factoring and finding square roots of a quadratic residue modulo a composite number, where the composite number is a product of two large primes. Our proposal is different from previously published pseudonym systems, as in addition to standard notion of protecting privacy of an user, our system offers colligation between seemingly independent pseudonyms. This new property when combined with a trusted platform that stores a master secret key is extremely beneficial to an user as it offers a convenient way to generate a large number of pseudonyms using relatively small storage.
Resumo:
We analyse the security of the cryptographic hash function LAKE-256 proposed at FSE 2008 by Aumasson, Meier and Phan. By exploiting non-injectivity of some of the building primitives of LAKE, we show three different collision and near-collision attacks on the compression function. The first attack uses differences in the chaining values and the block counter and finds collisions with complexity 233. The second attack utilizes differences in the chaining values and salt and yields collisions with complexity 242. The final attack uses differences only in the chaining values to yield near-collisions with complexity 299. All our attacks are independent of the number of rounds in the compression function. We illustrate the first two attacks by showing examples of collisions and near-collisions.
Resumo:
There has been tremendous interest in watermarking multimedia content during the past two decades, mainly for proving ownership and detecting tamper. Digital fingerprinting, that deals with identifying malicious user(s), has also received significant attention. While extensive work has been carried out in watermarking of images, other multimedia objects still have enormous research potential. Watermarking database relations is one of the several areas which demand research focus owing to the commercial implications of database theft. Recently, there has been little progress in database watermarking, with most of the watermarking schemes modeled after the irreversible database watermarking scheme proposed by Agrawal and Kiernan. Reversibility is the ability to re-generate the original (unmarked) relation from the watermarked relation using a secret key. As explained in our paper, reversible watermarking schemes provide greater security against secondary watermarking attacks, where an attacker watermarks an already marked relation in an attempt to erase the original watermark. This paper proposes an improvement over the reversible and blind watermarking scheme presented in [5], identifying and eliminating a critical problem with the previous model. Experiments showing that the average watermark detection rate is around 91% even with attacker distorting half of the attributes. The current scheme provides security against secondary watermarking attacks.