993 resultados para Lattice theory.
Resumo:
We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identity-based encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier lattice-based HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions.
Resumo:
We construct an efficient identity based encryption system based on the standard learning with errors (LWE) problem. Our security proof holds in the standard model. The key step in the construction is a family of lattices for which there are two distinct trapdoors for finding short vectors. One trapdoor enables the real system to generate short vectors in all lattices in the family. The other trapdoor enables the simulator to generate short vectors for all lattices in the family except for one. We extend this basic technique to an adaptively-secure IBE and a Hierarchical IBE.
Resumo:
We propose a framework for adaptive security from hard random lattices in the standard model. Our approach borrows from the recent Agrawal-Boneh-Boyen families of lattices, which can admit reliable and punctured trapdoors, respectively used in reality and in simulation. We extend this idea to make the simulation trapdoors cancel not for a specific forgery but on a non-negligible subset of the possible challenges. Conceptually, we build a compactly representable, large family of input-dependent “mixture” lattices, set up with trapdoors that “vanish” for a secret subset which we hope the forger will target. Technically, we tweak the lattice structure to achieve “naturally nice” distributions for arbitrary choices of subset size. The framework is very general. Here we obtain fully secure signatures, and also IBE, that are compact, simple, and elegant.
Resumo:
The last fifty years have witnessed the growing pervasiveness of the figure of the map in critical, theoretical, and fictional discourse. References to mapping and cartography are endemic in poststructuralist theory, and, similarly, geographically and culturally diverse authors of twentieth-century fiction seem fixated upon mapping. While the map metaphor has been employed for centuries to highlight issues of textual representation and epistemology, the map metaphor itself has undergone a transformation in the postmodern era. This metamorphosis draws together poststructuralist conceptualizations of epistemology, textuality, cartography, and metaphor, and signals a shift away from modernist preoccupations with temporality and objectivity to a postmodern pragmatics of spatiality and subjectivity. Cartographic Strategies of Postmodernity charts this metamorphosis of cartographic metaphor, and argues that the ongoing reworking of the map metaphor renders it a formative and performative metaphor of postmodernity.
Resumo:
In this paper we introduce a formalization of Logical Imaging applied to IR in terms of Quantum Theory through the use of an analogy between states of a quantum system and terms in text documents. Our formalization relies upon the Schrodinger Picture, creating an analogy between the dynamics of a physical system and the kinematics of probabilities generated by Logical Imaging. By using Quantum Theory, it is possible to model more precisely contextual information in a seamless and principled fashion within the Logical Imaging process. While further work is needed to empirically validate this, the foundations for doing so are provided.
Resumo:
The notion of certificateless public-key encryption (CL-PKE) was introduced by Al-Riyami and Paterson in 2003 that avoids the drawbacks of both traditional PKI-based public-key encryption (i.e., establishing public-key infrastructure) and identity-based encryption (i.e., key escrow). So CL-PKE like identity-based encryption is certificate-free, and unlike identity-based encryption is key escrow-free. In this paper, we introduce simple and efficient CCA-secure CL-PKE based on (hierarchical) identity-based encryption. Our construction has both theoretical and practical interests. First, our generic transformation gives a new way of constructing CCA-secure CL-PKE. Second, instantiating our transformation using lattice-based primitives results in a more efficient CCA-secure CL-PKE than its counterpart introduced by Dent in 2008.
Resumo:
In the last years several works have investigated a formal model for Information Retrieval (IR) based on the mathematical formalism underlying quantum theory. These works have mainly exploited geometric and logical–algebraic features of the quantum formalism, for example entanglement, superposition of states, collapse into basis states, lattice relationships. In this poster I present an analogy between a typical IR scenario and the double slit experiment. This experiment exhibits the presence of interference phenomena between events in a quantum system, causing the Kolmogorovian law of total probability to fail. The analogy allows to put forward the routes for the application of quantum probability theory in IR. However, several questions need still to be addressed; they will be the subject of my PhD research
Resumo:
Recently, Portfolio Theory (PT) has been proposed for Information Retrieval. However, under non-trivial conditions PT violates the original Probability Ranking Principle (PRP). In this poster, we shall explore whether PT upholds a different ranking principle based on Quantum Theory, i.e. the Quantum Probability Ranking Principle (QPRP), and examine the relationship between this new model and the new ranking principle. We make a significant contribution to the theoretical development of PT and show that under certain circumstances PT upholds the QPRP, and thus guarantees an optimal ranking according to the QPRP. A practical implication of this finding is that the parameters of PT can be automatically estimated via the QPRP, instead of resorting to extensive parameter tuning.
Resumo:
An encryption scheme is non-malleable if giving an encryption of a message to an adversary does not increase its chances of producing an encryption of a related message (under a given public key). Fischlin introduced a stronger notion, known as complete non-malleability, which requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti later proposed a comparison-based definition of this security notion, which is more in line with the well-studied definitions proposed by Bellare et al. The authors also provide additional feasibility results by proposing two constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Therefore, the only previously known completely non-malleable (and non-interactive) scheme in the standard model, is quite inefficient as it relies on generic NIZK approach. They left the existence of efficient schemes in the common reference string model as an open problem. Recently, two efficient public-key encryption schemes have been proposed by Libert and Yung, and Barbosa and Farshim, both of them are based on pairing identity-based encryption. At ACISP 2011, Sepahi et al. proposed a method to achieve completely non-malleable encryption in the public-key setting using lattices but there is no security proof for the proposed scheme. In this paper we review the mentioned scheme and provide its security proof in the standard model. Our study shows that Sepahi’s scheme will remain secure even for post-quantum world since there are currently no known quantum algorithms for solving lattice problems that perform significantly better than the best known classical (i.e., non-quantum) algorithms.
Resumo:
This article reports on a review of selected theory and practice in sports journalism to determine if the prominence of female journalists reporting the news of a major sporting movement, and industry, the Australian Football League (AFL) could be attributed to a feminist response to the traditional domination of male values in the sports media complex. The article reviews selected literature to establish that, on the evidence presented, male values have traditionally dominated the news. It then considers feminist theory and alternative feminist responses to the domination of male values in the newsroom. Consideration is also given to Australian research on the ‘seriousness’ of sports news and its coverage (or lack thereof) of more ‘feminine’ news values including human interest stories, stories about culture and those on serious social issues. Interviews with a select group of female journalists who write about the AFL for The Age newspaper in Melbourne are recounted, with a focus on the journalists’ work experiences. The article concludes by drawing together the research findings to demonstrate that, although feminine news values are represented in only a small proportion of AFL news stories, there is evidence to suggest they are afforded a high degree of presentational prominence which reflects the needs and expectations of a female audience. It shows that female journalists do play a meaningful role in the AFL media and that, given the evidence presented, a feminist response to the traditional domination of male values in the sports media complex could indeed be applicable, and taking place.
Resumo:
In this paper, we seek to operationalize Amartya Sen's concept of human capability to guide a scholarly investigation of student career choice capability. We begin by outlining factors affecting youth labour markets in Australia; a prosperous country that is affected by a ‘two-speed’ national economy. We then examine recent government initiatives that have been designed to combat youth unemployment and cyclical disadvantage by enhancing the aspirations and career knowledge of secondary school students. We argue that these policy measures are based on four assumptions: first, that career choice capability is a problem of individual agency; second, that the dissemination of career information can empower students to act as ‘consumers’ in an unequal job market; third, that agency is simply a question of will; and finally, that school education and career advice – as a means to freedom in the space of career development – is of equal quality, distribution and value to an increasingly diverse range of upper secondary school students. The paper concludes by outlining a conceptual framework capable of informing an empirical research project that aims to test these assumptions by measuring and comparing differences between groups in the range of freedom to achieve and, therefore, to choose.
Resumo:
This project develops and evaluates a model of curriculum design that aims to assist student learning of foundational disciplinary ‘Threshold Concepts’. The project uses phenomenographic action research, cross-institutional peer collaboration and the Variation Theory of Learning to develop and trial the model. Two contrasting disciplines (Physics and Law) and four institutions (two research-intensive and two universities of technology) were involved in the project, to ensure broad applicability of the model across different disciplines and contexts. The Threshold Concepts that were selected for curriculum design attention were measurement uncertainty in Physics and legal reasoning in Law. Threshold Concepts are key disciplinary concepts that are inherently troublesome, transformative and integrative in nature. Once understood, such concepts transform students’ views of the discipline because they enable students to coherently integrate what were previously seen as unrelated aspects of the subject, providing new ways of thinking about it (Meyer & Land 2003, 2005, 2006; Land et al. 2008). However, the integrative and transformative nature of such threshold concepts make them inherently difficult for students to learn, with resulting misunderstandings of concepts being prevalent...