556 resultados para Predicate encryption
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Using the work and ideas of French theorist Michel Foucault the writer examines s 3LA of the Crimes Act, which provides law enforcement officers with power to compel a person to reveal their private encryption keys and other personal information, and concludes that such a section creates fear, redirects flow of power between law enforcement agencies and citizens, and creates resistance.
Resumo:
We provide an abstract command language for real-time programs and outline how a partial correctness semantics can be used to compute execution times. The notions of a timed command, refinement of a timed command, the command traversal condition, and the worst-case and best-case execution time of a command are formally introduced and investigated with the help of an underlying weakest liberal precondition semantics. The central result is a theory for the computation of worst-case and best-case execution times from the underlying semantics based on supremum and infimum calculations. The framework is applied to the analysis of a message transmitter program and its implementation. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
High-level language program compilation strategies can be proven correct by modelling the process as a series of refinement steps from source code to a machine-level description. We show how this can be done for programs containing recursively-defined procedures in the well-established predicate transformer semantics for refinement. To do so the formalism is extended with an abstraction of the way stack frames are created at run time for procedure parameters and variables.
Resumo:
This paper presents a DES/3DES core that will support cipher block chaining (CBC) and also has a built in keygen that together take up about 10% of the resources in a Xilinx Virtex II 1000-4. The core will achieve up to 200Mbit/s of encryption or decryption. Also presented is a network architecture that will allow these CBC capable 3DES cores to perform their processing in parallel.
Resumo:
This thesis is an exploration of several completeness phenomena, both in the constructive and the classical settings. After some introductory chapters in the first part of the thesis where we outline the background used later on, the constructive part contains a categorical formulation of several constructive completeness theorems available in the literature, but presented here in an unified framework. We develop them within a constructive reverse mathematical viewpoint, highlighting the metatheory used in each case and the strength of the corresponding completeness theorems. The classical part of the thesis focuses on infinitary intuitionistic propositional and predicate logic. We consider a propositional axiomatic system with a special distributivity rule that is enough to prove a completeness theorem, and we introduce weakly compact cardinals as the adequate metatheoretical assumption for this development. Finally, we return to the categorical formulation focusing this time on infinitary first-order intuitionistic logic. We propose a first-order system with a special rule, transfinite transitivity, that embodies both distributivity as well as a form of dependent choice, and study the extent to which completeness theorems can be established. We prove completeness using a weakly compact cardinal, and, like in the constructive part, we study disjunction-free fragments as well. The assumption of weak compactness is shown to be essential for the completeness theorems to hold.
Resumo:
This monograph proposes a general model for the analysis of polysemy. The underspecified content of polysemic items produces interpretations in relation with characterised contextual indicators. This ternary set of schematic representations applies to the meaning organisation of French indefinite qui que ce soit ('any'). Universal positive readings, existential and opposition readings are generated by modalised predicate, a modalised proposition and syntactically adjoined functions. "Affective" contexts yield a negative polarity interpretation, the interpretation is generated through concessive reasoning, which explains how scalar values can be evoked by an item representing arbitrary selection. Such contextual reasoning and characterised contextual indicators are the two modes for the calculation of the contextual interpretation of polysemous items.
Resumo:
This book untangles the old grammatical paradox allowing for several negations within the same negative clause through his work of the scope of negations. The scope of each negation over the same predicate is what allows for concordant values. The frequent co-occurrence of negative items, cases of double negation and the expletive negative, as compared to constituent negation, help to demonstrate this. Analysis of these phenomena is based on a large body of data of different varieties of French considered in the light of historical, typological, and psycholinguistic tendencies. While extensive reference is made to current analysis, independence is maintained from any particular model. Starting from syntactic generalisations, the work provides an innovative solution to a classic interpretative issue.
Resumo:
The organization of linguistic meaning is animated by the duality between the sense of signs and the reference to the experience of speakers. How the presuppositions communicated by speakers emanate from the conventional value of signs and their cotextual dependencies is explored in this monograph on the scope and focus of negation. Negation can have scope over the predicate of the sequence in which it is used. The body of data brought together show that a variety of configurations preclude command of the predicate by the negative scoping over it, and that scope is a semantic rather than structural relation. Scope defines the domain in which an item can be focused by negation. Negative focus is dependent on the evocation of an alternative value, which may be generated by lexical antonymy, syntactic determination or contextual corrections. The study of focus and scope of negation on the basis of attested examples from different varieties of French demonstrates how the independently motivated semantic principles of relation to predicate and reference to an alternative value account for the observed effects.
Resumo:
For the last several years, mobile devices and platform security threats, including wireless networking technology, have been top security issues. A departure has occurred from automatic anti-virus software based on traditional PC defense: risk management (authentication and encryption), compliance, and disaster recovery following polymorphic viruses and malware as the primary activities within many organizations and government services alike. This chapter covers research in Turkey as a reflection of the current market – e-government started officially in 2008. This situation in an emerging country presents the current situation and resistances encountered while engaging with mobile and e-government interfaces. The authors contend that research is needed to understand more precisely security threats and most of all potential solutions for sustainable future intention to use m-government services. Finally, beyond m-government initiatives' success or failure, the mechanisms related to public administration mobile technical capacity building and security issues are discussed.
Resumo:
Partial information leakage in deterministic public-key cryptosystems refers to a problem that arises when information about either the plaintext or the key is leaked in subtle ways. Quite a common case is where there are a small number of possible messages that may be sent. An attacker may be able to crack the scheme simply by enumerating all the possible ciphertexts. Two methods are proposed for facing the partial information leakage problem in RSA that incorporate a random element into the encrypted message to increase the number of possible ciphertexts. The resulting scheme is, effectively, an RSA-like cryptosystem which exhibits probabilistic encryption. The first method involves encrypting several similar messages with RSA and then using the Quadratic Residuosity Problem (QRP) to mark the intended one. In this way, an adversary who has correctly guessed two or more of the ciphertexts is still in doubt about which message is the intended one. The cryptographic strength of the combined system is equal to the computational difficulty of factorising a large integer; ideally, this should be feasible. The second scheme uses error-correcting codes for accommodating the random component. The plaintext is processed with an error-correcting code and deliberately corrupted before encryption. The introduced corruption lies within the error-correcting ability of the code, so as to enable the recovery of the original message. The random corruption offers a vast number of possible ciphertexts corresponding to a given plaintext; hence an attacker cannot deduce any useful information from it. The proposed systems are compared to other cryptosystems sharing similar characteristics, in terms of execution time and ciphertext size, so as to determine their practical utility. Finally, parameters which determine the characteristics of the proposed schemes are also examined.
Resumo:
A property of sparse representations in relation to their capacity for information storage is discussed. It is shown that this feature can be used for an application that we term Encrypted Image Folding. The proposed procedure is realizable through any suitable transformation. In particular, in this paper we illustrate the approach by recourse to the Discrete Cosine Transform and a combination of redundant Cosine and Dirac dictionaries. The main advantage of the proposed technique is that both storage and encryption can be achieved simultaneously using simple processing steps.
Resumo:
The statistical distribution, when determined from an incomplete set of constraints, is shown to be suitable as host for encrypted information. We design an encoding/decoding scheme to embed such a distribution with hidden information. The encryption security is based on the extreme instability of the encoding procedure. The essential feature of the proposed system lies in the fact that the key for retrieving the code is generated by random perturbations of very small value. The security of the proposed encryption relies on the security to interchange the secret key. Hence, it appears as a good complement to the quantum key distribution protocol. © 2005 Elsevier B.V. All rights reserved.