358 resultados para Oracle verktyg
Resumo:
Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.
Resumo:
Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.
Resumo:
We revisit the venerable question of access credentials management, which concerns the techniques that we, humans with limited memory, must employ to safeguard our various access keys and tokens in a connected world. Although many existing solutions can be employed to protect a long secret using a short password, those solutions typically require certain assumptions on the distribution of the secret and/or the password, and are helpful against only a subset of the possible attackers. After briefly reviewing a variety of approaches, we propose a user-centric comprehensive model to capture the possible threats posed by online and offline attackers, from the outside and the inside, against the security of both the plaintext and the password. We then propose a few very simple protocols, adapted from the Ford-Kaliski server-assisted password generator and the Boldyreva unique blind signature in particular, that provide the best protection against all kinds of threats, for all distributions of secrets. We also quantify the concrete security of our approach in terms of online and offline password guesses made by outsiders and insiders, in the random-oracle model. The main contribution of this paper lies not in the technical novelty of the proposed solution, but in the identification of the problem and its model. Our results have an immediate and practical application for the real world: they show how to implement single-sign-on stateless roaming authentication for the internet, in a ad-hoc user-driven fashion that requires no change to protocols or infrastructure.
Resumo:
We describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model. Our construction works in groups equipped with an efficient bilinear map, or, more generally, an algorithm for the Decision Diffie-Hellman problem. The security of our scheme depends on a new intractability assumption we call Strong Diffie-Hellman (SDH), by analogy to the Strong RSA assumption with which it shares many properties. Signature generation in our system is fast and the resulting signatures are as short as DSA signatures for comparable security. We give a tight reduction proving that our scheme is secure in any group in which the SDH assumption holds, without relying on the random oracle model.
Resumo:
We consider the following problem: members in a dynamic group retrieve their encrypted data from an untrusted server based on keywords and without any loss of data confidentiality and member’s privacy. In this paper, we investigate common secure indices for conjunctive keyword-based retrieval over encrypted data, and construct an efficient scheme from Wang et al. dynamic accumulator, Nyberg combinatorial accumulator and Kiayias et al. public-key encryption system. The proposed scheme is trapdoorless and keyword-field free. The security is proved under the random oracle, decisional composite residuosity and extended strong RSA assumptions.
Efficient extension of standard Schnorr/RSA signatures into Universal Designated-Verifier Signatures
Resumo:
Universal Designated-Verifier Signature (UDVS) schemes are digital signature schemes with additional functionality which allows any holder of a signature to designate the signature to any desired designated-verifier such that the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. Since UDVS schemes reduce to standard signatures when no verifier designation is performed, it is natural to ask how to extend the classical Schnorr or RSA signature schemes into UDVS schemes, so that the existing key generation and signing implementation infrastructure for these schemes can be used without modification. We show how this can be efficiently achieved, and provide proofs of security for our schemes in the random oracle model.
Resumo:
A parallel authentication and public-key encryption is introduced and exemplified on joint encryption and signing which compares favorably with sequential Encrypt-then-Sign (ɛtS) or Sign-then-Encrypt (Stɛ) schemes as far as both efficiency and security are concerned. A security model for signcryption, and thus joint encryption and signing, has been recently defined which considers possible attacks and security goals. Such a scheme is considered secure if the encryption part guarantees indistinguishability and the signature part prevents existential forgeries, for outsider but also insider adversaries. We propose two schemes of parallel signcryption, which are efficient alternative to Commit-then-Sign-and- Encrypt (Ct&G3&S). They are both provably secure in the random oracle model. The first one, called generic parallel encrypt and sign, is secure if the encryption scheme is semantically secure against chosen-ciphertext attacks and the signature scheme prevents existential forgeries against random-message attacks. The second scheme, called optimal parallel encrypt. and sign, applies random oracles similar to the OAEP technique in order to achieve security using encryption and signature components with very weak security requirements — encryption is expected to be one-way under chosen-plaintext attacks while signature needs to be secure against universal forgeries under random-plaintext attack, that is actually the case for both the plain-RSA encryption and signature under the usual RSA assumption. Both proposals are generic in the sense that any suitable encryption and signature schemes (i.e. which simply achieve required security) can be used. Furthermore they allow both parallel encryption and signing, as well as parallel decryption and verification. Properties of parallel encrypt and sign schemes are considered and a new security standard for parallel signcryption is proposed.
Resumo:
Motivated by privacy issues associated with dissemination of signed digital certificates, we define a new type of signature scheme called a ‘Universal Designated-Verifier Signature’ (UDVS). A UDVS scheme can function as a standard publicly-verifiable digital signature but has additional functionality which allows any holder of a signature (not necessarily the signer) to designate the signature to any desired designated-verifier (using the verifier’s public key). Given the designated-signature, the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. We propose an efficient deterministic UDVS scheme constructed using any bilinear group-pair. Our UDVS scheme functions as a standard Boneh-Lynn-Shacham (BLS) signature when no verifier-designation is performed, and is therefore compatible with the key-generation, signing and verifying algorithms of the BLS scheme. We prove that our UDVS scheme is secure in the sense of our unforgeability and privacy notions for UDVS schemes, under the Bilinear Diffie-Hellman (BDH) assumption for the underlying group-pair, in the random-oracle model. We also demonstrate a general constructive equivalence between a class of unforgeable and unconditionally-private UDVS schemes having unique signatures (which includes the deterministic UDVS schemes) and a class of ID-Based Encryption (IBE) schemes which contains the Boneh-Franklin IBE scheme but not the Cocks IBE scheme.
Resumo:
Preneel, Govaerts and Vandewalle (PGV) analysed the security of single-block-length block cipher based compression functions assuming that the underlying block cipher has no weaknesses. They showed that 12 out of 64 possible compression functions are collision and (second) preimage resistant. Black, Rogaway and Shrimpton formally proved this result in the ideal cipher model. However, in the indifferentiability security framework introduced by Maurer, Renner and Holenstein, all these 12 schemes are easily differentiable from a fixed input-length random oracle (FIL-RO) even when their underlying block cipher is ideal. We address the problem of building indifferentiable compression functions from the PGV compression functions. We consider a general form of 64 PGV compression functions and replace the linear feed-forward operation in this generic PGV compression function with an ideal block cipher independent of the one used in the generic PGV construction. This modified construction is called a generic modified PGV (MPGV). We analyse indifferentiability of the generic MPGV construction in the ideal cipher model and show that 12 out of 64 MPGV compression functions in this framework are indifferentiable from a FIL-RO. To our knowledge, this is the first result showing that two independent block ciphers are sufficient to design indifferentiable single-block-length compression functions.
The suffix-free-prefix-free hash function construction and its indifferentiability security analysis
Resumo:
In this paper, we observe that in the seminal work on indifferentiability analysis of iterated hash functions by Coron et al. and in subsequent works, the initial value (IV) of hash functions is fixed. In addition, these indifferentiability results do not depend on the Merkle–Damgård (MD) strengthening in the padding functionality of the hash functions. We propose a generic n -bit-iterated hash function framework based on an n -bit compression function called suffix-free-prefix-free (SFPF) that works for arbitrary IV s and does not possess MD strengthening. We formally prove that SFPF is indifferentiable from a random oracle (RO) when the compression function is viewed as a fixed input-length random oracle (FIL-RO). We show that some hash function constructions proposed in the literature fit in the SFPF framework while others that do not fit in this framework are not indifferentiable from a RO. We also show that the SFPF hash function framework with the provision of MD strengthening generalizes any n -bit-iterated hash function based on an n -bit compression function and with an n -bit chaining value that is proven indifferentiable from a RO.
Resumo:
What are the musical features that turn a song into a hit? The aim of this research is to explore the musical features of hit tunes by studying the 224 most popular Finnish evergreens from the 1930s to the 1990s. It is remarkable, that 80-90% of Finnish oldies are in a minor key, though parallel major keys have also been widely employed within single pieces through, for example, modulations. Furthermore, melodies are usually diatonic, staying mostly in the same key. Consequently, chromatically altered tones in the melody and short modulations in the bridge sections become more prominent. I have concentrated in particular on the melodic lines in order to find the most typical melodic formulas from the data. These analyzed melodic formulas play an important role, because they serve as leading phrases and punchlines in songs. Analysis has revealed three major melodic formulas, which most often appear in the melodic lines of hit tunes. All of these formulas share common thematic ground, because they originate from the triadic tonic chord. Because the tonic chord is the most conventional opening chord in the verse parts, it is logical that these formulas occur most often in verses. The strong dominance of these formulas is very much a result of the rhythmic flexibility they possess; for instance, they can be found in every musical style from waltz to foxtrot. Alongside the major formulas lies a miscellaneous group of other tonic-related melodic formulas. One group of melodic formulas consists of melodic quotations. These quotations appear in a different musical context, for instance in a harmonically altered form, and are therefore often difficult to recognize as such. Yet despite the contextual manipulation, the distinctive character of the cited melody usually remains the same. Composers have also made use of certain popular chord-progressions in order to create new but familiar-sounding melodies. The most important individual progression in this case is what is known as a "circle of fifths" and its shortened, prolonged and altered versions. Because that progression is harmonically strong, it is also a contrastive tool used especially in chorus parts and middle sections (AABA). I have also paid attention to ragtime and jazz influences, which can be found in harmony parts and certain melody notes, which extend, suspend or alter the accompaning chords. Other influences from jazz and ragtime in the Finnish evergreen are evident in the use of typical Tin Pan Alley popular song forms. The most important is the AABA form, which dominates over the data along with the verse/chorus-type popular song form. To briefly illustrate the main results, the basic concept of the hit tune can be traced back to Tin Pan Alley songs, whereas the major stylistic aspects, such as minor keys and musical styles, bear influences from Russian, Western European, and Finnish traditions.
Resumo:
The aim of the study was to explore why the MuPSiNet project - a computer and network supported learning environment for the field of health care and social work - did not develop as expected. To grasp the problem some hypotheses were formulated. The hypotheses regarded the teachers' skills in and attitudes towards computing and their attitudes towards constructivist study methods. An online survey containing 48 items was performed. The survey targeted all the teachers within the field of health care and social work in the country, and it produced 461 responses that were analysed against the hypotheses. The reliability of the variables was tested using the Cronbach alpha coefficient and t-tests. Poor basic computing skills among the teachers combined with a vulnerable technical solution, and inadequate project management combined with lack of administrative models for transforming economic resources into manpower were the factors that turned out to play a decisive role in the project. Other important findings were that the teachers had rather poor skills and knowledge in computing, computer safety and computer supported instruction, and that these skills were significantly poorer among female teachers who were in majority in the sample. The fraction of teachers who were familiar with software for electronic patient records (EPR) was low. The attitudes towards constructivist teaching methods were positive, and further education seemed to utterly increase the teachers' readiness to use alternative teaching methods. The most important conclusions were the following: In order to integrate EPR software as a natural tool in teaching planning and documenting health care, it is crucial that the teachers have sufficient basic skills in computing and that more teachers have personal experience of using EPR software. In order for computer supported teaching to become accepted it is necessary to arrange with extensive further education for the teachers presently working, and for that further education to succeed it should be backed up locally among other things by sufficient support in matters concerning computer supported teaching. The attitudes towards computing showed significant gender differences. Based on the findings it is suggested that basic skills in computing should also include an awareness of data safety in relation to work in different kinds of computer networks, and that projects of this kind should be built up around a proper project organisation with sufficient resources. Suggestions concerning curricular development and further education are also presented. Conclusions concerning the research method were that reminders have a better effect, and that respondents tend to answer open-ended questions more verbosely in electronically distributed online surveys compared to traditional surveys. A method of utilising randomized passwords to guarantee respondent anonymity while maintaining sample control is presented. Keywords: computer-assisted learning, computer-assisted instruction, health care, social work, vocational education, computerized patient record, online survey
Resumo:
A smooth map is said to be stable if small perturbations of the map only differ from the original one by a smooth change of coordinates. Smoothly stable maps are generic among the proper maps between given source and target manifolds when the source and target dimensions belong to the so-called nice dimensions, but outside this range of dimensions, smooth maps cannot generally be approximated by stable maps. This leads to the definition of topologically stable maps, where the smooth coordinate changes are replaced with homeomorphisms. The topologically stable maps are generic among proper maps for any dimensions of source and target. The purpose of this thesis is to investigate methods for proving topological stability by constructing extremely tame (E-tame) retractions onto the map in question from one of its smoothly stable unfoldings. In particular, we investigate how to use E-tame retractions from stable unfoldings to find topologically ministable unfoldings for certain weighted homogeneous maps or germs. Our first results are concerned with the construction of E-tame retractions and their relation to topological stability. We study how to construct the E-tame retractions from partial or local information, and these results form our toolbox for the main constructions. In the next chapter we study the group of right-left equivalences leaving a given multigerm f invariant, and show that when the multigerm is finitely determined, the group has a maximal compact subgroup and that the corresponding quotient is contractible. This means, essentially, that the group can be replaced with a compact Lie group of symmetries without much loss of information. We also show how to split the group into a product whose components only depend on the monogerm components of f. In the final chapter we investigate representatives of the E- and Z-series of singularities, discuss their instability and use our tools to construct E-tame retractions for some of them. The construction is based on describing the geometry of the set of points where the map is not smoothly stable, discovering that by using induction and our constructional tools, we already know how to construct local E-tame retractions along the set. The local solutions can then be glued together using our knowledge about the symmetry group of the local germs. We also discuss how to generalize our method to the whole E- and Z- series.
Resumo:
Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.
Resumo:
Free software is viewed as a revolutionary and subversive practice, and in particular has dealt a strong blow to the traditional conception of intellectual property law (although in its current form could be considered a 'hack' of IP rights). However, other (capitalist) areas of law have been swift to embrace free software, or at least incorporate it into its own tenets. One area in particular is that of competition (antitrust) law, which itself has long been in theoretical conflict with intellectual property, due to the restriction on competition inherent in the grant of ‘monopoly’ rights by copyrights, patents and trademarks. This contribution will examine how competition law has approached free software by examining instances in which courts have had to deal with such initiatives, for instance in the Oracle Sun Systems merger, and the implications that these decisions have on free software initiatives. The presence or absence of corporate involvement in initiatives will be an important factor in this investigation, with it being posited that true instances of ‘commons-based peer production’ can still subvert the capitalist system, including perplexing its laws beyond intellectual property.