953 resultados para metadata schemes


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Review of the book 'Access to East European and Eurasian culture: publishing, acquisitions, digitization, metadata', edited by Miranda Remnek, published by Haworth Information Press, 2009.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present three counterfeiting attacks on the block-wise dependent fragile watermarking schemes. We consider vulnerabilities such as the exploitation of a weak correlation among block-wise dependent watermarks to modify valid watermarked %(medical or other digital) images, where they could still be verified as authentic, though they are actually not. Experimental results successfully demonstrate the practicability and consequences of the proposed attacks for some relevant schemes. The development of the proposed attack models can be used as a means to systematically examine the security levels of similar watermarking schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiple-time signatures are digital signature schemes where the signer is able to sign a predetermined number of messages. They are interesting cryptographic primitives because they allow to solve many important cryptographic problems, and at the same time offer substantial efficiency advantage over ordinary digital signature schemes like RSA. Multiple-time signature schemes have found numerous applications, in ordinary, on-line/off-line, forward-secure signatures, and multicast/stream authentication. We propose a multiple-time signature scheme with very efficient signing and verifying. Our construction is based on a combination of one-way functions and cover-free families, and it is secure against the adaptive chosen-message attack.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a nonstandard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (geometry of numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multi-secret sharing scheme allows several secrets to be shared amongst a group of participants. In 2005, Shao and Cao developed a verifiable multi-secret sharing scheme where each participant’s share can be used several times which reduces the number of interactions between the dealer and the group members. In addition some secrets may require a higher security level than others involving the need for different threshold values. Recently Chan and Chang designed such a scheme but their construction only allows a single secret to be shared per threshold value. In this article we combine the previous two approaches to design a multiple time verifiable multi-secret sharing scheme where several secrets can be shared for each threshold value. Since the running time is an important factor for practical applications, we will provide a complexity comparison of our combined approach with respect to the previous schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The construction and operation of infrastructure assets can have significant impact on society and the region. Using a sustainability assessment framework can be an effective means to build sustainability aspects into the design, construction and operation of infrastructure assets. The conventional evaluation processes and procedures for infrastructure projects do not necessarily measure the qualitative/quantitative effectiveness of all aspects of sustainability: environment, social wellbeing and economy. As a result, a few infrastructure sustainability rating schemes have been developed with a view to assess the level of sustainability attained in the infrastructure projects. These include: Infrastructure Sustainability (Australia); CEEQUAL (UK); and Envision (USA). In addition, road sector specific sustainability rating schemes such as Greenroads (USA) and Invest (Australia) have also been developed. These schemes address several aspects of sustainability with varying emphasis (weightings) on areas such as: use of resources; emission, pollution and waste; ecology; people and place; management and governance; and innovation. The attainment of sustainability of an infrastructure project depends largely on addressing the whole-of-life environmental issues. This study has analysed the rating schemes’ coverage of different environmental components for the road infrastructure under the five phases of a project: material, construction, use, maintenance and end-of-life. This is based on a comprehensive life cycle assessment (LCA) system boundary. The findings indicate that there is a need for the schemes to consider key (high impact) life cycle environmental components such as traffic congestion during construction, rolling resistance due to surface roughness and structural stiffness of the pavement, albedo, lighting, and end-of-life management (recycling) to deliver sustainable road projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

QUT Software Finder is a searchable repository of metadata describing software and source code, which has been created as a result of QUT research activities. It was launched in December 2013. https://researchdatafinder.qut.edu.au/scf The registry was designed to aid the discovery and visibility of QUT research outputs and encourage sharing and re-use of code and software throughout the research community, both nationally and internationally. The repository platform used is VIVO (an open source product initially developed at Cornell University). QUT Software Finder records that describe software or code are connected to information about researchers involved, the research groups, related publications and related projects. Links to where the software or code can be accessed from are also provided alongside licencing and re-use information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital forensics concerns the analysis of electronic artifacts to reconstruct events such as cyber crimes. This research produced a framework to support forensic analyses by identifying associations in digital evidence using metadata. It showed that metadata based associations can help uncover the inherent relationships between heterogeneous digital artifacts thereby aiding reconstruction of past events by identifying artifact dependencies and time sequencing. It also showed that metadata association based analysis is amenable to automation by virtue of the ubiquitous nature of metadata across forensic disk images, files, system and application logs and network packet captures. The results prove that metadata based associations can be used to extract meaningful relationships between digital artifacts, thus potentially benefiting real-life forensics investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We characterise ideal threshold schemes from different approaches. Since the characteristic properties are independent to particular descriptions of threshold schemes, all ideal threshold schemes can be examined by new points of view and new results on ideal threshold schemes can be discovered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work presents a new method for the design of ideal secret sharing. The method uses regular mappings that are well suited for construction of perfect secret sharing. The restriction of regular mappings to permutations gives a convenient tool for investigation of the relation between permutations and ideal secret sharing generated by them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We observe that MDS codes have interesting properties that can be used to construct ideal threshold schemes. These schemes permit the combiner to detect cheating, identify cheaters and recover the correct secret. The construction is later generalised so the resulting secret sharing is resistant against the Tompa-Woll cheating.