858 resultados para COMPUTER SCIENCE, THEORY
Resumo:
This book constitutes the refereed proceedings of the 11th International Conference on Cryptology and Network Security, CANS 2012, held in Darmstadt, Germany, in December 2012. The 22 revised full papers, presented were carefully reviewed and selected from 99 submissions. The papers are organized in topical sections on cryptanalysis; network security; cryptographic protocols; encryption; and s-box theory.
Resumo:
In a paper published in FSE 2007, a way of obtaining near-collisions and in theory also collisions for the FORK-256 hash function was presented [8]. The paper contained examples of near-collisions for the compression function, but in practice the attack could not be extended to the full function due to large memory requirements and computation time. In this paper we improve the attack and show that it is possible to find near-collisions in practice for any given value of IV. In particular, this means that the full hash function with the prespecified IV is vulnerable in practice, not just in theory. We exhibit an example near-collision for the complete hash function.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a non-standard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (Geometry of Numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.
Resumo:
Algebraic immunity AI(f) defined for a boolean function f measures the resistance of the function against algebraic attacks. Currently known algorithms for computing the optimal annihilator of f and AI(f) are inefficient. This work consists of two parts. In the first part, we extend the concept of algebraic immunity. In particular, we argue that a function f may be replaced by another boolean function f^c called the algebraic complement of f. This motivates us to examine AI(f ^c ). We define the extended algebraic immunity of f as AI *(f)= min {AI(f), AI(f^c )}. We prove that 0AI(f)AI *(f)1. Since AI(f)AI *(f)= 1 holds for a large number of cases, the difference between AI(f) and AI *(f) cannot be ignored in algebraic attacks. In the second part, we link boolean functions to hypergraphs so that we can apply known results in hypergraph theory to boolean functions. This not only allows us to find annihilators in a fast and simple way but also provides a good estimation of the upper bound on AI *(f).
Resumo:
The paper addresses the cheating prevention in secret sharing. We consider secret sharing with binary shares. The secret also is binary. This model allows us to use results and constructions from the well developed theory of cryptographically strong boolean functions. In particular, we prove that for given secret sharing, the average cheating probability over all cheating vectors and all original vectors, i.e., 1/n 2n c=1...n V n c, , denoted by , satisfies , and the equality holds if and only if c, satisfies c,= for every cheating vector c and every original vector . In this case the secret sharing is said to be cheating immune. We further establish a relationship between cheating-immune secret sharing and cryptographic criteria of boolean functions.This enables us to construct cheating-immune secret sharing.
Resumo:
Pseudorandom Generators (PRGs) based on the RSA inversion (one-wayness) problem have been extensively studied in the literature over the last 25 years. These generators have the attractive feature of provable pseudorandomness security assuming the hardness of the RSA inversion problem. However, despite extensive study, the most efficient provably secure RSA-based generators output asymptotically only at most O(logn) bits per multiply modulo an RSA modulus of bitlength n, and hence are too slow to be used in many practical applications. To bring theory closer to practice, we present a simple modification to the proof of security by Fischlin and Schnorr of an RSA-based PRG, which shows that one can obtain an RSA-based PRG which outputs (n) bits per multiply and has provable pseudorandomness security assuming the hardness of a well-studied variant of the RSA inversion problem, where a constant fraction of the plaintext bits are given. Our result gives a positive answer to an open question posed by Gennaro (J. of Cryptology, 2005) regarding finding a PRG beating the rate O(logn) bits per multiply at the cost of a reasonable assumption on RSA inversion.
Resumo:
While organizations strive to leverage the vast information generated daily from social media platforms and both decision makers and consultants are keen to identify and exploit this informations value, there has been little research into social media in the business context. Social media are diverse, varying in scope and functionality, this diversity entailing a complex of attributes and characteristics, resulting in confusion for both researchers and organizations. Taxonomies are important precursors in emerging fields and are foundational for rigorous theory building. Though aspects of social media have been studied from various discipline perspectives, this work has been largely descriptive. Thus, while the need for a rigorous taxonomy of social media is strong, previous efforts to classify social media suffer limitations e.g. lack of a systematic taxonomic method, overreliance on intuition, disregard for the users perspective, and inadequate consideration of purpose. Thus, this study was mainly initiated by the overarching question How can social media in the business context be usefully classified? In order to address this gap, the current paper proposes a systematic method for developing a taxonomy appropriate to study social media in organizations context, combining Nickerson et al,s (2012) IS taxonomy building guidelines and a Repertory grid (RepGrid) approach.
Resumo:
Numeric sets can be used to store and distribute important information such as currency exchange rates and stock forecasts. It is useful to watermark such data for proving ownership in case of illegal distribution by someone. This paper analyzes the numerical set watermarking model presented by Sion et. al in On watermarking numeric sets, identifies its weaknesses, and proposes a novel scheme that overcomes these problems. One of the weaknesses of Sions watermarking scheme is the requirement to have a normally-distributed set, which is not true for many numeric sets such as forecast figures. Experiments indicate that the scheme is also susceptible to subset addition and secondary watermarking attacks. The watermarking model we propose can be used for numeric sets with arbitrary distribution. Theoretical analysis and experimental results show that the scheme is strongly resilient against sorting, subset selection, subset addition, distortion, and secondary watermarking attacks.
Resumo:
The Australian Civil Aviation Safety Authority (CASA) currently lists more than 100 separate entities or organisations which maintain a UAS Operator Certificate (UOC) [1]. Approved operations are overwhelmingly a permutation of aerial photography, surveillance, survey or spotting and predominantly, are restricted to Visual Line of Sight (VLOS) operations, below 400 feet, and not within 3 NM of an aerodrome. However, demand is increasing for a Remote Piloted Aerial System (RPAS) regulatory regime which facilitates more expansive operations, in particular unsegregated, Beyond Visual Line of Sight (BVLOS) operations. Despite this demand, there is national and international apprehension regarding the necessary levels of airworthiness and operational regulation required to maintain safety and minimise the risk associated with unsegregated operations. Fundamental to addressing these legitimate concerns will be the mechanisms that underpin safe separation and collision avoidance. Whilst a large body of research has been dedicated to investigating on-board, Sense and Avoid (SAA) technology necessary to meet this challenge, this paper focuses on the contribution of the NAS to separation assurance, and how it will support, as well as complicate RPAS integration. The paper collates and presents key, but historically disparate, threads of Australian RPAS and NAS related information, and distils it with a filter focused on minimising RPAS collision risk. Our ongoing effort is motivated by the need to better understand the separation assurance contribution provided by the NAS layers, in the first instance, and subsequently employ this information to identify scenarios where the coincident collision risk is demonstrably low, providing legitimate substantiation for concessions on equipage and airworthiness standards.
Resumo:
We explore relationships between habits and technology interaction by reporting on older people's experience of the Kinect for Xbox. We contribute to theoretical and empirical understandings of habits in the use of technology to inform understanding of the habitual qualities of our interactions with computing technologies, particularly systems exploiting natural user interfaces. We situate ideas of habit in relation to user experience and usefulness in interaction design, and draw on critical approaches to the concept of habit from cultural theory to understand the embedded, embodied, and situated contexts in our interactions with technologies. We argue that understanding technology habits as a process of reciprocal habituation in which people and technologies adapt to each other over time through design, adoption, and appropriation offers opportunities for research on user experience and interaction design within human-computer interaction, especially as newer gestural and motion control interfaces promise to reshape the ways in which we interact with computers.
Resumo:
This thesis presents an empirical study of the effects of topology on cellular automata rule spaces. The classical definition of a cellular automaton is restricted to that of a regular lattice, often with periodic boundary conditions. This definition is extended to allow for arbitrary topologies. The dynamics of cellular automata within the triangular tessellation were analysed when transformed to 2-manifolds of topological genus 0, genus 1 and genus 2. Cellular automata dynamics were analysed from a statistical mechanics perspective. The sample sizes required to obtain accurate entropy calculations were determined by an entropy error analysis which observed the error in the computed entropy against increasing sample sizes. Each cellular automata rule space was sampled repeatedly and the selected cellular automata were simulated over many thousands of trials for each topology. This resulted in an entropy distribution for each rule space. The computed entropy distributions are indicative of the cellular automata dynamical class distribution. Through the comparison of these dynamical class distributions using the E-statistic, it was identified that such topological changes cause these distributions to alter. This is a significant result which implies that both global structure and local dynamics play a important role in defining long term behaviour of cellular automata.
Resumo:
Quantum-like models can be fruitfully used to model attitude change in a social context. Next steps require data, and higher dimensional models. Here, we discuss an exploratory study that demonstrates an order effect when three question sets about Climate Beliefs, Political Affiliation and Attitudes Towards Science are presented in different orders within a larger study of n=533 subjects. A quantum-like model seems possible, and we propose a new experiment which could be used to test between three possible models for this scenario.
Resumo:
This conceptual paper is a preliminary part of an ongoing study into take-up of electronic personal health records (ePHRs). The purpose of this work is to contextually operationalise' Grnroos (2012) model of value co-creation in service for ePHRs. Using findings in the extant literature we enhance theoretical and practical understanding of the potential for co-creation of value with ePHRs for relevant stakeholders. The research design focused on the selection and evaluation of relevant literature to include in the discussion. The objective was to demonstrate which articles can be used to 'contextualise' the concepts in relation to relevant healthcare providers and patient engagement in the co-creation of value from having shared ePHRs. Starting at the service concept, that is, what the service provider wants to achieve and for whom, there is little doubt that there are recognised benefits that co-create value for both healthcare providers and healthcare consumers (i.e. patients) through shared ePHRs. We further highlight both alignments and misalignments in the resources and activities concepts between stakeholder groups. Examples include the types of functionalities as well as the interactive and peer communication needs perceived as useful for healthcare providers compared to healthcare consumers. The paper has implications for theory and practice and is an original and innovative approach to studying the co-creation of value in eHealth delivery.