381 resultados para secure protocal
Resumo:
Motivated by the need of private set operations in a distributed environment, we extend the two-party private matching problem proposed by Freedman, Nissim and Pinkas (FNP) at Eurocrypt’04 to the distributed setting. By using a secret sharing scheme, we provide a distributed solution of the FNP private matching called the distributed private matching. In our distributed private matching scheme, we use a polynomial to represent one party’s dataset as in FNP and then distribute the polynomial to multiple servers. We extend our solution to the distributed set intersection and the cardinality of the intersection, and further we show how to apply the distributed private matching in order to compute distributed subset relation. Our work extends the primitives of private matching and set intersection by Freedman et al. Our distributed construction might be of great value when the dataset is outsourced and its privacy is the main concern. In such cases, our distributed solutions keep the utility of those set operations while the dataset privacy is not compromised. Comparing with previous works, we achieve a more efficient solution in terms of computation. All protocols constructed in this paper are provably secure against a semi-honest adversary under the Decisional Diffie-Hellman assumption.
Resumo:
We consider the following problem: users of an organization wish to outsource the storage of sensitive data to a large database server. It is assumed that the server storing the data is untrusted so the data stored have to be encrypted. We further suppose that the manager of the organization has the right to access all data, but a member of the organization can not access any data alone. The member must collaborate with other members to search for the desired data. In this paper, we investigate the notion of threshold privacy preserving keyword search (TPPKS) and define its security requirements. We construct a TPPKS scheme and show the proof of security under the assumptions of intractability of discrete logarithm, decisional Diffie-Hellman and computational Diffie-Hellman problems.
Resumo:
Security models for two-party authenticated key exchange (AKE) protocols have developed over time to provide security even when the adversary learns certain secret keys. In this work, we advance the modelling of AKE protocols by considering more granular, continuous leakage of long-term secrets of protocol participants: the adversary can adaptively request arbitrary leakage of long-term secrets even after the test session is activated, with limits on the amount of leakage per query but no bounds on the total leakage. We present a security model supporting continuous leakage even when the adversary learns certain ephemeral secrets or session keys, and give a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the model; our protocol achieves continuous, after-the-fact leakage resilience with not much more cost than a previous protocol with only bounded, non-after-the-fact leakage.
Resumo:
A key derivation function (KDF) is a function that transforms secret non-uniformly random source material together with some public strings into one or more cryptographic keys. These cryptographic keys are used with a cryptographic algorithm for protecting electronic data during both transmission over insecure channels and storage. In this thesis, we propose a new method for constructing a generic stream cipher based key derivation function. We show that our proposed key derivation function based on stream ciphers is secure if the under-lying stream cipher is secure. We simulate instances of this stream cipher based key derivation function using three eStream nalist: Trivium, Sosemanuk and Rabbit. The simulation results show these stream cipher based key derivation functions offer efficiency advantages over the more commonly used key derivation functions based on block ciphers and hash functions.
Resumo:
Voluntary and compliance markets for forest carbon and other (emission avoidance and biosequestration) activities are growing internationally and across Australia. Queensland and its Natural Resource Management (NRM) regions have an opportunity to take a variety of actions to help guide these markets to secure multiple landscape benefits and to build landscape resilience in the face of climate change. As the national arrangements for offsets within Australia’s Clean Energy Package (CEP) and emissions trading environment emerge, Queensland’s regions can prepare themselves and their landholding communities to take advantage of these opportunities to deliver improved climate resilience in their regional landscapes.
Resumo:
Regional and remote communities in tropical Queensland are among Australia’s most vulnerable in the face of climate change. At the same time, these socially and economically vulnerable regions house some of Australia’s most significant biodiversity values. Past approaches to terrestrial biodiversity management have focused on tackling biophysical interventions through the use of biophysical knowledge. An equally important focus should be placed on building regional-scale community resilience if some of the worst biodiversity impacts of climate change are to be avoided or mitigated. Despite its critical need, more systemic or holistic approaches to natural resource management have been rarely trialed and tested in a structured way. Currently, most strategic interventions in improving regional community resilience are ad hoc, not theory-based and short term. Past planning approaches have not been durable, nor have they been well informed by clear indicators. Research into indicators for community resilience has been poorly integrated within adaptive planning and management cycles. This project has aimed to resolve this problem by: * Reviewing the community and social resilience and adaptive planning literature to reconceptualise an improved framework for applying community resilience concepts; * Harvesting and extending work undertaken in MTSRF Phase 1 to identifying the learnings emerging from past MTSRF research; * Distilling these findings to identify new theoretical and practical approaches to the application of community resilience in natural resource use and management; * Reconsidering the potential interplay between a region’s biophysical and social planning processes, with a focus on exploring spatial tools to communicate climate change risk and its consequent environmental, economic and social impacts, and; * Trialling new approaches to indicator development and adaptive planning to improve community resilience, using a sub-regional pilot in the Wet Tropics. In doing so, we also looked at ways to improve the use and application of relevant spatial information. Our theoretical review drew upon the community development, psychology and emergency management literature to better frame the concept of community resilience relative to aligned concepts of social resilience, vulnerability and adaptive capacity. Firstly, we consider community resilience as a concept that can be considered at a range of scales (e.g. regional, locality, communities of interest, etc.). We also consider that overall resilience at higher scales will be influenced by resilience levels at lesser scales (inclusive of the resilience of constituent institutions, families and individuals). We illustrate that, at any scale, resilience and vulnerability are not necessarily polar opposites, and that some understanding of vulnerability is important in determining resilience. We position social resilience (a concept focused on the social characteristics of communities and individuals) as an important attribute of community resilience, but one that needs to be considered alongside economic, natural resource, capacity-based and governance attributes. The findings from the review of theory and MTSRF Phase 1 projects were synthesized and refined by the wider project team. Five predominant themes were distilled from this literature, research review and an expert analysis. They include the findings that: 1. Indicators have most value within an integrated and adaptive planning context, requiring an active co-research relationship between community resilience planners, managers and researchers if real change is to be secured; 2. Indicators of community resilience form the basis for planning for social assets and the resilience of social assets is directly related the longer term resilience of natural assets. This encourages and indeed requires the explicit development and integration of social planning within a broader natural resource planning and management framework; 3. Past indicator research and application has not provided a broad picture of the key attributes of community resilience and there have been many attempts to elicit lists of “perfect” indicators that may never be useful within the time and resource limitations of real world regional planning and management. We consider that modeling resilience for proactive planning and prediction purposes requires the consideration of simple but integrated clusters of attributes; 4. Depending on time and resources available for planning and management, the combined use of well suited indicators and/or other lesser “lines of evidence” is more flexible than the pursuit of perfect indicators, and that; 5. Index-based, collaborative and participatory approaches need to be applied to the development, refinement and reporting of indicators over longer time frames. We trialed the practical application of these concepts via the establishment of a collaborative regional alliance of planners and managers involved in the development of climate change adaptation strategies across tropical Queensland (the Gulf, Wet Tropics, Cape York and Torres Strait sub-regions). A focus on the Wet Tropics as a pilot sub-region enabled other Far North Queensland sub-region’s to participate and explore the potential extension of this approach. The pilot activities included: * Further exploring ways to innovatively communicate the region’s likely climate change scenarios and possible environmental, economic and social impacts. We particularly looked at using spatial tools to overlay climate change risks to geographic communities and social vulnerabilities within those communities; * Developing a cohesive first pass of a State of the Region-style approach to reporting community resilience, inclusive of regional economic viability, community vitality, capacitybased and governance attributes. This framework integrated a literature review, expert (academic and community) and alliance-based contributions; and * Early consideration of critical strategies that need to be included in unfolding regional planning activities with Far North Queensland. The pilot assessment finds that rural, indigenous and some urban populations in the Wet Tropics are highly vulnerable and sensitive to climate change and may require substantial support to adapt and become more resilient. This assessment finds that under current conditions (i.e. if significant adaptation actions are not taken) the Wet Tropics as a whole may be seriously impacted by the most significant features of climate change and extreme climatic events. Without early and substantive action, this could result in declining social and economic wellbeing and natural resource health. Of the four attributes we consider important to understanding community resilience, the Wet Tropics region is particularly vulnerable in two areas; specifically its economic vitality and knowledge, aspirations and capacity. The third and fourth attributes, community vitality and institutional governance are relatively resilient but are vulnerable in some key respects. In regard to all four of these attributes, however, there is some emerging capacity to manage the possible shocks that may be associated with the impacts of climate change and extreme climatic events. This capacity needs to be carefully fostered and further developed to achieve broader community resilience outcomes. There is an immediate need to build individual, household, community and sectoral resilience across all four attribute groups to enable populations and communities in the Wet Tropics region to adapt in the face of climate change. Preliminary strategies of importance to improve regional community resilience have been identified. These emerging strategies also have been integrated into the emerging Regional Development Australia Roadmap, and this will ensure that effective implementation will be progressed and coordinated. They will also inform emerging strategy development to secure implementation of the FNQ 2031 Regional Plan. Of most significance in our view, this project has taken a co-research approach from the outset with explicit and direct importance and influence within the region’s formal planning and management arrangements. As such, the research: * Now forms the foundations of the first attempt at “Social Asset” planning within the Wet Tropics Regional NRM Plan review; * Is assisting Local government at regional scale to consider aspects of climate change adaptation in emerging planning scheme/community planning processes; * Has partnered the State government (via the Department of Infrastructure and Planning and Regional Managers Coordination Network Chair) in progressing the Climate Change adaptation agenda set down within the FNQ 2031 Regional Plan; * Is informing new approaches to report on community resilience within the GBRMPA Outlook reporting framework; and * Now forms the foundation for the region’s wider climate change adaptation priorities in the Regional Roadmap developed by Regional Development Australia. Through the auspices of Regional Development Australia, the outcomes of the research will now inform emerging negotiations concerning a wider package of climate change adaptation priorities with State and Federal governments. Next stage research priorities are also being developed to enable an ongoing alliance between researchers and the region’s climate change response.
Resumo:
We present a text watermarking scheme that embeds a bitstream watermark Wi in a text document P preserving the meaning, context, and flow of the document. The document is viewed as a set of paragraphs, each paragraph being a set of sentences. The sequence of paragraphs and sentences used to embed watermark bits is permuted using a secret key. Then, English language sentence transformations are used to modify sentence lengths, thus embedding watermarking bits in the Least Significant Bits (LSB) of the sentences’ cardinalities. The embedding and extracting algorithms are public, while the secrecy and security of the watermark depends on a secret key K. The probability of False Positives is extremely small, hence avoiding incidental occurrences of our watermark in random text documents. Majority voting provides security against text addition, deletion, and swapping attacks, further reducing the probability of False Positives. The scheme is secure against the general attacks on text watermarks such as reproduction (photocopying, FAX), reformatting, synonym substitution, text addition, text deletion, text swapping, paragraph shuffling and collusion attacks.
Resumo:
To provide card holder authentication while they are conducting an electronic transaction using mobile devices, VISA and MasterCard independently proposed two electronic payment protocols: Visa 3D Secure and MasterCard Secure Code. The protocols use pre-registered passwords to provide card holder authentication and Secure Socket Layer/ Transport Layer Security (SSL/TLS) for data confidentiality over wired networks and Wireless Transport Layer Security (WTLS) between a wireless device and a Wireless Application Protocol (WAP) gateway. The paper presents our analysis of security properties in the proposed protocols using formal method tools: Casper and FDR2. We also highlight issues concerning payment security in the proposed protocols.
Resumo:
A secure protocol for electronic, sealed-bid, single item auctions is presented. The protocol caters to both first and second price (Vickrey) auctions and provides full price flexibility. Both computational and communication cost are linear with the number of bidders and utilize only standard cryptographic primitives. The protocol strictly divides knowledge of the bidder's identity and their actual bids between, respectively, a registration authority and an auctioneer, who are assumed not to collude but may be separately corrupt. This assures strong bidder-anonymity, though only weak bid privacy. The protocol is structured in two phases, each involving only off-line communication. Registration, requiring the use of the public key infrastructure, is simultaneous with hash-sealed bid-commitment and generates a receipt to the bidder containing a pseudonym. This phase is followed by encrypted bid-submission. Both phases involve the registration authority acting as a communication conduit but the actual message size is quite small. It is argued that this structure guarantees non-repudiation by both the winner and the auctioneer. Second price correctness is enforced either by observing the absence of registration of the claimed second-price bid or, where registered but lower than the actual second price, is subject to cooperation by the second price bidder - presumably motivated through self-interest. The use of the registration authority in other contexts is also considered with a view to developing an architecture for efficient secure multiparty transactions
Resumo:
Universal One-Way Hash Functions (UOWHFs) may be used in place of collision-resistant functions in many public-key cryptographic applications. At Asiacrypt 2004, Hong, Preneel and Lee introduced the stronger security notion of higher order UOWHFs to allow construction of long-input UOWHFs using the Merkle-Damgård domain extender. However, they did not provide any provably secure constructions for higher order UOWHFs. We show that the subset sum hash function is a kth order Universal One-Way Hash Function (hashing n bits to m < n bits) under the Subset Sum assumption for k = O(log m). Therefore we strengthen a previous result of Impagliazzo and Naor, who showed that the subset sum hash function is a UOWHF under the Subset Sum assumption. We believe our result is of theoretical interest; as far as we are aware, it is the first example of a natural and computationally efficient UOWHF which is also a provably secure higher order UOWHF under the same well-known cryptographic assumption, whereas this assumption does not seem sufficient to prove its collision-resistance. A consequence of our result is that one can apply the Merkle-Damgård extender to the subset sum compression function with ‘extension factor’ k+1, while losing (at most) about k bits of UOWHF security relative to the UOWHF security of the compression function. The method also leads to a saving of up to m log(k+1) bits in key length relative to the Shoup XOR-Mask domain extender applied to the subset sum compression function.
Resumo:
A dynamic accumulator is an algorithm, which merges a large set of elements into a constant-size value such that for an element accumulated, there is a witness confirming that the element was included into the value, with a property that accumulated elements can be dynamically added and deleted into/from the original set. Recently Wang et al. presented a dynamic accumulator for batch updates at ICICS 2007. However, their construction suffers from two serious problems. We analyze them and propose a way to repair their scheme. We use the accumulator to construct a new scheme for common secure indices with conjunctive keyword-based retrieval.
Resumo:
A parallel authentication and public-key encryption is introduced and exemplified on joint encryption and signing which compares favorably with sequential Encrypt-then-Sign (ɛtS) or Sign-then-Encrypt (Stɛ) schemes as far as both efficiency and security are concerned. A security model for signcryption, and thus joint encryption and signing, has been recently defined which considers possible attacks and security goals. Such a scheme is considered secure if the encryption part guarantees indistinguishability and the signature part prevents existential forgeries, for outsider but also insider adversaries. We propose two schemes of parallel signcryption, which are efficient alternative to Commit-then-Sign-and- Encrypt (Ct&G3&S). They are both provably secure in the random oracle model. The first one, called generic parallel encrypt and sign, is secure if the encryption scheme is semantically secure against chosen-ciphertext attacks and the signature scheme prevents existential forgeries against random-message attacks. The second scheme, called optimal parallel encrypt. and sign, applies random oracles similar to the OAEP technique in order to achieve security using encryption and signature components with very weak security requirements — encryption is expected to be one-way under chosen-plaintext attacks while signature needs to be secure against universal forgeries under random-plaintext attack, that is actually the case for both the plain-RSA encryption and signature under the usual RSA assumption. Both proposals are generic in the sense that any suitable encryption and signature schemes (i.e. which simply achieve required security) can be used. Furthermore they allow both parallel encryption and signing, as well as parallel decryption and verification. Properties of parallel encrypt and sign schemes are considered and a new security standard for parallel signcryption is proposed.
Resumo:
We study the multicast stream authentication problem when an opponent can drop, reorder and inject data packets into the communication channel. In this context, bandwidth limitation and fast authentication are the core concerns. Therefore any authentication scheme is to reduce as much as possible the packet overhead and the time spent at the receiver to check the authenticity of collected elements. Recently, Tartary and Wang developed a provably secure protocol with small packet overhead and a reduced number of signature verifications to be performed at the receiver. In this paper, we propose an hybrid scheme based on Tartary and Wang’s approach and Merkle hash trees. Our construction will exhibit a smaller overhead and a much faster processing at the receiver making it even more suitable for multicast than the earlier approach. As Tartary and Wang’s protocol, our construction is provably secure and allows the total recovery of the data stream despite erasures and injections occurred during transmission.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a non-standard scheme designed specifically for this purpose, or to have secure channels between shareholders. In contrast, we show how to increase the threshold parameter of the standard CRT secret-sharing scheme without secure channels between the shareholders. Our method can thus be applied to existing CRT schemes even if they were set up without consideration to future threshold increases. Our method is a positive cryptographic application for lattice reduction algorithms, and we also use techniques from lattice theory (geometry of numbers) to prove statements about the correctness and information-theoretic security of our constructions.
Resumo:
Pseudorandom Generators (PRGs) based on the RSA inversion (one-wayness) problem have been extensively studied in the literature over the last 25 years. These generators have the attractive feature of provable pseudorandomness security assuming the hardness of the RSA inversion problem. However, despite extensive study, the most efficient provably secure RSA-based generators output asymptotically only at most O(logn) bits per multiply modulo an RSA modulus of bitlength n, and hence are too slow to be used in many practical applications. To bring theory closer to practice, we present a simple modification to the proof of security by Fischlin and Schnorr of an RSA-based PRG, which shows that one can obtain an RSA-based PRG which outputs Ω(n) bits per multiply and has provable pseudorandomness security assuming the hardness of a well-studied variant of the RSA inversion problem, where a constant fraction of the plaintext bits are given. Our result gives a positive answer to an open question posed by Gennaro (J. of Cryptology, 2005) regarding finding a PRG beating the rate O(logn) bits per multiply at the cost of a reasonable assumption on RSA inversion.