954 resultados para Document Signatures
Resumo:
The concentrations of Na, K, Ca, Mg, Ba, Sr, Fe, Al, Mn, Zn, Pb, Cu, Ni, Cr, Co, Se, U and Ti were determined in the osteoderms and/or flesh of estuarine crocodiles (Crocodylus porosus) captured in three adjacent catchments within the Alligator Rivers Region (ARR) of northern Australia. Results from multivariate analysis of variance showed that when all metals were considered simultaneously, catchment effects were significant (P≤0.05). Despite considerable within-catchment variability, linear discriminant analysis (LDA) showed that differences in elemental signatures in the osteoderms and/or flesh of C. porosus amongst the catchments were sufficient to classify individuals accurately to their catchment of occurrence. Using cross-validation, the accuracy of classifying a crocodile to its catchment of occurrence was 76% for osteoderms and 60% for flesh. These data suggest that osteoderms provide better predictive accuracy than flesh for discriminating crocodiles amongst catchments. There was no advantage in combining the osteoderm and flesh results to increase the accuracy of classification (i.e. 67%). Based on the discriminant function coefficients for the osteoderm data, Ca, Co, Mg and U were the most important elements for discriminating amongst the three catchments. For flesh data, Ca, K, Mg, Na, Ni and Pb were the most important metals for discriminating amongst the catchments. Reasons for differences in the elemental signatures of crocodiles between catchments are generally not interpretable, due to limited data on surface water and sediment chemistry of the catchments or chemical composition of dietary items of C. porosus. From a wildlife management perspective, the provenance or source catchment(s) of 'problem' crocodiles captured at settlements or recreational areas along the ARR coastline may be established using catchment-specific elemental signatures. If the incidence of problem crocodiles can be reduced in settled or recreational areas by effective management at their source, then public safety concerns about these predators may be moderated, as well as the cost of their capture and removal. Copyright © 2002 Elsevier Science B.V.
Resumo:
Digital signatures are often used by trusted authorities to make unique bindings between a subject and a digital object; for example, certificate authorities certify a public key belongs to a domain name, and time-stamping authorities certify that a certain piece of information existed at a certain time. Traditional digital signature schemes however impose no uniqueness conditions, so a trusted authority could make multiple certifications for the same subject but different objects, be it intentionally, by accident, or following a (legal or illegal) coercion. We propose the notion of a double-authentication-preventing signature, in which a value to be signed is split into two parts: a subject and a message. If a signer ever signs two different messages for the same subject, enough information is revealed to allow anyone to compute valid signatures on behalf of the signer. This double-signature forgeability property discourages signers from misbehaving---a form of self-enforcement---and would give binding authorities like CAs some cryptographic arguments to resist legal coercion. We give a generic construction using a new type of trapdoor functions with extractability properties, which we show can be instantiated using the group of sign-agnostic quadratic residues modulo a Blum integer.
Resumo:
One-time proxy signatures are one-time signatures for which a primary signer can delegate his or her signing capability to a proxy signer. In this work we propose two one-time proxy signature schemes with different security properties. Unlike other existing one-time proxy signatures that are constructed from public key cryptography, our proposed schemes are based one-way functions without trapdoors and so they inherit the communication and computation efficiency from the traditional one-time signatures. Although from a verifier point of view, signatures generated by the proxy are indistinguishable from those created by the primary signer, a trusted authority can be equipped with an algorithm that allows the authority to settle disputes between the signers. In our constructions, we use a combination of one-time signatures, oblivious transfer protocols and certain combinatorial objects. We characterise these new combinatorial objects and present constructions for them.
Efficient extension of standard Schnorr/RSA signatures into Universal Designated-Verifier Signatures
Resumo:
Universal Designated-Verifier Signature (UDVS) schemes are digital signature schemes with additional functionality which allows any holder of a signature to designate the signature to any desired designated-verifier such that the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. Since UDVS schemes reduce to standard signatures when no verifier designation is performed, it is natural to ask how to extend the classical Schnorr or RSA signature schemes into UDVS schemes, so that the existing key generation and signing implementation infrastructure for these schemes can be used without modification. We show how this can be efficiently achieved, and provide proofs of security for our schemes in the random oracle model.
Resumo:
This study demonstrates a novel method for testing the hypothesis that variations in primary and secondary particle number concentration (PNC) in urban air are related to residual fuel oil combustion at a coastal port lying 30 km upwind, by examining the correlation between PNC and airborne particle composition signatures chosen for their sensitivity to the elemental contaminants present in residual fuel oil. Residual fuel oil combustion indicators were chosen by comparing the sensitivity of a range of concentration ratios to airborne emissions originating from the port. The most responsive were combinations of vanadium and sulfur concentration ([S], [V]) expressed as ratios with respect to black carbon concentration ([BC]). These correlated significantly with ship activity at the port and with the fraction of time during which the wind blew from the port. The average [V] when the wind was predominantly from the port was 0.52 ng.m-3 (87%) higher than the average for all wind directions and 0.83 ng.m-3 (280%) higher than that for the lowest vanadium yielding wind direction considered to approximate the natural background. Shipping was found to be the main source of V impacting urban air quality in Brisbane. However, contrary to the stated hypothesis, increases in PNC related measures did not correlate with ship emission indicators or ship traffic. Hence at this site ship emissions were not found to be a major contributor to PNC compared to other fossil fuel combustion sources such as road traffic, airport and refinery emissions.
Resumo:
This article presents a study of how humans perceive and judge the relevance of documents. Humans are adept at making reasonably robust and quick decisions about what information is relevant to them, despite the ever increasing complexity and volume of their surrounding information environment. The literature on document relevance has identified various dimensions of relevance (e.g., topicality, novelty, etc.), however little is understood about how these dimensions may interact. We performed a crowdsourced study of how human subjects judge two relevance dimensions in relation to document snippets retrieved from an internet search engine. The order of the judgment was controlled. For those judgments exhibiting an order effect, a q–test was performed to determine whether the order effects can be explained by a quantum decision model based on incompatible decision perspectives. Some evidence of incompatibility was found which suggests incompatible decision perspectives is appropriate for explaining interacting dimensions of relevance in such instances.
Resumo:
Motivated by privacy issues associated with dissemination of signed digital certificates, we define a new type of signature scheme called a ‘Universal Designated-Verifier Signature’ (UDVS). A UDVS scheme can function as a standard publicly-verifiable digital signature but has additional functionality which allows any holder of a signature (not necessarily the signer) to designate the signature to any desired designated-verifier (using the verifier’s public key). Given the designated-signature, the designated-verifier can verify that the message was signed by the signer, but is unable to convince anyone else of this fact. We propose an efficient deterministic UDVS scheme constructed using any bilinear group-pair. Our UDVS scheme functions as a standard Boneh-Lynn-Shacham (BLS) signature when no verifier-designation is performed, and is therefore compatible with the key-generation, signing and verifying algorithms of the BLS scheme. We prove that our UDVS scheme is secure in the sense of our unforgeability and privacy notions for UDVS schemes, under the Bilinear Diffie-Hellman (BDH) assumption for the underlying group-pair, in the random-oracle model. We also demonstrate a general constructive equivalence between a class of unforgeable and unconditionally-private UDVS schemes having unique signatures (which includes the deterministic UDVS schemes) and a class of ID-Based Encryption (IBE) schemes which contains the Boneh-Franklin IBE scheme but not the Cocks IBE scheme.
Resumo:
A combination of laser plasma ablation and strain control in CdO/ZnO heterostructures is used to produce and stabilize a metastable wurtzite CdO nanophase. According to the Raman selection rules, this nanophase is Raman-active whereas the thermodynamically preferred rocksalt phase is inactive. The wurtzite-specific and thickness/strain-dependent Raman fingerprints and phonon modes are identified and can be used for reliable and inexpensive nanophase detection. The wurtzite nanophase formation is also confirmed by x-ray diffractometry. The demonstrated ability of the metastable phase and phonon mode control in CdO/ZnO heterostructures is promising for the development of next-generation light emitting sources and exciton-based laser diodes.
Resumo:
This paper is about localising across extreme lighting and weather conditions. We depart from the traditional point-feature-based approach as matching under dramatic appearance changes is a brittle and hard thing. Point feature detectors are fixed and rigid procedures which pass over an image examining small, low-level structure such as corners or blobs. They apply the same criteria applied all images of all places. This paper takes a contrary view and asks what is possible if instead we learn a bespoke detector for every place. Our localisation task then turns into curating a large bank of spatially indexed detectors and we show that this yields vastly superior performance in terms of robustness in exchange for a reduced but tolerable metric precision. We present an unsupervised system that produces broad-region detectors for distinctive visual elements, called scene signatures, which can be associated across almost all appearance changes. We show, using 21km of data collected over a period of 3 months, that our system is capable of producing metric localisation estimates from night-to-day or summer-to-winter conditions.
Resumo:
Selumetinib (AZD6244, ARRY-142886) is a selective, non-ATP-competitive inhibitor of mitogen-activated protein/extracellular signal-regulated kinase kinase (MEK)-1/2. The range of antitumor activity seen preclinically and in patients highlights the importance of identifying determinants of response to this drug. In large tumor cell panels of diverse lineage, we show that MEK inhibitor response does not have an absolute correlation with mutational or phospho-protein markers of BRAF/MEK, RAS, or phosphoinositide 3-kinase (PI3K) activity. We aimed to enhance predictivity by measuring pathway output through coregulated gene networks displaying differential mRNA expression exclusive to resistant cell subsets and correlated to mutational or dynamic pathway activity. We discovered an 18-gene signature enabling measurement of MEK functional output independent of tumor genotype. Where the MEK pathway is activated but the cells remain resistant to selumetinib, we identified a 13-gene signature that implicates the existence of compensatory signaling from RAS effectors other than PI3K. The ability of these signatures to stratify samples according to functional activation of MEK and/or selumetinib sensitivity was shown in multiple independent melanoma, colon, breast, and lung tumor cell lines and in xenograft models. Furthermore, we were able to measure these signatures in fixed archival melanoma tumor samples using a single RT-qPCR-based test and found intergene correlations and associations with genetic markers of pathway activity to be preserved. These signatures offer useful tools for the study of MEK biology and clinical application of MEK inhibitors, and the novel approaches taken may benefit other targeted therapies.
Resumo:
The use of ‘topic’ concepts has shown improved search performance, given a query, by bringing together relevant documents which use different terms to describe a higher level concept. In this paper, we propose a method for discovering and utilizing concepts in indexing and search for a domain specific document collection being utilized in industry. This approach differs from others in that we only collect focused concepts to build the concept space and that instead of turning a user’s query into a concept based query, we experiment with different techniques of combining the original query with a concept query. We apply the proposed approach to a real-world document collection and the results show that in this scenario the use of concept knowledge at index and search can improve the relevancy of results.
Resumo:
Clustering is an important technique in organising and categorising web scale documents. The main challenges faced in clustering the billions of documents available on the web are the processing power required and the sheer size of the datasets available. More importantly, it is nigh impossible to generate the labels for a general web document collection containing billions of documents and a vast taxonomy of topics. However, document clusters are most commonly evaluated by comparison to a ground truth set of labels for documents. This paper presents a clustering and labeling solution where the Wikipedia is clustered and hundreds of millions of web documents in ClueWeb12 are mapped on to those clusters. This solution is based on the assumption that the Wikipedia contains such a wide range of diverse topics that it represents a small scale web. We found that it was possible to perform the web scale document clustering and labeling process on one desktop computer under a couple of days for the Wikipedia clustering solution containing about 1000 clusters. It takes longer to execute a solution with finer granularity clusters such as 10,000 or 50,000. These results were evaluated using a set of external data.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.