170 resultados para Signature


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Standard signature schemes are usually designed only to achieve weak unforgeability – i.e. preventing forgery of signatures on new messages not previously signed. However, most signature schemes are randomised and allow many possible signatures for a single message. In this case, it may be possible to produce a new signature on a previously signed message. Some applications require that this type of forgery also be prevented – this requirement is called strong unforgeability. At PKC2006, Boneh Shen and Waters presented an efficient transform based on any randomised trapdoor hash function which converts a weakly unforgeable signature into a strongly unforgeable signature and applied it to construct a strongly unforgeable signature based on the CDH problem. However, the transform of Boneh et al only applies to a class of so-called partitioned signatures. Although many schemes fall in this class, some do not, for example the DSA signature. Hence it is natural to ask whether one can obtain a truly generic efficient transform based on any randomised trapdoor hash function which converts any weakly unforgeable signature into a strongly unforgeable one. We answer this question in the positive by presenting a simple modification of the Boneh-Shen-Waters transform. Our modified transform uses two randomised trapdoor hash functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present an original approach for finding approximate nearest neighbours in collections of locality-sensitive hashes. The paper demonstrates that this approach makes high-performance nearest-neighbour searching feasible on Web-scale collections and commodity hardware with minimal degradation in search quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Email is rapidly replacing other forms of communication as the preferred means of communication between contracting parties. The recent decision of Stellard Pty Ltd v North Queensland Fuel Pty Ltd [2015] QSC 119 reinforces the judicial acceptance of email as an effective means of creating a binding agreement and the willingness to adopt a liberal concept of ‘signing’ in an electronic environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Multiple sclerosis (MS) is thought to be a T cell-mediated autoimmune disorder. MS pathogenesis is likely due to a genetic predisposition triggered by a variety of environmental factors. Epigenetics, particularly DNA methylation, provide a logical interface for environmental factors to influence the genome. In this study we aim to identify DNA methylation changes associated with MS in CD8+ T cells in 30 relapsing remitting MS patients and 28 healthy blood donors using Illumina 450K methylation arrays. Findings Seventy-nine differentially methylated CpGs were associated with MS. The methylation profile of CD8+ T cells was distinctive from our previously published data on CD4+ T cells in the same cohort. Most notably, there was no major CpG effect at the MS risk gene HLA-DRB1 locus in the CD8+ T cells. Conclusion CD8+ T cells and CD4+ T cells have distinct DNA methylation profiles. This case–control study highlights the importance of distinctive cell subtypes when investigating epigenetic changes in MS and other complex diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An interpretative methodology for understanding meaning in cinema since the 1950s, auteur analysis is an approach to film studies in which an individual, usually the director, is studied as the author of her or his films. The principal argument of this thesis is that proponents of auteurism have privileged examination of the visual components in a film-maker’s body of work, neglecting the potentially significant role played by sound. The thesis seeks to address this problematic imbalance by interrogating the creative use of sound in the films written and directed by Rolf de Heer, asking the question, “Does his use of sound make Rolf de Heer an aural auteur?” In so far as the term ‘aural’ encompasses everything in the film that is heard by the audience, the analysis seeks to discover if de Heer has, as Peter Wollen suggests of the auteur and her or his directing of the visual components (1968, 1972 and 1998), unconsciously left a detectable aural signature on his films. The thesis delivers an innovative outcome by demonstrating that auteur analysis that goes beyond the mise-en-scène (i.e. visuals) is productive and worthwhile as an interpretive response to film. De Heer’s use of the aural point of view and binaural sound recording, his interest in providing a ‘voice’ for marginalised people, his self-penned song lyrics, his close and early collaboration with composer Graham Tardif and sound designer Jim Currie, his ‘hands-on’ approach to sound recording and sound editing and his predilection for making films about sound are all shown to be examples of de Heer’s aural auteurism. As well as the three published (or accepted for publication) interviews with de Heer, Tardif and Currie, the dissertation consists of seven papers refereed and published (or accepted for publication) in journals and international conference proceedings, a literature review and a unifying essay. The papers presented are close textual analyses of de Heer’s films which, when considered as a whole, support the thesis’ overall argument and serve as a comprehensive auteur analysis, the first such sustained study of his work, and the first with an emphasis on the aural.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document describes algorithms based on Elliptic Cryptography (ECC) for use within the Secure Shell (SSH) transport protocol. In particular, it specifies Elliptic Curve Diffie-Hellman (ECDH) key agreement, Elliptic Curve Menezes-Qu-Vanstone (ECMQV) key agreement, and Elliptic Curve Digital Signature Algorithm (ECDSA) for use in the SSH Transport Layer protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We provide the first description of and security model for authenticated key exchange protocols with predicate-based authentication. In addition to the standard goal of session key security, our security model also provides for credential privacy: a participating party learns nothing more about the other party's credentials than whether they satisfy the given predicate. Our model also encompasses attribute-based key exchange since it is a special case of predicate-based key exchange.---------- We demonstrate how to realize a secure predicate-based key exchange protocol by combining any secure predicate-based signature scheme with the basic Diffie-Hellman key exchange protocol, providing an efficient and simple solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interdisciplinary studies are fundamental to the signature practices for the middle years of schooling. Middle years researchers claim that interdisciplinarity in teaching appropriately meets the needs of early adolescents by tying concepts together, providing frameworks for the relevance of knowledge, and demonstrating the linking of disparate information for solution of novel problems. Cognitive research is not wholeheartedly supportive of this position. Learning theorists assert that application of knowledge in novel situations for the solution of problems is actually dependent on deep discipline based understandings. The present research contrasts the capabilities of early adolescent students from discipline based and interdisciplinary based curriculum schooling contexts to successfully solve multifaceted real world problems. This will inform the development of effective management of middle years of schooling curriculum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This approach to sustainable design explores the possibility of creating an architectural design process which can iteratively produce optimised and sustainable design solutions. Driven by an evolution process based on genetic algorithms, the system allows the designer to “design the building design generator” rather than to “designs the building”. The design concept is abstracted into a digital design schema, which allows transfer of the human creative vision into the rational language of a computer. The schema is then elaborated into the use of genetic algorithms to evolve innovative, performative and sustainable design solutions. The prioritisation of the project’s constraints and the subsequent design solutions synthesised during design generation are expected to resolve most of the major conflicts in the evaluation and optimisation phases. Mosques are used as the example building typology to ground the research activity. The spatial organisations of various mosque typologies are graphically represented by adjacency constraints between spaces. Each configuration is represented by a planar graph which is then translated into a non-orthogonal dual graph and fed into the genetic algorithm system with fixed constraints and expected performance criteria set to govern evolution. The resultant Hierarchical Evolutionary Algorithmic Design System is developed by linking the evaluation process with environmental assessment tools to rank the candidate designs. The proposed system generates the concept, the seed, and the schema, and has environmental performance as one of the main criteria in driving optimisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ERP systems generally implement controls to prevent certain common kinds of fraud. In addition however, there is an imperative need for detection of more sophisticated patterns of fraudulent activity as evidenced by the legal requirement for company audits and the common incidence of fraud. This paper describes the design and implementation of a framework for detecting patterns of fraudulent activity in ERP systems. We include the description of six fraud scenarios and the process of specifying and detecting the occurrence of those scenarios in ERP user log data using the prototype software which we have developed. The test results for detecting these scenarios in log data have been verified and confirm the success of our approach which can be generalized to ERP systems in general.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scaffolds manufactured from biological materials promise better clinical functionality, providing that characteristic features are preserved. Collagen, a prominent biopolymer, is used extensively for tissue engineering applications, because its signature biological and physico-chemical properties are retained in vitro preparations. We show here for the first time that the very properties that have established collagen as the leading natural biomaterial are lost when it is electro-spun into nano-fibres out of fluoroalcohols such as 1,1,1,3,3,3-hexafluoro-2-propanol or 2,2,2-trifluoroethanol. We further identify the use of fluoroalcohols as the major culprit in the process. The resultant nano-scaffolds lack the unique ultra-structural axial periodicity that confirms quarter-staggered supramolecular assemblies and the capacity to generate second harmonic signals, representing the typical crystalline triple-helical structure. They were also characterised by low denaturation temperatures, similar to those obtained from gelatin preparations ( p > 0.05). Likewise, circular dichroism spectra revealed extensive denaturation of the electro-spun collagen. Using pepsin digestion in combination with quantitative SDS-PAGE, we corroborate great losses of up to 99% of triple-helical collagen. In conclusion, electro-spinning of collagen out of fluoroalcohols effectively denatures this biopolymer, and thus appears to defeat its purpose, namely to create biomimetic scaffolds emulating the collagen structure and function of the extracellular matrix.