251 resultados para Entity Authentication


Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a concept, the magic circle is in reality just 4 years old. Whilst often accredited to Johan Huizinga (1955), the modern usage of term in truth belongs to Katie Salen and Eric Zimmerman. It became in academia following the publication of “Rules of Play” in 2003. Because of the terminologyused, it carries with it unhelpful preconceptions that the game world, or play-space, excludes reality. In this paper, I argue that Salen and Zimmerman (2003) have taken a term used as an example, and applied a meaning to it that was never intended, based primarily upon definitions given by other authors, namely Apter (1991) and Sniderman (n.d.). I further argue that the definition itself contains a logical fallacy, which has prevented the full understanding of the definition in later work. Through a study of the literature in Game Theory, and examples of possible issues which could arise in contemporary games, I suggest that the emotions of the play experience continue beyond the play space, and that emotions from the “real world” enter it with the participants. I consider a reprise of the Stanley Milgram Obedience Experiment (2006), and what that tells us about human emotions and the effect that events taking place in a virtual environment can have upon them. I evaluate the opinion espoused by some authors of there being different magic circles for different players, and assert that this is not a useful approach to take when studying games, because it prevents the analysis of a game as a single entity. Furthermore I consider the reasons given by other authors for the existence of the Magic Circle, and I assert that the term “Magic Circle” should be discarded, that it has no relevance to contemporary games, and indeed it acts as a hindrance to the design and study of games. I conclude that the play space which it claims to protect from the courts and other governmental authorities would be better served by the existing concepts of intent, consent, and commonly accepted principles associated with international travel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present CHURNs, a method for providing freshness and authentication assurances to human users. In computer-to-computer protocols, it has long been accepted that assurances of freshness such as random nonces are required to prevent replay attacks. Typically, no such assurance of freshness is presented to a human in a human-and-computer protocol. A Computer–HUman Recognisable Nonce (CHURN) is a computer-aided random sequence that the human has a measure of control over and input into. Our approach overcomes limitations such as ‘humans cannot do random’ and that humans will follow the easiest path. Our findings show that CHURNs are significantly more random than values produced by unaided humans; that humans may be used as a second source of randomness, and we give measurements as to how much randomness can be gained from humans using our approach; and that our CHURN-generator makes the user feel more in control, thus removing the need for complete trust in devices and underlying protocols. We give an example of how a CHURN may be used to provide assurances of freshness and authentication for humans in a widely used protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, increasing focus has been made on making good business decisions utilizing the product of data analysis. With the advent of the Big Data phenomenon, this is even more apparent than ever before. But the question is how can organizations trust decisions made on the basis of results obtained from analysis of untrusted data? Assurances and trust that data and datasets that inform these decisions have not been tainted by outside agency. This study will propose enabling the authentication of datasets specifically by the extension of the RESTful architectural scheme to include authentication parameters while operating within a larger holistic security framework architecture or model compliant to legislation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While existing multi-biometic Dempster-Shafer the- ory fusion approaches have demonstrated promising perfor- mance, they do not model the uncertainty appropriately, sug- gesting that further improvement can be achieved. This research seeks to develop a unified framework for multimodal biometric fusion to take advantage of the uncertainty concept of Dempster- Shafer theory, improving the performance of multi-biometric authentication systems. Modeling uncertainty as a function of uncertainty factors affecting the recognition performance of the biometric systems helps to address the uncertainty of the data and the confidence of the fusion outcome. A weighted combination of quality measures and classifiers performance (Equal Error Rate) are proposed to encode the uncertainty concept to improve the fusion. We also found that quality measures contribute unequally to the recognition performance, thus selecting only significant factors and fusing them with a Dempster-Shafer approach to generate an overall quality score play an important role in the success of uncertainty modeling. The proposed approach achieved a competitive performance (approximate 1% EER) in comparison with other Dempster-Shafer based approaches and other conventional fusion approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australian labour law, at least from the mid-twentieth century, was dominated by the employment paradigm: the assumption that labour law’s scope was the regulation of employment relationships –full-time and part-time, and continuing, fixed term or casual – with a single (usually corporate) entity employer. But no sooner had the employment paradigm established and consolidated its shape, it began to fall apart. Since the 1980s there has been a significant growth of patterns of work that fall outside this paradigm, driven by organisational restructuring and management techniques such as labour hire, sub-contracting and franchising. Beyond Employment analyses the way in which Australian labour law is being reframed in this shift away from the pre-eminence of the employment paradigm. Its principal concern is with the legal construction and regulation of various forms of contracting, including labour hire arrangements, complex contractual chains and modern forms like franchising, and of casual employment. It outlines the current array of work relationships in Australia, and describes and analyses the way in which those outside continuous and fixed term employment are regulated. The book seeks to answer the central question: How does law (legal rules and principles) construct these work relationships, and how does it regulate these relationships? The book identifies the way in which current law draws the lines between the various work relationships through the use of contract and property ownership, and describes, analyses and synthesises the legal rules that govern these different forms of work relationships. The legal rules that govern work relationships are explored through the traditional lens of labour law’s protective function, principally in four themes: control of property, and the distribution of risks and rewards; maintenance of income security; access to collective voice mechanisms, focusing on collective bargaining; and health, safety and welfare. The book critically evaluates the gaps in the coverage and content of these rules and principles, and the implications of these gaps for workers. It also reflects upon the power relationships that underpin the work arrangements that are the focus of the book and that are enhanced through the laws of contract and property. Finally, it frames an agenda to address the gaps and identified weaknesses insofar as they affect the economic wellbeing, democratic voice, and health and safety of workers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a pilot application based on web search engine calledWeb-based Relation Completion (WebRC), we propose to join two columns of entities linked by a predefined relation by mining knowledge from the web through a web search engine. To achieve this, a novel retrieval task Relation Query Expansion (RelQE) is modelled: given an entity (query), the task is to retrieve documents containing entities in predefined relation to the given one. Solving this problem entails expanding the query before submitting it to a web search engine to ensure that mostly documents containing the linked entity are returned in the top K search results. In this paper, we propose a novel Learning-based Relevance Feedback (LRF) approach to solve this retrieval task. Expansion terms are learned from training pairs of entities linked by the predefined relation and applied to new entity-queries to find entities linked by the same relation. After describing the approach, we present experimental results on real-world web data collections, which show that the LRF approach always improves the precision of top-ranked search results to up to 8.6 times the baseline. Using LRF, WebRC also shows performances way above the baseline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent arguments on the ethics of stem cell research have taken a novel approach to the question of the moral status of the embryo. One influential argument focuses on a property that the embryo is said to posses—namely, the property of being an entity with a rational nature or, less controversially, an entity that has the potential to acquire a rational nature—and claims that this property is also possessed by a somatic cell. Since nobody seriously thinks that we have a duty to preserve the countless such cells we wash off our body every day in the shower, the argument is intended as a reductio ad absurdum of the claim that the embryo should be afforded the same moral status as a fully developed human being. This article argues that this argument is not successful and that it consequently plays into the hands of those who oppose embryonic stem cell research. It is therefore better to abandon this argument and focus instead on the different argument that potentiality, as such, is not a sufficient ground for the creation of moral obligations towards the embryo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cryptographic hash functions are an important tool of cryptography and play a fundamental role in efficient and secure information processing. A hash function processes an arbitrary finite length input message to a fixed length output referred to as the hash value. As a security requirement, a hash value should not serve as an image for two distinct input messages and it should be difficult to find the input message from a given hash value. Secure hash functions serve data integrity, non-repudiation and authenticity of the source in conjunction with the digital signature schemes. Keyed hash functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle–Damgård construction are followed in almost all widely used standard hash functions such as MD5 and SHA-1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently Gao et al. proposed a lightweight RFID mutual authentication protocol [3] to resist against intermittent position trace attacks and desynchronization attacks and called it RIPTA-DA. They also verified their protocol’s security by data reduction method with the learning parity with noise (LPN) and also formally verified the functionality of the proposed scheme by Colored Petri Nets. In this paper, we investigate RIPTA-DA’s security. We present an efficient secret disclosure attack against the protocol which can be used to mount both de-synchronization and traceability attacks against the protocol. Thus our attacks show that RIPTA-DA protocol is not a RIPTA-DA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many RFID protocols use cryptographic hash functions for their security. The resource constrained nature of RFID systems forces the use of light weight cryptographic algorithms. Tav-128 is one such 128-bit light weight hash function proposed by Peris-Lopez et al. for a low-cost RFID tag authentication protocol. Apart from some statistical tests for randomness by the designers themselves, Tav-128 has not undergone any other thorough security analysis. Based on these tests, the designers claimed that Tav-128 does not posses any trivial weaknesses. In this article, we carry out the first third party security analysis of Tav-128 and show that this hash function is neither collision resistant nor second preimage resistant. Firstly, we show a practical collision attack on Tav-128 having a complexity of 237 calls to the compression function and produce message pairs of arbitrary length which produce the same hash value under this hash function. We then show a second preimage attack on Tav-128 which succeeds with a complexity of 262 calls to the compression function. Finally, we study the constituent functions of Tav-128 and show that the concatenation of nonlinear functions A and B produces a 64-bit permutation from 32-bit messages. This could be a useful light weight primitive for future RFID protocols.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature on “entrepreneurial opportunities” has grown rapidly since the publication of Shane and Venkataraman (2000). By directing attention to the earliest stages of development of new economic activities and organizations, this marks sound redirection of entrepreneurship research. However, our review shows that theoretical and empirical progress has been limited on important aspects of the role of “opportunities” and their interaction with actors, i.e., the “nexus”. We argue that this is rooted in inherent and inescapable problems with the “opportunity” construct itself, when applied in the context of a prospective, micro-level (i.e., individual[s], venture, or individual–venture dyad) view of entrepreneurial processes. We therefore suggest a fundamental re-conceptualization using the constructs External Enablers, New Venture Ideas, and Opportunity Confidence to capture the many important ideas commonly discussed under the “opportunity” label. This re-conceptualization makes important distinctions where prior conceptions have been blurred: between explananda and explanantia; between actor and the entity acted upon; between external conditions and subjective perceptions, and between the contents and the favorability of the entity acted upon. These distinctions facilitate theoretical precision and can guide empirical investigation towards more fruitful designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The NLM stream cipher designed by Hoon Jae Lee, Sang Min Sung, Hyeong Rag Kim is a strengthened version of the LM summation generator that combines linear and non-linear feedback shift registers. In recent works, the NLM cipher has been used for message authentication in lightweight communication over wireless sensor networks and for RFID authentication protocols. The work analyses the security of the NLM stream cipher and the NLM-MAC scheme that is built on the top of the NLM cipher. We first show that the NLM cipher suffers from two major weaknesses that lead to key recovery and forgery attacks. We prove the internal state of the NLM cipher can be recovered with time complexity about nlog7×2, where the total length of internal state is 2⋅n+22⋅n+2 bits. The attack needs about n2n2 key-stream bits. We also show adversary is able to forge any MAC tag very efficiently by having only one pair (MAC tag, ciphertext). The proposed attacks are practical and break the scheme with a negligible error probability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Integration of biometrics is considered as an attractive solution for the issues associated with password based human authentication as well as for secure storage and release of cryptographic keys which is one of the critical issues associated with modern cryptography. However, the widespread popularity of bio-cryptographic solutions are somewhat restricted by the fuzziness associated with biometric measurements. Therefore, error control mechanisms must be adopted to make sure that fuzziness of biometric inputs can be sufficiently countered. In this paper, we have outlined such existing techniques used in bio-cryptography while explaining how they are deployed in different types of solutions. Finally, we have elaborated on the important facts to be considered when choosing appropriate error correction mechanisms for a particular biometric based solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Legal Theories: Contexts and Practices presents legal theory as a living and evolving entity. The reader is brought into its story as an active participant who is challenged to think about where they sit within the history and traditions of legal theory and jurisprudence. This second edition explores how lawyers and the courts adopt theoretical and jurisprudential positions and how they are influenced by the historical, social, cultural, and legal conditions characteristic of the time in which they live. It considers how legal theories, too, are influenced by those conditions, and how these combined forces influence and continue to affect contemporary legal thinking and legal interpretation.