884 resultados para LWE practical hardness
Resumo:
Proxy re-encryption (PRE) is a highly useful cryptographic primitive whereby Alice and Bob can endow a proxy with the capacity to change ciphertext recipients from Alice to Bob, without the proxy itself being able to decrypt, thereby providing delegation of decryption authority. Key-private PRE (KP-PRE) specifies an additional level of confidentiality, requiring pseudo-random proxy keys that leak no information on the identity of the delegators and delegatees. In this paper, we propose a CPA-secure PK-PRE scheme in the standard model (which we then transform into a CCA-secure scheme in the random oracle model). Both schemes enjoy highly desirable properties such as uni-directionality and multi-hop delegation. Unlike (the few) prior constructions of PRE and KP-PRE that typically rely on bilinear maps under ad hoc assumptions, security of our construction is based on the hardness of the standard Learning-With-Errors (LWE) problem, itself reducible from worst-case lattice hard problems that are conjectured immune to quantum cryptanalysis, or “post-quantum”. Of independent interest, we further examine the practical hardness of the LWE assumption, using Kannan’s exhaustive search algorithm coupling with pruning techniques. This leads to state-of-the-art parameters not only for our scheme, but also for a number of other primitives based on LWE published the literature.
Resumo:
Digital signatures are an important primitive for building secure systems and are used in most real-world security protocols. However, almost all popular signature schemes are either based on the factoring assumption (RSA) or the hardness of the discrete logarithm problem (DSA/ECDSA). In the case of classical cryptanalytic advances or progress on the development of quantum computers, the hardness of these closely related problems might be seriously weakened. A potential alternative approach is the construction of signature schemes based on the hardness of certain lattice problems that are assumed to be intractable by quantum computers. Due to significant research advancements in recent years, lattice-based schemes have now become practical and appear to be a very viable alternative to number-theoretic cryptography. In this article, we focus on recent developments and the current state of the art in lattice-based digital signatures and provide a comprehensive survey discussing signature schemes with respect to practicality. Additionally, we discuss future research areas that are essential for the continued development of lattice-based cryptography.
Resumo:
Recent technological developments in the field of experimental quantum annealing have made prototypical annealing optimizers with hundreds of qubits commercially available. The experimental demonstration of a quantum speedup for optimization problems has since then become a coveted, albeit elusive goal. Recent studies have shown that the so far inconclusive results, regarding a quantum enhancement, may have been partly due to the benchmark problems used being unsuitable. In particular, these problems had inherently too simple a structure, allowing for both traditional resources and quantum annealers to solve them with no special efforts. The need therefore has arisen for the generation of harder benchmarks which would hopefully possess the discriminative power to separate classical scaling of performance with size from quantum. We introduce here a practical technique for the engineering of extremely hard spin-glass Ising-type problem instances that does not require "cherry picking" from large ensembles of randomly generated instances. We accomplish this by treating the generation of hard optimization problems itself as an optimization problem, for which we offer a heuristic algorithm that solves it. We demonstrate the genuine thermal hardness of our generated instances by examining them thermodynamically and analyzing their energy landscapes, as well as by testing the performance of various state-of-the-art algorithms on them. We argue that a proper characterization of the generated instances offers a practical, efficient way to properly benchmark experimental quantum annealers, as well as any other optimization algorithm.
Resumo:
Both traditional and progressive curricula are inadequate for the task of responding to the economic, political, social, and cultural changes that have occurred as a result of globalization. This book documents some of the ongoing work occurring in early childhood settings that is aimed at improving, and ultimately transforming, early childhood practice in these changed and changing times. The authors do not simply critique developmental approaches or the increasing standardization of the field. Instead, they describe how they are playing around with postmodern ideas in practice and developing unique approaches to the diverse educational circumstances that confront early childhood educators. Whether it is preparing teachers, using materials, or developing policies, each chapter provides readers with possibilities for enacting pedagogies that are responsive to the contemporary circumstances shaping the lives of young children.
Resumo:
Schizophrenia is a serious mental disorder currently undergoing a renaissance of scientific interest. This book summarizes current knowledge and details some of the recent developments. Divided into three sections, it presents an overview of the disorder, including its features, social impact and aetiology; reviews methods of assessing the symptoms and disabilities of schizophrenia and examining the sufferer's social environment; and discusses a range of interventions, from pharmacological treatment to skill training for individuals and families. Issues that arise in planning services for sufferers and their families are also reviewed. All of the chapters focus on clinical practice and many include guides to assessment and practice. This handbook should be particularly useful in training health professionals in psychiatric/mental health services and general practice. It also aims to offer practitioners and researchers a comprehensive update on schizophrenia that is both easy to read and practical in its focus.
Resumo:
This is a guidebook for clinicians on how to conduct assessment interviews with patients presenting with common psychological disorders. The orientation is behavioural and cognitive; so the book has wide applicability, as most clinicians explicitly or implicitly accept this combination of models as a useful basis for assessing and treating these problems. The problem areas covered are: fear and anxiety problems; depression, obesity; interpersonal problems; sexual dysfunction; insomnia; headache; and substance abuse.
Resumo:
Dehydration has been associated with increased morbidity and mortality. Dehydration risk increases with advancing age, and will progressively become an issue as the aging population increases. Worldwide, those aged 60 years and over are the fastest growing segment of the population. The study aimed to develop a clinically practical means to identify dehydration amongst older people in the clinical care setting. Older people aged 60 years or over admitted to the Geriatric and Rehabilitation Unit (GARU) of two tertiary teaching hospitals were eligible for participation in the study. Ninety potential screening questions and 38 clinical parameters were initially tested on a single sample (n=33) with the most promising 11 parameters selected to undergo further testing in an independent group (n=86). Of the almost 130 variables explored, tongue dryness was most strongly associated with poor hydration status, demonstrating 64% sensitivity and 62% specificity within the study participants. The result was not confounded by age, gender or body mass index. With minimal training, inter-rater repeatability was over 90%. This study identified tongue dryness as a potentially practical tool to identify dehydration risk amongst older people in the clinical care setting. Further studies to validate the potential screen in larger and varied populations of older people are required