941 resultados para Algebraic lattices


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a non-standard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (Geometry of Numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The power of sharing computation in a cryptosystem is crucial in several real-life applications of cryptography. Cryptographic primitives and tasks to which threshold cryptosystems have been applied include variants of digital signature, identification, public-key encryption and block ciphers etc. It is desirable to extend the domain of cryptographic primitives which threshold cryptography can be applied to. This paper studies threshold message authentication codes (threshold MACs). Threshold cryptosystems usually use algebraically homomorphic properties of the underlying cryptographic primitives. A typical approach to construct a threshold cryptographic scheme is to combine a (linear) secret sharing scheme with an algebraically homomorphic cryptographic primitive. The lack of algebraic properties of MACs rules out such an approach to share MACs. In this paper, we propose a method of obtaining a threshold MAC using a combinatorial approach. Our method is generic in the sense that it is applicable to any secure conventional MAC by making use of certain combinatorial objects, such as cover-free families and their variants. We discuss the issues of anonymity in threshold cryptography, a subject that has not been addressed previously in the literature in the field, and we show that there are trade-offis between the anonymity and efficiency of threshold MACs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The authors have collaborated in the development and initial evaluation of a curriculum for mathematics acceleration. This paper reports upon the difficulties encountered with documenting student understanding using pen-and-paper assessment tasks. This leads to a discussion of the impact of students’ language and literacy on mathematical performance and the consequences for motivation and engagement as a result of simplifying the language in the tests, and extending student work to algebraic representations. In turn, implications are drawn for revisions to assessment used within the project and the language and literacy focus included within student learning experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nth-Dimensional Truncated Polynomial Ring (NTRU) is a lattice-based public-key cryptosystem that offers encryption and digital signature solutions. It was designed by Silverman, Hoffstein and Pipher. The NTRU cryptosystem was patented by NTRU Cryptosystems Inc. (which was later acquired by Security Innovations) and available as IEEE 1363.1 and X9.98 standards. NTRU is resistant to attacks based on Quantum computing, to which the standard RSA and ECC public-key cryptosystems are vulnerable to. In addition, NTRU has higher performance advantages over these cryptosystems. Considering this importance of NTRU, it is highly recommended to adopt NTRU as part of a cipher suite along with widely used cryptosystems for internet security protocols and applications. In this paper, we present our analytical study on the implementation of NTRU encryption scheme which serves as a guideline for security practitioners who are novice to lattice-based cryptography or even cryptography. In particular, we show some non-trivial issues that should be considered towards a secure and efficient NTRU implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date, a number of two-dimensional (2D) topological insulators (TIs) have been realized in Group 14 elemental honeycomb lattices, but all are inversionsymmetric. Here, based on first-principles calculations, we predict a new family of 2D inversion-asymmetric TIs with sizeable bulk gaps from 105 meV to 284 meV, in X2–GeSn (X = H, F, Cl, Br, I) monolayers, making them in principle suitable for room-temperature applications. The nontrivial topological characteristics of inverted band orders are identified in pristine X2–GeSn with X = (F, Cl, Br, I), whereas H2–GeSn undergoes a nontrivial band inversion at 8% lattice expansion. Topologically protected edge states are identified in X2–GeSn with X = (F, Cl, Br, I), as well as in strained H2–GeSn. More importantly, the edges of these systems, which exhibit single-Dirac-cone characteristics located exactly in the middle of their bulk band gaps, are ideal for dissipationless transport. Thus, Group 14 elemental honeycomb lattices provide a fascinating playground for the manipulation of quantum states.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projective Hjelmslev planes and affine Hjelmslev planes are generalisations of projective planes and affine planes. We present an algorithm for constructing projective Hjelmslev planes and affine Hjelmslev planes that uses projective planes, affine planes and orthogonal arrays. We show that all 2-uniform projective Hjelmslev planes, and all 2-uniform affine Hjelmslev planes can be constructed in this way. As a corollary it is shown that all $2$-uniform affine Hjelmslev planes are sub-geometries of $2$-uniform projective Hjelmslev planes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel algebraic formulation of the central problem of screw theory, namely the determination of the principal screws of a given system. Using the algebra of dual numbers, it shows that the principal screws can be determined via the solution of a generalised eigenproblem of two real, symmetric matrices. This approach allows the study of the principal screws of the general two-, three-systems associated with a manipulator of arbitrary geometry in terms of closed-form expressions of its architecture and configuration parameters. We also present novel methods for the determination of the principal screws for four-, five-systems which do not require the explicit computation of the reciprocal systems. Principal screws of the systems of different orders are identified from one uniform criterion, namely that the pitches of the principal screws are the extreme values of the pitch.The classical results of screw theory, namely the equations for the cylindroid and the pitch-hyperboloid associated with the two-and three-systems, respectively have been derived within the proposed framework. Algebraic conditions have been derived for some of the special screw systems. The formulation is also illustrated with several examples including two spatial manipulators of serial and parallel architecture, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note is concerned with the problem of determining approximate solutions of Fredholm integral equations of the second kind. Approximating the solution of a given integral equation by means of a polynomial, an over-determined system of linear algebraic equations is obtained involving the unknown coefficients, which is finally solved by using the least-squares method. Several examples are examined in detail. (c) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an effort to develop a fully computerized approach for structural synthesis of kinematic chains the steps involved in the method of structural synthesis based on transformation of binary chains [38] have been recast in a format suitable for implementation on a digital computer. The methodology thus evolved has been combined with the algebraic procedures for structural analysis [44] to develop a unified computer program for structural synthesis and analysis of simple jointed kinematic chains with a degree of freedom 0. Applications of this program are presented in the succeeding parts of the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerically discretized dynamic optimization problems having active inequality and equality path constraints that along with the dynamics induce locally high index differential algebraic equations often cause the optimizer to fail in convergence or to produce degraded control solutions. In many applications, regularization of the numerically discretized problem in direct transcription schemes by perturbing the high index path constraints helps the optimizer to converge to usefulm control solutions. For complex engineering problems with many constraints it is often difficult to find effective nondegenerat perturbations that produce useful solutions in some neighborhood of the correct solution. In this paper we describe a numerical discretization that regularizes the numerically consistent discretized dynamics and does not perturb the path constraints. For all values of the regularization parameter the discretization remains numerically consistent with the dynamics and the path constraints specified in the, original problem. The regularization is quanti. able in terms of time step size in the mesh and the regularization parameter. For full regularized systems the scheme converges linearly in time step size.The method is illustrated with examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From Arithmetic to Algebra. Changes in the skills in comprehensive school over 20 years. In recent decades we have emphasized the understanding of calculation in mathematics teaching. Many studies have found that better understanding helps to apply skills in new conditions and that the ability to think on an abstract level increases the transfer to new contexts. In my research I take into consideration competence as a matrix where content is in a horizontal line and levels of thinking are in a vertical line. The know-how is intellectual and strategic flexibility and understanding. The resources and limitations of memory have their effects on learning in different ways in different phases. Therefore both flexible conceptual thinking and automatization must be considered in learning. The research questions that I examine are what kind of changes have occurred in mathematical skills in comprehensive school over the last 20 years and what kind of conceptual thinking is demonstrated by students in this decade. The study consists of two parts. The first part is a statistical analysis of the mathematical skills and their changes over the last 20 years in comprehensive school. In the test the pupils did not use calculators. The second part is a qualitative analysis of the conceptual thinking of pupils in comprehensive school in this decade. The study shows significant differences in algebra and in some parts of arithmetic. The largest differences were detected in the calculation skills of fractions. In the 1980s two out of three pupils were able to complete tasks with fractions, but in the 2000s only one out of three pupils were able to do the same tasks. Also remarkable is that out of the students who could complete the tasks with fractions, only one out of three pupils was on the conceptual level in his/her thinking. This means that about 10% of pupils are able to understand the algebraic expression, which has the same isomorphic structure as the arithmetical expression. This finding is important because the ability to think innovatively is created when learning the basic concepts. Keywords: arithmetic, algebra, competence

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of low energy cost membranes to separate He from noble gas mixtures is highly desired. In this work, we studied He purification using recently experimentally realized, two-dimensional stanene (2D Sn) and decorated 2D Sn (SnH and SnF) honeycomb lattices by density functional theory calculations. To increase the permeability of noble gases through pristine 2D Sn at room temperature (298 K), two practical strategies (i.e., the application of strain and functionalization) are proposed. With their high concentration of large pores, 2D Sn-based membrane materials demonstrate excellent helium purification and can serve as a superior membrane over traditionally used, porous materials. In addition, the separation performance of these 2D Sn-based membrane materials can be significantly tuned by application of strain to optimize the He purification properties by taking both diffusion and selectivity into account. Our results are the first calculations of He separation in a defect-free honeycomb lattice, highlighting new interesting materials for helium separation for future experimental validation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The transition parameters for the freezing of two one-component liquids into crystalline solids are evaluated by two theoretical approaches. The first system considered is liquid sodium which crystallizes into a body-centered-cubic (bcc) lattice; the second system is the freezing of adhesive hard spheres into a face-centered-cubic (fcc) lattice. Two related theoretical techniques are used in this evaluation: One is based upon a recently developed bifurcation analysis; the other is based upon the theory of freezing developed by Ramakrishnan and Yussouff. For liquid sodium, where experimental information is available, the predictions of the two theories agree well with experiment and each other. The adhesive-hard-sphere system, which displays a triple point and can be used to fit some liquids accurately, shows a temperature dependence of the freezing parameters which is similar to Lennard-Jones systems. At very low temperature, the fractional density change on freezing shows a dramatic increase as a function of temperature indicating the importance of all the contributions due to the triplet direction correlation function. Also, we consider the freezing of a one-component liquid into a simple-cubic (sc) lattice by bifurcation analysis and show that this transition is highly unfavorable, independent of interatomic potential choice. The bifurcation diagrams for the three lattices considered are compared and found to be strikingly different. Finally, a new stability analysis of the bifurcation diagrams is presented.