931 resultados para Lattice-Valued Fuzzy connectives. Extensions. Retractions. E-operators
Resumo:
The invention of asymmetric encryption back in the seventies was a conceptual leap that vastly increased the expressive power of encryption of the times. For the first time, it allowed the sender of a message to designate the intended recipient in an cryptographic way, expressed as a “public key” that was related to but distinct from the “private key” that, alone, embodied the ability to decrypt. This made large-scale encryption a practical and scalable endeavour, and more than anything else—save the internet itself—led to the advent of electronic commerce as we know and practice it today.
Resumo:
This paper presents ongoing work toward constructing efficient completely non-malleable public-key encryption scheme based on lattices in the standard (common reference string) model. An encryption scheme is completely non-malleable if it requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti proposed two inefficient constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Recently, two efficient public-key encryption schemes have been proposed, both of them are based on pairing identity-based encryption.
Resumo:
At Crypto 2008, Shamir introduced a new algebraic attack called the cube attack, which allows us to solve black-box polynomials if we are able to tweak the inputs by varying an initialization vector. In a stream cipher setting where the filter function is known, we can extend it to the cube attack with annihilators: By applying the cube attack to Boolean functions for which we can find low-degree multiples (equivalently annihilators), the attack complexity can be improved. When the size of the filter function is smaller than the LFSR, we can improve the attack complexity further by considering a sliding window version of the cube attack with annihilators. Finally, we extend the cube attack to vectorial Boolean functions by finding implicit relations with low-degree polynomials.
Resumo:
Semantic Space models, which provide a numerical representation of words’ meaning extracted from corpus of documents, have been formalized in terms of Hermitian operators over real valued Hilbert spaces by Bruza et al. [1]. The collapse of a word into a particular meaning has been investigated applying the notion of quantum collapse of superpositional states [2]. While the semantic association between words in a Semantic Space can be computed by means of the Minkowski distance [3] or the cosine of the angle between the vector representation of each pair of words, a new procedure is needed in order to establish relations between two or more Semantic Spaces. We address the question: how can the distance between different Semantic Spaces be computed? By representing each Semantic Space as a subspace of a more general Hilbert space, the relationship between Semantic Spaces can be computed by means of the subspace distance. Such distance needs to take into account the difference in the dimensions between subspaces. The availability of a distance for comparing different Semantic Subspaces would enable to achieve a deeper understanding about the geometry of Semantic Spaces which would possibly translate into better effectiveness in Information Retrieval tasks.
Resumo:
The decision of Young J in McCosker v Lovitt (1995) 12 BCL 146 paces an interpretation upon s 74J of the Real Property Act 1900 (NSW) likely to surprise the unwary respondent to proceedings in New South Wales involving an application for an order to extend a caveat. Further, the similarity in critical respects between s74J and the legislation relating to lapse and extension of caveats in some jurisdictions when contrasted with other lapse provisions suggests that a court order extending a caveat for a specified period only may have very different consequences in different jurisdictions.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a non-standard scheme designed specifically for this purpose, or to have secure channels between shareholders. In contrast, we show how to increase the threshold parameter of the standard CRT secret-sharing scheme without secure channels between the shareholders. Our method can thus be applied to existing CRT schemes even if they were set up without consideration to future threshold increases. Our method is a positive cryptographic application for lattice reduction algorithms, and we also use techniques from lattice theory (geometry of numbers) to prove statements about the correctness and information-theoretic security of our constructions.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a non-standard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (Geometry of Numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.
Resumo:
Vertical line extensions, both step-up and step-down, are common occurrence in consumer products. For example, Timex recently launched its luxury high-end Valentino line. On the other hand, many companies use downscale extensions to increase the overall sales volume. For instance, a number of luxury watch brands recently introduced watch collections with lower price points, like TAG Heur’s affordable watch the Aquaracer Calibre 5. Previous literature on vertical extensions has investigated how number of products in the line (Dacin and Smith 1994), the direction of the extension, brand concept (Kim, Lavack, and Smith 2001), and perceived risk (Lei, de Ruyter, and Wetzels 2008) affect extensions evaluation. Common to this literature is the use of models based on adaptation-level theory, which states that all relevant price information is integrated into a single prototype value and used in consumer judgments of price (Helson 1947; Mazumdar, Raj, and Sinha 2005). In the current research we argue that, while adaptation-level theory can be viewed as a useful simplification to understanding consumers’ evaluations, it misses out important contextual influences caused by a brand’s price range. Drawing on research on range-frequency theory (Mellers and Cooke 1994; Parducci 1965) we investigate the effects of price point distance and parent brand’s price range on evaluations of vertical extensions. Our reasoning leads to two important predictions that we test in a series of three experiments...
Resumo:
Micrometre-sized MgB2 crystals of varying quality, synthesized at low temperature and autogeneous pressure, are compared using a combination of Raman and Infra-Red (IR) spectroscopy. These data, which include new peak positions in both spectroscopies for high quality MgB2, are interpreted using DFT calculations on phonon behaviour for symmetry-related structures. Raman and IR activity additional to that predicted by point group analyses of the P6/mmm symmetry are detected. These additional peaks, as well as the overall shapes of calculated phonon dispersion (PD) models are explained by assuming a double super-lattice, consistent with a lower symmetry structure for MgB2. A 2x super-lattice in the c-direction allows a simple correlation of the pair breaking energy and the superconducting gap by activation of corresponding acoustic frequencies. A consistent physical interpretation of these spectra is obtained when the position of a phonon anomaly defines a super-lattice modulation in the a-b plane.
Resumo:
This research is a step forward in improving the accuracy of detecting anomaly in a data graph representing connectivity between people in an online social network. The proposed hybrid methods are based on fuzzy machine learning techniques utilising different types of structural input features. The methods are presented within a multi-layered framework which provides the full requirements needed for finding anomalies in data graphs generated from online social networks, including data modelling and analysis, labelling, and evaluation.
Resumo:
Driver training is one of the interventions aimed at mitigating the number of crashes that involve novice drivers. Our failure to understand what is really important for learners, in terms of risky driving, is one of the many drawbacks restraining us to build better training programs. Currently, there is a need to develop and evaluate Advanced Driving Assistance Systems that could comprehensively assess driving competencies. The aim of this paper is to present a novel Intelligent Driver Training System (IDTS) that analyses crash risks for a given driving situation, providing avenues for improvement and personalisation of driver training programs. The analysis takes into account numerous variables acquired synchronously from the Driver, the Vehicle and the Environment (DVE). The system then segments out the manoeuvres within a drive. This paper further presents the usage of fuzzy set theory to develop the safety inference rules for each manoeuvre executed during the drive. This paper presents a framework and its associated prototype that can be used to comprehensively view and assess complex driving manoeuvres and then provide a comprehensive analysis of the drive used to give feedback to novice drivers.
Resumo:
A common finding in brand extension literature is that extension’s favorability is a function of the perceived fit between the parent brand and its extension (Aaker and Keller 1990; Park, Milberg, and Lawson 1991; Volckner and Sattler 2006) that is partially mediated by perceptions of risk (Milberg, Sinn, and Goodstein 2010; Smith and Andrews 1995). In other words, as fit between the parent brand and its extension increases, parent brand beliefs become more readily available, thus increasing consumer certainty and confidence about the new extension, which results in more positive evaluations. On the other hand, as perceived fit decreases, consumer certainty about the parent brand’s ability to introduce the extension is reduced, leading to more negative evaluations. Building on the notion that perceived fit of vertical line extensions is a function of the price/quality distance between parent brand and its extension (Lei, de Ruyter, and Wetzels 2008), traditional brand extension knowledge predicts a directionally consistent impact of perceived fit on evaluations of vertical extensions. Hence, vertical (upscale or downscale) extensions that are placed closer to the parent brand in the price/quality spectrum should lead to higher favorability ratings compared to more distant ones.