22 resultados para Decryption
Resumo:
L’objectif de cette thèse est de réfléchir aux enjeux d’une histoire du jeu de stratégie en temps réel (STR). Il s’agit de mieux comprendre les contextes dans lesquels le genre prend sens pour historiciser son émergence et sa période classique. Cette thèse cherche à documenter, d’une part, la cristallisation du STR en tant qu’objet ayant une forme relativement stable et en tant que corpus précis et identifié et, d’autre part, l’émergence des formes de jouabilité classiques des STR. La première partie est consacrée à décrire l’objet de cette recherche, pour mieux comprendre la complexité du terme « stratégie » et de la catégorisation « jeu de stratégie ». La seconde partie met en place la réflexion épistémologique en montrant comment on peut tenir compte de la jouabilité dans un travail historien. Elle définit le concept de paradigme de jouabilité en tant que formation discursive pour regrouper différents énoncés actionnels en une unité logique qui n’est pas nécessairement l’équivalent du genre. La troisième partie cartographie l’émergence du genre entre les wargames des années 1970 et les jeux en multijoueur de la décennie suivante. Deux paradigmes de jouabilité se distinguent pour former le STR classique : le paradigme de décryptage et le paradigme de prévision. La quatrième partie explique et contextualise le STR classique en montrant qu’il comporte ces deux paradigmes de jouabilité dans deux modes de jeu qui offrent des expériences fondamentalement différentes l’une de l’autre.
Resumo:
L’objectif de cette thèse est de réfléchir aux enjeux d’une histoire du jeu de stratégie en temps réel (STR). Il s’agit de mieux comprendre les contextes dans lesquels le genre prend sens pour historiciser son émergence et sa période classique. Cette thèse cherche à documenter, d’une part, la cristallisation du STR en tant qu’objet ayant une forme relativement stable et en tant que corpus précis et identifié et, d’autre part, l’émergence des formes de jouabilité classiques des STR. La première partie est consacrée à décrire l’objet de cette recherche, pour mieux comprendre la complexité du terme « stratégie » et de la catégorisation « jeu de stratégie ». La seconde partie met en place la réflexion épistémologique en montrant comment on peut tenir compte de la jouabilité dans un travail historien. Elle définit le concept de paradigme de jouabilité en tant que formation discursive pour regrouper différents énoncés actionnels en une unité logique qui n’est pas nécessairement l’équivalent du genre. La troisième partie cartographie l’émergence du genre entre les wargames des années 1970 et les jeux en multijoueur de la décennie suivante. Deux paradigmes de jouabilité se distinguent pour former le STR classique : le paradigme de décryptage et le paradigme de prévision. La quatrième partie explique et contextualise le STR classique en montrant qu’il comporte ces deux paradigmes de jouabilité dans deux modes de jeu qui offrent des expériences fondamentalement différentes l’une de l’autre.
Resumo:
This paper presents a DES/3DES core that will support cipher block chaining (CBC) and also has a built in keygen that together take up about 10% of the resources in a Xilinx Virtex II 1000-4. The core will achieve up to 200Mbit/s of encryption or decryption. Also presented is a network architecture that will allow these CBC capable 3DES cores to perform their processing in parallel.
Resumo:
The security and reliability of a class of public-key cryptosystems against attacks by unauthorized parties, who had acquired partial knowledge of one or more of the private key components and/or of the message, were discussed. The standard statistical mechanical methods of dealing with diluted spin systems with replica symmetric considerations were analyzed. The dynamical transition which defined decryption success in practical situation was studied. The phase diagrams which showed the dynamical threshold as a function of the partial acquired knowledge of the private key were also presented.
Resumo:
The phenomenonal growth of the Internet has connected us to a vast amount of computation and information resources around the world. However, making use of these resources is difficult due to the unparalleled massiveness, high communication latency, share-nothing architecture and unreliable connection of the Internet. In this dissertation, we present a distributed software agent approach, which brings a new distributed problem-solving paradigm to the Internet computing researches with enhanced client-server scheme, inherent scalability and heterogeneity. Our study discusses the role of a distributed software agent in Internet computing and classifies it into three major categories by the objects it interacts with: computation agent, information agent and interface agent. The discussion of the problem domain and the deployment of the computation agent and the information agent are presented with the analysis, design and implementation of the experimental systems in high performance Internet computing and in scalable Web searching. ^ In the computation agent study, high performance Internet computing can be achieved with our proposed Java massive computation agent (JAM) model. We analyzed the JAM computing scheme and built a brutal force cipher text decryption prototype. In the information agent study, we discuss the scalability problem of the existing Web search engines and designed the approach of Web searching with distributed collaborative index agent. This approach can be used for constructing a more accurate, reusable and scalable solution to deal with the growth of the Web and of the information on the Web. ^ Our research reveals that with the deployment of the distributed software agent in Internet computing, we can have a more cost effective approach to make better use of the gigantic scale network of computation and information resources on the Internet. The case studies in our research show that we are now able to solve many practically hard or previously unsolvable problems caused by the inherent difficulties of Internet computing. ^
Resumo:
Recent years have seen an astronomical rise in SQL Injection Attacks (SQLIAs) used to compromise the confidentiality, authentication and integrity of organisations’ databases. Intruders becoming smarter in obfuscating web requests to evade detection combined with increasing volumes of web traffic from the Internet of Things (IoT), cloud-hosted and on-premise business applications have made it evident that the existing approaches of mostly static signature lack the ability to cope with novel signatures. A SQLIA detection and prevention solution can be achieved through exploring an alternative bio-inspired supervised learning approach that uses input of labelled dataset of numerical attributes in classifying true positives and negatives. We present in this paper a Numerical Encoding to Tame SQLIA (NETSQLIA) that implements a proof of concept for scalable numerical encoding of features to a dataset attributes with labelled class obtained from deep web traffic analysis. In the numerical attributes encoding: the model leverages proxy in the interception and decryption of web traffic. The intercepted web requests are then assembled for front-end SQL parsing and pattern matching by applying traditional Non-Deterministic Finite Automaton (NFA). This paper is intended for a technique of numerical attributes extraction of any size primed as an input dataset to an Artificial Neural Network (ANN) and statistical Machine Learning (ML) algorithms implemented using Two-Class Averaged Perceptron (TCAP) and Two-Class Logistic Regression (TCLR) respectively. This methodology then forms the subject of the empirical evaluation of the suitability of this model in the accurate classification of both legitimate web requests and SQLIA payloads.
Resumo:
One of the main practical implications of quantum mechanical theory is quantum computing, and therefore the quantum computer. Quantum computing (for example, with Shor’s algorithm) challenges the computational hardness assumptions, such as the factoring problem and the discrete logarithm problem, that anchor the safety of cryptosystems. So the scientific community is studying how to defend cryptography; there are two defense strategies: the quantum cryptography (which involves the use of quantum cryptographic algorithms on quantum computers) and the post-quantum cryptography (based on classical cryptographic algorithms, but resistant to quantum computers). For example, National Institute of Standards and Technology (NIST) is collecting and standardizing the post-quantum ciphers, as it established DES and AES as symmetric cipher standards, in the past. In this thesis an introduction on quantum mechanics was given, in order to be able to talk about quantum computing and to analyze Shor’s algorithm. The differences between quantum and post-quantum cryptography were then analyzed. Subsequently the focus was given to the mathematical problems assumed to be resistant to quantum computers. To conclude, post-quantum digital signature cryptographic algorithms selected by NIST were studied and compared in order to apply them in today’s life.