992 resultados para proofofknowledge concurrent zero knowledge
Resumo:
L’utilisation d’Internet prend beaucoup d’ampleur depuis quelques années et le commerce électronique connaît une hausse considérable. Nous pouvons présentement acheter facilement via Internet sans quitter notre domicile et avons accès à d’innombrables sources d’information. Cependant, la navigation sur Internet permet également la création de bases de données détaillées décrivant les habitudes de chaque utilisateur, informations ensuite utilisées par des tiers afin de cerner le profil de leur clientèle cible, ce qui inquiète plusieurs intervenants. Les informations concernant un individu peuvent être récoltées par l’interception de données transactionnelles, par l’espionnage en ligne, ainsi que par l’enregistrement d’adresses IP. Afin de résoudre les problèmes de vie privée et de s’assurer que les commerçants respectent la législation applicable en la matière, ainsi que les exigences mises de l’avant par la Commission européenne, plusieurs entreprises comme Zero-knowledge Systems Inc. et Anonymizer.com offrent des logiciels permettant la protection de la vie privée en ligne (privacy-enhancing technologies ou PETs). Ces programmes utilisent le cryptage d’information, une méthode rendant les données illisibles pour tous à l’exception du destinataire. L’objectif de la technologie utilisée a été de créer des systèmes mathématiques rigoureux pouvant empêcher la découverte de l’identité de l’auteur même par le plus déterminé des pirates, diminuant ainsi les risques de vol d’information ou la divulgation accidentelle de données confidentielles. Malgré le fait que ces logiciels de protection de la vie privée permettent un plus grand respect des Directives européennes en la matière, une analyse plus approfondie du sujet témoigne du fait que ces technologies pourraient être contraires aux lois concernant le cryptage en droit canadien, américain et français.
Resumo:
Il y a des problemes qui semblent impossible a resoudre sans l'utilisation d'un tiers parti honnete. Comment est-ce que deux millionnaires peuvent savoir qui est le plus riche sans dire a l'autre la valeur de ses biens ? Que peut-on faire pour prevenir les collisions de satellites quand les trajectoires sont secretes ? Comment est-ce que les chercheurs peuvent apprendre les liens entre des medicaments et des maladies sans compromettre les droits prives du patient ? Comment est-ce qu'une organisation peut ecmpecher le gouvernement d'abuser de l'information dont il dispose en sachant que l'organisation doit n'avoir aucun acces a cette information ? Le Calcul multiparti, une branche de la cryptographie, etudie comment creer des protocoles pour realiser de telles taches sans l'utilisation d'un tiers parti honnete. Les protocoles doivent etre prives, corrects, efficaces et robustes. Un protocole est prive si un adversaire n'apprend rien de plus que ce que lui donnerait un tiers parti honnete. Un protocole est correct si un joueur honnete recoit ce que lui donnerait un tiers parti honnete. Un protocole devrait bien sur etre efficace. Etre robuste correspond au fait qu'un protocole marche meme si un petit ensemble des joueurs triche. On demontre que sous l'hypothese d'un canal de diusion simultane on peut echanger la robustesse pour la validite et le fait d'etre prive contre certains ensembles d'adversaires. Le calcul multiparti a quatre outils de base : le transfert inconscient, la mise en gage, le partage de secret et le brouillage de circuit. Les protocoles du calcul multiparti peuvent etre construits avec uniquements ces outils. On peut aussi construire les protocoles a partir d'hypoth eses calculatoires. Les protocoles construits a partir de ces outils sont souples et peuvent resister aux changements technologiques et a des ameliorations algorithmiques. Nous nous demandons si l'efficacite necessite des hypotheses de calcul. Nous demontrons que ce n'est pas le cas en construisant des protocoles efficaces a partir de ces outils de base. Cette these est constitue de quatre articles rediges en collaboration avec d'autres chercheurs. Ceci constitue la partie mature de ma recherche et sont mes contributions principales au cours de cette periode de temps. Dans le premier ouvrage presente dans cette these, nous etudions la capacite de mise en gage des canaux bruites. Nous demontrons tout d'abord une limite inferieure stricte qui implique que contrairement au transfert inconscient, il n'existe aucun protocole de taux constant pour les mises en gage de bit. Nous demontrons ensuite que, en limitant la facon dont les engagements peuvent etre ouverts, nous pouvons faire mieux et meme un taux constant dans certains cas. Ceci est fait en exploitant la notion de cover-free families . Dans le second article, nous demontrons que pour certains problemes, il existe un echange entre robustesse, la validite et le prive. Il s'effectue en utilisant le partage de secret veriable, une preuve a divulgation nulle, le concept de fantomes et une technique que nous appelons les balles et les bacs. Dans notre troisieme contribution, nous demontrons qu'un grand nombre de protocoles dans la litterature basee sur des hypotheses de calcul peuvent etre instancies a partir d'une primitive appelee Transfert Inconscient Veriable, via le concept de Transfert Inconscient Generalise. Le protocole utilise le partage de secret comme outils de base. Dans la derniere publication, nous counstruisons un protocole efficace avec un nombre constant de rondes pour le calcul a deux parties. L'efficacite du protocole derive du fait qu'on remplace le coeur d'un protocole standard par une primitive qui fonctionne plus ou moins bien mais qui est tres peu couteux. On protege le protocole contre les defauts en utilisant le concept de privacy amplication .
Resumo:
The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.
Resumo:
The past several years have seen the surprising and rapid rise of Bitcoin and other “cryptocurrencies.” These are decentralized peer-to-peer networks that allow users to transmit money, tocompose financial instruments, and to enforce contracts between mutually distrusting peers, andthat show great promise as a foundation for financial infrastructure that is more robust, efficientand equitable than ours today. However, it is difficult to reason about the security of cryptocurrencies. Bitcoin is a complex system, comprising many intricate and subtly-interacting protocol layers. At each layer it features design innovations that (prior to our work) have not undergone any rigorous analysis. Compounding the challenge, Bitcoin is but one of hundreds of competing cryptocurrencies in an ecosystem that is constantly evolving. The goal of this thesis is to formally reason about the security of cryptocurrencies, reining in their complexity, and providing well-defined and justified statements of their guarantees. We provide a formal specification and construction for each layer of an abstract cryptocurrency protocol, and prove that our constructions satisfy their specifications. The contributions of this thesis are centered around two new abstractions: “scratch-off puzzles,” and the “blockchain functionality” model. Scratch-off puzzles are a generalization of the Bitcoin “mining” algorithm, its most iconic and novel design feature. We show how to provide secure upgrades to a cryptocurrency by instantiating the protocol with alternative puzzle schemes. We construct secure puzzles that address important and well-known challenges facing Bitcoin today, including wasted energy and dangerous coalitions. The blockchain functionality is a general-purpose model of a cryptocurrency rooted in the “Universal Composability” cryptography theory. We use this model to express a wide range of applications, including transparent “smart contracts” (like those featured in Bitcoin and Ethereum), and also privacy-preserving applications like sealed-bid auctions. We also construct a new protocol compiler, called Hawk, which translates user-provided specifications into privacy-preserving protocols based on zero-knowledge proofs.
Resumo:
This work describes a methodology to extract symbolic rules from trained neural networks. In our approach, patterns on the network are codified using formulas on a Lukasiewicz logic. For this we take advantage of the fact that every connective in this multi-valued logic can be evaluated by a neuron in an artificial network having, by activation function the identity truncated to zero and one. This fact simplifies symbolic rule extraction and allows the easy injection of formulas into a network architecture. We trained this type of neural network using a back-propagation algorithm based on Levenderg-Marquardt algorithm, where in each learning iteration, we restricted the knowledge dissemination in the network structure. This makes the descriptive power of produced neural networks similar to the descriptive power of Lukasiewicz logic language, minimizing the information loss on the translation between connectionist and symbolic structures. To avoid redundance on the generated network, the method simplifies them in a pruning phase, using the "Optimal Brain Surgeon" algorithm. We tested this method on the task of finding the formula used on the generation of a given truth table. For real data tests, we selected the Mushrooms data set, available on the UCI Machine Learning Repository.
Resumo:
In this article I will analyse anaphoric references in German texts and their transaltion into Portuguese. I will take as main corpus Heinrich Böll's novel Haus ohne Hüter and its translation into Portuguese by Jorge Rosa with the title Casa Indefesa. I will concentrate on the use of personal pronouns and possessives in references to both people and objects in source text and target text and I will present patterns of symmetries and asymmetries. I will claim that asymmetries in the translation of such anaphoric references can be accounted for mainly by differences in the pronominal systems and verbal systems of both languages and by differences in the way each language marks theme/topic continuity/discontinuity in discourse. Issues related to style and the translation of anaphors will also be addressed. I will finally raise some questions related to ambiguous references which can not be solved within the scope of syntax or semantics, thus requiring pragmatic interpretation based on cultural knowledge/world knowledge.
Resumo:
In memory of our beloved Professor José Rodrigues Santos de Sousa Ramos (1948-2007), who João Cabral, one of the authors of this paper, had the honor of being his student between 2000 and 2006, we wrote this paper following the research by experimentation, using the new technologies to capture a new insight about a problem, as him so much love to do it. His passion was to create new relations between different fields of mathematics. He was a builder of bridges of knowledge, encouraging the birth of new ways to understand this science. One of the areas that Sousa Ramos researched was the iteration of maps and the description of its behavior, using the symbolic dynamics. So, in this issue of this journal, honoring his memory, we use experimental results to find some stable regions of a specific family of real rational maps, the ones that he worked with João Cabral. In this paper we describe a parameter space (a,b) to the real rational maps fa,b(x) = (x2 −a)/(x2 −b), using some tools of dynamical systems, as the study of the critical point orbit and Lyapunov exponents. We give some results regarding the stability of these family of maps when we iterate it, specially the ones connected to the order 3 of iteration. We hope that our results would help to understand better the behavior of these maps, preparing the ground to a more efficient use of the Kneading Theory on these family of maps, using symbolic dynamics.
Resumo:
The interest in zero-valent iron nanoparticles has been increasing significantly since the development of a green production method in which extracts from natural products or wastes are used. However, this field of application is yet poorly studied and lacks knowledge that allows the full understanding of the production and application processes. The aim of the present work was to evaluate the viability of the utilization of several tree leaves to produce extracts which are capable of reducing iron(III) in aqueous solution to form nZVIs. The quality of the extracts was evaluated concerning their antioxidant capacity. The results show that: i) dried leaves produce extracts with higher antioxidant capacities than non-dried leaves, ii) the most favorable extraction conditions (temperature, contact time, and volume:mass ratio) were identified for each leaf, iii) with the aim of developing a green, but also low-cost,method waterwas chosen as solvent, iv) the extracts can be classified in three categories according to their antioxidant capacity (expressed as Fe(II) concentration): >40 mmol L−1; 20–40 mmol L−1; and 2–10 mmol L−1; with oak, pomegranate and green tea leaves producing the richest extracts, and v) TEManalysis proves that nZVIs (d=10–20 nm) can be produced using the tree leaf extracts.
Resumo:
Zero-valent iron nanoparticles (nZVIs) are often used in environmental remediation. Their high surface area that is associated with their high reactivity makes them an excellent agent capable of transforming/degrading contaminants in soils and waters. Due to the recent development of green methods for the production of nZVIs, the use of this material became even more attractive. However, the knowledge of its capacity to degrade distinct types of contaminants is still scarce. The present work describes the study of the application of green nZVIs to the remediation of soils contaminated with a common anti-inflammatory drug, ibuprofen. The main objectives of this work were to produce nZVIs using extracts of grape marc, black tea and vine leaves, to verify the degradation of ibuprofen in aqueous solutions by the nZVIs, to study the remediation process of a sandy soil contaminated with ibuprofen using the nZVIs, and to compare the experiments with other common chemical oxidants. The produced nZVIs had nanometric sizes and were able to degrade ibuprofen (54 to 66% of the initial amount) in aqueous solutions. Similar remediation efficiencies were obtained in sandy soils. In this case the remediation could be enhanced (achieving degradation efficiencies above 95%) through the complementation of the process with a catalyzed nZVI Fenton-like reaction. These results indicate that this remediation technology represents a good alternative to traditional and more aggressive technologies.
Resumo:
Although cases of leishmaniasis co-infection have been described in acquired immunodeficiency syndrome patients as well as those who have undergone organ transplants, to our knowledge, the present report is the first documented case of simultaneous cutaneous, visceral and ocular leishmaniasis due to Leishmania (Viannia) braziliensis in a transplant patient. The patient had been using immunosuppressive drugs since receiving a transplanted kidney. The first clinical signs of leishmaniasis included fever, thoracic pain, hepatosplenomegaly, leucopenia and anemia. The cutaneous disease was revealed by the presence of amastigotes in the skin biopsy. After three months, the patient presented fever with conjunctive hyperemia, intense ocular pain and low visual acuity. Parasites isolated from iliac crest, aqueous humor and vitreous body were examined using a range of molecular techniques. The same strain of L. (V.) braziliensis was responsible for the different clinical manifestations. The immunosuppressive drugs probably contributed to the dissemination of Leishmania.
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
Traditional economic wisdom says that free entry in a market will drive profits down to zero. This conclusion is usually drawn under the assumption of perfect information. We assumethat a priori there exists imperfect information about theprofitability of the market, but that potential entrants maylearn the demand curve perfectly at negligible cost byengaging in market research. Even if in equilibrium firmslearn the demand perfectly, profits may be strictly positivebecause of insufficient entry. The mere fact that it will notbecome common knowledge that every entrant has perfectinformation about demand causes this surprising result. Belief means doubt. Knowing means certainty. Introduction to the Kabalah.
Resumo:
[N. 1:4400000].