982 resultados para certificate-based cryptography
Resumo:
One of the main features that confer high quality to the seed is its genetic purity, in which one of the major causes of contamination is the self-pollination of the female parent. Up to date, there is no accurate and fast methods for detecting such contamination. Thus, this work was carried out to certify the genetic purity in seeds of hybrid maize using different biochemical and DNA-based markers. Two single-cross hybrids and their parental lines derived from the maize breeding program at UFLA were evaluated by isoenzymatic pattern of alcohol dehydrogenase (ADH), esterase (EST), acid phosphatase (ACP), glutamate-oxaloacetate transaminase (GOT), malate dehydrogenase (MDH), isocitrate dehydrogenase (IDH), phosphoglucomutase (PGM), 6-phosphoglucomate dehydrogenase (PGDH), catalase (CAT) and ß-glucosidade (ßGLU) and by microsatellites markers. The enzymatic systems that were able to distinguish the hybrids from their parental line were the catalase, the isocitrate dehydrogenase and the esterase. The esterase showed a Mendelian segregation pattern for UFLA 8/3 hybrid, that enables a safer genetic purity certificate. Microsatellites were able to differentiate the hybrid lines and the respective parental lines. Moreover, this technique was fast, precise and without environment effects. For microsatellites, the amplification pattern was identical when young leaves or seeds were used as DNA source. The possibility of using seeds as DNA source would accelerate and facilitate the role process of the genetic purity analysis.
Resumo:
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
Communication is the process of transmitting data across channel. Whenever data is transmitted across a channel, errors are likely to occur. Coding theory is a stream of science that deals with finding efficient ways to encode and decode data, so that any likely errors can be detected and corrected. There are many methods to achieve coding and decoding. One among them is Algebraic Geometric Codes that can be constructed from curves. Cryptography is the science ol‘ security of transmitting messages from a sender to a receiver. The objective is to encrypt message in such a way that an eavesdropper would not be able to read it. A eryptosystem is a set of algorithms for encrypting and decrypting for the purpose of the process of encryption and decryption. Public key eryptosystem such as RSA and DSS are traditionally being prel‘en‘ec| for the purpose of secure communication through the channel. llowever Elliptic Curve eryptosystem have become a viable altemative since they provide greater security and also because of their usage of key of smaller length compared to other existing crypto systems. Elliptic curve cryptography is based on group of points on an elliptic curve over a finite field. This thesis deals with Algebraic Geometric codes and their relation to Cryptography using elliptic curves. Here Goppa codes are used and the curves used are elliptic curve over a finite field. We are relating Algebraic Geometric code to Cryptography by developing a cryptographic algorithm, which includes the process of encryption and decryption of messages. We are making use of fundamental properties of Elliptic curve cryptography for generating the algorithm and is used here to relate both.
Resumo:
Objective: To introduce a new approach to problem based learning (PBL) used in the context of medicinal chemistry practical class teaching pharmacy students. Design: The described chemistry practical is based on independent studies by small groups of undergraduate students (4-5), who design their own practical work taking relevant professional standards into account. Students are carefully guided by feedback and acquire a set of skills important to their future profession as healthcare professionals. This model has been tailored to the application of PBL in a chemistry practical class setting for a large student cohort (150 students). Assessment: The achievement of learning outcomes is based on the submission of relevant documentation including a certificate of analysis, in addition to peer assessment. Some of the learning outcomes are also assessed in the final written examination at the end of the academic year. Conclusion: The described design of a novel PBL chemistry laboratory course for pharmacy students has been found to be successful. Self-reflective learning and engagement with feedback were encouraged, and students enjoyed the challenging learning experience. Skills that are highly essential for the students’ future careers as healthcare professionals are promoted.
Resumo:
A novel cryptography method based on the Lorenz`s attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.
Resumo:
Cryptographic systems are safe. However, the management of cryptographic keys of these systems is a tough task. They are usually protected by the use of password-based authentication mechanisms, which is a weak link on conventional cryptographic systems, as the passwords can be easily copied or stolen. The usage of a biometric approach for releasing the keys is an alternative to the password-based mechanisms. But just like passwords, we need mechanisms to keep the biometrical signal safe. One approach for such mechanism is to use biometrical key cryptography. The cryptographic systems based on the use of biometric characteristics as keys are called biometrical cryptographic systems. This article presents the implementation of Fuzzy Vault, a biometrical cryptographic system written in Java, along with its performance evaluation. Fuzzy Vault was tested on a real application using smartcards.
Resumo:
Access control is a fundamental concern in any system that manages resources, e.g., operating systems, file systems, databases and communications systems. The problem we address is how to specify, enforce, and implement access control in distributed environments. This problem occurs in many applications such as management of distributed project resources, e-newspaper and payTV subscription services. Starting from an access relation between users and resources, we derive a user hierarchy, a resource hierarchy, and a unified hierarchy. The unified hierarchy is then used to specify the access relation in a way that is compact and that allows efficient queries. It is also used in cryptographic schemes that enforce the access relation. We introduce three specific cryptography based hierarchical schemes, which can effectively enforce and implement access control and are designed for distributed environments because they do not need the presence of a central authority (except perhaps for set- UP).
Resumo:
Three-party password-authenticated key exchange (3PAKE) protocols allow entities to negotiate a secret session key with the aid of a trusted server with whom they share a human-memorable password. Recently, Lou and Huang proposed a simple 3PAKE protocol based on elliptic curve cryptography, which is claimed to be secure and to provide superior efficiency when compared with similar-purpose solutions. In this paper, however, we show that the solution is vulnerable to key-compromise impersonation and offline password guessing attacks from system insiders or outsiders, which indicates that the empirical approach used to evaluate the scheme's security is flawed. These results highlight the need of employing provable security approaches when designing and analyzing PAKE schemes. Copyright (c) 2011 John Wiley & Sons, Ltd.
Resumo:
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Aims of the study: To assess the prevalence of Antiepileptic Drug (AED) exposure in pregnant women with or without epilepsy and the comparative risk of terminations of pregnancy (TOPs), spontaneous abortions, stillbirth, major congenital malformations (MCMs) and foetal growth retardation (FGR) following intrauterine AED exposure in the Emilia Romagna region (RER), Northern Italy (4 million inhabitants). Methods: Data were obtained from official regional registries: Certificate of Delivery Assistance, Hospital Discharge Card, reimbursed prescription databases and Registry of Congenital Malformations. We identified all the deliveries, hospitalized abortions and MCMs occurred between January 2009 and December 2011. Results: We identified 145,243 pregnancies: 111,284 deliveries (112,845 live births and 279 stillbirths), 16408 spontaneous abortions and 17551 TOPs. Six hundred and eleven pregnancies (0.42% 95% Cl: 0.39-0.46) were exposed to AEDs. Twenty-one per cent of pregnancies ended in TOP in the AED group vs 12% in the non-exposed (OR:2.24; CI 1.41-3.56). The rate of spontaneous abortions and stillbirth was comparable in the two groups. Three hundred fifty-three babies (0.31%, 95% CI: 0.28-0.35) were exposed to AEDs during the first trimester. The rate of MCMs was 2.3% in the AED group (2.2% in babies exposed to monotherapy and 3.1% in babies exposed to polytherapy) vs 2.0% in the non-exposed. The risk of FGR was 12.7 % in the exposed group compared to 10% in the non-exposed. Discussion and Conclusion: The prevalence of AED exposure in pregnancy in the RER was 0.42%. The rate of MCMs in children exposed to AEDs in utero was almost superimposable to the one of the non-exposed, however polytherapy carried a slightly increased risk . The rate of TOPs was significantly higher in the exposed women. Further studies are needed to clarify whether this high rate reflects a higher rate of MCMs detected prenatally or other more elusive reasons.
Resumo:
Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.
Resumo:
Since the tragic events of September 11, 2001, the United States has engaged in building the infrastructure and developing the expertise necessary to protect its borders and its citizens from further attacks against its homeland. One approach has been the development of academic courses to educate individuals on the nature and dangers of subversive attacks and to prepare them to respond to attacks and other large-scale emergencies in their roles as working professionals, participating members of their communities, and collaborators with first responders. An initial review of the literature failed to reveal any university-based emergency management courses or programs with a disaster medical component, despite the public health significance and need for such programs. In the Fall of 2003, The School of Management at The University of Texas at Dallas introduced a continuing education Certificate in Emergency Management and Preparedness Program. This thesis will (1) describe the development and implementation of a new Disaster Medical Track as a component of this Certificate in Emergency Management and Preparedness Program, (2) analyze the need for and effectiveness of this Disaster Medical Track, and (3) propose improvements in the track based on this analysis. ^
Resumo:
Current approaches to mobile code safety – inspired by the technique of Proof-Carrying Code (PCC) [4] – associate safety information (in the form of a certificate) to programs. The certificate (or proof) is created by the code supplier at compile time, and packaged along with the untrusted code. The consumer who receives the code+certificate package can then run a checker which, by a straightforward inspection of the code and the certificate, is able to verify the validity of the certificate and thus compliance with the safety policy. The main practical difficulty of PCC techniques is in generating safety certificates which at the same time: i) allow expressing interesting safety properties, ii) can be generated automatically and, iii) are easy and efficient to check.
Resumo:
Abstraction-Carrying Code (ACC) is a framework for mobile code safety in which the code supplier provides a program together with an abstraction (or abstract model of the program) whose validity entails compliance with a predefined safety policy. The abstraction plays thus the role of safety certificate and its generation is carried out automatically by a fixed-point analyzer. The advantage of providing a (fixed-point) abstraction to the code consumer is that its validity is checked in a single pass (i.e., one iteration) of an abstract interpretation-based checker. A main challenge to make ACC useful in practice is to reduce the size of certificates as much as possible, while at the same time not increasing checking time. Intuitively, we only include in the certificate the information which the checker is unable to reproduce without iterating. We introduce the notion of reduced certifícate which characterizes the subset of the abstraction which a checker needs in order to validate (and re-construct) the full certificate in a single pass. Based on this notion, we show how to instrument a generic analysis algorithm with the necessary extensions in order to identify the information relevant to the checker.
Resumo:
Abstraction-Carrying Code (ACC) has recently been proposed as a framework for mobile code safety in which the code supplier provides a program together with an abstraction (or abstract model of the program) whose validity entails compliance with a predefined safety policy. The abstraction plays thus the role of safety certificate and its generation is carried out automatically by a fixpoint analyzer. The advantage of providing a (fixpoint) abstraction to the code consumer is that its validity is checked in a single pass (i.e., one iteration) of an abstract interpretation-based checker. A main challenge to make ACC useful in practice is to reduce the size of certificates as much as possible while at the same time not increasing checking time. The intuitive idea is to only include in the certificate information that the checker is unable to reproduce without iterating. We introduce the notion of reduced certificate which characterizes the subset of the abstraction which a checker needs in order to validate (and re-construct) the fall certificate in a single pass. Based on this notion, we instrument a generic analysis algorithm with the necessary extensions in order to identify the information relevant to the checker. Interestingly, the fact that the reduced certificate omits (parts of) the abstraction has implications in the design of the checker. We provide the sufficient conditions which allow us to ensure that 1) if the checker succeeds in validating the certificate, then the certificate is valid for the program (correctness) and 2) the checker will succeed for any reduced certificate which is valid (completeness). Our approach has been implemented and benchmarked within the CiaoPP system. The experimental results show t h a t our proposal is able to greatly reduce the size of certificates in practice. To appear in Theory and Practice of Logic Programming (TPLP).