940 resultados para Encryption schemes
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Transmitting sensitive data over non-secret channels has always required encryption technologies to ensure that the data arrives without exposure to eavesdroppers. The Internet has made it possible to transmit vast volumes of data more rapidly and cheaply and to a wider audience than ever before. At the same time, strong encryption makes it possible to send data securely, to digitally sign it, to prove it was sent or received, and to guarantee its integrity. The Internet and encryption make bulk transmission of data a commercially viable proposition. However, there are implementation challenges to solve before commercial bulk transmission becomes mainstream. Powerful have a performance cost, and may affect quality of service. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Performance degradation and potential for commercial loss discourage the bulk transmission of data over the Internet in any commercial application. This paper outlines technical solutions to these problems. We develop new technologies and combine existing ones in new and powerful ways to minimise commercial loss without compromising performance or inflating overheads.
Resumo:
Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.
Resumo:
Procedures that provide detection, location and correction of tampering in documents are known as anti-tampering schemes. In this paper we describe how to construct an anti-tampering scheme using a pre-computed tree of hashes. The main problems of constructing such a scheme are its computational feasibility and its candidate reduction process. We show how to solve both problems by the use of secondary hashing over a tree structure. Finally, we give brief comments on our ongoing work in this area.
Airframe sound simulation based on staggered-grid higher order schemes and finite volume CFD methods
Resumo:
Abstract not available
Resumo:
Homomorphic encryption is a particular type of encryption method that enables computing over encrypted data. This has a wide range of real world ramifications such as being able to blindly compute a search result sent to a remote server without revealing its content. In the first part of this thesis, we discuss how database search queries can be made secure using a homomorphic encryption scheme based on the ideas of Gahi et al. Gahi’s method is based on the integer-based fully homomorphic encryption scheme proposed by Dijk et al. We propose a new database search scheme called the Homomorphic Query Processing Scheme, which can be used with the ring-based fully homomorphic encryption scheme proposed by Braserski. In the second part of this thesis, we discuss the cybersecurity of the smart electric grid. Specifically, we use the Homomorphic Query Processing scheme to construct a keyword search technique in the smart grid. Our work is based on the Public Key Encryption with Keyword Search (PEKS) method introduced by Boneh et al. and a Multi-Key Homomorphic Encryption scheme proposed by L´opez-Alt et al. A summary of the results of this thesis (specifically the Homomorphic Query Processing Scheme) is published at the 14th Canadian Workshop on Information Theory (CWIT).
Resumo:
Miniaturization of power generators to the MEMS scale, based on the hydrogen-air fuel cell, is the object of this research. The micro fuel cell approach has been adopted for advantages of both high power and energy densities. On-board hydrogen production/storage and an efficient control scheme that facilitates integration with a fuel cell membrane electrode assembly (MEA) are key elements for micro energy conversion. Millimeter-scale reactors (ca. 10 µL) have been developed, for hydrogen production through hydrolysis of CaH2 and LiAlH4, to yield volumetric energy densities of the order of 200 Whr/L. Passive microfluidic control schemes have been implemented in order to facilitate delivery, self-regulation, and at the same time eliminate bulky auxiliaries that run on parasitic power. One technique uses surface tension to pump water in a microchannel for hydrolysis and is self-regulated, based on load, by back pressure from accumulated hydrogen acting on a gas-liquid microvalve. This control scheme improves uniformity of power delivery during long periods of lower power demand, with fast switching to mass transport regime on the order of seconds, thus providing peak power density of up to 391.85 W/L. Another method takes advantage of water recovery by backward transport through the MEA, of water vapor that is generated at the cathode half-cell reaction. This regulation-free scheme increases available reactor volume to yield energy density of 313 Whr/L, and provides peak power density of 104 W/L. Prototype devices have been tested for a range of duty periods from 2-24 hours, with multiple switching of power demand in order to establish operation across multiple regimes. Issues identified as critical to the realization of the integrated power MEMS include effects of water transport and byproduct hydrate swelling on hydrogen production in the micro reactor, and ambient relative humidity on fuel cell performance.
Resumo:
In this talk, we propose an all regime Lagrange-Projection like numerical scheme for the gas dynamics equations. By all regime, we mean that the numerical scheme is able to compute accurate approximate solutions with an under-resolved discretization with respect to the Mach number M, i.e. such that the ratio between the Mach number M and the mesh size or the time step is small with respect to 1. The key idea is to decouple acoustic and transport phenomenon and then alter the numerical flux in the acoustic approximation to obtain a uniform truncation error in term of M. This modified scheme is conservative and endowed with good stability properties with respect to the positivity of the density and the internal energy. A discrete entropy inequality under a condition on the modification is obtained thanks to a reinterpretation of the modified scheme in the Harten Lax and van Leer formalism. A natural extension to multi-dimensional problems discretized over unstructured mesh is proposed. Then a simple and efficient semi implicit scheme is also proposed. The resulting scheme is stable under a CFL condition driven by the (slow) material waves and not by the (fast) acoustic waves and so verifies the all regime property. Numerical evidences are proposed and show the ability of the scheme to deal with tests where the flow regime may vary from low to high Mach values.
Resumo:
In this work, we introduce a new class of numerical schemes for rarefied gas dynamic problems described by collisional kinetic equations. The idea consists in reformulating the problem using a micro-macro decomposition and successively in solving the microscopic part by using asymptotic preserving Monte Carlo methods. We consider two types of decompositions, the first leading to the Euler system of gas dynamics while the second to the Navier-Stokes equations for the macroscopic part. In addition, the particle method which solves the microscopic part is designed in such a way that the global scheme becomes computationally less expensive as the solution approaches the equilibrium state as opposite to standard methods for kinetic equations which computational cost increases with the number of interactions. At the same time, the statistical error due to the particle part of the solution decreases as the system approach the equilibrium state. This causes the method to degenerate to the sole solution of the macroscopic hydrodynamic equations (Euler or Navier-Stokes) in the limit of infinite number of collisions. In a last part, we will show the behaviors of this new approach in comparisons to standard Monte Carlo techniques for solving the kinetic equation by testing it on different problems which typically arise in rarefied gas dynamic simulations.
Resumo:
This paper deals with the development and the analysis of asymptotically stable and consistent schemes in the joint quasi-neutral and fluid limits for the collisional Vlasov-Poisson system. In these limits, the classical explicit schemes suffer from time step restrictions due to the small plasma period and Knudsen number. To solve this problem, we propose a new scheme stable for choices of time steps independent from the small scales dynamics and with comparable computational cost with respect to standard explicit schemes. In addition, this scheme reduces automatically to consistent discretizations of the underlying asymptotic systems. In this first work on this subject, we propose a first order in time scheme and we perform a relative linear stability analysis to deal with such problems. The framework we propose permits to extend this approach to high order schemes in the next future. We finally show the capability of the method in dealing with small scales through numerical experiments.
Resumo:
It is suggested here that the ultimate accuracy of DFT methods arises from the type of hybridization scheme followed. This idea can be cast into a mathematical formulation utilizing an integrand connecting the noninteracting and the interacting particle system. We consider two previously developed models for it, dubbed as HYB0 and QIDH, and assess a large number of exchange-correlation functionals against the AE6, G2/148, and S22 reference data sets. An interesting consequence of these hybridization schemes is that the error bars, including the standard deviation, are found to markedly decrease with respect to the density-based (nonhybrid) case. This improvement is substantially better than variations due to the underlying density functional used. We thus finally hypothesize about the universal character of the HYB0 and QIDH models.
Resumo:
Things change. Words change, meaning changes and use changes both words and meaning. In information access systems this means concept schemes such as thesauri or clas- sification schemes change. They always have. Concept schemes that have survived have evolved over time, moving from one version, often called an edition, to the next. If we want to manage how words and meanings - and as a conse- quence use - change in an effective manner, and if we want to be able to search across versions of concept schemes, we have to track these changes. This paper explores how we might expand SKOS, a World Wide Web Consortium (W3C) draft recommendation in order to do that kind of tracking.The Simple Knowledge Organization System (SKOS) Core Guide is sponsored by the Semantic Web Best Practices and Deployment Working Group. The second draft, edited by Alistair Miles and Dan Brickley, was issued in November 2005. SKOS is a “model for expressing the basic structure and content of concept schemes such as thesauri, classification schemes, subject heading lists, taxonomies, folksonomies, other types of controlled vocabulary and also concept schemes embedded in glossaries and terminologies” in RDF. How SKOS handles version in concept schemes is an open issue. The current draft guide suggests using OWL and DCTERMS as mechanisms for concept scheme revision.As it stands an editor of a concept scheme can make notes or declare in OWL that more than one version exists. This paper adds to the SKOS Core by introducing a tracking sys- tem for changes in concept schemes. We call this tracking system vocabulary ontogeny. Ontogeny is a biological term for the development of an organism during its lifetime. Here we use the ontogeny metaphor to describe how vocabularies change over their lifetime. Our purpose here is to create a conceptual mechanism that will track these changes and in so doing enhance information retrieval and prevent document loss through versioning, thereby enabling persistent retrieval.
Resumo:
This thesis examines cultural policy for film in Scotland, from 1997 to 2010. It explores the extent to which the industry is shaped by film policy strategies and through the agency of public funding bodies. It reflects on how Scottish Screen, Scotland’s former screen agency, articulated its role as a national institution concerned with both commercial and cultural remits, with the conflicting interests of different industry groups. The study examines how the agency developed funding schemes to fulfil policy directives during a tumultuous period in Scottish cultural policy history, following the establishment of the Scottish Parliament with the Scotland Act 1998 and preceding the Independence Referendum Act 2013. In order to investigate how policy has shaped the development of a national film industry, a further two case studies are explored. These are Tartan Shorts, Scotland’s former flagship short film scheme, and the Audience Development Fund, Scotland’s first project based film exhibition scheme. The first study explores the planning, implementation and evaluation of the scheme as part of the agency’s talent development strategy. The outcomes of this study show the potential impact of funding methods aimed at developing and retaining Scottish filmmaking talent. Thereafter, the Scottish exhibition sector is discussed; a formerly unexplored field within film policy discussions and academic debate. It outlines Scottish Screen’s legacy to current film exhibition funding practices and the practical mechanisms the agency utilised to foster Scottish audiences. By mapping the historical and political terrain, the research analyses the specificity of Scotland within the UK context and explores areas in which short-term, context-driven policies become problematic. The work concludes by presenting the advantages and issues caused by film funding practices, advocating what is needed for the film industry in Scotland today with suggestions for long-term and cohesive policy development.