70 resultados para quantum information theory
Resumo:
The paper "the importance of convexity in learning with squared loss" gave a lower bound on the sample complexity of learning with quadratic loss using a nonconvex function class. The proof contains an error. We show that the lower bound is true under a stronger condition that holds for many cases of interest.
Resumo:
Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.
Resumo:
Mutually unbiased bases (MUBs) have been used in several cryptographic and communications applications. There has been much speculation regarding connections between MUBs and finite geometries. Most of which has focused on a connection with projective and affine planes. We propose a connection with higher dimensional projective geometries and projective Hjelmslev geometries. We show that this proposed geometric structure is present in several constructions of MUBs.
Resumo:
We offer an exposition of Boneh, Boyen, and Goh’s “uber-assumption” family for analyzing the validity and strength of pairing assumptions in the generic-group model, and augment the original BBG framework with a few simple but useful extensions.
Resumo:
We consider the problem of increasing the threshold parameter of a secret-sharing scheme after the setup (share distribution) phase, without further communication between the dealer and the shareholders. Previous solutions to this problem require one to start off with a nonstandard scheme designed specifically for this purpose, or to have communication between shareholders. In contrast, we show how to increase the threshold parameter of the standard Shamir secret-sharing scheme without communication between the shareholders. Our technique can thus be applied to existing Shamir schemes even if they were set up without consideration to future threshold increases. Our method is a new positive cryptographic application for lattice reduction algorithms, inspired by recent work on lattice-based list decoding of Reed-Solomon codes with noise bounded in the Lee norm. We use fundamental results from the theory of lattices (geometry of numbers) to prove quantitative statements about the information-theoretic security of our construction. These lattice-based security proof techniques may be of independent interest.
Resumo:
The notion of identity-based IB cryptography was proposed by Shamir [177] as a specialization of public key PK cryptography which dispensed with the need for cumbersome directories, certificates, and revocation lists.
Resumo:
We analyse the security of the cryptographic hash function LAKE-256 proposed at FSE 2008 by Aumasson, Meier and Phan. By exploiting non-injectivity of some of the building primitives of LAKE, we show three different collision and near-collision attacks on the compression function. The first attack uses differences in the chaining values and the block counter and finds collisions with complexity 233. The second attack utilizes differences in the chaining values and salt and yields collisions with complexity 242. The final attack uses differences only in the chaining values to yield near-collisions with complexity 299. All our attacks are independent of the number of rounds in the compression function. We illustrate the first two attacks by showing examples of collisions and near-collisions.
Resumo:
We discuss algorithms for combining sequential prediction strategies, a task which can be viewed as a natural generalisation of the concept of universal coding. We describe a graphical language based on Hidden Markov Models for defining prediction strategies, and we provide both existing and new models as examples. The models include efficient, parameterless models for switching between the input strategies over time, including a model for the case where switches tend to occur in clusters, and finally a new model for the scenario where the prediction strategies have a known relationship, and where jumps are typically between strongly related ones. This last model is relevant for coding time series data where parameter drift is expected. As theoretical contributions we introduce an interpolation construction that is useful in the development and analysis of new algorithms, and we establish a new sophisticated lemma for analysing the individual sequence regret of parameterised models.
Resumo:
Through a combinatorial approach involving experimental measurement and plasma modelling, it is shown that a high degree of control over diamond-like nanocarbon film sp3/sp2 ratio (and hence film properties) may be exercised, starting at the level of electrons (through modification of the plasma electron energy distribution function). Hydrogenated amorphous carbon nanoparticle films with high percentages of diamond-like bonds are grown using a middle-frequency (2 MHz) inductively coupled Ar + CH4 plasma. The sp3 fractions measured by X-ray photoelectron spectroscopy (XPS) and Raman spectroscopy in the thin films are explained qualitatively using sp3/sp2 ratios 1) derived from calculated sp3 and sp2 hybridized precursor species densities in a global plasma discharge model and 2) measured experimentally. It is shown that at high discharge power and lower CH4 concentrations, the sp3/sp2 fraction is higher. Our results suggest that a combination of predictive modeling and experimental studies is instrumental to achieve deterministically grown made-to-order diamond-like nanocarbons suitable for a variety of applications spanning from nano-magnetic resonance imaging to spin-flip quantum information devices. This deterministic approach can be extended to graphene, carbon nanotips, nanodiamond and other nanocarbon materials for a variety of applications
Resumo:
Plasma nanoscience is an emerging multidisciplinary research field at the cutting edge of a large number of disciplines including but not limited to physics and chemistry of plasmas and gas discharges, materials science, surface science, nanoscience and nanotechnology, solid-state physics, space physics and astrophysics, photonics, optics, plasmonics, spintronics, quantum information, physical chemistry, biomedical sciences and related engineering subjects. This paper examines the origin, progress and future perspectives of this research field driven by the global scientific and societal challenges. The future potential of plasma nanoscience to remain a highly topical area in the global research and technological agenda in the age of fundamental-level control for a sustainable future is assessed using a framework of the five Grand Challenges for Basic Energy Sciences recently mapped by the US Department of Energy. It is concluded that the ongoing research is very relevant and is expected to substantially expand to competitively contribute to the solution of all of these Grand Challenges. The approach to controlling energy and matter at nano- and subnanoscales is based on identifying the prevailing carriers and transfer mechanisms of the energy and matter at the spatial and temporal scales that are most relevant to any particular nanofabrication process. Strong accent is made on the competitive edge of the plasma-based nanotechnology in applications related to the major socio-economic issues (energy, food, water, health and environment) that are crucial for a sustainable development of humankind. Several important emerging topics, opportunities and multidisciplinary synergies for plasma nanoscience are highlighted. The main nanosafety issues are also discussed and the environment- and human health-friendly features of plasma-based nanotech are emphasized.
Resumo:
In traditional communication and information theory, noise is the demon Other, an unwelcome disruption in the passage of information. Noise is "anything that is added to the signal between its transmission and reception that is not intended by the source...anything that makes the intended signal harder to decode accurately". It is in Michel Serres' formulation, the "third man" in dialogue who is always assumed, and whom interlocutors continually struggle to exclude. Noise is simultaneously a condition and a by-product of the act of communication, it represents the ever present possibility of disruption, interruption, misunderstanding. In sonic or musical terms noise is cacophony, dissonance. For economists, noise is an arbitrary element, both a barrier to the pursuit of wealth and a basis for speculation. For Mick (Jeremy Sims) and his mate Kev (Ben Mendelsohn) in David Caesar's Idiot Box (1996), as for Hando (Russell Crowe) and his gang of skinheads in Geoffrey Wright's Romper Stomper (1992), or Dazey (Ben Mendelsohn) and Joe (Aden Young) in Wright's Metal Skin (1994) and all those like them starved of (useful) information and excluded from the circuit - the information poor - their only option, their only point of intervention in the loop, is to make noise, to disrupt, to discomfort, to become Serres' "third man", "the prosopopoeia of noise" (5).
Resumo:
Health care services are typically consumed out of necessity, typically to recover from illness. While the consumption of health care services can be emotional given that consumers experience fear, hope, relief, and joy, surprisingly, there is little research on the role of consumer affect in health care consumption. We propose that consumer affect is a heuristic cue that drives evaluation of health care services. Drawing from cognitive appraisal theory and affect-as-information theory, this article tests a research model (N = 492) that investigates consumer affect resulting from service performance on subsequent service outcomes.
Resumo:
Broad knowledge is required when a business process is modeled by a business analyst. We argue that existing Business Process Management methodologies do not consider business goals at the appropriate level. In this paper we present an approach to integrate business goals and business process models. We design a Business Goal Ontology for modeling business goals. Furthermore, we devise a modeling pattern for linking the goals to process models and show how the ontology can be used in query answering. In this way, we integrate the intentional perspective into our business process ontology framework, enriching the process description and enabling new types of business process analysis. © 2008 IEEE.