918 resultados para Chinese information processing
Resumo:
A key element in the architecture of a quantum-information processing network is a reliable physical interface between fields and qubits. We study a process of entanglement transfer engineering, where two remote qubits respectively interact with an entangled two-mode continuous-variable (CV) field. We quantify the entanglement induced in the qubit state at the expenses of the loss of entanglement in the CV system. We discuss the range of mixed entangled states which can be obtained with this setup. Furthermore, we suggest a protocol to determine the residual entangling power of the light fields inferring, thus, the entanglement left in the field modes which, after the interaction, are no longer in a Gaussian state. Two different setups are proposed: a cavity-QED system and an interface between superconducting qubits and field modes. We address in detail the practical difficulties inherent in these two proposals, showing that the latter is promising in many aspects.
Resumo:
The realization of nonclassical states is an important task for many applications of quantum information processing. Usually, properly tailored interactions, different from goal to goal, are considered in order to accomplish specific tasks within the general framework of quantum state engineering. In this paper, we remark on the flexibility of a cross-Kerr nonlinear coupling in hybrid systems as an important ingredient in the engineering of nonclassical states. The general scenario we consider is the implementation of high cross-Kerr nonlinearity in cavity-quantum electrodynamics. In this context, we discuss the possibility of performing entanglement transfer and swapping between matter qubits and light fields initially prepared in separable coherent states. The recently introduced concept of entanglement reciprocation is also considered and shown to be possible with our scheme. We reinterpret some of our results in terms of applications of a generalized Ising interaction to systems of different nature.
Resumo:
We address the propagation of a single photon pulse with two polarization components, i.e., a polarization qubit, in an inhomogeneously broadened "phaseonium" \Lambda-type three-level medium. We combine some of the non-trivial propagation effects characteristic for this kind of coherently prepared systems and the controlled reversible inhomogeneous broadening technique to propose several quantum information processing applications, such as a protocol for polarization qubit filtering and sieving as well as a tunable polarization beam splitter. Moreover, we show that, by imposing a spatial variation of the atomic coherence phase, an effcient quantum memory for the incident polarization qubit can be also implemented in \Lambda-type three-level systems.
Resumo:
Three experiments examined children’s and adults’ abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a three-variable mechanical system that operated probabilistically. Participants of all ages preferentially relied on the temporal pattern of events in their inferences, even if this conflicted with statistical information. In Experiments 2 and 3, participants observed a series of interventions on the system, which in these experiments operated deterministically. In Experiment 2, participants found it easier to use temporal pattern information than statistical information provided as a result of interventions. In Experiment 3, in which no temporal pattern information was provided, children from 6-7 years, but not younger children, were able to use intervention information to make causal chain judgments, although they had difficulty when the structure was a common cause. The findings suggest that participants, and children in particular, may find it more difficult to use statistical information than temporal pattern information because of its demands on information processing resources. However, there may also be an inherent preference for temporal information.
Resumo:
We present a new algorithm for exactly solving decision-making problems represented as an influence diagram. We do not require the usual assumptions of no forgetting and regularity, which allows us to solve problems with limited information. The algorithm, which implements a sophisticated variable elimination procedure, is empirically shown to outperform a state-of-the-art algorithm in randomly generated problems of up to 150 variables and 10^64 strategies.
Resumo:
Pre-processing (PP) of received symbol vector and channel matrices is an essential pre-requisite operation for Sphere Decoder (SD)-based detection of Multiple-Input Multiple-Output (MIMO) wireless systems. PP is a highly complex operation, but relative to the total SD workload it represents a relatively small fraction of the overall computational cost of detecting an OFDM MIMO frame in standards such as 802.11n. Despite this, real-time PP architectures are highly inefficient, dominating the resource cost of real-time SD architectures. This paper resolves this issue. By reorganising the ordering and QR decomposition sub operations of PP, we describe a Field Programmable Gate Array (FPGA)-based PP architecture for the Fixed Complexity Sphere Decoder (FSD) applied to 4 × 4 802.11n MIMO which reduces resource cost by 50% as compared to state-of-the-art solutions whilst maintaining real-time performance.
Resumo:
O objeto principal desta tese é o estudo de algoritmos de processamento e representação automáticos de dados, em particular de informação obtida por sensores montados a bordo de veículos (2D e 3D), com aplicação em contexto de sistemas de apoio à condução. O trabalho foca alguns dos problemas que, quer os sistemas de condução automática (AD), quer os sistemas avançados de apoio à condução (ADAS), enfrentam hoje em dia. O documento é composto por duas partes. A primeira descreve o projeto, construção e desenvolvimento de três protótipos robóticos, incluindo pormenores associados aos sensores montados a bordo dos robôs, algoritmos e arquitecturas de software. Estes robôs foram utilizados como plataformas de ensaios para testar e validar as técnicas propostas. Para além disso, participaram em várias competições de condução autónoma tendo obtido muito bons resultados. A segunda parte deste documento apresenta vários algoritmos empregues na geração de representações intermédias de dados sensoriais. Estes podem ser utilizados para melhorar técnicas já existentes de reconhecimento de padrões, deteção ou navegação, e por este meio contribuir para futuras aplicações no âmbito dos AD ou ADAS. Dado que os veículos autónomos contêm uma grande quantidade de sensores de diferentes naturezas, representações intermédias são particularmente adequadas, pois podem lidar com problemas relacionados com as diversas naturezas dos dados (2D, 3D, fotométrica, etc.), com o carácter assíncrono dos dados (multiplos sensores a enviar dados a diferentes frequências), ou com o alinhamento dos dados (problemas de calibração, diferentes sensores a disponibilizar diferentes medições para um mesmo objeto). Neste âmbito, são propostas novas técnicas para a computação de uma representação multi-câmara multi-modal de transformação de perspectiva inversa, para a execução de correcção de côr entre imagens de forma a obter mosaicos de qualidade, ou para a geração de uma representação de cena baseada em primitivas poligonais, capaz de lidar com grandes quantidades de dados 3D e 2D, tendo inclusivamente a capacidade de refinar a representação à medida que novos dados sensoriais são recebidos.
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
La théorie de l'information quantique étudie les limites fondamentales qu'imposent les lois de la physique sur les tâches de traitement de données comme la compression et la transmission de données sur un canal bruité. Cette thèse présente des techniques générales permettant de résoudre plusieurs problèmes fondamentaux de la théorie de l'information quantique dans un seul et même cadre. Le théorème central de cette thèse énonce l'existence d'un protocole permettant de transmettre des données quantiques que le receveur connaît déjà partiellement à l'aide d'une seule utilisation d'un canal quantique bruité. Ce théorème a de plus comme corollaires immédiats plusieurs théorèmes centraux de la théorie de l'information quantique. Les chapitres suivants utilisent ce théorème pour prouver l'existence de nouveaux protocoles pour deux autres types de canaux quantiques, soit les canaux de diffusion quantiques et les canaux quantiques avec information supplémentaire fournie au transmetteur. Ces protocoles traitent aussi de la transmission de données quantiques partiellement connues du receveur à l'aide d'une seule utilisation du canal, et ont comme corollaires des versions asymptotiques avec et sans intrication auxiliaire. Les versions asymptotiques avec intrication auxiliaire peuvent, dans les deux cas, être considérées comme des versions quantiques des meilleurs théorèmes de codage connus pour les versions classiques de ces problèmes. Le dernier chapitre traite d'un phénomène purement quantique appelé verrouillage: il est possible d'encoder un message classique dans un état quantique de sorte qu'en lui enlevant un sous-système de taille logarithmique par rapport à sa taille totale, on puisse s'assurer qu'aucune mesure ne puisse avoir de corrélation significative avec le message. Le message se trouve donc «verrouillé» par une clé de taille logarithmique. Cette thèse présente le premier protocole de verrouillage dont le critère de succès est que la distance trace entre la distribution jointe du message et du résultat de la mesure et le produit de leur marginales soit suffisamment petite.
Resumo:
Thèse numérisée par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.
Resumo:
This research investigates what information German Fairtrade coffee consumers search for during pre-purchase information seeking and to what extent information is retrieved. Furthermore, the sequence of the information search as well as the degree of cognitive involvement is highlighted. The role of labeling, the importance of additional ethical information and its quality in terms of concreteness as well as the importance of product price and organic origin are addressed. A set of information relevant to Fairtrade consumers was tested by means of the Information Display Matrix (IDM) method with 389 Fairtrade consumers. Results show that prior to purchase, information on product packages plays an important role and is retrieved rather extensively, but search strategies that reduce the information processing effort are applied as well. Furthermore, general information is preferred over specific information. Results of two regression analyses indicate that purchase decisions are related to search behavior variables rather than to socio-demographic variables and purchase motives. In order to match product information with consumers’ needs, marketers should offer information that is reduced to the central aspects of Fairtrade.