867 resultados para Measurement-based quantum computing
Resumo:
This paper analyzes the measurement of the diversity of sets based on the dissimilarity of the objects contained in the set. We discuss axiomatic approaches to diversity measurement and examine the considerations underlying the application of specific measures. Our focus is on descriptive issues: rather than assuming a specific ethical position or restricting attention to properties that are appealing in specific applications, we address the foundations of the measurement issue as such in the context of diversity.
Resumo:
This note develops general model-free adjustment procedures for the calculation of unbiased volatility loss functions based on practically feasible realized volatility benchmarks. The procedures, which exploit the recent asymptotic distributional results in Barndorff-Nielsen and Shephard (2002a), are both easy to implement and highly accurate in empirically realistic situations. On properly accounting for the measurement errors in the volatility forecast evaluations reported in Andersen, Bollerslev, Diebold and Labys (2003), the adjustments result in markedly higher estimates for the true degree of return-volatility predictability.
Resumo:
Affiliation: Pierre Dagenais : Hôpital Maisonneuve-Rosemont, Faculté de médecine, Université de Montréal
Resumo:
La théorie de l'information quantique étudie les limites fondamentales qu'imposent les lois de la physique sur les tâches de traitement de données comme la compression et la transmission de données sur un canal bruité. Cette thèse présente des techniques générales permettant de résoudre plusieurs problèmes fondamentaux de la théorie de l'information quantique dans un seul et même cadre. Le théorème central de cette thèse énonce l'existence d'un protocole permettant de transmettre des données quantiques que le receveur connaît déjà partiellement à l'aide d'une seule utilisation d'un canal quantique bruité. Ce théorème a de plus comme corollaires immédiats plusieurs théorèmes centraux de la théorie de l'information quantique. Les chapitres suivants utilisent ce théorème pour prouver l'existence de nouveaux protocoles pour deux autres types de canaux quantiques, soit les canaux de diffusion quantiques et les canaux quantiques avec information supplémentaire fournie au transmetteur. Ces protocoles traitent aussi de la transmission de données quantiques partiellement connues du receveur à l'aide d'une seule utilisation du canal, et ont comme corollaires des versions asymptotiques avec et sans intrication auxiliaire. Les versions asymptotiques avec intrication auxiliaire peuvent, dans les deux cas, être considérées comme des versions quantiques des meilleurs théorèmes de codage connus pour les versions classiques de ces problèmes. Le dernier chapitre traite d'un phénomène purement quantique appelé verrouillage: il est possible d'encoder un message classique dans un état quantique de sorte qu'en lui enlevant un sous-système de taille logarithmique par rapport à sa taille totale, on puisse s'assurer qu'aucune mesure ne puisse avoir de corrélation significative avec le message. Le message se trouve donc «verrouillé» par une clé de taille logarithmique. Cette thèse présente le premier protocole de verrouillage dont le critère de succès est que la distance trace entre la distribution jointe du message et du résultat de la mesure et le produit de leur marginales soit suffisamment petite.
Resumo:
PériCulture est le nom d'un projet de recherche à l'Université de Montréal qui fait partie d'un projet plus vaste basé à l'Université de Sherbrooke. Ce dernier visait à former un réseau de recherche pour la gestion du contenu culturel numérique canadien. L'objectif général de la recherche de PériCulture était d'étudier les méthodes d'indexation de contenus culturels non textuels sur le Web, plus spécifiquement des images. Les résultats de la recherche présentés ici s'appuient sur des travaux précédents en indexation d'images et en indexation automatique (de texte), par l'étude des propriétés du texte associé à des images dans un environnement réseau. Le but était de comprendre la façon dont le texte associé à des images sur des pages Web (appelé péritexte) peut être exploité pour indexer les images correspondantes. Nous avons étudié cette question dans le contexte de pages Web sélectionnées, c'est à dire : des pages de contenu culturel canadien contenant des objets multimédia auxquels était associé du texte (plus que simplement les noms de fichiers et les légendes) et qui étaient bilingues (anglais et français). Nous avons identifié les mots-clés utiles à l'indexation situés à proximité de l'objet décrit. Les termes d'indexation potentiels ont été identifiés dans diverses balises HTML et dans le texte intégral (chacun étant considéré comme une source différente de péritexte). Notre étude a révélé qu'un grand nombre de termes d'indexation utiles sont disponibles dans le péritexte de nombreux sites Web ayant un contenu culturel, et ce péritexte de différentes sources a une utilité variable dans la recherche d’information. Nos résultats suggèrent que ces termes peuvent être exploités de différentes manières dans les systèmes de recherche d’information pour améliorer les résultats de recherche.
Resumo:
En opération depuis 2008, l’expérience ATLAS est la plus grande de toutes les expériences au LHC. Les détecteurs ATLAS- MPX (MPX) installés dans ATLAS sont basés sur le détecteur au silicium à pixels Medipix2 qui a été développé par la collaboration Medipix au CERN pour faire de l’imagerie en temps réel. Les détecteurs MPX peuvent être utilisés pour mesurer la luminosité. Ils ont été installés à seize différents endroits dans les zones expérimentale et technique d’ATLAS en 2008. Le réseau MPX a recueilli avec succès des données indépendamment de la chaîne d’enregistrement des données ATLAS de 2008 à 2013. Chaque détecteur MPX fournit des mesures de la luminosité intégrée du LHC. Ce mémoire décrit la méthode d’étalonnage de la luminosité absolue mesurée avec les détectors MPX et la performance des détecteurs MPX pour les données de luminosité en 2012. Une constante d’étalonnage de la luminosité a été déterminée. L’étalonnage est basé sur technique de van der Meer (vdM). Cette technique permet la mesure de la taille des deux faisceaux en recouvrement dans le plan vertical et horizontal au point d’interaction d’ATLAS (IP1). La détermination de la luminosité absolue nécessite la connaissance précise de l’intensité des faisceaux et du nombre de trains de particules. Les trois balayages d’étalonnage ont été analysés et les résultats obtenus par les détecteurs MPX ont été comparés aux autres détecteurs d’ATLAS dédiés spécifiquement à la mesure de la luminosité. La luminosité obtenue à partir des balayages vdM a été comparée à la luminosité des collisions proton- proton avant et après les balayages vdM. Le réseau des détecteurs MPX donne des informations fiables pour la détermination de la luminosité de l’expérience ATLAS sur un large intervalle (luminosité de 5 × 10^29 cm−2 s−1 jusqu’à 7 × 10^33 cm−2 s−1 .
Resumo:
Simuler efficacement l'éclairage global est l'un des problèmes ouverts les plus importants en infographie. Calculer avec précision les effets de l'éclairage indirect, causés par des rebonds secondaires de la lumière sur des surfaces d'une scène 3D, est généralement un processus coûteux et souvent résolu en utilisant des algorithmes tels que le path tracing ou photon mapping. Ces techniquesrésolvent numériquement l'équation du rendu en utilisant un lancer de rayons Monte Carlo. Ward et al. ont proposé une technique nommée irradiance caching afin d'accélérer les techniques précédentes lors du calcul de la composante indirecte de l'éclairage global sur les surfaces diffuses. Krivanek a étendu l'approche de Ward et Heckbert pour traiter le cas plus complexe des surfaces spéculaires, en introduisant une approche nommée radiance caching. Jarosz et al. et Schwarzhaupt et al. ont proposé un modèle utilisant le hessien et l'information de visibilité pour raffiner le positionnement des points de la cache dans la scène, raffiner de manière significative la qualité et la performance des approches précédentes. Dans ce mémoire, nous avons étendu les approches introduites dans les travaux précédents au problème du radiance caching pour améliorer le positionnement des éléments de la cache. Nous avons aussi découvert un problème important négligé dans les travaux précédents en raison du choix des scènes de test. Nous avons fait une étude préliminaire sur ce problème et nous avons trouvé deux solutions potentielles qui méritent une recherche plus approfondie.
Resumo:
STUDY DESIGN: Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. OBJECTIVE: To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. SUMMARY OF BACKGROUND DATA: To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. METHODS: Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. RESULTS: The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). CONCLUSION: This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.
Resumo:
Transparent conducting oxides (TCO’s) have been known and used for technologically important applications for more than 50 years. The oxide materials such as In2O3, SnO2 and impurity doped SnO2: Sb, SnO2: F and In2O3: Sn (indium tin oxide) were primarily used as TCO’s. Indium based oxides had been widely used as TCO’s for the past few decades. But the current increase in the cost of indium and scarcity of this material created the difficulty in obtaining low cost TCO’s. Hence the search for alternative TCO material has been a topic of active research for the last few decades. This resulted in the development of various binary and ternary compounds. But the advantages of using binary oxides are the easiness to control the composition and deposition parameters. ZnO has been identified as the one of the promising candidate for transparent electronic applications owing to its exciting optoelectronic properties. Some optoelectronics applications of ZnO overlap with that of GaN, another wide band gap semiconductor which is widely used for the production of green, blue-violet and white light emitting devices. However ZnO has some advantages over GaN among which are the availability of fairly high quality ZnO bulk single crystals and large excitonic binding energy. ZnO also has much simpler crystal-growth technology, resulting in a potentially lower cost for ZnO based devices. Most of the TCO’s are n-type semiconductors and are utilized as transparent electrodes in variety of commercial applications such as photovoltaics, electrochromic windows, flat panel displays. TCO’s provide a great potential for realizing diverse range of active functions, novel functions can be integrated into the materials according to the requirement. However the application of TCO’s has been restricted to transparent electrodes, ii notwithstanding the fact that TCO’s are n-type semiconductors. The basic reason is the lack of p-type TCO, many of the active functions in semiconductor originate from the nature of pn-junction. In 1997, H. Kawazoe et al reported the CuAlO2 as the first p-type TCO along with the chemical design concept for the exploration of other p-type TCO’s. This has led to the fabrication of all transparent diode and transistors. Fabrication of nanostructures of TCO has been a focus of an ever-increasing number of researchers world wide, mainly due to their unique optical and electronic properties which makes them ideal for a wide spectrum of applications ranging from flexible displays, quantum well lasers to in vivo biological imaging and therapeutic agents. ZnO is a highly multifunctional material system with highly promising application potential for UV light emitting diodes, diode lasers, sensors, etc. ZnO nanocrystals and nanorods doped with transition metal impurities have also attracted great interest, recently, for their spin-electronic applications This thesis summarizes the results on the growth and characterization of ZnO based diodes and nanostructures by pulsed laser ablation. Various ZnO based heterojunction diodes have been fabricated using pulsed laser deposition (PLD) and their electrical characteristics were interpreted using existing models. Pulsed laser ablation has been employed to fabricate ZnO quantum dots, ZnO nanorods and ZnMgO/ZnO multiple quantum well structures with the aim of studying the luminescent properties.
Resumo:
The quantum yields of singlet oxygen production and lifetimes at the gas–solid interface in silica gel material are determined. Different photosensitizers (PS) are encapsulated in parallelepipedic xerogel monoliths (PS-SG). PS were chosen according to their known photooxidation properties: 9,10-dicyanoanthracene (DCA), 9,10-anthraquinone (ANT), and a benzophenone derivative, 4-benzoyl benzoic acid (4BB). These experiments are mainly based on time-resolved 1O2 phosphorescence detection, and the obtained FD and tD values are compared with those of a reference sensitizer for production, 1H-phenalen-1- one (PN), included in the same xerogel. The trend between their ability to oxidize organic pollutants in the gas phase and their efficiency for production is investigated through photooxidation experiments of a test pollutant dimethylsulfide (DMS). The FD value is high for DCA-SG relative to the PN reference, whereas it is slightly lower for 4BB-SG and for ANT-SG. FD is related to the production of sulfoxide and sulfone as the main oxidation products for DMS photosensitized oxidation. Additional mechanisms, leading to C!S bond cleaveage, appear to mainly occur for the less efficient singlet oxygen sensitizers 4BB-SG and ANTSG.
Resumo:
The operation of a previously proposed terahertz (THZ) detector is formulated in detail. The detector is based on the hot-electron effect of the 2D electron gas (2DEG) in the quantum well (QW) of a GaAs/AIGaAs heterostructure. The interaction between the THz radiation and the 2DEG, the current enhancement due to hot -electron effect, and the noise performance of the detector are analyzed
Resumo:
Photoplethysmography (PPG) is a simple and inexpensive optical technique that can be used to detect blood volume changes in the microvascular bed of tissues. There has been a resurgence of interest in the technique in recent years, driven by the demand for low cost, simple and portable technology for the primary care and community based clinical settings and the wide availability of low cost and small semiconductor components, and the advancement of computer-based pulse wave analysis techniques. The present research work deals with the design of a PPG sensor for recording the blood volume pulse signals and carry out selected cardiovascular studies based on these signals. The interaction of light with tissue, early and recent history of PPG, instrumentation, measurement protocol and pulse wave analysis are also discussed in this study. The effect of aging, mild cold exposure, and variation in the body posture on the PPG signal have been experimentally studied.
Resumo:
This thesis presents analytical and numerical results from studies based on the multiple quantum well laser rate equation model. We address the problem of controlling chaos produced by direct modulation of laser diodes. We consider the delay feedback control methods for this purpose and study their performance using numerical simulation. Besides the control of chaos, control of other nonlinear effects such as quasiperiodicity and bistability using delay feedback methods are also investigated.A number of secure communication schemes based on synchronization of chaos semiconductor lasers have been successfully demonstrated theoretically and experimentally. The current investigations in these field include the study of practical issues on the implementations of such encryption schemes. We theoretically study the issues such as channel delay, phase mismatch and frequency detuning on the synchronization of chaos in directly modulated laser diodes. It would be helpful for designing and implementing chaotic encryption schemes using synchronization of chaos in modulated semiconductor lasers.
Resumo:
Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.