986 resultados para quantum Fisher information
Resumo:
A mechanical electroscope based on a change in the resonant frequency of a cantilever one micron in size in the presence of charge has recently been fabricated. We derive the decoherence rate of a charge superposition during measurement with such a device using a master equation theory adapted from quantum optics. We also investigate the information produced by such a measurement, using a quantum trajectory approach. Such instruments could be used in mesoscopic electronic systems, and future solid-state quantum computers, so it is useful to know how they behave when used to measure quantum superpositions of charge.
Resumo:
Quantum feedback can stabilize a two-level atom against decoherence (spontaneous emission), putting it into an arbitrary (specified) pure state. This requires perfect homodyne detection of the atomic emission, and instantaneous feedback. Inefficient detection was considered previously by two of us. Here we allow for a non-zero delay time tau in the feedback circuit. Because a two-level atom is a non-linear optical system, an analytical solution is not possible. However, quantum trajectories allow a simple numerical simulation of the resulting non-Markovian process. We find the effect of the time delay to be qualitatively similar to chat of inefficient detection. The solution of the non-Markovian quantum trajectory will not remain fixed, so that the time-averaged state will be mixed, not pure. In the case where one tries to stabilize the atom in the excited state, an approximate analytical solution to the quantum trajectory is possible. The result, that the purity (P = 2Tr[rho (2)] - 1) of the average state is given by P = 1 - 4y tau (where gamma is the spontaneous emission rate) is found to agree very well with the numerical results. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
We show that stochastic electrodynamics and quantum mechanics give quantitatively different predictions for the quantum nondemolition (QND) correlations in travelling wave second harmonic generation. Using phase space methods and stochastic integration, we calculate correlations in both the positive-P and truncated Wigner representations, the latter being equivalent to the semi-classical theory of stochastic electrodynamics. We show that the semiclassical results are different in the regions where the system performs best in relation to the QND criteria, and that they significantly overestimate the performance in these regions. (C) 2001 Published by Elsevier Science B.V.
Resumo:
Recent rapid advances in communication technology have changed global structural patterns and produced new concepts and poles of dynamism in international relations. One such technology, which is increasingly causing a mixed reaction across international boundaries, is that of the Internet. For the first time in history the emergence of the Internet has produced an anarchic power that is capable of influencing individuals, societies and governments on a scale previously unimaginable.
Resumo:
By exhibiting a violation of a novel form of the Bell-CHSH inequality, Żukowski has recently established that the quantum correlations exploited in the standard perfect teleportation protocol cannot be recovered by any local hidden variables model. In the case of imperfect teleportation, we show that a violation of a generalized form of Żukowski's teleportation inequality can only occur if the channel state, considered by itself, already violates a Bell-CHSH inequality. On the other hand, the fact that the channel state violates a Bell-CHSH inequality is not sufficient to imply a violation of Żukowski's teleportation inequality (or any of its generalizations). The implication does hold, however, if the fidelity of the teleportation exceeds ≈ 0.90. © 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This paper describes the inception, planning and first delivery of a security course as part of a postgraduate ecommerce program. The course is reviewed in terms of existing literature on security courses, the common body of knowledge established for security professionals and the job market into which students will graduate. The course described in this paper is a core subject for the e-commerce program. This program was established in 1999 and the first batch of students graduated in 2001. The program is offered at both postgraduate and undergraduate level. The work described here relates to the postgraduate offering. Students on this program are graduates of diverse disciplines and do not have a common e-commerce or business background.
Resumo:
Internationalisation occurs when the firm expands its selling, production, or other business activities into international markets. Many enterprises, especially small- and medium-size firms (SMEs), are internationalising today at an unprecedented rate. Managers are strategically using information to achieve degrees of internationalisation previously considered the domain of large firms. We extend existing explanations of firm internationalisation by examining the nature and fundamental, antecedent role of internalising appropriate information and translating it into relevant knowledge. Based on case studies of internationalising firms, we advance a conceptualisation of information internalisation and knowledge creation within the firm as it achieves internationalisation readiness. In the process, we offer several propositions intended to guide future research. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
The efficacy of psychological treatments emphasising a self-management approach to chronic pain has been demonstrated by substantial empirical research. Nevertheless, high drop-out and relapse rates and low or unsuccessful engagement in self-management pain rehabilitation programs have prompted the suggestion that people vary in their readiness to adopt a self-management approach to their pain. The Pain Stages of Change Questionnaire (PSOCQ) was developed to assess a patient's readiness to adopt a self-management approach to their chronic pain. Preliminary evidence has supported the PSOCQ's psychometric properties. The current study was designed to further examine the psychometric properties of the PSOCQ, including its reliability, factorial structure and predictive validity. A total of 107 patients with an average age of 36.2 years (SD = 10.63) attending a multi-disciplinary pain management program completed the PSOCQ, the Pain Self-Efficacy Questionnaire (PSEQ) and the West Haven-Yale Multidimensional Pain Inventory (WHYMPI) pre-admission and at discharge from the program. Initial data analysis found inadequate internal consistencies of the precontemplation and action scales of the PSOCQ and a high correlation (r = 0.66, P < 0.01) between the action and maintenance scales. Principal component analysis supported a two-factor structure: 'Contemplation' and 'Engagement'. Subsequent analyses revealed that the PSEQ was a better predictor of treatment outcome than the PSOCQ scales. Discussion centres upon the utility of the PSOCQ in a clinical pain setting in light of the above findings, and a need for further research. (C) 2002 International Association for the Study of Pain. Published by Elsevier Science B.V. All rights reserved.