929 resultados para Dahlberg error
Resumo:
Performing experiments on small-scale quantum computers is certainly a challenging endeavor. Many parameters need to be optimized to achieve high-fidelity operations. This can be done efficiently for operations acting on single qubits, as errors can be fully characterized. For multiqubit operations, though, this is no longer the case, as in the most general case, analyzing the effect of the operation on the system requires a full state tomography for which resources scale exponentially with the system size. Furthermore, in recent experiments, additional electronic levels beyond the two-level system encoding the qubit have been used to enhance the capabilities of quantum-information processors, which additionally increases the number of parameters that need to be controlled. For the optimization of the experimental system for a given task (e.g., a quantum algorithm), one has to find a satisfactory error model and also efficient observables to estimate the parameters of the model. In this manuscript, we demonstrate a method to optimize the encoding procedure for a small quantum error correction code in the presence of unknown but constant phase shifts. The method, which we implement here on a small-scale linear ion-trap quantum computer, is readily applicable to other AMO platforms for quantum-information processing.
Resumo:
Laser trackers have been widely used in many industries to meet increasingly high accuracy requirements. In laser tracker measurement, it is complex and difficult to perform an accurate error analysis and uncertainty evaluation. This paper firstly reviews the working principle of single beam laser trackers and state-of- The- Art of key technologies from both industrial and academic efforts, followed by a comprehensive analysis of uncertainty sources. A generic laser tracker modelling method is formulated and the framework of the virtual tracker is proposed. The VLS can be used for measurement planning, measurement accuracy optimization and uncertainty evaluation. The completed virtual laser tracking system should take all the uncertainty sources affecting coordinate measurement into consideration and establish an uncertainty model which will behave in an identical way to the real system. © Springer-Verlag Berlin Heidelberg 2010.
Resumo:
La percepción clara y distinta es el elemento sobre el que se asienta la certeza metafísica de Descartes. Con todo, el planteamiento de los argumentos escépticos referidos a la duda metódica cartesiana ha evidenciado la necesidad de hallar una justificación al propio criterio de la percepción clara y distinta. Frente a los intentos basados en la indubitabilidad de la percepción o en la garantía surgida de la bondad divina, se defenderá una justificación alternativa pragmatista.
Resumo:
Reliability has emerged as a critical design constraint especially in memories. Designers are going to great lengths to guarantee fault free operation of the underlying silicon by adopting redundancy-based techniques, which essentially try to detect and correct every single error. However, such techniques come at a cost of large area, power and performance overheads which making many researchers to doubt their efficiency especially for error resilient systems where 100% accuracy is not always required. In this paper, we present an alternative method focusing on the confinement of the resulting output error induced by any reliability issues. By focusing on memory faults, rather than correcting every single error the proposed method exploits the statistical characteristics of any target application and replaces any erroneous data with the best available estimate of that data. To realize the proposed method a RISC processor is augmented with custom instructions and special-purpose functional units. We apply the method on the proposed enhanced processor by studying the statistical characteristics of the various algorithms involved in a popular multimedia application. Our experimental results show that in contrast to state-of-the-art fault tolerance approaches, we are able to reduce runtime and area overhead by 71.3% and 83.3% respectively.
Resumo:
Energy efficiency improvement has been a key objective of China’s long-term energy policy. In this paper, we derive single-factor technical energy efficiency (abbreviated as energy efficiency) in China from multi-factor efficiency estimated by means of a translog production function and a stochastic frontier model on the basis of panel data on 29 Chinese provinces over the period 2003–2011. We find that average energy efficiency has been increasing over the research period and that the provinces with the highest energy efficiency are at the east coast and the ones with the lowest in the west, with an intermediate corridor in between. In the analysis of the determinants of energy efficiency by means of a spatial Durbin error model both factors in the own province and in first-order neighboring provinces are considered. Per capita income in the own province has a positive effect. Furthermore, foreign direct investment and population density in the own province and in neighboring provinces have positive effects, whereas the share of state-owned enterprises in Gross Provincial Product in the own province and in neighboring provinces has negative effects. From the analysis it follows that inflow of foreign direct investment and reform of state-owned enterprises are important policy handles.
Resumo:
Contribution to a roundtable on the 70th anniversary of the publication of W. E. B. DuBois's classic study of US slave emancipation, Black Reconstruction, 1860-1880, including original research on the context in which the book was launched and reflections on its impact on the recent historiography of the American Civil War and its aftermath.
Resumo:
With the objective to improve the reactor physics calculation on a 2D and 3D nuclear reactor via the Diffusion Equation, an adaptive automatic finite element remeshing method, based on the elementary area (2D) or volume (3D) constraints, has been developed. The adaptive remeshing technique, guided by a posteriori error estimator, makes use of two external mesh generator programs: Triangle and TetGen. The use of these free external finite element mesh generators and an adaptive remeshing technique based on the current field continuity show that they are powerful tools to improve the neutron flux distribution calculation and by consequence the power solution of the reactor core even though they have a minor influence on the critical coefficient of the calculated reactor core examples. Two numerical examples are presented: the 2D IAEA reactor core numerical benchmark and the 3D model of the Argonauta research reactor, built in Brasil.
Resumo:
This article introduces the concept of error recovery performance, followed by the development and validation of an instrument to measure it. The first objective of this article is to broaden the current concept of service recovery to be relevant to the back-of-house operations. The second objective is to examine the influence of leader behavioral integrity (BI) on error recovery performance. Moreover, the study examines the mediating effect of job satisfaction between BI and error recovery performance. Finally, the study links error management performance with work-unit effectiveness. Data for Study 1 were collected from 369 hotel employees in Turkey. The same relationships were tested again in Study 2 to validate the findings of Study 1 with a different sample. Data for Study 2 were collected from 33 departmental managers from the same hotels. Linear regression analysis was used to test the direct effects. The mediating effects were tested using the mediation test suggested by Preacher and Hayes. In addition, in Study 2, general managers of the hotels were asked to rate the effectiveness of each manager and their respective department. Results from Study 1 indicate that BI drives error recovery performance, and this impact is mediated by employee job satisfaction. Results of Study 2 confirm this model and finds further that managers’ self-rated error recovery performance was associated with their general managers’ assessment of their deliverables and of their department’s overall performance.
Resumo:
This talk is about using research and design to reduce medical errors. It doesn’t matter whether you deliver healthcare in the old-fashioned pathogenic way, or salutogenically, it all falls apart if systems and protocols let the patient down, and harm them.