967 resultados para graphic computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has recently been stated that the parametrization of the time variables in the one-dimensional (I-D) mixing-frequency electron spin-echo envelope modulation (MIF-ESEEM) experiment is incorrect and hence the wrong frequencies for correlated nuclear transitions are predicted. This paper is a direct response to such a claim, its purpose being to show that the parametrization in land 2-D MIF-ESEEM experiments possesses the same form as that used in other 4-pulse incrementation schemes and predicts the same correlation frequencies. We show that the parametrization represents a shearing transformation of the 2-D time-domain and relate the resulting frequency domain spectrum to the HYSCORE spectrum in terms of a skew-projection. It is emphasized that the parametrization of the time-domain variables may be chosen arbitrarily and affects neither the computation of the correct nuclear frequencies nor the resulting resolution. The usefulness or otherwise of the MIF parameters \gamma\ > 1 is addressed, together with the validity of the original claims of the authors with respect to resolution enhancement in cases of purely homogeneous and inhomogeneous broadening. Numerical simulations are provided to illustrate the main points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show that quantum feedback control can be used as a quantum-error-correction process for errors induced by a weak continuous measurement. In particular, when the error model is restricted to one, perfectly measured, error channel per physical qubit, quantum feedback can act to perfectly protect a stabilizer codespace. Using the stabilizer formalism we derive an explicit scheme, involving feedback and an additional constant Hamiltonian, to protect an (n-1)-qubit logical state encoded in n physical qubits. This works for both Poisson (jump) and white-noise (diffusion) measurement processes. Universal quantum computation is also possible in this scheme. As an example, we show that detected-spontaneous emission error correction with a driving Hamiltonian can greatly reduce the amount of redundancy required to protect a state from that which has been previously postulated [e.g., Alber , Phys. Rev. Lett. 86, 4402 (2001)].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We are currently in the midst of a second quantum revolution. The first quantum revolution gave us new rules that govern physical reality. The second quantum revolution will take these rules and use them to develop new technologies. In this review we discuss the principles upon which quantum technology is based and the tools required to develop it. We discuss a number of examples of research programs that could deliver quantum technologies in coming decades including: quantum information technology, quantum electromechanical systems, coherent quantum electronics, quantum optics and coherent matter technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a method by which the decoherence time of a solid-state qubit may be measured. The qubit is coded in the orbital degree of freedom of a single electron bound to a pair of donor impurities in a semiconductor host. The qubit is manipulated by adiabatically varying an external electric field. We show that by measuring the total probability of a successful qubit rotation as a function of the control field parameters, the decoherence rate may be determined. We estimate various system parameters, including the decoherence rates due to electromagnetic fluctuations and acoustic phonons. We find that, for reasonable physical parameters, the experiment is possible with existing technology. In particular, the use of adiabatic control fields implies that the experiment can be performed with control electronics with a time resolution of tens of nanoseconds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate complete characterization of a two-qubit entangling process-a linear optics controlled-NOT gate operating with coincident detection-by quantum process tomography. We use a maximum-likelihood estimation to convert the experimental data into a physical process matrix. The process matrix allows an accurate prediction of the operation of the gate for arbitrary input states and a calculation of gate performance measures such as the average gate fidelity, average purity, and entangling capability of our gate, which are 0.90, 0.83, and 0.73, respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the existence of asymptotically almost periodic classical solutions for a class of abstract neutral integro-differential equation with unbounded delay. A concrete application to partial neutral integro-differential equations which arise in the study of heat conduction in fading memory material is considered. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the expectation propagation (EP) algorithm for approximate Bayesian inference using a factorizing posterior approximation. For neural network models, we use a central limit theorem argument to make EP tractable when the number of parameters is large. For two types of models, we show that EP can achieve optimal generalization performance when data are drawn from a simple distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Qu-Prolog is an extension of Prolog which performs meta-level computations over object languages, such as predicate calculi and lambda-calculi, which have object-level variables, and quantifier or binding symbols creating local scopes for those variables. As in Prolog, the instantiable (meta-level) variables of Qu-Prolog range over object-level terms, and in addition other Qu-Prolog syntax denotes the various components of the object-level syntax, including object-level variables. Further, the meta-level operation of substitution into object-level terms is directly represented by appropriate Qu-Prolog syntax. Again as in Prolog, the driving mechanism in Qu-Prolog computation is a form of unification, but this is substantially more complex than for Prolog because of Qu-Prolog's greater generality, and especially because substitution operations are evaluated during unification. In this paper, the Qu-Prolog unification algorithm is specified, formalised and proved correct. Further, the analysis of the algorithm is carried out in a frame-work which straightforwardly allows the 'completeness' of the algorithm to be proved: though fully explicit answers to unification problems are not always provided, no information is lost in the unification process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background & aims: Severe obesity imposes physical limitations to body composition assessment. Our aim was to compare body fat (BF) estimations of severely obese patients obtained by bioelectrical impedance (BIA) and air displacement plethysmography (ADP) for development of new equations for BF prediction. Methods: Severely obese subjects (83 female/36 mate, mean age = 41.6 +/- 11.6 years) had BF estimated by BIA and ADP. The agreement of the data was evaluated using Bland-Altman`s graphic and concordance correlation coefficient (CCC). A multivariate regression analysis was performed to develop and validate new predictive equations. Results: BF estimations from BIA (64.8 +/- 15 kg) and ADP (65.6 +/- 16.4 kg) did not differ (p > 0.05, with good accuracy, precision, and CCC), but the Bland- Altman graphic showed a wide Limit of agreement (- 10.4; 8.8). The standard BIA equation overestimated BF in women (-1.3 kg) and underestimated BF in men (5.6 kg; p < 0.05). Two BF new predictive equations were generated after BIA measurement, which predicted BF with higher accuracy, precision, CCC, and limits of agreement than the standard BIA equation. Conclusions: Standard BIA equations were inadequate for estimating BF in severely obese patients. Equations developed especially for this population provide more accurate BF assessment. (C) 2008 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we show that the dengue epidemic in the city of Singapore organized itself into a scale-free network of transmission as the 2000-2005 outbreaks progressed. This scale-free network of cluster comprised geographical breeding places for the aedes mosquitoes, acting as super-spreaders nodes in a network of transmission. The geographical organization of the network was analysed by the corresponding distribution of weekly number of new cases. Therefore, our hypothesis is that the distribution of dengue cases reflects the geographical organization of a transmission network, which evolved towards a power law as the epidemic intensity progressed until 2005. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cancellation tasks have been largely used to evaluate visuospatial function and attention. Cognitive evaluation of low literacy subjects remains a challenge in developing countries, when it becomes necessary to distinguish between what is pathological and what is biased by low education. Performance of river bank dwellers of the Amazon region was studied, in a structured nonverbal cancellation task, verifying their searching strategies (randomized/organized), time of completion, number of correct cancelled targets and number of false-positive targets. A difference was observed in performance and searching strategies between illiterates and literates with only a few years of schooling (mean= 0.8, S.D.=1.6 years of education) across all measures. There was a significant difference between literate groups in the searching strategy, as well as between illiterates who had never attended school and those who had, showing that a minimal contact with graphic presentations and organization of writing was able to modify this cognitive function. (C) 2007 National Academy of Neuropsychology. Published by Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a positive correlation between the intensity of use of a given antibiotic and the prevalence of resistant strains. The more you treat, more patients infected with resistant strains appears and, as a consequence, the higher the mortality due to the infection and the longer the hospitalization time. In contrast, the less you treat, the higher the mortality rates and the longer the hospitalization time of patients infected with sensitive strains that could be successfully treated. The hypothesis proposed in this paper is an attempt to solve such a conflict: there must be an optimum treatment intensity that minimizes both the additional mortality and hospitalization time due to the infection by both sensitive and resistant bacteria strains. In order to test this hypothesis we applied a simple mathematical model that allowed us to estimate the optimum proportion of patients to be treated in order to minimize the total number of deaths and hospitalization time due to the infection in a hospital setting. (C) 2007 Elsevier Inc. All rights reserved.