151 resultados para incremental computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the conditional quantum dynamics of an electron tunneling between two quantum dots subject to a measurement using a low transparency point contact or tunnel junction. The double dot system forms a single qubit and the measurement corresponds to a continuous in time readout of the occupancy of the quantum dot. We illustrate the difference between conditional and unconditional dynamics of the qubit. The conditional dynamics is discussed in two regimes depending on the rate of tunneling through the point contact: quantum jumps, in which individual electron tunneling current events can be distinguished, and a diffusive dynamics in which individual events are ignored, and the time-averaged current is considered as a continuous diffusive variable. We include the effect of inefficient measurement and the influence of the relative phase between the two tunneling amplitudes of the double dot/point contact system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Supplementation with propionyl-L-carnitine (PLC) may be of use in improving the exercise capacity of people with peripheral arterial disease. Methods: After a 2-wk exercise familiarization phase, seven subjects displaying intermittent claudication were studied over a 12-wk period consisting of three 4-wk phases, baseline (B), supplementation (S), and placebo (P). PLC was supplemented at 2 g(.)d(-1), and subjects were blinded to the order of supplementation. Unilateral calf strength and endurance were assessed weekly. Walking performance was assessed at the end of each phase using an incremental protocol, during which respiratory gases were collected. Results: Although there was not a significant increase in maximal walking time (similar to 14%) in the whole group, walking time improved to a greater extent than the individual baseline coefficient of variation in four of the seven subjects. The changes in walking performance were correlated with changes in the respiratory exchange ratio both at steady state (r = 0.59) and maximal exercise (r = 0.79). Muscle strength increased significantly from 695 +/- 198 N to 812 +/- 249 N by the end of S. Changes in calf strength from B to S were modestly related to changes in walking performance (r = 0.56). No improvements in calf endurance were detected throughout the study. Conclusions: These preliminary data suggest that, in addition to walking performance, muscle strength can be increased in PAD patients after 4 wk of supplementation with propionyl-L-carnitine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The numerical implementation of the complex image approach for the Green's function of a mixed-potential integralequation formulation is examined and is found to be limited to low values of k(0) rho (in this context k(0) rho = 2 pirho/ lambda(0), where rho is the distance between the source and the field points of the Green's function and lambda(0) is the free space wavelength). This is a clear limitation for problems of large dimension or high frequency where this limit is easily exceeded. This paper examines the various strategies and proposes a hybrid method whereby most of the above problems can be avoided. An efficient integral method that is valid for large k(0) rho is combined with the complex image method in order to take advantage of the relative merits of both schemes. It is found that a wide overlapping region exists between the two techniques allowing a very efficient and consistent approach for accurately calculating the Green's functions. In this paper, the method developed for the computation of the Green's function is used for planar structures containing both lossless and lossy media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present some applications of high-efficiency quantum interrogation (interaction-free measurement) for the creation of entangled states of separate atoms and of separate photons. The quantum interrogation of a quantum object in a superposition of object-in and object-out leaves the object and probe in an entangled state. The probe can then be further entangled with other objects in subsequent quantum interrogations. By then projecting out those cases in which the probe is left in a particular final state, the quantum objects can themselves be left in various entangled states. In this way, we show how to generate two-, three-, and higher-qubit entanglement between atoms and between photons. The effect of finite efficiency for the quantum interrogation is delineated for the various schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new algorithm has been developed for smoothing the surfaces in finite element formulations of contact-impact. A key feature of this method is that the smoothing is done implicitly by constructing smooth signed distance functions for the bodies. These functions are then employed for the computation of the gap and other variables needed for implementation of contact-impact. The smoothed signed distance functions are constructed by a moving least-squares approximation with a polynomial basis. Results show that when nodes are placed on a surface, the surface can be reproduced with an error of about one per cent or less with either a quadratic or a linear basis. With a quadratic basis, the method exactly reproduces a circle or a sphere even for coarse meshes. Results are presented for contact problems involving the contact of circular bodies. Copyright (C) 2002 John Wiley Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE - This study sought to determine whether stress echocardiography using exercise (when feasible) or dobutamine echo could be used to predict mortality in patients with diabetes. RESEARCH DESIGN AND METHODS - Stress echo was performed in 937 patients with diabetes (aged 59 +/- 13 years, 529 men) for symptom evaluation (42%) and follow-up of known coronary artery disease (CAD) (58%). Stress echocardiography using exercise was performed in 333 patients able to exercise maximally, and dobutamine echo using a standard dobutamine stress was used in 604 patients. Patients were followed for less than or equal to9 years (mean 3.9 +/- 2.3) for all-cause mortality. RESULTS - Normal studies were obtained in 567 (60%) patients; 29% had resting left ventricular (LV) dysfunction, and 25% had ischemia. Abnormalities were confined to one territory in 183 (20%) patients and to multiple territories in 187 (20%) patients. Death (in 275 [29%] patients) was predicted by referral for pharmacologic stress (hazard ratio [HR] 3.94, P < 0.0001), ischemia (1.77, P <0.0001), age (1.02, P = 0.002), and heart failure (1.54, P = 0.01). The risk of death in patients With a normal scan was 4% per year, and this was associated with age and selection for pharmacologic stress testing. In stepwise models replicating the sequence of clinical evaluation, the predictive power of independent clinical predictors (age, selection for pharmacologic stress, previous infarction, and heart failure; model chi(2) = 104.8) was significantly enhanced by addition of stress echo data (model chi(2) = 122.9). CONCLUSIONS - The results of stress echo are independent predictors of death in diabetic patients with known or suspected CAD.. Ischemia adds risk that is incremental to clinical risks and LV dysfunction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stress echocardiography has been shown to improve the diagnosis of coronary artery disease in the presence of hypertension, but its value in prognostic evaluation is unclear. We sought to determine whether stress echocardiography could be used to predict mortality in 2363 patients with hypertension, who were followed for up to 10 years (mean 4.0+/-1.8) for death and revascularization. Stress echocardiograms were normal in 1483 patients (63%), 16% had resting left ventricular (LV) dysfunction alone, and 21% had ischemia. Abnormalities were confined to one territory in 489 patients (21%) and to multiple territories in 365 patients (15%). Cardiac death was less frequent among the patients able to exercise than among those undergoing dobutamine echocardiography (4% versus 7%, P<0.001). The risk of death in patients with a negative stress echocardiogram was <1% per year. Ischemia identified by stress echocardiography was an independent predictor of mortality in those able to exercise (hazard ratio 2.21, 95% confidence intervals 1.10 to 4.43, P=0.0001) as well as those undergoing dobutamine echo (hazard ratio 2.39, 95% confidence intervals 1.53 to 3.75, P=0.0001); other predictors were age, heart failure, resting LV dysfunction, and the Duke treadmill score. In stepwise models replicating the sequence of clinical evaluation, the results of stress echocardiography added prognostic power to models based on clinical and stress-testing variables. Thus, the results of stress echocardiography are an independent predictor of cardiac death in hypertensive patients with known or suspected coronary artery disease, incremental to clinical risks and exercise results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Complex chemical reactions in the gas phase can be decomposed into a network of elementary (e.g., unimolecular and bimolecular) steps which may involve multiple reactant channels, multiple intermediates, and multiple products. The modeling of such reactions involves describing the molecular species and their transformation by reaction at a detailed level. Here we focus on a detailed modeling of the C(P-3)+allene (C3H4) reaction, for which molecular beam experiments and theoretical calculations have previously been performed. In our previous calculations, product branching ratios for a nonrotating isomerizing unimolecular system were predicted. We extend the previous calculations to predict absolute unimolecular rate coefficients and branching ratios using microcanonical variational transition state theory (mu-VTST) with full energy and angular momentum resolution. Our calculation of the initial capture rate is facilitated by systematic ab initio potential energy surface calculations that describe the interaction potential between carbon and allene as a function of the angle of attack. Furthermore, the chemical kinetic scheme is enhanced to explicitly treat the entrance channels in terms of a predicted overall input flux and also to allow for the possibility of redissociation via the entrance channels. Thus, the computation of total bimolecular reaction rates and partial capture rates is now possible. (C) 2002 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the difference between classical and quantum dynamics of coupled magnetic dipoles. We prove that in general the dynamics of the classical interaction Hamiltonian differs from the corresponding quantum model, regardless of the initial state. The difference appears as nonpositive-definite diffusion terms in the quantum evolution equation of an appropriate positive phase-space probability density. Thus, it is not possible to express the dynamics in terms of a convolution of a positive transition probability function and the initial condition as can be done in the classical case. It is this feature that enables the quantum system to evolve to an entangled state. We conclude that the dynamics are a quantum element of nuclear magnetic resonance quantum-information processing. There are two limits where our quantum evolution coincides with the classical one: the short-time limit before spin-spin interaction sets in and the long-time limit when phase diffusion is incorporated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What interactions are sufficient to simulate arbitrary quantum dynamics in a composite quantum system? Dodd [Phys. Rev. A 65, 040301(R) (2002)] provided a partial solution to this problem in the form of an efficient algorithm to simulate any desired two-body Hamiltonian evolution using any fixed two-body entangling N-qubit Hamiltonian, and local unitaries. We extend this result to the case where the component systems are qudits, that is, have D dimensions. As a consequence we explain how universal quantum computation can be performed with any fixed two-body entangling N-qudit Hamiltonian, and local unitaries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently a scheme has been proposed for constructing quantum error-correcting codes that embed a finite-dimensional code space in the infinite-dimensional Hilbert space of a system described by continuous quantum variables. One of the difficult steps in this scheme is the preparation of the encoded states. We show how these states can be generated by coupling a continuous quantum variable to a single qubit. An ion trap quantum computer provides a natural setting for a continuous system coupled to a qubit. We discuss how encoded states may be generated in an ion trap.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To outline the major methodological issues appropriate to the use of the population impact number (PIN) and the disease impact number (DIN) in health policy decision making. Design: Review of literature and calculation of PIN and DIN statistics in different settings. Setting: Previously proposed extensions to the number needed to treat (NNT): the DIN and the PIN, which give a population perspective to this measure. Main results: The PIN and DIN allow us to compare the population impact of different interventions either within the same disease or in different diseases or conditions. The primary studies used for relative risk estimates should have outcomes, time periods and comparison groups that are congruent and relevant to the local setting. These need to be combined with local data on disease rates and population size. Depending on the particular problem, the target may be disease incidence or prevalence and the effects of interest may be either the incremental impact or the total impact of each intervention. For practical application, it will be important to use sensitivity analyses to determine plausible intervals for the impact numbers. Conclusions: Attention to various methodological issues will permit the DIN and PIN to be used to assist health policy makers assign a population perspective to measures of risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To measure the cost-effectiveness of cholesterol-lowering therapy with pravastatin in patients with established ischaemic heart disease and average baseline cholesterol levels. Design: Prospective economic evaluation within a double-blind randomised trial (Long-Term Intervention with Pravastatin in Ischaemic Disease [LIPID]), in which patients with a history of unstable angina or previous myocardial infarction were randomised to receive 40 mg of pravastatin daily or matching placebo. Patients and setting: 9014 patients aged 35-75 years from 85 centres in Australia and New Zealand, recruited from June 1990 to December 1992. Main outcome measures: Cost per death averted, cost per life-year gained, and cost per quality-adjusted life-year gained, calculated from measures of hospitalisations, medication use, outpatient visits, and quality of life. Results: The LIPID trial showed a 22% relative reduction in all-cause mortality (P < 0.001). Over a mean follow-up of 6 years, hospital admissions for coronary heart disease and coronary revascularisation were reduced by about 20%. Over this period, pravastatin cost $A4913 per patient, but reduced total hospitalisation costs by $A1385 per patient and other long-term medication costs by $A360 per patient. In a subsample of patients, average quality of life was 0.98 (where 0 = dead and 1 = normal good health); the treatment groups were not significantly different. The absolute reduction in all-cause mortality was 3.0% (95% CI, 1.6%-4.4%), and the incremental cost was $3246 per patient, resulting in a cost per life saved of $107730 (95% Cl, $68626-$209881) within the study period. Extrapolating long-term survival from the placebo group, the undiscounted cost per life-year saved was $7695 (and $10 938 with costs and life-years discounted at an annual rate of 5%). Conclusions: Pravastatin therapy for patients with a history of myocardial infarction or unstable angina and average cholesterol levels reduces all-cause mortality and appears cost effective compared with accepted treatments in high-income countries.