994 resultados para variable cam timing
Resumo:
Patients with schizophrenia display numerous cognitive deficits, including problems in working memory, time estimation, and absolute identification of stimuli. Research in these fields has traditionally been conducted independently. We examined these cognitive processes using tasks that are structurally similar and that yield rich error data. Relative to healthy control participants (n = 20), patients with schizophrenia (n = 20) were impaired on a duration identification task and a probed-recall memory task but not on a line-length identification task. These findings do not support the notion of a global impairment in absolute identification in schizophrenia. However, the authors suggest that some aspect of temporal information processing is indeed disturbed in schizophrenia.
Resumo:
With the advent of new video standards such as MPEG-4 part-10 and H.264/H.26L, demands for advanced video coding, particularly in the area of variable block size video motion estimation (VBSME), are increasing. In this paper, we propose a new one-dimensional (1-D) very large-scale integration architecture for full-search VBSME (FSVBSME). The VBS sum of absolute differences (SAD) computation is performed by re-using the results of smaller sub-block computations. These are distributed and combined by incorporating a shuffling mechanism within each processing element. Whereas a conventional 1-D architecture can process only one motion vector (MV), this new architecture can process up to 41 MV sub-blocks (within a macroblock) in the same number of clock cycles.
Resumo:
We establish a mapping between a continuous-variable (CV) quantum system and a discrete quantum system of arbitrary dimension. This opens up the general possibility to perform any quantum information task with a CV system as if it were a discrete system. The Einstein-Podolsky-Rosen state is mapped onto the maximally entangled state in any finite-dimensional Hilbert space and thus can be considered as a universal resource of entanglement. An explicit example of the map and a proposal for its experimental realization are discussed.
Resumo:
There have been theoretical and experimental studies on quantum nonlocality for continuous variables, based on dichotomic observables. In particular, we are interested in two cases of dichotomic observables for the light field of continuous variables: One case is even and odd numbers of photons and the other case is no photon and the presence of photons. We analyze various observables to give the maximum violation of Bell's inequalities for continuous-variable states. We discuss an observable which gives the violation of Bell's inequality for any entangled pure continuous-variable state. However, it does not have to be a maximally entangled state to give the maximal violation of Bell's inequality. This is attributed to a generic problem of testing the quantum nonlocality of an infinite- dimensional state using a dichotomic observable.
Resumo:
Measures of entanglement, fidelity, and purity are basic yardsticks in quantum-information processing. We propose how to implement these measures using linear devices and homodyne detectors for continuous-variable Gaussian states. In particular, the test of entanglement becomes simple with some prior knowledge that is relevant to current experiments.
Resumo:
For interpreting past changes on a regional or global scale, the timings of proxy-inferred events are usually aligned with data from other locations. However, too often chronological uncertainties are ignored in proxy diagrams and multisite comparisons, making it possible for researchers to fall into the trap of sucking separate events into one illusionary event (or vice versa). Here we largely solve this "suck in and smear syndrome" for radiocarbon (14C) dated sequences. In a Bayesian framework, millions of plausible age-models are constructed to quantify the chronological uncertainties within and between proxy archives. We test the technique on replicated high-resolution 14C-dated peat cores deposited during the "Little Ice Age" (c. AD 1400-1900), a period characterized by abrupt climate changes and severe 14C calibration problems. Owing to internal variability in proxy data and uncertainties in age-models, these (and possibly many more) archives are not consistent in recording decadal climate change. Through explicit statistical tests of palaeoenvironmental hypotheses, we can move forward to systematic interpretations of proxy data. However, chronological uncertainties of non-annually resolved palaeoclimate records are too large for answering decadal timescale questions.