923 resultados para COHERENCE
Resumo:
In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.
Resumo:
Hypertexts are digital texts characterized by interactive hyperlinking and a fragmented textual organization. Increasingly prominent since the early 1990s, hypertexts have become a common text type both on the Internet and in a variety of other digital contexts. Although studied widely in disciplines like hypertext theory and media studies, formal linguistic approaches to hypertext continue to be relatively rare. This study examines coherence negotiation in hypertext with particularly reference to hypertext fiction. Coherence, or the quality of making sense, is a fundamental property of textness. Proceeding from the premise that coherence is a subjectively evaluated property rather than an objective quality arising directly from textual cues, the study focuses on the processes through which readers interact with hyperlinks and negotiate continuity between hypertextual fragments. The study begins with a typological discussion of textuality and an overview of the historical and technological precedents of modern hypertexts. Then, making use of text linguistic, discourse analytical, pragmatic, and narratological approaches to textual coherence, the study takes established models developed for analyzing and describing conventional texts, and examines their applicability to hypertext. Primary data derived from a collection of hyperfictions is used throughout to illustrate the mechanisms in practice. Hypertextual coherence negotiation is shown to require the ability to cognitively operate between local and global coherence by means of processing lexical cohesion, discourse topical continuities, inferences and implications, and shifting cognitive frames. The main conclusion of the study is that the style of reading required by hypertextuality fosters a new paradigm of coherence. Defined as fuzzy coherence, this new approach to textual sensemaking is predicated on an acceptance of the coherence challenges readers experience when the act of reading comes to involve repeated encounters with referentially imprecise hyperlinks and discourse topical shifts. A practical application of fuzzy coherence is shown to be in effect in the way coherence is actively manipulated in hypertext narratives.
Resumo:
In this paper we present a cache coherence protocol for multistage interconnection network (MIN)-based multiprocessors with two distinct private caches: private-blocks caches (PCache) containing blocks private to a process and shared-blocks caches (SCache) containing data accessible by all processes. The architecture is extended by a coherence control bus connecting all shared-block cache controllers. Timing problems due to variable transit delays through the MIN are dealt with by introducing Transient states in the proposed cache coherence protocol. The impact of the coherence protocol on system performance is evaluated through a performance study of three phases. Assuming homogeneity of all nodes, a single-node queuing model (phase 3) is developed to analyze system performance. This model is solved for processor and coherence bus utilizations using the mean value analysis (MVA) technique with shared-blocks steady state probabilities (phase 1) and communication delays (phase 2) as input parameters. The performance of our system is compared to that of a system with an equivalent-sized unified cache and with a multiprocessor implementing a directory-based coherence protocol. System performance measures are verified through simulation.
Resumo:
Multisensor recordings are becoming commonplace. When studying functional connectivity between different brain areas using such recordings, one defines regions of interest, and each region of interest is often characterized by a set (block) of time series. Presently, for two such regions, the interdependence is typically computed by estimating the ordinary coherence for each pair of individual time series and then summing or averaging the results over all such pairs of channels (one from block 1 and other from block 2). The aim of this paper is to generalize the concept of coherence so that it can be computed for two blocks of non-overlapping time series. This quantity, called block coherence, is first shown mathematically to have properties similar to that of ordinary coherence, and then applied to analyze local field potential recordings from a monkey performing a visuomotor task. It is found that an increase in block coherence between the channels from V4 region and the channels from prefrontal region in beta band leads to a decrease in response time.
Resumo:
Let D denote the open unit disk in C centered at 0. Let H-R(infinity) denote the set of all bounded and holomorphic functions defined in D that also satisfy f(z) = <(f <(z)over bar>)over bar> for all z is an element of D. It is shown that H-R(infinity) is a coherent ring.
Resumo:
We address the problem of high-resolution reconstruction in frequency-domain optical-coherence tomography (FDOCT). The traditional method employed uses the inverse discrete Fourier transform, which is limited in resolution due to the Heisenberg uncertainty principle. We propose a reconstruction technique based on zero-crossing (ZC) interval analysis. The motivation for our approach lies in the observation that, for a multilayered specimen, the backscattered signal may be expressed as a sum of sinusoids, and each sinusoid manifests as a peak in the FDOCT reconstruction. The successive ZC intervals of a sinusoid exhibit high consistency, with the intervals being inversely related to the frequency of the sinusoid. The statistics of the ZC intervals are used for detecting the frequencies present in the input signal. The noise robustness of the proposed technique is improved by using a cosine-modulated filter bank for separating the input into different frequency bands, and the ZC analysis is carried out on each band separately. The design of the filter bank requires the design of a prototype, which we accomplish using a Kaiser window approach. We show that the proposed method gives good results on synthesized and experimental data. The resolution is enhanced, and noise robustness is higher compared with the standard Fourier reconstruction. (c) 2012 Optical Society of America
Resumo:
We address the reconstruction problem in frequency-domain optical-coherence tomography (FDOCT) from under-sampled measurements within the framework of compressed sensing (CS). Specifically, we propose optimal sparsifying bases for accurate reconstruction by analyzing the backscattered signal model. Although one might expect Fourier bases to be optimal for the FDOCT reconstruction problem, it turns out that the optimal sparsifying bases are windowed cosine functions where the window is the magnitude spectrum of the laser source. Further, the windowed cosine bases can be phase locked, which allows one to obtain higher accuracy in reconstruction. We present experimental validations on real data. The findings reported in this Letter are useful for optimal dictionary design within the framework of CS-FDOCT. (C) 2012 Optical Society of America
Resumo:
We address the problem of phase retrieval, which is frequently encountered in optical imaging. The measured quantity is the magnitude of the Fourier spectrum of a function (in optics, the function is also referred to as an object). The goal is to recover the object based on the magnitude measurements. In doing so, the standard assumptions are that the object is compactly supported and positive. In this paper, we consider objects that admit a sparse representation in some orthonormal basis. We develop a variant of the Fienup algorithm to incorporate the condition of sparsity and to successively estimate and refine the phase starting from the magnitude measurements. We show that the proposed iterative algorithm possesses Cauchy convergence properties. As far as the modality is concerned, we work with measurements obtained using a frequency-domain optical-coherence tomography experimental setup. The experimental results on real measured data show that the proposed technique exhibits good reconstruction performance even with fewer coefficients taken into account for reconstruction. It also suppresses the autocorrelation artifacts to a significant extent since it estimates the phase accurately.
Resumo:
The H-1 NMR spectroscopic discrimination of enantiomers in the solution state and the measurement of enantiomeric composition is most often hindered due to either very small chemical shift differences between the discriminated peaks or severe overlap of transitions from other chemically non-equivalent protons. In addition the use of chiral auxiliaries such as, crown ether and chiral lanthanide shift reagent may often cause enormous line broadening or give little degree of discrimination beyond the crown ether substrate ratio, hampering the discrimination. In circumventing such problems we are proposing the utilization of the difference in the additive values of all the chemical shifts of a scalar coupled spin system. The excitation and detection of appropriate highest quantum coherence yields the measurable difference in the frequencies between two transitions, one pertaining to each enantiomer in the maximum quantum dimension permitting their discrimination and the F-2 cross section at each of these frequencies yields an enantiopure spectrum. The advantage of the utility of the proposed method is demonstrated on several chiral compounds where the conventional one dimensional H-1 NMR spectra fail to differentiate the enantiomers.
Resumo:
Facet-based sentiment analysis involves discovering the latent facets, sentiments and their associations. Traditional facet-based sentiment analysis algorithms typically perform the various tasks in sequence, and fail to take advantage of the mutual reinforcement of the tasks. Additionally,inferring sentiment levels typically requires domain knowledge or human intervention. In this paper, we propose aseries of probabilistic models that jointly discover latent facets and sentiment topics, and also order the sentiment topics with respect to a multi-point scale, in a language and domain independent manner. This is achieved by simultaneously capturing both short-range syntactic structure and long range semantic dependencies between the sentiment and facet words. The models further incorporate coherence in reviews, where reviewers dwell on one facet or sentiment level before moving on, for more accurate facet and sentiment discovery. For reviews which are supplemented with ratings, our models automatically order the latent sentiment topics, without requiring seed-words or domain-knowledge. To the best of our knowledge, our work is the first attempt to combine the notions of syntactic and semantic dependencies in the domain of review mining. Further, the concept of facet and sentiment coherence has not been explored earlier either. Extensive experimental results on real world review data show that the proposed models outperform various state of the art baselines for facet-based sentiment analysis.
Resumo:
Recently it has been discovered---contrary to expectations of physicists as well as biologists---that the energy transport during photosynthesis, from the chlorophyll pigment that captures the photon to the reaction centre where glucose is synthesised from carbon dioxide and water, is highly coherent even at ambient temperature and in the cellular environment. This process and the key molecular ingredients that it depends on are described. By looking at the process from the computer science view-point, we can study what has been optimised and how. A spatial search algorithmic model based on robust features of wave dynamics is presented.
Resumo:
Exploiting the performance potential of GPUs requires managing the data transfers to and from them efficiently which is an error-prone and tedious task. In this paper, we develop a software coherence mechanism to fully automate all data transfers between the CPU and GPU without any assistance from the programmer. Our mechanism uses compiler analysis to identify potential stale accesses and uses a runtime to initiate transfers as necessary. This allows us to avoid redundant transfers that are exhibited by all other existing automatic memory management proposals. We integrate our automatic memory manager into the X10 compiler and runtime, and find that it not only results in smaller and simpler programs, but also eliminates redundant memory transfers. Tested on eight programs ported from the Rodinia benchmark suite it achieves (i) a 1.06x speedup over hand-tuned manual memory management, and (ii) a 1.29x speedup over another recently proposed compiler--runtime automatic memory management system. Compared to other existing runtime-only and compiler-only proposals, it also transfers 2.2x to 13.3x less data on average.
Resumo:
We demonstrate that the universal conductance fluctuations (UCF) can be used as a direct probe to study the valley quantum states in disordered graphene. The UCF magnitude in graphene is suppressed by a factor of four at high carrier densities where the short-range disorder essentially breaks the valley degeneracy of the K and K' valleys, leading to a density dependent crossover of symmetry class from symplectic near the Dirac point to orthogonal at high densities.
Resumo:
We address the problem of reconstructing a sparse signal from its DFT magnitude. We refer to this problem as the sparse phase retrieval (SPR) problem, which finds applications in tomography, digital holography, electron microscopy, etc. We develop a Fienup-type iterative algorithm, referred to as the Max-K algorithm, to enforce sparsity and successively refine the estimate of phase. We show that the Max-K algorithm possesses Cauchy convergence properties under certain conditions, that is, the MSE of reconstruction does not increase with iterations. We also formulate the problem of SPR as a feasibility problem, where the goal is to find a signal that is sparse in a known basis and whose Fourier transform magnitude is consistent with the measurement. Subsequently, we interpret the Max-K algorithm as alternating projections onto the object-domain and measurement-domain constraint sets and generalize it to a parameterized relaxation, known as the relaxed averaged alternating reflections (RAAR) algorithm. On the application front, we work with measurements acquired using a frequency-domain optical-coherence tomography (FDOCT) experimental setup. Experimental results on measured data show that the proposed algorithms exhibit good reconstruction performance compared with the direct inversion technique, homomorphic technique, and the classical Fienup algorithm without sparsity constraint; specifically, the autocorrelation artifacts and background noise are suppressed to a significant extent. We also demonstrate that the RAAR algorithm offers a broader framework for FDOCT reconstruction, of which the direct inversion technique and the proposed Max-K algorithm become special instances corresponding to specific values of the relaxation parameter.
Resumo:
Neural activity across the brain shows both spatial and temporal correlations at multiple scales, and understanding these correlations is a key step toward understanding cortical processing. Correlation in the local field potential (LFP) recorded from two brain areas is often characterized by computing the coherence, which is generally taken to reflect the degree of phase consistency across trials between two sites. Coherence, however, depends on two factors-phase consistency as well as amplitude covariation across trials-but the spatial structure of amplitude correlations across sites and its contribution to coherence are not well characterized. We recorded LFP from an array of microelectrodes chronically implanted in the primary visual cortex of monkeys and studied correlations in amplitude across electrodes as a function of interelectrode distance. We found that amplitude correlations showed a similar trend as coherence as a function of frequency and interelectrode distance. Importantly, even when phases were completely randomized between two electrodes, amplitude correlations introduced significant coherence. To quantify the contributions of phase consistency and amplitude correlations to coherence, we simulated pairs of sinusoids with varying phase consistency and amplitude correlations. These simulations confirmed that amplitude correlations can significantly bias coherence measurements, resulting in either over-or underestimation of true phase coherence. Our results highlight the importance of accounting for the correlations in amplitude while using coherence to study phase relationships across sites and frequencies.