40 resultados para quantization artifacts


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The generation of an entangled coherent state is one of the most important ingredients of quantum information processing using coherent states. Recently, numerous schemes to achieve this task have been proposed. In order to generate travelling-wave entangled coherent states, cross-phase-modulation, optimized by optical Kerr effect enhancement in a dense medium in an electromagnetically induced transparency (EIT) regime, seems to be very promising. In this scenario, we propose a fully quantized model of a double-EIT scheme recently proposed [D. Petrosyan and G. Kurizki, Phys. Rev. A 65, 33 833 (2002)]: the quantization step is performed adopting a fully Hamiltonian approach. This allows us to write effective equations of motion for two interacting quantum fields of light that show how the dynamics of one field depends on the photon-number operator of the other. The preparation of a Schrodinger cat state, which is a superposition of two distinct coherent states, is briefly exposed. This is based on nonlinear interaction via double EIT of two light fields (initially prepared in coherent states) and on a detection step performed using a 50:50 beam splitter and two photodetectors. In order to show the entanglement of an entangled coherent state, we suggest to measure the joint quadrature variance of the field. We show that the entangled coherent states satisfy the sufficient condition for entanglement based on quadrature variance measurement. We also show how robust our scheme is against a low detection efficiency of homodyne detectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the first marine reservoir age and Delta R determination for the island of St. Helena using marine mollusk radiocarbon dates obtained from an historical context of known age. This represents the first marine reservoir a.-c and Delta R determination in the southern Atlantic Ocean within thousands of kilometers of the island. The depletion of C-14 in the shells indicates a rather larger reservoir age for that portion of the surface Atlantic than models indicate. The implication is that upwelling old water along the Namibian coast is transported for a considerable distance, although it is likely to be variable on a decadal timescale. An artilleryman's button, together with other artifacts found in a midden, demonstrate association of the mollusk shells with a narrow historic period of AD 1815-1835.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radiocarbon dating has been used infrequently as a chronological tool for research in Anglo-Saxon archaeology. Primarily, this is because the uncertainty of calibrated dates provides little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology in conjunction with high-precision 14C dating have, however, created the possibility of both testing and refining the established Anglo-Saxon chronologies based on typology of artifacts. The calibration process within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have previously reported decadal measurements on a section of the Irish oak chronology for the period AD 495–725 (McCormac et al. 2004). In this paper, we present decadal measurements for the periods AD 395–485 and AD 735–805,which extends the original calibration set.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the universal (blind) image steganalysis problem and introduces a novel method to detect especially spatial domain steganographic methods. The proposed steganalyzer models linear dependencies of image rows/columns in local neighborhoods using singular value decomposition transform and employs content independency provided by a Wiener filtering process. Experimental results show that the novel method has superior performance when compared with its counterparts in terms of spatial domain steganography. Experiments also demonstrate the reasonable ability of the method to detect discrete cosine transform-based steganography as well as the perturbation quantization method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the application of complex wavelet transforms to the field of digital data hiding. Complex wavelets offer improved directional selectivity and shift invariance over their discretely sampled counterparts allowing for better adaptation of watermark distortions to the host media. Two methods of deriving visual models for the watermarking system are adapted to the complex wavelet transforms and their performances are compared. To produce improved capacity a spread transform embedding algorithm is devised, this combines the robustness of spread spectrum methods with the high capacity of quantization based methods. Using established information theoretic methods, limits of watermark capacity are derived that demonstrate the superiority of complex wavelets over discretely sampled wavelets. Finally results for the algorithm against commonly used attacks demonstrate its robustness and the improved performance offered by complex wavelet transforms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The many-electron-correlated scattering (MECS) approach to quantum electronic transport was investigated in the linear-response regime [I. Bâldea and H. Köppel, Phys. Rev. B 78, 115315 (2008). The authors suggest, based on numerical calculations, that the manner in which the method imposes boundary conditions is unable to reproduce the well-known phenomena of conductance quantization. We introduce an analytical model and demonstrate that conductance quantization is correctly obtained using open system boundary conditions within the MECS approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For any proposed software project, when the software requirements specification has been established, requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development. Then it is necessary to provide effective and flexible requirements changes management. In this paper, we present an approach to managing requirements changes based on Booth’s negotiation-style framework for belief revision. Informally, we consider the current requirements specification as a belief set about the system-to-be. The request of requirements change is viewed as new information about the same system-to-be. Then the process of executing the requirements change is a process of revising beliefs about the system-to-be. We design a family of belief negotiation models appropriate for different processes of requirements revision, including the setting of the request of requirements change being fully accepted, the setting of the current requirements specification being fully preserved, and that of the current specification and the request of requirements change reaching a compromise. In particular, the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Studies concerning the physiological significance of Ca2+ sparks often depend on the detection and measurement of large populations of events in noisy microscopy images. Automated detection methods have been developed to quickly and objectively distinguish potential sparks from noise artifacts. However, previously described algorithms are not suited to the reliable detection of sparks in images where the local baseline fluorescence and noise properties can vary significantly, and risk introducing additional bias when applied to such data sets. Here, we describe a new, conceptually straightforward approach to spark detection in linescans that addresses this issue by combining variance stabilization with local baseline subtraction. We also show that in addition to greatly increasing the range of images in which sparks can be automatically detected, the use of a more accurate noise model enables our algorithm to achieve similar detection sensitivities with fewer false positives than previous approaches when applied both to synthetic and experimental data sets. We propose, therefore, that it might be a useful tool for improving the reliability and objectivity of spark analysis in general, and describe how it might be further optimized for specific applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the stimulated Raman transition between two long-lived states via multiple intermediate states, such as between hyperfine ground states in the alkali-metal atoms. We present a concise treatment of the general, multilevel, off-resonant case, and we show how the lightshift emerges naturally in this approach. We illustrate our results by application to alkali-metal atoms and we make specific reference to cesium. We comment on some artifacts, due solely to the geometrical overlap of states, which are relevant to existing experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A bit-level systolic array system for performing a binary tree vector quantization (VQ) codebook search is described. This is based on a highly regular VLSI building block circuit. The system in question exhibits a very high data rate suitable for a range of real-time applications. A technique is described which reduces the storage requirements of such a system by 50%, with a corresponding decrease in hardware complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A number of high-performance VLSI architectures for real-time image coding applications are described. In particular, attention is focused on circuits for computing the 2-D DCT (discrete cosine transform) and for 2-D vector quantization. The former circuits are based on Winograd algorithms and comprise a number of bit-level systolic arrays with a bit-serial, word-parallel input. The latter circuits exhibit a similar data organization and consist of a number of inner product array circuits. Both circuits are highly regular and allow extremely high data rates to be achieved through extensive use of parallelism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A bit-level systolic array system for performing a binary tree Vector Quantization codebook search is described. This consists of a linear chain of regular VLSI building blocks and exhibits data rates suitable for a wide range of real-time applications. A technique is described which reduces the computation required at each node in the binary tree to that of a single inner product operation. This method applies to all the common distortion measures (including the Euclidean distance, the Weighted Euclidean distance and the Itakura-Saito distortion measure) and significantly reduces the hardware required to implement the tree search system. © 1990 Kluwer Academic Publishers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design of a System-on-a-Chip (SoC) demonstrator for a baseline JPEG encoder core is presented. This combines a highly optimized Discrete Cosine Transform (DCT) and quantization unit with an entropy coder which has been realized using off-the-shelf synthesizable IP cores (Run-length coder, Huffman coder and data packer). When synthesized in a 0.35 µm CMOS process, the core can operate at speeds up to 100 MHz and contains 50 k gates plus 11.5 kbits of RAM. This is approximately 20% less than similar JPEG encoder designs reported in literature. When targeted at FPGA the core can operate up to 30 MHz and is capable of compressing 9-bit full-frame color input data at NTSC or PAL rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An overview is given of a systolic VLSI compiler (SVC) tool currently under development for the automated design of high performance digital signal processing (DSP) chips. Attention is focused on the design of systolic vector quantization chips for use in both speech and image coding systems. The software in question consists of a cell library, silicon assemblers, simulators, test pattern generators, and a specially designed graphics shell interface which makes it expandable and user friendly. It allows very high performance digital coding systems to be rapidly designed in VLSI.