995 resultados para Sampling Theorem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical Kramer sampling theorem provides a method for obtaining orthogonal sampling formulas. Besides, it has been the cornerstone for a significant mathematical literature on the topic of sampling theorems associated with differential and difference problems. In this work we provide, in an unified way, new and old generalizations of this result corresponding to various different settings; all these generalizations are illustrated with examples. All the different situations along the paper share a basic approach: the functions to be sampled are obtaining by duality in a separable Hilbert space H through an H -valued kernel K defined on an appropriate domain.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 94A12, 94A20, 30D20, 41A05.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The classical approach to A/D conversion has been uniform sampling and we get perfect reconstruction for bandlimited signals by satisfying the Nyquist Sampling Theorem. We propose a non-uniform sampling scheme based on level crossing (LC) time information. We show stable reconstruction of bandpass signals with correct scale factor and hence a unique reconstruction from only the non-uniform time information. For reconstruction from the level crossings we make use of the sparse reconstruction based optimization by constraining the bandpass signal to be sparse in its frequency content. While overdetermined system of equations is resorted to in the literature we use an undetermined approach along with sparse reconstruction formulation. We could get a reconstruction SNR > 20dB and perfect support recovery with probability close to 1, in noise-less case and with lower probability in the noisy case. Random picking of LC from different levels over the same limited signal duration and for the same length of information, is seen to be advantageous for reconstruction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The discretization size is limited by the sampling theorem, and the limit is one half of the wavelength of the highest frequency of the problem. However, one half of the wavelength is an ideal value. In general, the discretization size that can ensure the accuracy of the simulation is much smaller than this value in the traditional finite element method. The possible reason of this phenomenon is analyzed in this paper, and an efficient method is given to improve the simulation accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Shannon/Nyquist sampling theorem specifies that to avoid losing information when capturing a signal, one must sample at least two times faster than the signal bandwidth. In order to capture and represent compressible signals at a rate significantly below the Nyquist rate, a new method, called compressive sensing (CS), is therefore proposed. CS theory asserts that one can recover certain signals from far fewer samples or measurements than traditional methods use. It employs non-adaptive linear projections that preserve the structure of the sparse signal; the signal is then reconstructed from these projections using an optimization process. It is believed that CS has far reaching implications, while most publications concentrate on signal processing fields (especially for images). In this paper, we provide a concise introduction of CS and then discuss some of its potential applications in structural engineering. The recorded vibration time history of a steel beam and the wave propagation result on a steel rebar are studied in detail. CS is adopted to reconstruct the time histories by using only parts of the signals. The results under different conditions are compared, which confirm that CS will be a promising tool for structural engineering.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The classical Kramer sampling theorem, which provides a method for obtaining orthogonal sampling formulas, can be formulated in a more general nonorthogonal setting. In this setting, a challenging problem is to characterize the situations when the obtained nonorthogonal sampling formulas can be expressed as Lagrange-type interpolation series. In this article a necessary and sufficient condition is given in terms of the zero removing property. Roughly speaking, this property concerns the stability of the sampled functions on removing a finite number of their zeros.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The classical Kramer sampling theorem provides a method for obtaining orthogonal sampling formulas. In particular, when the involved kernel is analytic in the sampling parameter it can be stated in an abstract setting of reproducing kernel Hilbert spaces of entire functions which includes as a particular case the classical Shannon sampling theory. This abstract setting allows us to obtain a sort of converse result and to characterize when the sampling formula associated with an analytic Kramer kernel can be expressed as a Lagrange-type interpolation series. On the other hand, the de Branges spaces of entire functions satisfy orthogonal sampling formulas which can be written as Lagrange-type interpolation series. In this work some links between all these ideas are established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teachers in many introductory statistics courses demonstrate the Central Limit Theorem by using a computer to draw a large number of random samples of size n from a population distribution and plot the resulting empirical sampling distribution of the sample mean. There are
many computer applications that can be used for this (see, for example, the Rice Virtual Lab in Statistics: http://www.ruf.rice.edu/~lane/rvls.html). The effectiveness of such demonstrations has been questioned (see delMas et al (1999))) but in the work presented in this paper we do not rely on sampling distributions to convey or teach statistical concepts; only that the sampling distribution is independent of the distribution of the population, provided the sample size is sufficiently large.

We describe a lesson that starts out with a demonstration of the CTL, but sample from a (finite) population where actual census data is provided; doing this may help students more easily relate to the concepts – they can see the original data as a column of numbers and if the samples are shown they can also see random samples being taken. We continue with this theme of sampling from census data to teach the basic ideas of inference. We end up with standard resampling/bootstrap procedures.

We also demonstrate how Excel can provide a tool for developing a learning objects to support the program; a workbook called Sampling.xls is available from www.deakin.edu.au/~rodneyc/PS > Sampling.xls.

Relevância:

20.00% 20.00%

Publicador: