994 resultados para signal reconstruction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We uncover the underlying potential energy landscape for a cellular network. We find that the potential energy landscape of the mitogen-activated protein-kinase signal transduction network is funneled toward the global minimum. The funneled landscape is quite robust against random perturbations. This naturally explains robustness from a physical point of view. The ratio of slope versus roughness of the landscape becomes a quantitative measure of robustness of the network. Funneled landscape is a realization of the Darwinian principle of natural selection at the cellular network level. It provides an optimal criterion for network connections and design. Our approach is general and can be applied to other cellular networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study reports an application of the searching combination moving window partial least squares (SCMWPLS) algorithm to the determination of ethenzamide and acetoaminophen in quaternary powdered samples by near infrared (NIR) spectroscopy. Another purpose of the study was to examine the instrumentation effects of spectral resolution and signal-to-noise ratio of the Buchi NIRLab N-200 FT-NIR spectrometer equipped with an InGaAs detector. The informative spectral intervals of NIR spectra of a series of quaternary powdered mixture samples were first located for ethenzamide and acetoaminophen by use of moving window partial least squares regression (MWPLSR). Then, these located spectral intervals were further optimised by SCMWPLS for subsequent partial least squares (PLS) model development. The improved results are attributed to both the less complex PLS models and to higher accuracy of predicted concentrations of ethenzamide and acetoaminophen in the optimised informative spectral intervals that are featured by NIR bands. At the same time, SCMWPLS is also demonstrated as a viable route for wavelength selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discrete wavelets transform (DWT). was applied to noise on removal capillary electrophoresis-electrochemiluminescence (CE-ECL) electropherograms. Several typical wavelet transforms, including Haar, Daublets, Coiflets, and Symmlets, were evaluated. Four types of determining threshold methods, fixed form threshold, rigorous Stein's unbiased estimate of risk (rigorous SURE), heuristic SURE and minimax, combined with hard and soft thresholding methods were compared. The denoising study on synthetic signals showed that wave Symmlet 4 with a level decomposition of 5 and the thresholding method of heuristic SURE-hard provide the optimum denoising strategy. Using this strategy, the noise on CE-ECL electropherograms could be removed adequately. Compared with the Savitzky-Golay and Fourier transform denoising methods, DWT is an efficient method for noise removal with a better preservation of the shape of peaks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

C-37 unsaturated alkenones were analyzed on a core retrieved from the middle Okinawa Trough. The calculated U-37(K') displays a trend generally parallel with those of the oxygen isotopic compositions of two planktonic foraminiferal species, Neogloboquadrina dutertrei and Globigerinoides sacculifer, suggesting that in this region, SST has varied in phase with global ice volume change since the last glacial -interglacial cycle. The U-37(K')-derived SST ranged from ca. 24.0 to 27.5 degrees C, with the highest value 27.5 degrees C occurring in marine isotope stage 5 and the lowest similar to 24.0 degrees C in marine isotope stage 2. This trend is consistent with the continental records from the East Asian monsoon domain and the marine records from the Equatorial Pacific. The deglacial increase of the U-37(K')-derived SST is similar to 2.4 degrees C from the Last Glacial Maximum to the Holocene. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical Orthogonal Function (EOF) analysis is used in this study to generate main eigenvector fields of historical temperature for the China Seas (here referring to Chinese marine territories) and adjacent waters from 1930 to 2002 (510 143 profiles). A good temperature profile is reconstructed based on several subsurface in situ temperature observations and the thermocline was estimated using the model. The results show that: 1) For the study area, the former four principal components can explain 95% of the overall variance, and the vertical distribution of temperature is most stable using the in situ temperature observations near the surface. 2) The model verifications based on the observed CTD data from the East China Sea (ECS), South China Sea (SCS) and the areas around Taiwan Island show that the reconstructed profiles have high correlation with the observed ones with the confidence level > 95%, especially to describe the characteristics of the thermocline well. The average errors between the reconstructed and observed profiles in these three areas are 0.69A degrees C, 0.52A degrees C and 1.18A degrees C respectively. It also shows the model RMS error is less than or close to the climatological error. The statistical model can be used to well estimate the temperature profile vertical structure. 3) Comparing the thermocline characteristics between the reconstructed and observed profiles, the results in the ECS show that the average absolute errors are 1.5m, 1.4 m and 0.17A degrees C/m, and the average relative errors are 24.7%, 8.9% and 22.6% for the upper, lower thermocline boundaries and the gradient, respectively. Although the relative errors are obvious, the absolute error is small. In the SCS, the average absolute errors are 4.1 m, 27.7 m and 0.007A degrees C/m, and the average relative errors are 16.1%, 16.8% and 9.5% for the upper, lower thermocline boundaries and the gradient, respectively. The average relative errors are all < 20%. Although the average absolute error of the lower thermocline boundary is considerable, but contrast to the spatial scale of average depth of the lower thermocline boundary (165 m), the average relative error is small (16.8%). Therefore the model can be used to well estimate the thermocline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Offshore seismic exploration is full of high investment and risk. And there are many problems, such as multiple. The technology of high resolution and high S/N ratio on marine seismic data processing is becoming an important project. In this paper, the technology of multi-scale decomposition on both prestack and poststack seismic data based on wavelet and Hilbert-Huang transform and the theory of phase deconvolution is proposed by analysis of marine seismic exploration, investigation and study of literatures, and integration of current mainstream and emerging technology. Related algorithms are studied. The Pyramid algorithm of decomposition and reconstruction had been given by the Mallat algorithm of discrete wavelet transform In this paper, it is introduced into seismic data processing, the validity is shown by test with field data. The main idea of Hilbert-Huang transform is the empirical mode decomposition with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions that admit well-behaved Hilbert transform. After the decomposition, a analytical signal is constructed by Hilbert transform, from which the instantaneous frequency and amplitude can be obtained. And then, Hilbert spectrum. This decomposition method is adaptive and highly efficient. Since the decomposition is based on the local characteristics of the time scale of data, it is applicable to nonlinear and non-stationary processes. The phenomenons of fitting overshoot and undershoot and end swings are analyzed in Hilbert-Huang transform. And these phenomenons are eliminated by effective method which is studied in the paper. The technology of multi-scale decomposition on both prestack and poststack seismic data can realize the amplitude preserved processing, enhance the seismic data resolution greatly, and overcome the problem that different frequency components can not restore amplitude properly uniformly in the conventional method. The method of phase deconvolution, which has overcome the minimum phase limitation in traditional deconvolution, approached the base fact well that the seismic wavelet is phase mixed in practical application. And a more reliable result will be given by this method. In the applied research, the high resolution relative amplitude preserved processing result has been obtained by careful analysis and research with the application of the methods mentioned above in seismic data processing in four different target areas of China Sea. Finally, a set of processing flow and method system was formed in the paper, which has been carried on in the application in the actual production process and has made the good progress and the huge economic benefit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The modeling formula based on seismic wavelet can well simulate zero - phase wavelet and hybrid-phase wavelet, and approximate maximal - phase and minimal - phase wavelet in a certain sense. The modeling wavelet can be used as wavelet function after suitable modification item added to meet some conditions. On the basis of the modified Morlet wavelet, the derivative wavelet function has been derived. As a basic wavelet, it can be sued for high resolution frequency - division processing and instantaneous feature extraction, in acoordance with the signal expanding characters in time and scale domains by each wavelet structured. Finally, an application example proves the effectiveness and reasonability of the method. Based on the analysis of SVD (Singular Value Decomposition) filter, by taking wavelet as basic wavelet and combining SVD filter and wavelet transform, a new de - noising method, which is Based on multi - dimension and multi-space de - noising method, is proposed. The implementation of this method is discussed the detail. Theoretical analysis and modeling show that the method has strong capacity of de - noising and keeping attributes of effective wave. It is a good tool for de - noising when the S/N ratio is poor. To give prominence to high frequency information of reflection event of important layer and to take account of other frequency information under processing seismic data, it is difficult for deconvolution filter to realize this goal. A filter from Fourier Transform has some problems for realizing the goal. In this paper, a new method is put forward, that is a method of processing seismic data in frequency division from wavelet transform and reconstruction. In ordinary seismic processing methods for resolution improvement, deconvolution operator has poor part characteristics, thus influencing the operator frequency. In wavelet transform, wavelet function has very good part characteristics. Frequency - division data processing in wavelet transform also brings quite good high resolution data, but it needs more time than deconvolution method does. On the basis of frequency - division processing method in wavelet domain, a new technique is put forward, which involves 1) designing filter operators equivalent to deconvolution operator in time and frequency domains in wavelet transform, 2) obtaining derivative wavelet function that is suitable to high - resolution seismic data processing, and 3) processing high resolution seismic data by deconvolution method in time domain. In the method of producing some instantaneous characteristic signals by using Hilbert transform, Hilbert transform is very sensitive to high - frequency random noise. As a result, even though there exist weak high - frequency noises in seismic signals, the obtained instantaneous characteristics of seismic signals may be still submerged by the noises. One method for having instantaneous characteristics of seismic signals in wavelet domain is put forward, which obtains directly the instantaneous characteristics of seismic signals by taking the characteristics of both the real part (real signals, namely seismic signals) and the imaginary part (the Hilbert transfom of real signals) of wavelet transform. The method has the functions of frequency division and noise removal. What is more, the weak wave whose frequency is lower than that of high - frequency random noise is retained in the obtained instantaneous characteristics of seismic signals, and the weak wave may be seen in instantaneous characteristic sections (such as instantaneous frequency, instantaneous phase and instantaneous amplitude). Impedance inversion is one of tools in the description of oil reservoir. one of methods in impedance inversion is Generalized Linear Inversion. This method has higher precision of inversion. But, this method is sensitive to noise of seismic data, so that error results are got. The description of oil reservoir in researching important geological layer, in order to give prominence to geological characteristics of the important layer, not only high frequency impedance to research thin sand layer, but other frequency impedance are needed. It is difficult for some impedance inversion method to realize the goal. Wavelet transform is very good in denoising and processing in frequency division. Therefore, in the paper, a method of impedance inversion is put forward based on wavelet transform, that is impedance inversion in frequency division from wavelet transform and reconstruction. in this paper, based on wavelet transform, methods of time - frequency analysis is given. Fanally, methods above are in application on real oil field - Sansan oil field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the computational role that the construction of a complete surface representation may play in the recovery of 3--D structure from motion. We present a model that combines a feature--based structure--from- -motion algorithm with smooth surface interpolation. This model can represent multiple surfaces in a given viewing direction, incorporates surface constraints from object boundaries, and groups image features using their 2--D image motion. Computer simulations relate the model's behavior to perceptual observations. In a companion paper, we discuss further perceptual experiments regarding the role of surface reconstruction in the human recovery of 3--D structure from motion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the problem of estimating the three-dimensional structure of a scene from a sequence of images. Structure information is recovered from images continuously using shading, motion or other visual mechanisms. A Kalman filter represents structure in a dense depth map. With each new image, the filter first updates the current depth map by a minimum variance estimate that best fits the new image data and the previous estimate. Then the structure estimate is predicted for the next time step by a transformation that accounts for relative camera motion. Experimental evaluation shows the significant improvement in quality and computation time that can be achieved using this technique.