879 resultados para Production engineering Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

3D wave equation prestack depth migration is the effective tool for obtaining the exact imaging result of complex geology structures. It's a part of the 3D seismic data processing. 3D seismic data processing belongs to high dimension signal processing, and there are some difficult problems to do with. They are: How to process high dimension operators? How to improve the focusing? and how to construct the deconvolution operator? The realization of 3D wave equation prestack depth migration, not only realized the leap from poststack to prestack, but also provided the important means to solve the difficult problems in high dimension signal processing. In this thesis, I do a series research especially for the solve of the difficult problems around the 3D wave equation prestack depth migration and using it as a mean. So this thesis service for the realization of 3D wave equation prestack depth migration for one side and improve the migration effect for another side. This thesis expatiates in five departs. Summarizes the main contents as the follows: In the first part, I have completed the projection from 3D data point area to low dimension are using de big matrix transfer and trace rearrangement, and realized the liner processing of high dimension signal. Firstly, I present the mathematics expression of 3D seismic data and the mean according to physics, present the basic ideal of big matrix transfer and describe the realization of five transfer models for example. Secondly, I present the basic ideal and rules for the rearrange and parallel calculate of 3D traces, and give a example. In the conventional DMO focusing method, I recall the history of DM0 process firstly, give the fundamental of DMO process and derive the equation of DMO process and it's impulse response. I also prove the equivalence between DMO and prestack time migration, from the kinematic character of DMO. And derive the relationship between DMO base on wave equation and prestack time migration. Finally, I give the example of DMO process flow and synthetic data of theoretical models. In the wave equation prestak depth migration, I firstly recall the history of migration from time to depth, from poststack to prestack and from 2D to 3D. And conclude the main migration methods, point out their merit and shortcoming. Finally, I obtain the common image point sets using the decomposed migration program code.In the residual moveout, I firstly describe the Viterbi algorithm based on Markov process and compound decision theory and how to solve the shortest path problem using Viterbi algorithm. And based on this ideal, I realized the residual moveout of post 3D wave equation prestack depth migration. Finally, I give the example of residual moveout of real 3D seismic data. In the migration Green function, I firstly give the concept of migration Green function and the 2D Green function migration equation for the approximate of far field. Secondly, I prove the equivalence of wave equation depth extrapolation algorithms. And then I derive the equation of Green function migration. Finally, I present the response and migration result of Green function for point resource, analyze the effect of migration aperture to prestack migration result. This research is benefit for people to realize clearly the effect of migration aperture to migration result, and study on the Green function deconvolution to improve the focusing effect of migration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oil and scientific groups have been focusing on the 3D wave equation prestack depth migration since it can solve the complex problems of the geologic structure accurately and maintain the wave information, which is propitious to lithology imaging. The symplectic method was brought up by Feng Kang firstly in 1984 and became the hotspot of numerical computation study. It will be widely applied in many scientific field of necessity because of its great virtue in scientific sense. This paper combines the Symplectic method and the 3-D wave equation prestack depth migration to bring up an effectual numerical computation method of wave field extrapolatation technique under the scientific background mentioned above. At the base of deep analysis of computation method and the performance of PC cluster, a seismic prestack depth migration flow considering the virtue of both seismic migration method and Pc cluster has formatted. The software, named 3D Wave Equation Prestack Depth Migration of Symplectic Method, which is based on the flow, has been enrolled in the National Bureau of Copyright (No. 0013767). Dagang and Daqing Oil Field have now put it into use in the field data processing. In this paper, the one way wave equation operator is decompounded into a phase shift operator and a time shift operator and the correct item with high rank Symplectic method when approaching E exponent. After reviewing eliminating alias frequency of operator, computing the maximum angle of migration and the imaging condition, we present the test result of impulse response of the Symplectic method. Taking the imaging results of the SEG/EAGE salt and overthrust models for example and seeing about the imaging ability with complex geologic structure of our software system, the paper has discussed the effect of the selection of imaging parameters and the effectuation on the migration result of the seismic wavelet and compared the 2-D and 3-D prestack depth migration result of the salt mode. We also present the test result of impulse response with the overthrust model. The imaging result of the two international models indicates that the Symplectic method of 3-D prestack depth migration accommodates great transversal velocity variation and complex geologic structure. The huge computing cost is the key obstruction that 3-D prestack depth migration wave equation cannot be adopted by oil industry. After deep analysis of prestack depth migration flow and the character of PC cluster ,the paper put forward :i)parallel algorithms in shot and frequency domain of the common shot gather 3-D wave equation prestack migration; ii)the optimized setting scheme of breakpoint in field data processing; iii)dynamic and static load balance among the nodes of the PC cluster in the 3-D prestack depth migration. It has been proven that computation periods of the 3-D prestack depth migration imaging are greatly shortened given that adopting the computing method mentioned in the paper. In addition,considering the 3-D wave equation prestack depth migration flow in complex medium and examples of the field data processing, the paper put the emphasis on: i)seismic data relative preprocessing, ii) 2.5D prestack depth migration velocity analysis, iii)3D prestack depth migration. The result of field data processing shows satisfied application ability of the flow put forward in the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reflectivity sequences extraction is a key part of impedance inversion in seismic exploration. Although many valid inversion methods exist, with crosswell seismic data, the frequency brand of seismic data can not be broadened to satisfy the practical need. It is an urgent problem to be solved. Pre-stack depth migration which developed in these years becomes more and more robust in the exploration. It is a powerful technology of imaging to the geological object with complex structure and its final result is reflectivity imaging. Based on the reflectivity imaging of crosswell seismic data and wave equation, this paper completed such works as follows: Completes the workflow of blind deconvolution, Cauchy criteria is used to regulate the inversion(sparse inversion). Also the precondition conjugate gradient(PCG) based on Krylov subspace is combined with to decrease the computation, improves the speed, and the transition matrix is not necessary anymore be positive and symmetric. This method is used to the high frequency recovery of crosswell seismic section and the result is satisfactory. Application of rotation transform and viterbi algorithm in the preprocess of equation prestack depth migration. In equation prestack depth migration, the grid of seismic dataset is required to be regular. Due to the influence of complex terrain and fold, the acquisition geometry sometimes becomes irregular. At the same time, to avoid the aliasing produced by the sparse sample along the on-line, interpolation should be done between tracks. In this paper, I use the rotation transform to make on-line run parallel with the coordinate, and also use the viterbi algorithm to complete the automatic picking of events, the result is satisfactory. 1. Imaging is a key part of pre-stack depth migration besides extrapolation. Imaging condition can influence the final result of reflectivity sequences imaging greatly however accurate the extrapolation operator is. The author does migration of Marmousi under different imaging conditions. And analyzes these methods according to the results. The results of computation show that imaging condition which stabilize source wave field and the least-squares estimation imaging condition in this paper are better than the conventional correlation imaging condition. The traditional pattern of "distributed computing and mass decision" is wisely adopted in the field of seismic data processing and becoming an obstacle of the promoting of the enterprise management level. Thus at the end of this paper, a systemic solution scheme, which employs the mode of "distributed computing - centralized storage - instant release", is brought forward, based on the combination of C/S and B/S release models. The architecture of the solution, the corresponding web technology and the client software are introduced. The application shows that the validity of this scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seismic survey is the most effective prospecting geophysical method during exploration and development of oil/gas. The structure and the lithology of the geological body become increasingly complex now. So it must assure that the seismic section own upper resolution if we need accurately describe the targets. High signal/noise ratio is the precondition of high-resolution. As one important seismic data processing method, Stacking is an effective means to suppress the records noise. Broadening area of surface stacked is more important to enhance genuine reflection signals and suppressing unwanted energy in the form of coherent and random ambient noise. Common reflection surface stack is a macro-model independent seismic imaging method. Based on the similarity of CRP trace gathers in one coherent zone, CRS stack effectively improves S/N ratio by using more CMP trace gathers to stack. It is regarded as one important method of seismic data processing. Performing CRS stack depends on three attributes. However, the equation of CRS is invalid under condition of great offset. In this thesis, one method based on velocity model in depth domain is put forward. Ray tracing is used to determine the traveltime of CRP in one common reflection surface by the least squares method to regress the equation of CRS. Then we stack in the coherent seismic data set according to the traveltime, and get the zero offset section. In the end of flowchart of implementing CRS stack, one method using the dip angle to enhance the ratio of S/N is used. Application of the method on synthetic examples and field seismic records, the results of this method show an excellent performance of the algorithm both in accuracy and efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formation resistivity is one of the most important parameters to be evaluated in the evaluation of reservoir. In order to acquire the true value of virginal formation, various types of resistivity logging tools have been developed. However, with the increment of the proved reserves, the thickness of interest pay zone is becoming thinner and thinner, especially in the terrestrial deposit oilfield, so that electrical logging tools, limited by the contradictory requirements of resolution and investigation depth of this kinds of tools, can not provide the true value of the formation resistivity. Therefore, resitivity inversion techniques have been popular in the determination of true formation resistivity based on the improving logging data from new tools. In geophysical inverse problems, non-unique solution is inevitable due to the noisy data and deficient measurement information. I address this problem in my dissertation from three aspects, data acquisition, data processing/inversion and applications of the results/ uncertainty evaluation of the non-unique solution. Some other problems in the traditional inversion methods such as slowness speed of the convergence and the initial-correlation results. Firstly, I deal with the uncertainties in the data to be processed. The combination of micro-spherically focused log (MSFL) and dual laterolog(DLL) is the standard program to determine formation resistivity. During the inversion, the readings of MSFL are regarded as the resistivity of invasion zone of the formation after being corrected. However, the errors can be as large as 30 percent due to mud cake influence even if the rugose borehole effects on the readings of MSFL can be ignored. Furthermore, there still are argues about whether the two logs can be quantitatively used to determine formation resisitivities due to the different measurement principles. Thus, anew type of laterolog tool is designed theoretically. The new tool can provide three curves with different investigation depths and the nearly same resolution. The resolution is about 0.4meter. Secondly, because the popular iterative inversion method based on the least-square estimation can not solve problems more than two parameters simultaneously and the new laterolog logging tool is not applied to practice, my work is focused on two parameters inversion (radius of the invasion and the resistivty of virgin information ) of traditional dual laterolog logging data. An unequal weighted damp factors- revised method is developed to instead of the parameter-revised techniques used in the traditional inversion method. In this new method, the parameter is revised not only dependency on the damp its self but also dependency on the difference between the measurement data and the fitting data in different layers. At least 2 iterative numbers are reduced than the older method, the computation cost of inversion is reduced. The damp least-squares inversion method is the realization of Tikhonov's tradeoff theory on the smooth solution and stability of inversion process. This method is realized through linearity of non-linear inversion problem which must lead to the dependency of solution on the initial value of parameters. Thus, severe debates on efficiency of this kinds of methods are getting popular with the developments of non-linear processing methods. The artificial neural net method is proposed in this dissertation. The database of tool's response to formation parameters is built through the modeling of the laterolog tool and then is used to training the neural nets. A unit model is put forward to simplify the dada space and an additional physical limitation is applied to optimize the net after the cross-validation method is done. Results show that the neural net inversion method could replace the traditional inversion method in a single formation and can be used a method to determine the initial value of the traditional method. No matter what method is developed, the non-uniqueness and uncertainties of the solution could be inevitable. Thus, it is wise to evaluate the non-uniqueness and uncertainties of the solution in the application of inversion results. Bayes theorem provides a way to solve such problems. This method is illustrately discussed in a single formation and achieve plausible results. In the end, the traditional least squares inversion method is used to process raw logging data, the calculated oil saturation increased 20 percent than that not be proceed compared to core analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis mainly talks about the wavelet transfrom and the frequency division method. It describes the frequency division processing on prestack or post-stack seismic data and application of inversion noise attenuation, frequency division residual static correction and high resolution data in reservoir inversion. This thesis not only describes the frequency division and inversion in theory, but also proves it by model calculation. All the methods are integrated together. The actual data processing demonstrates the applying results. This thesis analyzes the differences and limitation between t-x prediction filter and f-x prediction filter noise attenuation from wavelet transform theory. It considers that we can do the frequency division attenuation process of noise and signal by wavelet frequency division theory according to the differences of noise and signal in phase, amplitude and frequency. By comparison with the f-x coherence noise, removal method, it approves the effects and practicability of frequency division in coherence and random noise isolation. In order to solve the side effects in non-noise area, we: take the area constraint method and only apply the frequency division processing in the noise area. So it can solve the problem of low frequency loss in non-noise area. The residual moveout differences in seismic data processing have a great effect on stack image and resolutions. Different frequency components have different residual moveout differences. The frequency division residual static correction realizes the frequency division and the calculation of residual correction magnitude. It also solves the problems of different residual correction magnitude in different frequency and protects the high frequency information in data. By actual data processing, we can get good results in phase residual moveout differences elimination of pre-stack data, stack image quality and improvement of data resolution. This thesis analyses the characters of the random noises and its descriptions in time domain and frequency domain. Furthermore it gives the inversion prediction solution methods and realizes the frequency division inversion attenuation of the random noise. By the analysis of results of the actual data processing, we show that the noise removed by inversion has its own advantages. By analyzing parameter's about resolution and technology of high resolution data processing, this thesis describes the relations between frequency domain and resolution, parameters about resolution and methods to increase resolution. It also gives the processing flows of the high resolution data; the effect and influence of reservoir inversion caused by high resolution data. Finally it proves the accuracy and precision of the reservoir inversion results. The research results of this thesis reveal that frequency division noise attenuation, frequency residual correction and inversion noise attenuation are effective methods to increase the SNR and resolution of seismic data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At first, the article has an introduction of the basic theory of magnetotelluric and the essential methods of data acquisition and preprocessing. After that, the article introduces the methods together with their predominance of computing transfering function such as the Least-square method, the Remote-Reference method and the Robust method. The article also describe the cause and influence of static shift, and has a summarize of how to correct the static shift efficiently, then emphasizes on the theories of the popular impedance tensor decomposition methods as Phase-sensitivity method, Groom and Bailey method, General tensor-analyzed method and Mohr circle-analyzed method. The kernal step of magnetotelluric data-processing is inversion, which is also an important content of the article. Firstly, the article introduces the basic theories of both the popular one-dimensional inversion methods as Automod, Occam, Rhoplus, Bostick and Ipi2win and the two-dimensional inversion methods as Occam, Rebocc, Abie and Nlcg. Then, the article is focused on parallel-analysis of the applying advantage of each inversion method with practical models, and obtains meaningful conclusion. Visual program design of magnetotelluric data-processing is another kernal part of the article. The bypast visual program design of magnetotelluric data-processing is not satisfied and systemic, for example, the data-processing method is single, the data-management is not systemic, the data format is not uniform. The article bases the visual program design of magnetotelluric data-processing upon practicability, structurality, variety and extensibility, and adopts database technology and mixed language program design method; finally, a magnetotelluric data management and processing system that integrates database saving and fetching system, data-processing system and graphical displaying system. Finally, the article comes onto the magnetotelluric application.takeing the Tulargen Cu-Ni mining area in Xingjiang as the practical example, using the data-processing methods introduced before, the article has a detailed introduction of magnetotelluric data interpretation procedure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To deal with some key problems in multi-component seismic exploration, some methods are introduced in this thesis based on reading amounts of papers about multi-component seismic theories and methods. First, to find a solution for the detection of the fracture density and orientation in igneous, carbonate and shale reservoirs, a large amount of which exist in domestic oil fields with low exploration and development degree, a new fast and slow shear waves separation method called Ratio Method based on S-wave splitting theory is discussed in this thesis, through which the anisotropy coefficient as well as fracture parameters such as density and azimuthal angle can be acquired. Another main point in this thesis involves the application of seismic velocity ratio (Vp/Vs) to predict the Hthological parameters of subsurface medium. To deal with the unfeasibility of velocity ratio calculation method based on time ratio due to the usually low single-noise ratio of S-wave seismic data acquired on land, a new method based on detailed velocity analysis is introduced. Third, pre-stack Kirchhoff integral migration is a new method developed in recent years, through which both S and P component seismic data as well as amplitude ratio of P/S waves can be acquired. In this thesis, the research on untilizing the P and S wave sections as well as amplitude ratio sections to interpret low-amplitude structures and lithological traps is carried out. The fast and slow shear wave separation method is then be applied respectively to detect the density and azimuthal angle of fractures in an igneous rock gas reservoir and the coal formation in a coal field. Two velocity ratio-calculating methods are applied respectively in the lithological prediction at the gas and coal field after summarizing a large amount of experimental results draw domestically and abroad. P and S wave sections as well as amplitude ratio sections are used to identify low-amplitude structures and lithological traps in the slope area of a oil-bearing sedimentary basin. The calculated data concerning fracture density and azimuthal angle through the introduced method matches well with the regional stress and actual drilling data. The predicted lithological data reflects the actual drilling data. Some of the low-amplitude and lithological traps determined by Kirchhoff migration method are verified by the actual drilling data. These results indicate that these methods are very meaningful when dealing with complex oil and gas reservoir, and can be applied in other areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reflection seismic prospecting technique is an important and a widely used method in the petroleum and coal surveying, and has been developed to a perfectly mature technique from the aspects of data acquisition, data processing to data interpretation. However, the metallic mine seismic prospecting, especially the high resolution seismic prospecting technique are being still in the course of studying and probing up to now. In this paper, the basic theory and the present situation of study on metallic mine seismic reflection are expatiated, the basic theory, the improving measure, the converging velocity and the ability on the integrating global optimization method are also illuminated in detail at first. Then the basic theory, the realization process and the practicing effects of the vector suppressing noise algorithm are also introduced. On the basis of studying of applying the integrating global optimization method to static correction and the vector suppressing noise algorithm, we elaborate processed the seismic data of Tongling metallic mine. We introduced the processing flow, the key steps and the processing effects. Basing on the processing results, we analyzed the major reflection characteristics, the geological interpretation results and the earth's crust top reflection structure and the space distribution status of Wutong set, the space shape of part lithological body and the contacting relations of horizonsunveiled.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The content of this paper is based on the research work while the author took part in the key project of NSFC and the key project of Knowledge Innovation of CAS. The whole paper is expanded by introduction of the inevitable boundary problem during seismic migration and inversion. Boundary problem is a popular issue in seismic data processing. At the presence of artificial boundary, reflected wave which does not exist in reality comes to presence when the incident seismic wave arrives at the artificial boundary. That will interfere the propagation of seismic wave and cause alias information on the processed profile. Furthermore, the quality of the whole seismic profile will decrease and the subsequent work will fail.This paper has also made a review on the development of seismic migration, expatiated temporary seismic migration status and predicted the possible break through. Aiming at the absorbing boundary problem in migration, we have deduced the wide angle absorbing boundary condition and made a compare with the boundary effect of Toepiitz matrix fast approximate computation.During the process of fast approximate inversion computation of Toepiitz system, we have introduced the pre-conditioned conjugate gradient method employing co circulant extension to construct pre-conditioned matrix. Especially, employment of combined preconditioner will reduce the boundary effect during computation.Comparing the boundary problem in seismic migration with that in Toepiitz matrix inversion we find that the change of boundary condition will lead to the change of coefficient matrix eigenvalues and the change of coefficient matrix eigenvalues will cause boundary effect. In this paper, the author has made an qualitative analysis of the relationship between the coefficient matrix eigenvalues and the boundary effect. Quantitative analysis is worthy of further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The molar heat capacities of 2-(chloromethylthio)benzothiazole (molecular formula C8H6ClNS2, CA registry no. 28908-00-1) were measured with an adiabatic calorimeter in the temperature range between (80 and 350) K. The construction and procedures of the calorimeter were described in detail. The performance of the calorimetric apparatus was evaluated by heat capacity measurements on alpha-Al2O3. The deviation of experiment heat capacities from the corresponding smoothed values lies within 0.3%, whereas the uncertainty is within +/-0.5%, compared with that of the recommended reference data over the whole experimental temperature range. A fusion transition was found from the C-p-T curve of 2-(chloromethylthio)benzothiazole. The melting temperature and the molar enthalpy and entropy of fusion of the compound were determined to be T-m = (315.11 +/- 0.04) K, Delta(fus)H(m) = (17.02 +/- 0.03) kJ(.)mol(-1), and Delta(fus)S(m) = (54.04 +/- 0.05) J(.)mol(-1.)K(-1), respectively. The thermodynamic functions (H-T - H-298.15) and (S-T - S-298.15) were also derived from the heat capacity data. The molar fraction purity of the 2-(chloromethylthio)benzothiazole sample used in the present calorimetric study was determined to be 99.21 by fraction melting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comprehensive two-dimensional gas chromatography (GC x GC) has attracted much attention for the analys is of complex samples. Even with a large peak capacity in GC x GC, peak overlapping is often met. In this paper, a new method was developed to resolve overlapped peaks based on the mass conservation and the exponentially modified Gaussian (EMG) model. Linear relationships between the calculated sigma, tau of primary peaks with the corresponding retention time (t(R)) were obtained, and the correlation coefficients were over 0.99. Based on such relationships, the elution profile of each compound in overlapped peaks could be simulated, even for the peak never separated on the second-dimension. The proposed method has proven to offer more accurate peak area than the general data processing method. (c) 2005 Elsevier B.V. All rights reserved.