17 resultados para normalization
em Chinese Academy of Sciences Institutional Repositories Grid Portal
Resumo:
Performance of comprehensive two-dimensional liquid chromatography system is greatly improved than we reported previously by using a silica monolithic column as for the second dimensional separation. Due to the increase of the elution speed on the second dimensional monolithic column, the first dimensional column efficiency and analysis rate can be greatly improved as comparing with conventionally second dimensional column. The developed system was applied to analysis of methanol extraction of two umbelliferae herbs Ligusticum chuanxiong Hort. and Angelica sinensis (Oliv.) Diels by using CN column as for the first dimensional separation and a silica monolithic ODS column for the second dimensional separation, and the obtained three-dimensional chromatograms were treated by normalization of peak heights with the value of the highest peak or setting a certain value using a software written in-house. It was observed that much more peaks for low-abundant components in TCM extract can clearly be detected here than we reported before, due to the large difference for the amount of components in TCMs' extract. With the above improvements in separation performance and data treatment, totally about 120 components in methanol extraction of Rhizoma chuanxiong and 100 in A. sinensis were separated with UV detection within 130 min. This result meant that both the number of peaks detected increase twice but the analysis time decease twice if comparing with the previously reported result. (c) 2005 Published by Elsevier B.V.
Resumo:
In this paper, we proposed a method of classification for viruses' complete genomes based on graph geometrical theory in order to viruses classification. Firstly, a model of triangular geometrical graph was put forward, and then constructed feature-space-samples-graphs for classes of viruses' complete genomes in feature space after feature extraction and normalization. Finally, we studied an algorithm for classification of viruses' complete genomes based on feature-space-samples-graphs. Compared with the BLAST algorithm, experiments prove its efficiency.
Resumo:
The existing methods for the discrimination of varieties of commodity corn seed are unable to process batch data and speed up identification, and very time consuming and costly. The present paper developed a new approach to the fast discrimination of varieties of commodity corn by means of near infrared spectral data. Firstly, the experiment obtained spectral data of 37 varieties of commodity corn seed with the Fourier transform near infrared spectrometer in the wavenurnber range from 4 000 to 12 000 cm (1). Secondly, the original data were pretreated using statistics method of normalization in order to eliminate noise and improve the efficiency of models. Thirdly, a new way based on sample standard deviation was used to select the characteristic spectral regions, and it can search very different wavenumbers among all wavenumbers and reduce the amount of data in part. Fourthly, principal component analysis (PCA) was used to compress spectral data into several variables, and the cumulate reliabilities of the first ten components were more than 99.98%. Finally, according to the first ten components, recognition models were established based on BPR. For every 25 samples in each variety, 15 samples were randomly selected as the training set. The remaining 10 samples of the same variety were used as the first testing set, and all the 900 samples of the other varieties were used as the second testing set. Calculation results showed that the average correctness recognition rate of the 37 varieties of corn seed was 94.3%. Testing results indicate that the discrimination method had higher precision than the discrimination of various kinds of commodity corn seed. In short, it is feasible to discriminate various varieties of commodity corn seed based on near infrared spectroscopy and BPR.
Resumo:
Compared with other existing methods, the feature point-based image watermarking schemes can resist to global geometric attacks and local geometric attacks, especially cropping and random bending attacks (RBAs), by binding watermark synchronization with salient image characteristics. However, the watermark detection rate remains low in the current feature point-based watermarking schemes. The main reason is that both of feature point extraction and watermark embedding are more or less related to the pixel position, which is seriously distorted by the interpolation error and the shift problem during geometric attacks. In view of these facts, this paper proposes a geometrically robust image watermarking scheme based on local histogram. Our scheme mainly consists of three components: (1) feature points extraction and local circular regions (LCRs) construction are conducted by using Harris-Laplace detector; (2) a mechanism of grapy theoretical clustering-based feature selection is used to choose a set of non-overlapped LCRs, then geometrically invariant LCRs are completely formed through dominant orientation normalization; and (3) the histogram and mean statistically independent of the pixel position are calculated over the selected LCRs and utilized to embed watermarks. Experimental results demonstrate that the proposed scheme can provide sufficient robustness against geometric attacks as well as common image processing operations. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Feature-based image watermarking schemes, which aim to survive various geometric distortions, have attracted great attention in recent years. Existing schemes have shown robustness against rotation, scaling, and translation, but few are resistant to cropping, nonisotropic scaling, random bending attacks (RBAs), and affine transformations. Seo and Yoo present a geometrically invariant image watermarking based on affine covariant regions (ACRs) that provide a certain degree of robustness. To further enhance the robustness, we propose a new image watermarking scheme on the basis of Seo's work, which is insensitive to geometric distortions as well as common image processing operations. Our scheme is mainly composed of three components: 1) feature selection procedure based on graph theoretical clustering algorithm is applied to obtain a set of stable and nonoverlapped ACRs; 2) for each chosen ACR, local normalization, and orientation alignment are performed to generate a geometrically invariant region, which can obviously improve the robustness of the proposed watermarking scheme; and 3) in order to prevent the degradation in image quality caused by the normalization and inverse normalization, indirect inverse normalization is adopted to achieve a good compromise between the imperceptibility and robustness. Experiments are carried out on an image set of 100 images collected from Internet, and the preliminary results demonstrate that the developed method improves the performance over some representative image watermarking approaches in terms of robustness.
Resumo:
This paper deals with the evaluation of the reliability of the analytical results obtained by Kalman filtering. Two criteria for evaluation were compared: one is based on the autocorrelation analysis of the innovation sequence, the so-called NAC criterion; the other is the innovations number, which actually is the autocorrelation coefficient of the innovation sequence at the initial wavelength. Both criteria allow compensation for the wavelength positioning errors in spectral scans, but there exists a difference in the way they work. The NAC criterion can provide information about the reliability of an individual result, which is very useful for the indication of unmodelled emissions, while the innovations number should be incorporated with the normalization of the innovations or seek the help of the sequence itself for the same purpose. The major limitation of the NAC criterion is that it does not allow the theoretical modelling of continuous backgrounds, which, however, is convenient in practical analysis and can be taken with the innovations number criterion.
Resumo:
Based on the second-order random wave solutions of water wave equations in finite water depth, a statistical distribution of the wave-surface elevation is derived by using the characteristic function expansion method. It is found that the distribution, after normalization of the wave-surface elevation, depends only on two parameters. One parameter describes the small mean bias of the surface produced by the second-order wave-wave interactions. Another one is approximately proportional to the skewness of the distribution. Both of these two parameters can be determined by the water depth and the wave-number spectrum of ocean waves. As an illustrative example, we consider a fully developed wind-generated sea and the parameters are calculated for various wind speeds and water depths by using Donelan and Pierson spectrum. It is also found that, for deep water, the dimensionless distribution reduces to the third-order Gram-Charlier series obtained by Longuet-Higgins [J. Fluid Mech. 17 (1963) 459]. The newly proposed distribution is compared with the data of Bitner [Appl. Ocean Res. 2 (1980) 63], Gaussian distribution and the fourth-order Gram-Charlier series, and found our distribution gives a more reasonable fit to the data. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
PSD是一种高分辨率、实时性好的光电位置敏感器件,因而具有广泛应用的前景.但在光照度变化条件下,输出信号存在非线性飘移,因而影响了作为位置检测传感器的检测精度,尤其在3D测量时适用性受到了限制.针对这个问题,提出了一种PSD位置传感器的非线性误差补偿方法.该方法针对目标的空间距离变化所产生的PSD输出非线性飘移,采用归一化模型进行误差修正,很大程度上改进了PSD的输出一致性,使基于PSD的3D测量系统性能得以提高.
Resumo:
In this paper, the detailed analysis of fundamental seismic data and theoretical method are given, and the tests of some new technologies are performed. For seismic data processing assembly, some key technologies are developed and applied, such as global static correction, amplitude consistency processing, wavelet consistency shaping, fine velocity model establishing and prestack time migration. These technologies can efficiently settle the problems during the course of multiple- block– jointed prestack time migration processing, and it is highly significant for holding the oil output of 40,000,000 tons for Daqing oilfield. Through the research of this dissertation, the following important contributions are shown: (1) The combination of near-surface model method and refraction static correction method is developed, and is applied to solve global static correction for the whole merging area. (2) Prestack amplitude normalization processing method based on fold is developed. The method eliminates the effects of fold on amplitude uniformity, and solves the problem of energy uniformity for tie-area prestack migration processing. (3) Wavelet consistency is investigated. For multiple survey blocks existing in the area, the optimum method of wavelet shaping is developed, which removes the waveform variance between two adjacent blocks. (4) Controlled velocity inversion (CVI) technique is used to establish migration velocity field. It can largely shorten the period of velocity modeling, and improve velocity analysis precision. (5) Float datum level technique is employed, and is able to guarantee prestack migration results of subsurface shallow layers. (6) The static partition of seismic data volume relating to migration aperture is firstly developed. And the precious imaging for huge data volume by prestack time migration is realized. (7) The numerical forward simulation and prestack migration processing is primarily combined to discuss the migration technique for a complex geology structure from practical field information. The combination of numerical simulation and prestack migration is a feasible way to solve the fine imaging of complex volcanic structure. And the combination approach can help to select appropriate migration parameters.
Resumo:
Based on the features of soft soil in Tianjing Coastal New Developing Area, this kind of soil with different content of sand was researched systematically, according to the indoor experiment, about its characteristics of strength and deformation. The main results are summarized in the following: Firstly, on the basis of geological engineering investigation, the systemic experiments about the physical characteristics were conducted. The test soil samples were taken from the gray and gray-yellow silty soft soil which was formatted by near-shore marine sediment and marine-continental interactive sediment. The original condition of the sample soil was in saturation and the basic indexes are: liquid limit36.1%, plastic limit 18.8%, plasticity index. Then, the condensation characteristics of the soft soil were analyzed through high-pressure consolidation tests. The results show that,in various loading serials, the coefficient of compressibility under P=100kPa and 200kPa are all larger than 0.5MPa-1. So the sample soil is a kind of high-compressibility soil. Secondly, triaxial strength of undisturbed soil and remoulded soil was researched by using triaxial test. The types of stress-strain curve of both undisturbed and remoulded soil are the stress stabilization and softening type, which show the specific plastic character. Furthermore, the cohesion and friction angle of undisturbed soil changes, when the ambient pressure is different, instead of a stable value for all time; the cohesion and friction angle of remoulded soil changes with the compactness and sand-carrying capacity which is wholly higher than undisturbed soil. At last, the stress-strain results of both undisturbed and remoulded soil were normalized by using the ambient pressure as the normalization factor. The results show that, there are all some normalization characters in both undisturbed and remoulded soil, however, the feature of normalization of undisturbed soil is worse than the remoulded ones. The main reason is that the undisturbed samples are worse in equality and the unavoidable disturb through the process of sampling and experiments will also make them can not put up good normalization. Therefore, it is feasible to normalize the soil in Tianjing Coastal New Developing Area with the ambient pressure as normalization factor.
Resumo:
The unsaturated expansive soil is a hotspot and difficulty in soil mechanics inland and outland. The expansive soil in our China is one of the widest in distributing and greatest in area, and the disaster of expansive soil happens continually as a result. The soil mechanics test, monitor, numerical simulation and engineering practice are used to research swell and shrinkage characteristic, edge strength characteristic and unsaturated strength characteristic of Mengzi expansive soil. The seep and stability of the slope for expansive soil associated with fissure are analyzed and two kinds of new technique are put forward to be used in expansive soil area, based on disaster mechnics proposed of the slope.The technique of reinforcement in road embankment is optimized also. Associated with engineering geology research of Mengzi expansive soil, mineral composition, chemical composition, specific area and cation content, dissolubility salt and agglutinate, microcosmic fabric characteristic, cause of formation and atmosphere effect depth are analyzed to explain the intrinsic cause and essence of swell and shrinkage for expansive soil. The rule between swell-shrinkage and initial state, namely initial water content, initial dry density and initial pressure, can be used to construction control. Does Response model is fit to simulate the rule, based on ternary regression analysis. It has great meaning to expansive soil engineering in area with salt or alkali. The mechanics under CD, CU and GCU of expansive soil is researched by edge surface theory to explain the remarkable effect of consolidation pressure, initial dry density, initial water content, cut velocity, drainage and reinforcement to the edge strength characteristic. The infirm hardening stress strain curves can be fitted with hyperbola model and the infirm softening curves can be fitted with exponential model. The normalization theory can be used to reveal the intrinsic unity of the otherness which is brought by different methods to the shear strength of the same kinds of samples. The unsaturated strain softening characteristic and strength envelope of remolding samples are researched by triaxial shear test based on suction controlled, the result of which is simulated by exponential function. The strength parameters of the unsaturated samples are obtained to be used in the unsaturated seep associated with rainfall. The elasticity and plasticity characters of expansive soil are researched to attain the model parameters by using modified G-A model. The humidification destroy characteristic of expansive soil is discussed to research the disaster mechanism of the slope with the back pressure increasing and suction decreasing under bias pressure consolidation. The indoor and outdoor SWCCs are measured to research the effect factors and the rule between different stress and filling environment. The moisture absorption curves can express the relationship between suction and water content in locale. The SWCCs of Mengzi expansive soil are measured by GDS stress path trixial system. The unsaturated infiltration function is gained to research seep and stability of the slope of expansive soil. The rainfall infiltration and ability of slope considering multifarious factors are studied by analyzing fissure cause of Mengzi expansive soil. The mechanism of the slope disaster is brought forward by the double controlling effect between suction and fissure. Two new kinds of technique are put forward to resolve disaster of expansive soil and the technique of reinforcement on embankment is optimized, which gives a useful help to solving engineering trouble.
Resumo:
The ionogram acquired with the ionospheric vertical sounding method is the oldest data in the history of ionospheric research. Using of modern microelectronics and computer technology to digitalize, analyse and preserve the huge amount of historical film ionogram has become more and more important and urgent. This paper introduced the progress of the film ionogram digitalization by using digital image processing technologies to correct and repair film ionogram and convert them in an exchangeable format. An analysis and conversion software, basing on this method, has been developed for the film ionogram analysis, and then it introduces the application of this software by combining the SAO Explorer program for Wuhan film ionogram and pseudo-color ionogram in Yamagawa in Japan. It shows that our method is reliable,and the developed software is used friendly and provides a positive solution in digitalization and analysis of huge amount of historical film ionogram. Firstly, we briefly introduce the film ionogram and the process of its digitalization. By observing a amount of film ionogram, we obtain some common characteristics of the digitalized film ionogram following as: (1) the image rotation are caused by scanning; (2) the vertical axis of a large number of film ionogram exist more or less tilt and bending ; (3) coordinates of the film ionogram appear the non-uniformity phenomena result from the instability of driving motor rotation and the error of altitudinal cursor orientation. Moreover, based on the characteristics of the film ionogram and the SAO Explorer software which is widely used for the digital ionogram analysis in the world, a new method has been developed for film ionogram procession. The method contains the image geometric correction and film ionogram format conversion. The image geometric correction includes such as image rotation correction, vertical correction and coordinates scale correction. After geometric correction, the BMP file format images will be converted to the SBF file format images. Then, we also discuss the data format converting methods, which include two methods of the image data mapping basing on the normalization and logarithm, and the method of the preprocessing of the noise filtering and the threshold setting. Combining with SAO Explorer software, we successfully obtain ionospheric parameters and electron profile from the converted SBF file format digital ionograms. Based on the above method, we developed the software for the film ionogram to realize its correction analysis and conversion of the image format, and then give a introduction for its function and operation. Subsequently, the software are applied into the Wuhan film ionogram which separately observed in the high solar activity year and the low in 1980s last century. The results reveal the converted SBF digital ionogram almost preserve the all echo information of the film ionogram. Furthermore, we expressly discuss the application to the Wuhan film ionogram in 1958 in order to validate the applicability and credibility of the software. And it is showed that the important information of the film ionogram are maintained into the SBF digital ionogram. It is represented that there is credibility for conversion of the software when it applied in the older film ionogram. In sum, this software could apply to the digitalization and analysis of huge amount of historical film ionogram. Last, we extended the function of the software by bring some new conversion method and used it to apply to the pseudo-color ionogram of yamagawa in Japan. The results show that the converted ionogram information basically maintain the importantly ionogram information and the error of scaling of converted SBF file format image is almost acceptable, though there is no preprocessing for the original ionogram. Hence, we could extend the applicable range of the software and apply it to all kinds of simulative ionogram imaging by improving the method and software.
Resumo:
Characterization of Platinum Group Elements (PGE) has been applied to earth, space and environmental sciences. However, all these applications are based on a basic prerequisite, i.e. their concentration or ratio in the research objects can be accurately and precisely determined. In fact, development in these related studies is a great challenge to the analytical chemistry of the PGE because their content in the geological sample (non-mineralized) is often extremely low, range from ppt (10~(-12)g/g) to ppt (10~(-9)g/g). Their distribution is highly heterogeneous, usually concentrating in single particle or phase. Therefore, the accurate determination of these elements remains a problem in analytical chemistry and it obstructs the research on geochemistry of PGE. A great effort has been made in scientific community to reliable determining of very low amounts of PGE, which has been focused on to reduce the level of background in used reagents and to solve probable heterogeneity of PGE in samples. Undoubtedly, the fire-assay method is one of the best ways for solving the heterogeneity, as a large amount of sample weight (10-50g) can be hold. This page is mainly aimed at development of the methodology on separation, concentration and determination of the ultra-trace PGE in the rock and peat samples, and then they are applied to study the trace of PGE in ophiolite suite, in Kudi, West Kunlun and Tunguska explosion in 1908. The achievements of the study are summarized as follows: 1. A PGE lab is established in the Laboratory of Lithosphere Tectonic Evolution, IGG, CAS. 2. A modified method of determination of PGE in geological samples using NiS Fire-Assay with inductively coupled plasma-mass spectrometry (ICP-MS) is set up. The technical improvements are made as following: (1) investigating the level of background in used reagents, and finding the contents of Au, Pt and Pd in carbonyl nickel powder are 30, 0.6 and 0.6ng/g, respectively and 0.35, 7.5 and 6.4ng, respectively in other flux, and the contents of Ru, Rh, Os in whole reagents used are very low (below or near the detection limits of ICP-MS); (2) measuring the recoveries of PGE using different collector (Ni+S) and finding 1.5g of carbonyl nickel is effective for recovering the PGE for 15g samples (recoveries are more than 90%), reducing the inherent blank value due to impurities reagents; (3) direct dissolving nickel button in Teflon bomb and using Te-precipitation, so reducing the loss of PGE during preconcentration process and improving the recoveries of PGE (above 60% for Os and 93.6-106.3% for other PGE, using 2g carbonyl nickel); (4) simplifying the procedure of analyzing Osmium; (5)method detection limits are 8.6, 4.8, 43, 2.4, 82pg/g for 15g sample size ofRu, Rh, Pd, Ir, Pt, respectively. 3. An analytical method is set up to determine the content of ultra-trace PGE in peat samples. The method detection limits are 0.06, 0.1, 0.001, 0.001 and 0.002ng/mL for Ru, Rh, Pd, Ir and Pt, respectively. 4. Distinct anomaly of Pd and Os are firstly found in the peat sampling near the Tunguska explosion site, using the analytical method. 5. Applying the method to the study on the origin of Tunguska explosion and making the following conclusions: (1) these excess elements were likely resulted from the Tunguska Cosmic Body (TCB) explosion of 1908. (2) The Tunguska explosive body was composed of materials (solid components) similar to C1 chondrite, and, most probably, a cometary object, which weighed more than 10~7 tons and had a radius of more than 126 m. 6. The analysis method about ultra-trace PGE in rock samples is successfully used in the study on the characteristic of PGE in Kudi ophiolite suite and the following conclusions are made: (1) The difference of the mantle normalization of PGE patterns between dunite, harzburgite and lherzolite in Kudi indicates that they are residual of multi-stage partial melt of the mantle. Their depletion of Ir at a similar degree probably indicates the existence of an upper mantle depleted Ir. (2) With the evolution of the magma produced by the partial melt of the mantle, strong differentiation has been shown between IPGE and PPGE; and the differentiation from pyroxenite to basalt would have been more and more distinct. (3) The magma forming ophiolite in Kudi probably suffered S-saturation process.