970 resultados para invalid match


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The seismic survey is the most effective prospecting geophysical method during exploration and development of oil/gas. The structure and the lithology of the geological body become increasingly complex now. So it must assure that the seismic section own upper resolution if we need accurately describe the targets. High signal/noise ratio is the precondition of high-resolution. As one important seismic data processing method, Stacking is an effective means to suppress the records noise. Broadening area of surface stacked is more important to enhance genuine reflection signals and suppressing unwanted energy in the form of coherent and random ambient noise. Common reflection surface stack is a macro-model independent seismic imaging method. Based on the similarity of CRP trace gathers in one coherent zone, CRS stack effectively improves S/N ratio by using more CMP trace gathers to stack. It is regarded as one important method of seismic data processing. Performing CRS stack depends on three attributes. However, the equation of CRS is invalid under condition of great offset. In this thesis, one method based on velocity model in depth domain is put forward. Ray tracing is used to determine the traveltime of CRP in one common reflection surface by the least squares method to regress the equation of CRS. Then we stack in the coherent seismic data set according to the traveltime, and get the zero offset section. In the end of flowchart of implementing CRS stack, one method using the dip angle to enhance the ratio of S/N is used. Application of the method on synthetic examples and field seismic records, the results of this method show an excellent performance of the algorithm both in accuracy and efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to discover the distribution law of the remaining oil, the paper focuses on the quantitative characterization of the reservoir heterogeneity and the distribution law of the fluid barrier and interbed, based on fine geological study of the reservoir in Liuhuall-1 oil field. The refined quantitative reservoir geological model has been established by means of the study of core analysis, logging evaluation on vertical well and parallel well, and seismic interpretation and prediction. Utilizing a comprehensive technology combining dynamic data with static data, the distribution characteristics, formation condition and controlling factors of remaining oil in Liuhuall-1 oil field have been illustrated. The study plays an important role in the enrichment regions of the remaining oil and gives scientific direction for the next development of the remaining oil. Several achievements have been obtained as follows: l.On the basis of the study of reservoir division and correlation,eight lithohorizons (layer A, B_1, B_2, B_3, C, D, E, and F) from the top to the bottom of the reservoir are discriminated. The reef facies is subdivided into reef-core facies, fore-reef facies and backreef facies. These three subfacies are further subdivided into five microfacies: coral algal limestone, coralgal micrite, coral algal clastic limestone, bioclastic limestone and foraminiferal limestone. In order to illustrate the distribution law of remaining oil in high watercut period, the stratigraphic structure model and sedimentary model are reconstructed. 2.1n order to research intra-layer, inter-layer and plane reservoir heterogeneity, a new method to characterize reservoir heterogeneity by using IRH (Index of Reservoir Heterogeneity) is introduced. The result indicates that reservoir heterogeneity is medium in layer B_1 and B_3, hard in layer A, B_2, C, E, poor in layer D. 3.Based on the study of the distribution law of fluid barrier and interbed, the effect of fluid battier and interbed on fluid seepage is revealed. Fluid barrier and interbed is abundant in layer A, which control the distribution of crude oil in reservoir. Fluid barrier and interbed is abundant relatively in layer B_2,C and E, which control the spill movement of the bottom water. Layer B_1, B_3 and D tend to be waterflooded due to fluid barrier and interbed is poor. 4.Based on the analysis of reservoir heterogeneity, fluid barrier and interbed and the distribution of bottom water, four contributing regions are discovered. The main lies on the north of well LH11-1A. Two minors lie on the east of well LH11-1-3 and between well LH11-1-3 and well LH11-1-5. The last one lies in layer E in which the interbed is discontinuous. 5.The parameters of reservoir and fluid are obtained recurring to core analysis, logging evaluation on vertical well and parallel well, and seismic interpretation and prediction. Theses parameters provide data for the quantitative characterization of the reservoir heterogeneity and the distribution law of the fluid barrier and interbed. 6.1n the paper, an integrated method about the distribution prediction of remaining oil is put forward on basis of refined reservoir geological model and reservoir numerical simulation. The precision in history match and prediction of remaining oil is improved greatly. The integrated study embodies latest trend in this research field. 7.It is shown that the enrichment of the remaining oil with high watercut in Liuhua 11-1 oil field is influenced by reservoir heterogeneity, fluid barrier and interbed, sealing property of fault, driving manner of bottom water and exploitation manner of parallel well. 8.Using microfacies, IRH, reservoir structure, effective thickness, physical property of reservoir, distribution of fluid barrier and interbed, the analysis of oil and water movement and production data, twelve new sidetracked holes are proposed and demonstrated. The result is favorable to instruct oil field development and have gotten a good effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The historical land use and land cover changes is one of the key issues in LUCC research. However, the achievement of China in this field doesn't match her position in the world yet. And the reliability of the quantitive records in Chinese historical literature, the basic data for historical land use research, has been doubted. This research focuses on Re-Cha-Sui, a typical area for the farming-pastoral region in the north of China, to make a detailed case study in this field. Based on a deep mining and calibration on the data from massive historical documents and land-use surveys, the author gives a detailed analysis on the administrative region evolution, historical population dynamics, reclamation policy, and the land statistic system. According to textual researches, parallel validation and physical geographical analysis, a unified land use series for recent 300 years, which founded on the results of modern land-use surveys, is constructed. And the thematic maps on the cultivation index for different counties in several temporal sections are plotted. Based on the endeavor above, the dynamic of forest and steppe is reconstructed as well. The temporal-spatial patterns of land use/land cover changes in the area is analyzed. And the influence of different driving forces are discussed. The main conclusions of the research are as followed: 1. The quantitive records in literatures on Re-Cha-Sui area are reflection of real amounts of croplands. It is practical to reconstruct a result comparable with the modern land-use surveys, based of a deep mining and considerate validation on historical documents. The unexceptional negative attitude towards the numerical records in historical documents is unnecessary. 2. In recent 300 years, 3 climax of reclamation appeared in Re-Cha-Sui area and altered the pure pastoral area into a farming-pastoral region. The interval were respectively the early time till mid time of the Qing dynasty, the end of the Qing dynasty till early time of the Republic of China(ROC), and the time after A.D. 1949. After the first expansion, the area of cropland in this region reached 2.0 million ha. Among them, Guisui area, which was most densely cultivated, had a cultivation index over 30%, which is similar with modern situation. The second expansion covered broader area, and the amount of cropland reached 3.5 million ha. The increase of farming area after 1949 is due to the recultivation of abandoned farmland. The current area of cropland in this region is 5.6 million ha. In the southern area where the land was reclaimed early, the amount on of the cropland has some fluctuation in 300 years. While in the new reclaimed area in the north, the area of cropland has kept the trend of increasing. 3. Due to the different natural conditions, most forests in Re-Cha-Sui area distribute in the mountain area of North Hebei province, and the upland of West Liaoning province, especially the former, which has a forest coverage near 70%. However, most of these forests were destroyed before the end of the Qing dynasty. In 1949, the natural forest near Chengde was nearly cleared up. They were partly renewed after 1949 due to plantation. 4. In the steppe zone such as northern Rehe, Suiyuan and Chahar, the area of steppe has a negative correlation with that of cropland. With the expansion of cropland, the percentage of steppe has shrunk from over 80% to 53%. In the mountain area of North Hebei province, steppe expanded with the shrinkage of forest, though cropland was expanding. The percentage once reached 60%, and then fell with the renew of forest. However, in the upland of West Liaoning province, the steppe shrink slowly from original 50% to current 26%, with the expansion of cropland. 5. The land use and land cover change in Re-Cha-Sui area in recent 300 years is driven by various factors, including human dimensions such as population, policy of the government, disorder of the society, cultural tradition, and natural factors such as climate change and natural disasters. Among them, pressure from surplus population is the basic driving force.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the development of oil and gas exploration, the exploration of the continental oil and gas turns into the exploration of the subtle oil and gas reservoirs from the structural oil and gas reservoirs in China. The reserves of the found subtle oil and gas reservoirs account for more than 60 percent of the in the discovered oil and gas reserves. Exploration of the subtle oil and gas reservoirs is becoming more and more important and can be taken as the main orientation for the increase of the oil and gas reserves. The characteristics of the continental sedimentary facies determine the complexities of the lithological exploration. Most of the continental rift basins in East China have entered exploration stages of medium and high maturity. Although the quality of the seismic data is relatively good, this areas have the characteristics of the thin sand thickness, small faults, small range of the stratum. It requests that the seismic data have high resolution. It is a important task how to improve the signal/noise ratio of the high frequency of seismic data. In West China, there are the complex landforms, the deep embedding the targets of the prospecting, the complex geological constructs, many ruptures, small range of the traps, the low rock properties, many high pressure stratums and difficulties of boring well. Those represent low signal/noise ratio and complex kinds of noise in the seismic records. This needs to develop the method and technique of the noise attenuation in the data acquisition and processing. So that, oil and gas explorations need the high resolution technique of the geophysics in order to solve the implementation of the oil resources strategy for keep oil production and reserves stable in Ease China and developing the crude production and reserves in West China. High signal/noise ratio of seismic data is the basis. It is impossible to realize for the high resolution and high fidelity without the high signal/noise ratio. We play emphasis on many researches based on the structure analysis for improving signal/noise ratio of the complex areas. Several methods are put forward for noise attenuation to truly reflect the geological features. Those can reflect the geological structures, keep the edges of geological construction and improve the identifications of the oil and gas traps. The ideas of emphasize the foundation, give prominence to innovate, and pay attention to application runs through the paper. The dip-scanning method as the center of the scanned point inevitably blurs the edges of geological features, such as fault and fractures. We develop the new dip scanning method in the shap of end with two sides scanning to solve this problem. We bring forward the methods of signal estimation with the coherence, seismic wave characteristc with coherence, the most homogeneous dip-sanning for the noise attenuation using the new dip-scanning method. They can keep the geological characters, suppress the random noise and improve the s/n ratio and resolution. The rutine dip-scanning is in the time-space domain. Anew method of dip-scanning in the frequency-wavenumber domain for the noise attenuation is put forward. It use the quality of distinguishing between different dip events of the reflection in f-k domain. It can reduce the noise and gain the dip information. We describe a methodology for studying and developing filtering methods based on differential equations. It transforms the filtering equations in the frequency domain or the f-k domain into time or time-space domains, and uses a finite-difference algorithm to solve these equations. This method does not require that seismic data be stationary, so their parameters can vary at every temporal and spatial point. That enhances the adaptability of the filter. It is computationally efficient. We put forward a method of matching pursuits for the noise suppression. This method decomposes any signal into a linear expansion of waveforms that are selected from a redundant dictionary of functions. These waveforms are chosen in order to best match the signal structures. It can extract the effective signal from the noisy signal and reduce the noise. We introduce the beamforming filtering method for the noise elimination. Real seismic data processing shows that it is effective in attenuating multiples and internal multiples. The s/n ratio and resolution are improved. The effective signals have the high fidelity. Through calculating in the theoretic model and applying it to the real seismic data processing, it is proved that the methods in this paper can effectively suppress the random noise, eliminate the cohence noise, and improve the resolution of the seismic data. Their practicability is very better. And the effect is very obvious.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Metacognitive illusions or metacognitive bias is a concept that is a homologous with metacognitve monitor accuracy. In the dissertation, metacognitive illusions mainly refers to the absolute differences between judgment of learning (JOL) and recall because individuals are misguided by some invalid cues or information. JOL is one kind of metacognitive judgments, which is the prediction about the future performance of learned materials. Its mechanism and accuracy are the key issues in the study of JOL. Cue-utilization framework proposed by Koriat (1997) summarized the previous findings and provided a significant advance in understanding how people make JOL. However, the model is not able to explain individual differences in the accuracy of JOL. From the perspective of people’s cognitive bound, our study use posterior associative word pairs easy to produce metacognitive bias to explore the deeper psychological mechanism of metacontive bias. Moreover, we plan to investigate the cause to result in higher metacognitive illusions of children with LD. Based on these, the study tries to look for the method of mending metacognitive illusions. At the same time, we will summarize the findings of this study and previous literatures, and propose a revesied theory for explaining children’s with LD cue selection and utilization according to Koriat’s cue-utilization model. The results of the present study indicated that: (1) Children showed stable metacognitive illusions for the weak associative and posterior associative word pairs, it was not true for strong associative word pairs. It was higher metacognitive illusions for children with LD than normal children. And it was significant grade differences for metacognitive illusions. A priori associative strength exerted a weaker effect on JOL than it did on recall. (2) Children with LD mainly utilized retrieval fluency to make JOL across immediate and delay conditions. However, for normal children, it showed some distinction between encoding fluency and retrieval fluency as potential cues for JOL across immediate and delay conditions. Obviously, children with LD lacked certain flexibility for cue selection and utilization. (3)When word pairs were new list, it showed higher metacognitve transfer effects for analytic inferential group than heuristic inferential group for normal children in the second block. And metacognitive relative accuracy got increased for both children with and without LD across the experimental conditions. However, it was significantly improved only for normal children in analytic inferential group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The relationship between exogenous attention and endogenous attention is still unclear. We conclude the result of previous studies and our experiments by proposing a convergence hypothesis. A lot of evidences from ERP and fMRI proved the distinction between exogenous and endogenous attention. The distinction is fit for the complicated surroundings. Some studies found that exogenous and endogenous attention activated some common brain areas but if there’s any common function in these areas was still unclear. Previous studies failed to disentangle the relationship between these two mechanisms in detail, and most of the previous studies compared these two mechanisms separately. We used the cue-target paradigm, controlled the SOA and the cue validity to separate endogenous and exogenous attention. We try to figure out the relationship between endogenous and exogenous attention. We found that, in the exogenous attention condition, the amplitude of the P1 component was larger and the amplitude of N1 was smaller than that in the invalid trials. In the endogenous attention condition, the amplitudes of the P1 and N1 components in valid trials were both larger than that in the invalid trials. The difference wave of endogenous and exogenous attentions showed that at the beginning of their process endogenous and exogenous attentions were separate and then the process was going to converge. The stimuli appear in the visual field will activate the “external trigger” in exogenous attention processing, and then the signal will be transferred to next stage, in the endogenous attention processing, when the observer want to attend, the “inner trigger” will be activated and the signal will be transfered. The difference between “external trigger” and “inner trigger” in endogenous and exogenous attention was significant, so the difference at the beginning of the process was significant. When the signal was transferred to the next processing stage, the difference between the endogenous and exogenous attention became diminished, and it was a convergence process. We used the cue-target paradigm, let the central cue and peripheral cue appear in one trial, try to investigate the relationship between endogenous and exogenous attention under different perceptual load conditions. The result from of difference wave showed that the relationship between the two attentions was consistent with the convergence hypothesis. We also found that under different perceptual load conditions, the relationship between endogenous and exogenous attention was different. The perceptual load has influence on the “external trigger” and “inner trigger”. A cue-target paradigm was used to investigate the relationship between attention, age and perceptual load, we found that the function of visual attention was decrease in the old group, they need more attentional resource when they face the issue. The aging effect has influence on the “external trigger” and “inner trigger”. The effects of age and perceptual load were much higher in exogenous attention condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although robust findings have been that positive mood leads to greater creativity, several other literatures have found that negative mood sometimes results in more creative performance than positive or neutral mood. Several possible explanations for this emotion–creativity link have been proposed by researchers, but there has not yet been definitive research identifying the mechanism(s) behind this relationship. This research represents an initial step in this direction, examining the possibility that Intelligence Current may be a contributing mechanism in the emotion–creativity link from the perspective of development. The object of the present study was to do the followings 1) the effects and mechanism(s) of emotions on creativity development from adolescence to young adulthood by Unusual Uses Test; 2) the possibility that Intelligence Current may be a contributing mechanism in the emotion–creativity link. The participants were 849 adolescents in high schools and 267 undergraduates in the university aged from 11 to 22 years old. The mechanism(s) for emotion-creativity link was explored by cognitive flexibility (assessed using Abstract Match Task), tolerance (inclusive) ratings (assessed using Categorization of Analogy Task), uses originality ratings, and confidence ratings. Results indicated that: 1) The level of creativity varied with age. It increased from 11 years on, but decreased at about 14 to improve again from 15 to 22 years. 2) The different effects of emotions on creativity development among adolescents and undergraduates emerged, but the effects of positive and negative emotions on creativity didn’t differ from each other for the whole participants. Furthermore, compared with positive and negative emotions, the neutral emotions produced the lowest creativity for 11.00-13.99 years old group, but produced the highest creativity for 14.00-14.99 and 17.00-21.99years old. 3) Positive emotions have been shown to enable individuals to higher level of cognitive flexibility and better performance than negative emotions by Abstract Match Task, which could be considered that the Intelligence Current may be a contributing mechanism in the emotion–creativity link. 4) Positive emotions have been shown to enable individuals to own higher confidence, to categorize items with more flexibility, to see more potential relatedness among unusual and atypical members of categories, to evaluate items more originality than negative emotions among most of participants, especially the group of 14.00-14.99 years old. In sum, the present study helps us to further understand that the term ‘Intelligence Current’ is further explained and the problems found in relationships between creativity and emotions. However, further research is required to explore and confirm the conclusions of the present study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unlike alphabetic languages, Chinese language is ideographical writing system. Each Chinese character is single-syllable and usually has a direct meaning. So Chinese characters are a kind of valuable experimental material used for research on reading and comparisons of the reading mechanism of different language. In this paper, the normal persons and the patients with semantic dementia were respectively scheduled for two parts of experimental studies on the orthographic, phonologic, semantic and frequency effects of reading of Chinese characters. The Stroop-like character-picture interference experimental paradigm was used to investigate the orthographic, phonologic, semantic and frequency effects of Chinese characters on picture naming when they were presented with pictures to normal persons. The results indicated that the orthographic facilitation effect, phonologic facilitation effect, and semantic interference effect occurred at different SOA values. The orthographic and phonologic facilitation effects were independent. It was for the first time shown that the interaction between orthographic variable and semantic variable occurred when the high-frequency Chinese characters were read. Phonologic representation was activated quicker than semantic representation, by comparison of their SOA. Generally, it means that there is reading without meaning in Chinese character among the normal persons. The orthographic, phonologic, semantic, frequency and concrete effects of Chinese characters were further investigated among the dementia patients with DAT(dementia of Alzheimer's type disease) or CVA or both. They all have an impaired semantic memory. The results showed that patients with dementia could read the names of the pictures aloud while they could not name them or match them with a right character correctly. This is reading impairment without meaning in Chinese among the dementia patients. Meanwhile, they had a selective reading impairment and more LARC(a legitimate alternative reading of components) mistakes especially when reading low-frequency irregular, low-frequency inconsistent and abstract Chinese characters. With the patients' semantic impairment developed, their ability to read the pictures names would remain whereas their ability to read low-frequency irregular and low-frequency inconsistency Chinese characters was reduced. These results indicated that low-frequency irregular Chinese characters can be read correctly only when it is supported by their semantic information. Based on the above results of reading without meaning and of reading of low-frequency irregular Chinese characters supported by their semantic information, it is reasonable to suggest that at least two routes are involved in the process of reading Chinese characters. They are direct phonologic route and indirect semantic route; moreover, the two routes are independent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for localization and positioning in an indoor environment is presented. The method is based on representing the scene as a set of 2D views and predicting the appearances of novel views by linear combinations of the model views. The method is accurate under weak perspective projection. Analysis of this projection as well as experimental results demonstrate that in many cases it is sufficient to accurately describe the scene. When weak perspective approximation is invalid, an iterative solution to account for the perspective distortions can be employed. A simple algorithm for repositioning, the task of returning to a previously visited position defined by a single view, is derived from this method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe an approach to parallel compilation that seeks to harness the vast amount of fine-grain parallelism that is exposed through partial evaluation of numerically-intensive scientific programs. We have constructed a compiler for the Supercomputer Toolkit parallel processor that uses partial evaluation to break down data abstractions and program structure, producing huge basic blocks that contain large amounts of fine-grain parallelism. We show that this fine-grain prarllelism can be effectively utilized even on coarse-grain parallel architectures by selectively grouping operations together so as to adjust the parallelism grain-size to match the inter-processor communication capabilities of the target architecture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Under normal viewing conditions, humans find it easy to distinguish between objects made out of different materials such as plastic, metal, or paper. Untextured materials such as these have different surface reflectance properties, including lightness and gloss. With single isolated images and unknown illumination conditions, the task of estimating surface reflectance is highly underconstrained, because many combinations of reflection and illumination are consistent with a given image. In order to work out how humans estimate surface reflectance properties, we asked subjects to match the appearance of isolated spheres taken out of their original contexts. We found that subjects were able to perform the task accurately and reliably without contextual information to specify the illumination. The spheres were rendered under a variety of artificial illuminations, such as a single point light source, and a number of photographically-captured real-world illuminations from both indoor and outdoor scenes. Subjects performed more accurately for stimuli viewed under real-world patterns of illumination than under artificial illuminations, suggesting that subjects use stored assumptions about the regularities of real-world illuminations to solve the ill-posed problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data and procedures and the values they amass, Higher-order functions to combine and mix and match, Objects with their local state, the message they pass, A property, a package, the control of point for a catch- In the Lambda Order they are all first-class. One thing to name them all, one things to define them, one thing to place them in environments and bind them, in the Lambda Order they are all first-class. Keywords: Scheme, Lisp, functional programming, computer languages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical shape and texture appearance models are powerful image representations, but previously had been restricted to 2D or simple 3D shapes. In this paper we present a novel 3D morphable model based on image-based rendering techniques, which can represent complex lighting conditions, structures, and surfaces. We describe how to construct a manifold of the multi-view appearance of an object class using light fields and show how to match a 2D image of an object to a point on this manifold. In turn we use the reconstructed light field to render novel views of the object. Our technique overcomes the limitations of polygon based appearance models and uses light fields that are acquired in real-time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I have invented "Internet Fish," a novel class of resource-discovery tools designed to help users extract useful information from the Internet. Internet Fish (IFish) are semi-autonomous, persistent information brokers; users deploy individual IFish to gather and refine information related to a particular topic. An IFish will initiate research, continue to discover new sources of information, and keep tabs on new developments in that topic. As part of the information-gathering process the user interacts with his IFish to find out what it has learned, answer questions it has posed, and make suggestions for guidance. Internet Fish differ from other Internet resource discovery systems in that they are persistent, personal and dynamic. As part of the information-gathering process IFish conduct extended, long-term conversations with users as they explore. They incorporate deep structural knowledge of the organization and services of the net, and are also capable of on-the-fly reconfiguration, modification and expansion. Human users may dynamically change the IFish in response to changes in the environment, or IFish may initiate such changes itself. IFish maintain internal state, including models of its own structure, behavior, information environment and its user; these models permit an IFish to perform meta-level reasoning about its own structure. To facilitate rapid assembly of particular IFish I have created the Internet Fish Construction Kit. This system provides enabling technology for the entire class of Internet Fish tools; it facilitates both creation of new IFish as well as additions of new capabilities to existing ones. The Construction Kit includes a collection of encapsulated heuristic knowledge modules that may be combined in mix-and-match fashion to create a particular IFish; interfaces to new services written with the Construction Kit may be immediately added to "live" IFish. Using the Construction Kit I have created a demonstration IFish specialized for finding World-Wide Web documents related to a given group of documents. This "Finder" IFish includes heuristics that describe how to interact with the Web in general, explain how to take advantage of various public indexes and classification schemes, and provide a method for discovering similarity relationships among documents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A quantitative analysis of the individual compounds in tobacco essential oils is performed by comprehensive two-dimensional gas chromatography (GC x GC) combined with flame ionization detector (FID). A time-of-flight mass spectrometer (TOF/MS) was coupled to GC x GC for the identification of the resolved peaks. The response of a flame ionization detector to different compound classes was calibrated using multiple internal standards. In total, 172 compounds were identified with good match and 61 compounds with high probability value were reliably quantified. For comparative purposes, the essential oil sample was also quantified by one-dimensional gas chromatography-mass spectrometry (GC/MS) with multiple internal standards method. The results showed that there was close agreement between the two analysis methods when the peak purity and match quality in one-dimensional GC/MS are high enough. (c) 2005 Elsevier B.V. All rights reserved.