922 resultados para High-frequency data


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using the approximate high-frequency asymptotic methods to solve the scalar wave equation, we can get the eikonal equation and transport equation. Solving the eikonal equation by the method of characteristics provides a mathematical derivation of ray tracing equations. So, the ray tracing system is folly based on the approximate high-frequency asymptotic methods. If the eikonal is complex, more strictly, the eikonal is real value at the rays and complex outside rays, we can derive the Gaussian beam. This article mainly concentrates on the theory of Gaussian beam. To classical ray tracing theory, the Gaussina beam method (GBM) has many advantages. First, rays are no longer required to stop at the exact position of the receivers; thus time-consuming two-point ray tracing can be avoided. Second, the GBM yields stable results in regions of the wavefield where the standard ray theory fails (e.g., caustics, shadows zones and critical distance). Third, unlike seismograms computed by conventional ray tracing techniques, the GBM synthetic data are less influenced by minor details in the model representation. Here, I realize kinematical and dynamical system, and based on this, realize the GBM. Also, I give some mathematical examples. From these examples, we can find the importance and feasibility of the ray tracing system. Besides, I've studied about the reflection coefficient of inhomogeneous S-electromagnetic wave at the interface of conductive media. Basing on the difference of directions of phase shift constant and attenuation constant when the electromagnetic wave propagates in conductive medium, and using the boundary conditions of electromagnetic wave at the interface of conductive media, we derive the reflection coefficient of inhomogeneous S-electromagnetic wave, and draw the curves of it. The curves show that the quasi total reflection will occur when the electromagnetic wave incident from the medium with greater conductivity to the medium with smaller conductivity. There are two peak, values at the points of the critical angles of phase shift constant and attenuation constant, and the reflection coefficient is smaller than 1. This conclusion is different from that of total reflection light obviously.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reflectivity sequences extraction is a key part of impedance inversion in seismic exploration. Although many valid inversion methods exist, with crosswell seismic data, the frequency brand of seismic data can not be broadened to satisfy the practical need. It is an urgent problem to be solved. Pre-stack depth migration which developed in these years becomes more and more robust in the exploration. It is a powerful technology of imaging to the geological object with complex structure and its final result is reflectivity imaging. Based on the reflectivity imaging of crosswell seismic data and wave equation, this paper completed such works as follows: Completes the workflow of blind deconvolution, Cauchy criteria is used to regulate the inversion(sparse inversion). Also the precondition conjugate gradient(PCG) based on Krylov subspace is combined with to decrease the computation, improves the speed, and the transition matrix is not necessary anymore be positive and symmetric. This method is used to the high frequency recovery of crosswell seismic section and the result is satisfactory. Application of rotation transform and viterbi algorithm in the preprocess of equation prestack depth migration. In equation prestack depth migration, the grid of seismic dataset is required to be regular. Due to the influence of complex terrain and fold, the acquisition geometry sometimes becomes irregular. At the same time, to avoid the aliasing produced by the sparse sample along the on-line, interpolation should be done between tracks. In this paper, I use the rotation transform to make on-line run parallel with the coordinate, and also use the viterbi algorithm to complete the automatic picking of events, the result is satisfactory. 1. Imaging is a key part of pre-stack depth migration besides extrapolation. Imaging condition can influence the final result of reflectivity sequences imaging greatly however accurate the extrapolation operator is. The author does migration of Marmousi under different imaging conditions. And analyzes these methods according to the results. The results of computation show that imaging condition which stabilize source wave field and the least-squares estimation imaging condition in this paper are better than the conventional correlation imaging condition. The traditional pattern of "distributed computing and mass decision" is wisely adopted in the field of seismic data processing and becoming an obstacle of the promoting of the enterprise management level. Thus at the end of this paper, a systemic solution scheme, which employs the mode of "distributed computing - centralized storage - instant release", is brought forward, based on the combination of C/S and B/S release models. The architecture of the solution, the corresponding web technology and the client software are introduced. The application shows that the validity of this scheme.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cross well seismic technique is a new type of geophysical method, which observes the seismic wave of the geologic body by placing both the source and receiver in the wells. By applying this method, it averted the absorption to high-frequency component of seismic signal caused by low weathering layers, thus, an extremely high-resolution seismic signal can be acquired. And extremely fine image of cross well formations, structure, and reservoir can be achieved as well. An integrated research is conducted to the high-frequency S-wave and P-wave data and some other data to determine the small faults, small structure and resolving the issues concerning the thin bed and reservoir's connectivity, fluid distribution, steam injection and fracture. This method connects the high-resolution surface seismic, logging and reservoir engineering. In this paper, based on the E & P situation in the oilfield and the theory of geophysical exploration, a research is conducted on cross well seismic technology in general and its important issues in cross well seismic technology in particular. A technological series of integrated field acquisition, data processing and interpretation and its integrated application research were developed and this new method can be applied to oilfield development and optimizing oilfield development scheme. The contents and results in this paper are as listed follows: An overview was given on the status quo and development of the cross well seismic method and problems concerning the cross well seismic technology and the difference in cross well seismic technology between China and international levels; And an analysis and comparison are given on foreign-made field data acquisition systems for cross-well seismic and pointed out the pros and cons of the field systems manufactured by these two foreign companies and this is highly valuable to import foreign-made cross well seismic field acquisition system for China. After analyses were conducted to the geometry design and field data for the cross well seismic method, a common wave field time-depth curve equation was derived and three types of pipe waves were discovered for the first time. Then, a research was conducted on the mechanism for its generation. Based on the wave field separation theory for cross well seismic method, we believe that different type of wave fields in different gather domain has different attributes characteristics, multiple methods (for instance, F-K filtering and median filtering) were applied in eliminating and suppressing the cross well disturbances and successfully separated the upgoing and downgoing waves and a satisfactory result has been achieved. In the area of wave field numerical simulation for cross well seismic method, a analysis was conducted on conventional ray tracing method and its shortcomings and proposed a minimum travel time ray tracing method based on Feraiat theory in this paper. This method is not only has high-speed calculation, but also with no rays enter into "dead end" or "blinded spot" after numerous iterations and it is become more adequate for complex velocity model. This is first time that the travel time interpolation has been brought into consideration, a dynamic ray tracing method with shortest possible path has been developed for the first arrivals of any complex mediums, such as transmission, diffraction and refraction, etc and eliminated the limitation for only traveling from one node to another node and increases the calculation accuracy for minimum travel time and ray tracing path and derives solution and corresponding edge conditions to the fourth-order differential sonic wave equation. The final step is to calculate cross well seismic synthetics for given source and receivers from multiple geological bodies. Thus, real cross-well seismic wave field can be recognized through scientific means and provides important foundation to guide the cross well seismic field geometry designing. A velocity tomographic inversion of the least square conjugated gradient method was developed for cross well seismic velocity tomopgraphic inversion and a modification has been made to object function of the old high frequency ray tracing method and put forward a thin bed oriented model for finite frequency velocity tomographic inversion method. As the theory model and results demonstrates that the method is simple and effective and is very important in seismic ray tomographic imaging for the complex geological body. Based on the characteristics of the cross well seismic algorithm, a processing flow for cross well seismic data processing has been built and optimized and applied to the production, a good section of velocity tomopgrphic inversion and cross well reflection imaging has been acquired. The cross well seismic data is acquired from the depth domain and how to interprets the depth domain data and retrieve the attributes is a brand new subject. After research was conducted on synthetics and trace integration from depth domain for the cross well seismic data interpretation, first of all, a research was conducted on logging constraint wave impedance of cross well seismic data and initially set up cross well seismic data interpretation flows. After it applied and interpreted to the cross well seismic data and a good geological results has been achieved in velocity tomographic inversion and reflection depth imaging and a lot of difficult problems for oilfield development has been resolved. This powerful, new method is good for oilfield development scheme optimization and increasing EOR. Based on conventional reservoir geological model building from logging data, a new method is also discussed on constraining the accuracy of reservoir geological model by applying the high resolution cross well seismic data and it has applied to Fan 124 project and a good results has been achieved which it presents a bight future for the cross well seismic technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis mainly talks about the wavelet transfrom and the frequency division method. It describes the frequency division processing on prestack or post-stack seismic data and application of inversion noise attenuation, frequency division residual static correction and high resolution data in reservoir inversion. This thesis not only describes the frequency division and inversion in theory, but also proves it by model calculation. All the methods are integrated together. The actual data processing demonstrates the applying results. This thesis analyzes the differences and limitation between t-x prediction filter and f-x prediction filter noise attenuation from wavelet transform theory. It considers that we can do the frequency division attenuation process of noise and signal by wavelet frequency division theory according to the differences of noise and signal in phase, amplitude and frequency. By comparison with the f-x coherence noise, removal method, it approves the effects and practicability of frequency division in coherence and random noise isolation. In order to solve the side effects in non-noise area, we: take the area constraint method and only apply the frequency division processing in the noise area. So it can solve the problem of low frequency loss in non-noise area. The residual moveout differences in seismic data processing have a great effect on stack image and resolutions. Different frequency components have different residual moveout differences. The frequency division residual static correction realizes the frequency division and the calculation of residual correction magnitude. It also solves the problems of different residual correction magnitude in different frequency and protects the high frequency information in data. By actual data processing, we can get good results in phase residual moveout differences elimination of pre-stack data, stack image quality and improvement of data resolution. This thesis analyses the characters of the random noises and its descriptions in time domain and frequency domain. Furthermore it gives the inversion prediction solution methods and realizes the frequency division inversion attenuation of the random noise. By the analysis of results of the actual data processing, we show that the noise removed by inversion has its own advantages. By analyzing parameter's about resolution and technology of high resolution data processing, this thesis describes the relations between frequency domain and resolution, parameters about resolution and methods to increase resolution. It also gives the processing flows of the high resolution data; the effect and influence of reservoir inversion caused by high resolution data. Finally it proves the accuracy and precision of the reservoir inversion results. The research results of this thesis reveal that frequency division noise attenuation, frequency residual correction and inversion noise attenuation are effective methods to increase the SNR and resolution of seismic data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this study, 260 mollusk fossil samples from a Red Clay sequence at Xifeng, Gansu province, in the northern China were analyzed quantitatively. 12 fossil species and four fossil zones have been identified. Three main ecological groups were determined based on ecological requirement of each mollusk taxon. According to fossil composition and succession of three ecological groups, the author discussed the origin and sedimentary environment of the red clay deposits, and the process of ecological environmental changes as well as the variations of the East Asia monsoons during 6.2-2.4 Ma in the Loess Plateau. A preliminary study on periodicity of paleoclimatic changes was also conducted by using spectral analysis method. The main results and conclusions are presented as follows:A continuous land mollusk fossil sequence of 6.2-2.4 Ma from Xifeng Red Clay Formation has been established, which provided a basic data for studying the environmental changes during late Miocene to Pliocene.The study of composition and preservation condition of mollusk fossils reveals a terrestrial in situ ecological population in the Red Clay Formation. All of identifiable mollusk species are composed of terrestrial taxa, which support the view that the Red Clay is an eolian origin, similar to the overlying Quaternary loess deposits.The mollusk record reveals the processes of ecological and environmental changes during 6.2-2.4 Ma in the Loess Plateau. Climatic changes experienced cold and dry from 6.2-5.4 Ma, warm and wet during 5.4-4.5 Ma, mild and moderate from 4.5-3-4 Ma, to rapid cooling and drying after 3.4 Ma. From '5.4- 2.4 Ma, climate was stepwise cooling. The cooling trend is in good agreement with a general1 0global cooling trend during this period, as documented by marine 5 0 records.4. Three remarked ecological shifts took place in mollusk assemblages from 6.2-2.4 Ma, focused on about 5.4, 4.5 and 3.4 Ma. The warming shift around 5.4 Ma was probably related to the rising of the global temperature. The cooling shifts around 4,5 and 3.4 Ma however might be closely linked to the uplift of Tibet Plateau and the development of Northern Hemisphere ice sheet.The succession in mollusk ecological groups also recorded the variability of the East Asian winter and summer monsoon. The winter monsoon dominated two periods from 6.2-5.4 Ma and from 3.4-2.4 Ma, while the summer monsoon was strong during 5.4-4.5 Ma. The variations in winter and summer monsoons were in phase during 4.5-3.4 Ma. Monsoon regimes changed with the duration about 1 Ma, which roughly corresponds to the cycle driven by tectonic activity on the time scales of ICP-IO7 years. In addition, mollusk fossils recorded the large amplitude and high frequency fluctuations overlapped on 105-107 years climate cycle.The maximum entropy spectral analysis and filter-band analysis of total mollusk individuals and three typical ecological groups suggest that the climate changes controlled mainly by solar insolation had periods about 70 ka and 40 ka on the time scales of 105 during late Miocene-Pliocene. Climatic periodicity intensified from 4.0 Ma, which reflected strengthened forcing by high latitude ice volume.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the development of oil and gas exploration, the exploration of the continental oil and gas turns into the exploration of the subtle oil and gas reservoirs from the structural oil and gas reservoirs in China. The reserves of the found subtle oil and gas reservoirs account for more than 60 percent of the in the discovered oil and gas reserves. Exploration of the subtle oil and gas reservoirs is becoming more and more important and can be taken as the main orientation for the increase of the oil and gas reserves. The characteristics of the continental sedimentary facies determine the complexities of the lithological exploration. Most of the continental rift basins in East China have entered exploration stages of medium and high maturity. Although the quality of the seismic data is relatively good, this areas have the characteristics of the thin sand thickness, small faults, small range of the stratum. It requests that the seismic data have high resolution. It is a important task how to improve the signal/noise ratio of the high frequency of seismic data. In West China, there are the complex landforms, the deep embedding the targets of the prospecting, the complex geological constructs, many ruptures, small range of the traps, the low rock properties, many high pressure stratums and difficulties of boring well. Those represent low signal/noise ratio and complex kinds of noise in the seismic records. This needs to develop the method and technique of the noise attenuation in the data acquisition and processing. So that, oil and gas explorations need the high resolution technique of the geophysics in order to solve the implementation of the oil resources strategy for keep oil production and reserves stable in Ease China and developing the crude production and reserves in West China. High signal/noise ratio of seismic data is the basis. It is impossible to realize for the high resolution and high fidelity without the high signal/noise ratio. We play emphasis on many researches based on the structure analysis for improving signal/noise ratio of the complex areas. Several methods are put forward for noise attenuation to truly reflect the geological features. Those can reflect the geological structures, keep the edges of geological construction and improve the identifications of the oil and gas traps. The ideas of emphasize the foundation, give prominence to innovate, and pay attention to application runs through the paper. The dip-scanning method as the center of the scanned point inevitably blurs the edges of geological features, such as fault and fractures. We develop the new dip scanning method in the shap of end with two sides scanning to solve this problem. We bring forward the methods of signal estimation with the coherence, seismic wave characteristc with coherence, the most homogeneous dip-sanning for the noise attenuation using the new dip-scanning method. They can keep the geological characters, suppress the random noise and improve the s/n ratio and resolution. The rutine dip-scanning is in the time-space domain. Anew method of dip-scanning in the frequency-wavenumber domain for the noise attenuation is put forward. It use the quality of distinguishing between different dip events of the reflection in f-k domain. It can reduce the noise and gain the dip information. We describe a methodology for studying and developing filtering methods based on differential equations. It transforms the filtering equations in the frequency domain or the f-k domain into time or time-space domains, and uses a finite-difference algorithm to solve these equations. This method does not require that seismic data be stationary, so their parameters can vary at every temporal and spatial point. That enhances the adaptability of the filter. It is computationally efficient. We put forward a method of matching pursuits for the noise suppression. This method decomposes any signal into a linear expansion of waveforms that are selected from a redundant dictionary of functions. These waveforms are chosen in order to best match the signal structures. It can extract the effective signal from the noisy signal and reduce the noise. We introduce the beamforming filtering method for the noise elimination. Real seismic data processing shows that it is effective in attenuating multiples and internal multiples. The s/n ratio and resolution are improved. The effective signals have the high fidelity. Through calculating in the theoretic model and applying it to the real seismic data processing, it is proved that the methods in this paper can effectively suppress the random noise, eliminate the cohence noise, and improve the resolution of the seismic data. Their practicability is very better. And the effect is very obvious.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As one part of national road No. 318, Sichuan-Tibet (Chengdu-Lasha) Highway is one of traffic life lines connecting Tibet municipality to the inland, which is very important to the economic development of Tibet. In addition, it is still an important national defence routeway, with extremely important strategic position on maintaining the stability and solidarity of Tibet municipality and consolidating national defence. Particular geological condition, terrain and landform condition and hydrometeorological condition induce large-scale debris flows and landslides (including landslips) and the like geological hazards frequently occur along the highway. High frequency geological hazards not only result in high casualties and a great property loss, but also block traffic at every turn, obstructing the Sichuan-Tibet highway seriously. On the basis of considerable engineering geological investigation and analysis to the relative studying achievements of predecessors, it is found that one of the dominating reason incurring landslides or debris flows again and again in a place is that abundant loose materials are accumulated in valleys and slopes along the highway. Taking landslides' and debris flows along Ranwu-Lulang section of Sichuan-Tibet highway as studying objects, the sources and cause of formation of loose accumulation materials in the studying area are analyzed in detail, the major hazard-inducing conditions, hazard, dynamic risk, prediction of susceptibility degree of landslides and debris flows, and the relations between landslides and debris flows and various hazard-inducing conditions are systematically researched in this paper. All of these will provide scientific foundation for the future highway renovating and reducing and preventing geological hazards. For the purpose of quantitatively analyzing landslide and debris flow hazards, the conception of entropy and information entropy are extended, the conception of geological hazard entropy is brought forward, and relevant mathematics model is built. Additionally, a new approach for the dynamic risk analysis of landslide and debris flow is put forward based on the dynamic characteristics of the hazard of hazard-inducings and the vulnerability of hazard-bearings. The formation of landslide and debris flow is a non-linear process, which is synthetically affected by various factors, and whose formation mechanics is extremely complex. Aiming at this question, a muli-factors classifying and overlapping technique is brought forward on the basis of engineering geomechanics meta-synthesis (EGMS) thought and approach, and relevant mathematics model is also built to predict the susceptibility degree of landslide or debris flow. The example analysis result proves the validity of this thought and approach. To studying the problem that whether the formation and space distribution of landslides and debris flows are controlled by one or several hazard-inducing conditions, the theme graphics of landslides and debris flows hazard and various hazard-inducing conditions are overlapped to determine the relationship between hazard and hazard-inducing conditions. On this basis, the semi-quantitative engineering zonation of the studying area is carried out. In addition, the overlapping analysis method of the hazard-indue ing conditions of landslides and debris flows based on "digital graphics system" is advanced to orderly organize and effectively manage the spatial and attributive data of hazard and hazard-inducing conditions theme graphics, and to realize the effectively combination of graphics, images and figures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Chaplin, W. J.; Dumbill, A. M.; Elsworth, Y.; Isaak, G. R.; McLeod, C. P.; Miller, B. A.; New, R.; Pint?r, B., Studies of the solar mean magnetic field with the Birmingham Solar-Oscillations Network (BiSON), Monthly Notice of the Royal Astronomical Society, Volume 343, Issue 3, pp. 813-818. RAE2008

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Current low-level networking abstractions on modern operating systems are commonly implemented in the kernel to provide sufficient performance for general purpose applications. However, it is desirable for high performance applications to have more control over the networking subsystem to support optimizations for their specific needs. One approach is to allow networking services to be implemented at user-level. Unfortunately, this typically incurs costs due to scheduling overheads and unnecessary data copying via the kernel. In this paper, we describe a method to implement efficient application-specific network service extensions at user-level, that removes the cost of scheduling and provides protected access to lower-level system abstractions. We present a networking implementation that, with minor modifications to the Linux kernel, passes data between "sandboxed" extensions and the Ethernet device without copying or processing in the kernel. Using this mechanism, we put a customizable networking stack into a user-level sandbox and show how it can be used to efficiently process and forward data via proxies, or intermediate hosts, in the communication path of high performance data streams. Unlike other user-level networking implementations, our method makes no special hardware requirements to avoid unnecessary data copies. Results show that we achieve a substantial increase in throughput over comparable user-space methods using our networking stack implementation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate adaptive buffer management techniques for approximate evaluation of sliding window joins over multiple data streams. In many applications, data stream processing systems have limited memory or have to deal with very high speed data streams. In both cases, computing the exact results of joins between these streams may not be feasible, mainly because the buffers used to compute the joins contain much smaller number of tuples than the tuples contained in the sliding windows. Therefore, a stream buffer management policy is needed in that case. We show that the buffer replacement policy is an important determinant of the quality of the produced results. To that end, we propose GreedyDual-Join (GDJ) an adaptive and locality-aware buffering technique for managing these buffers. GDJ exploits the temporal correlations (at both long and short time scales), which we found to be prevalent in many real data streams. We note that our algorithm is readily applicable to multiple data streams and multiple joins and requires almost no additional system resources. We report results of an experimental study using both synthetic and real-world data sets. Our results demonstrate the superiority and flexibility of our approach when contrasted to other recently proposed techniques.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A neural model of peripheral auditory processing is described and used to separate features of coarticulated vowels and consonants. After preprocessing of speech via a filterbank, the model splits into two parallel channels, a sustained channel and a transient channel. The sustained channel is sensitive to relatively stable parts of the speech waveform, notably synchronous properties of the vocalic portion of the stimulus it extends the dynamic range of eighth nerve filters using coincidence deteectors that combine operations of raising to a power, rectification, delay, multiplication, time averaging, and preemphasis. The transient channel is sensitive to critical features at the onsets and offsets of speech segments. It is built up from fast excitatory neurons that are modulated by slow inhibitory interneurons. These units are combined over high frequency and low frequency ranges using operations of rectification, normalization, multiplicative gating, and opponent processing. Detectors sensitive to frication and to onset or offset of stop consonants and vowels are described. Model properties are characterized by mathematical analysis and computer simulations. Neural analogs of model cells in the cochlear nucleus and inferior colliculus are noted, as are psychophysical data about perception of CV syllables that may be explained by the sustained transient channel hypothesis. The proposed sustained and transient processing seems to be an auditory analog of the sustained and transient processing that is known to occur in vision.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The thesis is focused on the magnetic materials comparison and selection for high-power non-isolated dc-dc converters for industrial applications or electric, hybrid and fuel cell vehicles. The application of high-frequency bi-directional soft-switched dc-dc converters is also investigated. The thesis initially outlines the motivation for an energy-efficient transportation system with minimum environmental impact and reduced dependence on exhaustible resources. This is followed by a general overview of the power system architectures for electric, hybrid and fuel cell vehicles. The vehicle power sources and general dc-dc converter topologies are discussed. The dc-dc converter components are discussed with emphasis on recent semiconductor advances. A novel bi-directional soft-switched dc-dc converter with an auxiliary cell is introduced in this thesis. The soft-switching cell allows for the MOSFET's intrinsic body diode to operate in a half-bridge without reduced efficiency. The converter's mode-by-mode operation is analysed and closed-form expressions are presented for the average current gain of the converter. The design issues are presented and circuit limitations are discussed. Magnetic materials for the main dc-dc converter inductor are compared and contrasted. Novel magnetic material comparisons are introduced, which include the material dc bias capability and thermal conductivity. An inductor design algorithm is developed and used to compare the various magnetic materials for the application. The area-product analysis is presented for the minimum inductor size and highlights the optimum magnetic materials. Finally, the high-flux magnetic materials are experimentally compared. The practical effects of frequency, dc-bias, and converters duty-cycle effect for arbitrary shapes of flux density, air gap effects on core and winding, the winding shielding effect, and thermal configuration are investigated. The thesis results have been documented at IEEE EPE conference in 2007 and 2008, IEEE APEC in 2009 and 2010, and IEEE VPPC in 2010. A 2011 journal has been approved by IEEE Transactions on Power Electronics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the swamping and timeliness of data in the organizational context, the decision maker’s choice of an appropriate decision alternative in a given situation is defied. In particular, operational actors are facing the challenge to meet business-critical decisions in a short time and at high frequency. The construct of Situation Awareness (SA) has been established in cognitive psychology as a valid basis for understanding the behavior and decision making of human beings in complex and dynamic systems. SA gives decision makers the possibility to make informed, time-critical decisions and thereby improve the performance of the respective business process. This research paper leverages SA as starting point for a design science project for Operational Business Intelligence and Analytics systems and suggests a first version of design principles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. METHODS: Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. RESULTS: Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. CONCLUSIONS: This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.