974 resultados para Multiple attenuation. Deconvolution. Seismic processing
Resumo:
There is increasing evidence to suggest that the presence of mesoscopic heterogeneities constitutes the predominant attenuation mechanism at seismic frequencies. As a consequence, centimeter-scale perturbations of the subsurface physical properties should be taken into account for seismic modeling whenever detailed and accurate responses of the target structures are desired. This is, however, computationally prohibitive since extremely small grid spacings would be necessary. A convenient way to circumvent this problem is to use an upscaling procedure to replace the heterogeneous porous media by equivalent visco-elastic solids. In this work, we solve Biot's equations of motion to perform numerical simulations of seismic wave propagation through porous media containing mesoscopic heterogeneities. We then use an upscaling procedure to replace the heterogeneous poro-elastic regions by homogeneous equivalent visco-elastic solids and repeat the simulations using visco-elastic equations of motion. We find that, despite the equivalent attenuation behavior of the heterogeneous poro-elastic medium and the equivalent visco-elastic solid, the seismograms may differ due to diverging boundary conditions at fluid-solid interfaces, where there exist additional options for the poro-elastic case. In particular, we observe that the seismograms agree for closed-pore boundary conditions, but differ significantly for open-pore boundary conditions. This is an interesting result, which has potentially important implications for wave-equation-based algorithms in exploration geophysics involving fluid-solid interfaces, such as, for example, wave field decomposition.
Resumo:
P130 A HIGH-RESOLUTION 2D/3D SEISMIC STUDY OF A THRUST FAULT ZONE IN LAKE GENEVA SWITZERLAND M. SCHEIDHAUER M. BERES D. DUPUY and F. MARILLIER Institute of Geophysics University of Lausanne 1015 Lausanne, Switzerland Summary A high-resolution three-dimensional (3D) seismic reflection survey has been conducted in Lake Geneva near the city of Lausanne Switzerland where the faulted molasse basement (Tertiary sandstones) is overlain by complex Quaternary sedimentary structures. Using a single 48-channel streamer an area of 1200 m x 600 m was surveyed in 10 days. With a 5-m shot spacing and a receiver spacing of 2.5 m in the inline direction and 7.5 m in the crossline direction, a 12-fold data coverage was achieved. A maximum penetration depth of ~150 m was achieved with a 15 cu. in. water gun operated at 140 bars. The multi-channel data allow the determination of an accurate velocity field for 3D processing, and they show particularly clean images of the fault zone and the overlying sediments in horizontal and vertical sections. In order to compare different sources, inline 55 was repeated with a 30/30 and a 15/15 cu. in. double-chamber air gun (Mini GI) operated at 100 and 80 bars, respectively. A maximum penetration depth of ~450 m was achieved with this source.
Resumo:
In this work we analyze how patchy distributions of CO2 and brine within sand reservoirs may lead to significant attenuation and velocity dispersion effects, which in turn may have a profound impact on surface seismic data. The ultimate goal of this paper is to contribute to the understanding of these processes within the framework of the seismic monitoring of CO2 sequestration, a key strategy to mitigate global warming. We first carry out a Monte Carlo analysis to study the statistical behavior of attenuation and velocity dispersion of compressional waves traveling through rocks with properties similar to those at the Utsira Sand, Sleipner field, containing quasi-fractal patchy distributions of CO2 and brine. These results show that the mean patch size and CO2 saturation play key roles in the observed wave-induced fluid flow effects. The latter can be remarkably important when CO2 concentrations are low and mean patch sizes are relatively large. To analyze these effects on the corresponding surface seismic data, we perform numerical simulations of wave propagation considering reservoir models and CO2 accumulation patterns similar to the CO2 injection site in the Sleipner field. These numerical experiments suggest that wave-induced fluid flow effects may produce changes in the reservoir's seismic response, modifying significantly the main seismic attributes usually employed in the characterization of these environments. Consequently, the determination of the nature of the fluid distributions as well as the proper modeling of the seismic data constitute important aspects that should not be ignored in the seismic monitoring of CO2 sequestration problems.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to non-invasively probe the structure of the white matter. Despite the potential of the technique, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a wide variety of methods have been proposed to shorten acquisition times. [...] We here review a recent work where we propose to further exploit the versatility of compressed sensing and convex optimization with the aim to characterize the fiber orientation distribution sparsity more optimally. We re-formulate the spherical deconvolution problem as a constrained l0 minimization.
Resumo:
At seismic frequencies, wave-induced fluid flow is a major cause of P-wave attenuation in partially saturated porous rocks. Attenuation is of great importance for the oil industry in the interpretation of seismic field data. Here, the effects on P-wave attenuation resulting from changes in oil saturation are studied for media with coexisting water, oil, and gas. For that, creep experiments are numerically simulated by solving Biot's equations for consolidation of poroelastic media with the finite-element method. The experiments yield time-dependent stress?strain relations that are used to calculate the complex P-wave modulus from which frequency-dependent P-wave attenuation is determined. The models are layered media with periodically alternating triplets of layers. Models consisting of triplets of layers having randomly varying layer thicknesses are also considered. The layers in each triplet are fully saturated with water, oil, and gas. The layer saturated with water has lower porosity and permeability than the layers saturated with oil and gas. These models represent hydrocarbon reservoirs in which water is the wetting fluid preferentially saturating regions of lower porosity. The results from the numerical experiments showed that increasing oil saturation, connected to a decrease in gas saturation, resulted in a significant increase of attenuation at low frequencies (lower than 2 Hz). Furthermore, replacing the oil with water resulted in a distinguishable behavior of the frequency-dependent attenuation. These results imply that, according to the physical mechanism of wave-induced fluid flow, frequency-dependent attenuation in media saturated with water, oil, and gas is a potential indicator of oil saturation.
Resumo:
A defining characteristic of fractured rocks is their very high level of seismic attenuation, which so far has been assumed to be mainly due to wave-induced fluid flow (WIFF) between the fractures and the pore space of the embedding matrix. Using oscillatory compressibility simulations based on the quasi-static poroelastic equations, we show that another important, and as of yet undocumented, manifestation of WIFF is at play in the presence of fracture connectivity. This additional energy loss is predominantly due to fluid flow within the connected fractures and is sensitive to their lengths, permeabilities, and intersection angles. Correspondingly, it contains key information on the governing hydraulic properties of fractured rock masses and hence should be accounted for whenever realistic seismic models of such media are needed.
Resumo:
Background: Conventional magnetic resonance imaging (MRI) techniques are highly sensitive to detect multiple sclerosis (MS) plaques, enabling a quantitative assessment of inflammatory activity and lesion load. In quantitative analyses of focal lesions, manual or semi-automated segmentations have been widely used to compute the total number of lesions and the total lesion volume. These techniques, however, are both challenging and time-consuming, being also prone to intra-observer and inter-observer variability.Aim: To develop an automated approach to segment brain tissues and MS lesions from brain MRI images. The goal is to reduce the user interaction and to provide an objective tool that eliminates the inter- and intra-observer variability.Methods: Based on the recent methods developed by Souplet et al. and de Boer et al., we propose a novel pipeline which includes the following steps: bias correction, skull stripping, atlas registration, tissue classification, and lesion segmentation. After the initial pre-processing steps, a MRI scan is automatically segmented into 4 classes: white matter (WM), grey matter (GM), cerebrospinal fluid (CSF) and partial volume. An expectation maximisation method which fits a multivariate Gaussian mixture model to T1-w, T2-w and PD-w images is used for this purpose. Based on the obtained tissue masks and using the estimated GM mean and variance, we apply an intensity threshold to the FLAIR image, which provides the lesion segmentation. With the aim of improving this initial result, spatial information coming from the neighbouring tissue labels is used to refine the final lesion segmentation.Results:The experimental evaluation was performed using real data sets of 1.5T and the corresponding ground truth annotations provided by expert radiologists. The following values were obtained: 64% of true positive (TP) fraction, 80% of false positive (FP) fraction, and an average surface distance of 7.89 mm. The results of our approach were quantitatively compared to our implementations of the works of Souplet et al. and de Boer et al., obtaining higher TP and lower FP values.Conclusion: Promising MS lesion segmentation results have been obtained in terms of TP. However, the high number of FP which is still a well-known problem of all the automated MS lesion segmentation approaches has to be improved in order to use them for the standard clinical practice. Our future work will focus on tackling this issue.
Resumo:
Brain perfusion can be assessed by CT and MR. For CT, two major techniquesare used. First, Xenon CT is an equilibrium technique based on a freely diffusibletracer. First pass of iodinated contrast injected intravenously is a second method,more widely available. Both methods are proven to be robust and quantitative,thanks to the linear relationship between contrast concentration and x-ray attenuation.For the CT methods, concern regarding x-ray doses delivered to the patientsneed to be addressed. MR is also able to assess brain perfusion using the firstpass of gadolinium based contrast agent injected intravenously. This method hasto be considered as a semi-quantitative because of the non linear relationshipbetween contrast concentration and MR signal changes. Arterial spin labelingis another MR method assessing brain perfusion without injection of contrast. Insuch case, the blood flow in the carotids is magnetically labelled by an externalradiofrequency pulse and observed during its first pass through the brain. Eachof this various CT and MR techniques have advantages and limits that will be illustratedand summarised.Learning Objectives:1. To understand and compare the different techniques for brain perfusionimaging.2. To learn about the methods of acquisition and post-processing of brainperfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).
Resumo:
When dealing with nonlinear blind processing algorithms (deconvolution or post-nonlinear source separation), complex mathematical estimations must be done giving as a result very slow algorithms. This is the case, for example, in speech processing, spike signals deconvolution or microarray data analysis. In this paper, we propose a simple method to reduce computational time for the inversion of Wiener systems or the separation of post-nonlinear mixtures, by using a linear approximation in a minimum mutual information algorithm. Simulation results demonstrate that linear spline interpolation is fast and accurate, obtaining very good results (similar to those obtained without approximation) while computational time is dramatically decreased. On the other hand, cubic spline interpolation also obtains similar good results, but due to its intrinsic complexity, the global algorithm is much more slow and hence not useful for our purpose.
Resumo:
In this paper we present a method for blind deconvolution of linear channels based on source separation techniques, for real word signals. This technique applied to blind deconvolution problems is based in exploiting not the spatial independence between signals but the temporal independence between samples of the signal. Our objective is to minimize the mutual information between samples of the output in order to retrieve the original signal. In order to make use of use this idea the input signal must be a non-Gaussian i.i.d. signal. Because most real world signals do not have this i.i.d. nature, we will need to preprocess the original signal before the transmission into the channel. Likewise we should assure that the transmitted signal has non-Gaussian statistics in order to achieve the correct function of the algorithm. The strategy used for this preprocessing will be presented in this paper. If the receiver has the inverse of the preprocess, the original signal can be reconstructed without the convolutive distortion.
Resumo:
There is increasing evidence to suggest that the presence of mesoscopic heterogeneities constitutes an important seismic attenuation mechanism in porous rocks. As a consequence, centimetre-scale perturbations of the rock physical properties should be taken into account for seismic modelling whenever detailed and accurate responses of specific target structures are desired, which is, however, computationally prohibitive. A convenient way to circumvent this problem is to use an upscaling procedure to replace each of the heterogeneous porous media composing the geological model by corresponding equivalent visco-elastic solids and to solve the visco-elastic equations of motion for the inferred equivalent model. While the overall qualitative validity of this procedure is well established, there are as of yet no quantitative analyses regarding the equivalence of the seismograms resulting from the original poro-elastic and the corresponding upscaled visco-elastic models. To address this issue, we compare poro-elastic and visco-elastic solutions for a range of marine-type models of increasing complexity. We found that despite the identical dispersion and attenuation behaviour of the heterogeneous poro-elastic and the equivalent visco-elastic media, the seismograms may differ substantially due to diverging boundary conditions, where there exist additional options for the poro-elastic case. In particular, we observe that at the fluid/porous-solid interface, the poro- and visco-elastic seismograms agree for closed-pore boundary conditions, but differ significantly for open-pore boundary conditions. This is an important result which has potentially far-reaching implications for wave-equation-based algorithms in exploration geophysics involving fluid/porous-solid interfaces, such as, for example, wavefield decomposition.
Resumo:
In (1) H magnetic resonance spectroscopy, macromolecule signals underlay metabolite signals, and knowing their contribution is necessary for reliable metabolite quantification. When macromolecule signals are measured using an inversion-recovery pulse sequence, special care needs to be taken to correctly remove residual metabolite signals to obtain a pure macromolecule spectrum. Furthermore, since a single spectrum is commonly used for quantification in multiple experiments, the impact of potential macromolecule signal variability, because of regional differences or pathologies, on metabolite quantification has to be assessed. In this study, we introduced a novel method to post-process measured macromolecule signals that offers a flexible and robust way of removing residual metabolite signals. This method was applied to investigate regional differences in the mouse brain macromolecule signals that may affect metabolite quantification when not taken into account. However, since no significant differences in metabolite quantification were detected, it was concluded that a single macromolecule spectrum can be generally used for the quantification of healthy mouse brain spectra. Alternatively, the study of a mouse model of human glioma showed several alterations of the macromolecule spectrum, including, but not limited to, increased mobile lipid signals, which had to be taken into account to avoid significant metabolite quantification errors.
Resumo:
Problem solving (including insight, divergent thinking) seems to rely on the right hemisphere (RH). These functions are difficult to assess behaviorally. We propose anagram resolution as a suitable paradigm. University students (n=32) performed three tachistoscopic lateralized visual half-field experiments (stimulus presentation 150ms). In Experiment 1, participants recalled four-letter strings. Subsequently, participants provided solutions for four-letter anagrams (one solution in Experiment 2; two solutions in Experiment 3). Additionally, participants completed a schizotypy questionnaire (O-LIFE). Results showed a right visual field advantage in Experiment 1 and 2, but no visual field advantage in Experiment 3. In Experiment 1, increasing positive schizotypy associated with a RH performance shift. Problem solving seems to require increasingly the RH when facing several rather than one solution. This result supports previous studies on the RH's role in remote associative, metaphor and discourse processing. The more complex language requirements, the less personality traits seem to matter.
Resumo:
The global structural connectivity of the brain, the human connectome, is now accessible at millimeter scale with the use of MRI. In this paper, we describe an approach to map the connectome by constructing normalized whole-brain structural connection matrices derived from diffusion MRI tractography at 5 different scales. Using a template-based approach to match cortical landmarks of different subjects, we propose a robust method that allows (a) the selection of identical cortical regions of interest of desired size and location in different subjects with identification of the associated fiber tracts (b) straightforward construction and interpretation of anatomically organized whole-brain connection matrices and (c) statistical inter-subject comparison of brain connectivity at various scales. The fully automated post-processing steps necessary to build such matrices are detailed in this paper. Extensive validation tests are performed to assess the reproducibility of the method in a group of 5 healthy subjects and its reliability is as well considerably discussed in a group of 20 healthy subjects.