921 resultados para seismic data processing
Resumo:
This paper focuses on a problem of Grid system decomposition by developing its object model. Unified Modelling Language (UML) is used as a formalization tool. This approach is motivated by the complexity of the system being analysed and the need for simulation model design.
Resumo:
The software architecture and development consideration for open metadata extraction and processing framework are outlined. Special attention is paid to the aspects of reliability and fault tolerance. Grid infrastructure is shown as useful backend for general-purpose task.
Resumo:
In this paper conceptual foundations for the development of Grid systems that aimed for satellite data processing are discussed. The state of the art of development of such Grid systems is analyzed, and a model of Grid system for satellite data processing is proposed. An experience obtained within the development of the Grid system for satellite data processing in the Space Research Institute of NASU-NSAU is discussed.
Resumo:
Implementation of GEOSS/GMES initiative requires creation and integration of service providers, most of which provide geospatial data output from Grid system to interactive user. In this paper approaches of DOS- centers (service providers) integration used in Ukrainian segment of GEOSS/GMES will be considered and template solutions for geospatial data visualization subsystems will be suggested. Developed patterns are implemented in DOS center of Space Research Institute of National Academy of Science of Ukraine and National Space Agency of Ukraine (NASU-NSAU).
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
Data processing services for Meteosat geostationary satellite are presented. Implemented services correspond to the different levels of remote-sensing data processing, including noise reduction at preprocessing level, cloud mask extraction at low-level and fractal dimension estimation at high-level. Cloud mask obtained as a result of Markovian segmentation of infrared data. To overcome high computation complexity of Markovian segmentation parallel algorithm is developed. Fractal dimension of Meteosat data estimated using fractional Brownian motion models.
Resumo:
As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.
Resumo:
The generation of heterogeneous big data sources with ever increasing volumes, velocities and veracities over the he last few years has inspired the data science and research community to address the challenge of extracting knowledge form big data. Such a wealth of generated data across the board can be intelligently exploited to advance our knowledge about our environment, public health, critical infrastructure and security. In recent years we have developed generic approaches to process such big data at multiple levels for advancing decision-support. It specifically concerns data processing with semantic harmonisation, low level fusion, analytics, knowledge modelling with high level fusion and reasoning. Such approaches will be introduced and presented in context of the TRIDEC project results on critical oil and gas industry drilling operations and also the ongoing large eVacuate project on critical crowd behaviour detection in confined spaces.
Resumo:
The Data Processing Department of ISHC has developed coding forms to be used for the data to be entered into the program. The Highway Planning and Programming and the Design Departments are responsible for coding and submitting the necessary data forms to Data Processing for the noise prediction on the highway sections.
Resumo:
New morpho-bathymetric and tectono-stratigraphic data on Naples and Salerno Gulfs, derived from bathymetric and seismic data analysis and integrated geologic interpretation are here presented. The CUBE(Combined Uncertainty Bathymetric Estimator) method has been applied to complex morphologies, such as the Capri continental slope and the related geological structures occurring in the Salerno Gulf.The bathymetric data analysis has been carried out for marine geological maps of the whole Campania continental margin at scales ranging from 1:25.000 to 1:10.000, including focused examples in Naples and Salerno Gulfs, Naples harbour, Capri and Ischia Islands and Salerno Valley. Seismic data analysis has allowed for the correlation of main morpho-structural lineaments recognized at a regional scale through multichannel profiles with morphological features cropping out at the sea bottom, evident from bathymetry.Main fault systems in the area have been represented on a tectonic sketch map, including the master fault located northwards to the Salerno Valley half graben. Some normal faults parallel to the master fault have been interpreted from the slope map derived from bathymetric data. A complex system of antithetic faults bound two morpho-structural highs located 20km to the south of the Capri Island. Some hints of compressional reactivation of normal faults in an extensional setting involving the whole Campania continental margin have been shown from seismic interpretation.
Resumo:
The structure of the Moroccan and Nova Scotia conjugate rifted margins is of key importance for understanding the Mesozoic break-up and evolution of the northern central Atlantic Ocean basin. Seven combined multichannel reflection (MCS) and wide-angle seismic (OBS) data profiles were acquired along the Atlantic Moroccan margin between the latitudes of 31.5° and 33° N during the MIRROR seismic survey in 2011, in order to image the transition from continental to oceanic crust, to study the variation in crustal structure and to characterize the crust under the West African Coast Magnetic Anomaly (WACMA). The data were modeled using a forward modeling approach. The final models image crustal thinning from 36 km thickness below the continent to approximately 8 km in the oceanic domain. A 100 km wide zone characterized by rough basement topography and high seismic velocities up to 7.4 km/s in the lower crust is observed westward of the West African Coast Magnetic Anomaly. No basin underlain by continental crust has been imaged in this region, as has been identified north of our study area. Comparison to the conjugate Nova Scotian margin shows a similar continental crustal thickness and layer geometry, and the existence of exhumed and serpentinized upper mantle material on the Canadian side only. The oceanic crustal thickness is lower on the Canadian margin.
Resumo:
Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.
Resumo:
By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.
Resumo:
Processing of a recently acquired seismic line in the northeastern South China Sea by Project 973 has been conducted to study the character and the distribution of gas hydrate Bottom-Simulating Reflectors (BSRs) in the Hengchun ridge. Analysis of different-type seismic profiles shows that the distribution of BSRs can be revealed to some extents by single-channel profile in this area, but seismic data processing plays an important role to resolve the full distribution of BSRs in this area. BSR' s in the northeastern South China Sea have the typical characteristics of BSRs on worldwide continental margins: they cross sediment bed reflections, they are generally parallel to the seafloor and the associated reflections have strong amplitude and a negative polarity. The characteristics of BSRs in this area are obvious and the BSRs indicate the occurrence of gas hydrate-bearing sediments in the northeastern South China Sea. The depth of the base of the gas-hydrate stability zone was calculated using the phase stability boundary curve of methane hydrate and gas hydrate with mixture gas composition and compared with the observed BSR depth. If a single gradient geothermal curve is used for the calculation, the base of the stability zone for methane hydrate or gas hydrate with a gas mixture composition does not correspond to the depth of the BSRs observed along the whole seismic profile. The geothermal gradient therefore changes significantly along the profile. The geothermal gradient and heat flow were estimated from the BSR data and the calculations show that the geothermal gradient and heat flow decrease from west to east, with the increase of the distance from the trench and the decrease of the distance to the island arc. The calculated 2 heat flow changes from 28 to 64 mW/m(2), which is basically consistent with the measured heat flow in southwestern offshore Taiwan.
Resumo:
With the Oil field exploration and exploitation, the problem of supervention and enhaning combination gas recovery was faced.then proposing new and higher demands to precision of seismic data. On the basis of studying exploration status,resource potential,and quality of 3D seismic data to internal representative mature Oil field, taking shengli field ken71 zone as study object, this paper takes advantage of high-density 3D seismic technique to solving the complex geologic problem in exploration and development of mature region, deep into researching the acquisition, processing of high-density 3D seismic data. This disseration study the function of routine 3D seismic, high-density 3D seismic, 3D VSP seismic,and multi-wave multi-component seismic to solving the geologic problem in exploration and development of mature region,particular introduce the advantage and shortage of high-density 3D seismic exploration, put forward the integrated study method of giving priority to high-density 3D seismic and combining other seismic data in enhancing exploration accuracy of mature region. On the basis of detailedly studying acquisition method of high-density 3D seismic and 3D VSP seismic,aming at developing physical simulation and numeical simulation to designing and optimizing observation system. Optimizing “four combination” whole acquisition method of acquisition of well with ground seimic and “three synchron”technique, realizing acquisition of combining P-wave with S-wave, acquisition of combining digit geophone with simulation geophone, acquisition of 3D VSP seismic with ground seimic, acquisition of combining interborehole seismic,implementing synchron acceptance of aboveground equipment and downhole instrument, common use and synchron acceptance of 3D VSP and ground shots, synchron acquisition of high-density P-wave and high-density multi-wave, achieve high quality magnanimity seismic data. On the basis of detailedly analysising the simulation geophone data of high-density acquisition ,adopting pertinency processing technique to protecting amplitude,studying the justice matching of S/N and resolution to improving resolution of seismic profile ,using poststack series connection migration,prestack time migration and prestack depth migration to putting up high precision imaging,gained reliable high resolution data.At the same time carrying along high accuracy exploration to high-density digit geophone data, obtaining good improve in its resolution, fidelity, break point clear degree, interbed information, formation characteristics and so on.Comparing processing results ,we may see simulation geophone high-density acquisition and high precision imaging can enhancing resolution, high-density seismic basing on digit geophone can better solve subsurface geology problem. At the same time, fine processing converted wave of synchron acquisition and 3D VSP seismic data,acquiring good result. On the basis of high-density seismic data acquisition and high-density seismic data processing, carry through high precision structure interpretation and inversion, and preliminary interpretation analysis to 3D VSP seismic data and multi-wave multi-component seismic data. High precision interpretation indicates after high resolution processing ,structural diagram obtaining from high-density seismic data better accord with true geoligy situation.