941 resultados para open source seismic data processing packages
Resumo:
TEXTABLE est un nouvel outil open source de programmation visuelle pour l'analyse de données textuelles. Les implications de la conception de ce logiciel du point de vue de l'interopérabilité et de la flexibilité sont abordées, ainsi que la question que son adéquation pour un usage pédagogique. Une brève introduction aux principes de la programmation visuelle pour l'analyse de données textuelles est également proposée.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
The relief of the seafloor is an important source of data for many scientists. In this paper we present an optical system to deal with underwater 3D reconstruction. This system is formed by three cameras that take images synchronously in a constant frame rate scheme. We use the images taken by these cameras to compute dense 3D reconstructions. We use Bundle Adjustment to estimate the motion ofthe trinocular rig. Given the path followed by the system, we get a dense map of the observed scene by registering the different dense local reconstructions in a unique and bigger one
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
Resumo:
Extensible Markup Language (XML) is a generic computing language that provides an outstanding case study of commodification of service standards. The development of this language in the late 1990s marked a shift in computer science as its extensibility let store and share any kind of data. Many office suites software rely on it. The chapter highlights how the largest multinational firms pay special attention to gain a recognised international standard for such a major technological innovation. It argues that standardisation processes affects market structures and can lead to market capture. By examining how a strategic use of standardisation arenas can generate profits, it shows that Microsoft succeeded in making its own technical solution a recognised ISO standard in 2008, while the same arena already adopted two years earlier the open source standard set by IBM and Sun Microsystems. Yet XML standardisation also helped to establish a distinct model of information technology services at the expense of Microsoft monopoly on proprietary software
Resumo:
BACKGROUND: Qualitative frameworks, especially those based on the logical discrete formalism, are increasingly used to model regulatory and signalling networks. A major advantage of these frameworks is that they do not require precise quantitative data, and that they are well-suited for studies of large networks. While numerous groups have developed specific computational tools that provide original methods to analyse qualitative models, a standard format to exchange qualitative models has been missing. RESULTS: We present the Systems Biology Markup Language (SBML) Qualitative Models Package ("qual"), an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks. We demonstrate the interoperability of models via SBML qual through the analysis of a specific signalling network by three independent software tools. Furthermore, the collective effort to define the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyse qualitative models. CONCLUSIONS: SBML qual allows the exchange of qualitative models among a number of complementary software tools. SBML qual has the potential to promote collaborative work on the development of novel computational approaches, as well as on the specification and the analysis of comprehensive qualitative models of regulatory and signalling networks.
Resumo:
Knowledge of the reflectivity of the sediment-covered seabed is of significant importance to marine seismic data acquisition and interpretation as it governs the generation of reverberations in the water layer. In this context pertinent, but largely unresolved, questions concern the importance of the typically very prominent vertical seismic velocity gradients as well as the potential presence and magnitude of anisotropy in soft surficial seabed sediments. To address these issues, we explore the seismic properties of granulometric end-member-type clastic sedimentary seabed models consisting of sand, silt, and clay as well as scale-invariant stochastic layer sequences of these components characterized by realistic vertical gradients of the P- and S-wave velocities. Using effective media theory, we then assess the nature and magnitude of seismic anisotropy associated with these models. Our results indicate that anisotropy is rather benign for P-waves, and that the S-wave velocities in the axial directions differ only slightly. Because of the very high P- to S-wave velocity ratios in the vicinity of the seabed our models nevertheless suggest that S-wave triplications may occur at very small incidence angles. To numerically evaluate the P-wave reflection coefficient of our seabed models, we apply a frequency-slowness technique to the corresponding synthetic seismic wavefields. Comparison with analytical plane-wave reflection coefficients calculated for corresponding isotropic elastic half-space models shows that the differences tend to be most pronounced in the vicinity of the elastic equivalent of the critical angle as well as in the post-critical range. We also find that the presence of intrinsic anisotropy in the clay component of our layered models tends to dramatically reduce the overall magnitude of the P-wave reflection coefficient as well as its variation with incidence angle.
Resumo:
This study proposes a new concept for upscaling local information on failure surfaces derived from geophysical data, in order to develop the spatial information and quickly estimate the magnitude and intensity of a landslide. A new vision of seismic interpretation on landslides is also demonstrated by taking into account basic geomorphic information with a numeric method based on the Sloping Local Base Level (SLBL). The SLBL is a generalization of the base level defined in geomorphology applied to landslides, and allows the calculation of the potential geometry of the landslide failure surface. This approach was applied to a large scale landslide formed mainly in gypsum and situated in a former glacial valley along the Rhone within the Western European Alps. Previous studies identified the existence of two sliding surfaces that may continue below the level of the valley. In this study. seismic refraction-reflexion surveys were carried out to verify the existence of these failure surfaces. The analysis of the seismic data provides a four-layer model where three velocity layers (<1000 ms(-1), 1500 ms(-1) and 3000 ms(-1)) are interpreted as the mobilized mass at different weathering levels and compaction. The highest velocity layer (>4000 ms(-1)) with a maximum depth of similar to 58 m is interpreted as the stable anhydrite bedrock. Two failure surfaces were interpreted from the seismic surveys: an upper failure and a much deeper one (respectively 25 and 50 m deep). The upper failure surface depth deduced from geophysics is slightly different from the results obtained using the SLBL, and the deeper failure surface depth calculated with the SLBL method is underestimated in comparison with the geophysical interpretations. Optimal results were therefore obtained by including the seismic data in the SLBL calculations according to the geomorphic limits of the landslide (maximal volume of mobilized mass = 7.5 x 10(6) m(3)).
Resumo:
After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.
Resumo:
BACKGROUND: The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. RESULTS: Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. CONCLUSION: ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.
Resumo:
This work extends a previously developed research concerning about the use of local model predictive control in differential driven mobile robots. Hence, experimental results are presented as a way to improve the methodology by considering aspects as trajectory accuracy and time performance. In this sense, the cost function and the prediction horizon are important aspects to be considered. The aim of the present work is to test the control method by measuring trajectory tracking accuracy and time performance. Moreover, strategies for the integration with perception system and path planning are briefly introduced. In this sense, monocular image data can be used to plan safety trajectories by using goal attraction potential fields
Resumo:
This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational, and research tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system. In this context the research developed includes the visual information as a meaningful source that allows detecting the obstacle position coordinates as well as planning the free obstacle trajectory that should be reached by the robot
Resumo:
RESUME L'Institut de Géophysique de l'Université de Lausanne a développé au cours de ces dernières années un système d'acquisition de sismique réflexion multitrace à haute résolution 2D et 3D. L'objectif de cette thèse était de poursuivre ce développement tout améliorant les connaissances de la géologie sous le lac Léman, en étudiant en particulier la configuration des grands accidents sous-lacustres dans la Molasse (Tertiaire) qui forme l'essentiel du substratum des formations quaternaires. En configuration 2D, notre système permet d'acquérir des profils sismiques avec une distance inter-CDP de 1,25 m. La couverture varie entre 6 et 18 selon le nombre de traces et la distance inter-tir. Le canon à air (15/15 eu. in.), offre une résolution verticale de 1,25 ni et une pénétration maximale de 300 m sous le fond de l'eau. Nous avons acquis au total plus de 400 km de sections 2D dans le Grand Lac et le Haut Lac entre octobre 2000 et juillet 2004. Une campagne de sismique 3D a fourni des données au large d'Evian sur une surface de 442,5 m sur 1450 m, soit 0,64 km2. La navigation ainsi que le positionnement des hydrophones et de la source ont été réalisés avec des GPS différentiels. Nous avons utilisé un traitement sismique conventionnel, sans appliquer d'AGC et en utilisant une migration post-stack. L'interprétation du substratum antéquaternaire est basée sur l'identification des sismofaciès, sur leurs relations avec les unités géologiques adjacentes au lac, ainsi que sur quelques données de forages. Nous obtenons ainsi une carte des unités géologiques dans le Grand Lac. Nous précisons la position du chevauchement subalpin entre la ville de Lausanne, sur la rive nord, et le bassin de Sciez, sur la rive sud. Dans la Molasse de Plateau, nous avons identifié les décrochements de Pontarlier et de St. Cergue ainsi que plusieurs failles non reconnues jusqu'ici. Nous avons cartographié les accidents qui affectent la Molasse subalpine ainsi que le plan de chevauchement du flysch sur la Molasse près de la rive sud du lac. Une nouvelle carte tectonique de la région lémanique a ainsi pu être dressée. L'analyse du substratum ne montre pas de failles suggérant une origine tectonique de la cuvette lémanique. Par contre, nous suggérons que la forme du creusement glaciaire, donc de la forme du lac Léman, a été influencée par la présence de failles dans le substratum antéquaternaire. L'analyse des sédiments quaternaires nous a permis de tracer des cartes des différentes interfaces ou unités qui les composent. La carte du toit du substratum antéquaternaire montre la présence de chenaux d'origine glaciaire dont la profondeur maximale atteint la cote -200 ni. Leur pente est dirigée vers le nord-est, à l'inverse du sens d'écoulement actuel des eaux. Nous expliquons cette observation par l'existence de circulations sous-glaciaires d'eau artésienne. Les sédiments glaciaires dont l'épaisseur maximale atteint 150 ni au centre du lac ont enregistré les différentes récurrences glaciaires. Dans la zone d'Evian, nous mettons en évidence la présence de lentilles de sédiments glaciolacustres perchées sur le flanc de la cuvette lémanique. Nous avons corrélé ces unités avec des données de forage et concluons qu'il s'agit du complexe inférieur de la pile sédimentaire d'Evian. Celui-ci, âgé de plus de 30 000 ans, serait un dépôt de Kame associé à un lac périglaciaire. La sismique réflexion 3D permet de préciser l'orientation de l'alimentation en matériel détritique de l'unité. La finesse des images obtenues nous permet également d'établir quels types d'érosion ont affecté certaines unités. Les sédiments lacustres, dont l'épaisseur maximale imagée atteint plus de 225 m et sans doute 400 ni sous le delta du Rhône, indiquent plusieurs mécanismes de dépôts. A la base, une mégaturbidite, épaisse d'une trentaine de mètres en moyenne, s'étend entre l'embouchure de la Dranse et le delta du Rhône. Au-dessus, la décantation des particules en suspension d'origine biologique et détritique fournit l'essentiel des sédiments. Dans la partie orientale du lac, les apports détritiques du Rhône forment un delta qui prograde vers l'ouest en s'imbriquant avec les sédiments déposés par décantation. La structure superficielle du delta a brutalement évolué, probablement à la suite de l'évènement catastrophique du Tauredunum (563 A.D.). Sa trace probable se marque par la présence d'une surface érosive que nous avons cartographiée. Le delta a ensuite changé de géométrie, avec notamment un déplacement des chenaux sous-lacustres. Sur l'ensemble de nos sections sismiques, nous n'observons aucune faille dans les sédiments quaternaires qui attesterait d'une tectonique postglaciaire du substratum. ABSTRACT During the last few years the institute of Geophysics of the University of Lausanne cleveloped a 2D and 3D high-resolution multichannel seismic reflection acquisition system. The objective of the present work was to carry on this development white improving our knowledge of the geology under Lake Geneva, in particular by studying the configuration of the large accidents affecting the Tertiary Molasse that makes up the basement of most Quaternary deposits. In its 2D configuration, our system makes it possible to acquire seismic profiles with a CDP interval of 1.25 m. The fold varies from 6 to 18 depending on the number of traces and the shooting interval. Our air gun (15/15 cu. in.) provides a vertical resolution of 1.25 m and a maximum penetration depth of approximately 300 m under water bottom. We acquired more than 400 km of 2D sections in the Grand Lac and the Haut Lac between October 2000 and July 2004. A 3D seismic survey off the city of Evian provided data on a surface of 442.5 m x 1450 m (0.64 km2). Ship's navigation as well as hydrophone- and source positioning were carried out with differential GPS. The seismic data were processed following a conventional sequence without .applying AGC and using post-stack migration. The interpretation of the pre-Quaternary substratum is based on sismofacies, on their relationships with terrestrial geological units and on some borehole data. We thus obtained a map of the geological units in the Grand Lac. We defined the location of the subalpine thrust from Lausanne, on the north shore, to the Sciez Basin, on the south shore. Within the Molasse de Plateau, we identified the already know Pontarlier and St Cergue transforms Fault as well as faults. We mapped faults that affect subalpine Molasse as well as the thrust fault plane between alpine flysch and Molasse near the lake's south shore. A new tectonic map of the Lake Geneva region could thus be drawn up. The substratum does not show faults indicating a tectonic origin for the Lake Geneva Basin. However, we suggest that the orientation of glacial erosion, and thus the shape of Lake Geneva, vas influenced by the presence of faults in the pre-Quaternary basement. The analysis of Quaternary sediments enabled us to draw up maps of various discontinuities or internal units. The top pre-Quaternary basement map shows channels of glacial origin, the deepest of them reaching an altitude of 200 m a.s.l. The channel's slopes are directed to the North-East, in opposite direction of the present water flow. We explain this observation by the presence of artesian subglacial water circulation. Glacial sediments, the maximum thickness of which reaches 150 m in the central part of the lake, record several glacial recurrences. In the Evian area, we found lenses of glacio-lacustrine sediments set high up on the flank of the Lake Geneva Bassin. We correlated these units with on-land borehole data and concluded that they represent the lower complex of the Evian sedimentary pile. The lower complex is aider than 30 000 years, and it could be a Kame deposit associated with a periglacial lake. Our 3D seismic reflexion survey enables us to specify the supply direction of detrital material in this unit. With detailed seismic images we established how some units were affected by different erosion types. The lacustrine sediments we imaged in Lake Geneva are thicker than 225 m and 400 m or more Linder the Rhone Delta. They indicate several depositional mechanisms. Their base is a major turbidite, thirty meters thick on average, that spreads between the Dranse mouth and the Rhone delta. Above this unit, settling of suspended biological and detrital particles provides most of the sediments. In the eastern part of the lake, detrital contribution from the Rhone builds a delta that progrades to the west and imbricates with the settling sediments. The shallow structure of the Rhone delta abruptly evolved, probably after the catastrophic Tauredunum event (563 A.D.). It probably coincides with an erosive surface that we mapped. As a result, the delta geometry changed, in particular associated with a displacement of water bottom channels. In all our seismic sections, we do not observe fault in the Quaternary sediments that would attest postglacial tectonic activity in the basement.
Resumo:
Acoustic waveform inversions are an increasingly popular tool for extracting subsurface information from seismic data. They are computationally much more efficient than elastic inversions. Naturally, an inherent disadvantage is that any elastic effects present in the recorded data are ignored in acoustic inversions. We investigate the extent to which elastic effects influence seismic crosshole data. Our numerical modeling studies reveal that in the presence of high contrast interfaces, at which P-to-S conversions occur, elastic effects can dominate the seismic sections, even for experiments involving pressure sources and pressure receivers. Comparisons of waveform inversion results using a purely acoustic algorithm on synthetic data that is either acoustic or elastic, show that subsurface models comprising small low-to-medium contrast (?30%) structures can be successfully resolved in the acoustic approximation. However, in the presence of extended high-contrast anomalous bodies, P-to-S-conversions may substantially degrade the quality of the tomographic images. In particular, extended low-velocity zones are difficult to image. Likewise, relatively small low-velocity features are unresolved, even when advanced a priori information is included. One option for mitigating elastic effects is data windowing, which suppresses later arriving seismic arrivals, such as shear waves. Our tests of this approach found it to be inappropriate because elastic effects are also included in earlier arriving wavetrains. Furthermore, data windowing removes later arriving P-wave phases that may provide critical constraints on the tomograms. Finally, we investigated the extent to which acoustic inversions of elastic data are useful for time-lapse analyses of high contrast engineered structures, for which accurate reconstruction of the subsurface structure is not as critical as imaging differential changes between sequential experiments. Based on a realistic scenario for monitoring a radioactive waste repository, we demonstrated that acoustic inversions of elastic data yield substantial distortions of the tomograms and also unreliable information on trends in the velocity changes.
Resumo:
This paper describes the result of a research about diverse areas of the information technology world applied to cartography. Its final result is a complete and custom geographic information web system, designed and implemented to manage archaeological information of the city of Tarragona. The goal of the platform is to show on a web-focused application geographical and alphanumerical data and to provide concrete queries to explorate this. Various tools, between others, have been used: the PostgreSQL database management system in conjunction with its geographical extension PostGIS, the geographic server GeoServer, the GeoWebCache tile caching, the maps viewer and maps and satellite imagery from Google Maps, locations imagery from Google Street View, and other open source libraries. The technology has been chosen from an investigation of the requirements of the project, and has taken great part of its development. Except from the Google Maps tools which are not open source but are free, all design has been implemented with open source and free tools.