941 resultados para open source seismic data processing packages


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main activity carried out by the geophysicist when interpreting seismic data, in terms of both importance and time spent is tracking (or picking) seismic events. in practice, this activity turns out to be rather challenging, particularly when the targeted event is interrupted by discontinuities such as geological faults or exhibits lateral changes in seismic character. In recent years, several automated schemes, known as auto-trackers, have been developed to assist the interpreter in this tedious and time-consuming task. The automatic tracking tool available in modem interpretation software packages often employs artificial neural networks (ANN's) to identify seismic picks belonging to target events through a pattern recognition process. The ability of ANNs to track horizons across discontinuities largely depends on how reliably data patterns characterise these horizons. While seismic attributes are commonly used to characterise amplitude peaks forming a seismic horizon, some researchers in the field claim that inherent seismic information is lost in the attribute extraction process and advocate instead the use of raw data (amplitude samples). This paper investigates the performance of ANNs using either characterisation methods, and demonstrates how the complementarity of both seismic attributes and raw data can be exploited in conjunction with other geological information in a fuzzy inference system (FIS) to achieve an enhanced auto-tracking performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of the ionospheric E-region during total solar eclipses have been used to provide information about the evolution of the solar magnetic field and EUV and X-ray emissions from the solar corona and chromosphere. By measuring levels of ionisation during an eclipse and comparing these measurements with an estimate of the unperturbed ionisation levels (such as those made during a control day, where available) it is possible to estimate the percentage of ionising radiation being emitted by the solar corona and chromosphere. Previously unpublished data from the two eclipses presented here are particularly valuable as they provide information that supplements the data published to date. The eclipse of 23 October 1976 over Australia provides information in a data gap that would otherwise have spanned the years 1966 to 1991. The eclipse of 4 December 2002 over Southern Africa is important as it extends the published sequence of measurements. Comparing measurements from eclipses between 1932 and 2002 with the solar magnetic source flux reveals that changes in the solar EUV and X-ray flux lag the open source flux measurements by approximately 1.5 years. We suggest that this unexpected result comes about from changes to the relative size of the limb corona between eclipses, with the lag representing the time taken to populate the coronal field with plasma hot enough to emit the EUV and X-rays ionising our atmosphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing use of shallow seismic methods of high resolution, for investigations of geological problems, environmental or industrial, has impelled the development of techniques, flows and computational algorithms. The practice of applying techniques for processing this data, until recently it wasn t used and the interpretation of the data was made as they were acquired. In order to facilitate and contribute to the improvement of the practices adopted, was developed a free graphical application and open source, called OpenSeismic which is based on free software Seismic Un*x, widely used in the treatment of conventional seismic data used in the exploration of hydrocarbon reservoirs. The data used to validate the initiative were marine seismic data of high resolution, acquired by the laboratory of Geology and Marine Geophysics and Environmental Monitoring - GGEMMA, of the Federal University of Rio Grande do Norte UFRN, for the SISPLAT Project, located at the region of paleo-valley of the Rio Acu. These data were submitted to the processing flow developed by Gomes (2009), using the free software developed in this work, the OpenSeismic, as well other free software, the Seismic Un*x and the commercial software ProMAX, where despite its peculiarities has presented similar results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to evaluate the use of shallow seismic technique to delineate geological and geotechnical features up to 40 meters depth in noisy urban areas covered with asphalt pavement, five survey lines were conducted in the metropolitan area of São Paulo City. The data were acquired using a 24-bit, 24-channel seismograph, 30 and 100 Hz geophones and a sledgehammer-plate system as seismic source. Seismic reflection data were recorded using a CMP (common mid point) acquisition method. The processing routine consisted of: prestack band-pass filtering (90-250 Hz); automatic gain control (AGC); muting (digital zeroin) of dead/noisy traces, ground roll, air-wave and refracted-wave; CMP sorting; velocity analyses; normal move-out corrections; residual static corrections; f-k filtering; CMP stacking. The near surface is geologically characterized by unconsolidated fill materials and Quaternary sediments with organic material overlying Tertiary sediments with the water table 2 to 5 m below the surface. The basement is composed of granite and gneiss. Reflections were observed from 40 milliseconds to 65 ms two-way traveltime and were related to the silt clay layer and fine sand layer contact of the Tertiary sediments and to the weathered basement. The CMP seismic-reflection technique has been shown to be useful for mapping the sedimentary layers and the bedrock of the São Paulo sedimentary basin for the purposes of shallow investigations related to engineering problems. In spite of the strong cultural noise observed in these urban areas and problems with planting geophones we verified that, with the proper equipment, field parameters and particularly great care in data collection and processing, we can overcome the adverse field conditions and to image reflections from layers as shallow as 20 meters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O empilhamento por superfície de reflexão comum (ou empilhamento SRC), conhecido como empilhamento CRS, do inglês Commom reflection surface, constitui-se em um novo método para o processamento sísmico na simulação de seções afastamento nulo (AN) e afastamento comum (AC). Este método é baseado em uma aproximação paraxial hiperbólica de segunda ordem dos tempos de trânsito de reflexão na vizinhança de um raio central. Para a simulação de seção AN, o raio central é um raio normal, enquanto que para a simulação de uma seção AC o raio central é um raio de afastamento finito. Em adição à seção AN, o método de empilhamento SRC também fornece estimativas dos atributos cinemáticos do campo de onda, sendo aplicados, por exemplo, na determinação (por um processo de inversão) da velocidade intervalar, no cálculo do espalhamento geométrico, na estimativa da zona de Fresnel, e também na simulação de eventos de tempos de difrações, este último tendo uma grande importância para a migração pré-empilhamento. Neste trabalho é proposta uma nova estratégia para fazer uma migração em profundidade pré-empilhamento, que usa os atributos cinemáticos do campo de onda derivados do empilhamento SRC, conhecido por método CRS-PSDM, do inglês CRS based pre-stack depth migration. O método CRS-PSDM usa os resultados obtidos do método SRC, isto é, as seções dos atributos cinemáticos do campo de onda, para construir uma superfície de tempos de trânsito de empilhamento, ao longo da qual as amplitudes do dado sísmico de múltipla cobertura são somadas, sendo o resultado da soma atribuído a um dado ponto em profundidade, na zona alvo de migração que é definida por uma malha regular. Similarmente ao método convencional de migração tipo Kirchhoff (K-PSDM), o método CRS-PSDM precisa de um modelo de velocidade de migração. Contrário ao método K-PSDM, o método CRS-PSDM necessita apenas computar os tempos de trânsito afastamento nulo, ao seja, ao longo de um único raio ligando o ponto considerado em profundidade a uma dada posição de fonte e receptor coincidentes na superfície. O resultado final deste procedimento é uma imagem sísmica em profundidade dos refletores a partir do dado de múltipla cobertura.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As redes neurais artificiais têm provado serem uma poderosa técnica na resolução de uma grande variedade de problemas de otimização. Nesta dissertação é desenvolvida uma nova rede neural, tipo recorrente, sem realimentação (self-feedback loops) e sem neurônios ocultos, para o processamento do sinal sísmico, para fornecer a posição temporal, a polaridade e as amplitudes estimadas dos refletores sísmicos, representadas pelos seus coeficientes de reflexão. A principal característica dessa nova rede neural consiste no tipo de função de ativação utilizada, a qual permite três possíveis estados para o neurônio. Busca-se estimar a posição dos refletores sísmicos e reproduzir as verdadeiras polaridades desses refletores. A idéia básica desse novo tipo de rede, aqui denominada rede neural discreta (RND), é relacionar uma função objeto, que descreve o problema geofísico, com a função de Liapunov, que descreve a dinâmica da rede neural. Deste modo, a dinâmica da rede leva a uma minimização local da sua função de Liapunov e consequentemente leva a uma minimização da função objeto. Assim, com uma codificação conveniente do sinal de saída da rede tem-se uma solução do problema geofísico. A avaliação operacional da arquitetura desta rede neural artificial é realizada em dados sintéticos gerados através do modelo convolucional simples e da teoria do raio. A razão é para explicar o comportamento da rede com dados contaminados por ruído, e diante de pulsos fonte de fases mínima, máxima e misturada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Project aims to develop methods for data classification in a Data Warehouse for decision-making purposes. We also have as another goal the reduction of an attribute set in a Data Warehouse, in which a given reduced set is capable of keeping the same properties of the original one. Once we achieve a reduced set, we have a smaller computational cost of processing, we are able to identify non-relevant attributes to certain kinds of situations, and finally we are also able to recognize patterns in the database that will help us to take decisions. In order to achieve these main objectives, it will be implemented the Rough Sets algorithm. We chose PostgreSQL as our data base management system due to its efficiency, consolidation and finally, it’s an open-source system (free distribution)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a non linear technique to invert strong motion records with the aim of obtaining the final slip and rupture velocity distributions on the fault plane. In this thesis, the ground motion simulation is obtained evaluating the representation integral in the frequency. The Green’s tractions are computed using the discrete wave-number integration technique that provides the full wave-field in a 1D layered propagation medium. The representation integral is computed through a finite elements technique, based on a Delaunay’s triangulation on the fault plane. The rupture velocity is defined on a coarser regular grid and rupture times are computed by integration of the eikonal equation. For the inversion, the slip distribution is parameterized by 2D overlapping Gaussian functions, which can easily relate the spectrum of the possible solutions with the minimum resolvable wavelength, related to source-station distribution and data processing. The inverse problem is solved by a two-step procedure aimed at separating the computation of the rupture velocity from the evaluation of the slip distribution, the latter being a linear problem, when the rupture velocity is fixed. The non-linear step is solved by optimization of an L2 misfit function between synthetic and real seismograms, and solution is searched by the use of the Neighbourhood Algorithm. The conjugate gradient method is used to solve the linear step instead. The developed methodology has been applied to the M7.2, Iwate Nairiku Miyagi, Japan, earthquake. The estimated magnitude seismic moment is 2.6326 dyne∙cm that corresponds to a moment magnitude MW 6.9 while the mean the rupture velocity is 2.0 km/s. A large slip patch extends from the hypocenter to the southern shallow part of the fault plane. A second relatively large slip patch is found in the northern shallow part. Finally, we gave a quantitative estimation of errors associates with the parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Meditation is a self-induced and willfully initiated practice that alters the state of consciousness. The meditation practice of Zazen, like many other meditation practices, aims at disregarding intrusive thoughts while controlling body posture. It is an open monitoring meditation characterized by detached moment-to-moment awareness and reduced conceptual thinking and self-reference. Which brain areas differ in electric activity during Zazen compared to task-free resting? Since scalp electroencephalography (EEG) waveforms are reference-dependent, conclusions about the localization of active brain areas are ambiguous. Computing intracerebral source models from the scalp EEG data solves this problem. In the present study, we applied source modeling using low resolution brain electromagnetic tomography (LORETA) to 58-channel scalp EEG data recorded from 15 experienced Zen meditators during Zazen and no-task resting. Zazen compared to no-task resting showed increased alpha-1 and alpha-2 frequency activity in an exclusively right-lateralized cluster extending from prefrontal areas including the insula to parts of the somatosensory and motor cortices and temporal areas. Zazen also showed decreased alpha and beta-2 activity in the left angular gyrus and decreased beta-1 and beta-2 activity in a large bilateral posterior cluster comprising the visual cortex, the posterior cingulate cortex and the parietal cortex. The results include parts of the default mode network and suggest enhanced automatic memory and emotion processing, reduced conceptual thinking and self-reference on a less judgmental, i.e., more detached moment-to-moment basis during Zazen compared to no-task resting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Camera traps have become a widely used technique for conducting biological inventories, generating a large number of database records of great interest. The main aim of this paper is to describe a new free and open source software (FOSS), developed to facilitate the management of camera-trapped data which originated from a protected Mediterranean area (SE Spain). In the last decade, some other useful alternatives have been proposed, but ours focuses especially on a collaborative undertaking and on the importance of spatial information underpinning common camera trap studies. This FOSS application, namely, “Camera Trap Manager” (CTM), has been designed to expedite the processing of pictures on the .NET platform. CTM has a very intuitive user interface, automatic extraction of some image metadata (date, time, moon phase, location, temperature, atmospheric pressure, among others), analytical (Geographical Information Systems, statistics, charts, among others), and reporting capabilities (ESRI Shapefiles, Microsoft Excel Spreadsheets, PDF reports, among others). Using this application, we have achieved a very simple management, fast analysis, and a significant reduction of costs. While we were able to classify an average of 55 pictures per hour manually, CTM has made it possible to process over 1000 photographs per hour, consequently retrieving a greater amount of data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The commercial data acquisition systems used for seismic exploration are usually expensive equipment. In this work, a low cost data acquisition system (Geophonino) has been developed for recording seismic signals from a vertical geophone. The signal goes first through an instrumentation amplifier, INA155, which is suitable for low amplitude signals like the seismic noise, and an anti-aliasing filter based on the MAX7404 switched-capacitor filter. After that, the amplified and filtered signal is digitized and processed by Arduino Due and registered in an SD memory card. Geophonino is configured for continuous registering, where the sampling frequency, the amplitude gain and the registering time are user-defined. The complete prototype is an open source and open hardware system. It has been tested by comparing the registered signals with the ones obtained through different commercial data recording systems and different kind of geophones. The obtained results show good correlation between the tested measurements, presenting Geophonino as a low-cost alternative system for seismic data recording.