45 resultados para open source seismic data processing packages

em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The technologies are rapidly developing, but some of them present in the computers, as for instance their processing capacity, are reaching their physical limits. It is up to quantum computation offer solutions to these limitations and issues that may arise. In the field of information security, encryption is of paramount importance, being then the development of quantum methods instead of the classics, given the computational power offered by quantum computing. In the quantum world, the physical states are interrelated, thus occurring phenomenon called entanglement. This study presents both a theoretical essay on the merits of quantum mechanics, computing, information, cryptography and quantum entropy, and some simulations, implementing in C language the effects of entropy of entanglement of photons in a data transmission, using Von Neumann entropy and Tsallis entropy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The applications of the Finite Element Method (FEM) for three-dimensional domains are already well documented in the framework of Computational Electromagnetics. However, despite the power and reliability of this technique for solving partial differential equations, there are only a few examples of open source codes available and dedicated to the solid modeling and automatic constrained tetrahedralization, which are the most time consuming steps in a typical three-dimensional FEM simulation. Besides, these open source codes are usually developed separately by distinct software teams, and even under conflicting specifications. In this paper, we describe an experiment of open source code integration for solid modeling and automatic mesh generation. The integration strategy and techniques are discussed, and examples and performance results are given, specially for complicated and irregular volumes which are not simply connected. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, L1 SBAS signals can be used in a combined GPS+SBAS data processing. However, such situation restricts the studies over short baselines. Besides of increasing the satellite availability, SBAS satellites orbit configuration is different from that of GPS. In order to analyze how these characteristics can impact GPS positioning in the southeast area of Brazil, experiments involving GPS-only and combined GPS+SBAS data were performed. Solutions using single point and relative positioning were computed to show the impact over satellite geometry, positioning accuracy and short baseline ambiguity resolution. Results showed that the inclusion of SBAS satellites can improve the accuracy of positioning. Nevertheless, the bad quality of the data broadcasted by these satellites limits their usage. © Springer-Verlag Berlin Heidelberg 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary: The objective of this work was to evaluate the sperm motility of 13 Steindachneridion parahybae males using open-source software (ImageJ/CASA plugin). The sperm activation procedure and image capture were initiated after semen collection. Four experimental phases were defined from the videos captured of each male as follows: (i) standardization of a dialogue box generated by the CASA plugin within ImageJ; (ii) frame numbers used to perform the analysis; (iii) post-activation motility between 10 and 20 s with analysis at each 1 s; and (iv) post-activation motility between 10 and 50 s with analysis at each 10 s. The settings used in the CASA dialogue box were satisfactory, and the results were consistent. These analyses should be performed using 50 frames immediately after sperm activation because spermatozoa quickly lose their vigor. At 10 s post-activation, 89.1% motile sperm was observed with 107.2 μm s-1 curvilinear velocity, 83.6 μm s-1 average path velocity, 77.1 μm s-1 straight line velocity; 91.6% were of straightness and 77.1% of wobble. The CASA plugin within ImageJ can be applied in sperm analysis of the study species by using the established settings. © 2013 Blackwell Verlag GmbH.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Finite Element Method (FEM) is a way of numerical solution applied in different areas, as simulations used in studies to improve cardiac ablation procedures. For this purpose, the meshes should have the same size and histological features of the focused structures. Some methods and tools used to generate tetrahedral meshes are limited mainly by the use conditions. In this paper, the integration of Open Source Softwares is presented as an alternative to solid modeling and automatic mesh generation. To demonstrate its efficiency, the cardiac structures were considered as a first application context: atriums, ventricles, valves, arteries and pericardium. The proposed method is feasible to obtain refined meshes in an acceptable time and with the required quality for simulations using FEM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to evaluate the use of shallow seismic technique to delineate geological and geotechnical features up to 40 meters depth in noisy urban areas covered with asphalt pavement, five survey lines were conducted in the metropolitan area of São Paulo City. The data were acquired using a 24-bit, 24-channel seismograph, 30 and 100 Hz geophones and a sledgehammer-plate system as seismic source. Seismic reflection data were recorded using a CMP (common mid point) acquisition method. The processing routine consisted of: prestack band-pass filtering (90-250 Hz); automatic gain control (AGC); muting (digital zeroin) of dead/noisy traces, ground roll, air-wave and refracted-wave; CMP sorting; velocity analyses; normal move-out corrections; residual static corrections; f-k filtering; CMP stacking. The near surface is geologically characterized by unconsolidated fill materials and Quaternary sediments with organic material overlying Tertiary sediments with the water table 2 to 5 m below the surface. The basement is composed of granite and gneiss. Reflections were observed from 40 milliseconds to 65 ms two-way traveltime and were related to the silt clay layer and fine sand layer contact of the Tertiary sediments and to the weathered basement. The CMP seismic-reflection technique has been shown to be useful for mapping the sedimentary layers and the bedrock of the São Paulo sedimentary basin for the purposes of shallow investigations related to engineering problems. In spite of the strong cultural noise observed in these urban areas and problems with planting geophones we verified that, with the proper equipment, field parameters and particularly great care in data collection and processing, we can overcome the adverse field conditions and to image reflections from layers as shallow as 20 meters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Project aims to develop methods for data classification in a Data Warehouse for decision-making purposes. We also have as another goal the reduction of an attribute set in a Data Warehouse, in which a given reduced set is capable of keeping the same properties of the original one. Once we achieve a reduced set, we have a smaller computational cost of processing, we are able to identify non-relevant attributes to certain kinds of situations, and finally we are also able to recognize patterns in the database that will help us to take decisions. In order to achieve these main objectives, it will be implemented the Rough Sets algorithm. We chose PostgreSQL as our data base management system due to its efficiency, consolidation and finally, it’s an open-source system (free distribution)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of São Paulo (IAG-USP) has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation. © 2011 by the Istituto Nazionale di Geofisica e Vulcanologia. All rights reserved.