962 resultados para Data Processing


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work aims viewing weather information, by building isosurfaces enabling enjoy the advantages of three-dimensional geometric models, to communicate the meaning of the data used in a clear and efficient way. The evolving technology of data processing makes possible the interpretation of masses of data increasing, through robust algorithms. In meteorology, in particular, we can benefit from this fact, due to the large amount of data required for analysis and statistics. The manipulation of data, by users from other areas, is facilitated by the choice of algorithm and the tools involved in this work. The project was further developed into distinct modules, increasing their flexibility and reusability for future studies

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the rapid growth of the use of Web applications in various fields of knowledge, the term Web service enter into evidence in the current scenario, which refers to services from different origins and purpose, offered through local networks and also available in some cases, on the Internet. The architecture of this type of application offers data processing on server side thereby, running applications and complex and slow processes is very interesting, which is the case with most algorithms involving visualization. The VTK is a library intended for visualization, and features a large variety of methods and algorithms for this purpose, but with a graphics engine that requires processing capacity. The union of these two resources can bring interesting results and contribute for performance improvements in the VTK library. This study is discussed in this project, through testing and communication overhead analysis

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The X-ray fluorescence analysis (XRF) is an important technique for the qualitative and quantitative determination of chemical components in a sample. It is based on measurement of the intensity of the emitted characteristic radiation by the elements of the sample, after being properly excited. One of the modalities of this technique is the total reflection x-ray fluorescence (TXRF). In TXRF, the angle of refraction of the incident beam tends to zero and the refracted beam is tangent to the sample-support interface. Thus, there is a minimum angle of incidence that there is no refracted beam and all the incident radiation undergoes total reflection. As it is implemented in very small samples, in a film format, self-absorption effects should not very relevant. In this study, we evaluated the feasibility of using code MCNPX (Monte Carlo N - Particle eXtended), to simulate a measure implemented by the TXRF technique. In this way, it was verified the quality of response of a system by TXRF spectroscopy using synchrotron radiation as excitation beam for a simple setup, by retrieving the characteristic energies and the concentrations of the elements in the sample. The steps of data processing, after obtaining the excitation spectra, were the same as in a real experiment and included the obtaining of the sensitivity curve for the simulated system. The agreement between the theoretical and simulated values of Ka characteristic energies for different elements was lower than 1 % .The obtained concentration of the elements of the sample had high relatively errors ( between 6 and 60 % ) due mainly to lack of knowing about some realistic physical parameters of the sample , such as density . In this way, this result does not preclude the use of MCNPX code for this type of application

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Estimation of tropospheric gradients in GNSS data processing is a well-known technique to improve positioning (e.g. Bar-Sever et al., 1998; Chen and Herring, 1997). More recently, several authors also focused on the estimation of such parameters for meteorological studies and demonstrated their potential benefits (e.g. Champollion et al., 2004). Today, they are routinely estimated by several global and regional GNSS analysis centres but they are still not yet used for operational meteorology.This paper discusses the physical meaning of tropospheric gradients estimated from GPS observations recorded in 2011 by 13 permanent stations located in Corsica Island (a French Island in the western part of Italy). Corsica Island is a particularly interesting location for such study as it presents a significant environmental contrast between the continent and the sea, as well as a steep topography.Therefore, we estimated Zenith Total Delay (ZTD) and tropospheric gradients using two software: GAMIT/GLOBK (GAMIT version 10.5) and GIPSY-OASIS II version 6.1. Our results are then compared to radiosonde observations and to the IGS final troposphere products. For all stations we found a good agreement between the ZWD estimated by the two software (the mean of the ZWD differences is 1 mm with a standard deviation of 6 mm) but the tropospheric gradients are in less good agreement (the mean of the gradient differences is 0.1 mm with a standard deviation of 0.7 mm), despite the differences in the processing strategy (double-differences for GAMIT/GLOBK versus zero-difference for GIPSY-OASIS).We also observe that gradient amplitudes are correlated with the seasonal behaviour of the humidity. Like ZWD estimates, they are larger in summer than in winter. Their directions are stable over the time but not correlated with the IWV anomaly observed by ERA-Interim. Tropospheric gradients observed at many sites always point to inland throughout the year. These preferred directions are almost opposite to the largest slope of the local topography as derived from the world Digital Elevation Model ASTER GDEM v2. These first results give a physical meaning to gradients but the origin of such directions need further investigations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The collection of prices for basic goods supply is very important for the population, based on the collection and processing of these data the CLI (Cost Living Index) is calculated among others, helping consumers to shop more rationally and with a clearer view of each product impact of each product on their household budget, not only food, but also cleaning products and personal hygiene ones. Nowadays, the project of collection of prices for basic goods supply is conducted weekly in Botucatu - SP through a spreadsheet. The aim of this work was to develop a software which utilized mobile devices in the data collection and storage phase, concerning the basic goods supply in Botucatu -SP. This was created in order to eliminate the need of taking notes in paper spreadsheets, increasing efficiency and accelerating the data processing. This work utilized the world of mobile technology and development tools, through the platform".NET" - Compact Framework and programming language Visual Basic".NET" was used in the handheld phase, enabling to develop a system using techniques of object oriented programming, with higher speed and reliability in the codes writing. A HP Pavilion dv3 personal computer and an Eten glofish x500+ handheld computer were used. At the end of the software development, collection, data storing and processing in a report, the phase of in loco paper spreadsheets were eliminated and it was possible to verify that the whole process was faster, more consistent, safer, more efficient and the data were more available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Degeneration of tendon tissue is a common cause of tendon dysfunction with the symptoms of repeated episodes of pain and palpable increase of tendon thickness. Tendon mechanical properties are directly related to its physiological composition and the structural organization of the interior collagen fibers which could be altered by tendon degeneration due to overuse or injury. Thus, measuring mechanical properties of tendon tissue may represent a quantitative measurement of pain, reduced function, and tissue health. Ultrasound elasticity imaging has been developed in the last two decades and has proved to be a promising tool for tissue elasticity imaging. To date, however, well established protocols of tendinopathy elasticity imaging for diagnosing tendon degeneration in early stages or late stages do not exist. This thesis describes the re-creation of one dynamic ultrasound elasticity imaging method and the development of an ultrasound transient shear wave elasticity imaging platform for tendon and other musculoskeletal tissue imaging. An experimental mechanical stage with proper supporting systems and accurate translating stages was designed and made. A variety of high-quality tissue-mimicking phantoms were made to simulate homogeneous and heterogeneous soft tissues as well as tendon tissues. A series of data acquisition and data processing programs were developed to collect the displacement data from the phantom and calculate the shear modulus and Young’s modulus of the target. The imaging platform was found to be capable of conducting comparative measurements of the elastic parameters of the phantoms and quantitatively mapping elasticity onto ultrasound B-Mode images. This suggests the system has great potential for not only benefiting individuals with tendinopathy with an earlier detection, intervention and better rehabilitation, but also for providing a medical tool for quantification of musculoskeletal tissue dysfunction in other regions of the body such as the shoulder, elbow and knee.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper provides a brief but comprehensive guide to creating, preparing and dissecting a 'virtual' fossil, using a worked example to demonstrate some standard data processing techniques. Computed tomography (CT) is a 3D imaging modality for producing 'virtual' models of an object on a computer. In the last decade, CT technology has greatly improved, allowing bigger and denser objects to be scanned increasingly rapidly. The technique has now reached a stage where systems can facilitate large-scale, non-destructive comparative studies of extinct fossils and their living relatives. Consequently the main limiting factor in CT-based analyses is no longer scanning, but the hurdles of data processing (see disclaimer). The latter comprises the techniques required to convert a 3D CT volume (stack of digital slices) into a virtual image of the fossil that can be prepared (separated) from the matrix and 'dissected' into its anatomical parts. This technique can be applied to specimens or part of specimens embedded in the rock matrix that until now have been otherwise impossible to visualise. This paper presents a suggested workflow explaining the steps required, using as example a fossil tooth of Sphenacanthus hybodoides (Egerton), a shark from the Late Carboniferous of England. The original NHMUK copyrighted CT slice stack can be downloaded for practice of the described techniques, which include segmentation, rendering, movie animation, stereo-anaglyphy, data storage and dissemination. Fragile, rare specimens and type materials in university and museum collections can therefore be virtually processed for a variety of purposes, including virtual loans, website illustrations, publications and digital collections. Micro-CT and other 3D imaging techniques are increasingly utilized to facilitate data sharing among scientists and on education and outreach projects. Hence there is the potential to usher in a new era of global scientific collaboration and public communication using specimens in museum collections.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

[ES]El proyecto contiene módulos de simulación, procesado de datos, mapeo y localización, desarrollados en C++ utilizando ROS (Robot Operating System) y PCL (Point Cloud Library). Ha sido desarrollado bajo el proyecto de robótica submarina AVORA.Se han caracterizado el vehículo y el sensor, y se han analizado diferentes tecnologías de sensores y mapeo. Los datos pasan por tres etapas: Conversión a nube de puntos, filtrado por umbral, eliminación de puntos espureos y, opcionalmente, detección de formas. Estos datos son utilizados para construir un mapa de superficie multinivel. La otra herramienta desarrollada es un algoritmo de Punto más Cercano Iterativo (ICP) modificado, que tiene en cuenta el modo de funcionamiento del sonar de imagen utilizado.