991 resultados para GPU - graphics processing unit
Resumo:
The Optical, Spectroscopic, and Infrared Remote Imaging System OSIRIS is the scientific camera system onboard the Rosetta spacecraft (Figure 1). The advanced high performance imaging system will be pivotal for the success of the Rosetta mission. OSIRIS will detect 67P/Churyumov-Gerasimenko from a distance of more than 106 km, characterise the comet shape and volume, its rotational state and find a suitable landing spot for Philae, the Rosetta lander. OSIRIS will observe the nucleus, its activity and surroundings down to a scale of ~2 cm px−1. The observations will begin well before the onset of cometary activity and will extend over months until the comet reaches perihelion. During the rendezvous episode of the Rosetta mission, OSIRIS will provide key information about the nature of cometary nuclei and reveal the physics of cometary activity that leads to the gas and dust coma. OSIRIS comprises a high resolution Narrow Angle Camera (NAC) unit and a Wide Angle Camera (WAC) unit accompanied by three electronics boxes. The NAC is designed to obtain high resolution images of the surface of comet 7P/Churyumov-Gerasimenko through 12 discrete filters over the wavelength range 250–1000 nm at an angular resolution of 18.6 μrad px−1. The WAC is optimised to provide images of the near-nucleus environment in 14 discrete filters at an angular resolution of 101 μrad px−1. The two units use identical shutter, filter wheel, front door, and detector systems. They are operated by a common Data Processing Unit. The OSIRIS instrument has a total mass of 35 kg and is provided by institutes from six European countries
Resumo:
Implementation of a Monte Carlo simulation for the solution of population balance equations (PBEs) requires choice of initial sample number (N0), number of replicates (M), and number of bins for probability distribution reconstruction (n). It is found that Squared Hellinger Distance, H2, is a useful measurement of the accuracy of Monte Carlo (MC) simulation, and can be related directly to N0, M, and n. Asymptotic approximations of H2 are deduced and tested for both one-dimensional (1-D) and 2-D PBEs with coalescence. The central processing unit (CPU) cost, C, is found in a power-law relationship, C= aMNb0, with the CPU cost index, b, indicating the weighting of N0 in the total CPU cost. n must be chosen to balance accuracy and resolution. For fixed n, M × N0 determines the accuracy of MC prediction; if b > 1, then the optimal solution strategy uses multiple replications and small sample size. Conversely, if 0 < b < 1, one replicate and a large initial sample size is preferred. © 2015 American Institute of Chemical Engineers AIChE J, 61: 2394–2402, 2015
Resumo:
In the oil industry, natural gas is a vital component of the world energy supply and an important source of hydrocarbons. It is one of the cleanest, safest and most relevant of all energy sources, and helps to meet the world's growing demand for clean energy in the future. With the growing share of natural gas in the Brazil energy matrix, the main purpose of its use has been the supply of electricity by thermal power generation. In the current production process, as in a Natural Gas Processing Unit (NGPU), natural gas undergoes various separation units aimed at producing liquefied natural gas and fuel gas. The latter should be specified to meet the thermal machines specifications. In the case of remote wells, the process of absorption of heavy components aims the match of fuel gas application and thereby is an alternative to increase the energy matrix. Currently, due to the high demand for this raw gas, research and development techniques aimed at adjusting natural gas are studied. Conventional methods employed today, such as physical absorption, show good results. The objective of this dissertation is to evaluate the removal of heavy components of natural gas by absorption. In this research it was used as the absorbent octyl alcohol (1-octanol). The influence of temperature (5 and 40 °C) and flowrate (25 and 50 ml/min) on the absorption process was studied. Absorption capacity expressed by the amount absorbed and kinetic parameters, expressed by the mass transfer coefficient, were evaluated. As expected from the literature, it was observed that the absorption of heavy hydrocarbon fraction is favored by lowering the temperature. Moreover, both temperature and flowrate favors mass transfer (kinetic effect). The absorption kinetics for removal of heavy components was monitored by chromatographic analysis and the experimental results demonstrated a high percentage of recovery of heavy components. Furthermore, it was observed that the use of octyl alcohol as absorbent was feasible for the requested separation process.
Resumo:
In the oil industry, natural gas is a vital component of the world energy supply and an important source of hydrocarbons. It is one of the cleanest, safest and most relevant of all energy sources, and helps to meet the world's growing demand for clean energy in the future. With the growing share of natural gas in the Brazil energy matrix, the main purpose of its use has been the supply of electricity by thermal power generation. In the current production process, as in a Natural Gas Processing Unit (NGPU), natural gas undergoes various separation units aimed at producing liquefied natural gas and fuel gas. The latter should be specified to meet the thermal machines specifications. In the case of remote wells, the process of absorption of heavy components aims the match of fuel gas application and thereby is an alternative to increase the energy matrix. Currently, due to the high demand for this raw gas, research and development techniques aimed at adjusting natural gas are studied. Conventional methods employed today, such as physical absorption, show good results. The objective of this dissertation is to evaluate the removal of heavy components of natural gas by absorption. In this research it was used as the absorbent octyl alcohol (1-octanol). The influence of temperature (5 and 40 °C) and flowrate (25 and 50 ml/min) on the absorption process was studied. Absorption capacity expressed by the amount absorbed and kinetic parameters, expressed by the mass transfer coefficient, were evaluated. As expected from the literature, it was observed that the absorption of heavy hydrocarbon fraction is favored by lowering the temperature. Moreover, both temperature and flowrate favors mass transfer (kinetic effect). The absorption kinetics for removal of heavy components was monitored by chromatographic analysis and the experimental results demonstrated a high percentage of recovery of heavy components. Furthermore, it was observed that the use of octyl alcohol as absorbent was feasible for the requested separation process.
Resumo:
Graphics Processing Units (GPUs) are becoming popular accelerators in modern High-Performance Computing (HPC) clusters. Installing GPUs on each node of the cluster is not efficient resulting in high costs and power consumption as well as underutilisation of the accelerator. The research reported in this paper is motivated towards the use of few physical GPUs by providing cluster nodes access to remote GPUs on-demand for a financial risk application. We hypothesise that sharing GPUs between several nodes, referred to as multi-tenancy, reduces the execution time and energy consumed by an application. Two data transfer modes between the CPU and the GPUs, namely concurrent and sequential, are explored. The key result from the experiments is that multi-tenancy with few physical GPUs using sequential data transfers lowers the execution time and the energy consumed, thereby improving the overall performance of the application.
Resumo:
A major weakness among loading models for pedestrians walking on flexible structures proposed in recent years is the various uncorroborated assumptions made in their development. This applies to spatio-temporal characteristics of pedestrian loading and the nature of multi-object interactions. To alleviate this problem, a framework for the determination of localised pedestrian forces on full-scale structures is presented using a wireless attitude and heading reference systems (AHRS). An AHRS comprises a triad of tri-axial accelerometers, gyroscopes and magnetometers managed by a dedicated data processing unit, allowing motion in three-dimensional space to be reconstructed. A pedestrian loading model based on a single point inertial measurement from an AHRS is derived and shown to perform well against benchmark data collected on an instrumented treadmill. Unlike other models, the current model does not take any predefined form nor does it require any extrapolations as to the timing and amplitude of pedestrian loading. In order to assess correctly the influence of the moving pedestrian on behaviour of a structure, an algorithm for tracking the point of application of pedestrian force is developed based on data from a single AHRS attached to a foot. A set of controlled walking tests with a single pedestrian is conducted on a real footbridge for validation purposes. A remarkably good match between the measured and simulated bridge response is found, indeed confirming applicability of the proposed framework.
Resumo:
The problem addressed concerns the determination of the average numberof successive attempts of guessing a word of a certain length consisting of letters withgiven probabilities of occurrence. Both first- and second-order approximations to a naturallanguage are considered. The guessing strategy used is guessing words in decreasing orderof probability. When word and alphabet sizes are large, approximations are necessary inorder to estimate the number of guesses. Several kinds of approximations are discusseddemonstrating moderate requirements regarding both memory and central processing unit(CPU) time. When considering realistic sizes of alphabets and words (100), the numberof guesses can be estimated within minutes with reasonable accuracy (a few percent) andmay therefore constitute an alternative to, e.g., various entropy expressions. For manyprobability distributions, the density of the logarithm of probability products is close to anormal distribution. For those cases, it is possible to derive an analytical expression for theaverage number of guesses. The proportion of guesses needed on average compared to thetotal number decreases almost exponentially with the word length. The leading term in anasymptotic expansion can be used to estimate the number of guesses for large word lengths.Comparisons with analytical lower bounds and entropy expressions are also provided.
Resumo:
In this paper, we develop a fast implementation of an hyperspectral coded aperture (HYCA) algorithm on different platforms using OpenCL, an open standard for parallel programing on heterogeneous systems, which includes a wide variety of devices, from dense multicore systems from major manufactures such as Intel or ARM to new accelerators such as graphics processing units (GPUs), field programmable gate arrays (FPGAs), the Intel Xeon Phi and other custom devices. Our proposed implementation of HYCA significantly reduces its computational cost. Our experiments have been conducted using simulated data and reveal considerable acceleration factors. This kind of implementations with the same descriptive language on different architectures are very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.
Resumo:
This work presents a low cost architecture for development of synchronized phasor measurement units (PMU). The device is intended to be connected in the low voltage grid, which allows the monitoring of transmission and distribution networks. Developments of this project include a complete PMU, with instrumentation module for use in low voltage network, GPS module to provide the sync signal and time stamp for the measures, processing unit with the acquisition system, phasor estimation and formatting data according to the standard and finally, communication module for data transmission. For the development and evaluation of the performance of this PMU, it was developed a set of applications in LabVIEW environment with specific features that let analyze the behavior of the measures and identify the sources of error of the PMU, as well as to apply all the tests proposed by the standard. The first application, useful for the development of instrumentation, consists of a function generator integrated with an oscilloscope, which allows the generation and acquisition of signals synchronously, in addition to the handling of samples. The second and main, is the test platform, with capabality of generating all tests provided by the synchronized phasor measurement standard IEEE C37.118.1, allowing store data or make the analysis of the measurements in real time. Finally, a third application was developed to evaluate the results of the tests and generate calibration curves to adjust the PMU. The results include all the tests proposed by synchrophasors standard and an additional test that evaluates the impact of noise. Moreover, through two prototypes connected to the electrical installation of consumers in same distribution circuit, it was obtained monitoring records that allowed the identification of loads in consumer and power quality analysis, beyond the event detection at the distribution and transmission levels.
Resumo:
The Solar Intensity X-ray and particle Spectrometer (SIXS) on board BepiColombo's Mercury Planetary Orbiter (MPO) will study solar energetic particles moving towards Mercury and solar X-rays on the dayside of Mercury. The SIXS instrument consists of two detector sub-systems; X-ray detector SIXS-X and particle detector SIXS-P. The SIXS-P subdetector will detect solar energetic electrons and protons in a broad energy range using a particle telescope approach with five outer Si detectors around a central CsI(Tl) scintillator. The measurements made by the SIXS instrument are necessary for other instruments on board the spacecraft. SIXS data will be used to study the Solar X-ray corona, solar flares, solar energetic particles, the Hermean magnetosphere, and solar eruptions. The SIXS-P detector was calibrated by comparing experimental measurement data from the instrument with Geant4 simulation data. Calibration curves were produced for the different side detectors and the core scintillator for electrons and protons, respectively. The side detector energy response was found to be linear for both electrons and protons. The core scintillator energy response to protons was found to be non-linear. The core scintillator calibration for electrons was omitted due to insufficient experimental data. The electron and proton acceptance of the SIXS-P detector was determined with Geant4 simulations. Electron and proton energy channels are clean in the main energy range of the instrument. At higher energies, protons and electrons produce non-ideal response in the energy channels. Due to the limited bandwidth of the spacecraft's telemetry, the particle measurements made by SIXS-P have to be pre-processed in the data processing unit of the SIXS instrument. A lookup table was created for the pre-processing of data with Geant4 simulations, and the ability of the lookup table to provide spectral information from a simulated electron event was analysed. The lookup table produces clean electron and proton channels and is able to separate protons and electrons. Based on a simulated solar energetic electron event, the incident electron spectrum cannot be determined from channel particle counts with a standard analysis method.
Resumo:
The research described in this thesis was motivated by the need of a robust model capable of representing 3D data obtained with 3D sensors, which are inherently noisy. In addition, time constraints have to be considered as these sensors are capable of providing a 3D data stream in real time. This thesis proposed the use of Self-Organizing Maps (SOMs) as a 3D representation model. In particular, we proposed the use of the Growing Neural Gas (GNG) network, which has been successfully used for clustering, pattern recognition and topology representation of multi-dimensional data. Until now, Self-Organizing Maps have been primarily computed offline and their application in 3D data has mainly focused on free noise models, without considering time constraints. It is proposed a hardware implementation leveraging the computing power of modern GPUs, which takes advantage of a new paradigm coined as General-Purpose Computing on Graphics Processing Units (GPGPU). The proposed methods were applied to different problem and applications in the area of computer vision such as the recognition and localization of objects, visual surveillance or 3D reconstruction.
Resumo:
Virtual Screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. Most VS methods suppose a unique binding site for the target, but it has been demonstrated that diverse ligands interact with unrelated parts of the target and many VS methods do not take into account this relevant fact. This problem is circumvented by a novel VS methodology named BINDSURF that scans the whole protein surface to find new hotspots, where ligands might potentially interact with, and which is implemented in massively parallel Graphics Processing Units, allowing fast processing of large ligand databases. BINDSURF can thus be used in drug discovery, drug design, drug repurposing and therefore helps considerably in clinical research. However, the accuracy of most VS methods is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to solve this problem, we propose a novel approach where neural networks are trained with databases of known active (drugs) and inactive compounds, and later used to improve VS predictions.
Resumo:
Recent years observed massive growth in wearable technology, everything can be smart: phones, watches, glasses, shirts, etc. These technologies are prevalent in various fields: from wellness/sports/fitness to the healthcare domain. The spread of this phenomenon led the World-Health-Organization to define the term 'mHealth' as "medical and public health practice supported by mobile devices, such as mobile phones, patient monitoring devices, personal digital assistants, and other wireless devices". Furthermore, mHealth solutions are suitable to perform real-time wearable Biofeedback (BF) systems: sensors in the body area network connected to a processing unit (smartphone) and a feedback device (loudspeaker) to measure human functions and return them to the user as (bio)feedback signal. During the COVID-19 pandemic, this transformation of the healthcare system has been dramatically accelerated by new clinical demands, including the need to prevent hospital surges and to assure continuity of clinical care services, allowing pervasive healthcare. Never as of today, we can say that the integration of mHealth technologies will be the basis of this new era of clinical practice. In this scenario, this PhD thesis's primary goal is to investigate new and innovative mHealth solutions for the Assessment and Rehabilitation of different neuromotor functions and diseases. For the clinical assessment, there is the need to overcome the limitations of subjective clinical scales. Creating new pervasive and self-administrable mHealth solutions, this thesis investigates the possibility of employing innovative systems for objective clinical evaluation. For rehabilitation, we explored the clinical feasibility and effectiveness of mHealth systems. In particular, we developed innovative mHealth solutions with BF capability to allow tailored rehabilitation. The main goal that a mHealth-system should have is improving the person's quality of life, increasing or maintaining his autonomy and independence. To this end, inclusive design principles might be crucial, next to the technical and technological ones, to improve mHealth-systems usability.
Resumo:
This paper discusses the qualitativecomparative evaluation performed on theresults of two machine translation systemswith different approaches to the processing ofmulti-word units. It proposes a solution forovercoming the difficulties multi-word unitspresent to machine translation by adopting amethodology that combines the lexicongrammar approach with OpenLogos ontologyand semantico-syntactic rules. The paper alsodiscusses the importance of a qualitativeevaluation metrics to correctly evaluate theperformance of machine translation engineswith regards to multi-word units.
Resumo:
Umbilical cord blood (UCB) is a source of hematopoietic stem cells that initially was used exclusively for the hematopoietic reconstitution of pediatric patients. It is now suggested for use for adults as well, a fact that increases the pressure to obtain units with high cellularity. Therefore, the optimization of UCB processing is a priority.