157 resultados para submarine pipeline


Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-throughput DNA sequencing (HTS) instruments today are capable of generating millions of sequencing reads in a short period of time, and this represents a serious challenge to current bioinformatics pipeline in processing such an enormous amount of data in a fast and economical fashion. Modern graphics cards are powerful processing units that consist of hundreds of scalar processors in parallel in order to handle the rendering of high-definition graphics in real-time. It is this computational capability that we propose to harness in order to accelerate some of the time-consuming steps in analyzing data generated by the HTS instruments. We have developed BarraCUDA, a novel sequence mapping software that utilizes the parallelism of NVIDIA CUDA graphics cards to map sequencing reads to a particular location on a reference genome. While delivering a similar mapping fidelity as other mainstream programs , BarraCUDA is a magnitude faster in mapping throughput compared to its CPU counterparts. The software is also capable of supporting multiple CUDA devices in parallel to further accelerate the mapping throughput. BarraCUDA is designed to take advantage of the parallelism of GPU to accelerate the mapping of millions of sequencing reads generated by HTS instruments. By doing this, we could, at least in part streamline the current bioinformatics pipeline such that the wider scientific community could benefit from the sequencing technology. BarraCUDA is currently available at http://seqbarracuda.sf.net

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: With the maturation of next-generation DNA sequencing (NGS) technologies, the throughput of DNA sequencing reads has soared to over 600 gigabases from a single instrument run. General purpose computing on graphics processing units (GPGPU), extracts the computing power from hundreds of parallel stream processors within graphics processing cores and provides a cost-effective and energy efficient alternative to traditional high-performance computing (HPC) clusters. In this article, we describe the implementation of BarraCUDA, a GPGPU sequence alignment software that is based on BWA, to accelerate the alignment of sequencing reads generated by these instruments to a reference DNA sequence. FINDINGS: Using the NVIDIA Compute Unified Device Architecture (CUDA) software development environment, we ported the most computational-intensive alignment component of BWA to GPU to take advantage of the massive parallelism. As a result, BarraCUDA offers a magnitude of performance boost in alignment throughput when compared to a CPU core while delivering the same level of alignment fidelity. The software is also capable of supporting multiple CUDA devices in parallel to further accelerate the alignment throughput. CONCLUSIONS: BarraCUDA is designed to take advantage of the parallelism of GPU to accelerate the alignment of millions of sequencing reads generated by NGS instruments. By doing this, we could, at least in part streamline the current bioinformatics pipeline such that the wider scientific community could benefit from the sequencing technology.BarraCUDA is currently available from http://seqbarracuda.sf.net.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In situ tests in deep waterWest African clays show crust-like shear strengths within the top few metres of sediment. Typical strength profiles show su rising from mud-line to 10 kPa to 15 kPa before dropping back to normally consolidated strengths of 3 kPa to 4 kPa by 1.5m to 2m depth. A Cam-shear device is used to better understand the mechanical behaviour of undisturbed crust samples under pipelines. Extremely variable peak and residual shear strengths are observed for a range of pipeline consolidation stresses and test shear rates, with residual strengths approximating zero. ESEM of undisturbed samples and wet-sieved samples from various core depths show the presence of numerous randomly-located groups of invertebrate faecal pellets. It is therefore proposed that the cause of strength variability during shear testing and, indeed, of the crust's origin, is the presence of random groups of faecal pellets within the sediment. © 2011 Taylor & Francis Group, London.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vibration methods are used to identify faults, such as spanning and loss of cover, in long off-shore pipelines. A pipeline `pig', propelled by fluid flow, generates transverse vibration in the pipeline and the measured vibration amplitude reflects the nature of the support condition. Large quantities of vibration data are collected and analyzed by Fourier and wavelet methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents centrifuge test data of the problem of tunnelling effects on buried pipelines and compares them to predictions made using DEM simulations. The paper focuses on the examination of pipeline bending moments, their distribution along the pipe, and their development with tunnel volume loss. Centrifuge results are obtained by PIV analysis and compared to results obtained using the DEM model. The DEM model was built to replicate the centrifuge model as closely as possible and included numerical features formulated specially for this task, such as structural elements to replicate the tunnel and pipeline. Results are extremely encouraging, with deviations between DEM and centrifuge test bending moment results being very small. © 2010 Taylor & Francis Group, London.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Videogrammetry is an inexpensive and easy-to-use technology for spatial 3D scene recovery. When applied to large scale civil infrastructure scenes, only a small percentage of the collected video frames are required to achieve robust results. However, choosing the right frames requires careful consideration. Videotaping a built infrastructure scene results in large video files filled with blurry, noisy, or redundant frames. This is due to frame rate to camera speed ratios that are often higher than necessary; camera and lens imperfections and limitations that result in imaging noise; and occasional jerky motions of the camera that result in motion blur; all of which can significantly affect the performance of the videogrammetric pipeline. To tackle these issues, this paper proposes a novel method for automating the selection of an optimized number of informative, high quality frames. According to this method, as the first step, blurred frames are removed using the thresholds determined based on a minimum level of frame quality required to obtain robust results. Then, an optimum number of key frames are selected from the remaining frames using the selection criteria devised by the authors. Experimental results show that the proposed method outperforms existing methods in terms of improved 3D reconstruction results, while maintaining the optimum number of extracted frames needed to generate high quality 3D point clouds.© 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Obtaining accurate confidence measures for automatic speech recognition (ASR) transcriptions is an important task which stands to benefit from the use of multiple information sources. This paper investigates the application of conditional random field (CRF) models as a principled technique for combining multiple features from such sources. A novel method for combining suitably defined features is presented, allowing for confidence annotation using lattice-based features of hypotheses other than the lattice 1-best. The resulting framework is applied to different stages of a state-of-the-art large vocabulary speech recognition pipeline, and consistent improvements are shown over a sophisticated baseline system. Copyright © 2011 ISCA.