823 resultados para Pipeline


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pipelines are one of the safest means to transport crude oil, but are not spill-free. This is of concern in North America, due to the large volumes of crude oil shipped by Canadian producers and the lengthy network of pipelines. Each pipeline crosses many rivers, supporting a wide variety of human activities, and rich aquatic life. However, there is a knowledge gap on the risks of contamination of river beds due to oil spills. This thesis addresses this knowledge gap by focussing on mechanisms that transport water (and contaminants) from the free surface flow to the bed sediments, and vice-versa. The work focuses on gravel rivers, in which bed sediments are sufficiently permeable that pressure gradients caused by the interactions of flow with topographic elements (gravel bars), or changes in direction induce exchanges of water between the free surface flow and the bed, known as hyporheic flows. The objectives of the thesis are: to present a new method to visualize and quantify hyporheic flows in laboratory experiments; to conduct a novel series of experiments on hyporheic flow induced by a gravel bar under different free surface flows. The new method to quantify hyporheic flows rests on injections of a solution of dye and water. The method yielded accurate flow lines, and reasonable estimates of the hyporheic flow velocities. The present series of experiments was carried out in a 11 m long, 0.39 m wide, and 0.41 m deep tilting flume. The gravel had a mean particle size of 7.7 mm. Different free surface flows were imposed by changing the flume slope and flow depth. Measured hyporheic flows were turbulent. Smaller free surface flow depths resulted in stronger hyporheic flows (higher velocities, and deeper dye penetration into the sediment). A significant finding is that different free surface flows (different velocities, Reynolds number, etc.) produce similar hyporheic flows as long as the downstream hydraulic gradients are similar. This suggests, that for a specified bar geometry, the characteristics of the hyporheic flows depend on the downstream hydraulic gradients, and not or only minimally on the internal dynamics of the free surface flow.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pipelines extend thousands of kilometers across wide geographic areas as a network to provide essential services for modern life. It is inevitable that pipelines must pass through unfavorable ground conditions, which are susceptible to natural disasters. This thesis investigates the behaviour of buried pressure pipelines experiencing ground distortions induced by normal faulting. A recent large database of physical modelling observations on buried pipes of different stiffness relative to the surrounding soil subjected to normal faults provided a unique opportunity to calibrate numerical tools. Three-dimensional finite element models were developed to enable the complex soil-structure interaction phenomena to be further understood, especially on the subjects of gap formation beneath the pipe and the trench effect associated with the interaction between backfill and native soils. Benchmarked numerical tools were then used to perform parametric analysis regarding project geometry, backfill material, relative pipe-soil stiffness and pipe diameter. Seismic loading produces a soil displacement profile that can be expressed by isoil, the distance between the peak curvature and the point of contraflexure. A simplified design framework based on this length scale (i.e., the Kappa method) was developed, which features estimates of longitudinal bending moments of buried pipes using a characteristic length, ipipe, the distance from peak to zero curvature. Recent studies indicated that empirical soil springs that were calibrated against rigid pipes are not suitable for analyzing flexible pipes, since they lead to excessive conservatism (for design). A large-scale split-box normal fault simulator was therefore assembled to produce experimental data for flexible PVC pipe responses to a normal fault. Digital image correlation (DIC) was employed to analyze the soil displacement field, and both optical fibres and conventional strain gauges were used to measure pipe strains. A refinement to the Kappa method was introduced to enable the calculation of axial strains as a function of pipe elongation induced by flexure and an approximation of the longitudinal ground deformations. A closed-form Winkler solution of flexural response was also derived to account for the distributed normal fault pattern. Finally, these two analytical solutions were evaluated against the pipe responses observed in the large-scale laboratory tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Acute myeloid leukemia (AML) is a heterogeneous clonal disorder often associated with dismal overall survival. The clinical diversity of AML is reflected in the range of recurrent somatic mutations in several genes, many of which have a prognostic and therapeutic value. Targeted next-generation sequencing (NGS) of these genes has the potential for translation into clinical practice. In order to assess this potential, an inter-laboratory evaluation of a commercially available AML gene panel across three diagnostic centres in the UK and Ireland was performed.

METHODS: DNA from six AML patient samples was distributed to each centre and processed using a standardised workflow, including a common sequencing platform, sequencing chips and bioinformatics pipeline. A duplicate sample in each centre was run to assess inter- and intra-laboratory performance.

RESULTS: An average sample read depth of 2725X (range 629-5600) was achieved using six samples per chip, with some variability observed in the depth of coverage generated for individual samples and between centres. A total of 16 somatic mutations were detected in the six AML samples, with a mean of 2.7 mutations per sample (range 1-4) representing nine genes on the panel. 15/16 mutations were identified by all three centres. Allelic frequencies of the mutations ranged from 5.6 to 53.3 % (median 44.4 %), with a high level of concordance of these frequencies between centres, for mutations detected.

CONCLUSION: In this inter-laboratory comparison, a high concordance, reproducibility and robustness was demonstrated using a commercially available NGS AML gene panel and platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aims to determinate the water flowrate using Time Transient and Cross-Correlation techniques. The detection system uses two NaI(Tl) detectors adequately positioned on the outside of pipe and a gamma-ray source (82Br radiotracer). The water flowrate measurements using Time Transient and Cross-Correlation techniques were compared to invasive conventional measurements of the flowmeter previously installed in pipeline. Discrepancies between Time Transient and Cross-Correlation techniques flowrate values were found to be less than 3% in relation to conventional ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fusobacterium necrophorum, a Gram negative, anaerobic bacterium, is a common cause of acute pharyngitis and tonsillitis and a rare cause of more severe infections of the head and neck. At the beginning of the project, there was no available genome sequence for F. necrophorum. The aim of this project was to sequence the F. necrophorum genome and identify and study its putative virulence factors contained using in silico and in vitro analysis. Type strains JCM 3718 and JCM 3724,F. necrophorum subspecies necrophorum (Fnn) and funduliforme (Fnf), respectively, and strain ARU 01 (Fnf), isolated from a patient with LS, were commercially sequenced by Roche 454 GS-FLX+ next generation sequencing and assembled into contigs using Roche GS Assembler. Sequence data was annotated semi-automatically, using the xBASE pipeline, BLASTp and Pfam. The F. necrophorum genome was determined to be approximately 2.1 – 2.3 Mb in size, with an estimated 1,950 ORFs and includes genes for a leukotoxin, ecotin, haemolysin, haemagglutinin, haemin receptor, adhesin and type Vb and Vc secretion systems. The prevalence of the leukotoxin gene was investigated in strains JCM 3718, JCM 3724 and ARU 01, as well as a clinical collection of 25 Fnf strains, identified using biochemical and molecular tests. The leukotoxin operon was found to be universal within the strain collection by PCR. HL-60 cells subjected to aliquots of concentrated high molecular weight culture supernatant, predicted to contain the secreted leukotoxins of strains JCM 3718, JCM 3724 and ARU 01, were killed in a dose-dependent manner. The cytotoxic effect of the leukotoxin against human donor white blood cells was also tested to validate the HL-60 assay. The differences in the results between the two assays were not statistically significant. Ecotin, a serine protease inhibitor, was found to be present in 100 % of the strain collection and had a highly conserved sequence with primary and secondary binding sites exposed on opposing sides of the protein. During enzyme inhibition studies, a purified recombinant F. necrophorum ecotin protein inhibited human neutrophil elastase, a protease that degrades bacteria at inflammation sites, and human plasma kallikrein, a component of the host clotting cascade. The recombinant ecotin also prolonged human plasma clotting times by up to 7-fold for the extrinsic pathway, and up to 40-fold for the intrinsic pathway. The genome sequence data provides important information about F. necrophorum type strains and enables comparative study between strains and subspecies. Results from the leukotoxin and ecotin assays can be used to build up an understanding of how the organism behaves during infection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les réseaux de capteurs sont formés d’un ensemble de dispositifs capables de prendre individuellement des mesures d’un environnement particulier et d’échanger de l’information afin d’obtenir une représentation de haut niveau sur les activités en cours dans la zone d’intérêt. Une telle détection distribuée, avec de nombreux appareils situés à proximité des phénomènes d’intérêt, est pertinente dans des domaines tels que la surveillance, l’agriculture, l’observation environnementale, la surveillance industrielle, etc. Nous proposons dans cette thèse plusieurs approches pour effectuer l’optimisation des opérations spatio-temporelles de ces dispositifs, en déterminant où les placer dans l’environnement et comment les contrôler au fil du temps afin de détecter les cibles mobiles d’intérêt. La première nouveauté consiste en un modèle de détection réaliste représentant la couverture d’un réseau de capteurs dans son environnement. Nous proposons pour cela un modèle 3D probabiliste de la capacité de détection d’un capteur sur ses abords. Ce modèle inègre également de l’information sur l’environnement grâce à l’évaluation de la visibilité selon le champ de vision. À partir de ce modèle de détection, l’optimisation spatiale est effectuée par la recherche du meilleur emplacement et l’orientation de chaque capteur du réseau. Pour ce faire, nous proposons un nouvel algorithme basé sur la descente du gradient qui a été favorablement comparée avec d’autres méthodes génériques d’optimisation «boites noires» sous l’aspect de la couverture du terrain, tout en étant plus efficace en terme de calculs. Une fois que les capteurs placés dans l’environnement, l’optimisation temporelle consiste à bien couvrir un groupe de cibles mobiles dans l’environnement. D’abord, on effectue la prédiction de la position future des cibles mobiles détectées par les capteurs. La prédiction se fait soit à l’aide de l’historique des autres cibles qui ont traversé le même environnement (prédiction à long terme), ou seulement en utilisant les déplacements précédents de la même cible (prédiction à court terme). Nous proposons de nouveaux algorithmes dans chaque catégorie qui performent mieux ou produits des résultats comparables par rapport aux méthodes existantes. Une fois que les futurs emplacements de cibles sont prédits, les paramètres des capteurs sont optimisés afin que les cibles soient correctement couvertes pendant un certain temps, selon les prédictions. À cet effet, nous proposons une méthode heuristique pour faire un contrôle de capteurs, qui se base sur les prévisions probabilistes de trajectoire des cibles et également sur la couverture probabiliste des capteurs des cibles. Et pour terminer, les méthodes d’optimisation spatiales et temporelles proposées ont été intégrées et appliquées avec succès, ce qui démontre une approche complète et efficace pour l’optimisation spatio-temporelle des réseaux de capteurs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ricavare informazioni dalla realtà circostante è un obiettivo molto importante dell'informatica moderna, in modo da poter progettare robot, veicoli a guida autonoma, sistemi di riconoscimento e tanto altro. La computer vision è la parte dell'informatica che se ne occupa e sta sempre più prendendo piede. Per raggiungere tale obiettivo si utilizza una pipeline di visione stereo i cui passi di rettificazione e generazione di mappa di disparità sono oggetto di questa tesi. In particolare visto che questi passi sono spesso affidati a dispositivi hardware dedicati (come le FPGA) allora si ha la necessità di utilizzare algoritmi che siano portabili su questo tipo di tecnologia, dove le risorse sono molto minori. Questa tesi mostra come sia possibile utilizzare tecniche di approssimazione di questi algoritmi in modo da risparmiare risorse ma che che garantiscano comunque ottimi risultati.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuous delivery (CD) is a software engineering approach where the focus lays on creating a short delivery cycle by automating parts of the deployment pipeline which includes build, deploy-, test and release process. CD is based on that during development should be possible to always automatically generate a release based on the source code in its current state. One of CD's many advantages is that through continuous releases it allows you to get a quick feedback loop leading to faster and more efficient implementation of new functions, at the same time fixing errors. Although CD has many advantages, there are also several challenges a maintenance management project must manage in the transition to CD. These challenges may differ depending on the maturity level for a maintenance management project and what strengths and weaknesses the project has. Our research question was: "What challenges can a maintenance management project face in transition to Continuous delivery?" The purpose of this study is to describe Continuous delivery and the challenges a maintenance management project may face during a transition to Continuous delivery. A descriptive case study has been carried out with the data collection methods of interviews and documents. A situation analysis was created based on the collected data in a shape of a process model that represent the maintenance management projects release process. The processmodel was used as the basis of SWOT analysis and analysis by Rehn et al's Maturity Model. From these analyzes we found challenges of a maintenance management project may face in the transition to CD. The challenges are about customers and the management's attitude towards a transition to CD. But the biggest challenge is about automation of the deployment pipeline steps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The estimating of the relative orientation and position of a camera is one of the integral topics in the field of computer vision. The accuracy of a certain Finnish technology company’s traffic sign inventory and localization process can be improved by utilizing the aforementioned concept. The company’s localization process uses video data produced by a vehicle installed camera. The accuracy of estimated traffic sign locations depends on the relative orientation between the camera and the vehicle. This thesis proposes a computer vision based software solution which can estimate a camera’s orientation relative to the movement direction of the vehicle by utilizing video data. The task was solved by using feature-based methods and open source software. When using simulated data sets, the camera orientation estimates had an absolute error of 0.31 degrees on average. The software solution can be integrated to be a part of the traffic sign localization pipeline of the company in question.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective school discipline practices are essential to keeping schools safe and creating an optimal learning environment. However, the overreliance of exclusionary discipline often removes students from the school setting and deprives them of the opportunity to learn. Previous research has suggested that students are being introduced to the juvenile justice system through the use of school-based juvenile court referrals. In 2011, approximately 1.2 million delinquency cases were referred to the juvenile courts in the United States. Preliminary evidence suggests that an increasing number of these referrals have originated in the schools. This study investigated school-based referrals to the juvenile courts as an element of the School-to-Prison Pipeline (StPP). The likelihood of school-based juvenile court referrals and rate of dismissal of these referrals was examined in several states using data from the National Juvenile Court Data Archives. In addition, the study examined race and special education status as predictors of school-based juvenile court referrals. Descriptive statistics, logistic regression and odds ratio, were used to analyze the data, make conclusions based on the findings and recommend appropriate school discipline practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the manufacturing industry the term Process Planning (PP) is concerned with determining the sequence of individual manufacturing operations needed to produce a given part or product with a certain machine. In this technical report we propose a preliminary analysis of scientific literature on the topic of process planning for Additive Manufacturing (AM) technologies (i.e. 3D printing). We observe that the process planning for additive manufacturing processes consists of a small set of standard operations (repairing, orientation, supports, slicing and toolpath generation). We analyze each of them in order to emphasize the most critical aspects of the current pipeline as well as highlight the future challenges for this emerging manufacturing technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discovery Driven Analysis (DDA) is a common feature of OLAP technology to analyze structured data. In essence, DDA helps analysts to discover anomalous data by highlighting 'unexpected' values in the OLAP cube. By giving indications to the analyst on what dimensions to explore, DDA speeds up the process of discovering anomalies and their causes. However, Discovery Driven Analysis (and OLAP in general) is only applicable on structured data, such as records in databases. We propose a system to extend DDA technology to semi-structured text documents, that is, text documents with a few structured data. Our system pipeline consists of two stages: first, the text part of each document is structured around user specified dimensions, using semi-PLSA algorithm; then, we adapt DDA to these fully structured documents, thus enabling DDA on text documents. We present some applications of this system in OLAP analysis and show how scalability issues are solved. Results show that our system can handle reasonable datasets of documents, in real time, without any need for pre-computation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient crop monitoring and pest damage assessments are key to protecting the Australian agricultural industry and ensuring its leading position internationally. An important element in pest detection is gathering reliable crop data frequently and integrating analysis tools for decision making. Unmanned aerial systems are emerging as a cost-effective solution to a number of precision agriculture challenges. An important advantage of this technology is it provides a non-invasive aerial sensor platform to accurately monitor broad acre crops. In this presentation, we will give an overview on how unmanned aerial systems and machine learning can be combined to address crop protection challenges. A recent 2015 study on insect damage in sorghum will illustrate the effectiveness of this methodology. A UAV platform equipped with a high-resolution camera was deployed to autonomously perform a flight pattern over the target area. We describe the image processing pipeline implemented to create a georeferenced orthoimage and visualize the spatial distribution of the damage. An image analysis tool has been developed to minimize human input requirements. The computer program is based on a machine learning algorithm that automatically creates a meaningful partition of the image into clusters. Results show the algorithm delivers decision boundaries that accurately classify the field into crop health levels. The methodology presented in this paper represents a venue for further research towards automated crop protection assessments in the cotton industry, with applications in detecting, quantifying and monitoring the presence of mealybugs, mites and aphid pests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hepatitis C virus (HCV) is emerging as one of the leading causes of morbidity and mortality in individuals infected with HIV and has overtaken AIDS-defining illnesses as a cause of death in HIV patient populations who have access to highly active antiretroviral therapy. For many years, the clonal analysis was the reference method for investigating viral diversity. In this thesis, a next generation sequencing (NGS) approach was developed using 454 pyrosequencing and Illumina-based technology. A sequencing pipeline was developed using two different NGS approaches, nested PCR, and metagenomics. The pipeline was used to study the viral populations in the sera of HCV-infected patients from a unique cohort of 160 HIV-positive patients with early HCV infection. These pipelines resulted in an improved understanding of HCV quasispecies dynamics, especially regarding studying response to treatment. Low viral diversity at baseline correlated with sustained virological response (SVR) while high viral diversity at baseline was associated with treatment failure. The emergence of new viral strains following treatment failure was most commonly associated with emerging dominance of pre-existing minority variants rather than re-infection. In the new era of direct-acting antivirals, next generation sequencing technologies are the most promising tool for identifying minority variants present in the HCV quasispecies populations at baseline. In this cohort, several mutations conferring resistance were detected in genotype 1a treatment-naïve patients. Further research into the impact of baseline HCV variants on SVR rates should be carried out in this population. A clearer understanding of the properties of viral quasispecies would enable clinicians to make improved treatment choices for their patients.