490 resultados para Workflow
Resumo:
INTRODUCTION: Acute myeloid leukemia (AML) is a heterogeneous clonal disorder often associated with dismal overall survival. The clinical diversity of AML is reflected in the range of recurrent somatic mutations in several genes, many of which have a prognostic and therapeutic value. Targeted next-generation sequencing (NGS) of these genes has the potential for translation into clinical practice. In order to assess this potential, an inter-laboratory evaluation of a commercially available AML gene panel across three diagnostic centres in the UK and Ireland was performed.
METHODS: DNA from six AML patient samples was distributed to each centre and processed using a standardised workflow, including a common sequencing platform, sequencing chips and bioinformatics pipeline. A duplicate sample in each centre was run to assess inter- and intra-laboratory performance.
RESULTS: An average sample read depth of 2725X (range 629-5600) was achieved using six samples per chip, with some variability observed in the depth of coverage generated for individual samples and between centres. A total of 16 somatic mutations were detected in the six AML samples, with a mean of 2.7 mutations per sample (range 1-4) representing nine genes on the panel. 15/16 mutations were identified by all three centres. Allelic frequencies of the mutations ranged from 5.6 to 53.3 % (median 44.4 %), with a high level of concordance of these frequencies between centres, for mutations detected.
CONCLUSION: In this inter-laboratory comparison, a high concordance, reproducibility and robustness was demonstrated using a commercially available NGS AML gene panel and platform.
Resumo:
AIMS: Mutation detection accuracy has been described extensively; however, it is surprising that pre-PCR processing of formalin-fixed paraffin-embedded (FFPE) samples has not been systematically assessed in clinical context. We designed a RING trial to (i) investigate pre-PCR variability, (ii) correlate pre-PCR variation with EGFR/BRAF mutation testing accuracy and (iii) investigate causes for observed variation. METHODS: 13 molecular pathology laboratories were recruited. 104 blinded FFPE curls including engineered FFPE curls, cell-negative FFPE curls and control FFPE tissue samples were distributed to participants for pre-PCR processing and mutation detection. Follow-up analysis was performed to assess sample purity, DNA integrity and DNA quantitation. RESULTS: Rate of mutation detection failure was 11.9%. Of these failures, 80% were attributed to pre-PCR error. Significant differences in DNA yields across all samples were seen using analysis of variance (p
Resumo:
Scientific workflows orchestrate the execution of complex experiments frequently using distributed computing platforms. Meta-workflows represent an emerging type of such workflows which aim to reuse existing workflows from potentially different workflow systems to achieve more complex and experimentation minimizing workflow design and testing efforts. Workflow interoperability plays a profound role in achieving this objective. This paper is focused at fostering interoperability across meta-workflows that combine workflows of different workflow systems from diverse scientific domains. This is achieved by formalizing definitions of meta-workflow and its different types to standardize their data structures used to describe workflows to be published and shared via public repositories. The paper also includes thorough formalization of two workflow interoperability approaches based on this formal description: the coarse-grained and fine-grained workflow interoperability approach. The paper presents a case study from Astrophysics which successfully demonstrates the use of the concepts of meta-workflows and workflow interoperability within a scientific simulation platform.
Resumo:
Preserving the cultural heritage of the performing arts raises difficult and sensitive issues, as each performance is unique by nature and the juxtaposition between the performers and the audience cannot be easily recorded. In this paper, we report on an experimental research project to preserve another aspect of the performing arts—the history of their rehearsals. We have specifically designed non-intrusive video recording and on-site documentation techniques to make this process transparent to the creative crew, and have developed a complete workflow to publish the recorded video data and their corresponding meta-data online as Open Data using state-of-the-art audio and video processing to maximize non-linear navigation and hypervideo linking. The resulting open archive is made publicly available to researchers and amateurs alike and offers a unique account of the inner workings of the worlds of theater and opera.
Resumo:
Test av mjukvara görs i syfte att se ifall systemet uppfyller specificerade krav samt för att hitta fel. Det är en viktig del i systemutveckling och involverar bland annat regressionstestning. Regressionstester utförs för att säkerställa att en ändring i systemet inte medför att andra delar i systemet påverkas negativt. Dokumenthanteringssystem hanterar ofta känslig data hos organisationer vilket ställer höga krav på säkerheten. Behörigheter i system måste därför testas noggrant för att säkerställa att data inte hamnar i fel händer. Dokumenthanteringssystem gör det möjligt för flera organisationer att samla sina resurser och kunskaper för att nå gemensamma mål. Gemensamma arbetsprocesser stöds med hjälp av arbetsflöden som innehåller ett antal olika tillstånd. Vid dessa olika tillstånd gäller olika behörigheter. När en behörighet ändras krävs regressionstester för att försäkra att ändringen inte har gjort inverkan på andra behörigheter. Denna studie har utförts som en kvalitativ fallstudie vars syfte var att beskriva utmaningar med regressionstestning av roller och behörigheter i arbetsflöden för dokument i dokumenthanteringssystem. Genom intervjuer och en observation så framkom det att stora utmaningar med dessa tester är att arbetsflödens tillstånd följer en förutbestämd sekvens. För att fullfölja denna sekvens så involveras en enorm mängd behörigheter som måste testas. Det ger ett mycket omfattande testarbete avseende bland annat tid och kostnad. Studien har riktat sig mot dokumenthanteringssystemet ProjectWise som förvaltas av Trafikverket. Beslutsunderlag togs fram för en teknisk lösning för automatiserad regressionstestning av roller och behörigheter i arbetsflöden åt ProjectWise. Utifrån en kravinsamling tillhandahölls beslutsunderlag som involverade Team Foundation Server (TFS), Coded UI och en nyckelordsdriven testmetod som en teknisk lösning. Slutligen jämfördes vilka skillnader den tekniska lösningen kan utgöra mot manuell testning. Utifrån litteratur, dokumentstudie och förstahandserfarenheter visade sig testautomatisering kunna utgöra skillnader inom ett antal identifierade problemområden, bland annat tid och kostnad.
Resumo:
E-books on their own are complex; they become even more so in the context of course reserves. In FY2016 the Resource Sharing & Reserves and Acquisitions units developed a new workflow for vetting requested e-books to ensure that they were suitable for course reserves (i.e. they permit unlimited simultaneous users) before posting links to them within the university’s online learning management system. In the Spring 2016 semester 46 e-books were vetted through this process, resulting in 18 purchases. Preliminary data analysis sheds light on the suitability of the Libraries’ current e-book collections for course reserves as well as faculty preferences, with potential implications for the Libraries’ ordering process. We hope this lightening talk will generate discussion about these issues among selectors, collection managers, and reserves staff alike.
Resumo:
Current practices in agricultural management involve the application of rules and techniques to ensure high quality and environmentally friendly production. Based on their experience, agricultural technicians and farmers make critical decisions affecting crop growth while considering several interwoven agricultural, technological, environmental, legal and economic factors. In this context, decision support systems and the knowledge models that support them, enable the incorporation of valuable experience into software systems providing support to agricultural technicians to make rapid and effective decisions for efficient crop growth. Pest control is an important issue in agricultural management due to crop yield reductions caused by pests and it involves expert knowledge. This paper presents a formalisation of the pest control problem and the workflow followed by agricultural technicians and farmers in integrated pest management, the crop production strategy that combines different practices for growing healthy crops whilst minimising pesticide use. A generic decision schema for estimating infestation risk of a given pest on a given crop is defined and it acts as a metamodel for the maintenance and extension of the knowledge embedded in a pest management decision support system which is also presented. This software tool has been implemented by integrating a rule-based tool into web-based architecture. Evaluation from validity and usability perspectives concluded that both agricultural technicians and farmers considered it a useful tool in pest control, particularly for training new technicians and inexperienced farmers.
Resumo:
The traditional process of filling the medicine trays and dispensing the medicines to the patients in the hospitals is manually done by reading the printed paper medicine chart. This process can be very strenuous and error-prone, given the number of sub-tasks involved in the entire workflow and the dynamic nature of the work environment. Therefore, efforts are being made to digitalise the medication dispensation process by introducing a mobile application called Smart Dosing application. The introduction of the Smart Dosing application into hospital workflow raises security concerns and calls for security requirement analysis. This thesis is written as a part of the smart medication management project at Embedded Systems Laboratory, A° bo Akademi University. The project aims at digitising the medicine dispensation process by integrating information from various health systems, and making them available through the Smart Dosing application. This application is intended to be used on a tablet computer which will be incorporated on the medicine tray. The smart medication management system include the medicine tray, the tablet device, and the medicine cups with the cup holders. Introducing the Smart Dosing application should not interfere with the existing process carried out by the nurses, and it should result in minimum modifications to the tray design and the workflow. The re-designing of the tray would include integrating the device running the application into the tray in a manner that the users find it convenient and make less errors while using it. The main objective of this thesis is to enhance the security of the hospital medicine dispensation process by ensuring the security of the Smart Dosing application at various levels. The methods used for writing this thesis was to analyse how the tray design, and the application user interface design can help prevent errors and what secure technology choices have to be made before starting the development of the next prototype of the Smart Dosing application. The thesis first understands the context of the use of the application, the end-users and their needs, and the errors made in everyday medication dispensation workflow by continuous discussions with the nursing researchers. The thesis then gains insight to the vulnerabilities, threats and risks of using mobile application in hospital medication dispensation process. The resulting list of security requirements was made by analysing the previously built prototype of the Smart Dosing application, continuous interactive discussions with the nursing researchers, and an exhaustive stateof- the-art study on security risks of using mobile applications in hospital context. The thesis also uses Octave Allegro method to make the readers understand the likelihood and impact of threats, and what steps should be taken to prevent or fix them. The security requirements obtained, as a result, are a starting point for the developers of the next iteration of the prototype for the Smart Dosing application.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Jede Lernumgebung muss ein Gleichgewicht von drei Anforderungen sicherstellen: Inhaltsvermittlung, Förderung von Aktivitäten der Studierenden und Unterstützung von lern- und arbeitsbezogenen Interaktionen. Auf dem Hintergrund von Ansätzen zu Task-Technology-Fit und zu Prozessverlusten bei Gruppenleistung wird ein Workflow-basiertes Modell einer Lern- und Arbeitsumgebung für kooperatives und kollaboratives Lernen und Arbeiten in der Psychologie und den empirischen Sozialwissenschaften zur Erreichung dieser Ziele vorgelegt. Es wird gezeigt, wie rezeptionsorientierte Lernvorgänge, die durch Lernprogramme angeregt werden, durch Funktionalitäten von Kooperation ergänzt werden können. Ferner wird gezeigt, wie produktionsorientierte Lernvorgänge durch kollaborative Lernprojekte gefördert werden können, welche die Lern- und Arbeitsschritte in einer studentischen Arbeitsgruppe unterstützen. Die Nutzung eines geteilten Arbeitsbereichs sowohl für Aktivitäten im Lernprogramm als auch im Lernprojekt werden diskutiert.(DIPF/Orig.)
Resumo:
International audience
Resumo:
A human genome contains more than 20 000 protein-encoding genes. A human proteome, instead, has been estimated to be much more complex and dynamic. The most powerful tool to study proteins today is mass spectrometry (MS). MS based proteomics is based on the measurement of the masses of charged peptide ions in a gas-phase. The peptide amino acid sequence can be deduced, and matching proteins can be found, using software to correlate MS-data with sequence database information. Quantitative proteomics allow the estimation of the absolute or relative abundance of a certain protein in a sample. The label-free quantification methods use the intrinsic MS-peptide signals in the calculation of the quantitative values enabling the comparison of peptide signals from numerous patient samples. In this work, a quantitative MS methodology was established to study aromatase overexpressing (AROM+) male mouse liver and ovarian endometriosis tissue samples. The workflow of label-free quantitative proteomics was optimized in terms of sensitivity and robustness, allowing the quantification of 1500 proteins with a low coefficient of variance in both sample types. Additionally, five statistical methods were evaluated for the use with label-free quantitative proteomics data. The proteome data was integrated with other omics datasets, such as mRNA microarray and metabolite data sets. As a result, an altered lipid metabolism in liver was discovered in male AROM+ mice. The results suggest a reduced beta oxidation of long chain phospholipids in the liver and increased levels of pro-inflammatory fatty acids in the circulation in these mice. Conversely, in the endometriosis tissues, a set of proteins highly specific for ovarian endometrioma were discovered, many of which were under the regulation of the growth factor TGF-β1. This finding supports subsequent biomarker verification in a larger number of endometriosis patient samples.
Resumo:
Proceedings paper published by Society of American Archivists. Presented at conference in 2015 in Cleveland, OH (http://www2.archivists.org/proceedings/research-forum/2015/agenda#papers). Published by SAA in 2016.
Resumo:
The work outlined in this dissertation will allow biochemists and cellular biologists to characterize polyubiquitin chains involved in their cellular environment by following a facile mass spectrometric based workflow. The characterization of polyubiquitin chains has been of interest since their discovery in 1984. The profound effects of ubiquitination on the movement and processing of cellular proteins depend exclusively on the structures of mono and polyubiquitin modifications anchored or unanchored on the protein within the cellular environment. However, structure-function studies have been hindered by the difficulty in identifying complex chain structures due to limited instrument capabilities of the past. Genetic mutations or reiterative immunoprecipitations have been used previously to characterize the polyubiquitin chains, but their tedium makes it difficult to study a broad ubiquitinome. Top-down and middle-out mass spectral based proteomic studies have been reported for polyubiquitin and have had success in characterizing parts of the chain, but no method to date has been successful at differentiating all theoretical ubiquitin chain isomers (ubiquitin chain lengths from dimer to tetramer alone have 1340 possible isomers). The workflow presented here can identify chain length, topology and linkages present using a chromatographic-time-scale compatible, LC-MS/MS based workflow. To accomplish this feat, the strategy had to exploit the most recent advances in top-down mass spectrometry. This included the most advanced electron transfer dissociation (ETD) activation and sensitivity for large masses from the orbitrap Fusion Lumos. The spectral interpretation had to be done manually with the aid of a graphical interface to assign mass shifts because of a lack of software capable to interpret fragmentation across isopeptide linkages. However, the method outlined can be applied to any mass spectral based system granted it results in extensive fragmentation across the polyubiquitin chain; making this method adaptable to future advances in the field.
Resumo:
Preserving the cultural heritage of the performing arts raises difficult and sensitive issues, as each performance is unique by nature and the juxtaposition between the performers and the audience cannot be easily recorded. In this paper, we report on an experimental research project to preserve another aspect of the performing arts—the history of their rehearsals. We have specifically designed non-intrusive video recording and on-site documentation techniques to make this process transparent to the creative crew, and have developed a complete workflow to publish the recorded video data and their corresponding meta-data online as Open Data using state-of-the-art audio and video processing to maximize non-linear navigation and hypervideo linking. The resulting open archive is made publicly available to researchers and amateurs alike and offers a unique account of the inner workings of the worlds of theater and opera.