607 resultados para Workflow


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Informática e de Computadores

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Jede Lernumgebung muss ein Gleichgewicht von drei Anforderungen sicherstellen: Inhaltsvermittlung, Förderung von Aktivitäten der Studierenden und Unterstützung von lern- und arbeitsbezogenen Interaktionen. Auf dem Hintergrund von Ansätzen zu Task-Technology-Fit und zu Prozessverlusten bei Gruppenleistung wird ein Workflow-basiertes Modell einer Lern- und Arbeitsumgebung für kooperatives und kollaboratives Lernen und Arbeiten in der Psychologie und den empirischen Sozialwissenschaften zur Erreichung dieser Ziele vorgelegt. Es wird gezeigt, wie rezeptionsorientierte Lernvorgänge, die durch Lernprogramme angeregt werden, durch Funktionalitäten von Kooperation ergänzt werden können. Ferner wird gezeigt, wie produktionsorientierte Lernvorgänge durch kollaborative Lernprojekte gefördert werden können, welche die Lern- und Arbeitsschritte in einer studentischen Arbeitsgruppe unterstützen. Die Nutzung eines geteilten Arbeitsbereichs sowohl für Aktivitäten im Lernprogramm als auch im Lernprojekt werden diskutiert.(DIPF/Orig.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A human genome contains more than 20 000 protein-encoding genes. A human proteome, instead, has been estimated to be much more complex and dynamic. The most powerful tool to study proteins today is mass spectrometry (MS). MS based proteomics is based on the measurement of the masses of charged peptide ions in a gas-phase. The peptide amino acid sequence can be deduced, and matching proteins can be found, using software to correlate MS-data with sequence database information. Quantitative proteomics allow the estimation of the absolute or relative abundance of a certain protein in a sample. The label-free quantification methods use the intrinsic MS-peptide signals in the calculation of the quantitative values enabling the comparison of peptide signals from numerous patient samples. In this work, a quantitative MS methodology was established to study aromatase overexpressing (AROM+) male mouse liver and ovarian endometriosis tissue samples. The workflow of label-free quantitative proteomics was optimized in terms of sensitivity and robustness, allowing the quantification of 1500 proteins with a low coefficient of variance in both sample types. Additionally, five statistical methods were evaluated for the use with label-free quantitative proteomics data. The proteome data was integrated with other omics datasets, such as mRNA microarray and metabolite data sets. As a result, an altered lipid metabolism in liver was discovered in male AROM+ mice. The results suggest a reduced beta oxidation of long chain phospholipids in the liver and increased levels of pro-inflammatory fatty acids in the circulation in these mice. Conversely, in the endometriosis tissues, a set of proteins highly specific for ovarian endometrioma were discovered, many of which were under the regulation of the growth factor TGF-β1. This finding supports subsequent biomarker verification in a larger number of endometriosis patient samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proceedings paper published by Society of American Archivists. Presented at conference in 2015 in Cleveland, OH (http://www2.archivists.org/proceedings/research-forum/2015/agenda#papers). Published by SAA in 2016.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work outlined in this dissertation will allow biochemists and cellular biologists to characterize polyubiquitin chains involved in their cellular environment by following a facile mass spectrometric based workflow. The characterization of polyubiquitin chains has been of interest since their discovery in 1984. The profound effects of ubiquitination on the movement and processing of cellular proteins depend exclusively on the structures of mono and polyubiquitin modifications anchored or unanchored on the protein within the cellular environment. However, structure-function studies have been hindered by the difficulty in identifying complex chain structures due to limited instrument capabilities of the past. Genetic mutations or reiterative immunoprecipitations have been used previously to characterize the polyubiquitin chains, but their tedium makes it difficult to study a broad ubiquitinome. Top-down and middle-out mass spectral based proteomic studies have been reported for polyubiquitin and have had success in characterizing parts of the chain, but no method to date has been successful at differentiating all theoretical ubiquitin chain isomers (ubiquitin chain lengths from dimer to tetramer alone have 1340 possible isomers). The workflow presented here can identify chain length, topology and linkages present using a chromatographic-time-scale compatible, LC-MS/MS based workflow. To accomplish this feat, the strategy had to exploit the most recent advances in top-down mass spectrometry. This included the most advanced electron transfer dissociation (ETD) activation and sensitivity for large masses from the orbitrap Fusion Lumos. The spectral interpretation had to be done manually with the aid of a graphical interface to assign mass shifts because of a lack of software capable to interpret fragmentation across isopeptide linkages. However, the method outlined can be applied to any mass spectral based system granted it results in extensive fragmentation across the polyubiquitin chain; making this method adaptable to future advances in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Preserving the cultural heritage of the performing arts raises difficult and sensitive issues, as each performance is unique by nature and the juxtaposition between the performers and the audience cannot be easily recorded. In this paper, we report on an experimental research project to preserve another aspect of the performing arts—the history of their rehearsals. We have specifically designed non-intrusive video recording and on-site documentation techniques to make this process transparent to the creative crew, and have developed a complete workflow to publish the recorded video data and their corresponding meta-data online as Open Data using state-of-the-art audio and video processing to maximize non-linear navigation and hypervideo linking. The resulting open archive is made publicly available to researchers and amateurs alike and offers a unique account of the inner workings of the worlds of theater and opera.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução: A cintigrafia óssea é um dos exames mais frequentes em Medicina Nuclear. Esta modalidade de imagem médica requere um balanço apropriado entre a qualidade de imagem e a dose de radiação, ou seja, as imagens obtidas devem conter o número mínimo de contagem necessárias, para que apresentem qualidade considerada suficiente para fins diagnósticos. Objetivo: Este estudo tem como principal objetivo, a aplicação do software Enhanced Planar Processing (EPP), nos exames de cintigrafia óssea em doentes com carcinoma da mama e próstata que apresentam metástases ósseas. Desta forma, pretende-se avaliar a performance do algoritmo EPP na prática clínica em termos de qualidade e confiança diagnóstica quando o tempo de aquisição é reduzido em 50%. Material e Métodos: Esta investigação teve lugar no departamento de Radiologia e Medicina Nuclear do Radboud University Nijmegen Medical Centre. Cinquenta e um doentes com suspeita de metástases ósseas foram administrados com 500MBq de metilenodifosfonato marcado com tecnécio-99m. Cada doente foi submetido a duas aquisições de imagem, sendo que na primeira foi seguido o protocolo standard do departamento (scan speed=8 cm/min) e na segunda, o tempo de aquisição foi reduzido para metade (scan speed=16 cm/min). As imagens adquiridas com o segundo protocolo foram processadas com o algoritmo EPP. Todas as imagens foram submetidas a uma avaliação objetiva e subjetiva. Relativamente à análise subjetiva, três médicos especialistas em Medicina Nuclear avaliaram as imagens em termos da detetabilidade das lesões, qualidade de imagem, aceitabilidade diagnóstica, localização das lesões e confiança diagnóstica. No que respeita à avaliação objetiva, foram selecionadas duas regiões de interesse, uma localizada no terço médio do fémur e outra localizada nos tecidos moles adjacentes, de modo a obter os valores de relação sinal-ruído, relação contraste-ruído e coeficiente de variação. Resultados: Os resultados obtidos evidenciam que as imagens processadas com o software EPP oferecem aos médicos suficiente informação diagnóstica na deteção de metástases, uma vez que não foram encontradas diferenças estatisticamente significativas (p>0.05). Para além disso, a concordância entre os observadores, comparando essas imagens e as imagens adquiridas com o protocolo standard foi de 95% (k=0.88). Por outro lado, no que respeita à qualidade de imagem, foram encontradas diferenças estatisticamente significativas quando se compararam as modalidades de imagem entre si (p≤0.05). Relativamente à aceitabilidade diagnóstica, não foram encontradas diferenças estatisticamente significativas entre as imagens adquiridas com o protocolo standard e as imagens processadas com o EPP software (p>0.05), verificando-se uma concordância entre os observadores de 70.6%. Todavia, foram encontradas diferenças estatisticamente significativas entre as imagens adquiridas com o protocolo standard e as imagens adquiridas com o segundo protocolo e não processadas com o software EPP (p≤0.05). Para além disso, não foram encontradas diferenças estatisticamente significativas (p>0.05) em termos de relação sinal-ruído, relação contraste-ruído e coeficiente de variação entre as imagens adquiridas com o protocolo standard e as imagens processadas com o EPP. Conclusão: Com os resultados obtidos através deste estudo, é possível concluir que o algoritmo EPP, desenvolvido pela Siemens, oferece a possibilidade de reduzir o tempo de aquisição em 50%, mantendo ao mesmo tempo uma qualidade de imagem considerada suficiente para fins de diagnóstico. A utilização desta tecnologia, para além de aumentar a satisfação por parte dos doentes, é bastante vantajosa no que respeita ao workflow do departamento.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AIMS AND OBJECTIVES: The aim of this study is to describe healthcare professionals' experiences and perceptions of an intervention implemented in an action research project conducted to improve nursing documentation practices in four municipalities in Norway.

BACKGROUND: Documentation of individualized patient care is a continuing concern in healthcare services and could impacts the quality and safety of healthcare. Use of electronic systems has made some aspects of documentation more comprehensive, but creation of an individualized care plan remains a pressing issue.

DESIGN: A qualitative descriptive design was used.

METHODS: An action research project was conducted between 2010 and 2012 to improve the content and quality of nursing documentation in community healthcare services in four municipalities. One year after the project was completed four focus group interviews were conducted with healthcare professionals, one for each involved municipality. Two unit managers were interviewed individually. Qualitative content analysis was used.

RESULTS: Three themes emerged: healthcare professionals perceived competing interest; they experienced that they had to manage complexity and changes; and they highlighted a clear and visible leader as important for success.

CONCLUSIONS: Quality improvement activities are essential. Healthcare professionals experience a complicated situation when electronic health record systems do not support workflow. Further research is recommended to focus on the functionality and user interface of EHR systems, and on the role of leadership when implementing changes in clinical practice. This article is protected by copyright. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Temporal violations often take place during the running of large batch of parallel business cloud workflow, which have a serious impact on the on-time completion of massive concurrent user requests. Existing studies have shown that local temporal violations (namely the delays of workflow activities) occurring during cloud workflow execution are the fundamental causes for failed on-time completion. Therefore, accurate prediction of temporal violations is a very important yet challenging task for business cloud workflows. In this paper, based on an epidemic model, a novel temporal violation prediction strategy is proposed to estimate the number of local temporal violations and the number of violations that must be handled so as to achieve a certain on-time completion rate before the execution of workflows. The prediction result can be served as an important reference for temporal violation prevention and handling strategies such as static resource reservation and dynamic provision. Specifically, we first analyze the queuing process of the parallel workflow activities, then we predict the number of potential temporal violations based on a novel temporal violation transmission model inspired by an epidemic model. Comprehensive experimental results demonstrate that our strategy can achieve very high prediction accuracy under different situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud computing is establishing itself as the latest computing paradigm in recent years. As doing science in the cloud is becoming a reality, scientists are now able to access public cloud centers and employ high-performance computing resources to run scientific applications. However, due to the dynamic nature of the cloud environment, the usability of scientific cloud workflow systems can be significantly deteriorated if without effective service quality assurance strategies. Specifically, workflow temporal verification as the major approach for workflow temporal QoS (Quality of Service) assurance plays a critical role in the on-time completion of large-scale scientific workflows. Great efforts have been dedicated to the area of workflow temporal verification in recent years and it is high time that we should define the key research issues for scientific cloud workflows in order to keep our research on the right track. In this paper, we systematically investigate this problem and present four key research issues based on the introduction of a generic temporal verification framework. Meanwhile, state-of-the-art solutions for each research issue and open challenges are also presented. Finally, SwinDeW-V, an ongoing research project on temporal verification as part of our SwinDeW-C cloud workflow system, is also demonstrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interaction and integration of uncertainties in on-site and off-site project activities often result in the risk of delays and schedule overruns in hybrid construction projects. To address this problem, a holistic risk analysis approach that assesses the integrating impact of uncertainties on completion times is proposed. The results of the analysis show that growth in project size and work quantities intensifies pair and group interconnection of tasks within and between groups of on-site and off-site activities, resulting in lengthened completion times and deviations from project plans. Unavailability of resources, risk seeking attitudes, and workflow variability are other major contributors to the risk of late completion in hybrid construction. While project managers often analyze on-site and off-site uncertainties separately, practical implications of the research results suggest adoption of a holistic approach in which risk management practices in the two environments are integrated. This approach significantly improves tangible performance measures in projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conducting research in the rapidly evolving fields constituting the digital social sciences raises challenging ethical and technical issues, especially when the subject matter includes activities of stigmatised populations. Our study of a dark-web drug-use community provides a case example of ‘how to’ conduct studies in digital environments where sensitive and illicit activities are discussed. In this paper we present the workflow from our digital ethnography and consider the consequences of particular choices of action upon knowledge production. Key considerations that our workflow responded to include adapting to volatile field-sites, researcher safety in digital environments, data security and encryption, and ethical-legal challenges. We anticipate that this workflow may assist other researchers to emulate, test and adapt our approach to the diverse range of illicit studies online. In this paper we argue that active engagement with stigmatised communities through multi-sited digital ethnography can complement and augment the findings of digital trace analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.