680 resultados para Workflow Managment
Resumo:
Proceedings paper published by Society of American Archivists. Presented at conference in 2015 in Cleveland, OH (http://www2.archivists.org/proceedings/research-forum/2015/agenda#papers). Published by SAA in 2016.
Resumo:
The work outlined in this dissertation will allow biochemists and cellular biologists to characterize polyubiquitin chains involved in their cellular environment by following a facile mass spectrometric based workflow. The characterization of polyubiquitin chains has been of interest since their discovery in 1984. The profound effects of ubiquitination on the movement and processing of cellular proteins depend exclusively on the structures of mono and polyubiquitin modifications anchored or unanchored on the protein within the cellular environment. However, structure-function studies have been hindered by the difficulty in identifying complex chain structures due to limited instrument capabilities of the past. Genetic mutations or reiterative immunoprecipitations have been used previously to characterize the polyubiquitin chains, but their tedium makes it difficult to study a broad ubiquitinome. Top-down and middle-out mass spectral based proteomic studies have been reported for polyubiquitin and have had success in characterizing parts of the chain, but no method to date has been successful at differentiating all theoretical ubiquitin chain isomers (ubiquitin chain lengths from dimer to tetramer alone have 1340 possible isomers). The workflow presented here can identify chain length, topology and linkages present using a chromatographic-time-scale compatible, LC-MS/MS based workflow. To accomplish this feat, the strategy had to exploit the most recent advances in top-down mass spectrometry. This included the most advanced electron transfer dissociation (ETD) activation and sensitivity for large masses from the orbitrap Fusion Lumos. The spectral interpretation had to be done manually with the aid of a graphical interface to assign mass shifts because of a lack of software capable to interpret fragmentation across isopeptide linkages. However, the method outlined can be applied to any mass spectral based system granted it results in extensive fragmentation across the polyubiquitin chain; making this method adaptable to future advances in the field.
Resumo:
Preserving the cultural heritage of the performing arts raises difficult and sensitive issues, as each performance is unique by nature and the juxtaposition between the performers and the audience cannot be easily recorded. In this paper, we report on an experimental research project to preserve another aspect of the performing arts—the history of their rehearsals. We have specifically designed non-intrusive video recording and on-site documentation techniques to make this process transparent to the creative crew, and have developed a complete workflow to publish the recorded video data and their corresponding meta-data online as Open Data using state-of-the-art audio and video processing to maximize non-linear navigation and hypervideo linking. The resulting open archive is made publicly available to researchers and amateurs alike and offers a unique account of the inner workings of the worlds of theater and opera.
Resumo:
Introdução: A cintigrafia óssea é um dos exames mais frequentes em Medicina Nuclear. Esta modalidade de imagem médica requere um balanço apropriado entre a qualidade de imagem e a dose de radiação, ou seja, as imagens obtidas devem conter o número mínimo de contagem necessárias, para que apresentem qualidade considerada suficiente para fins diagnósticos. Objetivo: Este estudo tem como principal objetivo, a aplicação do software Enhanced Planar Processing (EPP), nos exames de cintigrafia óssea em doentes com carcinoma da mama e próstata que apresentam metástases ósseas. Desta forma, pretende-se avaliar a performance do algoritmo EPP na prática clínica em termos de qualidade e confiança diagnóstica quando o tempo de aquisição é reduzido em 50%. Material e Métodos: Esta investigação teve lugar no departamento de Radiologia e Medicina Nuclear do Radboud University Nijmegen Medical Centre. Cinquenta e um doentes com suspeita de metástases ósseas foram administrados com 500MBq de metilenodifosfonato marcado com tecnécio-99m. Cada doente foi submetido a duas aquisições de imagem, sendo que na primeira foi seguido o protocolo standard do departamento (scan speed=8 cm/min) e na segunda, o tempo de aquisição foi reduzido para metade (scan speed=16 cm/min). As imagens adquiridas com o segundo protocolo foram processadas com o algoritmo EPP. Todas as imagens foram submetidas a uma avaliação objetiva e subjetiva. Relativamente à análise subjetiva, três médicos especialistas em Medicina Nuclear avaliaram as imagens em termos da detetabilidade das lesões, qualidade de imagem, aceitabilidade diagnóstica, localização das lesões e confiança diagnóstica. No que respeita à avaliação objetiva, foram selecionadas duas regiões de interesse, uma localizada no terço médio do fémur e outra localizada nos tecidos moles adjacentes, de modo a obter os valores de relação sinal-ruído, relação contraste-ruído e coeficiente de variação. Resultados: Os resultados obtidos evidenciam que as imagens processadas com o software EPP oferecem aos médicos suficiente informação diagnóstica na deteção de metástases, uma vez que não foram encontradas diferenças estatisticamente significativas (p>0.05). Para além disso, a concordância entre os observadores, comparando essas imagens e as imagens adquiridas com o protocolo standard foi de 95% (k=0.88). Por outro lado, no que respeita à qualidade de imagem, foram encontradas diferenças estatisticamente significativas quando se compararam as modalidades de imagem entre si (p≤0.05). Relativamente à aceitabilidade diagnóstica, não foram encontradas diferenças estatisticamente significativas entre as imagens adquiridas com o protocolo standard e as imagens processadas com o EPP software (p>0.05), verificando-se uma concordância entre os observadores de 70.6%. Todavia, foram encontradas diferenças estatisticamente significativas entre as imagens adquiridas com o protocolo standard e as imagens adquiridas com o segundo protocolo e não processadas com o software EPP (p≤0.05). Para além disso, não foram encontradas diferenças estatisticamente significativas (p>0.05) em termos de relação sinal-ruído, relação contraste-ruído e coeficiente de variação entre as imagens adquiridas com o protocolo standard e as imagens processadas com o EPP. Conclusão: Com os resultados obtidos através deste estudo, é possível concluir que o algoritmo EPP, desenvolvido pela Siemens, oferece a possibilidade de reduzir o tempo de aquisição em 50%, mantendo ao mesmo tempo uma qualidade de imagem considerada suficiente para fins de diagnóstico. A utilização desta tecnologia, para além de aumentar a satisfação por parte dos doentes, é bastante vantajosa no que respeita ao workflow do departamento.
Resumo:
Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.
The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.
The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).
The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.
The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.
In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.
Resumo:
Mestrado em Gestão de Recursos Humanos
Resumo:
Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
A Digital Collection Center's Experience: ETD Discovery, Promotion, and Workflows in Digital Commons
Resumo:
This presentation was given at the Digital Commons Southeastern User Group conference at Winthrop University, South Carolina on June 5, 2015. The presentation discusses how the digital collections center (DCC) at Florida International University uses Digital Commons as their tool for ingesting, editing, tracking, and publishing university theses and dissertations. The basic DCC workflow is covered as well as institutional repository promotion.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
Building Information Modeling (BIM) concept is able to reshape each AEC project and the industry in general, offering a comprehensive collaboration process over a model of structure with regularly actualized and synchronized information. This report presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders through four case studies carried out during the internship in the engineering office NEWTON - Engineering Consultancy Company. The aim of the four cases studies was to cover multidisciplinary and varied projects. The first case study highlights the engineering project’s workflow and presents a comparison of traditional procedures and BIM concepts applied on the rehabilitation of an existing building. In the second and third case study, attention is focused on the goals achieved, particularly by structural engineer, due to the implementation of the mentioned technology on a full-lifecycle BIM project of a small residence and a complex project of residential building in Porto and on its architectural integration. In addition, through the fourth case study, the spatial coordination of Mechanical, Electrical and Plumbing (MEP) systems at a large-scale hotel project has been analyzed and accomplished, highlighting merits of BIM at this project stage. Through a reduction of the space used for facilities and infrastructures and the ability to identify conflicts and to nullify the related costs, its advantage for a complex building was proved.
Resumo:
Introduction: There has been a continuous development of new technologies in healthcare that are derived from national quality registries. However, this innovation needs to be translated into the workflow of healthcare delivery, to enable children with long-term conditions to get the best support possible to manage their health during everyday life. Since children living with long-term conditions experience different interference levels in their lives, healthcare professionals need to assess the impact of care on children’s day-to-day lives, as a complement to biomedical assessments. Aim: The overall aim of this thesis was to explore and describe the use of instruments about health-related quality of life (HRQOL) in outpatient care for children with long-term conditions on the basis of a national quality registry system. Methods: The research was conducted by using comparative, cross-sectional and explorative designs and data collection was performed by using different methods. The questionnaire DISABKIDS Chronic Generic Measure -37 was used as well as semi-structured interviews and video-recordings from consultations. Altogether, 156 children (8–18 years) and nine healthcare professionals participated in the studies. Children with Type 1 Diabetes (T1D) (n 131) answered the questionnaire DISABKIDS and children with rheumatic diseases, kidney diseases and T1D (n 25) were interviewed after their consultation at the outpatient clinic after the web-DISABKIDS had been used. In total, nine healthcare professionals used the HRQOL instrument as an assessment tool during the encounters which was video-recorded (n 21). Quantitative deductive content analysis was used to describe content in different HRQOL instruments. Statistical inference was used to analyse results from DISABKIDS and qualitative content analysis was used to analyse the interviews and video-recordings. Results: The findings showed that based on a biopsychosocial perspective, both generic and disease-specific instruments should be used to gain a comprehensive evaluation of the child’s HRQOL. The DISABKIDS instrument is applicable when describing different aspects of health concerning children with T1D. When DISABKIDS was used in the encounters, children expressed positive experiences about sharing their results with the healthcare professional. It was discovered that different approaches led to different outcomes for the child when the healthcare professionals were using DISABKIDS during the encounter. When an instructing approach is used, the child’s ability to learn more about their health and how to improve their health is limited. When an inviting or engaging approach is used by the professional, the child may become more involved during the conversations. Conclusions: It could be argued that instruments of HRQOL could be used as a complement to biomedical variables, to promote a biopsychosocial perspective on the child’s health. According to the children in this thesis, feedback on their results after answering to web-DISABKIDS is important, which implies that healthcare professionals need to prioritize time for discussions about results from HRQOL instruments in the encounters. If healthcare professionals involve the child in the discussion of the results of the HRQOL, misinterpreted answers could be corrected during the conversation. Concurrently, this claims that healthcare professionals invite and engage the child.
Resumo:
Background: Physician-rating websites have become a popular tool to create more transparency about the quality of health care providers. So far, it remains unknown whether online-based rating websites have the potential to contribute to a better standard of care. Objective: Our goal was to examine which health care providers use online rating websites and for what purposes, and whether health care providers use online patient ratings to improve patient care. Methods: We conducted an online-based cross-sectional study by surveying 2360 physicians and other health care providers (September 2015). In addition to descriptive statistics, we performed multilevel logistic regression models to ascertain the effects of providers' demographics as well as report card-related variables on the likelihood that providers implement measures to improve patient care. Results: Overall, more than half of the responding providers surveyed (54.66%, 1290/2360) used online ratings to derive measures to improve patient care (implemented measures: mean 3.06, SD 2.29). Ophthalmologists (68%, 40/59) and gynecologists (65.4%, 123/188) were most likely to implement any measures. The most widely implemented quality measures were related to communication with patients (28.77%, 679/2360), the appointment scheduling process (23.60%, 557/2360), and office workflow (21.23%, 501/2360). Scaled-survey results had a greater impact on deriving measures than narrative comments. Multilevel logistic regression models revealed medical specialty, the frequency of report card use, and the appraisal of the trustworthiness of scaled-survey ratings to be significantly associated predictors for implementing measures to improve patient care because of online ratings. Conclusions: Our results suggest that online ratings displayed on physician-rating websites have an impact on patient care. Despite the limitations of our study and unintended consequences of physician-rating websites, they still may have the potential to improve patient care.
Resumo:
El presente estudio de corte descriptivo hace una revisión teórica de 68 artículos de 11 países de Latinoamérica con el fin de dar a conocer el panorama organizacional con relación a la cultura organizacional y el liderazgo en la región y cómo este ha ido evolucionando en el tiempo. La metodología utilizada se enfocó en un conteo de frecuencias usando el modelo de liderazgo y cultura organizacional de Bass y Avolio (Bass, 1999) permitiendo ordenar en tres estilos de liderazgo la información encontrada en la revisión teórica, y a su vez cada uno de los liderazgos con sus creencias, éstas tomadas como variables de la cultura organizacional. Finalmente se encontraron diferentes tipos de tendencias a nivel de los tipos de liderazgo implementados en las organizaciones y la cultura organizacional que se adopta. Se plantea la necesidad de profundizar más y de forma empírica en la temática planteada para que se conozcan las transformaciones que se han dado en el contexto organizacional y el impacto sea mayor en un futuro cercano.
Resumo:
La estación de servicio “La Americana S.A.S” ha sido una de las primeras estaciones de servicio ubicada en Bucaramanga, dedicada a la comercialización y distribución tanto de gasolina como de los diferentes repuestos y accesorios para los vehículos, razón por la cual desde su fundación y hasta nuestros días, ha venido prestando un servicio destacado, efectivo y cumpliendo siempre con la demanda del mercado proporcionalmente a su consumo. Del mismo modo dicha experiencia dentro de este sector ha producido que con el transcurrir de los años, algunos entes privados y gubernamentales en su mayoría hayan querido hacer acuerdos y negociaciones organizacionales con dicha estación de servicio; claramente este tipo de alianzas estratégicas y negociaciones son de gran importancia ya que le dan el reconocimiento respectivo a La Americana y es así como esta organización recibe periódicamente ganancias y dineros fijos. Resultado de los acuerdos anteriormente mencionados, cabe resaltar que la mayoría de las ventas que realiza esta estación de servicio son producto de las compras hechas por los vehículos pertenecientes a las entidades gubernamentales y privadas en convenio; es importante mencionar que todas estas ventas se realizan vía crédito y por ende la cancelación del servicio correspondiente se hace uno o dos meses luego de prestado el servicio. Así mismo también se ha logrado evidenciar que el consumo por parte de los automóviles particulares con el transcurrir del tiempo se ha disminuido drásticamente debido a diversos factores (geográficos, competitivos y de procesos) tanto internos como externos y siendo una razón para que el total de las ventas y posteriores utilidades no sea el esperado. De continuar esto así en un futuro la empresa podría incurrir en serios problemas que afecten su participación dentro de este mercado. La alta dependencia de las ventas a crédito (entidades públicas y privadas) y la disminución continúa de las ventas de contado (particulares) está ocasionando que la Americana desde ya hace un tiempo tenga baja liquidez financiera y baja rotación de inventarios, así como la disminución considerada de sus utilidades, razón por la cual creemos que la implementación de un modelo de mercadeo así como la creación de un sistema para el conteo y supervisión de los inventarios ayudara a La Americana a poder superar esta pequeña crisis y poder ser una empresa perdurable durante los próximos años .