562 resultados para Workflow
Resumo:
A integração de serviços na perspetiva dos cidadãos e empresas e a necessidade de garantir algumas características da Administração Pública como a versatilidade e a competitividade colocam alguns constrangimentos na conceção das arquiteturas de integração de serviços. Para que seja possível integrar serviços de forma a que se garanta a mutabilidade da Administração Pública, é necessário criar dinamicamente workflows. No entanto, a criação de dinâmica de workflows suscita algumas preocupações ao nível da segurança, nomeadamente em relação à privacidade dos resultados produzidos durante a execução de um workflow e em relação à aplicação de políticas de controlo de participação no workflow pelos diversos executores do mesmo. Neste trabalho apresentamos um conjunto de princípios e regras (arquitetura) que permitem a criação e execução de workflows dinâmicos resolvendo, através de um modelo de segurança, as questões referidas. A arquitetura utiliza a composição de serviços para dessa forma construir serviços complexos a que poderá estar inerente um workflow dinâmico. A arquitetura usa ainda um paradigma de troca de mensagens-padrão entre os prestadores de serviços envolvidos num workflow dinâmico. O modelo de segurança proposto está intimamente ligado ao conjunto de mensagens definido na arquitetura. No âmbito do trabalho foram identificadas e analisadas várias arquiteturas e/ou plataformas de integração de serviços. A análise realizada teve como objetivo identificar as arquiteturas que permitem a criação de workflows dinâmicos e, destas, aquelas que utilizam mecanismos de privacidade para os resultados e de controlo de participação dos executores desses workflows. A arquitetura de integração que apresentamos é versátil, escalável, permite a prestação concorrente de serviços entre prestadores de serviços e permite criar workflows dinâmicos. A arquitetura permite que as entidades executoras do workflow decidam sobre a sua participação, decidam sobre a participação de terceiros (a quem delegam serviços) e decidam a quem entregam os resultados. Os participantes são acreditados por entidades certificadores reconhecidas pelos demais participantes. As credenciais fornecidas pelas entidades certificadoras são o ponto de partida para a aplicação de políticas de segurança no âmbito da arquitetura. Para validar a arquitetura proposta foram identificados vários casos de uso que exemplificam a necessidade de construção de workflows dinâmicos para atender a serviços complexos (não prestados na íntegra por uma única entidade). Estes casos de uso foram implementados num protótipo da arquitetura desenvolvido para o efeito. Essa experimentação permitiu concluir que a arquitetura está adequada para prestar esses serviços usando workflows dinâmicos e que na execução desses workflows os executores dispõem dos mecanismos de segurança adequados para controlar a sua participação, a participação de terceiros e a privacidade dos resultados produzidos no âmbito dos mesmos.
Resumo:
Das Forschungsinformationssystem VIVO bietet als Linked-Data-basiertes System die Möglichkeit, Daten aus anderen Quellen wiederzuverwenden. In der Praxis kann man dabei auf Konvertierungsprobleme stoßen. Oft liegen Daten nur in tabellarischem Format vor, z.B. als CSV-Datei. Zur Konvertierung dieser Daten existieren verschiedene Werkzeuge, viele dieser Werkzeuge erfordern jedoch entweder spezielle technische Umgebungen (oft Linux-Systeme) oder sie sind in der Bedienung sehr anspruchsvoll. Im Artikel wird ein Workflow für die Konvertierung von Daten aus GeoNames für VIVO mit Google Refine beschrieben.
Resumo:
Tese de mestrado em Bioinformática e Biologia Computacional (Bioinformática), apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2014
Resumo:
This paper describes the impact of cloud computing and the use of GPUs on the performance of Autodock and Gromacs respectively. Cloud computing was applicable to reducing the ‘‘tail’’ seen in running Autodock on desktop grids and the GPU version of Gromacs showed significant improvement over the CPU version. A large (200,000 compounds) library of small molecules, seven sialic acid analogues of the putative substrate and 8000 sugar molecules were converted into pdbqt format and used to interrogate the Trichomonas vaginalis neuraminidase using Autodock Vina. Good binding energy was noted for some of the small molecules (~-9 kcal/mol), but the sugars bound with affinity of less than -7.6 kcal/mol. The screening of the sugar library resulted in a ‘‘top hit’’ with a-2,3-sialyllacto-N-fucopentaose III, a derivative of the sialyl Lewisx structure and a known substrate of the enzyme. Indeed in the top 100 hits 8 were related to this structure. A comparison of Autodock Vina and Autodock 4.2 was made for the high affinity small molecules and in some cases the results were superimposable whereas in others, the match was less good. The validation of this work will require extensive ‘‘wet lab’’ work to determine the utility of the workflow in the prediction of potential enzyme inhibitors.
Resumo:
A vast majority of scientific grid applications are either parameter sweep applications or a significant subpart of these applications belong to class of parameter sweep activities. The paper describes a new graphical workflow language in which any node of the DAG-based workflow can be a parameter sweep node and the execution of these nodes are transparently executed either in service grids or in desktop grids depending on the computational complexity of the workflow node. The new concept is supported by the CancerGrid portal that has been established for a chemist community.
Resumo:
Researchers want to run scientific experiments focusing on their disciplines. They do not want to know how and where the experiments are executed. Science gateways hide details by coordinating the execution of experiments using different infrastructures and workflow systems. ER-flow/SHIWA and SCI-BUS project developed repositories to share artefacts such as applications, portlets, workflows, etc. inside and among research communities. Sharing artefacts in re-positories enable gateway developers to reuse them when building a new gateway and/or creating a new application.
Resumo:
Researchers want to analyse Health Care data which may requires large pools of compute and data resources. To have them they need access to Distributed Computing Infrastructures (DCI). To use them it requires expertise which researchers may not have. Workflows can hide infrastructures. There are many workflow systems but they are not interoperable. To learn a workflow system and create workflows in a workflow system may require significant effort. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows of other workflow systems. As a result, the lack of interoperability prevents workflow sharing and a vast amount of research efforts is wasted. The FP7 Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs (SHIWA) project developed the Coarse-Grained Interoperability (CGI) to enable workflow sharing. The project created the SHIWA Simulation Platform (SSP) to support CGI as a production-level service. The paper describes how the CGI approach can be used for analysis and simulation in Health Care.
Resumo:
Existing Workflow Management Systems (WFMSs) follow a pragmatic approach. They often use a proprietary modelling language with an intuitive graphical layout. However the underlying semantics lack a formal foundation. As a consequence, analysis issues, such as proving correctness i.e. soundness and completeness, and reliable execution are not supported at design level. This project will be using an applied ontology approach by formally defining key terms such as process, sub-process, action/task based on formal temporal theory. Current business process modelling (BPM) standards such as Business Process Modelling Notation (BPMN) and Unified Modelling Language (UML) Activity Diagram (AD) model their constructs with no logical basis. This investigation will contribute to the research and industry by providing a framework that will provide grounding for BPM to reason and represent a correct business process (BP). This is missing in the current BPM domain, and may result in reduction of the design costs and avert the burden of redundant terms used by the current standards. A graphical tool will be introduced which will implement the formal ontology defined in the framework. This new tool can be used both as a modelling tool and at the same time will serve the purpose of validating the model. This research will also fill the existing gap by providing a unified graphical representation to represent a BP in a logically consistent manner for the mainstream modelling standards in the fields of business and IT. A case study will be conducted to analyse a catalogue of existing ‘patient pathways’ i.e. processes, of King’s College Hospital NHS Trust including current performance statistics. Following the application of the framework, a mapping will be conducted, and new performance statistics will be collected. A cost/benefits analysis report will be produced comparing the results of the two approaches.
Resumo:
Existing Workflow Management Systems (WFMSs) follow a pragmatic approach. They often use a proprietary modelling language with an intuitive graphical layout. However the underlying semantics lack a formal foundation. As a consequence, analysis issues, such as proving correctness i.e. soundness and completeness, and reliable execution are not supported at design level. This project will be using an applied ontology approach by formally defining key terms such as process, sub-process, action/task based on formal temporal theory. Current business process modelling (BPM) standards such as Business Process Modelling Notation (BPMN) and Unified Modelling Language (UML) Activity Diagram (AD) model their constructs with no logical basis. This investigation will contribute to the research and industry by providing a framework that will provide grounding for BPM to reason and represent a correct business process (BP). This is missing in the current BPM domain, and may result in reduction of the design costs and avert the burden of redundant terms used by the current standards. A graphical tool will be introduced which will implement the formal ontology defined in the framework. This new tool can be used both as a modelling tool and at the same time will serve the purpose of validating the model. This research will also fill the existing gap by providing a unified graphical representation to represent a BP in a logically consistent manner for the mainstream modelling standards in the fields of business and IT. A case study will be conducted to analyse a catalogue of existing ‘patient pathways’ i.e. processes, of King’s College Hospital NHS Trust including current performance statistics. Following the application of the framework, a mapping will be conducted, and new performance statistics will be collected. A cost/benefits analysis report will be produced comparing the results of the two approaches.
Resumo:
We present a novel data analysis strategy which combined with subcellular fractionation and liquid chromatography-mass spectrometry (LC-MS) based proteomics provides a simple and effective workflow for global drug profiling. Five subcellular fractions were obtained by differential centrifugation followed by high resolution LC-MS and complete functional regulation analysis. The methodology combines functional regulation and enrichment analysis into a single visual summary. The workflow enables improved insight into perturbations caused by drugs. We provide a statistical argument to demonstrate that even crude subcellular fractions leads to improved functional characterization. We demonstrate this data analysis strategy on data obtained in a MS-based global drug profiling study. However, this strategy can also be performed on other types of large scale biological data.
Resumo:
This paper presents work in progress, to develop an efficient and economic way to directly produce Technetium 99metastable (99mTc) using low-energy cyclotrons. Its importance is well established and relates with the increased global trouble in delivering 99mTc to Nuclear Medicine Departments relying on this radioisotope. Since the present delivery strategy has clearly demonstrated its intrinsic limits, our group decided to follow a distinct approach that uses the broad distribution of the low energy cyclotrons and the accessibility of Molybdenum 100 (100Mo) as the Target material. This is indeed an important issue to consider, since the system here presented, named CYCLOTECH, it is not based on the use of Highly Enriched (or even Low Enriched) Uranium 235 (235U), so entirely complying with the actual international trends and directives concerning the use of this potential highly critical material. The production technique is based on the nuclear reaction 100Mo (p,2n) 99mTc whose production yields have already been documented. Until this moment two Patent requests have already been submitted (the first at the INPI, in Portugal, and the second at the USPTO, in the USA); others are being prepared for submission on a near future. The object of the CYCLOTECH system is to present 99mTc to Nuclear Medicine radiopharmacists in a routine, reliable and efficient manner that, remaining always flexible, entirely blends with established protocols. To facilitate workflow and Radiation Protection measures, it has been developed a Target Station that can be installed on most of the existing PET cyclotrons and that will tolerate up to 400 μA of beam by allowing the beam to strike the Target material at an adequately oblique angle. The Target Station permits the remote and automatic loading and discharge of the Targets from a carriage of 10 Target bodies. On other hand, several methods of Target material deposition and Target substrates are presented. The object was to create a cost effective means of depositing and intermediate the target material thickness (25 - 100μm) with a minimum of loss on a substrate that is able to easily transport the heat associated with high beam currents. Finally, the separation techniques presented are a combination of both physical and column chemistry. The object was to extract and deliver 99mTc in the identical form now in use in radiopharmacies worldwide. In addition, the Target material is recovered and can be recycled.
Resumo:
Learning and teaching processes, like all human activities, can be mediated through the use of tools. Information and communication technologies are now widespread within education. Their use in the daily life of teachers and learners affords engagement with educational activities at any place and time and not necessarily linked to an institution or a certificate. In the absence of formal certification, learning under these circumstances is known as informal learning. Despite the lack of certification, learning with technology in this way presents opportunities to gather information about and present new ways of exploiting an individual’s learning. Cloud technologies provide ways to achieve this through new architectures, methodologies, and workflows that facilitate semantic tagging, recognition, and acknowledgment of informal learning activities. The transparency and accessibility of cloud services mean that institutions and learners can exploit existing knowledge to their mutual benefit. The TRAILER project facilitates this aim by providing a technological framework using cloud services, a workflow, and a methodology. The services facilitate the exchange of information and knowledge associated with informal learning activities ranging from the use of social software through widgets, computer gaming, and remote laboratory experiments. Data from these activities are shared among institutions, learners, and workers. The project demonstrates the possibility of gathering information related to informal learning activities independently of the context or tools used to carry them out.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Redes de Comunicação e Multimédia
Resumo:
Trabalho de projeto apresentado à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.
Resumo:
Dissertação submetida à Escola Superior de Teatro e Cinema para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Desenvolvimento de projecto cinematográfico - Tecnologias de Pós-Produção