490 resultados para Workflow


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital rock physics combines modern imaging with advanced numerical simulations to analyze the physical properties of rocks -- In this paper we suggest a special segmentation procedure which is applied to a carbonate rock from Switzerland -- Starting point is a CTscan of a specimen of Hauptmuschelkalk -- The first step applied to the raw image data is a nonlocal mean filter -- We then apply different thresholds to identify pores and solid phases -- Because we are aware of a nonneglectable amount of unresolved microporosity we also define intermediate phases -- Based on this segmentation determine porositydependent values for the pwave velocity and for the permeability -- The porosity measured in the laboratory is then used to compare our numerical data with experimental data -- We observe a good agreement -- Future work includes an analytic validation to the numerical results of the pwave velocity upper bound, employing different filters for the image segmentation and using data with higher resolution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Many-Body-Perturbation Theory approach is among the most successful theoretical frameworks for the study of excited state properties. It allows to describe the excitonic interactions, which play a fundamental role in the optical response of insulators and semiconductors. The first part of the thesis focuses on the study of the quasiparticle, optical and excitonic properties of \textit{bulk} Transition Metal Oxide (TMO) perovskites using a G$_0$W$_0$+Bethe Salpeter Equation (BSE) approach. A representative set of 14 compounds has been selected, including 3d, 4d and 5d perovskites. An approximation of the BSE scheme, based on an analytic diagonal expression for the inverse dielectric function, is used to compute the exciton binding energies and is carefully bench-marked against the standard BSE results. In 2019 an important breakthrough has been achieved with the synthesis of ultrathin SrTiO3 films down to the monolayer limit. This allows us to explore how the quasiparticle and optical properties of SrTiO3 evolve from the bulk to the two-dimensional limit. The electronic structure is computed with G0W0 approach: we prove that the inclusion of the off-diagonal self-energy terms is required to avoid non-physical band dispersions. The excitonic properties are investigated beyond the optical limit at finite momenta. Lastly a study of the under pressure optical response of the topological nodal line semimetal ZrSiS is presented, in conjunction with the experimental results from the group of Prof. Dr. Kuntscher of the Augsburg University. The second part of the thesis discusses the implementation of a workflow to automate G$_0$W$_0$ and BSE calculations with the VASP software. The workflow adopts a convergence scheme based on an explicit basis-extrapolation approach [J. Klimeš \textit{et al.}, Phys. Rev.B 90, 075125 (2014)] which allows to reduce the number of intermediate calculations required to reach convergence and to explicit estimate the error associated to the basis-set truncation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LHC experiments produce an enormous amount of data, estimated of the order of a few PetaBytes per year. Data management takes place using the Worldwide LHC Computing Grid (WLCG) grid infrastructure, both for storage and processing operations. However, in recent years, many more resources are available on High Performance Computing (HPC) farms, which generally have many computing nodes with a high number of processors. Large collaborations are working to use these resources in the most efficient way, compatibly with the constraints imposed by computing models (data distributed on the Grid, authentication, software dependencies, etc.). The aim of this thesis project is to develop a software framework that allows users to process a typical data analysis workflow of the ATLAS experiment on HPC systems. The developed analysis framework shall be deployed on the computing resources of the Open Physics Hub project and on the CINECA Marconi100 cluster, in view of the switch-on of the Leonardo supercomputer, foreseen in 2023.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern society is now facing significant difficulties in attempting to preserve its architectural heritage. Numerous challenges arise consequently when it comes to documentation, preservation and restoration. Fortunately, new perspectives on architectural heritage are emerging owing to the rapid development of digitalization. Therefore, this presents new challenges for architects, restorers and specialists. Additionally, this has changed the way they approach the study of existing heritage, changing from conventional 2D drawings in response to the increasing requirement for 3D representations. Recently, Building Information Modelling for historic buildings (HBIM) has escalated as an emerging trend to interconnect geometrical and informational data. Currently, the latest 3D geomatics techniques based on 3D laser scanners with enhanced photogrammetry along with the continuous improvement in the BIM industry allow for an enhanced 3D digital reconstruction of historical and existing buildings. This research study aimed to develop an integrated workflow for the 3D digital reconstruction of heritage buildings starting from a point cloud. The Pieve of San Michele in Acerboli’s Church in Santarcangelo Di Romagna (6th century) served as the test bed. The point cloud was utilized as an essential referential to model the BIM geometry using Autodesk Revit® 2022. To validate the accuracy of the model, Deviation Analysis Method was employed using CloudCompare software to determine the degree of deviation between the HBIM model and the point cloud. The acquired findings showed a very promising outcome in the average distance between the HBIM model and the point cloud. The conducted approach in this study demonstrated the viability of producing a precise BIM geometry from point clouds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il settore del mobile è frammentato, caratteristica che non ha mai permesso una completa digitalizzazione dei suoi comparti: la comunicazione è risultata spesso difficoltosa, responsabile di incomprensioni e spesso mancanza di puntualità, con conseguente dispendio di risorse. Nel mio progetto di tesi analizzo principalmente la filiera degli imbottiti e quella tessile, collaterale ad essa. Il target di mio interesse sono le aziende produttrici di arredi che si servono di configuratori per la presentazione dei prodotti ai propri clienti. Ho ipotizzato un workflow suddivisibile in due fasi, una attuabile nel presente ed una seconda che implica una variazione attuabile nel futuro prossimo. Per la prima fase ho studiato un workflow agile al fine di digitalizzare le texture a partire da acquisizioni fotografiche. Analizzando il comportamento di vari tessuti ed elaborando un metodo di riproduzione digitale, ho sperimentato una metodologia per una resa superficiale degli imbottiti accurata e fotorealistica sia su piattaforme web che in realtà aumentata. Lo studio nasce dall’esigenza di apportare migliorie agli attuali configuratori e renderli effettivamente utili al fine per cui sono stati pensati: garantire una semplificazione del processo di acquisto rispondendo alle necessità degli utenti di vedere in anteprima l’aspetto finale del proprio arredo, collocato nello spazio abitativo. Il configuratore che ne deriva, restituirà una visualizzazione fedele dei tessuti anche in realtà aumentata. L’intelligenza artificiale fungerà da supporto nel processo decisionale dell’utente e la scelta dei materiali che compongono l’arredo sarà guidata in maniera tale da generare nell’utente un senso di consapevolezza circa gli impatti sull’ambiente. Per la seconda fase propongo una modalità di creazione dei tessuti completamente digitalizzata, a favore di una filiera sostenibile che riesce in questo modo ad abbattere costi e sprechi, con conseguente risparmio di risorse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a witness on the industrialization in Bologna, since its first generation was born in the late 1760, the Battiferro lock has been coping with the innovation that the city experienced throughout the centuries, until it has lost its functionality due to the technological development for which Bologna’s canals were gradually covered starting from the 1950s under Giuseppe Dozza ’s administration, as part of the reconstruction, reclamation and urban requalification that was carried out in the aftermath the World War II and which involved the whole city. The interest of the research carried out on this case study was primarily to reintroduce the landmark that is still intact, to what is considered to be the fourth generation of the industrial revolution, namely in the construction field, which is recognized as Construction 4.0, by means of the Historic (or Heritage) Information Modeling HBIM and Virtual Reality (VR) application. A scan-to-BIM approach was followed to create 3D as-built BIM model, as a first step towards the storytelling of the abandoned industrial built asset in VR environment, or as a seed for future applications such as Digital Twins (DT), heritage digital learning, sustainable impact studies, and/or interface with other interfaces such as GIS. Based on the HBIM product, examples of the primary BIM deliverables such as 2D layouts is given, then a workflow to VR is proposed and investigated the reliability of data and the type of users that may benefit of the VR experience, then the potential future development of the model is investigated, with comparison of a relatively similar experience in the UK.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document records the process of migrating eprints.org data to a Fez repository. Fez is a Web-based digital repository and workflow management system based on Fedora (http://www.fedora.info/). At the time of migration, the University of Queensland Library was using EPrints 2.2.1 [pepper] for its ePrintsUQ repository. Once we began to develop Fez, we did not upgrade to later versions of eprints.org software since we knew we would be migrating data from ePrintsUQ to the Fez-based UQ eSpace. Since this document records our experiences of migration from an earlier version of eprints.org, anyone seeking to migrate eprints.org data into a Fez repository might encounter some small differences. Moving UQ publication data from an eprints.org repository into a Fez repository (hereafter called UQ eSpace (http://espace.uq.edu.au/) was part of a plan to integrate metadata (and, in some cases, full texts) about all UQ research outputs, including theses, images, multimedia and datasets, in a single repository. This tied in with the plan to identify and capture the research output of a single institution, the main task of the eScholarshipUQ testbed for the Australian Partnership for Sustainable Repositories project (http://www.apsr.edu.au/). The migration could not occur at UQ until the functionality in Fez was at least equal to that of the existing ePrintsUQ repository. Accordingly, as Fez development occurred throughout 2006, a list of eprints.org functionality not currently supported in Fez was created so that programming of such development could be planned for and implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O desenvolvimento actual de aplicações paralelas com processamento intensivo (HPC - High Performance Computing) para alojamento em computadores organizados em Cluster baseia-se muito no modelo de passagem de mensagens, do qual é de realçar os esforços de definição de standards, por exemplo, MPI - Message - Passing Interface. Por outro lado, com a generalização do paradigma de programação orientado aos objectos para ambientes distribuídos (Java RMI, .NET Remoting), existe a possibilidade de considerar que a execução de uma aplicação, de processamento paralelo e intensivo, pode ser decomposta em vários fluxos de execução paralela, em que cada fluxo é constituído por uma ou mais tarefas executadas no contexto de objectos distribuídos. Normalmente, em ambientes baseados em objectos distribuídos, a especificação, controlo e sincronização dos vários fluxos de execução paralela, é realizada de forma explicita e codificada num programa principal (hard-coded), dificultando possíveis e necessárias modificações posteriores. No entanto, existem, neste contexto, trabalhos que propõem uma abordagem de decomposição, seguindo o paradigma de workflow com interacções entre as tarefas por, entre outras, data-flow, control-flow, finite - state - machine. Este trabalho consistiu em propor e explorar um modelo de execução, sincronização e controlo de múltiplas tarefas, que permita de forma flexível desenhar aplicações de processamento intensivo, tirando partido da execução paralela de tarefas em diferentes máquinas. O modelo proposto e consequente implementação, num protótipo experimental, permite: especificar aplicações usando fluxos de execução; submeter fluxos para execução e controlar e monitorizar a execução desses fluxos. As tarefas envolvidas nos fluxos de execução podem executar-se num conjunto de recursos distribuídos. As principais características a realçar no modelo proposto, são a expansibilidade e o desacoplamento entre as diferentes componentes envolvidas na execução dos fluxos de execução. São ainda descritos casos de teste que permitiram validar o modelo e o protótipo implementado. Tendo consciência da necessidade de continuar no futuro esta linha de investigação, este trabalho é um contributo para demonstrar que o paradigma de workflow é adequado para expressar e executar, de forma paralela e distribuída, aplicações complexas de processamento intensivo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel data analysis strategy which combined with subcellular fractionation and liquid chromatography-mass spectrometry (LC-MS) based proteomics provides a simple and effective workflow for global drug profiling. Five subcellular fractions were obtained by differential centrifugation followed by high resolution LC-MS and complete functional regulation analysis. The methodology combines functional regulation and enrichment analysis into a single visual summary. The workflow enables improved insight into perturbations caused by drugs. We provide a statistical argument to demonstrate that even crude subcellular fractions leads to improved functional characterization. We demonstrate this data analysis strategy on data obtained in a MS-based global drug profiling study. However, this strategy can also be performed on other types of large scale biological data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents work in progress, to develop an efficient and economic way to directly produce Technetium 99metastable (99mTc) using low-energy cyclotrons. Its importance is well established and relates with the increased global trouble in delivering 99mTc to Nuclear Medicine Departments relying on this radioisotope. Since the present delivery strategy has clearly demonstrated its intrinsic limits, our group decided to follow a distinct approach that uses the broad distribution of the low energy cyclotrons and the accessibility of Molybdenum 100 (100Mo) as the Target material. This is indeed an important issue to consider, since the system here presented, named CYCLOTECH, it is not based on the use of Highly Enriched (or even Low Enriched) Uranium 235 (235U), so entirely complying with the actual international trends and directives concerning the use of this potential highly critical material. The production technique is based on the nuclear reaction 100Mo (p,2n) 99mTc whose production yields have already been documented. Until this moment two Patent requests have already been submitted (the first at the INPI, in Portugal, and the second at the USPTO, in the USA); others are being prepared for submission on a near future. The object of the CYCLOTECH system is to present 99mTc to Nuclear Medicine radiopharmacists in a routine, reliable and efficient manner that, remaining always flexible, entirely blends with established protocols. To facilitate workflow and Radiation Protection measures, it has been developed a Target Station that can be installed on most of the existing PET cyclotrons and that will tolerate up to 400 μA of beam by allowing the beam to strike the Target material at an adequately oblique angle. The Target Station permits the remote and automatic loading and discharge of the Targets from a carriage of 10 Target bodies. On other hand, several methods of Target material deposition and Target substrates are presented. The object was to create a cost effective means of depositing and intermediate the target material thickness (25 - 100μm) with a minimum of loss on a substrate that is able to easily transport the heat associated with high beam currents. Finally, the separation techniques presented are a combination of both physical and column chemistry. The object was to extract and deliver 99mTc in the identical form now in use in radiopharmacies worldwide. In addition, the Target material is recovered and can be recycled.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ao longo deste trabalho será apresentada uma solução para sistemas de informação empresarial no domínio da Qualidade. Desenvolvida num ambiente real com recurso a plataforma .NET, a sua eficácia foi posta à prova em várias implementações. Várias organizações necessitam de implementar processos baseados num documento único, que deve conter o registo de toda a informação, mas que exige a intervenção de diferentes colaboradores. Esta necessidade varia muito entre organizações. Para dar resposta a este problema foi desenvolvido um sistema, denominado de Documentos Inteligentes, que, através de várias ferramentas, permite configurar processos, sem recorrer a codificação, baseados no registo de informação, workflow, controlo de acessos e alertas. É justamente a descrição dos vários componentes tecnológicos utilizados para dar resposta aos problemas, que se irá efectuar neste trabalho. Em termos de desenvolvimento de software foram abordados conceitos práticos de engenharia de domínio, RAD, arquitecturas por camadas, abstracção e workflow que possibilitaram uma maior flexibilidade do sistema desenvolvido e que conduz, simultaneamente, a uma rápida implementação de soluções de software baseadas nesta área.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Learning and teaching processes, like all human activities, can be mediated through the use of tools. Information and communication technologies are now widespread within education. Their use in the daily life of teachers and learners affords engagement with educational activities at any place and time and not necessarily linked to an institution or a certificate. In the absence of formal certification, learning under these circumstances is known as informal learning. Despite the lack of certification, learning with technology in this way presents opportunities to gather information about and present new ways of exploiting an individual’s learning. Cloud technologies provide ways to achieve this through new architectures, methodologies, and workflows that facilitate semantic tagging, recognition, and acknowledgment of informal learning activities. The transparency and accessibility of cloud services mean that institutions and learners can exploit existing knowledge to their mutual benefit. The TRAILER project facilitates this aim by providing a technological framework using cloud services, a workflow, and a methodology. The services facilitate the exchange of information and knowledge associated with informal learning activities ranging from the use of social software through widgets, computer gaming, and remote laboratory experiments. Data from these activities are shared among institutions, learners, and workers. The project demonstrates the possibility of gathering information related to informal learning activities independently of the context or tools used to carry them out.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Redes de Comunicação e Multimédia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.