959 resultados para Workflow Execution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital rock physics combines modern imaging with advanced numerical simulations to analyze the physical properties of rocks -- In this paper we suggest a special segmentation procedure which is applied to a carbonate rock from Switzerland -- Starting point is a CTscan of a specimen of Hauptmuschelkalk -- The first step applied to the raw image data is a nonlocal mean filter -- We then apply different thresholds to identify pores and solid phases -- Because we are aware of a nonneglectable amount of unresolved microporosity we also define intermediate phases -- Based on this segmentation determine porositydependent values for the pwave velocity and for the permeability -- The porosity measured in the laboratory is then used to compare our numerical data with experimental data -- We observe a good agreement -- Future work includes an analytic validation to the numerical results of the pwave velocity upper bound, employing different filters for the image segmentation and using data with higher resolution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Many-Body-Perturbation Theory approach is among the most successful theoretical frameworks for the study of excited state properties. It allows to describe the excitonic interactions, which play a fundamental role in the optical response of insulators and semiconductors. The first part of the thesis focuses on the study of the quasiparticle, optical and excitonic properties of \textit{bulk} Transition Metal Oxide (TMO) perovskites using a G$_0$W$_0$+Bethe Salpeter Equation (BSE) approach. A representative set of 14 compounds has been selected, including 3d, 4d and 5d perovskites. An approximation of the BSE scheme, based on an analytic diagonal expression for the inverse dielectric function, is used to compute the exciton binding energies and is carefully bench-marked against the standard BSE results. In 2019 an important breakthrough has been achieved with the synthesis of ultrathin SrTiO3 films down to the monolayer limit. This allows us to explore how the quasiparticle and optical properties of SrTiO3 evolve from the bulk to the two-dimensional limit. The electronic structure is computed with G0W0 approach: we prove that the inclusion of the off-diagonal self-energy terms is required to avoid non-physical band dispersions. The excitonic properties are investigated beyond the optical limit at finite momenta. Lastly a study of the under pressure optical response of the topological nodal line semimetal ZrSiS is presented, in conjunction with the experimental results from the group of Prof. Dr. Kuntscher of the Augsburg University. The second part of the thesis discusses the implementation of a workflow to automate G$_0$W$_0$ and BSE calculations with the VASP software. The workflow adopts a convergence scheme based on an explicit basis-extrapolation approach [J. Klimeš \textit{et al.}, Phys. Rev.B 90, 075125 (2014)] which allows to reduce the number of intermediate calculations required to reach convergence and to explicit estimate the error associated to the basis-set truncation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LHC experiments produce an enormous amount of data, estimated of the order of a few PetaBytes per year. Data management takes place using the Worldwide LHC Computing Grid (WLCG) grid infrastructure, both for storage and processing operations. However, in recent years, many more resources are available on High Performance Computing (HPC) farms, which generally have many computing nodes with a high number of processors. Large collaborations are working to use these resources in the most efficient way, compatibly with the constraints imposed by computing models (data distributed on the Grid, authentication, software dependencies, etc.). The aim of this thesis project is to develop a software framework that allows users to process a typical data analysis workflow of the ATLAS experiment on HPC systems. The developed analysis framework shall be deployed on the computing resources of the Open Physics Hub project and on the CINECA Marconi100 cluster, in view of the switch-on of the Leonardo supercomputer, foreseen in 2023.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern society is now facing significant difficulties in attempting to preserve its architectural heritage. Numerous challenges arise consequently when it comes to documentation, preservation and restoration. Fortunately, new perspectives on architectural heritage are emerging owing to the rapid development of digitalization. Therefore, this presents new challenges for architects, restorers and specialists. Additionally, this has changed the way they approach the study of existing heritage, changing from conventional 2D drawings in response to the increasing requirement for 3D representations. Recently, Building Information Modelling for historic buildings (HBIM) has escalated as an emerging trend to interconnect geometrical and informational data. Currently, the latest 3D geomatics techniques based on 3D laser scanners with enhanced photogrammetry along with the continuous improvement in the BIM industry allow for an enhanced 3D digital reconstruction of historical and existing buildings. This research study aimed to develop an integrated workflow for the 3D digital reconstruction of heritage buildings starting from a point cloud. The Pieve of San Michele in Acerboli’s Church in Santarcangelo Di Romagna (6th century) served as the test bed. The point cloud was utilized as an essential referential to model the BIM geometry using Autodesk Revit® 2022. To validate the accuracy of the model, Deviation Analysis Method was employed using CloudCompare software to determine the degree of deviation between the HBIM model and the point cloud. The acquired findings showed a very promising outcome in the average distance between the HBIM model and the point cloud. The conducted approach in this study demonstrated the viability of producing a precise BIM geometry from point clouds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il settore del mobile è frammentato, caratteristica che non ha mai permesso una completa digitalizzazione dei suoi comparti: la comunicazione è risultata spesso difficoltosa, responsabile di incomprensioni e spesso mancanza di puntualità, con conseguente dispendio di risorse. Nel mio progetto di tesi analizzo principalmente la filiera degli imbottiti e quella tessile, collaterale ad essa. Il target di mio interesse sono le aziende produttrici di arredi che si servono di configuratori per la presentazione dei prodotti ai propri clienti. Ho ipotizzato un workflow suddivisibile in due fasi, una attuabile nel presente ed una seconda che implica una variazione attuabile nel futuro prossimo. Per la prima fase ho studiato un workflow agile al fine di digitalizzare le texture a partire da acquisizioni fotografiche. Analizzando il comportamento di vari tessuti ed elaborando un metodo di riproduzione digitale, ho sperimentato una metodologia per una resa superficiale degli imbottiti accurata e fotorealistica sia su piattaforme web che in realtà aumentata. Lo studio nasce dall’esigenza di apportare migliorie agli attuali configuratori e renderli effettivamente utili al fine per cui sono stati pensati: garantire una semplificazione del processo di acquisto rispondendo alle necessità degli utenti di vedere in anteprima l’aspetto finale del proprio arredo, collocato nello spazio abitativo. Il configuratore che ne deriva, restituirà una visualizzazione fedele dei tessuti anche in realtà aumentata. L’intelligenza artificiale fungerà da supporto nel processo decisionale dell’utente e la scelta dei materiali che compongono l’arredo sarà guidata in maniera tale da generare nell’utente un senso di consapevolezza circa gli impatti sull’ambiente. Per la seconda fase propongo una modalità di creazione dei tessuti completamente digitalizzata, a favore di una filiera sostenibile che riesce in questo modo ad abbattere costi e sprechi, con conseguente risparmio di risorse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a witness on the industrialization in Bologna, since its first generation was born in the late 1760, the Battiferro lock has been coping with the innovation that the city experienced throughout the centuries, until it has lost its functionality due to the technological development for which Bologna’s canals were gradually covered starting from the 1950s under Giuseppe Dozza ’s administration, as part of the reconstruction, reclamation and urban requalification that was carried out in the aftermath the World War II and which involved the whole city. The interest of the research carried out on this case study was primarily to reintroduce the landmark that is still intact, to what is considered to be the fourth generation of the industrial revolution, namely in the construction field, which is recognized as Construction 4.0, by means of the Historic (or Heritage) Information Modeling HBIM and Virtual Reality (VR) application. A scan-to-BIM approach was followed to create 3D as-built BIM model, as a first step towards the storytelling of the abandoned industrial built asset in VR environment, or as a seed for future applications such as Digital Twins (DT), heritage digital learning, sustainable impact studies, and/or interface with other interfaces such as GIS. Based on the HBIM product, examples of the primary BIM deliverables such as 2D layouts is given, then a workflow to VR is proposed and investigated the reliability of data and the type of users that may benefit of the VR experience, then the potential future development of the model is investigated, with comparison of a relatively similar experience in the UK.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Disconnectivity between the Default Mode Network (DMN) nodes can cause clinical symptoms and cognitive deficits in Alzheimer׳s disease (AD). We aimed to examine the structural connectivity between DMN nodes, to verify the extent in which white matter disconnection affects cognitive performance. MRI data of 76 subjects (25 mild AD, 21 amnestic Mild Cognitive Impairment subjects and 30 controls) were acquired on a 3.0T scanner. ExploreDTI software (fractional Anisotropy threshold=0.25 and the angular threshold=60°) calculated axial, radial, and mean diffusivities, fractional anisotropy and streamline count. AD patients showed lower fractional anisotropy (P=0.01) and streamline count (P=0.029), and higher radial diffusivity (P=0.014) than controls in the cingulum. After correction for white matter atrophy, only fractional anisotropy and radial diffusivity remained significantly lower in AD compared to controls (P=0.003 and P=0.05). In the parahippocampal bundle, AD patients had lower mean and radial diffusivities (P=0.048 and P=0.013) compared to controls, from which only radial diffusivity survived for white matter adjustment (P=0.05). Regression models revealed that cognitive performance is also accounted for by white matter microstructural values. Structural connectivity within the DMN is important to the execution of high-complexity tasks, probably due to its relevant role in the integration of the network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The experiences induced by psychedelics share a wide variety of subjective features, related to the complex changes in perception and cognition induced by this class of drugs. A remarkable increase in introspection is at the core of these altered states of consciousness. Self-oriented mental activity has been consistently linked to the Default Mode Network (DMN), a set of brain regions more active during rest than during the execution of a goal-directed task. Here we used fMRI technique to inspect the DMN during the psychedelic state induced by Ayahuasca in ten experienced subjects. Ayahuasca is a potion traditionally used by Amazonian Amerindians composed by a mixture of compounds that increase monoaminergic transmission. In particular, we examined whether Ayahuasca changes the activity and connectivity of the DMN and the connection between the DMN and the task-positive network (TPN). Ayahuasca caused a significant decrease in activity through most parts of the DMN, including its most consistent hubs: the Posterior Cingulate Cortex (PCC)/Precuneus and the medial Prefrontal Cortex (mPFC). Functional connectivity within the PCC/Precuneus decreased after Ayahuasca intake. No significant change was observed in the DMN-TPN orthogonality. Altogether, our results support the notion that the altered state of consciousness induced by Ayahuasca, like those induced by psilocybin (another serotonergic psychedelic), meditation and sleep, is linked to the modulation of the activity and the connectivity of the DMN.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Errors are always present in experimental measurements so, it is important to identify them and understand how they affect the results of experiments. Statistics suggest that the execution of experiments should follow random order, but unfortunately the complete randomization of experiments is not always viable for practical reasons. One possible simplification is blocked experiments within which the levels of certain factors are maintained fixed while the levels of others are randomized. However this has a cost. Although the experimental part is simplified, the statistical analysis becomes more complex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Universidade Estadual de Campinas. Faculdade de Educação Física