12 resultados para Process Modeling, Collaboration, Distributed Modeling, Collaborative Technology

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Si tratta di un'analisi della piattaforma di sviluppo per BPMN Activiti. Viene prima spiegata la notazione del Business Process Modeling e poi viene descritto il funzionamento e la struttura di Activiti. Infine viene spiegato come usare le API fornite insieme al motore.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il lavoro è stato suddiviso in tre macro-aree. Una prima riguardante un'analisi teorica di come funzionano le intrusioni, di quali software vengono utilizzati per compierle, e di come proteggersi (usando i dispositivi che in termine generico si possono riconoscere come i firewall). Una seconda macro-area che analizza un'intrusione avvenuta dall'esterno verso dei server sensibili di una rete LAN. Questa analisi viene condotta sui file catturati dalle due interfacce di rete configurate in modalità promiscua su una sonda presente nella LAN. Le interfacce sono due per potersi interfacciare a due segmenti di LAN aventi due maschere di sotto-rete differenti. L'attacco viene analizzato mediante vari software. Si può infatti definire una terza parte del lavoro, la parte dove vengono analizzati i file catturati dalle due interfacce con i software che prima si occupano di analizzare i dati di contenuto completo, come Wireshark, poi dei software che si occupano di analizzare i dati di sessione che sono stati trattati con Argus, e infine i dati di tipo statistico che sono stati trattati con Ntop. Il penultimo capitolo, quello prima delle conclusioni, invece tratta l'installazione di Nagios, e la sua configurazione per il monitoraggio attraverso plugin dello spazio di disco rimanente su una macchina agent remota, e sui servizi MySql e DNS. Ovviamente Nagios può essere configurato per monitorare ogni tipo di servizio offerto sulla rete.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Al fine di migliorare le tecniche di coltura cellulare in vitro, sistemi a bioreattore sono sempre maggiormente utilizzati, e.g. ingegnerizzazione del tessuto osseo. Spinner Flasks, bioreattori rotanti e sistemi a perfusione di flusso sono oggi utilizzati e ogni sistema ha vantaggi e svantaggi. Questo lavoro descrive lo sviluppo di un semplice bioreattore a perfusione ed i risultati della metodologia di valutazione impiegata, basata su analisi μCT a raggi-X e tecniche di modellizzazione 3D. Un semplice bioreattore con generatore di flusso ad elica è stato progettato e costruito con l'obiettivo di migliorare la differenziazione di cellule staminali mesenchimali, provenienti da embrioni umani (HES-MP); le cellule sono state seminate su scaffold porosi di titanio che garantiscono una migliore adesione della matrice mineralizzata. Attraverso un microcontrollore e un'interfaccia grafica, il bioreattore genera tre tipi di flusso: in avanti (senso orario), indietro (senso antiorario) e una modalità a impulsi (avanti e indietro). Un semplice modello è stato realizzato per stimare la pressione generata dal flusso negli scaffolds (3•10-2 Pa). Sono stati comparati tre scaffolds in coltura statica e tre all’interno del bioreattore. Questi sono stati incubati per 21 giorni, fissati in paraformaldehyde (4% w/v) e sono stati soggetti ad acquisizione attraverso μCT a raggi-X. Le immagini ottenute sono state poi elaborate mediante un software di imaging 3D; è stato effettuato un sezionamento “virtuale” degli scaffolds, al fine di ottenere la distribuzione del gradiente dei valori di grigio di campioni estratti dalla superficie e dall’interno di essi. Tale distribuzione serve per distinguere le varie componenti presenti nelle immagini; in questo caso gli scaffolds dall’ipotetica matrice cellulare. I risultati mostrano che sia sulla superficie che internamente agli scaffolds, mantenuti nel bioreattore, è presente una maggiore densità dei gradienti dei valori di grigio ciò suggerisce un migliore deposito della matrice mineralizzata. Gli insegnamenti provenienti dalla realizzazione di questo bioreattore saranno utilizzati per progettare una nuova versione che renderà possibile l’analisi di più di 20 scaffolds contemporaneamente, permettendo un’ulteriore analisi della qualità della differenziazione usando metodologie molecolari ed istochimiche.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Slope failure occurs in many areas throughout the world and it becomes an important problem when it interferes with human activity, in which disasters provoke loss of life and property damage. In this research we investigate the slope failure through the centrifuge modeling, where a reduced-scale model, N times smaller than the full-scale (prototype), is used whereas the acceleration is increased by N times (compared with the gravity acceleration) to preserve the stress and the strain behavior. The aims of this research “Centrifuge modeling of sandy slopes” are in extreme synthesis: 1) test the reliability of the centrifuge modeling as a tool to investigate the behavior of a sandy slope failure; 2) understand how the failure mechanism is affected by changing the slope angle and obtain useful information for the design. In order to achieve this scope we arranged the work as follows: Chapter one: centrifuge modeling of slope failure. In this chapter we provide a general view about the context in which we are working on. Basically we explain what is a slope failure, how it happens and which are the tools available to investigate this phenomenon. Afterwards we introduce the technology used to study this topic, that is the geotechnical centrifuge. Chapter two: testing apparatus. In the first section of this chapter we describe all the procedures and facilities used to perform a test in the centrifuge. Then we explain the characteristics of the soil (Nevada sand), like the dry unit weight, water content, relative density, and its strength parameters (c,φ), which have been calculated in laboratory through the triaxial test. Chapter three: centrifuge tests. In this part of the document are presented all the results from the tests done in centrifuge. When we talk about results we refer to the acceleration at failure for each model tested and its failure surface. In our case study we tested models with the same soil and geometric characteristics but different angles. The angles tested in this research were: 60°, 75° and 90°. Chapter four: slope stability analysis. We introduce the features and the concept of the software: ReSSA (2.0). This software allows us to calculate the theoretical failure surfaces of the prototypes. Then we show in this section the comparisons between the experimental failure surfaces of the prototype, traced in the laboratory, and the one calculated by the software. Chapter five: conclusion. The conclusion of the research presents the results obtained in relation to the two main aims, mentioned above.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Synthetic Biology is a relatively new discipline, born at the beginning of the New Millennium, that brings the typical engineering approach (abstraction, modularity and standardization) to biotechnology. These principles aim to tame the extreme complexity of the various components and aid the construction of artificial biological systems with specific functions, usually by means of synthetic genetic circuits implemented in bacteria or simple eukaryotes like yeast. The cell becomes a programmable machine and its low-level programming language is made of strings of DNA. This work was performed in collaboration with researchers of the Department of Electrical Engineering of the University of Washington in Seattle and also with a student of the Corso di Laurea Magistrale in Ingegneria Biomedica at the University of Bologna: Marilisa Cortesi. During the collaboration I contributed to a Synthetic Biology project already started in the Klavins Laboratory. In particular, I modeled and subsequently simulated a synthetic genetic circuit that was ideated for the implementation of a multicelled behavior in a growing bacterial microcolony. In the first chapter the foundations of molecular biology are introduced: structure of the nucleic acids, transcription, translation and methods to regulate gene expression. An introduction to Synthetic Biology completes the section. In the second chapter is described the synthetic genetic circuit that was conceived to make spontaneously emerge, from an isogenic microcolony of bacteria, two different groups of cells, termed leaders and followers. The circuit exploits the intrinsic stochasticity of gene expression and intercellular communication via small molecules to break the symmetry in the phenotype of the microcolony. The four modules of the circuit (coin flipper, sender, receiver and follower) and their interactions are then illustrated. In the third chapter is derived the mathematical representation of the various components of the circuit and the several simplifying assumptions are made explicit. Transcription and translation are modeled as a single step and gene expression is function of the intracellular concentration of the various transcription factors that act on the different promoters of the circuit. A list of the various parameters and a justification for their value closes the chapter. In the fourth chapter are described the main characteristics of the gro simulation environment, developed by the Self Organizing Systems Laboratory of the University of Washington. Then, a sensitivity analysis performed to pinpoint the desirable characteristics of the various genetic components is detailed. The sensitivity analysis makes use of a cost function that is based on the fraction of cells in each one of the different possible states at the end of the simulation and the wanted outcome. Thanks to a particular kind of scatter plot, the parameters are ranked. Starting from an initial condition in which all the parameters assume their nominal value, the ranking suggest which parameter to tune in order to reach the goal. Obtaining a microcolony in which almost all the cells are in the follower state and only a few in the leader state seems to be the most difficult task. A small number of leader cells struggle to produce enough signal to turn the rest of the microcolony in the follower state. It is possible to obtain a microcolony in which the majority of cells are followers by increasing as much as possible the production of signal. Reaching the goal of a microcolony that is split in half between leaders and followers is comparatively easy. The best strategy seems to be increasing slightly the production of the enzyme. To end up with a majority of leaders, instead, it is advisable to increase the basal expression of the coin flipper module. At the end of the chapter, a possible future application of the leader election circuit, the spontaneous formation of spatial patterns in a microcolony, is modeled with the finite state machine formalism. The gro simulations provide insights into the genetic components that are needed to implement the behavior. In particular, since both the examples of pattern formation rely on a local version of Leader Election, a short-range communication system is essential. Moreover, new synthetic components that allow to reliably downregulate the growth rate in specific cells without side effects need to be developed. In the appendix are listed the gro code utilized to simulate the model of the circuit, a script in the Python programming language that was used to split the simulations on a Linux cluster and the Matlab code developed to analyze the data.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The aim of Tissue Engineering is to develop biological substitutes that will restore lost morphological and functional features of diseased or damaged portions of organs. Recently computer-aided technology has received considerable attention in the area of tissue engineering and the advance of additive manufacture (AM) techniques has significantly improved control over the pore network architecture of tissue engineering scaffolds. To regenerate tissues more efficiently, an ideal scaffold should have appropriate porosity and pore structure. More sophisticated porous configurations with higher architectures of the pore network and scaffolding structures that mimic the intricate architecture and complexity of native organs and tissues are then required. This study adopts a macro-structural shape design approach to the production of open porous materials (Titanium foams), which utilizes spatial periodicity as a simple way to generate the models. From among various pore architectures which have been studied, this work simulated pore structure by triply-periodic minimal surfaces (TPMS) for the construction of tissue engineering scaffolds. TPMS are shown to be a versatile source of biomorphic scaffold design. A set of tissue scaffolds using the TPMS-based unit cell libraries was designed. TPMS-based Titanium foams were meant to be printed three dimensional with the relative predicted geometry, microstructure and consequently mechanical properties. Trough a finite element analysis (FEA) the mechanical properties of the designed scaffolds were determined in compression and analyzed in terms of their porosity and assemblies of unit cells. The purpose of this work was to investigate the mechanical performance of TPMS models trying to understand the best compromise between mechanical and geometrical requirements of the scaffolds. The intention was to predict the structural modulus in open porous materials via structural design of interconnected three-dimensional lattices, hence optimising geometrical properties. With the aid of FEA results, it is expected that the effective mechanical properties for the TPMS-based scaffold units can be used to design optimized scaffolds for tissue engineering applications. Regardless of the influence of fabrication method, it is desirable to calculate scaffold properties so that the effect of these properties on tissue regeneration may be better understood.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The present thesis work proposes a new physical equivalent circuit model for a recently proposed semiconductor transistor, a 2-drain MSET (Multiple State Electrostatically Formed Nanowire Transistor). It presents a new software-based experimental setup that has been developed for carrying out numerical simulations on the device and on equivalent circuits. As of 2015, we have already approached the scaling limits of the ubiquitous CMOS technology that has been in the forefront of mainstream technological advancement, so many researchers are exploring different ideas in the realm of electrical devices for logical applications, among them MSET transistors. The idea that underlies MSETs is that a single multiple-terminal device could replace many traditional transistors. In particular a 2-drain MSET is akin to a silicon multiplexer, consisting in a Junction FET with independent gates, but with a split drain, so that a voltage-controlled conductive path can connect either of the drains to the source. The first chapter of this work presents the theory of classical JFETs and its common equivalent circuit models. The physical model and its derivation are presented, the current state of equivalent circuits for the JFET is discussed. A physical model of a JFET with two independent gates has been developed, deriving it from previous results, and is presented at the end of the chapter. A review of the characteristics of MSET device is shown in chapter 2. In this chapter, the proposed physical model and its formulation are presented. A listing for the SPICE model was attached as an appendix at the end of this document. Chapter 3 concerns the results of the numerical simulations on the device. At first the research for a suitable geometry is discussed and then comparisons between results from finite-elements simulations and equivalent circuit runs are made. Where points of challenging divergence were found between the two numerical results, the relevant physical processes are discussed. In the fourth chapter the experimental setup is discussed. The GUI-based environments that allow to explore the four-dimensional solution space and to analyze the physical variables inside the device are described. It is shown how this software project has been structured to overcome technical challenges in structuring multiple simulations in sequence, and to provide for a flexible platform for future research in the field.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Global climate change in recent decades has strongly influenced the Arctic generating pronounced warming accompanied by significant reduction of sea ice in seasonally ice-covered seas and a dramatic increase of open water regions exposed to wind [Stephenson et al., 2011]. By strongly scattering the wave energy, thick multiyear ice prevents swell from penetrating deeply into the Arctic pack ice. However, with the recent changes affecting Arctic sea ice, waves gain more energy from the extended fetch and can therefore penetrate further into the pack ice. Arctic sea ice also appears weaker during melt season, extending the transition zone between thick multi-year ice and the open ocean. This region is called the Marginal Ice Zone (MIZ). In the Arctic, the MIZ is mainly encountered in the marginal seas, such as the Nordic Seas, the Barents Sea, the Beaufort Sea and the Labrador Sea. Formed by numerous blocks of sea ice of various diameters (floes) the MIZ, under certain conditions, allows maritime transportation stimulating dreams of industrial and touristic exploitation of these regions and possibly allowing, in the next future, a maritime connection between the Atlantic and the Pacific. With the increasing human presence in the Arctic, waves pose security and safety issues. As marginal seas are targeted for oil and gas exploitation, understanding and predicting ocean waves and their effects on sea ice become crucial for structure design and for real time safety of operations. The juxtaposition of waves and sea ice represents a risk for personnel and equipment deployed on ice, and may complicate critical operations such as platform evacuations. The risk is difficult to evaluate because there are no long-term observations of waves in ice, swell events are difficult to predict from local conditions, ice breakup can occur on very short time-scales and wave-ice interactions are beyond the scope of current forecasting models [Liu and Mollo-Christensen, 1988,Marko, 2003]. In this thesis, a newly developed Waves in Ice Model (WIM) [Williams et al., 2013a,Williams et al., 2013b] and its related Ocean and Sea Ice model (OSIM) will be used to study the MIZ and the improvements of wave modeling in ice infested waters. The following work has been conducted in collaboration with the Nansen Environmental and Remote Sensing Center and within the SWARP project which aims to extend operational services supporting human activity in the Arctic by including forecast of waves in ice-covered seas, forecast of sea-ice in the presence of waves and remote sensing of both waves and sea ice conditions. The WIM will be included in the downstream forecasting services provided by Copernicus marine environment monitoring service.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

One of the biggest challenges that contaminant hydrogeology is facing, is how to adequately address the uncertainty associated with model predictions. Uncertainty arise from multiple sources, such as: interpretative error, calibration accuracy, parameter sensitivity and variability. This critical issue needs to be properly addressed in order to support environmental decision-making processes. In this study, we perform Global Sensitivity Analysis (GSA) on a contaminant transport model for the assessment of hydrocarbon concentration in groundwater. We provide a quantification of the environmental impact and, given the incomplete knowledge of hydrogeological parameters, we evaluate which are the most influential, requiring greater accuracy in the calibration process. Parameters are treated as random variables and a variance-based GSA is performed in a optimized numerical Monte Carlo framework. The Sobol indices are adopted as sensitivity measures and they are computed by employing meta-models to characterize the migration process, while reducing the computational cost of the analysis. The proposed methodology allows us to: extend the number of Monte Carlo iterations, identify the influence of uncertain parameters and lead to considerable saving computational time obtaining an acceptable accuracy.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Ultrafast pump-probe spectroscopy is a conceptually simple and versatile tool for resolving photoinduced dynamics in molecular systems. Due to the fast development of new experimental setups, such as synchrotron light sources and X-ray free electron lasers (XFEL), new spectral windows are becoming accessible. On the one hand, these sources have enabled scientist to access faster and faster time scales and to reach unprecedent insights into dynamical properties of matter. On the other hand, the complementarity of well-developed and novel techniques allows to study the same physical process from different points of views, integrating the advantages and overcoming the limitations of each approach. In this context, it is highly desirable to reach a clear understanding of which type of spectroscopy is more suited to capture a certain facade of a given photo-induced process, that is, to establish a correlation between the process to be unraveled and the technique to be used. In this thesis, I will show how computational spectroscopy can be a tool to establish such a correlation. I will study a specific process, which is the ultrafast energy transfer in the nicotinamide adenine dinucleotide dimer (NADH). This process will be observed in different spectral windows (from UV-VIS to X-rays), accessing the ability of different spectroscopic techniques to unravel the system evolution by means of state-of-the-art theoretical models and methodologies. The comparison of different spectroscopic simulations will demonstrate their complementarity, eventually allowing to identify the type of spectroscopy that is best suited to resolve the ultrafast energy transfer.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This work is focused on studying the kinetics of esterification of levulinic acid in an isothermal batch reactor using ethanol as a reactant and as a protic polar solvent at the same time and in the presence of an acid catalyst (sulfuric acid). The choice of solvent is important as it affects the kinetics and thermodynamics of the reaction system moreover, the knowledge of the reaction kinetics plays an important role in the design of the process. This work is divided into two stages; The first stage is the experimental part in which the experimental matrix was developed by changing the process variables one at a time (temperature, molar ratio between reactants, and catalyst concentration) in order to study their influence on the kinetics; the second stage is using the obtained data from the experiments to build the modeling part in order to estimate the thermodynamics parameters.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Fiber-reinforced concrete is a composite material consisting of discrete, discontinuous, and uniformly distributed fibers in plain concrete primarily used to enhance the tensile properties of the concrete. FRC performance depends upon the fiber, interface, and matrix properties. The use of fiber-reinforced concrete has been increasing substantially in the past few years in different fields of the construction industry such as ground-level application in sidewalks and building floors, tunnel lining, aircraft parking, runways, slope stabilization, etc. Many experiments have been performed to observe the short-term and long-term mechanical behavior of fiber-reinforced concrete in the last decade and numerous numerical models have been formulated to accurately capture the response of fiber-reinforced concrete. The main purpose of this dissertation is to numerically calibrate the short-term response of the concrete and fiber parameters in mesoscale for the three-point bending test and cube compression test in the MARS framework which is based on the lattice discrete particle model (LDPM) and later validate the same parameters for the round panels. LDPM is the most validated theory in mesoscale theories for concrete. Different seeds representing the different orientations of concrete and fiber particles are simulated to produce the mean numerical response. The result of numerical simulation shows that the lattice discrete particle model for fiber-reinforced concrete can capture results of experimental tests on the behavior of fiber-reinforced concrete to a great extent.