14 resultados para computer-aided qualitative data analysis software
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
The aim of Tissue Engineering is to develop biological substitutes that will restore lost morphological and functional features of diseased or damaged portions of organs. Recently computer-aided technology has received considerable attention in the area of tissue engineering and the advance of additive manufacture (AM) techniques has significantly improved control over the pore network architecture of tissue engineering scaffolds. To regenerate tissues more efficiently, an ideal scaffold should have appropriate porosity and pore structure. More sophisticated porous configurations with higher architectures of the pore network and scaffolding structures that mimic the intricate architecture and complexity of native organs and tissues are then required. This study adopts a macro-structural shape design approach to the production of open porous materials (Titanium foams), which utilizes spatial periodicity as a simple way to generate the models. From among various pore architectures which have been studied, this work simulated pore structure by triply-periodic minimal surfaces (TPMS) for the construction of tissue engineering scaffolds. TPMS are shown to be a versatile source of biomorphic scaffold design. A set of tissue scaffolds using the TPMS-based unit cell libraries was designed. TPMS-based Titanium foams were meant to be printed three dimensional with the relative predicted geometry, microstructure and consequently mechanical properties. Trough a finite element analysis (FEA) the mechanical properties of the designed scaffolds were determined in compression and analyzed in terms of their porosity and assemblies of unit cells. The purpose of this work was to investigate the mechanical performance of TPMS models trying to understand the best compromise between mechanical and geometrical requirements of the scaffolds. The intention was to predict the structural modulus in open porous materials via structural design of interconnected three-dimensional lattices, hence optimising geometrical properties. With the aid of FEA results, it is expected that the effective mechanical properties for the TPMS-based scaffold units can be used to design optimized scaffolds for tissue engineering applications. Regardless of the influence of fabrication method, it is desirable to calculate scaffold properties so that the effect of these properties on tissue regeneration may be better understood.
Resumo:
LHC experiments produce an enormous amount of data, estimated of the order of a few PetaBytes per year. Data management takes place using the Worldwide LHC Computing Grid (WLCG) grid infrastructure, both for storage and processing operations. However, in recent years, many more resources are available on High Performance Computing (HPC) farms, which generally have many computing nodes with a high number of processors. Large collaborations are working to use these resources in the most efficient way, compatibly with the constraints imposed by computing models (data distributed on the Grid, authentication, software dependencies, etc.). The aim of this thesis project is to develop a software framework that allows users to process a typical data analysis workflow of the ATLAS experiment on HPC systems. The developed analysis framework shall be deployed on the computing resources of the Open Physics Hub project and on the CINECA Marconi100 cluster, in view of the switch-on of the Leonardo supercomputer, foreseen in 2023.
Resumo:
I principi Agile, pubblicati nell’omonimo Manifesto più di 20 anni fa, al giorno d’oggi sono declinati in una moltitudine di framework: Scrum, XP, Kanban, Lean, Adaptive, Crystal, etc. Nella prima parte della tesi (Capitoli 1 e 2) sono stati descritti alcuni di questi framework e si è analizzato come un approccio Agile è utilizzato nella pratica in uno specifico caso d’uso: lo sviluppo di una piattaforma software a supporto di un sistema di e-grocery da parte di un team di lab51. Si sono verificate le differenze e le similitudini rispetto alcuni metodi Agile formalizzati in letteratura spiegando le motivazioni che hanno portato a differenziarsi da questi framework illustrando i vantaggi per il team. Nella seconda parte della tesi (Capitoli 3 e 4) è stata effettuata un’analisi dei dati raccolti dal supermercato online negli ultimi anni con l’obiettivo di migliorare l’algoritmo di riordino. In particolare, per prevedere le vendite dei singoli prodotti al fine di avere degli ordini più adeguati in quantità e frequenza, sono stati studiati vari approcci: dai modelli statistici di time series forecasting, alle reti neurali, fino ad una metodologia sviluppata ad hoc.
Resumo:
The objective of this dissertation is to study the structure and behavior of the Atmospheric Boundary Layer (ABL) in stable conditions. This type of boundary layer is not completely well understood yet, although it is very important for many practical uses, from forecast modeling to atmospheric dispersion of pollutants. We analyzed data from the SABLES98 experiment (Stable Atmospheric Boundary Layer Experiment in Spain, 1998), and compared the behaviour of this data using Monin-Obukhov's similarity functions for wind speed and potential temperature. Analyzing the vertical profiles of various variables, in particular the thermal and momentum fluxes, we identified two main contrasting structures describing two different states of the SBL, a traditional and an upside-down boundary layer. We were able to determine the main features of these two states of the boundary layer in terms of vertical profiles of potential temperature and wind speed, turbulent kinetic energy and fluxes, studying the time series and vertical structure of the atmosphere for two separate nights in the dataset, taken as case studies. We also developed an original classification of the SBL, in order to separate the influence of mesoscale phenomena from turbulent behavior, using as parameters the wind speed and the gradient Richardson number. We then compared these two formulations, using the SABLES98 dataset, verifying their validity for different variables (wind speed and potential temperature, and their difference, at different heights) and with different stability parameters (zita or Rg). Despite these two classifications having completely different physical origins, we were able to find some common behavior, in particular under weak stability conditions.
Resumo:
Nella fisica delle particelle, onde poter effettuare analisi dati, è necessario disporre di una grande capacità di calcolo e di storage. LHC Computing Grid è una infrastruttura di calcolo su scala globale e al tempo stesso un insieme di servizi, sviluppati da una grande comunità di fisici e informatici, distribuita in centri di calcolo sparsi in tutto il mondo. Questa infrastruttura ha dimostrato il suo valore per quanto riguarda l'analisi dei dati raccolti durante il Run-1 di LHC, svolgendo un ruolo fondamentale nella scoperta del bosone di Higgs. Oggi il Cloud computing sta emergendo come un nuovo paradigma di calcolo per accedere a grandi quantità di risorse condivise da numerose comunità scientifiche. Date le specifiche tecniche necessarie per il Run-2 (e successivi) di LHC, la comunità scientifica è interessata a contribuire allo sviluppo di tecnologie Cloud e verificare se queste possano fornire un approccio complementare, oppure anche costituire una valida alternativa, alle soluzioni tecnologiche esistenti. Lo scopo di questa tesi è di testare un'infrastruttura Cloud e confrontare le sue prestazioni alla LHC Computing Grid. Il Capitolo 1 contiene un resoconto generale del Modello Standard. Nel Capitolo 2 si descrive l'acceleratore LHC e gli esperimenti che operano a tale acceleratore, con particolare attenzione all’esperimento CMS. Nel Capitolo 3 viene trattato il Computing nella fisica delle alte energie e vengono esaminati i paradigmi Grid e Cloud. Il Capitolo 4, ultimo del presente elaborato, riporta i risultati del mio lavoro inerente l'analisi comparata delle prestazioni di Grid e Cloud.
Resumo:
Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?
Resumo:
VIRTIS, a bordo di Venus Express, è uno spettrometro in grado di operare da 0.25 a 5 µm. Nel periodo 2006-2011 ha ricavato un'enorme mole di dati ma a tutt'oggi le osservazioni al lembo sono poco utilizzate per lo studio delle nubi e delle hazes, specialmente di notte. Gli spettri al lembo a quote mesosferiche sono dominati dalla radianza proveniente dalle nubi e scatterata in direzione dello strumento dalle hazes. L'interpretazione degli spettri al lembo non può quindi prescindere dalla caratterizzazione dell'intera colonna atmosferica. L'obiettivo della tesi è di effettuare un’analisi statistica sulle osservazioni al nadir e proporre una metodologia per ricavare una caratterizzazione delle hazes combinando osservazioni al nadir e al lembo. La caratterizzazione delle nubi è avvenuta su un campione di oltre 3700 osservazioni al nadir. È stato creato un ampio dataset di spettri sintetici modificando, in un modello iniziale, vari parametri di nube quali composizione chimica, numero e dimensione delle particelle. Un processo di fit è stato applicato alle osservazioni per stabilire quale modello potesse descrivere gli spettri osservati. Si è poi effettuata una analisi statistica sui risultati del campione. Si è ricavata una concentrazione di acido solforico molto elevata nelle nubi basse, pari al 96% in massa, che si discosta dal valore generalmente utilizzato del 75%. Si sono poi integrati i risultati al nadir con uno studio mirato su poche osservazioni al lembo, selezionate in modo da intercettare nel punto di tangenza la colonna atmosferica osservata al nadir, per ricavare informazioni sulle hazes. I risultati di un modello Monte Carlo indicano che il numero e le dimensioni delle particelle previste dal modello base devono essere ridotte in maniera significativa. In particolare si osserva un abbassamento della quota massima delle hazes rispetto ad osservazioni diurne.
Resumo:
The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.
Resumo:
Il rilevatore Probe for LUminosity MEasurement (PLUME) è un luminometro per l’esperimento LHCb al CERN. Fornirà misurazioni istantanee della luminosità per LHCb durante la Run 3 a LHC. L’obiettivo di questa tesi è di valutare, con dati simulati, le prestazioni attese di PLUME, come l’occupanza dei PMT che compongono il rivelatore, e riportare l’analisi dei primi dati ottenuti da PLUME durante uno scan di Van der Meer. In particolare, sono state ottenuti tre misure del valore della sezione d’urto, necessarie per tarare il rivelatore, ovvero σ1Da = (1.14 ± 0.11) mb, σ1Db = (1.13 ± 0.10) mb, σ2D = (1.20 ± 0.02) mb, dove i pedici 1D e 2D corrispondono a uno scan di Van der Meer unidimensionale e bidimensionale. Tutti i risultati sono in accordo tra loro.
Resumo:
The thesis is the result of work conducted during a period of six months at the Strategy department of Automobili Lamborghini S.p.A. in Sant'Agata Bolognese (BO) and concerns the study and analysis of Big Data relating to Lamborghini's connected cars. The Big Data is a project of Connected Car Project House, that is an inter-departmental team which works toward the definition of the Lamborghini corporate connectivity strategy and its implementation in the product portfolio. The Data of the connected cars is one of the hottest topics right now in the automotive industry; in fact, all the largest automotive companies are investi,ng a lot in this direction, in order to derive the greatest advantages both from a purely economic point of view, because from these data you can understand a lot the behaviors and habits of each driver, and from a technological point of view because it will increasingly promote the development of 5G that will be an important enabler for the future of connectivity. The main purpose of the work by Lamborghini prospective is to analyze the data of the connected cars, in particular a data-set referred to connected Huracans that had been already placed on the market, and, starting from that point, derive valuable Key Performance Indicators (KPIs) on which the company could partly base the decisions to be made in the near future. The key result that we have obtained at the end of this period was the creation of a Dashboard, in which is possible to visualize many parameters and indicators both related to driving habits and the use of the vehicle itself, which has brought great insights on the huge potential and value that is present behind the study of these data. The final Demo of the project has received great interest, not only from the whole strategy department but also from all the other business areas of Lamborghini, making mostly a great awareness that this will be the road to follow in the coming years.
Resumo:
There are many natural events that can negatively affect the urban ecosystem, but weather-climate variations are certainly among the most significant. The history of settlements has been characterized by extreme events like earthquakes and floods, which repeat themselves at different times, causing extensive damage to the built heritage on a structural and urban scale. Changes in climate also alter various climatic subsystems, changing rainfall regimes and hydrological cycles, increasing the frequency and intensity of extreme precipitation events (heavy rainfall). From an hydrological risk perspective, it is crucial to understand future events that could occur and their magnitude in order to design safer infrastructures. Unfortunately, it is not easy to understand future scenarios as the complexity of climate is enormous. For this thesis, precipitation and discharge extremes were primarily used as data sources. It is important to underline that the two data sets are not separated: changes in rainfall regime, due to climate change, could significantly affect overflows into receiving water bodies. It is imperative that we understand and model climate change effects on water structures to support the development of adaptation strategies. The main purpose of this thesis is to search for suitable water structures for a road located along the Tione River. Therefore, through the analysis of the area from a hydrological point of view, we aim to guarantee the safety of the infrastructure over time. The observations made have the purpose to underline how models such as a stochastic one can improve the quality of an analysis for design purposes, and influence choices.
Resumo:
Nowadays, product development in all its phases plays a fundamental role in the industrial chain. The need for a company to compete at high levels, the need to be quick in responding to market demands and therefore to be able to engineer the product quickly and with a high level of quality, has led to the need to get involved in new more advanced methods/ processes. In recent years, we are moving away from the concept of 2D-based design and production and approaching the concept of Model Based Definition. By using this approach, increasingly complex systems turn out to be easier to deal with but above all cheaper in obtaining them. Thanks to the Model Based Definition it is possible to share data in a lean and simple way to the entire engineering and production chain of the product. The great advantage of this approach is precisely the uniqueness of the information. In this specific thesis work, this approach has been exploited in the context of tolerances with the aid of CAD / CAT software. Tolerance analysis or dimensional variation analysis is a way to understand how sources of variation in part size and assembly constraints propagate between parts and assemblies and how that range affects the ability of a project to meet its requirements. It is critically important to note how tolerance directly affects the cost and performance of products. Worst Case Analysis (WCA) and Statistical analysis (RSS) are the two principal methods in DVA. The thesis aims to show the advantages of using statistical dimensional analysis by creating and examining various case studies, using PTC CREO software for CAD modeling and CETOL 6σ for tolerance analysis. Moreover, it will be provided a comparison between manual and 3D analysis, focusing the attention to the information lost in the 1D case. The results obtained allow us to highlight the need to use this approach from the early stages of the product design cycle.
Resumo:
This thesis is developed in the contest of Ritmare project WP1, which main objective is the development of a sustainable fishery through the identification of populations boundaries in commercially important species in Italian Seas. Three main objectives are discussed in order to help reach the main purpose of identification of stock boundaries in Parapenaeus longirostris: 1 -Development of a representative sampling design for Italian seas; 2 -Evaluation of 2b-RAD protocol; 3 -Investigation of populations through biological data analysis. First of all we defined and accomplished a sampling design which properly represents all Italian seas. Then we used information and data about nursery areas distribution, abundance of populations and importance of P. longirostris in local fishery, to develop an experimental design that prioritize the most important areas to maximize the results with actual project funds. We introduced for the first time the use of 2b-RAD on this species, a genotyping method based on sequencing the uniform fragments produced by type IIB restriction endonucleases. Thanks to this method we were able to move from genetics to the more complex genomics. In order to proceed with 2b-RAD we performed several tests to identify the best DNA extraction kit and protocol and finally we were able to extract 192 high quality DNA extracts ready to be processed. We tested 2b-RAD with five samples and after high-throughput sequencing of libraries we used the software “Stacks” to analyze the sequences. We obtained positive results identifying a great number of SNP markers among the five samples. To guarantee a multidisciplinary approach we used the biological data associated to the collected samples to investigate differences between geographical samples. Such approach assures continuity with other project, for instance STOCKMED, which utilize a combination of molecular and biological analysis as well.
Resumo:
One of the most undervalued problems by smartphone users is the security of data on their mobile devices. Today smartphones and tablets are used to send messages and photos and especially to stay connected with social networks, forums and other platforms. These devices contain a lot of private information like passwords, phone numbers, private photos, emails, etc. and an attacker may choose to steal or destroy this information. The main topic of this thesis is the security of the applications present on the most popular stores (App Store for iOS and Play Store for Android) and of their mechanisms for the management of security. The analysis is focused on how the architecture of the two systems protects users from threats and highlights the real presence of malware and spyware in their respective application stores. The work described in subsequent chapters explains the study of the behavior of 50 Android applications and 50 iOS applications performed using network analysis software. Furthermore, this thesis presents some statistics about malware and spyware present on the respective stores and the permissions they require. At the end the reader will be able to understand how to recognize malicious applications and which of the two systems is more suitable for him. This is how this thesis is structured. The first chapter introduces the security mechanisms of the Android and iOS platform architectures and the security mechanisms of their respective application stores. The Second chapter explains the work done, what, why and how we have chosen the tools needed to complete our analysis. The third chapter discusses about the execution of tests, the protocol followed and the approach to assess the “level of danger” of each application that has been checked. The fourth chapter explains the results of the tests and introduces some statistics on the presence of malicious applications on Play Store and App Store. The fifth chapter is devoted to the study of the users, what they think about and how they might avoid malicious applications. The sixth chapter seeks to establish, following our methodology, what application store is safer. In the end, the seventh chapter concludes the thesis.