17 resultados para Design and Analysis of Compute Experiment (DACE)
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Laterally loaded piles are a typical situation for a large number of cases in which deep foundations are used. Dissertation herein reported, is a focus upon the numerical simulation of laterally loaded piles. In the first chapter the best model settings are largely discussed, so a clear idea about the effects of interface adoption, model dimension, refinement cluster and mesh coarseness is reached. At a second stage, there are three distinct parametric analyses, in which the model response sensibility is studied for variation of interface reduction factor, Eps50 and tensile cut-off. In addition, the adoption of an advanced soil model is analysed (NGI-ADP). This was done in order to use the complex behaviour (different undrained shear strengths are involved) that governs the resisting process of clay under short time static loads. Once set a definitive model, a series of analyses has been carried out with the objective of defining the resistance-deflection (P-y) curves for Plaxis3D (2013) data. Major results of a large number of comparisons made with curves from API (America Petroleum Institute) recommendation are that the empirical curves have almost the same ultimate resistance but a bigger initial stiffness. In the second part of the thesis a simplified structural preliminary design of a jacket structure has been carried out to evaluate the environmental forces that act on it and on its piles foundation. Finally, pile lateral response is studied using the empirical curves.
Resumo:
In a world focused on the need to produce energy for a growing population, while reducing atmospheric emissions of carbon dioxide, organic Rankine cycles represent a solution to fulfil this goal. This study focuses on the design and optimization of axial-flow turbines for organic Rankine cycles. From the turbine designer point of view, most of this fluids exhibit some peculiar characteristics, such as small enthalpy drop, low speed of sound, large expansion ratio. A computational model for the prediction of axial-flow turbine performance is developed and validated against experimental data. The model allows to calculate turbine performance within a range of accuracy of ±3%. The design procedure is coupled with an optimization process, performed using a genetic algorithm where the turbine total-to-static efficiency represents the objective function. The computational model is integrated in a wider analysis of thermodynamic cycle units, by providing the turbine optimal design. First, the calculation routine is applied in the context of the Draugen offshore platform, where three heat recovery systems are compared. The turbine performance is investigated for three competing bottoming cycles: organic Rankine cycle (operating cyclopentane), steam Rankine cycle and air bottoming cycle. Findings indicate the air turbine as the most efficient solution (total-to-static efficiency = 0.89), while the cyclopentane turbine results as the most flexible and compact technology (2.45 ton/MW and 0.63 m3/MW). Furthermore, the study shows that, for organic and steam Rankine cycles, the optimal design configurations for the expanders do not coincide with those of the thermodynamic cycles. This suggests the possibility to obtain a more accurate analysis by including the computational model in the simulations of the thermodynamic cycles. Afterwards, the performance analysis is carried out by comparing three organic fluids: cyclopentane, MDM and R245fa. Results suggest MDM as the most effective fluid from the turbine performance viewpoint (total-to-total efficiency = 0.89). On the other hand, cyclopentane guarantees a greater net power output of the organic Rankine cycle (P = 5.35 MW), while R245fa represents the most compact solution (1.63 ton/MW and 0.20 m3/MW). Finally, the influence of the composition of an isopentane/isobutane mixture on both the thermodynamic cycle performance and the expander isentropic efficiency is investigated. Findings show how the mixture composition affects the turbine efficiency and so the cycle performance. Moreover, the analysis demonstrates that the use of binary mixtures leads to an enhancement of the thermodynamic cycle performance.
Resumo:
Numerical modelling and simulations are needed to develop and test specific analysis methods by providing test data before BIRDY would be launched. This document describes the "satellite data simulator" which is a multi-sensor, multi-spectral satellite simulator produced especially for the BIRDY mission which could be used as well to analyse data from other satellite missions providing energetic particles data in the Solar system.
Resumo:
ABSTRACT Il presente lavoro vuole introdurre la problematica del rigonfiamento del terreno a seguito di grandi scavi in argilla. Il sollevamento del terreno dopo lo scavo può passare inosservato ma sono numerosi i casi in cui il rigonfiamento dura per molti anni e addirittura decenni, Shell Centre, London, Lion Yard, Cambridge, Bell Common, London, ecc. Questo rigonfiamento il più delle volte è impedito dalla presenza di fondazioni, si genera quindi una pressione distribuita che se non considerata in fase di progetto può portare alla fessurazione della fondazione stessa. L’anima del progetto è la modellazione e l’analisi del rigonfiamento di grandi scavi in argilla, confrontando poi i risultati con i dati reali disponibili in letteratura. L’idea del progetto nasce dalla difficoltà di ottenere stime e previsioni attendibili del rigonfiamento a seguito di grandi scavi in argilla sovraconsolidata. Inizialmente ho esaminato la teoria e i fattori che influenzano il grado e la velocità del rigonfiamento, quali la rigidezza, permeabilità, fessurazione, struttura del suolo, etc. In seguito ho affrontato lo studio del comportamento rigonfiante di argille sovraconsolidate a seguito di scarico tensionale (scavi), si è evidenziata l’importanza di differenziare il rigonfiamento primario e il rigonfiamento secondario dovuto al fenomeno del creep. Il tema centrale del progetto è l’analisi numerica tramite Flac di due grandi scavi in argilla, Lion Yard, Cambridge, e, Bell Common, London. Attraverso una dettagliata analisi parametrica sono riuscito a trovare i migliori parametri che modellano il comportamento reale nei due casi in esame, in questo modo è possibile arrivare a stime e previsioni attendibili del fenomeno rigonfiante del terreno a seguito di grandi scavi. Gli scavi modellati Lion Yard e Bell Common sono rispettivamente in Gault Clay e London Clay, grazie a famosi recenti articoli scientifici sono riuscito a evidenziare la principali propietà che diversificano i due terreni in esame, tali propietà sono estremamente differenti dalle normali caratteristiche considerate per la progettazione in presenza di terreno argilloso; sono così riuscito a implementare i migliori parametri per descrivere il comportamento dei due terreni nei diversi modelli. Ho inoltre studiato l’interazione terreno-struttura, la pressione esercitata dal rigonfiamento del terreno è strettamente funzione delle caratteristiche di connesione tra fondazione superficiale e muro di sostegno, tale pressione non deve essere ignorata in fase progettuale poichè può raggiungere importanti valori. Nello scavo di Lion Yard, considerando la presenza delle fondazioni profonde ho evidenziato il fatto che il rigonfiamento crea una forza distribuita di taglio tra i pali di fondazione ed il terreno, anche tale sollecitazione dovrebbe essere considerata ai fini della progettazione. La problematica non si ferma solo sull’interazione terreno-fondazioni, infatti durante gli scavi di importanti fondazioni londinesi lo scarico tensionale ha creato uno spostamento significativo positivo verso la superfice di tratti di tunnel della metropolita, questo fenomeno può creare seri problemi di sicurezza nella rete dei trasporti pubblici. Infine sono stati messi a confronto i risultati del programma Flac con quelli di metodi semplificati, ho trovato che utilizzando il metodo iterativo di O’Brien i risultati sono simili alla realtà e il tempo di calcolo è molto inferiore di quello richiesto utilizzando Flac, 2-3 giorni. In conclusione posso affermare che grazie ad una dettagliata analisi parametrica è stato possibile stimare il rigonfiamento del terreno, argilla sovraconsolidata, nei due casi analizzati.
Resumo:
The aim of the work is to conduct a finite element model analysis on a small – size concrete beam and on a full size concrete beam internally reinforced with BFRP exposed at elevated temperatures. Experimental tests performed at Kingston University have been used to compare the results from the numerical analysis for the small – size concrete beam. Once the behavior of the small – size beam at room temperature is investigated and switching to the heating phase reinforced beams are tested at 100°C, 200°C and 300°C in loaded condition. The aim of the finite element analysis is to reflect the three – point bending test adopted into the oven during the exposure of the beam at room temperature and at elevated temperatures. Performance and deformability of reinforced beams are straightly correlated to the material properties and a wide analysis on elastic modulus and coefficient of thermal expansion is given in this work. Develop a good correlation between the numerical model and the experimental test is the main objective of the analysis on the small – size concrete beam, for both modelling the aim is also to estimate which is the deterioration of the material properties due to the heating process and the influence of different parameters on the final result. The focus of the full – size modelling which involved the last part of this work is to evaluate the effect of elevated temperatures, the material deterioration and the deflection trend on a reinforced beam characterized by a different size. A comparison between the results from different modelling has been developed.
Resumo:
This research work presents the design and implementation of a FFT pruning block, which is an extension to the FFT core for OFDM demodulation, enabling run-time 8 pruning of the FFT algorithm, without any restrictions on the distribution pattern of the active/inactive sub-carriers. The design and implementation of FFT processor core is not the part of this work. The whole design was prototyped on an ALTERA STRATIX V FPGA to evaluate the performance of the pruning engine. Synthesis and simulation results showed that the logic overhead introduced by the pruning block is limited to a 10% of the total resources utilization. Moreover, in presence of a medium-high scattering of the sub-carriers, power and energy consumption of the FFT core were reduced by a 30% factor.
Resumo:
Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).
Resumo:
Abstract (US) Composite material components design and production techniques are discussed in the present graduation paper. In particular, this paper covers the design process and the production process of a carbon-fiber composite material component for a high performance car, more specifically, the Dallara T12 race car. This graduation paper is split in two. After a brief introduction on existing composite materials (their origins and applications), the first part of the present paper covers the main theoretical concepts behind the design of composite material components: particular focus will be given to carbon-fiber composites. The second part of the present paper covers the whole design and production process that the candidate carried out to create the new front mainplane of the Dallara T12 race car. This graduation paper is the result of a six-months-long internship that the candidate conducted as Design Office Trainee inside Dallara Automobili S.p.A. Abstract (ITA) La presente tesi di laurea discute le metodologie progettuali e produttive legate alla realizzazione di un componente in materiale composito. Nello specifico, viene discussa la progettazione e la produzione di un componente in fibra di carbonio destinato ad una vettura da competizione. La vettura in esame è la Dallara T12. Il lavoro è diviso in due parti. Nella prima parte, dopo una breve introduzione sull’origine e le tipologie di materiali compositi esistenti, vengono trattati i concetti teorici fondamentali su cui si basa la progettazione di generici componenti in materiale composito, con particolare riguardo ai materiali in fibra di carbonio. Nella seconda parte viene discusso tutto il processo produttivo che il candidato ha portato a termine per realizzare il nuovo alettone anteriore della Dallara T12. La presente tesi di laurea è il risultato del lavoro di progettazione che il candidato ha svolto presso l’Ufficio Tecnico di Dallara Automobili S.p.A. nel corso di un tirocinio formativo di sei mesi.
Resumo:
The association of several favorable factors has resulted in the development of a wide barchan dune field that stands out as a fundamental element in the coastal landscape of southern Santa Catarina state in Brazil. This original ecosystem is being destroyed and highly modified, due to urbanization. This work identifies and discusses its basic characteristics and analyzes the favorable factors for its preservation, in the foreseen of both a sustainable future and potential incomes from ecotourism. The knowledge of the geologic evolution allows to associate this transgressive Holocene dunes formation to more dissipative beach conditions. Spatial differences on morphodynamics are related to local and regional contrasts in the sediment budget, with an influence on gradients of wave attenuation in the inner shelf and consequently with influence in the level of coastal erosion. The link between relative sea level changes and coastal eolian sedimentation can be used to integrate coastal eolian systems to the sequence stratigraphy model. The main accumulation phase of eolian sediments would occur during the final transgressive and highstand systems tracts. Considering the global character of Quaternary relative sea level changes, the Laguna transgressive dune field should be correlated with similar eolian deposits developed along other parts of the Brazilian coast compatibles with the model of dunefield initiation during rising and highstand sea level phases.
Resumo:
Progettazione, test e creazione di schede elettroniche per lo studio dell'atmosfera in condizioni ambientali difficili.
Resumo:
Cloud services are becoming ever more important for everyone's life. Cloud storage? Web mails? Yes, we don't need to be working in big IT companies to be surrounded by cloud services. Another thing that's growing in importance, or at least that should be considered ever more important, is the concept of privacy. The more we rely on services of which we know close to nothing about, the more we should be worried about our privacy. In this work, I will analyze a prototype software based on a peer to peer architecture for the offering of cloud services, to see if it's possible to make it completely anonymous, meaning that not only the users using it will be anonymous, but also the Peers composing it will not know the real identity of each others. To make it possible, I will make use of anonymizing networks like Tor. I will start by studying the state of art of Cloud Computing, by looking at some real example, followed by analyzing the architecture of the prototype, trying to expose the differences between its distributed nature and the somehow centralized solutions offered by the famous vendors. After that, I will get as deep as possible into the working principle of the anonymizing networks, because they are not something that can just be 'applied' mindlessly. Some de-anonymizing techniques are very subtle so things must be studied carefully. I will then implement the required changes, and test the new anonymized prototype to see how its performances differ from those of the standard one. The prototype will be run on many machines, orchestrated by a tester script that will automatically start, stop and do all the required API calls. As to where to find all these machines, I will make use of Amazon EC2 cloud services and their on-demand instances.