9 resultados para verification algorithm

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the increase in load demand for various sectors, protection and safety of the network are key factors that have to be taken into consideration over the electric grid and distribution network. A phasor Measuring unit is an Intelligent electronics device that collects the data in the form of a real-time synchrophasor with a precise time tag using GPS (Global positioning system) and transfers the data to the grid command to monitor and assess the data. The measurements made by PMU have to be very precise to protect the relays and measuring equipment according to the IEEE 60255-118-1(2018). As a device PMU is very expensive to research and develop new functionalities there is a need to find an alternative to working with. Hence many open source virtual libraries are available to replicate the exact function of PMU in the virtual environment(Software) to continue the research on multiple objectives, providing the very least error results when verified. In this thesis, I executed performance and compliance verification of the virtual PMU which was developed using the I-DFT (Interpolated Discrete Fourier transforms) C-class algorithm in MATLAB. In this thesis, a test environment has been developed in MATLAB and tested the virtually developed PMU on both steady state and dynamic state for verifying the latest standard compliance(IEEE-60255-118-1).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La presente tesi analizza il comportamento differito in una prova di flessione su quattro punti su una trave prefabbricata con getto di soletta integrativa in opera. Sono riportati i risultati della prova di carico e di rottura sulla trave. Le prove sono state eseguite presso il Laboratorio di Prove Strutture (DISTART) dell'Università di Bologna. La trave è composta di due tipi di calcestreuzzo, realizzati in tempi diversi e con proprietà meccaniche diverse. L'obiettivo della campagna sperimentale è studiare l'effetto delle fasi di costruzione sull'evoluzione delle tensioni e delle deformazioni nel tempo e sulla resistenza ultima della trave. Il comportamento sperimentale è stato confrontato con i risultati numerici ottenuti con un modello a fibre in grado di cogliere anche i fenomeni legati al creep.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past a change in temperature of 5°C most often occurred over intervals of thousands of years. According to estimates by the IPCC, in the XXI century is expected an increase in average temperatures in Europe between 1.8 and 4.0°C in the best case caused by emissions of carbon dioxide and other GHG from human activities. As well as on the environment and economic context, global warming will have effects even on road safety. Several studies have already shown how increasing temperature may cause a worsening of some types of road surface damages, especially rutting, a permanent deformation of the road structures consisting in the formation of a longitudinal depression in the wheelpath, mostly due to the rheological behavior of bitumen. This deformation evolves during the hot season because of the heating capacity of the asphalt layers, in fact, the road surface temperature is up to 24°C higher than air. In this thesis, through the use of Wheeltrack test, it was studied the behavior of some types of asphalt concrete mixtures subjected to fatigue testing at different temperatures. The objectives of this study are: to determine the strain variation of different bituminous mixture subjected to fatigue testing at different temperature conditions; to investigate the effect of aggregates, bitumen and mixtures’ characteristics on rutting. Samples were made in the laboratory mostly using an already prepared mixtures, the others preparing the asphalt concrete from the grading curve and bitumen content. The same procedure was performed for each specimen: preparation, compaction using the roller compactor, cooling and heating before the test. The tests were carried out at 40 - 50 - 60°C in order to obtain the evolution of deformation with temperature variation, except some mixtures for which the tests were carried out only at 50°C. In the elaboration of the results were considered testing parameters, component properties and the characteristics of the mixture. Among the testing parameters, temperature was varied for each sample. The mixtures responded to this variation with a different behavior (linear logarithmic and exponential) not directly correlated with the asphalt characteristics; the others parameters as load, passage frequency and test condition were kept constant. According to the results obtained, the main contribution to deformation is due to the type of binder used, it was found that the modified bitumen have a better response than the same mixtures containing traditional bitumen; to the porosity which affects negatively the behavior of the samples and to the homogeneity ceteris paribus. The granulometric composition did not seem to have interfered with the results. Overall has emerged at working temperature, a decisive importance of bitumen composition, than the other characteristics of the mixture, that tends to disappear with heating in favor of increased dependence of rutting resistance from the granulometric composition of the sample considered. In particular it is essential, rather than the mechanical characteristics of the binder, its chemical properties given by the polymeric modification. To confirm some considered results, the maximum bulk density and the air voids content were determined. Tests have been conducted in the laboratories of the Civil Engineering Department at NTNU in Trondheim according to European Standards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex networks analysis is a very popular topic in computer science. Unfortunately this networks, extracted from different contexts, are usually very large and the analysis may be very complicated: computation of metrics on these structures could be very complex. Among all metrics we analyse the extraction of subnetworks called communities: they are groups of nodes that probably play the same role within the whole structure. Communities extraction is an interesting operation in many different fields (biology, economics,...). In this work we present a parallel community detection algorithm that can operate on networks with huge number of nodes and edges. After an introduction to graph theory and high performance computing, we will explain our design strategies and our implementation. Then, we will show some performance evaluation made on a distributed memory architectures i.e. the supercomputer IBM-BlueGene/Q "Fermi" at the CINECA supercomputing center, Italy, and we will comment our results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of my thesis is to parallelize the Weighting Histogram Analysis Method (WHAM), which is a popular algorithm used to calculate the Free Energy of a molucular system in Molecular Dynamics simulations. WHAM works in post processing in cooperation with another algorithm called Umbrella Sampling. Umbrella Sampling has the purpose to add a biasing in the potential energy of the system in order to force the system to sample a specific region in the configurational space. Several N independent simulations are performed in order to sample all the region of interest. Subsequently, the WHAM algorithm is used to estimate the original system energy starting from the N atomic trajectories. The parallelization of WHAM has been performed through CUDA, a language that allows to work in GPUs of NVIDIA graphic cards, which have a parallel achitecture. The parallel implementation may sensibly speed up the WHAM execution compared to previous serial CPU imlementations. However, the WHAM CPU code presents some temporal criticalities to very high numbers of interactions. The algorithm has been written in C++ and executed in UNIX systems provided with NVIDIA graphic cards. The results were satisfying obtaining an increase of performances when the model was executed on graphics cards with compute capability greater. Nonetheless, the GPUs used to test the algorithm is quite old and not designated for scientific calculations. It is likely that a further performance increase will be obtained if the algorithm would be executed in clusters of GPU at high level of computational efficiency. The thesis is organized in the following way: I will first describe the mathematical formulation of Umbrella Sampling and WHAM algorithm with their apllications in the study of ionic channels and in Molecular Docking (Chapter 1); then, I will present the CUDA architectures used to implement the model (Chapter 2); and finally, the results obtained on model systems will be presented (Chapter 3).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to its practical importance and inherent complexity, the optimisation of distribution networks for supplying drinking water has been the subject of extensive study for the past 30 years. The optimization is governed by sizing the pipes in the water distribution network (WDN) and / or optimises specific parts of the network such as pumps, tanks etc. or try to analyse and optimise the reliability of a WDN. In this thesis, the author has analysed two different WDNs (Anytown City and Cabrera city networks), trying to solve and optimise a multi-objective optimisation problem (MOOP). The main two objectives in both cases were the minimisation of Energy Cost (€) or Energy consumption (kWh), along with the total Number of pump switches (TNps) during a day. For this purpose, a decision support system generator for Multi-objective optimisation used. Its name is GANetXL and has been developed by the Center of Water System in the University of Exeter. GANetXL, works by calling the EPANET hydraulic solver, each time a hydraulic analysis has been fulfilled. The main algorithm used, was a second-generation algorithm for multi-objective optimisation called NSGA_II that gave us the Pareto fronts of each configuration. The first experiment that has been carried out was the network of Anytown city. It is a big network with a pump station of four fixed speed parallel pumps that are boosting the water dynamics. The main intervention was to change these pumps to new Variable speed driven pumps (VSDPs), by installing inverters capable to diverse their velocity during the day. Hence, it’s been achieved great Energy and cost savings along with minimisation in the number of pump switches. The results of the research are thoroughly illustrated in chapter 7, with comments and a variety of graphs and different configurations. The second experiment was about the network of Cabrera city. The smaller WDN had a unique FS pump in the system. The problem was the same as far as the optimisation process was concerned, thus, the minimisation of the energy consumption and in parallel the minimisation of TNps. The same optimisation tool has been used (GANetXL).The main scope was to carry out several and different experiments regarding a vast variety of configurations, using different pump (but this time keeping the FS mode), different tank levels, different pipe diameters and different emitters coefficient. All these different modes came up with a large number of results that were compared in the chapter 8. Concluding, it should be said that the optimisation of WDNs is a very interested field that has a vast space of options to deal with. This includes a large number of algorithms to choose from, different techniques and configurations to be made and different support system generators. The researcher has to be ready to “roam” between these choices, till a satisfactory result will convince him/her that has reached a good optimisation point.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lo scopo del presente lavoro di tesi riguarda la caratterizzazione di un sensore ottico per la lettura di ematocrito e lo sviluppo dell’algoritmo di calibrazione del dispositivo. In altre parole, utilizzando dati ottenuti da una sessione di calibrazione opportunamente pianificata, l’algoritmo sviluppato ha lo scopo di restituire la curva di interpolazione dei dati che caratterizza il trasduttore. I passi principali del lavoro di tesi svolto sono sintetizzati nei punti seguenti: 1) Pianificazione della sessione di calibrazione necessaria per la raccolta dati e conseguente costruzione di un modello black box.  Output: dato proveniente dal sensore ottico (lettura espressa in mV)  Input: valore di ematocrito espresso in punti percentuali ( questa grandezza rappresenta il valore vero di volume ematico ed è stata ottenuta con un dispositivo di centrifugazione sanguigna) 2) Sviluppo dell’algoritmo L’algoritmo sviluppato e utilizzato offline ha lo scopo di restituire la curva di regressione dei dati. Macroscopicamente, il codice possiamo distinguerlo in due parti principali: 1- Acquisizione dei dati provenienti da sensore e stato di funzionamento della pompa bifasica 2- Normalizzazione dei dati ottenuti rispetto al valore di riferimento del sensore e implementazione dell’algoritmo di regressione. Lo step di normalizzazione dei dati è uno strumento statistico fondamentale per poter mettere a confronto grandezze non uniformi tra loro. Studi presenti, dimostrano inoltre un mutazione morfologica del globulo rosso in risposta a sollecitazioni meccaniche. Un ulteriore aspetto trattato nel presente lavoro, riguarda la velocità del flusso sanguigno determinato dalla pompa e come tale grandezza sia in grado di influenzare la lettura di ematocrito.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present thesis work was performed in the frame of ESEO (European Student Earth Orbiter) project. The activities that are described in this document were carried out in the Microsatellites and Space Micro systems Lab led by Professor Paolo Tortora and in ALMASpace company facilities. The thesis deals with ESEO structural analysis, at system and unit level, and verification: after determining the design limit loads to be applied to the spacecraft as an envelope of different launchers load profiles, a finite element structural analysis was performed on the model of the satellite in order to ensure the capability to withstand the loads encountered during the launch; all the analyses were performed according to ESA standards and using the software MSC NASTRAN SIMXPERT. Amplification factors were derived and used to determine loads to be considered at unit level. In particular structural analyses were carried out on the GPS unit, the payload developed for ESEO by students of University of Bologna and results were used in the preparation of GPS payload design definition file. As for the verification phase a study on the panels and inserts to be used in the spacecraft was performed: different designs were created exploiting methods to optimize weight and mechanical behavior. The configurations have been analyzed and results compared to select the final design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il presente lavoro di tesi è stato svolto presso il servizio di Fisica Sanitaria del Policlinico Sant'Orsola-Malpighi di Bologna. Lo studio si è concentrato sul confronto tra le tecniche di ricostruzione standard (Filtered Back Projection, FBP) e quelle iterative in Tomografia Computerizzata. Il lavoro è stato diviso in due parti: nella prima è stata analizzata la qualità delle immagini acquisite con una CT multislice (iCT 128, sistema Philips) utilizzando sia l'algoritmo FBP sia quello iterativo (nel nostro caso iDose4). Per valutare la qualità delle immagini sono stati analizzati i seguenti parametri: il Noise Power Spectrum (NPS), la Modulation Transfer Function (MTF) e il rapporto contrasto-rumore (CNR). Le prime due grandezze sono state studiate effettuando misure su un fantoccio fornito dalla ditta costruttrice, che simulava la parte body e la parte head, con due cilindri di 32 e 20 cm rispettivamente. Le misure confermano la riduzione del rumore ma in maniera differente per i diversi filtri di convoluzione utilizzati. Lo studio dell'MTF invece ha rivelato che l'utilizzo delle tecniche standard e iterative non cambia la risoluzione spaziale; infatti gli andamenti ottenuti sono perfettamente identici (a parte le differenze intrinseche nei filtri di convoluzione), a differenza di quanto dichiarato dalla ditta. Per l'analisi del CNR sono stati utilizzati due fantocci; il primo, chiamato Catphan 600 è il fantoccio utilizzato per caratterizzare i sistemi CT. Il secondo, chiamato Cirs 061 ha al suo interno degli inserti che simulano la presenza di lesioni con densità tipiche del distretto addominale. Lo studio effettuato ha evidenziato che, per entrambi i fantocci, il rapporto contrasto-rumore aumenta se si utilizza la tecnica di ricostruzione iterativa. La seconda parte del lavoro di tesi è stata quella di effettuare una valutazione della riduzione della dose prendendo in considerazione diversi protocolli utilizzati nella pratica clinica, si sono analizzati un alto numero di esami e si sono calcolati i valori medi di CTDI e DLP su un campione di esame con FBP e con iDose4. I risultati mostrano che i valori ricavati con l'utilizzo dell'algoritmo iterativo sono al di sotto dei valori DLR nazionali di riferimento e di quelli che non usano i sistemi iterativi.