12 resultados para optimisation algorithms

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to its practical importance and inherent complexity, the optimisation of distribution networks for supplying drinking water has been the subject of extensive study for the past 30 years. The optimization is governed by sizing the pipes in the water distribution network (WDN) and / or optimises specific parts of the network such as pumps, tanks etc. or try to analyse and optimise the reliability of a WDN. In this thesis, the author has analysed two different WDNs (Anytown City and Cabrera city networks), trying to solve and optimise a multi-objective optimisation problem (MOOP). The main two objectives in both cases were the minimisation of Energy Cost (€) or Energy consumption (kWh), along with the total Number of pump switches (TNps) during a day. For this purpose, a decision support system generator for Multi-objective optimisation used. Its name is GANetXL and has been developed by the Center of Water System in the University of Exeter. GANetXL, works by calling the EPANET hydraulic solver, each time a hydraulic analysis has been fulfilled. The main algorithm used, was a second-generation algorithm for multi-objective optimisation called NSGA_II that gave us the Pareto fronts of each configuration. The first experiment that has been carried out was the network of Anytown city. It is a big network with a pump station of four fixed speed parallel pumps that are boosting the water dynamics. The main intervention was to change these pumps to new Variable speed driven pumps (VSDPs), by installing inverters capable to diverse their velocity during the day. Hence, it’s been achieved great Energy and cost savings along with minimisation in the number of pump switches. The results of the research are thoroughly illustrated in chapter 7, with comments and a variety of graphs and different configurations. The second experiment was about the network of Cabrera city. The smaller WDN had a unique FS pump in the system. The problem was the same as far as the optimisation process was concerned, thus, the minimisation of the energy consumption and in parallel the minimisation of TNps. The same optimisation tool has been used (GANetXL).The main scope was to carry out several and different experiments regarding a vast variety of configurations, using different pump (but this time keeping the FS mode), different tank levels, different pipe diameters and different emitters coefficient. All these different modes came up with a large number of results that were compared in the chapter 8. Concluding, it should be said that the optimisation of WDNs is a very interested field that has a vast space of options to deal with. This includes a large number of algorithms to choose from, different techniques and configurations to be made and different support system generators. The researcher has to be ready to “roam” between these choices, till a satisfactory result will convince him/her that has reached a good optimisation point.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network Theory is a prolific and lively field, especially when it approaches Biology. New concepts from this theory find application in areas where extensive datasets are already available for analysis, without the need to invest money to collect them. The only tools that are necessary to accomplish an analysis are easily accessible: a computing machine and a good algorithm. As these two tools progress, thanks to technology advancement and human efforts, wider and wider datasets can be analysed. The aim of this paper is twofold. Firstly, to provide an overview of one of these concepts, which originates at the meeting point between Network Theory and Statistical Mechanics: the entropy of a network ensemble. This quantity has been described from different angles in the literature. Our approach tries to be a synthesis of the different points of view. The second part of the work is devoted to presenting a parallel algorithm that can evaluate this quantity over an extensive dataset. Eventually, the algorithm will also be used to analyse high-throughput data coming from biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Algoritmi euristici per la risoluzione del Travelling DEliveryman Problem

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The full blood cell (FBC) count is the most common indicator of diseases. At present hematology analyzers are used for the blood cell characterization, but, recently, there has been interest in using techniques that take advantage of microscale devices and intrinsic properties of cells for increased automation and decreased cost. Microfluidic technologies offer solutions to handling and processing small volumes of blood (2-50 uL taken by finger prick) for point-of-care(PoC) applications. Several PoC blood analyzers are in use and may have applications in the fields of telemedicine, out patient monitoring and medical care in resource limited settings. They have the advantage to be easy to move and much cheaper than traditional analyzers, which require bulky instruments and consume large amount of reagents. The development of miniaturized point-of-care diagnostic tests may be enabled by chip-based technologies for cell separation and sorting. Many current diagnostic tests depend on fractionated blood components: plasma, red blood cells (RBCs), white blood cells (WBCs), and platelets. Specifically, white blood cell differentiation and counting provide valuable information for diagnostic purposes. For example, a low number of WBCs, called leukopenia, may be an indicator of bone marrow deficiency or failure, collagen- vascular diseases, disease of the liver or spleen. The leukocytosis, a high number of WBCs, may be due to anemia, infectious diseases, leukemia or tissue damage. In the laboratory of hybrid biodevices, at the University of Southampton,it was developed a functioning micro impedance cytometer technology for WBC differentiation and counting. It is capable to classify cells and particles on the base of their dielectric properties, in addition to their size, without the need of labeling, in a flow format similar to that of a traditional flow cytometer. It was demonstrated that the micro impedance cytometer system can detect and differentiate monocytes, neutrophils and lymphocytes, which are the three major human leukocyte populations. The simplicity and portability of the microfluidic impedance chip offer a range of potential applications in cell analysis including point-of-care diagnostic systems. The microfluidic device has been integrated into a sample preparation cartridge that semi-automatically performs erythrocyte lysis before leukocyte analysis. Generally erythrocytes are manually lysed according to a specific chemical lysis protocol, but this process has been automated in the cartridge. In this research work the chemical lysis protocol, defined in the patent US 5155044 A, was optimized in order to improve white blood cell differentiation and count performed by the integrated cartridge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic Resonance Spectroscopy (MRS) is an advanced clinical and research application which guarantees a specific biochemical and metabolic characterization of tissues by the detection and quantification of key metabolites for diagnosis and disease staging. The "Associazione Italiana di Fisica Medica (AIFM)" has promoted the activity of the "Interconfronto di spettroscopia in RM" working group. The purpose of the study is to compare and analyze results obtained by perfoming MRS on scanners of different manufacturing in order to compile a robust protocol for spectroscopic examinations in clinical routines. This thesis takes part into this project by using the GE Signa HDxt 1.5 T at the Pavillion no. 11 of the S.Orsola-Malpighi hospital in Bologna. The spectral analyses have been performed with the jMRUI package, which includes a wide range of preprocessing and quantification algorithms for signal analysis in the time domain. After the quality assurance on the scanner with standard and innovative methods, both spectra with and without suppression of the water peak have been acquired on the GE test phantom. The comparison of the ratios of the metabolite amplitudes over Creatine computed by the workstation software, which works on the frequencies, and jMRUI shows good agreement, suggesting that quantifications in both domains may lead to consistent results. The characterization of an in-house phantom provided by the working group has achieved its goal of assessing the solution content and the metabolite concentrations with good accuracy. The goodness of the experimental procedure and data analysis has been demonstrated by the correct estimation of the T2 of water, the observed biexponential relaxation curve of Creatine and the correct TE value at which the modulation by J coupling causes the Lactate doublet to be inverted in the spectrum. The work of this thesis has demonstrated that it is possible to perform measurements and establish protocols for data analysis, based on the physical principles of NMR, which are able to provide robust values for the spectral parameters of clinical use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computing the weighted geometric mean of large sparse matrices is an operation that tends to become rapidly intractable, when the size of the matrices involved grows. However, if we are not interested in the computation of the matrix function itself, but just in that of its product times a vector, the problem turns simpler and there is a chance to solve it even when the matrix mean would actually be impossible to compute. Our interest is motivated by the fact that this calculation has some practical applications, related to the preconditioning of some operators arising in domain decomposition of elliptic problems. In this thesis, we explore how such a computation can be efficiently performed. First, we exploit the properties of the weighted geometric mean and find several equivalent ways to express it through real powers of a matrix. Hence, we focus our attention on matrix powers and examine how well-known techniques can be adapted to the solution of the problem at hand. In particular, we consider two broad families of approaches for the computation of f(A) v, namely quadrature formulae and Krylov subspace methods, and generalize them to the pencil case f(A\B) v. Finally, we provide an extensive experimental evaluation of the proposed algorithms and also try to assess how convergence speed and execution time are influenced by some characteristics of the input matrices. Our results suggest that a few elements have some bearing on the performance and that, although there is no best choice in general, knowing the conditioning and the sparsity of the arguments beforehand can considerably help in choosing the best strategy to tackle the problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of localizing a scatterer, which represents a tumor, in a homogeneous circular domain, which represents a breast, is addressed. A breast imaging method based on microwaves is considered. The microwave imaging involves to several techniques for detecting, localizing and characterizing tumors in breast tissues. In all such methods an electromagnetic inverse scattering problem exists. For the scattering detection method, an algorithm based on a linear procedure solution, inspired by MUltiple SIgnal Classification algorithm (MUSIC) and Time Reversal method (TR), is implemented. The algorithm returns a reconstructed image of the investigation domain in which it is detected the scatterer position. This image is called pseudospectrum. A preliminary performance analysis of the algorithm vying the working frequency is performed: the resolution and the signal-to-noise ratio of the pseudospectra are improved if a multi-frequency approach is considered. The Geometrical Mean-MUSIC algorithm (GM- MUSIC) is proposed as multi-frequency method. The performance of the GMMUSIC is tested in different real life computer simulations. The performed analysis shows that the algorithm detects the scatterer until the electrical parameters of the breast are known. This is an evident limit, since, in a real life situation, the anatomy of the breast is unknown. An improvement in GM-MUSIC is proposed: the Eye-GMMUSIC algorithm. Eye-GMMUSIC algorithm needs no a priori information on the electrical parameters of the breast. It is an optimizing algorithm based on the pattern search algorithm: it searches the breast parameters which minimize the Signal-to-Clutter Mean Ratio (SCMR) in the signal. Finally, the GM-MUSIC and the Eye-GMMUSIC algorithms are tested on a microwave breast cancer detection system consisting of an dipole antenna, a Vector Network Analyzer and a novel breast phantom built at University of Bologna. The reconstruction of the experimental data confirm the GM-MUSIC ability to localize a scatterer in a homogeneous medium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il cancro della prostata (PCa) è il tumore maligno non-cutaneo più diffuso tra gli uomini ed è il secondo tumore che miete più vittime nei paesi occidentali. La necessità di nuove tecniche non invasive per la diagnosi precoce del PCa è aumentata negli anni. 1H-MRS (proton magnetic resonance spectroscopy) e 1H-MRSI (proton magnetic resonance spectroscopy imaging) sono tecniche avanzate di spettroscopia in risonanza magnetica che permettono di individuare presenza di metaboliti come citrato, colina, creatina e in alcuni casi poliammine in uno o più voxel nel tessuto prostatico. L’abbondanza o l’assenza di uno di questi metaboliti rende possibile discriminare un tessuto sano da uno patologico. Le tecniche di spettroscopia RM sono correntemente utilizzate nella pratica clinica per cervello e fegato, con l’utilizzo di software dedicati per l’analisi degli spettri. La quantificazione di metaboliti nella prostata invece può risultare difficile a causa del basso rapporto segnale/rumore (SNR) degli spettri e del forte accoppiamento-j del citrato. Lo scopo principale di questo lavoro è di proporre un software prototipo per la quantificazione automatica di citrato, colina e creatina nella prostata. Lo sviluppo del programma e dei suoi algoritmi è stato portato avanti all’interno dell’IRST (Istituto Romagnolo per lo Studio e la cura dei Tumori) con l’aiuto dell’unità di fisica sanitaria. Il cuore del programma è un algoritmo iterativo per il fit degli spettri che fa uso di simulazioni MRS sviluppate con il pacchetto di librerie GAMMA in C++. L’accuratezza delle quantificazioni è stata testata con dei fantocci realizzati all’interno dei laboratori dell’istituto. Tutte le misure spettroscopiche sono state eseguite con il nuovo scanner Philips Ingenia 3T, una delle machine di risonanza magnetica più avanzate per applicazioni cliniche. Infine, dopo aver eseguito i test in vitro sui fantocci, sono stati acquisiti gli spettri delle prostate di alcuni volontari sani, per testare se il programma fosse in grado di lavorare in condizioni di basso SNR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I Polar Codes sono la prima classe di codici a correzione d’errore di cui è stato dimostrato il raggiungimento della capacità per ogni canale simmetrico, discreto e senza memoria, grazie ad un nuovo metodo introdotto recentemente, chiamato ”Channel Polarization”. In questa tesi verranno descritti in dettaglio i principali algoritmi di codifica e decodifica. In particolare verranno confrontate le prestazioni dei simulatori sviluppati per il ”Successive Cancellation Decoder” e per il ”Successive Cancellation List Decoder” rispetto ai risultati riportati in letteratura. Al fine di migliorare la distanza minima e di conseguenza le prestazioni, utilizzeremo uno schema concatenato con il polar code come codice interno ed un CRC come codice esterno. Proporremo inoltre una nuova tecnica per analizzare la channel polarization nel caso di trasmissione su canale AWGN che risulta il modello statistico più appropriato per le comunicazioni satellitari e nelle applicazioni deep space. In aggiunta, investigheremo l’importanza di una accurata approssimazione delle funzioni di polarizzazione.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last years radar sensor networks for localization and tracking in indoor environment have generated more and more interest, especially for anti-intrusion security systems. These networks often use Ultra Wide Band (UWB) technology, which consists in sending very short (few nanoseconds) impulse signals. This approach guarantees high resolution and accuracy and also other advantages such as low price, low power consumption and narrow-band interference (jamming) robustness. In this thesis the overall data processing (done in MATLAB environment) is discussed, starting from experimental measures from sensor devices, ending with the 2D visualization of targets movements over time and focusing mainly on detection and localization algorithms. Moreover, two different scenarios and both single and multiple target tracking are analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lo streaming è una tecnica per trasferire contenuti multimediali sulla rete globale, utilizzato per esempio da servizi come YouTube e Netflix; dopo una breve attesa, durante la quale un buffer di sicurezza viene riempito, l'utente può usufruire del contenuto richiesto. Cisco e Sandvine, che con cadenza regolare pubblicano bollettini sullo stato di Internet, affermano che lo streaming video ha, e avrà sempre di più, un grande impatto sulla rete globale. Il buon design delle applicazioni di streaming riveste quindi un ruolo importante, sia per la soddisfazione degli utenti che per la stabilità dell'infrastruttura. HTTP Adaptive Streaming indica una famiglia di implementazioni volta a offrire la migliore qualità video possibile (in termini di bit rate) in funzione della bontà della connessione Internet dell'utente finale: il riproduttore multimediale può cambiare in ogni momento il bit rate, scegliendolo in un insieme predefinito, adattandosi alle condizioni della rete. Per ricavare informazioni sullo stato della connettività, due famiglie di metodi sono possibili: misurare la velocità di scaricamento dei precedenti trasferimenti (approccio rate-based), oppure, come recentemente proposto da Netflix, utilizzare l'occupazione del buffer come dato principale (buffer-based). In questo lavoro analizziamo algoritmi di adattamento delle due famiglie, con l'obiettivo di confrontarli su metriche riguardanti la soddisfazione degli utenti, l'utilizzo della rete e la competizione su un collo di bottiglia. I risultati dei nostri test non definiscono un chiaro vincitore, riconoscendo comunque la bontà della nuova proposta, ma evidenziando al contrario che gli algoritmi buffer-based non sempre riescono ad allocare in modo imparziale le risorse di rete.