5 resultados para Radiality constraints in distribution systems
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Due to its practical importance and inherent complexity, the optimisation of distribution networks for supplying drinking water has been the subject of extensive study for the past 30 years. The optimization is governed by sizing the pipes in the water distribution network (WDN) and / or optimises specific parts of the network such as pumps, tanks etc. or try to analyse and optimise the reliability of a WDN. In this thesis, the author has analysed two different WDNs (Anytown City and Cabrera city networks), trying to solve and optimise a multi-objective optimisation problem (MOOP). The main two objectives in both cases were the minimisation of Energy Cost (€) or Energy consumption (kWh), along with the total Number of pump switches (TNps) during a day. For this purpose, a decision support system generator for Multi-objective optimisation used. Its name is GANetXL and has been developed by the Center of Water System in the University of Exeter. GANetXL, works by calling the EPANET hydraulic solver, each time a hydraulic analysis has been fulfilled. The main algorithm used, was a second-generation algorithm for multi-objective optimisation called NSGA_II that gave us the Pareto fronts of each configuration. The first experiment that has been carried out was the network of Anytown city. It is a big network with a pump station of four fixed speed parallel pumps that are boosting the water dynamics. The main intervention was to change these pumps to new Variable speed driven pumps (VSDPs), by installing inverters capable to diverse their velocity during the day. Hence, it’s been achieved great Energy and cost savings along with minimisation in the number of pump switches. The results of the research are thoroughly illustrated in chapter 7, with comments and a variety of graphs and different configurations. The second experiment was about the network of Cabrera city. The smaller WDN had a unique FS pump in the system. The problem was the same as far as the optimisation process was concerned, thus, the minimisation of the energy consumption and in parallel the minimisation of TNps. The same optimisation tool has been used (GANetXL).The main scope was to carry out several and different experiments regarding a vast variety of configurations, using different pump (but this time keeping the FS mode), different tank levels, different pipe diameters and different emitters coefficient. All these different modes came up with a large number of results that were compared in the chapter 8. Concluding, it should be said that the optimisation of WDNs is a very interested field that has a vast space of options to deal with. This includes a large number of algorithms to choose from, different techniques and configurations to be made and different support system generators. The researcher has to be ready to “roam” between these choices, till a satisfactory result will convince him/her that has reached a good optimisation point.
Resumo:
The research work presented in the thesis describes a new methodology for the automated near real-time detection of pipe bursts in Water Distribution Systems (WDSs). The methodology analyses the pressure/flow data gathered by means of SCADA systems in order to extract useful informations that go beyond the simple and usual monitoring type activities and/or regulatory reporting , enabling the water company to proactively manage the WDSs sections. The work has an interdisciplinary nature covering AI techniques and WDSs management processes such as data collection, manipulation and analysis for event detection. Indeed, the methodology makes use of (i) Artificial Neural Network (ANN) for the short-term forecasting of future pressure/flow signal values and (ii) Rule-based Model for bursts detection at sensor and district level. The results of applying the new methodology to a District Metered Area in Emilia- Romagna’s region, Italy have also been reported in the thesis. The results gathered illustrate how the methodology is capable to detect the aforementioned failure events in fast and reliable manner. The methodology guarantees the water companies to save water, energy, money and therefore enhance them to achieve higher levels of operational efficiency, a compliance with the current regulations and, last but not least, an improvement of customer service.
Resumo:
The objective of the thesis project, developed within the Line Control & Software Engineering team of G.D company, is to analyze and identify the appropriate tool to automate the HW configuration process using Beckhoff technologies by importing data from an ECAD tool. This would save a great deal of time, since the I/O topology created as part of the electrical planning is presently imported manually in the related SW project of the machine. Moreover, a manual import is more error-prone because of human mistake than an automatic configuration tool. First, an introduction about TwinCAT 3, EtherCAT and Automation Interface is provided; then, it is analyzed the official Beckhoff tool, XCAD Interface, and the requirements on the electrical planning to use it: the interface is realized by means of the AutomationML format. Finally, due to some limitations observed, the design and implementation of a company internal tool is performed. Tests and validation of the tool are performed on a sample production line of the company.
Resumo:
Carbon capture and storage (CCS) represents an interesting climate mitigation option, however, as for any other human activity, there is the impelling need to assess and manage the associated risks. This study specifically addresses the marine environmental risk posed by CO2 leakages associated to CCS subsea engineering system, meant as offshore pipelines and injection / plugged and abandoned wells. The aim of this thesis work is to start approaching the development of a complete and standardized practical procedure to perform a quantified environmental risk assessment for CCS, with reference to the specific activities mentioned above. Such an effort would be of extreme relevance not only for companies willing to implement CCS, as a methodological guidance, but also, by uniformizing the ERA procedure, to begin changing people’s perception about CCS, that happens to be often discredited due to the evident lack of comprehensive and systematic methods to assess the impacts on the marine environment. The backbone structure of the framework developed consists on the integration of ERA’s main steps and those belonging to the quantified risk assessment (QRA), in the aim of quantitatively characterizing risk and describing it as a combination of magnitude of the consequences and their frequency. The framework developed by this work is, however, at a high level, as not every single aspect has been dealt with in the required detail. Thus, several alternative options are presented to be considered for use depending on the situation. Further specific studies should address their accuracy and efficiency and solve the knowledge gaps emerged, in order to establish and validate a final and complete procedure. Regardless of the knowledge gaps and uncertainties, that surely need to be addressed, this preliminary framework already finds some relevance in on field applications, as a non-stringent guidance to perform CCS ERA, and it constitutes the foundation of the final framework.
Resumo:
I sistemi decentralizzati hanno permesso agli utenti di condividere informazioni senza la presenza di un intermediario centralizzato che possiede la sovranità sui dati scambiati, rischi di sicurezza e la possibilità di colli di bottiglia. Tuttavia, sono rari i sistemi pratici per il recupero delle informazioni salvate su di essi che non includano una componente centralizzata. In questo lavoro di tesi viene presentato lo sviluppo di un'applicazione il cui scopo è quello di consentire agli utenti di caricare immagini in un'architettura totalmente decentralizzata, grazie ai Decentralized File Storage e alla successiva ricerca e recupero di tali oggetti attraverso una Distributed Hash Table (DHT) in cui sono memorizzati i necessari Content IDentifiers (CID).\\ L'obiettivo principale è stato quello di trovare una migliore allocazione delle immagini all'interno del DHT attraverso l'uso dell'International Standard Content Code (ISCC), ovvero uno standard ISO che, attraverso funzioni hash content-driven, locality-sensitive e similarity-preserving, assegna i CID IPFS delle immagini ai nodi del DHT in modo efficiente, per ridurre il più possibile i salti tra i nodi e recuperare immagini coerenti con la query eseguita. Verranno, poi, analizzati i risultati ottenuti dall'allocazione dei CID delle immagini nei nodi mettendo a confronto ISCC e hash crittografico SHA-256, per verificare se ISCC rappresenti meglio la somiglianza tra le immagini allocando le immagini simili in nodi vicini tra loro.