13 resultados para implementation of organizational values

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Isolated DC-DC converters play a significant role in fast charging and maintaining the variable output voltage for EV applications. This study aims to investigate the different Isolated DC-DC converters for onboard and offboard chargers, then, once the topology is selected, study the control techniques and, finally, achieve a real-time converter model to accomplish Hardware-In-The-Loop (HIL) results. Among the different isolated DC-DC topologies, the Dual Active Bridge (DAB) converter has the advantage of allowing bidirectional power flow, which enables operating in both Grid to Vehicle (G2V) and Vehicle to Grid (V2G) modalities. Recently, DAB has been used in the offboard chargers for high voltage applications due to SiC and GaN MOSFETs; this new technology also allows the utilization of higher switching frequencies. By empowering soft switching techniques to reduce switching losses, higher switching frequency operation is possible in DAB. There are four phase shift control techniques for the DAB converter. They are Single Phase shift, Extended Phase shift, Dual Phase shift, Triple Phase shift controls. This thesis considers two control strategies; Single-Phase, and Dual-Phase shifts, to understand the circulating currents, power losses, and output capacitor size reduction in the DAB. Hardware-In-The-Loop (HIL) experiments are carried out on both controls with high switching frequencies using the PLECS software tool and the RT box supporting the PLECS. Root Mean Square Error is also calculated for steady-state values of output voltage with different sampling frequencies in both the controls to identify the achievable sampling frequency in real-time. DSP implementation is also executed to emulate the optimized DAB converter design, and final real-time simulation results are discussed for both the Single-Phase and Dual-Phase shift controls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this report it was designed an innovative satellite-based monitoring approach applied on the Iraqi Marshlands to survey the extent and distribution of marshland re-flooding and assess the development of wetland vegetation cover. The study, conducted in collaboration with MEEO Srl , makes use of images collected from the sensor (A)ATSR onboard ESA ENVISAT Satellite to collect data at multi-temporal scales and an analysis was adopted to observe the evolution of marshland re-flooding. The methodology uses a multi-temporal pixel-based approach based on classification maps produced by the classification tool SOIL MAPPER ®. The catalogue of the classification maps is available as web service through the Service Support Environment Portal (SSE, supported by ESA). The inundation of the Iraqi marshlands, which has been continuous since April 2003, is characterized by a high degree of variability, ad-hoc interventions and uncertainty. Given the security constraints and vastness of the Iraqi marshlands, as well as cost-effectiveness considerations, satellite remote sensing was the only viable tool to observe the changes taking place on a continuous basis. The proposed system (ALCS – AATSR LAND CLASSIFICATION SYSTEM) avoids the direct use of the (A)ATSR images and foresees the application of LULCC evolution models directly to „stock‟ of classified maps. This approach is made possible by the availability of a 13 year classified image database, conceived and implemented in the CARD project (http://earth.esa.int/rtd/Projects/#CARD).The approach here presented evolves toward an innovative, efficient and fast method to exploit the potentiality of multi-temporal LULCC analysis of (A)ATSR images. The two main objectives of this work are both linked to a sort of assessment: the first is to assessing the ability of modeling with the web-application ALCS using image-based AATSR classified with SOIL MAPPER ® and the second is to evaluate the magnitude, the character and the extension of wetland rehabilitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research work presents the design and implementation of a FFT pruning block, which is an extension to the FFT core for OFDM demodulation, enabling run-time 8 pruning of the FFT algorithm, without any restrictions on the distribution pattern of the active/inactive sub-carriers. The design and implementation of FFT processor core is not the part of this work. The whole design was prototyped on an ALTERA STRATIX V FPGA to evaluate the performance of the pruning engine. Synthesis and simulation results showed that the logic overhead introduced by the pruning block is limited to a 10% of the total resources utilization. Moreover, in presence of a medium-high scattering of the sub-carriers, power and energy consumption of the FFT core were reduced by a 30% factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network Theory is a prolific and lively field, especially when it approaches Biology. New concepts from this theory find application in areas where extensive datasets are already available for analysis, without the need to invest money to collect them. The only tools that are necessary to accomplish an analysis are easily accessible: a computing machine and a good algorithm. As these two tools progress, thanks to technology advancement and human efforts, wider and wider datasets can be analysed. The aim of this paper is twofold. Firstly, to provide an overview of one of these concepts, which originates at the meeting point between Network Theory and Statistical Mechanics: the entropy of a network ensemble. This quantity has been described from different angles in the literature. Our approach tries to be a synthesis of the different points of view. The second part of the work is devoted to presenting a parallel algorithm that can evaluate this quantity over an extensive dataset. Eventually, the algorithm will also be used to analyse high-throughput data coming from biology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a CMOS Amplifier with High Common Mode rejection designed in UMC 130nm technology. The goal is to achieve a high amplification factor for a wide range of biological signals (with frequencies in the range of 10Hz-1KHz) and to reject the common-mode noise signal. It is here presented a Data Acquisition System, composed of a Delta-Sigma-like Modulator and an antenna, that is the core of a portable low-complexity radio system; the amplifier is designed in order to interface the data acquisition system with a sensor that acquires the electrical signal. The Modulator asynchronously acquires and samples human muscle activity, by sending a Quasi-Digital pattern that encodes the acquired signal. There is only a minor loss of information translating the muscle activity using this pattern, compared to an encoding technique which uses astandard digital signal via Impulse-Radio Ultra-Wide Band (IR-UWB). The biological signals, needed for Electromyographic analysis, have an amplitude of 10-100μV and need to be highly amplified and separated from the overwhelming 50mV common mode noise signal. Various tests of the firmness of the concept are presented, as well the proof that the design works even with different sensors, such as Radiation measurement for Dosimetry studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relazione del lavoro di creazione e implementazione della piattaforma software che sviluppa l’archivio del progetto SATNET. I satelliti universitari hanno un tempo di vista della propria Stazione di Terra di pochi minuti al giorno: SATNET risponde all’esigenza di comunicare con un satellite universitario in orbita bassa per più dei pochi minuti al giorno che una singola Stazione di Terra permette. Questo avviene grazie a una rete di Stazioni di Terra Satellitari collegate da specifiche missioni comuni che mettono in condivisione dati ricevuti da uno o più satelliti, aumentando il rendimento dati/giorno di questi e permettendo una migliore fruizione delle Stazioni di Terra stesse. Il network sfrutta Internet come canale di connessione, e prevede la presenza di un archivio nel quale memorizzare i dati ricevuti, per poi renderne possibile la consultazione e il recupero. Oggetto di questo lavoro di tesi è stato lo sviluppo e l’implementazione di tale archivio: utilizzando un sito web dinamico, il software risponde a tutte le richieste evidenziate nel paragrafo precedente, permettendo a utenti autenticati di inserire dati e ad altri di poterne avere accesso. Il software è completo e funzionante ma non finito, in quanto manca la formulazione di alcune richieste; per esempio non è stato specificato il tipo di informazioni che è possibile caricare in upload, né il tipo di campi richiesti nel modulo di registrazione dei vari utenti. In questi casi sono stati inseriti campi generici, lasciando all’utente la possibilità di modificarli in seguito. Il software è stato dunque concepito come facilmente personalizzabile e modificabile anche da utenti inesperti grazie alla sola lettura della tesi, che rappresenta quindi una vera e propria guida per l’utilizzo, l’installazione, la personalizzazione e la manutenzione della piattaforma software. La tesi evidenzia gli obiettivi e le richieste, mostrando l’aspetto del sito web e le sue funzionalità, e spiega passo per passo il procedimento per la modifica dell’aspetto delle pagine e di alcuni parametri di configurazione. Inoltre, qualora siano necessarie modifiche sostanziali al progetto, introduce i vari linguaggi di programmazione necessari allo sviluppo e alla programmazione web e aiuta l’utente nella comprensione della struttura del software. Si conclude con alcuni suggerimenti su eventuali modifiche, attuabili solo a seguito di un lavoro di definizione degli obiettivi e delle specifiche richieste. In futuro ci si aspetta l’implementazione e la personalizzazione del software, nonché l’integrazione dell’archivio all’interno del progetto SATNET, con l’obiettivo di migliorare e favorire la diffusione e la condivisione di progetti comuni tra diverse Università Europee ed Extra-Europee.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EBPR (Enhanced Biological Phosphorus Removal) is a type of secondary treatment in WWTPs (WasteWater Treatment Plants), quite largely used in full-scale plants worldwide. The phosphorus occurring in aquatic systems in high amounts can cause eutrophication and consequently the death of fauna and flora. A specific biomass is used in order to remove the phosphorus, the so-called PAOs (Polyphosphate Accumulating Organisms) that accumulate the phosphorus in form of polyphosphate in their cells. Some of these organisms, the so-called DPAO (Denitrifying Polyphosphate Accumulating Organisms) use as electron acceptor the nitrate or nitrite, contributing in this way also to the removal of these compounds from the wastewater, but there could be side reactions leading to the formation of nitrous oxides. The aim of this project was to simulate in laboratory scale a EBPR, acclimatizing and enriching the specialized biomass. Two bioreactors were operated as Sequencing Batch Reactors, one enriched in Accumulibacter, the other in Tetrasphaera (both PAOs): Tetrasphaera microorganisms are able to uptake aminoacids as carbon source, Accumulibacter uptake organic carbon (volatile fatty acids, VFA). In order to measure the removal of COD, phosphorus and nitrogen-derivate compounds, different analysis were performed: spectrophotometric measure of phosphorus, nitrate, nitrite and ammonia concentrations, TOC (Total Organic Carbon, measuring the carbon consumption), VFA via HPLC (High Performance Liquid Chromatography), total and volatile suspended solids following standard methods APHA, qualitative microorganism population via FISH (Fluorescence In Situ Hybridization). Batch test were also performed to monitor the NOx production. Both specialized populations accumulated as a result of SBR operations; however, Accumulibacter were found to uptake phosphates at higher extents. Both populations were able to remove efficiently nitrates and organic compounds occurring in the feeding. The experimental work was carried out at FCT of Universidade Nova de Lisboa (FCT-UNL) from February to July 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud services are becoming ever more important for everyone's life. Cloud storage? Web mails? Yes, we don't need to be working in big IT companies to be surrounded by cloud services. Another thing that's growing in importance, or at least that should be considered ever more important, is the concept of privacy. The more we rely on services of which we know close to nothing about, the more we should be worried about our privacy. In this work, I will analyze a prototype software based on a peer to peer architecture for the offering of cloud services, to see if it's possible to make it completely anonymous, meaning that not only the users using it will be anonymous, but also the Peers composing it will not know the real identity of each others. To make it possible, I will make use of anonymizing networks like Tor. I will start by studying the state of art of Cloud Computing, by looking at some real example, followed by analyzing the architecture of the prototype, trying to expose the differences between its distributed nature and the somehow centralized solutions offered by the famous vendors. After that, I will get as deep as possible into the working principle of the anonymizing networks, because they are not something that can just be 'applied' mindlessly. Some de-anonymizing techniques are very subtle so things must be studied carefully. I will then implement the required changes, and test the new anonymized prototype to see how its performances differ from those of the standard one. The prototype will be run on many machines, orchestrated by a tester script that will automatically start, stop and do all the required API calls. As to where to find all these machines, I will make use of Amazon EC2 cloud services and their on-demand instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis work is developed under the European Student Earth Orbiter (ESEO) project supported by the European Space Agency (ESA) in order to help prepare a well-qualified space-engineering workforce for Europe's future. In the following chapters we are going to analyse how to simulate some ESEO subsystem. First of all, the Thermal Subsystem that evaluates the temperature evolution of on-board instruments. For this purpose, simulating also the orbital and attitude dynamics of the spacecraft, it is necessary in order to evaluate external environmental fluxes. The Power Subsystem will be the following step and it models the ability of a spacecraft to produce and store electrical energy. Finally, we will integrate in our software a block capable of simulating the communication link between the satellite and the Ground Station (GS). This last step is designed and validated during the thesis preparation.