14 resultados para rule-based algorithms

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, the interest of the automotive market for hybrid vehicles has increased due to the more restrictive pollutants emissions legislation and to the necessity of decreasing the fossil fuel consumption, since such solution allows a consistent improvement of the vehicle global efficiency. The term hybridization regards the energy flow in the powertrain of a vehicle: a standard vehicle has, usually, only one energy source and one energy tank; instead, a hybrid vehicle has at least two energy sources. In most cases, the prime mover is an internal combustion engine (ICE) while the auxiliary energy source can be mechanical, electrical, pneumatic or hydraulic. It is expected from the control unit of a hybrid vehicle the use of the ICE in high efficiency working zones and to shut it down when it is more convenient, while using the EMG at partial loads and as a fast torque response during transients. However, the battery state of charge may represent a limitation for such a strategy. That’s the reason why, in most cases, energy management strategies are based on the State Of Charge, or SOC, control. Several studies have been conducted on this topic and many different approaches have been illustrated. The purpose of this dissertation is to develop an online (usable on-board) control strategy in which the operating modes are defined using an instantaneous optimization method that minimizes the equivalent fuel consumption of a hybrid electric vehicle. The equivalent fuel consumption is calculated by taking into account the total energy used by the hybrid powertrain during the propulsion phases. The first section presents the hybrid vehicles characteristics. The second chapter describes the global model, with a particular focus on the energy management strategies usable for the supervisory control of such a powertrain. The third chapter shows the performance of the implemented controller on a NEDC cycle compared with the one obtained with the original control strategy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I sistemi di raccomandazione per come li conosciamo nascono alla fine del XX secolo, e si sono evoluti fino ai giorni nostri approcciandosi a numerosi campi, tra i quali analizzeremo l’ingegneria del software, la medicina, la gestione delle reti aziendali e infine, come argomento focale della tesi, l’e-Learning. Dopo una rapida panoramica sullo stato dell’arte dei sistemi di raccomandazione al giorno d’oggi, discorrendo velocemente tra metodi puri e metodi ibridi ottenuti come combinazione dei primi, analizzeremo varie applicazioni pratiche per dare un’idea al lettore di quanto possano essere vari i settori di utilizzo di questi software. Tratteremo nello specifico il funzionamento di varie tecniche per la raccomandazione in ambito e-Learning, analizzando tutte le problematiche che distinguono questo settore da tutti gli altri. Nello specifico, dedicheremo un’intera sezione alla descrizione della psicologia dello studente, e su come capire il suo profilo cognitivo aiuti a suggerire al meglio la giusta risorsa da apprendere nel modo più corretto. È doveroso, infine, parlare di privacy: come vedremo nel primo capitolo, i sistemi di raccomandazione utilizzano al massimo dati sensibili degli utenti al fine di fornire un suggerimento il più accurato possibile. Ma come possiamo tutelarli contro intrusioni e quindi contro violazioni della privacy? L’obiettivo di questa tesi è quindi quello di presentare al meglio lo stato attuale dei sistemi di raccomandazione in ambito e-Learning e non solo, in modo da costituire un riferimento chiaro, semplice ma completo per chiunque si volesse affacciare a questo straordinario ed affascinante mondo della raccomandazione on line.  

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In questo lavoro si introducono i concetti di base di Natural Language Processing, soffermandosi su Information Extraction e analizzandone gli ambiti applicativi, le attività principali e la differenza rispetto a Information Retrieval. Successivamente si analizza il processo di Named Entity Recognition, focalizzando l’attenzione sulle principali problematiche di annotazione di testi e sui metodi per la valutazione della qualità dell’estrazione di entità. Infine si fornisce una panoramica della piattaforma software open-source di language processing GATE/ANNIE, descrivendone l’architettura e i suoi componenti principali, con approfondimenti sugli strumenti che GATE offre per l'approccio rule-based a Named Entity Recognition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research work presented in the thesis describes a new methodology for the automated near real-time detection of pipe bursts in Water Distribution Systems (WDSs). The methodology analyses the pressure/flow data gathered by means of SCADA systems in order to extract useful informations that go beyond the simple and usual monitoring type activities and/or regulatory reporting , enabling the water company to proactively manage the WDSs sections. The work has an interdisciplinary nature covering AI techniques and WDSs management processes such as data collection, manipulation and analysis for event detection. Indeed, the methodology makes use of (i) Artificial Neural Network (ANN) for the short-term forecasting of future pressure/flow signal values and (ii) Rule-based Model for bursts detection at sensor and district level. The results of applying the new methodology to a District Metered Area in Emilia- Romagna’s region, Italy have also been reported in the thesis. The results gathered illustrate how the methodology is capable to detect the aforementioned failure events in fast and reliable manner. The methodology guarantees the water companies to save water, energy, money and therefore enhance them to achieve higher levels of operational efficiency, a compliance with the current regulations and, last but not least, an improvement of customer service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present work proposes different approaches to extend the mathematical methods of supervisory energy management used in terrestrial environments to the maritime sector, that diverges in constraints, variables and disturbances. The aim is to find the optimal real-time solution that includes the minimization of a defined track time, while maintaining the classical energetic approach. Starting from analyzing and modelling the powertrain and boat dynamics, the energy economy problem formulation is done, following the mathematical principles behind the optimal control theory. Then, an adaptation aimed in finding a winning strategy for the Monaco Energy Boat Challenge endurance trial is performed via ECMS and A-ECMS control strategies, which lead to a more accurate knowledge of energy sources and boat’s behaviour. The simulations show that the algorithm accomplishes fuel economy and time optimization targets, but the latter adds huge tuning and calculation complexity. In order to assess a practical implementation on real hardware, the knowledge of the previous approaches has been translated into a rule-based algorithm, that let it be run on an embedded CPU. Finally, the algorithm has been tuned and tested in a real-world race scenario, showing promising results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This master thesis work is focused on the development of a predictive EHC control function for a diesel plug-in hybrid electric vehicle equipped with a EURO 7 compliant exhaust aftertreatment system (EATS), with the purpose of showing the advantages provided by the implementation of a predictive control strategy with respect to a rule-based one. A preliminary step will be the definition of an accurate powertrain and EATS physical model, starting from already existing and validated applications. Then, a rule-based control strategy managing the torque split between the electric motor (EM) and the internal combustion engine (ICE) will be developed and calibrated, with the main target of limiting tailpipe NOx emission by taking into account EM and ICE operating conditions together with EATS conversion efficiency. The information available from vehicle connectivity will be used to reconstruct the future driving scenario, also referred to as electronic horizon (eHorizon), and in particular to predict ICE first start. Based on this knowledge, an EATS pre-heating phase can be planned to avoid low pollutant conversion efficiencies, thus preventing high NOx emission due to engine cold start. Consequently, the final NOx emission over the complete driving cycle will be strongly reduced, allowing to comply with the limits potentially set by the incoming EURO 7 regulation. Moreover, given the same NOx emission target, the gain achieved thanks to the implementation of an EHC predictive control function will allow to consider a simplified EATS layout, thus reducing the related manufacturing cost. The promising results achieved in terms of NOx emission reduction show the effectiveness of the application of a predictive control strategy focused on EATS thermal management and highlight the potential of a complete integration and parallel development of involved vehicle physical systems, control software and connectivity data management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The focus of the thesis is the application of different attitude’s determination algorithms on data evaluated with MEMS sensor using a board provided by University of Bologna. MEMS sensors are a very cheap options to obtain acceleration, and angular velocity. The use of magnetometers based on Hall effect can provide further data. The disadvantage is that they have a lot of noise and drift which can affects the results. The different algorithms that have been used are: pitch and roll from accelerometer, yaw from magnetometer, attitude from gyroscope, TRIAD, QUEST, Magdwick, Mahony, Extended Kalman filter, Kalman GPS aided INS. In this work the algorithms have been rewritten to fit perfectly with the data provided from the MEMS sensor. The data collected by the board are acceleration on the three axis, angular velocity on the three axis, magnetic fields on the three axis, and latitude, longitude, and altitude from the GPS. Several tests and comparisons have been carried out installing the electric board on different vehicles operating in the air and on ground. The conclusion that can be drawn from this study is that the Magdwich filter is the best trade-off between computational capabilities required and results obtained. If attitude angles are obtained from accelerometers, gyroscopes, and magnetometer, inconsistent data are obtained for cases where high vibrations levels are noticed. On the other hand, Kalman filter based algorithms requires a high computational burden. TRIAD and QUEST algorithms doesn’t perform as well as filters.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The recent years have witnessed increased development of small, autonomous fixed-wing Unmanned Aerial Vehicles (UAVs). In order to unlock widespread applicability of these platforms, they need to be capable of operating under a variety of environmental conditions. Due to their small size, low weight, and low speeds, they require the capability of coping with wind speeds that are approaching or even faster than the nominal airspeed. In this thesis, a nonlinear-geometric guidance strategy is presented, addressing this problem. More broadly, a methodology is proposed for the high-level control of non-holonomic unicycle-like vehicles in the presence of strong flowfields (e.g. winds, underwater currents) which may outreach the maximum vehicle speed. The proposed strategy guarantees convergence to a safe and stable vehicle configuration with respect to the flowfield, while preserving some tracking performance with respect to the target path. As an alternative approach, an algorithm based on Model Predictive Control (MPC) is developed, and a comparison between advantages and disadvantages of both approaches is drawn. Evaluations in simulations and a challenging real-world flight experiment in very windy conditions confirm the feasibility of the proposed guidance approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main objective of my thesis work is to exploit the Google native and open-source platform Kubeflow, specifically using Kubeflow pipelines, to execute a Federated Learning scalable ML process in a 5G-like and simplified test architecture hosting a Kubernetes cluster and apply the largely adopted FedAVG algorithm and FedProx its optimization empowered by the ML platform ‘s abilities to ease the development and production cycle of this specific FL process. FL algorithms are more are and more promising and adopted both in Cloud application development and 5G communication enhancement through data coming from the monitoring of the underlying telco infrastructure and execution of training and data aggregation at edge nodes to optimize the global model of the algorithm ( that could be used for example for resource provisioning to reach an agreed QoS for the underlying network slice) and after a study and a research over the available papers and scientific articles related to FL with the help of the CTTC that suggests me to study and use Kubeflow to bear the algorithm we found out that this approach for the whole FL cycle deployment was not documented and may be interesting to investigate more in depth. This study may lead to prove the efficiency of the Kubeflow platform itself for this need of development of new FL algorithms that will support new Applications and especially test the FedAVG algorithm performances in a simulated client to cloud communication using a MNIST dataset for FL as benchmark.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetic Resonance Spectroscopy (MRS) is an advanced clinical and research application which guarantees a specific biochemical and metabolic characterization of tissues by the detection and quantification of key metabolites for diagnosis and disease staging. The "Associazione Italiana di Fisica Medica (AIFM)" has promoted the activity of the "Interconfronto di spettroscopia in RM" working group. The purpose of the study is to compare and analyze results obtained by perfoming MRS on scanners of different manufacturing in order to compile a robust protocol for spectroscopic examinations in clinical routines. This thesis takes part into this project by using the GE Signa HDxt 1.5 T at the Pavillion no. 11 of the S.Orsola-Malpighi hospital in Bologna. The spectral analyses have been performed with the jMRUI package, which includes a wide range of preprocessing and quantification algorithms for signal analysis in the time domain. After the quality assurance on the scanner with standard and innovative methods, both spectra with and without suppression of the water peak have been acquired on the GE test phantom. The comparison of the ratios of the metabolite amplitudes over Creatine computed by the workstation software, which works on the frequencies, and jMRUI shows good agreement, suggesting that quantifications in both domains may lead to consistent results. The characterization of an in-house phantom provided by the working group has achieved its goal of assessing the solution content and the metabolite concentrations with good accuracy. The goodness of the experimental procedure and data analysis has been demonstrated by the correct estimation of the T2 of water, the observed biexponential relaxation curve of Creatine and the correct TE value at which the modulation by J coupling causes the Lactate doublet to be inverted in the spectrum. The work of this thesis has demonstrated that it is possible to perform measurements and establish protocols for data analysis, based on the physical principles of NMR, which are able to provide robust values for the spectral parameters of clinical use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays communication is switching from a centralized scenario, where communication media like newspapers, radio, TV programs produce information and people are just consumers, to a completely different decentralized scenario, where everyone is potentially an information producer through the use of social networks, blogs, forums that allow a real-time worldwide information exchange. These new instruments, as a result of their widespread diffusion, have started playing an important socio-economic role. They are the most used communication media and, as a consequence, they constitute the main source of information enterprises, political parties and other organizations can rely on. Analyzing data stored in servers all over the world is feasible by means of Text Mining techniques like Sentiment Analysis, which aims to extract opinions from huge amount of unstructured texts. This could lead to determine, for instance, the user satisfaction degree about products, services, politicians and so on. In this context, this dissertation presents new Document Sentiment Classification methods based on the mathematical theory of Markov Chains. All these approaches bank on a Markov Chain based model, which is language independent and whose killing features are simplicity and generality, which make it interesting with respect to previous sophisticated techniques. Every discussed technique has been tested in both Single-Domain and Cross-Domain Sentiment Classification areas, comparing performance with those of other two previous works. The performed analysis shows that some of the examined algorithms produce results comparable with the best methods in literature, with reference to both single-domain and cross-domain tasks, in $2$-classes (i.e. positive and negative) Document Sentiment Classification. However, there is still room for improvement, because this work also shows the way to walk in order to enhance performance, that is, a good novel feature selection process would be enough to outperform the state of the art. Furthermore, since some of the proposed approaches show promising results in $2$-classes Single-Domain Sentiment Classification, another future work will regard validating these results also in tasks with more than $2$ classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of localizing a scatterer, which represents a tumor, in a homogeneous circular domain, which represents a breast, is addressed. A breast imaging method based on microwaves is considered. The microwave imaging involves to several techniques for detecting, localizing and characterizing tumors in breast tissues. In all such methods an electromagnetic inverse scattering problem exists. For the scattering detection method, an algorithm based on a linear procedure solution, inspired by MUltiple SIgnal Classification algorithm (MUSIC) and Time Reversal method (TR), is implemented. The algorithm returns a reconstructed image of the investigation domain in which it is detected the scatterer position. This image is called pseudospectrum. A preliminary performance analysis of the algorithm vying the working frequency is performed: the resolution and the signal-to-noise ratio of the pseudospectra are improved if a multi-frequency approach is considered. The Geometrical Mean-MUSIC algorithm (GM- MUSIC) is proposed as multi-frequency method. The performance of the GMMUSIC is tested in different real life computer simulations. The performed analysis shows that the algorithm detects the scatterer until the electrical parameters of the breast are known. This is an evident limit, since, in a real life situation, the anatomy of the breast is unknown. An improvement in GM-MUSIC is proposed: the Eye-GMMUSIC algorithm. Eye-GMMUSIC algorithm needs no a priori information on the electrical parameters of the breast. It is an optimizing algorithm based on the pattern search algorithm: it searches the breast parameters which minimize the Signal-to-Clutter Mean Ratio (SCMR) in the signal. Finally, the GM-MUSIC and the Eye-GMMUSIC algorithms are tested on a microwave breast cancer detection system consisting of an dipole antenna, a Vector Network Analyzer and a novel breast phantom built at University of Bologna. The reconstruction of the experimental data confirm the GM-MUSIC ability to localize a scatterer in a homogeneous medium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lo streaming è una tecnica per trasferire contenuti multimediali sulla rete globale, utilizzato per esempio da servizi come YouTube e Netflix; dopo una breve attesa, durante la quale un buffer di sicurezza viene riempito, l'utente può usufruire del contenuto richiesto. Cisco e Sandvine, che con cadenza regolare pubblicano bollettini sullo stato di Internet, affermano che lo streaming video ha, e avrà sempre di più, un grande impatto sulla rete globale. Il buon design delle applicazioni di streaming riveste quindi un ruolo importante, sia per la soddisfazione degli utenti che per la stabilità dell'infrastruttura. HTTP Adaptive Streaming indica una famiglia di implementazioni volta a offrire la migliore qualità video possibile (in termini di bit rate) in funzione della bontà della connessione Internet dell'utente finale: il riproduttore multimediale può cambiare in ogni momento il bit rate, scegliendolo in un insieme predefinito, adattandosi alle condizioni della rete. Per ricavare informazioni sullo stato della connettività, due famiglie di metodi sono possibili: misurare la velocità di scaricamento dei precedenti trasferimenti (approccio rate-based), oppure, come recentemente proposto da Netflix, utilizzare l'occupazione del buffer come dato principale (buffer-based). In questo lavoro analizziamo algoritmi di adattamento delle due famiglie, con l'obiettivo di confrontarli su metriche riguardanti la soddisfazione degli utenti, l'utilizzo della rete e la competizione su un collo di bottiglia. I risultati dei nostri test non definiscono un chiaro vincitore, riconoscendo comunque la bontà della nuova proposta, ma evidenziando al contrario che gli algoritmi buffer-based non sempre riescono ad allocare in modo imparziale le risorse di rete.