16 resultados para L-systems
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The thesis aims at inquiring into the issue of innovation and organizational and institutional change in the public administration with regard to the increasingly massive adoption of participatory devices and practices in various arenas of public policies. The field of reference regards transformations of the types of public actions and regulation systems, concerning governance. Together with the crisis of the public function and of the role played by the insitutions what is emerging are different levels of governement, both towards an over national and a local direction, and a plurality of social interlocutors, followed by a post-bureaucratic pattern of the public administration that is opening itself in the direction of environment and citizens. The public adminstration is no longer considered an inert object within the bureaucratic paradigm but as a series of communicative processes, choices, cultures and practices that actively builds itself and the environment it interacts with. Therefore, the output of the public administration isn’t the simple service being supplied but the relationship enacted with the citizen, relationship that becomes the constituent basis of adminstrative processes. The intention of thesis is to take into consideration the relation between innovation of the public administration and participatory experimentations and implementations regarded as exchanges in which citizens and the public administration hold talks and debates. The issue of the organizational change of the public administration as output and effect of inclusive deliberative practices has been analysed starting from an institutionalist approach, in other words examining the constituent features of institutions, “rediscovering” them with regard to their public nature, their ability to elaborate collective values and meanings, the social definition of problems and solutions. The participatory device employed by the Forlì city council that involved enterprises and cultural associations of the area in order to build a participatory Table, has been studied through a qualitative methodology (participant observation and semi-strutctured interviews). The analysis inquired into the public nature both of the participatory device and the administrative action itself as well as into elements pertaining the deliberative setting, the regulative reference framework and the actors which took part in the process.
Resumo:
In recent years, due to the rapid convergence of multimedia services, Internet and wireless communications, there has been a growing trend of heterogeneity (in terms of channel bandwidths, mobility levels of terminals, end-user quality-of-service (QoS) requirements) for emerging integrated wired/wireless networks. Moreover, in nowadays systems, a multitude of users coexists within the same network, each of them with his own QoS requirement and bandwidth availability. In this framework, embedded source coding allowing partial decoding at various resolution is an appealing technique for multimedia transmissions. This dissertation includes my PhD research, mainly devoted to the study of embedded multimedia bitstreams in heterogenous networks, developed at the University of Bologna, advised by Prof. O. Andrisano and Prof. A. Conti, and at the University of California, San Diego (UCSD), where I spent eighteen months as a visiting scholar, advised by Prof. L. B. Milstein and Prof. P. C. Cosman. In order to improve the multimedia transmission quality over wireless channels, joint source and channel coding optimization is investigated in a 2D time-frequency resource block for an OFDM system. We show that knowing the order of diversity in time and/or frequency domain can assist image (video) coding in selecting optimal channel code rates (source and channel code rates). Then, adaptive modulation techniques, aimed at maximizing the spectral efficiency, are investigated as another possible solution for improving multimedia transmissions. For both slow and fast adaptive modulations, the effects of imperfect channel estimation errors are evaluated, showing that the fast technique, optimal in ideal systems, might be outperformed by the slow adaptive modulation, when a real test case is considered. Finally, the effects of co-channel interference and approximated bit error probability (BEP) are evaluated in adaptive modulation techniques, providing new decision regions concepts, and showing how the widely used BEP approximations lead to a substantial loss in the overall performance.
Resumo:
Investigation on impulsive signals, originated from Partial Discharge (PD) phenomena, represents an effective tool for preventing electric failures in High Voltage (HV) and Medium Voltage (MV) systems. The determination of both sensors and instruments bandwidths is the key to achieve meaningful measurements, that is to say, obtaining the maximum Signal-To-Noise Ratio (SNR). The optimum bandwidth depends on the characteristics of the system under test, which can be often represented as a transmission line characterized by signal attenuation and dispersion phenomena. It is therefore necessary to develop both models and techniques which can characterize accurately the PD propagation mechanisms in each system and work out the frequency characteristics of the PD pulses at detection point, in order to design proper sensors able to carry out PD measurement on-line with maximum SNR. Analytical models will be devised in order to predict PD propagation in MV apparatuses. Furthermore, simulation tools will be used where complex geometries make analytical models to be unfeasible. In particular, PD propagation in MV cables, transformers and switchgears will be investigated, taking into account both irradiated and conducted signals associated to PD events, in order to design proper sensors.
Resumo:
Synchronization is a key issue in any communication system, but it becomes fundamental in the navigation systems, which are entirely based on the estimation of the time delay of the signals coming from the satellites. Thus, even if synchronization has been a well known topic for many years, the introduction of new modulations and new physical layer techniques in the modern standards makes the traditional synchronization strategies completely ineffective. For this reason, the design of advanced and innovative techniques for synchronization in modern communication systems, like DVB-SH, DVB-T2, DVB-RCS, WiMAX, LTE, and in the modern navigation system, like Galileo, has been the topic of the activity. Recent years have seen the consolidation of two different trends: the introduction of Orthogonal Frequency Division Multiplexing (OFDM) in the communication systems, and of the Binary Offset Carrier (BOC) modulation in the modern Global Navigation Satellite Systems (GNSS). Thus, a particular attention has been given to the investigation of the synchronization algorithms in these areas.
Resumo:
This thesis presents the outcomes of a Ph.D. course in telecommunications engineering. It is focused on the optimization of the physical layer of digital communication systems and it provides innovations for both multi- and single-carrier systems. For the former type we have first addressed the problem of the capacity in presence of several nuisances. Moreover, we have extended the concept of Single Frequency Network to the satellite scenario, and then we have introduced a novel concept in subcarrier data mapping, resulting in a very low PAPR of the OFDM signal. For single carrier systems we have proposed a method to optimize constellation design in presence of a strong distortion, such as the non linear distortion provided by satellites' on board high power amplifier, then we developed a method to calculate the bit/symbol error rate related to a given constellation, achieving an improved accuracy with respect to the traditional Union Bound with no additional complexity. Finally we have designed a low complexity SNR estimator, which saves one-half of multiplication with respect to the ML estimator, and it has similar estimation accuracy.
Resumo:
This work describes the development of a simulation tool which allows the simulation of the Internal Combustion Engine (ICE), the transmission and the vehicle dynamics. It is a control oriented simulation tool, designed in order to perform both off-line (Software In the Loop) and on-line (Hardware In the Loop) simulation. In the first case the simulation tool can be used in order to optimize Engine Control Unit strategies (as far as regard, for example, the fuel consumption or the performance of the engine), while in the second case it can be used in order to test the control system. In recent years the use of HIL simulations has proved to be very useful in developing and testing of control systems. Hardware In the Loop simulation is a technology where the actual vehicles, engines or other components are replaced by a real time simulation, based on a mathematical model and running in a real time processor. The processor reads ECU (Engine Control Unit) output signals which would normally feed the actuators and, by using mathematical models, provides the signals which would be produced by the actual sensors. The simulation tool, fully designed within Simulink, includes the possibility to simulate the only engine, the transmission and vehicle dynamics and the engine along with the vehicle and transmission dynamics, allowing in this case to evaluate the performance and the operating conditions of the Internal Combustion Engine, once it is installed on a given vehicle. Furthermore the simulation tool includes different level of complexity, since it is possible to use, for example, either a zero-dimensional or a one-dimensional model of the intake system (in this case only for off-line application, because of the higher computational effort). Given these preliminary remarks, an important goal of this work is the development of a simulation environment that can be easily adapted to different engine types (single- or multi-cylinder, four-stroke or two-stroke, diesel or gasoline) and transmission architecture without reprogramming. Also, the same simulation tool can be rapidly configured both for off-line and real-time application. The Matlab-Simulink environment has been adopted to achieve such objectives, since its graphical programming interface allows building flexible and reconfigurable models, and real-time simulation is possible with standard, off-the-shelf software and hardware platforms (such as dSPACE systems).
Resumo:
This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.
Resumo:
La questione energetica ha assunto, negli ultimi anni, un ruolo centrale nel dibattito mondiale in relazione a quattro fattori principali: la non riproducibilità delle risorse naturali, l’aumento esponenziale dei consumi, gli interessi economici e la salvaguardia dell'equilibrio ambientale e climatico del nostro Pianeta. E’ necessario, dunque, cambiare il modello di produzione e consumo dell’energia soprattutto nelle città, dove si ha la massima concentrazione dei consumi energetici. Per queste ragioni, il ricorso alle Fonti Energetiche Rinnovabili (FER) si configura ormai come una misura necessaria, opportuna ed urgente anche nella pianificazione urbanistica. Per migliorare la prestazione energetica complessiva del sistema città bisogna implementare politiche di governo delle trasformazioni che escano da una logica operativa “edificio-centrica” e ricomprendano, oltre al singolo manufatto, le aggregazioni di manufatti e le loro relazioni/ interazioni in termini di input e output materico-energetiche. La sostituzione generalizzata del patrimonio edilizio esistente con nuovi edifici iper-tecnologici, è improponibile. In che modo quindi, è possibile ridefinire la normativa e la prassi urbanistica per generare tessuti edilizi energeticamente efficienti? La presente ricerca propone l’integrazione tra la nascente pianificazione energetica del territorio e le più consolidate norme urbanistiche, nella generazione di tessuti urbani “energy saving” che aggiungano alle prestazioni energetico-ambientali dei singoli manufatti quelle del contesto, in un bilancio energetico complessivo. Questo studio, dopo aver descritto e confrontato le principali FER oggi disponibili, suggerisce una metodologia per una valutazione preliminare del mix di tecnologie e di FER più adatto per ciascun sito configurato come “distretto energetico”. I risultati di tale processo forniscono gli elementi basilari per predisporre le azioni necessarie all’integrazione della materia energetica nei Piani Urbanistici attraverso l’applicazione dei principi della perequazione nella definizione di requisiti prestazionali alla scala insediativa, indispensabili per un corretto passaggio alla progettazione degli “oggetti” e dei “sistemi” urbani.
Resumo:
Il progetto di ricerca si situa nell’ambito dell’informatica giudiziaria settore che studia i sistemi informativi implementati negli uffici giudiziari allo scopo di migliorare l’efficienza del servizio, fornire una leva per la riduzione dei lunghi tempi processuali, al fine ultimo di garantire al meglio i diritti riconosciuti ai cittadini e accrescere la competitività del Paese. Oggetto di studio specifico del progetto di ricerca è l’utilizzo delle ICT nel processo penale. Si tratta di una realtà meno studiata rispetto al processo civile, eppure la crisi di efficienza del processo non è meno sentita in tale area: l’arretrato da smaltire al 30 giugno del 2011 è stato quantificato in 3,4 milioni di processi penali, e il tempo medio di definizione degli stessi è di quattro anni e nove mesi. Guardare al processo penale con gli occhi della progettazione dei sistemi informativi è vedere un fluire ininterrotto di informazioni che include realtà collocate a monte e a valle del processo stesso: dalla trasmissione della notizia di reato alla esecuzione della pena. In questa prospettiva diventa evidente l’importanza di una corretta gestione delle informazioni: la quantità, l’accuratezza, la rapidità di accesso alle stesse sono fattori così cruciali per il processo penale che l’efficienza del sistema informativo e la qualità della giustizia erogata sono fortemente interrelate. Il progetto di ricerca è orientato a individuare quali siano le condizioni in cui l’efficienza può essere effettivamente raggiunta e, soprattutto, a verificare quali siano le scelte tecnologiche che possono preservare, o anche potenziare, i principi e le garanzie del processo penale. Nel processo penale, infatti, sono coinvolti diritti fondamentali dell’individuo quali la libertà personale, la dignità, la riservatezza, diritti fondamentali che vengono tutelati attraverso un ampia gamma di diritti processuali quali la presunzione di innocenza, il diritto di difesa, il diritto al contraddittorio, la finalità di rieducazione della pena.
Resumo:
We have modeled various soft-matter systems with molecular dynamics (MD) simulations. The first topic concerns liquid crystal (LC) biaxial nematic (Nb) phases, that can be possibly used in fast displays. We have investigated the phase organization of biaxial Gay-Berne (GB) mesogens, considering the effects of the orientation, strength and position of a molecular dipole. We have observed that for systems with a central dipole, nematic biaxial phases disappear when increasing dipole strength, while for systems characterized by an offset dipole, the Nb phase is stabilized at very low temperatures. In a second project, in view of their increasing importance as nanomaterials in LC phases, we are developing a DNA coarse-grained (CG) model, in which sugar and phosphate groups are represented with Lennard-Jones spheres, while bases with GB ellipsoids. We have obtained shape, position and orientation parameters for each bead, to best reproduce the atomistic structure of a B-DNA helix. Starting from atomistic simulations results, we have completed a first parametrization of the force field terms, accounting for bonded (bonds, angles and dihedrals) and non-bonded interactions (H-bond and stacking). We are currently validating the model, by investigating stability and melting temperature of various sequences. Finally, in a third project, we aim to explain the mechanism of enantiomeric discrimination due to the presence of a chiral helix of poly(gamma-benzyl L-glutamate) (PBLG), in solution of dimethylformamide (DMF), interacting with chiral or pro-chiral molecules (in our case heptyl butyrate, HEP), after tuning properly an atomistic force field (AMBER). We have observed that DMF and HEP molecules solvate uniformly the PBLG helix, but the pro-chiral solute is on average found closer to the helix with respect to the DMF. The solvent presents a faster isotropic diffusion, twice as HEP, also indicating a stronger interaction of the solute with the helix.
Resumo:
I moderni motori a combustione interna diventano sempre più complessi L'introduzione della normativa antinquinamento EURO VI richiederà una significativa riduzione degli inquinanti allo scarico. La maggiore criticità è rappresentata dalla riduzione degli NOx per i motori Diesel da aggiungersi a quelle già in vigore con le precedenti normative. Tipicamente la messa a punto di una nuova motorizzazione prevede una serie di test specifici al banco prova. Il numero sempre maggiore di parametri di controllo della combustione, sorti come conseguenza della maggior complessità meccanica del motore stesso, causa un aumento esponenziale delle prove da eseguire per caratterizzare l'intero sistema. L'obiettivo di questo progetto di dottorato è quello di realizzare un sistema di analisi della combustione in tempo reale in cui siano implementati diversi algoritmi non ancora presenti nelle centraline moderne. Tutto questo facendo particolare attenzione alla scelta dell'hardware su cui implementare gli algoritmi di analisi. Creando una piattaforma di Rapid Control Prototyping (RCP) che sfrutti la maggior parte dei sensori presenti in vettura di serie; che sia in grado di abbreviare i tempi e i costi della sperimentazione sui motopropulsori, riducendo la necessità di effettuare analisi a posteriori, su dati precedentemente acquisiti, a fronte di una maggior quantità di calcoli effettuati in tempo reale. La soluzione proposta garantisce l'aggiornabilità, la possibilità di mantenere al massimo livello tecnologico la piattaforma di calcolo, allontanandone l'obsolescenza e i costi di sostituzione. Questa proprietà si traduce nella necessità di mantenere la compatibilità tra hardware e software di generazioni differenti, rendendo possibile la sostituzione di quei componenti che limitano le prestazioni senza riprogettare il software.
Resumo:
La neuroriabilitazione è un processo attraverso cui individui affetti da patologie neurologiche mirano al conseguimento di un recupero completo o alla realizzazione del loro potenziale ottimale benessere fisico, mentale e sociale. Elementi essenziali per una riabilitazione efficace sono: una valutazione clinica da parte di un team multidisciplinare, un programma riabilitativo mirato e la valutazione dei risultati conseguiti mediante misure scientifiche e clinicamente appropriate. Obiettivo principale di questa tesi è stato sviluppare metodi e strumenti quantitativi per il trattamento e la valutazione motoria di pazienti neurologici. I trattamenti riabilitativi convenzionali richiedono a pazienti neurologici l’esecuzione di esercizi ripetitivi, diminuendo la loro motivazione. La realtà virtuale e i feedback sono in grado di coinvolgerli nel trattamento, permettendo ripetibilità e standardizzazione dei protocolli. È stato sviluppato e valutato uno strumento basato su feedback aumentati per il controllo del tronco. Inoltre, la realtà virtuale permette l’individualizzare il trattamento in base alle esigenze del paziente. Un’applicazione virtuale per la riabilitazione del cammino è stata sviluppata e testata durante un training su pazienti di sclerosi multipla, valutandone fattibilità e accettazione e dimostrando l'efficacia del trattamento. La valutazione quantitativa delle capacità motorie dei pazienti viene effettuata utilizzando sistemi di motion capture. Essendo il loro uso nella pratica clinica limitato, una metodologia per valutare l’oscillazione delle braccia in soggetti parkinsoniani basata su sensori inerziali è stata proposta. Questi sono piccoli, accurati e flessibili ma accumulano errori durante lunghe misurazioni. È stato affrontato questo problema e i risultati suggeriscono che, se il sensore è sul piede e le accelerazioni sono integrate iniziando dalla fase di mid stance, l’errore e le sue conseguenze nella determinazione dei parametri spaziali sono contenuti. Infine, è stata presentata una validazione del Kinect per il tracking del cammino in ambiente virtuale. Risultati preliminari consentono di definire il campo di utilizzo del sensore in riabilitazione.
Resumo:
The international growing concern for the human exposure to magnetic fields generated by electric power lines has unavoidably led to imposing legal limits. Respecting these limits, implies being able to calculate easily and accurately the generated magnetic field also in complex configurations. Twisting of phase conductors is such a case. The consolidated exact and approximated theory regarding a single-circuit twisted three-phase power cable line has been reported along with the proposal of an innovative simplified formula obtained by means of an heuristic procedure. This formula, although being dramatically simpler, is proven to be a good approximation of the analytical formula and at the same time much more accurate than the approximated formula found in literature. The double-circuit twisted three-phase power cable line case has been studied following different approaches of increasing complexity and accuracy. In this framework, the effectiveness of the above-mentioned innovative formula is also examined. The experimental verification of the correctness of the twisted double-circuit theoretical analysis has permitted its extension to multiple-circuit twisted three-phase power cable lines. In addition, appropriate 2D and, in particularly, 3D numerical codes for simulating real existing overhead power lines for the calculation of the magnetic field in their vicinity have been created. Finally, an innovative ‘smart’ measurement and evaluation system of the magnetic field is being proposed, described and validated, which deals with the experimentally-based evaluation of the total magnetic field B generated by multiple sources in complex three-dimensional arrangements, carried out on the basis of the measurement of the three Cartesian field components and their correlation with the field currents via multilinear regression techniques. The ultimate goal is verifying that magnetic induction intensity is within the prescribed limits.
Resumo:
The monitoring of cognitive functions aims at gaining information about the current cognitive state of the user by decoding brain signals. In recent years, this approach allowed to acquire valuable information about the cognitive aspects regarding the interaction of humans with external world. From this consideration, researchers started to consider passive application of brain–computer interface (BCI) in order to provide a novel input modality for technical systems solely based on brain activity. The objective of this thesis is to demonstrate how the passive Brain Computer Interfaces (BCIs) applications can be used to assess the mental states of the users, in order to improve the human machine interaction. Two main studies has been proposed. The first one allows to investigate whatever the Event Related Potentials (ERPs) morphological variations can be used to predict the users’ mental states (e.g. attentional resources, mental workload) during different reactive BCI tasks (e.g. P300-based BCIs), and if these information can predict the subjects’ performance in performing the tasks. In the second study, a passive BCI system able to online estimate the mental workload of the user by relying on the combination of the EEG and the ECG biosignals has been proposed. The latter study has been performed by simulating an operative scenario, in which the occurrence of errors or lack of performance could have significant consequences. The results showed that the proposed system is able to estimate online the mental workload of the subjects discriminating three different difficulty level of the tasks ensuring a high reliability.
Resumo:
La ricerca è volta a presentare un nuovo approccio integrato, a supporto di operatori e progettisti, per la gestione dell’intero processo progettuale di interventi di riqualificazione energetica e architettonica del patrimonio edilizio recente, mediante l’impiego di soluzioni tecnologiche innovative di involucro edilizio. Lo studio richiede necessariamente l’acquisizione di un repertorio selezionato di sistemi costruttivi di involucro, come base di partenza per l’elaborazione di soluzioni progettuali di recupero delle scuole appartenenti al secondo dopoguerra, in conglomerato cementizio armato, prevalentemente prefabbricate. Il progetto individua procedimenti costruttivi ecocompatibili per la progettazione di componenti prefabbricati di involucro “attivo”, adattabile ed efficiente, da assemblare a secco, nel rispetto dei requisiti prestazionali richiesti dalle attuali normative. La ricerca è finalizzata alla gestione dell’intero processo, supportato da sistemi di rilevazione geometrica, collegati a software di programmazione parametrica per la modellazione di superfici adattabili alla morfologia dei fabbricati oggetto di intervento. Tali strumenti informatizzati CAD-CAM sono connessi a macchine a controllo numerico CNC per la produzione industrializzata degli elementi costruttivi “su misura”. A titolo esemplificativo dell’approccio innovativo proposto, si formulano due possibili soluzioni di involucro in linea con i paradigmi della ricerca, nel rispetto dei principi di sostenibilità, intesa come modularità, rapidità di posa, reversibilità, recupero e riciclo di materiali. In particolare, le soluzioni innovative sono accomunate dall’applicazione di una tecnica basata sull’assemblaggio di elementi prefabbricati, dall’adozione di una trama esagonale per la tassellazione della nuova superficie di facciata, e dall’utilizzo del medesimo materiale termico isolante, plastico e inorganico, riciclato ed ecosostenibile, a basso impatto ambientale (AAM - Alkali Activated Materials). Le soluzioni progettuali proposte, sviluppate presso le due sedi coinvolte nella cotutela (Università di Bologna, Université Paris-Est) sono affrontate secondo un protocollo scientifico che prevede: progettazione del sistema costruttivo, analisi meccanica e termica, sperimentazione costruttiva, verifica delle tecniche di messa in opera e dei requisiti prestazionali.