238 resultados para INFORMATICA
Resumo:
The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. Security testing methodologies are the first step towards standardized security evaluation processes and understanding of how the security threats evolve over time. This dissertation analyzes some of the most used identifying differences and commonalities, useful to compare them and assess their quality. The dissertation then proposes a new enhanced methodology built by keeping the best of every analyzed methodology. The designed methodology is tested over different systems with very effective results, which is the main evidence that it could really be applied in practical cases. Most of the dissertation discusses and proves how the presented testing methodology could be applied to such different systems and even to evade security measures by inverting goals and scopes. Real cases are often hard to find in methodology' documents, in contrary this dissertation wants to show real and practical cases offering technical details about how to apply it. Electronic voting systems are the first field test considered, and Pvote and Scantegrity are the two tested electronic voting systems. The usability and effectiveness of the designed methodology for electronic voting systems is proved thanks to this field cases analysis. Furthermore reputation and anti virus engines have also be analyzed with similar results. The dissertation concludes by presenting some general guidelines to build a coordination-based approach of electronic voting systems to improve the security without decreasing the system modularity.
Resumo:
This thesis is mainly devoted to show how EEG data and related phenomena can be reproduced and analyzed using mathematical models of neural masses (NMM). The aim is to describe some of these phenomena, to show in which ways the design of the models architecture is influenced by such phenomena, point out the difficulties of tuning the dozens of parameters of the models in order to reproduce the activity recorded with EEG systems during different kinds of experiments, and suggest some strategies to cope with these problems. In particular the chapters are organized as follows: chapter I gives a brief overview of the aims and issues addressed in the thesis; in chapter II the main characteristics of the cortical column, of the EEG signal and of the neural mass models will be presented, in order to show the relationships that hold between these entities; chapter III describes a study in which a NMM from the literature has been used to assess brain connectivity changes in tetraplegic patients; in chapter IV a modified version of the NMM is presented, which has been developed to overcomes some of the previous version’s intrinsic limitations; chapter V describes a study in which the new NMM has been used to reproduce the electrical activity evoked in the cortex by the transcranial magnetic stimulation (TMS); chapter VI presents some preliminary results obtained in the simulation of the neural rhythms associated with memory recall; finally, some general conclusions are drawn in chapter VII.
Resumo:
This doctoral dissertation aims to establish fiber-optic technologies overcoming the limiting issues of data communications in indoor environments. Specific applications are broadband mobile distribution in different in-building scenarios and high-speed digital transmission over short-range wired optical systems. Two key enabling technologies are considered: Radio over Fiber (RoF) techniques over standard silica fibers for distributed antenna systems (DAS) and plastic optical fibers (POFs) for short-range communications. Hence, the objectives and achievements of this thesis are related to the application of RoF and POF technologies in different in-building scenarios. On one hand, a theoretical and experimental analysis combined with demonstration activities has been performed on cost-effective RoF systems. An extensive modeling on modal noise impact both on linear and non-linear characteristics of RoF link over silica multimode fiber has been performed to achieve link design rules for an optimum choice of the transmitter, receiver and launching technique. A successful transmission of Long Term Evolution (LTE) mobile signals on the resulting optimized RoF system over silica multimode fiber employing a Fabry-Perot LD, central launch technique and a photodiode with a built-in ball lens was demonstrated up to 525m with performances well compliant with standard requirements. On the other hand, digital signal processing techniques to overcome the bandwidth limitation of POF have been investigated. An uncoded net bit-rate of 5.15Gbit/s was obtained on a 50m long POF link employing an eye-safe transmitter, a silicon photodiode, and DMT modulation with bit and power loading algorithm. With the insertion of 3x2N quadrature amplitude modulation constellation formats, an uncoded net-bit-rate of 5.4Gbit/s was obtained on a 50 m long POF link employing an eye-safe transmitter and a silicon avalanche photodiode. Moreover, simultaneous transmission of baseband 2Gbit/s with DMT and 200Mbit/s with an ultra-wideband radio signal has been validated over a 50m long POF link.
Resumo:
L’attività di ricerca contenuta in questa tesi si è concentrata nello sviluppo e nell’implementazione di tecniche per la co-simulazione e il co-progetto non lineare/elettromagnetico di sistemi wireless non convenzionali. Questo lavoro presenta un metodo rigoroso per considerare le interazioni tra due sistemi posti sia in condizioni di campo vicino che in condizioni di campo lontano. In sostanza, gli effetti del sistema trasmittente sono rappresentati da un generatore equivalente di Norton posto in parallelo all’antenna del sistema ricevente, calcolato per mezzo del teorema di reciprocità e del teorema di equivalenza. La correttezza del metodo è stata verificata per mezzo di simulazioni e misure, concordi tra loro. La stessa teoria, ampliata con l’introduzione degli effetti di scattering, è stata usata per valutare una condizione analoga, dove l’elemento trasmittente coincide con quello ricevente (DIE) contenuto all’interno di una struttura metallica (package). I risultati sono stati confrontati con i medesimi ottenibili tramite tecniche FEM e FDTD/FIT, che richiedono tempi di simulazione maggiori di un ordine di grandezza. Grazie ai metodi di co-simulazione non lineari/EM sopra esposti, è stato progettato e verificato un sistema di localizzazione e identificazione di oggetti taggati posti in ambiente indoor. Questo è stato ottenuto dotando il sistema di lettura, denominato RID (Remotely Identify and Detect), di funzioni di scansione angolare e della tecnica di RADAR mono-pulse. Il sistema sperimentale, creato con dispositivi low cost, opera a 2.5 GHz ed ha le dimensioni paragonabili ad un normale PDA. E’ stato sperimentata la capacità del RID di localizzare, in scenari indoor, oggetti statici e in movimento.
Resumo:
Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.
Resumo:
The work of the present thesis is focused on the implementation of microelectronic voltage sensing devices, with the purpose of transmitting and extracting analog information between devices of different nature at short distances or upon contact. Initally, chip-to-chip communication has been studied, and circuitry for 3D capacitive coupling has been implemented. Such circuits allow the communication between dies fabricated in different technologies. Due to their novelty, they are not standardized and currently not supported by standard CAD tools. In order to overcome such burden, a novel approach for the characterization of such communicating links has been proposed. This results in shorter design times and increased accuracy. Communication between an integrated circuit (IC) and a probe card has been extensively studied as well. Today wafer probing is a costly test procedure with many drawbacks, which could be overcome by a different communication approach such as capacitive coupling. For this reason wireless wafer probing has been investigated as an alternative approach to standard on-contact wafer probing. Interfaces between integrated circuits and biological systems have also been investigated. Active electrodes for simultaneous electroencephalography (EEG) and electrical impedance tomography (EIT) have been implemented for the first time in a 0.35 um process. Number of wires has been minimized by sharing the analog outputs and supply on a single wire, thus implementing electrodes that require only 4 wires for their operation. Minimization of wires reduces the cable weight and thus limits the patient's discomfort. The physical channel for communication between an IC and a biological medium is represented by the electrode itself. As this is a very crucial point for biopotential acquisitions, large efforts have been carried in order to investigate the different electrode technologies and geometries and an electromagnetic model is presented in order to characterize the properties of the electrode to skin interface.
Resumo:
This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.
L'area dei Lungarni di Pisa nel tardo Medioevo (XIV-XV secolo). un tentativo di ricostruzione in 3D.
Resumo:
Lo scopo di questa ricerca è la ricostruzione dei Lungarni di Pisa nel Tardo Medioevo (XIV-XV secolo); lo studio intende sottolineare le trasformazioni urbanistiche che hanno cambiato il volto di Pisa nel corso del tempo e ricordare che l’area fluviale ebbe un ruolo di primo piano come baricentro commerciale ed economico della città, vocazione che si è in gran parte persa con l’età moderna e contemporanea. La metodologia seguita, affinata e perfezionata durante la partecipazione al progetto Nu.M.E. (Nuovo Museo Elettronico della Città di Bologna), si basa sull’analisi e il confronto di fonti eterogenee ma complementari, che includono precedenti studi di storia dell’urbanistica, un corpus di documentazione di epoca medievale (provvedimenti amministrativi come gli Statuti del Comune di Pisa, ma anche descrizioni di cronisti e viaggiatori), fonti iconografiche, tra cui vedute e mappe cinquecentesche o successive, e fonti materiali, come le persistenze medievali ancora osservabili all’interno degli edifici ed i reperti rinvenuti durante alcune campagne di scavo archeologiche. Il modello 3D non è concepito come statico e “chiuso”, ma è liberamente esplorabile all’interno di un engine tridimensionale; tale prodotto può essere destinato a livelli di utenza diversi, che includono sia studiosi e specialisti interessati a conoscere un maggior numero di informazioni e ad approfondire la ricerca, sia semplici cittadini appassionati di storia o utenti più giovani, come studenti di scuole medie superiori e inferiori.
Resumo:
Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.
Resumo:
The evolution of the electronics embedded applications forces electronics systems designers to match their ever increasing requirements. This evolution pushes the computational power of digital signal processing systems, as well as the energy required to accomplish the computations, due to the increasing mobility of such applications. Current approaches used to match these requirements relies on the adoption of application specific signal processors. Such kind of devices exploits powerful accelerators, which are able to match both performance and energy requirements. On the other hand, the too high specificity of such accelerators often results in a lack of flexibility which affects non-recurrent engineering costs, time to market, and market volumes too. The state of the art mainly proposes two solutions to overcome these issues with the ambition of delivering reasonable performance and energy efficiency: reconfigurable computing and multi-processors computing. All of these solutions benefits from the post-fabrication programmability, that definitively results in an increased flexibility. Nevertheless, the gap between these approaches and dedicated hardware is still too high for many application domains, especially when targeting the mobile world. In this scenario, flexible and energy efficient acceleration can be achieved by merging these two computational paradigms, in order to address all the above introduced constraints. This thesis focuses on the exploration of the design and application spectrum of reconfigurable computing, exploited as application specific accelerators for multi-processors systems on chip. More specifically, it introduces a reconfigurable digital signal processor featuring a heterogeneous set of reconfigurable engines, and a homogeneous multi-core system, exploiting three different flavours of reconfigurable and mask-programmable technologies as implementation platform for applications specific accelerators. In this work, the various trade-offs concerning the utilization multi-core platforms and the different configuration technologies are explored, characterizing the design space of the proposed approach in terms of programmability, performance, energy efficiency and manufacturing costs.
Resumo:
The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.
Resumo:
This thesis deals with the development of the upcoming aeronautical mobile airport communications system (AeroMACS) system. We analyzed the performance of AeroMACS and we investigated potential solutions for enhancing its performance. Since the most critical results correspond to the channel scenario having less diversity1, we tackled this problem investigating potential solutions for increasing the diversity of the system and therefore improving its performance. We accounted different forms of diversity as space diversity and time diversity. More specifically, space (antenna and cooperative) diversity and time diversity are analyzed as countermeasures for the harsh fading conditions that are typical of airport environments. Among the analyzed techniques, two novel concepts are introduced, namely unequal diversity coding and flexible packet level codes. The proposed techniques have been analyzed on a novel airport channel model, derived from a measurement campaign at the airport of Munich (Germany). The introduced techniques largely improve the performance of the conventional AeroMACS link; representing thus appealing solutions for the long term evolution of the system.
Resumo:
Il primo capitolo di questo lavoro è dedicato all’opera svolta dagli amministratori locali e da un ente governativo come la Camera di commercio per arrivare a decifrare le effettive caratteristiche del quadro locale dal punto di vista economico, sociale, della percezione e del significato che assumono i consumi e gli spazi urbani ad essi dedicati. La caratteristica più originale rilevata dagli amministratori (che contano tra le proprie fila studiosi come Ardigò, Zangheri e Bellettini) è quella di una notevole omogeneità politica e culturale del quadro sociale. E questo, nonostante le massicce immigrazioni che sono, in proporzione, seconde solo quelle di Milano, ma per la stragrande maggioranza provenienti dalla stessa provincia o, al massimo, dalla regione e da analoghi percorsi di socializzazione e di formazione. Fondando essenzialmente su questa omogeneità (capitolo secondo), gli enti bolognesi cercarono di governare la trasformazione della città e anche l’espansione dei consumi che appariva colpita da eccessi e distorsioni. Facendo leva sulle pesanti crisi del 1963-1965 e del 1973-1977, gli amministratori locali puntarono ad ottenere la propria legittimazione fondandola proprio sui consumi, sulla base di una precisa cognizione del nuovo che arrivava, ma schierandosi decisamente a contenerne gli effetti dirompenti sul tessuto locale e indirizzando gli sforzi acquisitivi dei bolognesi sulla base di una temperante razionalizzazione nutrita di pianificazione urbanistica. Ritardi, spinte dal basso, ostacoli burocratici e legislativi resero questi percorsi difficili, o comunque assai poco lineari; fino a che l’ingresso negli anni Ottanta non ne modificò sensibilmente il corso. Ma questo, allo stato attuale delle conoscenze, è già tema per nuova ricerca. Il terzo capitolo è dedicato alla visualizzazione cartografica (GIS) dell’espansione degli spazi commerciali urbani durante le fasi più significative del miracolo.
Resumo:
This work is concerned with the increasing relationships between two distinct multidisciplinary research fields, Semantic Web technologies and scholarly publishing, that in this context converge into one precise research topic: Semantic Publishing. In the spirit of the original aim of Semantic Publishing, i.e. the improvement of scientific communication by means of semantic technologies, this thesis proposes theories, formalisms and applications for opening up semantic publishing to an effective interaction between scholarly documents (e.g., journal articles) and their related semantic and formal descriptions. In fact, the main aim of this work is to increase the users' comprehension of documents and to allow document enrichment, discovery and linkage to document-related resources and contexts, such as other articles and raw scientific data. In order to achieve these goals, this thesis investigates and proposes solutions for three of the main issues that semantic publishing promises to address, namely: the need of tools for linking document text to a formal representation of its meaning, the lack of complete metadata schemas for describing documents according to the publishing vocabulary, and absence of effective user interfaces for easily acting on semantic publishing models and theories.
From fall-risk assessment to fall detection: inertial sensors in the clinical routine and daily life
Resumo:
Falls are caused by complex interaction between multiple risk factors which may be modified by age, disease and environment. A variety of methods and tools for fall risk assessment have been proposed, but none of which is universally accepted. Existing tools are generally not capable of providing a quantitative predictive assessment of fall risk. The need for objective, cost-effective and clinically applicable methods would enable quantitative assessment of fall risk on a subject-specific basis. Tracking objectively falls risk could provide timely feedback about the effectiveness of administered interventions enabling intervention strategies to be modified or changed if found to be ineffective. Moreover, some of the fundamental factors leading to falls and what actually happens during a fall remain unclear. Objectively documented and measured falls are needed to improve knowledge of fall in order to develop more effective prevention strategies and prolong independent living. In the last decade, several research groups have developed sensor-based automatic or semi-automatic fall risk assessment tools using wearable inertial sensors. This approach may also serve to detect falls. At the moment, i) several fall-risk assessment studies based on inertial sensors, even if promising, lack of a biomechanical model-based approach which could provide accurate and more detailed measurements of interests (e.g., joint moments, forces) and ii) the number of published real-world fall data of older people in a real-world environment is minimal since most authors have used simulations with healthy volunteers as a surrogate for real-world falls. With these limitations in mind, this thesis aims i) to suggest a novel method for the kinematics and dynamics evaluation of functional motor tasks, often used in clinics for the fall-risk evaluation, through a body sensor network and a biomechanical approach and ii) to define the guidelines for a fall detection algorithm based on a real-world fall database availability.