15 resultados para Open source
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
To understand a city and its urban structure it is necessary to study its history. This is feasible through GIS (Geographical Information Systems) and its by-products on the web. Starting from a cartographic view they allow an initial understanding of, and a comparison between, present and past data together with an easy and intuitive access to database information. The research done led to the creation of a GIS for the city of Bologna. It is based on varied data such as historical map, vector and alphanumeric historical data, etc.. After providing information about GIS we thought of spreading and sharing the collected data on the Web after studying two solutions available on the market: Web Mapping and WebGIS. In this study we discuss the stages, beginning with the development of Historical GIS of Bologna, which led to the making of a WebGIS Open Source (MapServer and Chameleon) and the Web Mapping services (Google Earth, Google Maps and OpenLayers).
Resumo:
A partire dagli anni '70, si è assistito ad un progressivo riassetto geopolitico a livello mondiale Grazie anche all’evoluzione tecnologica ed alla sua diffusione di massa, il tempo e lo spazio si contraggono nel processo di globalizzazione che ha caratterizzato le società contemporanee ove l'informazione e la comunicazione assumono ormai un ruolo centrale nelle dinamiche di conoscenza. Il presente studio, intende far luce in primis sulla disciplina dell'intelligence, così come enunciata in ambito militare e "civile", in particolare nel contesto USA, NATO ed ONU, al fine quindi di evidenziare le peculiarità di una nuova disciplina di intelligence, cosiddetta Open Source Intelligence, che ha come elemento di innovazione l'utilizio delle informazioni non classificate. Dopo aver affrontato il problema della concettualizzazione ed evoluzione del fenomeno terroristico, sarà posto il focus sull’espressione criminale ad oggi maggiormente preoccupante, il terrorismo internazionale di matrice islamica, in prospettiva multidimensionale, grazie all’adozione di concetti criminologici interdisciplinari. Sotto il profilo della sperimentazione, si è, quindi, deciso di proporre, progettare e sviluppare l’architettura della piattaforma Open Source Intellicence Analysis Platform,un tool operativo di supporto per l’analista Open Source Intelligence, che si pone quale risorsa del’analisi criminologica, in grado di fornire un valido contributo, grazie al merge tra practice e research, nell’applicazione di tale approccio informativo al fenomeno terroristico.
Resumo:
L'obiettivo della ricerca è di compiere un'analisi dell'impatto della cosiddetta cultura "open" alla luce dell'attuale condizione del World Wide Web. Si prenderà in considerazione, in particolare, la genesi del movimento a partire dalle basi di cultura hacker e la relativa evoluzione nella filosofia del software libero, con il fine ultimo di identificare il ruolo attuale del modello open source nello scenario esistente. L'introduzione al concetto di Open Access completerà la ricerca anche considerando la recente riaffermazione della conoscenza come bene comune all'interno della Società dell'Informazione
Resumo:
Questo lavoro ha come obiettivo l’utilizzo del Geographical Information System (GIS) per effettuare analisi di sicurezza, monitoraggio e valutazioni di impatto ambientale. Oggi, la totalità delle operazioni GIS possono essere svolte con software open source e in questa sedi si è scelto di utilizzare il GIS GRASS (Geographic Resources Analysis Support System) disponibile nei termini della GNU public license (GPL), mostrando l’usabilità e le notevoli potenzialità di tale software, nonché la qualità dei prodotti ottenibili, mai inferiori ai prodotti e agli strumenti messi a disposizione dai più radicati e diffusi programmi proprietari. Nel capitolo 4, vedremo l’applicazione all’analisi delle conseguenze di ipotetici incidenti, durante le operazioni di dismissione dell’impianto di processamento del combustibile nucleare, di Bosco Marengo (AL). Nel capitolo 5, vedremo applicazioni nel campo del monitoraggio della qualità dell’aria tramite analisi di immagini satellitari.
Resumo:
In the most recent years there is a renovate interest for Mixed Integer Non-Linear Programming (MINLP) problems. This can be explained for different reasons: (i) the performance of solvers handling non-linear constraints was largely improved; (ii) the awareness that most of the applications from the real-world can be modeled as an MINLP problem; (iii) the challenging nature of this very general class of problems. It is well-known that MINLP problems are NP-hard because they are the generalization of MILP problems, which are NP-hard themselves. However, MINLPs are, in general, also hard to solve in practice. We address to non-convex MINLPs, i.e. having non-convex continuous relaxations: the presence of non-convexities in the model makes these problems usually even harder to solve. The aim of this Ph.D. thesis is to give a flavor of different possible approaches that one can study to attack MINLP problems with non-convexities, with a special attention to real-world problems. In Part 1 of the thesis we introduce the problem and present three special cases of general MINLPs and the most common methods used to solve them. These techniques play a fundamental role in the resolution of general MINLP problems. Then we describe algorithms addressing general MINLPs. Parts 2 and 3 contain the main contributions of the Ph.D. thesis. In particular, in Part 2 four different methods aimed at solving different classes of MINLP problems are presented. Part 3 of the thesis is devoted to real-world applications: two different problems and approaches to MINLPs are presented, namely Scheduling and Unit Commitment for Hydro-Plants and Water Network Design problems. The results show that each of these different methods has advantages and disadvantages. Thus, typically the method to be adopted to solve a real-world problem should be tailored on the characteristics, structure and size of the problem. Part 4 of the thesis consists of a brief review on tools commonly used for general MINLP problems, constituted an integral part of the development of this Ph.D. thesis (especially the use and development of open-source software). We present the main characteristics of solvers for each special case of MINLP.
Resumo:
Healthcare, Human Computer Interfaces (HCI), Security and Biometry are the most promising application scenario directly involved in the Body Area Networks (BANs) evolution. Both wearable devices and sensors directly integrated in garments envision a word in which each of us is supervised by an invisible assistant monitoring our health and daily-life activities. New opportunities are enabled because improvements in sensors miniaturization and transmission efficiency of the wireless protocols, that achieved the integration of high computational power aboard independent, energy-autonomous, small form factor devices. Application’s purposes are various: (I) data collection to achieve off-line knowledge discovery; (II) user notification of his/her activities or in case a danger occurs; (III) biofeedback rehabilitation; (IV) remote alarm activation in case the subject need assistance; (V) introduction of a more natural interaction with the surrounding computerized environment; (VI) users identification by physiological or behavioral characteristics. Telemedicine and mHealth [1] are two of the leading concepts directly related to healthcare. The capability to borne unobtrusiveness objects supports users’ autonomy. A new sense of freedom is shown to the user, not only supported by a psychological help but a real safety improvement. Furthermore, medical community aims the introduction of new devices to innovate patient treatments. In particular, the extension of the ambulatory analysis in the real life scenario by proving continuous acquisition. The wide diffusion of emerging wellness portable equipment extended the usability of wearable devices also for fitness and training by monitoring user performance on the working task. The learning of the right execution techniques related to work, sport, music can be supported by an electronic trainer furnishing the adequate aid. HCIs made real the concept of Ubiquitous, Pervasive Computing and Calm Technology introduced in the 1988 by Marc Weiser and John Seeley Brown. They promotes the creation of pervasive environments, enhancing the human experience. Context aware, adaptive and proactive environments serve and help people by becoming sensitive and reactive to their presence, since electronics is ubiquitous and deployed everywhere. In this thesis we pay attention to the integration of all the aspects involved in a BAN development. Starting from the choice of sensors we design the node, configure the radio network, implement real-time data analysis and provide a feedback to the user. We present algorithms to be implemented in wearable assistant for posture and gait analysis and to provide assistance on different walking conditions, preventing falls. Our aim, expressed by the idea to contribute at the development of a non proprietary solutions, driven us to integrate commercial and standard solutions in our devices. We use sensors available on the market and avoided to design specialized sensors in ASIC technologies. We employ standard radio protocol and open source projects when it was achieved. The specific contributions of the PhD research activities are presented and discussed in the following. • We have designed and build several wireless sensor node providing both sensing and actuator capability making the focus on the flexibility, small form factor and low power consumption. The key idea was to develop a simple and general purpose architecture for rapid analysis, prototyping and deployment of BAN solutions. Two different sensing units are integrated: kinematic (3D accelerometer and 3D gyroscopes) and kinetic (foot-floor contact pressure forces). Two kind of feedbacks were implemented: audio and vibrotactile. • Since the system built is a suitable platform for testing and measuring the features and the constraints of a sensor network (radio communication, network protocols, power consumption and autonomy), we made a comparison between Bluetooth and ZigBee performance in terms of throughput and energy efficiency. Test in the field evaluate the usability in the fall detection scenario. • To prove the flexibility of the architecture designed, we have implemented a wearable system for human posture rehabilitation. The application was developed in conjunction with biomedical engineers who provided the audio-algorithms to furnish a biofeedback to the user about his/her stability. • We explored off-line gait analysis of collected data, developing an algorithm to detect foot inclination in the sagittal plane, during walk. • In collaboration with the Wearable Lab – ETH, Zurich, we developed an algorithm to monitor the user during several walking condition where the user carry a load. The remainder of the thesis is organized as follows. Chapter I gives an overview about Body Area Networks (BANs), illustrating the relevant features of this technology and the key challenges still open. It concludes with a short list of the real solutions and prototypes proposed by academic research and manufacturers. The domain of the posture and gait analysis, the methodologies, and the technologies used to provide real-time feedback on detected events, are illustrated in Chapter II. The Chapter III and IV, respectively, shown BANs developed with the purpose to detect fall and monitor the gait taking advantage by two inertial measurement unit and baropodometric insoles. Chapter V reports an audio-biofeedback system to improve balance on the information provided by the use centre of mass. A walking assistant based on the KNN classifier to detect walking alteration on load carriage, is described in Chapter VI.
Resumo:
The contemporary media landscape is characterized by the emergence of hybrid forms of digital communication that contribute to the ongoing redefinition of our societies cultural context. An incontrovertible consequence of this phenomenon is the new public dimension that characterizes the transmission of historical knowledge in the twenty-first century. Awareness of this new epistemic scenario has led us to reflect on the following methodological questions: what strategies should be created to establish a communication system, based on new technology, that is scientifically rigorous, but at the same time engaging for the visitors of museums and Internet users? How does a comparative analysis of ancient documentary sources form a solid base of information for the virtual reconstruction of thirteenth century Bologna in the Metaverse? What benefits can the phenomenon of cross-mediality give to the virtual heritage? The implementation of a new version of the Nu.M.E. project allowed for answering many of these instances. The investigation carried out between 2008 and 2010 has shown that, indeed, real-time 3D graphics and collaborative virtual environments can be feasible tools for representing philologically the urban medieval landscape and for communicating properly validated historical data to the general public. This research is focused on the study and implementation of a pipeline that permits mass communication of historical information about an area of vital importance in late medieval Bologna: Piazza di Porta Ravegnana. The originality of the developed project is not limited solely to the methodological dimension of historical research. Adopted technological perspective is an excellent example of innovation that digital technologies can bring to the cultural heritage. The main result of this research is the creation of Nu.ME 2010, a cross-media system of 3D real-time visualization based on some of the most advanced free software and open source technologies available today free of charge.
Resumo:
The aim of this study is the creation of a Historical GIS that spatially reference data retrieved from Italian and Catalan historical sources and records. The generation of locates these metasource was achieved through the integral acquisition of source-oriented records and the insertion of mark-up fields, yet maintaining, where possible, the original encoding of the source documents. In order to standardize the set of information contained in the original documents and thus allow queries to the database, additional fields were introduced. Once the initial phase of data research and analysis was concluded the new virtual source was published online within an open WebGIS source. As a conclusion we have created a dynamic and spatially referenced database of geo-historical information. The configuration of this new source is such to guarantee the best possible accessibility.
Resumo:
Il problema dell'antibiotico-resistenza è un problema di sanità pubblica per affrontare il quale è necessario un sistema di sorveglianza basato sulla raccolta e l'analisi dei dati epidemiologici di laboratorio. Il progetto di dottorato è consistito nello sviluppo di una applicazione web per la gestione di tali dati di antibiotico sensibilità di isolati clinici utilizzabile a livello di ospedale. Si è creata una piattaforma web associata a un database relazionale per avere un’applicazione dinamica che potesse essere aggiornata facilmente inserendo nuovi dati senza dover manualmente modificare le pagine HTML che compongono l’applicazione stessa. E’ stato utilizzato il database open-source MySQL in quanto presenta numerosi vantaggi: estremamente stabile, elevate prestazioni, supportato da una grande comunità online ed inoltre gratuito. Il contenuto dinamico dell’applicazione web deve essere generato da un linguaggio di programmazione tipo “scripting” che automatizzi operazioni di inserimento, modifica, cancellazione, visualizzazione di larghe quantità di dati. E’ stato scelto il PHP, linguaggio open-source sviluppato appositamente per la realizzazione di pagine web dinamiche, perfettamente utilizzabile con il database MySQL. E’ stata definita l’architettura del database creando le tabelle contenenti i dati e le relazioni tra di esse: le anagrafiche, i dati relativi ai campioni, microrganismi isolati e agli antibiogrammi con le categorie interpretative relative al dato antibiotico. Definite tabelle e relazioni del database è stato scritto il codice associato alle funzioni principali: inserimento manuale di antibiogrammi, importazione di antibiogrammi multipli provenienti da file esportati da strumenti automatizzati, modifica/eliminazione degli antibiogrammi precedenti inseriti nel sistema, analisi dei dati presenti nel database con tendenze e andamenti relativi alla prevalenza di specie microbiche e alla chemioresistenza degli stessi, corredate da grafici. Lo sviluppo ha incluso continui test delle funzioni via via implementate usando reali dati clinici e sono stati introdotti appositi controlli e l’introduzione di una semplice e pulita veste grafica.
Resumo:
By pulling and releasing the tension on protein homomers with the Atomic Force Miscroscope (AFM) at different pulling speeds, dwell times and dwell distances, the observed force-response of the protein can be fitted with suitable theoretical models. In this respect we developed mathematical procedures and open-source computer codes for driving such experiments and fitting Bell’s model to experimental protein unfolding forces and protein folding frequencies. We applied the above techniques to the study of proteins GB1 (the B1 IgG-binding domain of protein G from Streptococcus) and I27 (a module of human cardiac titin) in aqueous solutions of protecting osmolytes such as dimethyl sulfoxide (DMSO), glycerol and trimethylamine N-oxide (TMAO). In order to get a molecular understanding of the experimental results we developed an Ising-like model for proteins that incorporates the osmophobic nature of their backbone. The model benefits from analytical thermodynamics and kinetics amenable to Monte-Carlo simulation. The prevailing view used to be that small protecting osmolytes bridge the separating beta-strands of proteins with mechanical resistance, presumably shifting the transition state to significantly higher distances that correlate with the molecular size of the osmolyte molecules. Our experiments showed instead that protecting osmolytes slow down protein unfolding and speed-up protein folding at physiological pH without shifting the protein transition state on the mechanical reaction coordinate. Together with the theoretical results of the Ising-model, our results lend support to the osmophobic theory according to which osmolyte stabilisation is a result of the preferential exclusion of the osmolyte molecules from the protein backbone. The results obtained during this thesis work have markedly improved our understanding of the strategy selected by Nature to strengthen protein stability in hostile environments, shifting the focus from hypothetical protein-osmolyte interactions to the more general mechanism based on the osmophobicity of the protein backbone.
Resumo:
The new generation of multicore processors opens new perspectives for the design of embedded systems. Multiprocessing, however, poses new challenges to the scheduling of real-time applications, in which the ever-increasing computational demands are constantly flanked by the need of meeting critical time constraints. Many research works have contributed to this field introducing new advanced scheduling algorithms. However, despite many of these works have solidly demonstrated their effectiveness, the actual support for multiprocessor real-time scheduling offered by current operating systems is still very limited. This dissertation deals with implementative aspects of real-time schedulers in modern embedded multiprocessor systems. The first contribution is represented by an open-source scheduling framework, which is capable of realizing complex multiprocessor scheduling policies, such as G-EDF, on conventional operating systems exploiting only their native scheduler from user-space. A set of experimental evaluations compare the proposed solution to other research projects that pursue the same goals by means of kernel modifications, highlighting comparable scheduling performances. The principles that underpin the operation of the framework, originally designed for symmetric multiprocessors, have been further extended first to asymmetric ones, which are subjected to major restrictions such as the lack of support for task migrations, and later to re-programmable hardware architectures (FPGAs). In the latter case, this work introduces a scheduling accelerator, which offloads most of the scheduling operations to the hardware and exhibits extremely low scheduling jitter. The realization of a portable scheduling framework presented many interesting software challenges. One of these has been represented by timekeeping. In this regard, a further contribution is represented by a novel data structure, called addressable binary heap (ABH). Such ABH, which is conceptually a pointer-based implementation of a binary heap, shows very interesting average and worst-case performances when addressing the problem of tick-less timekeeping of high-resolution timers.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.
Resumo:
Coastal flooding poses serious threats to coastal areas around the world, billions of dollars in damage to property and infrastructure, and threatens the lives of millions of people. Therefore, disaster management and risk assessment aims at detecting vulnerability and capacities in order to reduce coastal flood disaster risk. In particular, non-specialized researchers, emergency management personnel, and land use planners require an accurate, inexpensive method to determine and map risk associated with storm surge events and long-term sea level rise associated with climate change. This study contributes to the spatially evaluation and mapping of social-economic-environmental vulnerability and risk at sub-national scale through the development of appropriate tools and methods successfully embedded in a Web-GIS Decision Support System. A new set of raster-based models were studied and developed in order to be easily implemented in the Web-GIS framework with the purpose to quickly assess and map flood hazards characteristics, damage and vulnerability in a Multi-criteria approach. The Web-GIS DSS is developed recurring to open source software and programming language and its main peculiarity is to be available and usable by coastal managers and land use planners without requiring high scientific background in hydraulic engineering. The effectiveness of the system in the coastal risk assessment is evaluated trough its application to a real case study.