7 resultados para Election Counting and Reporting Software,
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
The work presented in this thesis has been part of a Cranfield University research project. This thesis aims to design a flight control law for large cargo aircraft by using predictive control, which can assure flight motion along the flight path exactly and on time. In particular this work involves the modelling of a Boeing C-17 Globemaster III 6DOF model (used as study case), by using DATCOM and Matlab Simulink software. Then a predictive control algorithm has been developed. The majority of the work is done in a Matlab/Simulink environment. Finally the predictive control algorithm has been applied on the aircraft model and its performances, in tracking given trajectory optimized through a 4DT Research Software, have been evaluated.
Resumo:
Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.
Resumo:
The aim of this Thesis work is to study the multi-frequency properties of the Ultra Luminous Infrared Galaxy (ULIRG) IRAS 00183-7111 (I00183) at z = 0.327, connecting ALMA sub-mm/mm observations with those at high energies in order to place constraints on the properties of its central power source and verify whether the gas traced by the CO may be responsible for the obscuration observed in X-rays. I00183 was selected from the so-called Spoon diagnostic diagram (Spoon et al. 2007) for mid-infrared spectra of infrared galaxies based on the equivalent width of the 6.2 μm Polycyclic Aromatic Hydrocarbon (PAH) emission feature versus the 9.7 μm silicate strength. Such features are a powerful tool to investigate the contribution of star formation and AGN activity in this class of objects. I00183 was selected from the top-left region of the plot where the most obscured sources, characterized by a strong Si absorption feature, are located. To link the sub-mm/mm to the X-ray properties of I00183, ALMA archival Cycle 0 data in Band 3 (87 GHz) and Band 6 (270 GHz) have been calibrated and analyzed, using CASA software. ALMA Cycle 0 was the Early Science program for which data reprocessing is strongly suggested. The main work of this Thesis consisted in reprocessing raw data to provide an improvement with respect to the available archival products and results, which were obtained using standard procedures. The high-energy data consists of Chandra, XMM-Newton and NuSTAR observations which provide a broad coverage of the spectrum in the energy range 0.5 − 30 keV. Chandra and XMM archival data were used, with an exposure time of 22 and 22.2 ks, respectively; their reduction was carried out using CIAO and SAS software. The 100 ks NuSTAR are still private and the spectra were obtained by courtesy of the PI (K. Iwasawa). A detailed spectral analysis was done using XSPEC software; the spectral shape was reproduced starting from simple phenomenological models, and then more physical models were introduced to account for the complex mechanisms that involve this source. In Chapter 1, an overview of the scientific background is discussed, with a focus on the target, I00183, and the Spoon diagnostic diagram, from which it was originally selected. In Chapter 2, the basic principles of interferometry are briefly introduced, with a description of the calibration theory applied to interferometric observations. In Chapter 3, ALMA and its capabilities, both current and future, are shown, explaining also the complex structure of the ALMA archive. In Chapter 4, the calibration of ALMA data is presented and discussed, showing also the obtained imaging products. In Chapter 5, the analysis and discussion of the main results obtained from ALMA data is presented. In Chapter 6, the X-ray observations, data reduction and spectral analysis are reported, with a brief introduction to the basic principle of X-ray astronomy and the instruments from which the observations were carried out. Finally, the overall work is summarized, with particular emphasis on the main obtained results and the possible future perspectives.
Resumo:
Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.
Resumo:
Il seguente elaborato, frutto dell’attività di tirocinio presso il laboratorio del DIN di Montecuccolino, si prefigge di presentare un’architettura software contenente tutte le leggi di controllo necessarie allo svolgimento di un’attività di Pick and Place con un Robot Delta. L’elaborato si articola in una prima fase di analisi del Robot, una seconda fase di calibrazione del sistema in struttura e una terza fase di sviluppo del software di controllo con successiva validazione sperimentale dei risultati ottenuti.
Resumo:
The increasing number of Resident Space Objects (RSOs) is a threat to spaceflight operations. Conjunction Data Messages (CDMs) are sent to satellite operators to warn for possible future collision and their probabilities. The research project described herein pushed forward an algorithm that is able to update the collision probability directly on-board starting from CDMs and the state vector of the hosting satellite which is constantly updated thanks to an onboard GNSS receiver. A large set of methods for computing the collision probability was analyzed in order to find the best ones for this application. The selected algorithm was then tested to assess and improve its performance. Finally, parts of the algorithm and external software were implemented on a Raspberry Pi 3B+ board to demonstrate the compatibility of this approach with computational resources similar to those typically available onboard modern spacecraft.
Resumo:
Planning is an important sub-field of artificial intelligence (AI) focusing on letting intelligent agents deliberate on the most adequate course of action to attain their goals. Thanks to the recent boost in the number of critical domains and systems which exploit planning for their internal procedures, there is an increasing need for planning systems to become more transparent and trustworthy. Along this line, planning systems are now required to produce not only plans but also explanations about those plans, or the way they were attained. To address this issue, a new research area is emerging in the AI panorama: eXplainable AI (XAI), within which explainable planning (XAIP) is a pivotal sub-field. As a recent domain, XAIP is far from mature. No consensus has been reached in the literature about what explanations are, how they should be computed, and what they should explain in the first place. Furthermore, existing contributions are mostly theoretical, and software implementations are rarely more than preliminary. To overcome such issues, in this thesis we design an explainable planning framework bridging the gap between theoretical contributions from literature and software implementations. More precisely, taking inspiration from the state of the art, we develop a formal model for XAIP, and the software tool enabling its practical exploitation. Accordingly, the contribution of this thesis is four-folded. First, we review the state of the art of XAIP, supplying an outline of its most significant contributions from the literature. We then generalise the aforementioned contributions into a unified model for XAIP, aimed at supporting model-based contrastive explanations. Next, we design and implement an algorithm-agnostic library for XAIP based on our model. Finally, we validate our library from a technological perspective, via an extensive testing suite. Furthermore, we assess its performance and usability through a set of benchmarks and end-to-end examples.