11 resultados para Development of large software systems,
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
In the last years, the importance of locating people and objects and communicating with them in real time has become a common occurrence in every day life. Nowadays, the state of the art of location systems for indoor environments has not a dominant technology as instead occurs in location systems for outdoor environments, where GPS is the dominant technology. In fact, each location technology for indoor environments presents a set of features that do not allow their use in the overall application scenarios, but due its characteristics, it can well coexist with other similar technologies, without being dominant and more adopted than the others indoor location systems. In this context, the European project SELECT studies the opportunity of collecting all these different features in an innovative system which can be used in a large number of application scenarios. The goal of this project is to realize a wireless system, where a network of fixed readers able to query one or more tags attached to objects to be located. The SELECT consortium is composed of European institutions and companies, including Datalogic S.p.A. and CNIT, which deal with software and firmware development of the baseband receiving section of the readers, whose function is to acquire and process the information received from generic tagged objects. Since the SELECT project has an highly innovative content, one of the key stages of the system design is represented by the debug phase. This work aims to study and develop tools and techniques that allow to perform the debug phase of the firmware of the baseband receiving section of the readers.
Resumo:
The objective of the thesis project, developed within the Line Control & Software Engineering team of G.D company, is to analyze and identify the appropriate tool to automate the HW configuration process using Beckhoff technologies by importing data from an ECAD tool. This would save a great deal of time, since the I/O topology created as part of the electrical planning is presently imported manually in the related SW project of the machine. Moreover, a manual import is more error-prone because of human mistake than an automatic configuration tool. First, an introduction about TwinCAT 3, EtherCAT and Automation Interface is provided; then, it is analyzed the official Beckhoff tool, XCAD Interface, and the requirements on the electrical planning to use it: the interface is realized by means of the AutomationML format. Finally, due to some limitations observed, the design and implementation of a company internal tool is performed. Tests and validation of the tool are performed on a sample production line of the company.
Resumo:
This study investigates the growth and metabolite production of microorganisms causing spoilage of Atlantic cod (Gadus morhua) fillets packaged under air and modified atmosphere (60 % CO2, 40 % O2). Samples were provided by two different retailers (A and B). Storage of packaged fillets occurred at 4 °C and 8 °C. Microbiological quality and metabolite production of cod fillets stored in MAP 4 °C, MAP 8 °C and air were monitored during 13 days, 7 days and 3 days of storage, respectively. Volatile compounds concentration in the headspace were quantified by Selective ion flow tube mass spectrometry and a correlation with microbiological spoilage was studied. The onset of volatile compounds detection was observed to be mostly around 7 log cfu/g of total psychrotrophic count. Trimethylamine and dimethyl sulfide were found to be the dominant volatiles in all of the tested storage conditions, nevertheless there was no close correlation between concentrations of each main VOC and percentages of rejection based on sensory evaluation. According to results it was concluded that they cannot be considered as only indicators of the quality of cod fillets stored in modified atmosphere and air.
Resumo:
This master thesis work is focused on the development of a predictive EHC control function for a diesel plug-in hybrid electric vehicle equipped with a EURO 7 compliant exhaust aftertreatment system (EATS), with the purpose of showing the advantages provided by the implementation of a predictive control strategy with respect to a rule-based one. A preliminary step will be the definition of an accurate powertrain and EATS physical model, starting from already existing and validated applications. Then, a rule-based control strategy managing the torque split between the electric motor (EM) and the internal combustion engine (ICE) will be developed and calibrated, with the main target of limiting tailpipe NOx emission by taking into account EM and ICE operating conditions together with EATS conversion efficiency. The information available from vehicle connectivity will be used to reconstruct the future driving scenario, also referred to as electronic horizon (eHorizon), and in particular to predict ICE first start. Based on this knowledge, an EATS pre-heating phase can be planned to avoid low pollutant conversion efficiencies, thus preventing high NOx emission due to engine cold start. Consequently, the final NOx emission over the complete driving cycle will be strongly reduced, allowing to comply with the limits potentially set by the incoming EURO 7 regulation. Moreover, given the same NOx emission target, the gain achieved thanks to the implementation of an EHC predictive control function will allow to consider a simplified EATS layout, thus reducing the related manufacturing cost. The promising results achieved in terms of NOx emission reduction show the effectiveness of the application of a predictive control strategy focused on EATS thermal management and highlight the potential of a complete integration and parallel development of involved vehicle physical systems, control software and connectivity data management.
Resumo:
In the field of industrial automation, there is an increasing need to use optimal control systems that have low tracking errors and low power and energy consumption. The motors we are dealing with are mainly Permanent Magnet Synchronous Motors (PMSMs), controlled by 3 different types of controllers: a position controller, a speed controller, and a current controller. In this thesis, therefore, we are going to act on the gains of the first two controllers by going to find, through the TwinCAT 3 software, what might be the best set of parameters. To do this, starting with the default parameters recommended by TwinCAT, two main methods were used and then compared: the method of Ziegler and Nichols, which is a tabular method, and advanced tuning, an auto-tuning software method of TwinCAT. Therefore, in order to analyse which set of parameters was the best,several experiments were performed for each case, using the Motion Control Function Blocks. Moreover, some machines, such as large robotic arms, have vibration problems. To analyse them in detail, it was necessary to use the Bode Plot tool, which, through Bode plots, highlights in which frequencies there are resonance and anti-resonance peaks. This tool also makes it easier to figure out which and where to apply filters to improve control.
Resumo:
The seismic behaviour of one-storey asymmetric structures has been studied since 1970s by a number of researches studies which identified the coupled nature of the translational-to-torsional response of those class of systems leading to severe displacement magnifications at the perimeter frames and therefore to significant increase of local peak seismic demand to the structural elements with respect to those of equivalent not-eccentric systems (Kan and Chopra 1987). These studies identified the fundamental parameters (such as the fundamental period TL normalized eccentricity e and the torsional-to-lateral frequency ratio Ωϑ) governing the torsional behavior of in-plan asymmetric structures and trends of behavior. It has been clearly recognized that asymmetric structures characterized by Ωϑ >1, referred to as torsionally-stiff systems, behave quite different form structures with Ωϑ <1, referred to as torsionally-flexible systems. Previous research works by some of the authors proposed a simple closed-form estimation of the maximum torsional response of one-storey elastic systems (Trombetti et al. 2005 and Palermo et al. 2010) leading to the so called “Alpha-method” for the evaluation of the displacement magnification factors at the corner sides. The present paper provides an upgrade of the “Alpha Method” removing the assumption of linear elastic response of the system. The main objective is to evaluate how the excursion of the structural elements in the inelastic field (due to the reaching of yield strength) affects the displacement demand of one-storey in-plan asymmetric structures. The system proposed by Chopra and Goel in 2007, which is claimed to be able to capture the main features of the non-linear response of in-plan asymmetric system, is used to perform a large parametric analysis varying all the fundamental parameters of the system, including the inelastic demand by varying the force reduction factor from 2 to 5. Magnification factors for different force reduction factor are proposed and comparisons with the results obtained from linear analysis are provided.
Resumo:
This thesis is settled within the STOCKMAPPING project, which represents one of the studies that were developed in the framework of RITMARE Flagship project. The main goals of STOCKMAPPING were the creation of a genomic mapping for stocks of demersal target species and the assembling of a database of population genomic, in order to identify stocks and stocks boundaries. The thesis focuses on three main objectives representing the core for the initial assessment of the methodologies and structure that would be applied to the entire STOCKMAPPING project: individuation of an analytical design to identify and locate stocks and stocks boundaries of Mullus barbatus, application of a multidisciplinary approach to validate biological methods and an initial assessment and improvement for the genotyping by sequencing technique utilized (2b-RAD). The first step is the individuation of an analytical design that has to take in to account the biological characteristics of red mullet and being representative for STOCKMAPPING commitments. In this framework a reduction and selection steps was needed due to budget reduction. Sampling areas were ranked according the individuation of four priorities. To guarantee a multidisciplinary approach the biological data associated to the collected samples were used to investigate differences between sampling areas and GSAs. Genomic techniques were applied to red mullet for the first time so an initial assessment of molecular protocols for DNA extraction and 2b-RAD processing were needed. At the end 192 good quality DNAs have been extracted and eight samples have been processed with 2b-RAD. Utilizing the software Stacks for sequences analyses a great number of SNPs markers among the eight samples have been identified. Several tests have been performed changing the main parameter of the Stacks pipeline in order to identify the most explicative and functional sets of parameters.
Resumo:
This thesis is developed in the contest of Ritmare project WP1, which main objective is the development of a sustainable fishery through the identification of populations boundaries in commercially important species in Italian Seas. Three main objectives are discussed in order to help reach the main purpose of identification of stock boundaries in Parapenaeus longirostris: 1 -Development of a representative sampling design for Italian seas; 2 -Evaluation of 2b-RAD protocol; 3 -Investigation of populations through biological data analysis. First of all we defined and accomplished a sampling design which properly represents all Italian seas. Then we used information and data about nursery areas distribution, abundance of populations and importance of P. longirostris in local fishery, to develop an experimental design that prioritize the most important areas to maximize the results with actual project funds. We introduced for the first time the use of 2b-RAD on this species, a genotyping method based on sequencing the uniform fragments produced by type IIB restriction endonucleases. Thanks to this method we were able to move from genetics to the more complex genomics. In order to proceed with 2b-RAD we performed several tests to identify the best DNA extraction kit and protocol and finally we were able to extract 192 high quality DNA extracts ready to be processed. We tested 2b-RAD with five samples and after high-throughput sequencing of libraries we used the software “Stacks” to analyze the sequences. We obtained positive results identifying a great number of SNP markers among the five samples. To guarantee a multidisciplinary approach we used the biological data associated to the collected samples to investigate differences between geographical samples. Such approach assures continuity with other project, for instance STOCKMED, which utilize a combination of molecular and biological analysis as well.
Resumo:
In the last decade the near-surface mounted (NSM) strengthening technique using carbon fibre reinforced polymers (CFRP) has been increasingly used to improve the load carrying capacity of concrete members. Compared to externally bonded reinforcement (EBR), the NSM system presents considerable advantages. This technique consists in the insertion of carbon fibre reinforced polymer laminate strips into pre-cut slits opened in the concrete cover of the elements to be strengthened. CFRP reinforcement is bonded to concrete with an appropriate groove filler, typically epoxy adhesive or cement grout. Up to now, research efforts have been mainly focused on several structural aspects, such as: bond behaviour, flexural and/or shear strengthening effectiveness, and energy dissipation capacity of beam-column joints. In such research works, as well as in field applications, the most widespread adhesives that are used to bond reinforcements to concrete are epoxy resins. It is largely accepted that the performance of the whole application of NSM systems strongly depends on the mechanical properties of the epoxy resins, for which proper curing conditions must be assured. Therefore, the existence of non-destructive methods that allow monitoring the curing process of epoxy resins in the NSM CFRP system is desirable, in view of obtaining continuous information that can provide indication in regard to the effectiveness of curing and the expectable bond behaviour of CFRP/adhesive/concrete systems. The experimental research was developed at the Laboratory of the Structural Division of the Civil Engineering Department of the University of Minho in Guimar\~aes, Portugal (LEST). The main objective was to develop and propose a new method for continuous quality control of the curing of epoxy resins applied in NSM CFRP strengthening systems. This objective is pursued through the adaptation of an existing technique, termed EMM-ARM (Elasticity Modulus Monitoring through Ambient Response Method) that has been developed for monitoring the early stiffness evolution of cement-based materials. The experimental program was composed of two parts: (i) direct pull-out tests on concrete specimens strengthened with NSM CFRP laminate strips were conducted to assess the evolution of bond behaviour between CFRP and concrete since early ages; and, (ii) EMM-ARM tests were carried out for monitoring the progressive stiffness development of the structural adhesive used in CFRP applications. In order to verify the capability of the proposed method for evaluating the elastic modulus of the epoxy, static E-Modulus was determined through tension tests. The results of the two series of tests were then combined and compared to evaluate the possibility of implementation of a new method for the continuous monitoring and quality control of NSM CFRP applications.
Resumo:
Nell'ambito dello sviluppo software, la concorrenza è sempre stata vista come la strada del futuro. Tuttavia, questa è stata spesso ignorata a causa del continuo sviluppo dell'hardware che ha permesso agli sviluppatori di continuare a scrivere software sequenziale senza doversi preoccupare delle performance. In un'era in cui le nuove architetture hardware presentano processori multi-core, tutto questo non è più possibile. L'obiettivo di questa tesi è stato quello di considerare il Modello ad Attori come valida alternativa allo sviluppo di applicazioni in ambito mobile e quindi di progettare, sviluppare e distribuire un nuovo framework sulla base di tale modello. Il lavoro parte quindi da una panoramica di Swift, il nuovo linguaggio di programmazione presentato da Apple al WWDC 2014, in cui vengono analizzati nel dettaglio i meccanismi che abilitano alla concorrenza. Successivamente viene descritto il modello ad attori in termini di: attori, proprietà, comunicazione e sincronizzazione. Segue poi un'analisi delle principali implementazioni di questo modello, tra cui: Scala, Erlang ed Akka; quest'ultimo rappresenta la base su cui è stato ispirato il lavoro di progettazione e sviluppo del framework Actor Kit. Il quarto capitolo descrive tutti i concetti, le idee e i principi su cui il framework Actor Kit è stato progettato e sviluppato. Infine, l'ultimo capitolo presenta l'utilizzo del framework in due casi comuni della programmazione mobile: 1) Acquisizione dati da Web API e visualizzazione sull'interfaccia utente. 2) Acquisizione dati dai sensori del dispositivo. In conclusione Actor Kit abilita la progettazione e lo sviluppo di applicazioni secondo un approccio del tutto nuovo nell'ambito mobile. Un possibile sviluppo futuro potrebbe essere l'estensione del framework con attori che mappino i framework standard di Apple; proprio per questo sarà reso pubblico con la speranza che altri sviluppatori possano evolverlo e renderlo ancora più completo e performante.