9 resultados para Supervisory Control and Data Acquisition (SCADA) Topology
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).
Resumo:
Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
Hybrid vehicles (HV), comprising a conventional ICE-based powertrain and a secondary energy source, to be converted into mechanical power as well, represent a well-established alternative to substantially reduce both fuel consumption and tailpipe emissions of passenger cars. Several HV architectures are either being studied or already available on market, e.g. Mechanical, Electric, Hydraulic and Pneumatic Hybrid Vehicles. Among the others, Electric (HEV) and Mechanical (HSF-HV) parallel Hybrid configurations are examined throughout this Thesis. To fully exploit the HVs potential, an optimal choice of the hybrid components to be installed must be properly designed, while an effective Supervisory Control must be adopted to coordinate the way the different power sources are managed and how they interact. Real-time controllers can be derived starting from the obtained optimal benchmark results. However, the application of these powerful instruments require a simplified and yet reliable and accurate model of the hybrid vehicle system. This can be a complex task, especially when the complexity of the system grows, i.e. a HSF-HV system assessed in this Thesis. The first task of the following dissertation is to establish the optimal modeling approach for an innovative and promising mechanical hybrid vehicle architecture. It will be shown how the chosen modeling paradigm can affect the goodness and the amount of computational effort of the solution, using an optimization technique based on Dynamic Programming. The second goal concerns the control of pollutant emissions in a parallel Diesel-HEV. The emissions level obtained under real world driving conditions is substantially higher than the usual result obtained in a homologation cycle. For this reason, an on-line control strategy capable of guaranteeing the respect of the desired emissions level, while minimizing fuel consumption and avoiding excessive battery depletion is the target of the corresponding section of the Thesis.
Resumo:
The aging process is characterized by the progressive fitness decline experienced at all the levels of physiological organization, from single molecules up to the whole organism. Studies confirmed inflammaging, a chronic low-level inflammation, as a deeply intertwined partner of the aging process, which may provide the “common soil” upon which age-related diseases develop and flourish. Thus, albeit inflammation per se represents a physiological process, it can rapidly become detrimental if it goes out of control causing an excess of local and systemic inflammatory response, a striking risk factor for the elderly population. Developing interventions to counteract the establishment of this state is thus a top priority. Diet, among other factors, represents a good candidate to regulate inflammation. Building on top of this consideration, the EU project NU-AGE is now trying to assess if a Mediterranean diet, fortified for the elderly population needs, may help in modulating inflammaging. To do so, NU-AGE enrolled a total of 1250 subjects, half of which followed a 1-year long diet, and characterized them by mean of the most advanced –omics and non –omics analyses. The aim of this thesis was the development of a solid data management pipeline able to efficiently cope with the results of these assays, which are now flowing inside a centralized database, ready to be used to test the most disparate scientific hypotheses. At the same time, the work hereby described encompasses the data analysis of the GEHA project, which was focused on identifying the genetic determinants of longevity, with a particular focus on developing and applying a method for detecting epistatic interactions in human mtDNA. Eventually, in an effort to propel the adoption of NGS technologies in everyday pipeline, we developed a NGS variant calling pipeline devoted to solve all the sequencing-related issues of the mtDNA.
Resumo:
The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.
Resumo:
60 strains (belonging to the genera Lactobacillus, Bifidobacterium, Leuconostoc and Enterococcus) were tested for their capacity to inhibit the growth of 3 strains of Campylobacter jejuni: Lactobacilli and bifidobacteria were left to grow in MRS or TPY broth at 37°C overnight in anaerobic conditions; Campylobacter jejuni was inoculated in blood agar plates at 37°C for 24-48 hours in microaerophilic conditions. The inhibition experiments were carried out in vitro using ”Spot agar test” and “Well diffusion assay” techniques testing both cellular activity and that of the surnatant. 11 strains proved to inhibit the growth of Campylobacter jejuni. These strains were subsequently analised analised in order to evaluate the resistance to particular situations of stress which are found in the gastrointestinal tract and during the industrial transformation processes (Starvation stress, osmotic stress, heat stress, resistance to pH and to bile salts). Resistance to starvation stress: all strains seemed to resist the stress (except one strain). Resistance to osmotic stress: all strains were relatively resistant to the concentrations of 6% w/v of NaCl (except one strain). Resistance to heat stress: only one strain showed little resistance to the 55°C temperature. Resistance to pH: In the presence of a low pH (2.5), many strains rapidly lost their viability after approximately 1 hour. Resistance to bile salts: Except for one strain, all strains seemed to be relatively resistant to the 2% w/v concentration of bile salts. Afterward, strains were identified by using phenotipic and molecular techniques. Phenotipic identification was carried out by using API 50 CHL (bioMérieux) and API 20 STREP identification system (bioMérieux); molecular identification with species-specific PCR: the molecular techniques confirmed the results by phenotipic identification. For testing the antibiotic resistance profile, bacterial strains were subcultured in MRS or TPY broth and incubated for 18 h at 37°C under anaerobic conditions. Antibiotics tested (Tetracycline, Trimethoprim, Cefuroxime, Kanamycin, Chloramphenicol, Vancomycin, Ampycillin, Sterptomycin, Erythromycin) were diluted to the final concentrations of: 2,4,8,16,32,64,128,256 mg/ml. Then, 20 μl fresh bacterial culture (final concentration in the plates approximately 106 cfu/ml) were added to 160 μl MRS or TPY broth and 20 μl antibiotic solution. As positive control the bacterial culture (20 ul) was added to broth (160 ul) and water (20 ul). Test was performed on plates P96, that after the inoculum were incubated for 24 h at 37oC, then the antibiotic resistance was determined by measuring the Optical Density (OD) at 620 nm with Multiscan EX. All strains showed a similar behaviour: resistance to all antibiotic tested. Further studies are needed.