11 resultados para Critical infrastructures. Fault Tree. Dependability. Framework. Industrialenvironments
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Gli investimenti alle infrastrutture di trasporto sono stati per lungo tempo considerati misure di politica generale di competenza esclusiva degli Stati membri, nonostante il generale divieto di misure di sostegno pubblico a favore delle imprese ai sensi dell’art. 107 TFUE. Le sentenze rese dalle corti eurounitarie in relazione agli aeroporti di Parigi e di Lipsia Halle hanno dato avvio ad un vero e proprio revirement giurisprudenziale, in considerazione delle trasformazioni economiche internazionali, rimettendo in discussione il concetto di impresa, nonché la ferma interpretazione secondo cui il finanziamento alle infrastrutture – in quanto beni pubblici intesi a soddisfare i bisogni di mobilità dei cittadini – sfuggirebbe all’applicazione della disciplina degli aiuti di Stato. Nonostante le esigenze di costante ammodernamento e sviluppo delle infrastrutture, il nuovo quadro regolatorio adottato dall’Unione europea a seguire ha condotto inevitabilmente gli Stati membri a dover sottoporre al vaglio preventivo della Commissione ogni nuovo investimento infrastrutturale. La presente trattazione, muovendo dall’analisi della disciplina degli aiuti di Stato di cui agli artt. 107 e ss. TFUE, analizza i principi di creazione giurisprudenziale e dottrinale che derivano dall’interpretazione delle fonti primarie, mettendo in evidenza le principali problematiche giuridiche sottese, anche in considerazione delle peculiarità delle infrastrutture in questione, dei modelli proprietari e di governance, delle competenze e dei poteri decisionali in merito a nuovi progetti di investimento. Infine, la trattazione si concentra sui grandi progetti infrastrutturali a livello europeo e internazionale che interessano le reti di trasporto, analizzando le nuove sfide, pur considerando la necessità di assicurare, anche rispetto ad essi, la salvaguardia del cd. level playing field e l’osservanza sostanziale delle norme sugli aiuti di Stato.
Resumo:
Machines with moving parts give rise to vibrations and consequently noise. The setting up and the status of each machine yield to a peculiar vibration signature. Therefore, a change in the vibration signature, due to a change in the machine state, can be used to detect incipient defects before they become critical. This is the goal of condition monitoring, in which the informations obtained from a machine signature are used in order to detect faults at an early stage. There are a large number of signal processing techniques that can be used in order to extract interesting information from a measured vibration signal. This study seeks to detect rotating machine defects using a range of techniques including synchronous time averaging, Hilbert transform-based demodulation, continuous wavelet transform, Wigner-Ville distribution and spectral correlation density function. The detection and the diagnostic capability of these techniques are discussed and compared on the basis of experimental results concerning gear tooth faults, i.e. fatigue crack at the tooth root and tooth spalls of different sizes, as well as assembly faults in diesel engine. Moreover, the sensitivity to fault severity is assessed by the application of these signal processing techniques to gear tooth faults of different sizes.
Resumo:
Machine learning comprises a series of techniques for automatic extraction of meaningful information from large collections of noisy data. In many real world applications, data is naturally represented in structured form. Since traditional methods in machine learning deal with vectorial information, they require an a priori form of preprocessing. Among all the learning techniques for dealing with structured data, kernel methods are recognized to have a strong theoretical background and to be effective approaches. They do not require an explicit vectorial representation of the data in terms of features, but rely on a measure of similarity between any pair of objects of a domain, the kernel function. Designing fast and good kernel functions is a challenging problem. In the case of tree structured data two issues become relevant: kernel for trees should not be sparse and should be fast to compute. The sparsity problem arises when, given a dataset and a kernel function, most structures of the dataset are completely dissimilar to one another. In those cases the classifier has too few information for making correct predictions on unseen data. In fact, it tends to produce a discriminating function behaving as the nearest neighbour rule. Sparsity is likely to arise for some standard tree kernel functions, such as the subtree and subset tree kernel, when they are applied to datasets with node labels belonging to a large domain. A second drawback of using tree kernels is the time complexity required both in learning and classification phases. Such a complexity can sometimes prevents the kernel application in scenarios involving large amount of data. This thesis proposes three contributions for resolving the above issues of kernel for trees. A first contribution aims at creating kernel functions which adapt to the statistical properties of the dataset, thus reducing its sparsity with respect to traditional tree kernel functions. Specifically, we propose to encode the input trees by an algorithm able to project the data onto a lower dimensional space with the property that similar structures are mapped similarly. By building kernel functions on the lower dimensional representation, we are able to perform inexact matchings between different inputs in the original space. A second contribution is the proposal of a novel kernel function based on the convolution kernel framework. Convolution kernel measures the similarity of two objects in terms of the similarities of their subparts. Most convolution kernels are based on counting the number of shared substructures, partially discarding information about their position in the original structure. The kernel function we propose is, instead, especially focused on this aspect. A third contribution is devoted at reducing the computational burden related to the calculation of a kernel function between a tree and a forest of trees, which is a typical operation in the classification phase and, for some algorithms, also in the learning phase. We propose a general methodology applicable to convolution kernels. Moreover, we show an instantiation of our technique when kernels such as the subtree and subset tree kernels are employed. In those cases, Direct Acyclic Graphs can be used to compactly represent shared substructures in different trees, thus reducing the computational burden and storage requirements.
Resumo:
The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.
Resumo:
The prospect of the continuous multiplication of life styles, the obsolescence of the traditional typological diagrams, the usability of spaces on different territorial scales, imposes on contemporary architecture the search for new models of living. Limited densities in urban development have produced the erosion of territory, the increase of the harmful emissions and energy consumption. High density housing cannot refuse the social emergency to ensure high quality and low cost dwellings, to a new people target: students, temporary workers, key workers, foreign, young couples without children, large families and, in general, people who carry out public services. Social housing strategies have become particularly relevant in regenerating high density urban outskirts. The choice of this research topic derives from the desire to deal with the recent accommodation emergency, according to different perspectives, with a view to give a contribution to the current literature, by proposing some tools for a correct design of the social housing, by ensuring good quality, cost-effective, and eco-sustainable solutions, from the concept phase, through management and maintenance, until the end of the building life cycle. The purpose of the thesis is defining a framework of guidelines that become effective instruments to be used in designing the social housing. They should also integrate the existing regulations and are mainly thought for those who work in this sector. They would aim at supporting students who have to cope with this particular residential theme, and also the users themselves. The scientific evidence of either the recent specialized literature or the solutions adopted in some case studies within the selected metropolitan areas of Milan, London and São Paulo, it is possible to identify the principles of this new design approach, in which the connection between typology, morphology and technology pursues the goal of a high living standard.
Resumo:
In the last decade, manufacturing companies have been facing two significant challenges. First, digitalization imposes adopting Industry 4.0 technologies and allows creating smart, connected, self-aware, and self-predictive factories. Second, the attention on sustainability imposes to evaluate and reduce the impact of the implemented solutions from economic and social points of view. In manufacturing companies, the maintenance of physical assets assumes a critical role. Increasing the reliability and the availability of production systems leads to the minimization of systems’ downtimes; In addition, the proper system functioning avoids production wastes and potentially catastrophic accidents. Digitalization and new ICT technologies have assumed a relevant role in maintenance strategies. They allow assessing the health condition of machinery at any point in time. Moreover, they allow predicting the future behavior of machinery so that maintenance interventions can be planned, and the useful life of components can be exploited until the time instant before their fault. This dissertation provides insights on Predictive Maintenance goals and tools in Industry 4.0 and proposes a novel data acquisition, processing, sharing, and storage framework that addresses typical issues machine producers and users encounter. The research elaborates on two research questions that narrow down the potential approaches to data acquisition, processing, and analysis for fault diagnostics in evolving environments. The research activity is developed according to a research framework, where the research questions are addressed by research levers that are explored according to research topics. Each topic requires a specific set of methods and approaches; however, the overarching methodological approach presented in this dissertation includes three fundamental aspects: the maximization of the quality level of input data, the use of Machine Learning methods for data analysis, and the use of case studies deriving from both controlled environments (laboratory) and real-world instances.
Resumo:
Modern networks are undergoing a fast and drastic evolution, with software taking a more predominant role. Virtualization and cloud-like approaches are replacing physical network appliances, reducing the management burden of the operators. Furthermore, networks now expose programmable interfaces for fast and dynamic control over traffic forwarding. This evolution is backed by standard organizations such as ETSI, 3GPP, and IETF. This thesis will describe which are the main trends in this evolution. Then, it will present solutions developed during the three years of Ph.D. to exploit the capabilities these new technologies offer and to study their possible limitations to push further the state-of-the-art. Namely, it will deal with programmable network infrastructure, introducing the concept of Service Function Chaining (SFC) and presenting two possible solutions, one with Openstack and OpenFlow and the other using Segment Routing and IPv6. Then, it will continue with network service provisioning, presenting concepts from Network Function Virtualization (NFV) and Multi-access Edge Computing (MEC). These concepts will be applied to network slicing for mission-critical communications and Industrial IoT (IIoT). Finally, it will deal with network abstraction, with a focus on Intent Based Networking (IBN). To summarize, the thesis will include solutions for data plane programming with evaluation on well-known platforms, performance metrics on virtual resource allocations, novel practical application of network slicing on mission-critical communications, an architectural proposal and its implementation for edge technologies in Industrial IoT scenarios, and a formal definition of intent using a category theory approach.
Resumo:
With the entry into force of the latest Italian Building Code (NTC 2008, 2018), innovative criteria were provided, especially for what concerns the seismic verifications of large infrastructures. In particular, for buildings considered as strategic, such as large dams, a seismotectonic study of the site was declared necessary, which involves a re-assessment of the basic seismic hazard. This PhD project fits into this context, being part of the seismic re-evaluation process of large dams launched on a national scale following the O.P.C.M. 3274/2003, D.L. 79/2004. A full seismotectonic study in the region of two large earth dams in Southern Italy was carried out. We identified and characterized the structures that could generate earthquakes in our study area, together with the definition of the local seismic history. This information was used for the reassessment of the basic seismic hazard, using probabilistic seismic hazard assessment approaches. In recent years, fault-based models for the seismic hazard assessment have been proposed all over the world as a new emerging methodology. For this reason, we decided to test the innovative SHERIFS approach on our study area. The occasion of the seismotectonic study gave also the opportunity to focus on the characteristics of the seismic stations that provided the data for the study itself. In the context of the work presented here, we focused on the 10 stations that had been active for the longest time and we carried out a geophysical characterization, the data of which merged into a more general study on the soil-structure interaction at seismic stations and on the ways in which it could affect the SHA. Lastly, an additional experimental study on the two dams and their associated minor structures is also presented, aimed at defining their main dynamic parameters, useful for subsequent dynamic structural and geotechnical studies.
Resumo:
Biobanks are key infrastructures in data-driven biomedical research. The counterpoint of this optimistic vision is the reality of biobank governance, which must address various ethical, legal and social issues, especially in terms of open consent, privacy and secondary uses which, if not sufficiently resolved, may undermine participants’ and society’s trust in biobanking. The effect of the digital paradigm on biomedical research has only accentuated these issues by adding new pressure for the data protection of biobank participants against the risks of covert discrimination, abuse of power against individuals and groups, and critical commercial uses. Moreover, the traditional research-ethics framework has been unable to keep pace with the transformative developments of the digital era, and has proven inadequate in protecting biobank participants and providing guidance for ethical practices. To this must be added the challenge of an increased tendency towards exploitation and the commercialisation of personal data in the field of biomedical research, which may undermine the altruistic and solidaristic values associated with biobank participation and risk losing alignment with societal interests in biobanking. My research critically analyses, from a bioethical perspective, the challenges and the goals of biobank governance in data-driven biomedical research in order to understand the conditions for the implementation of a governance model that can foster biomedical research and innovation, while ensuring adequate protection for biobank participants and an alignment of biobank procedures and policies with society’s interests and expectations. The main outcome is a conceptualisation of a socially-oriented and participatory model of biobanks by proposing a new ethical framework that relies on the principles of transparency, data protection and participation to tackle the key challenges of biobanks in the digital age and that is well-suited to foster these goals.
Resumo:
This doctoral dissertation represents a cluster of research activities carried out at the DICAM Department of the University of Bologna during a three-year Ph.D. course. The goal of this research is to show how the development of an interconnected infrastructure network, aimed at promoting accessibility and sustainability of places, is fundamental in a framework of deep urban regeneration. Sustainable urban mobility plays an important role in improving the quality of life of citizens. From an environmental point of view, a sustainable mobility system means reducing fuel discharges and energy waste and, in general, aims to promote low carbon emissions. At the same time, a socially and economically sustainable mobility system should be accessible to everybody and create more job opportunities through better connectivity and mobility. Environmentally friendly means of transport such as non-motorized transport, electric vehicles, and hybrid vehicles play an important role in achieving sustainability but require a planned approach at the local policy level. The aim of this study is to demonstrate that, through a targeted reconnection of road and cycle-pedestrian routes, the quality of life of an urban area subject to degradation can be significantly improved just by increasing its accessibility and sustainability. Starting from a detailed study of the European policies and from the comparison with real similar cases, the case study of the Canal Port of Rimini (Italy) has been analysed within the European project FRAMESPORT. The analysis allowed the elaboration of a multicriterial methodology to get to the definition of a project proposal and of a priority scale of interventions. The applied methodology is a valuable tool that may be used in the future in similar urban contexts. Finally, the whole project was represented by using virtual reality to visually show the difference between the before and after the regeneration intervention.
Resumo:
The continuous and swift progression of both wireless and wired communication technologies in today's world owes its success to the foundational systems established earlier. These systems serve as the building blocks that enable the enhancement of services to cater to evolving requirements. Studying the vulnerabilities of previously designed systems and their current usage leads to the development of new communication technologies replacing the old ones such as GSM-R in the railway field. The current industrial research has a specific focus on finding an appropriate telecommunication solution for railway communications that will replace the GSM-R standard which will be switched off in the next years. Various standardization organizations are currently exploring and designing a radiofrequency technology based standard solution to serve railway communications in the form of FRMCS (Future Railway Mobile Communication System) to substitute the current GSM-R. Bearing on this topic, the primary strategic objective of the research is to assess the feasibility to leverage on the current public network technologies such as LTE to cater to mission and safety critical communication for low density lines. The research aims to identify the constraints, define a service level agreement with telecom operators, and establish the necessary implementations to make the system as reliable as possible over an open and public network, while considering safety and cybersecurity aspects. The LTE infrastructure would be utilized to transmit the vital data for the communication of a railway system and to gather and transmit all the field measurements to the control room for maintenance purposes. Given the significance of maintenance activities in the railway sector, the ongoing research includes the implementation of a machine learning algorithm to detect railway equipment faults, reducing time and human analysis errors due to the large volume of measurements from the field.