11 resultados para Generic Security Services Application Program Interface (GSS-API)
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Generic programming is likely to become a new challenge for a critical mass of developers. Therefore, it is crucial to refine the support for generic programming in mainstream Object-Oriented languages — both at the design and at the implementation level — as well as to suggest novel ways to exploit the additional degree of expressiveness made available by genericity. This study is meant to provide a contribution towards bringing Java genericity to a more mature stage with respect to mainstream programming practice, by increasing the effectiveness of its implementation, and by revealing its full expressive power in real world scenario. With respect to the current research setting, the main contribution of the thesis is twofold. First, we propose a revised implementation for Java generics that greatly increases the expressiveness of the Java platform by adding reification support for generic types. Secondly, we show how Java genericity can be leveraged in a real world case-study in the context of the multi-paradigm language integration. Several approaches have been proposed in order to overcome the lack of reification of generic types in the Java programming language. Existing approaches tackle the problem of reification of generic types by defining new translation techniques which would allow for a runtime representation of generics and wildcards. Unfortunately most approaches suffer from several problems: heterogeneous translations are known to be problematic when considering reification of generic methods and wildcards. On the other hand, more sophisticated techniques requiring changes in the Java runtime, supports reified generics through a true language extension (where clauses) so that backward compatibility is compromised. In this thesis we develop a sophisticated type-passing technique for addressing the problem of reification of generic types in the Java programming language; this approach — first pioneered by the so called EGO translator — is here turned into a full-blown solution which reifies generic types inside the Java Virtual Machine (JVM) itself, thus overcoming both performance penalties and compatibility issues of the original EGO translator. Java-Prolog integration Integrating Object-Oriented and declarative programming has been the subject of several researches and corresponding technologies. Such proposals come in two flavours, either attempting at joining the two paradigms, or simply providing an interface library for accessing Prolog declarative features from a mainstream Object-Oriented languages such as Java. Both solutions have however drawbacks: in the case of hybrid languages featuring both Object-Oriented and logic traits, such resulting language is typically too complex, thus making mainstream application development an harder task; in the case of library-based integration approaches there is no true language integration, and some “boilerplate code” has to be implemented to fix the paradigm mismatch. In this thesis we develop a framework called PatJ which promotes seamless exploitation of Prolog programming in Java. A sophisticated usage of generics/wildcards allows to define a precise mapping between Object-Oriented and declarative features. PatJ defines a hierarchy of classes where the bidirectional semantics of Prolog terms is modelled directly at the level of the Java generic type-system.
Resumo:
A highly dangerous situations for tractor driver is the lateral rollover in operating conditions. Several accidents, involving tractor rollover, have indeed been encountered, requiring the design of a robust Roll-Over Protective Structure (ROPS). The aim of the thesis was to evaluate tractor behaviour in the rollover phase so as to calculate the energy absorbed by the ROPS to ensure driver safety. A Mathematical Model representing the behaviour of a generic tractor during a lateral rollover, with the possibility of modifying the geometry, the inertia of the tractor and the environmental boundary conditions, is proposed. The purpose is to define a method allowing the prediction of the elasto-plastic behaviour of the subsequent impacts occurring in the rollover phase. A tyre impact model capable of analysing the influence of the wheels on the energy to be absorbed by the ROPS has been also developed. Different tractor design parameters affecting the rollover behaviour, such as mass and dimensions, have been considered. This permitted the evaluation of their influence on the amount of energy to be absorbed by the ROPS. The mathematical model was designed and calibrated with respect to the results of actual lateral upset tests carried out on a narrow-track tractor. The dynamic behaviour of the tractor and the energy absorbed by the ROPS, obtained from the actual tests, showed to match the results of the model developed. The proposed approach represents a valuable tool in understanding the dynamics (kinetic energy) and kinematics (position, velocity, angular velocity, etc.) of the tractor in the phases of lateral rollover and the factors mainly affecting the event. The prediction of the amount of energy to be absorbed in some cases of accident is possible with good accuracy. It can then help in designing protective structures or active security devices.
Resumo:
With the CERN LHC program underway, there has been an acceleration of data growth in the High Energy Physics (HEP) field and the usage of Machine Learning (ML) in HEP will be critical during the HL-LHC program when the data that will be produced will reach the exascale. ML techniques have been successfully used in many areas of HEP nevertheless, the development of a ML project and its implementation for production use is a highly time-consuming task and requires specific skills. Complicating this scenario is the fact that HEP data is stored in ROOT data format, which is mostly unknown outside of the HEP community. The work presented in this thesis is focused on the development of a ML as a Service (MLaaS) solution for HEP, aiming to provide a cloud service that allows HEP users to run ML pipelines via HTTP calls. These pipelines are executed by using the MLaaS4HEP framework, which allows reading data, processing data, and training ML models directly using ROOT files of arbitrary size from local or distributed data sources. Such a solution provides HEP users non-expert in ML with a tool that allows them to apply ML techniques in their analyses in a streamlined manner. Over the years the MLaaS4HEP framework has been developed, validated, and tested and new features have been added. A first MLaaS solution has been developed by automatizing the deployment of a platform equipped with the MLaaS4HEP framework. Then, a service with APIs has been developed, so that a user after being authenticated and authorized can submit MLaaS4HEP workflows producing trained ML models ready for the inference phase. A working prototype of this service is currently running on a virtual machine of INFN-Cloud and is compliant to be added to the INFN Cloud portfolio of services.
Resumo:
Historical evidence shows that chemical, process, and Oil&Gas facilities where dangerous substances are stored or handled are target of deliberate malicious attacks (security attacks) aiming at interfering with normal operations. Physical attacks and cyber-attacks may generate events with consequences on people, property, and the surrounding environment that are comparable to those of major accidents caused by safety-related causes. The security aspects of these facilities are commonly addressed using Security Vulnerability/Risk Assessment (SVA/SRA) methodologies. Most of these methodologies are semi-quantitative and non-systematic approaches that strongly rely on expert judgment, leading to security assessments that are not reproducible. Moreover, they do not consider the synergies with the safety domain. The present 3-year research is aimed at filling the gap outlined by providing knowledge on security attacks, as well as rigorous and systematic methods supporting existing SVA/SRA studies suitable for the chemical, process, and Oil&Gas industry. The different nature of cyber and physical attacks resulted in the development of different methods for the two domains. The first part of the research was devoted to the development and statistical analysis of security databases that allowed to develop new knowledge and lessons learnt on security threats. Based on the obtained background, a Bow-Tie based procedure and two reverse-HazOp based methodologies were developed as hazard identification approaches for physical and cyber threats respectively. To support the quantitative estimation of the security risk, a quantitative procedure based on the Bayesian Network was developed allowing to calculate the probability of success of physical security attacks. All the developed methods have been applied to case studies addressing chemical, process and Oil&Gas facilities (offshore and onshore) proving the quality of the results that can be achieved in improving site security. Furthermore, the outcomes achieved allow to step forward in developing synergies and promoting integration among safety and security management.
Resumo:
Gossip protocols have proved to be a viable solution to set-up and manage largescale P2P services or applications in a fully decentralised scenario. The gossip or epidemic communication scheme is heavily based on stochastic behaviors and it is the fundamental idea behind many large-scale P2P protocols. It provides many remarkable features, such as scalability, robustness to failures, emergent load balancing capabilities, fast spreading, and redundancy of information. In some sense, these services or protocols mimic natural system behaviors in order to achieve their goals. The key idea of this work is that the remarkable properties of gossip hold when all the participants follow the rules dictated by the actual protocols. If one or more malicious nodes join the network and start cheating according to some strategy, the result can be catastrophic. In order to study how serious the threat posed by malicious nodes can be and what can be done to prevent attackers from cheating, we focused on a general attack model aimed to defeat a key service in gossip overlay networks (the Peer Sampling Service [JGKvS04]). We also focused on the problem of protecting against forged information exchanged in gossip services. We propose a solution technique for each problem; both techniques are general enough to be applied to distinct service implementations. As gossip protocols, our solutions are based on stochastic behavior and are fully decentralized. In addition, each technique’s behaviour is abstracted by a general primitive function extending the basic gossip scheme; this approach allows the adoptions of our solutions with minimal changes in different scenarios. We provide an extensive experimental evaluation to support the effectiveness of our techniques. Basically, these techniques aim to be building blocks or P2P architecture guidelines in building more resilient and more secure P2P services.
Resumo:
One of the most interesting challenge of the next years will be the Air Space Systems automation. This process will involve different aspects as the Air Traffic Management, the Aircrafts and Airport Operations and the Guidance and Navigation Systems. The use of UAS (Uninhabited Aerial System) for civil mission will be one of the most important steps in this automation process. In civil air space, Air Traffic Controllers (ATC) manage the air traffic ensuring that a minimum separation between the controlled aircrafts is always provided. For this purpose ATCs use several operative avoidance techniques like holding patterns or rerouting. The use of UAS in these context will require the definition of strategies for a common management of piloted and piloted air traffic that allow the UAS to self separate. As a first employment in civil air space we consider a UAS surveillance mission that consists in departing from a ground base, taking pictures over a set of mission targets and coming back to the same ground base. During all mission a set of piloted aircrafts fly in the same airspace and thus the UAS has to self separate using the ATC avoidance as anticipated. We consider two objective, the first consists in the minimization of the air traffic impact over the mission, the second consists in the minimization of the impact of the mission over the air traffic. A particular version of the well known Travelling Salesman Problem (TSP) called Time-Dependant-TSP has been studied to deal with traffic problems in big urban areas. Its basic idea consists in a cost of the route between two clients depending on the period of the day in which it is crossed. Our thesis supports that such idea can be applied to the air traffic too using a convenient time horizon compatible with aircrafts operations. The cost of a UAS sub-route will depend on the air traffic that it will meet starting such route in a specific moment and consequently on the avoidance maneuver that it will use to avoid that conflict. The conflict avoidance is a topic that has been hardly developed in past years using different approaches. In this thesis we purpose a new approach based on the use of ATC operative techniques that makes it possible both to model the UAS problem using a TDTSP framework both to use an Air Traffic Management perspective. Starting from this kind of mission, the problem of the UAS insertion in civil air space is formalized as the UAS Routing Problem (URP). For this reason we introduce a new structure called Conflict Graph that makes it possible to model the avoidance maneuvers and to define the arc cost function of the departing time. Two Integer Linear Programming formulations of the problem are proposed. The first is based on a TDTSP formulation that, unfortunately, is weaker then the TSP formulation. Thus a new formulation based on a TSP variation that uses specific penalty to model the holdings is proposed. Different algorithms are presented: exact algorithms, simple heuristics used as Upper Bounds on the number of time steps used, and metaheuristic algorithms as Genetic Algorithm and Simulated Annealing. Finally an air traffic scenario has been simulated using real air traffic data in order to test our algorithms. Graphic Tools have been used to represent the Milano Linate air space and its air traffic during different days. Such data have been provided by ENAV S.p.A (Italian Agency for Air Navigation Services).
Resumo:
Il contesto nazionale è cambiato recentemente per l’introduzione del nuovo Sistema Geodetico coincidente con quello Europeo (ETRS89, frame ETRF00) e realizzato dalle stazioni della Rete Dinamica Nazionale. Sistema geodetico, associato al cartografico UTM_ETRF00, divenuto per decreto obbligatorio nelle Pubbliche Amministrazioni. Questo cambiamento ha consentito di ottenere rilevamenti dei dati cartografici in coordinate assolute ETRF00 molto più accurate. Quando i dati così rilevati vengono utilizzati per aggiornamenti cartografici perdono le coordinate originarie e vengono adattati a particolari cartografici circostanti. Per progettare una modernizzazione delle mappe catastali e delle carte tecniche finalizzata a consentire l’introduzione degli aggiornamenti senza modificarne le coordinate assolute originarie, lo studio è iniziato valutando come utilizzare sviluppi di strutturazione dei dati topografici presenti nel Database Geotopografico, modellizzazioni 3D di fabbricati nelle esperienze catastali INSPIRE, integrazioni in ambito MUDE tra progetti edilizi e loro realizzazioni. Lo studio è proseguito valutando i servizi di posizionamento in tempo reale NRTK presenti in Italia. Inoltre sono state effettuate sperimentazioni per verificare anche in sede locale la precisione e l’affidabilità dei servizi di posizionamento presenti. La criticità della cartografia catastale deriva sostanzialmente dal due fatti: che originariamente fu inquadrata in 850 Sistemi e successivamente fu trasformata in Roma40 con una esigua densità di punti rimisurati; che fino al 1988 fu aggiornata con modalità non rigorose di bassa qualità. Per risolvere tali criticità si è quindi ipotizzato di sfruttare le modalità di rilevamento NRTK per aumentare localmente la densità dei punti rimisurati e reinquadrare le mappe catastali. Il test, realizzato a Bologna, ha comportato un’analisi preliminare per individuare quali Punti Fiduciali considerare coerenti con le specifiche cartografiche per poi utilizzarli e aumentare localmente la densità dei punti rimisurati. La sperimentazione ha consentito la realizzazione del progetto e di inserire quindi i prossimi aggiornamenti senza modificarne le coordinate ETRF00 ottenute dal servizio di posizionamento.
Resumo:
Entro l’approccio concettuale e metodologico transdisciplinare della Scienza della Sostenibilità, la presente tesi elabora un background teorico per concettualizzare una definizione di sostenibilità sulla cui base proporre un modello di sviluppo alternativo a quello dominante, declinato in termini di proposte concrete entro il caso-studio di regolazione europea in materia di risparmio energetico. La ricerca, attraverso un’analisi transdisciplinare, identifica una crisi strutturale del modello di sviluppo dominante basato sulla crescita economica quale (unico) indicatore di benessere e una crisi valoriale. L’attenzione si concentra quindi sull’individuazione di un paradigma idoneo a rispondere alle criticità emerse dall’analisi. A tal fine vengono esaminati i concetti di sviluppo sostenibile e di sostenibilità, arrivando a proporre un nuovo paradigma (la “sostenibilità ecosistemica”) che dia conto dell’impossibilità di una crescita infinita su un sistema caratterizzato da risorse limitate. Vengono poi presentate delle proposte per un modello di sviluppo sostenibile alternativo a quello dominante. Siffatta elaborazione teorica viene declinata in termini concreti mediante l’elaborazione di un caso-studio. A tal fine, viene innanzitutto analizzata la funzione della regolazione come strumento per garantire l’applicazione pratica del modello teorico. L’attenzione è concentrata sul caso-studio rappresentato dalla politica e regolazione dell’Unione Europea in materia di risparmio ed efficienza energetica. Dall’analisi emerge una progressiva commistione tra i due concetti di risparmio energetico ed efficienza energetica, per la quale vengono avanzate delle motivazioni e individuati dei rischi in termini di effetti rebound. Per rispondere alle incongruenze tra obiettivo proclamato dall’Unione Europea di riduzione dei consumi energetici e politica effettivamente perseguita, viene proposta una forma di “regolazione per la sostenibilità” in ambito abitativo residenziale che, promuovendo la condivisione dei servizi energetici, recuperi il significato proprio di risparmio energetico come riduzione del consumo mediante cambiamenti di comportamento, arricchendolo di una nuova connotazione come “bene relazionale” per la promozione del benessere relazionale ed individuale.
Resumo:
Sustainability encompasses the presence of three dimensions that must coexist simultaneously, namely the environmental, social, and economic ones. The economic and social dimensions are gaining the spotlight in recent years, especially within food systems. To assess social and economic impacts, indicators and tools play a fundamental role in contributing to the achievements of sustainability targets, although few of them have deepen the focus on social and economic impacts. Moreover, in a framework of citizen science and bottom-up approach for improving food systems, citizen play a key role in defying their priorities in terms of social and economic interventions. This research expands the knowledge of social and economic sustainability indicators within the food systems for robust policy insights and interventions. This work accomplishes the following objectives: 1) to define social and economic indicators within the supply chain with a stakeholder perspective, 2) to test social and economic sustainability indicators for future food systems engaging young generations. The first objective was accomplished through the development of a systematic literature review of 34 social sustainability tools, based on five food supply chain stages, namely production, processing, wholesale, retail, and consumer considering farmers, workers, consumers, and society as stakeholders. The second objective was achieved by defining and testing new food systems social and economic sustainability indicators through youth engagement for informed and robust policy insights, to provide policymakers suggestions that would incorporate young generations ones. Future food systems scenarios were evaluated by youth through focus groups, whose results were analyzed through NVivo and then through a survey with a wider platform. Conclusion addressed the main areas of policy interventions in terms of social and economic aspects of sustainable food systems youth pointed out as in need of interventions, spanning from food labelling reporting sustainable origins to better access to online food services.
Resumo:
Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.