6 resultados para security, usability, identity management, authentication, authorization
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Today more than ever, with the recent war in Ukraine and the increasing number of attacks that affect systems of nations and companies every day, the world realizes that cybersecurity can no longer be considered just as a “cost”. It must become a pillar for our infrastructures that involve the security of our nations and the safety of people. Critical infrastructure, like energy, financial services, and healthcare, have become targets of many cyberattacks from several criminal groups, with an increasing number of resources and competencies, putting at risk the security and safety of companies and entire nations. This thesis aims to investigate the state-of-the-art regarding the best practice for securing Industrial control systems. We study the differences between two security frameworks. The first is Industrial Demilitarized Zone (I-DMZ), a perimeter-based security solution. The second one is the Zero Trust Architecture (ZTA) which removes the concept of perimeter to offer an entirely new approach to cybersecurity based on the slogan ‘Never Trust, always verify’. Starting from this premise, the Zero Trust model embeds strict Authentication, Authorization, and monitoring controls for any access to any resource. We have defined two architectures according to the State-of-the-art and the cybersecurity experts’ guidelines to compare I-DMZ, and Zero Trust approaches to ICS security. The goal is to demonstrate how a Zero Trust approach dramatically reduces the possibility of an attacker penetrating the network or moving laterally to compromise the entire infrastructure. A third architecture has been defined based on Cloud and fog/edge computing technology. It shows how Cloud solutions can improve the security and reliability of infrastructure and production processes that can benefit from a range of new functionalities, that the Cloud could offer as-a-Service.We have implemented and tested our Zero Trust solution and its ability to block intrusion or attempted attacks.
Resumo:
Questa Tesi mira a comparare diversi sistemi software di gestione dei metadati utilizzate dalle maggiori federazioni di identità europee e utilizzati dalle Organizzazioni di Formazione e Istruzione per il proprio Identity Management.
Analisi e riprogettazione del processo di ict risk management: un caso applicativo in Telecom Italia
Resumo:
Questo lavoro di tesi muove da tematiche relative alla sicurezza IT e risulta dagli otto mesi di lavoro all’interno della funzione Technical Security di Telecom Italia Information Technology. Il compito primario di questa unità di business è ridurre il rischio informatico dei sistemi di Telecom Italia per mezzo dell’attuazione del processo di ICT Risk Management, che coinvolge l’intera organizzazione ed è stato oggetto di una riprogettazione nel corso del 2012. Per estendere tale processo a tutti i sistemi informatici, nello specifico a quelli caratterizzati da non conformità, all’inizio del 2013 è stato avviato il Programma Strutturato di Sicurezza, un aggregato di quattro progetti dalla durata triennale particolarmente articolato e complesso. La pianificazione di tale Programma ha visto coinvolto, tra gli altri, il team di cui ho fatto parte, che ha collaborato con Telecom Italia assolvendo alcune delle funzioni di supporto tipiche dei Project Management Office (PMO).
Resumo:
In modern society, security issues of IT Systems are intertwined with interdisciplinary aspects, from social life to sustainability, and threats endanger many aspects of every- one’s daily life. To address the problem, it’s important that the systems that we use guarantee a certain degree of security, but to achieve this, it is necessary to be able to give a measure to the amount of security. Measuring security is not an easy task, but many initiatives, including European regulations, want to make this possible. One method of measuring security is based on the use of security metrics: those are a way of assessing, from various aspects, vulnera- bilities, methods of defense, risks and impacts of successful attacks then also efficacy of reactions, giving precise results using mathematical and statistical techniques. I have done literature research to provide an overview on the meaning, the effects, the problems, the applications and the overall current situation over security metrics, with particular emphasis in giving practical examples. This thesis starts with a summary of the state of the art in the field of security met- rics and application examples to outline the gaps in current literature, the difficulties found in the change of application context, to then advance research questions aimed at fostering the discussion towards the definition of a more complete and applicable view of the subject. Finally, it stresses the lack of security metrics that consider interdisciplinary aspects, giving some potential starting point to develop security metrics that cover all as- pects involved, taking the field to a new level of formal soundness and practical usability.
Resumo:
Group work allows participants to pool their thoughts and examine difficulties from several angles. In these settings, it is possible to attempt things that an individual could not achieve, combining a variety of abilities and knowledge to tackle more complicated and large-scale challenges. That’s why nowadays collaborative work is becoming more and more widespread to solve complex innovation dilemmas. Since innovation isn’t a tangible thing, most innovation teams used to take decisions based on performance KPIs such as forecasted engagement, projected profitability, investments required, cultural impacts etc. Have you ever wondered the reason why sometimes innovation group processes come out with decisions which are not the optimal meeting point of all the KPIs? Has this decision been influenced by other factors? Some researchers account part of this phenomenon to the emotions in group-based interaction between participants. I will develop a literature review that is split into three parts: first, I will consider some emotions theories from an individual perspective; secondly, a wider view of collective interactions theories will be provided; lastly, I will supply some recent collective interaction empirical studies. After the theoretical and empirical gaps have been tackled, the study will additionally move forward with a methodological point of view, about the Circumplex Model, which is the model I used to evaluate emotions in my research. This model has been applied to SUGAR project, which is the biggest design thinking academy worldwide.
Resumo:
In this paper, a joint location-inventory model is proposed that simultaneously optimises strategic supply chain design decisions such as facility location and customer allocation to facilities, and tactical-operational inventory management and production scheduling decisions. All this is analysed in a context of demand uncertainty and supply uncertainty. While demand uncertainty stems from potential fluctuations in customer demands over time, supply-side uncertainty is associated with the risk of “disruption” to which facilities may be subject. The latter is caused by external factors such as natural disasters, strikes, changes of ownership and information technology security incidents. The proposed model is formulated as a non-linear mixed integer programming problem to minimise the expected total cost, which includes four basic cost items: the fixed cost of locating facilities at candidate sites, the cost of transport from facilities to customers, the cost of working inventory, and the cost of safety stock. Next, since the optimisation problem is very complex and the number of evaluable instances is very low, a "matheuristic" solution is presented. This approach has a twofold objective: on the one hand, it considers a larger number of facilities and customers within the network in order to reproduce a supply chain configuration that more closely reflects a real-world context; on the other hand, it serves to generate a starting solution and perform a series of iterations to try to improve it. Thanks to this algorithm, it was possible to obtain a solution characterised by a lower total system cost than that observed for the initial solution. The study concludes with some reflections and the description of possible future insights.