936 resultados para Cryptographic Protocols, Provable Security, ID-Based Cryptography
Resumo:
Abstract There has been a great deal of interest in the area of cyber security in recent years. But what is cyber security exactly? And should society really care about it? We look at some of the challenges of being an academic working in the area of cyber security and explain why cyber security is, to put it rather simply, hard! Speaker Biography Keith Martin Prof. Keith Martin is Professor of Information Security at Royal Holloway, University of London. He received his BSc (Hons) in Mathematics from the University of Glasgow in 1988 and a PhD from Royal Holloway in 1991. Between 1992 and 1996 he held a Research Fellowship at the University of Adelaide, investigating mathematical modelling of cryptographic key distribution problems. In 1996 he joined the COSIC research group of the Katholieke Universiteit Leuven in Belgium, working on security for third generation mobile communications. Keith rejoined Royal Holloway in January 2000, became a Professor in Information Security in 2007 and was Director of the Information Security Group between 2010 and 2015. Keith's research interests range across cyber security, but with a focus on cryptographic applications. He is the author of 'Everyday Cryptography' published by Oxford University Press.
Resumo:
The research aimed to understand the challenges for the implementation of the proposed integration between the Civil Police and the Military Police in Rio Grande do Norte to the proposals of the SUSP. This study aimed to explore the gap with regard to the deepening of the possible causes that may hinder the implementation of integrated working between the police in public security, through a specific analysis on the state of Rio Grande do Norte. Was based on a theoretical framework that includes policies: general concepts, the steps of a public policy, the implementation stage , public security : conceptual definitions, policies on security in Brazil, the structure of public security in Brazil and systems police, Military Police x Civil Police: Roles and conflicts , integrating public security: the challenges to be overcome, the Unified public Safety (SUSP) and the main difficulties in the integration of the police. Being classified as to the purposes as an exploratory research on how to approach ranks as qualitative. The research unit was the Center for Integrated Operations Public Safety (CIOSP) through three subjects who were the chief CIOSP, the representative of the military police acting with the CIOSP, and representative civil police also active with the CIOSP. These subjects were chosen because of the understanding that individuals occupying senior positions would have more ability to respond to questions that guide the research problem. Data were collected through a set of interviews, qualitative data analysis was performed based content analysis, based on the definition of categories of analysis, gated time cross. With the results, it was revealed that the main problems of integration between the state police are treatment protocols, lack of political will and lack of infrastructure. The relationship between the Military Police and Civil Police in Rio Grande do Norte has differing cultural aspect, but can be considered as good value, professionalism and integrated operations. The implementation of CIOSP-RN followed the characteristics of the top-down model, the main difficulties in implementing the proposals of the SUSP, lack of own resources, the lack of standardization in public safety and the lack of professional training of public safety. It was concluded that with respect to the challenges to the implementation of the proposed integration between the Civil Police and the Military Police in Rio Grande do Norte to the proposals of the SUSP, the actions follow the characteristics of the top-down model, with no autonomy of administrators public to say in decisions, which restricts the view of the public safety of the state
Resumo:
This thesis focuses on the private membership test (PMT) problem and presents three single server protocols to resolve this problem. In the presented solutions, a client can perform an inclusion test for some record x in a server's database, without revealing his record. Moreover after executing the protocols, the contents of server's database remain secret. In each of these solutions, a different cryptographic protocol is utilized to construct a privacy preserving variant of Bloom filter. The three suggested solutions are slightly different from each other, from privacy perspective and also from complexity point of view. Therefore, their use cases are different and it is impossible to choose one that is clearly the best between all three. We present the software developments of the three protocols by utilizing various pseudocodes. The performance of our implementation is measured based on a real case scenario. This thesis is a spin-off from the Academy of Finland research project "Cloud Security Services".
Resumo:
Security Onion is a Network Security Manager (NSM) platform that provides multiple Intrusion Detection Systems (IDS) including Host IDS (HIDS) and Network IDS (NIDS). Many types of data can be acquired using Security Onion for analysis. This includes data related to: Host, Network, Session, Asset, Alert and Protocols. Security Onion can be implemented as a standalone deployment with server and sensor included or with a master server and multiple sensors allowing for the system to be scaled as required. Many interfaces and tools are available for management of the system and analysis of data such as Sguil, Snorby, Squert and Enterprise Log Search and Archive (ELSA). These interfaces can be used for analysis of alerts and captured events and then can be further exported for analysis in Network Forensic Analysis Tools (NFAT) such as NetworkMiner, CapME or Xplico. The Security Onion platform also provides various methods of management such as Secure SHell (SSH) for management of server and sensors and Web client remote access. All of this with the ability to replay and analyse example malicious traffic makes the Security Onion a suitable low cost alternative for Network Security Management. In this paper, we have a feature and functionality review for the Security Onion in terms of: types of data, configuration, interface, tools and system management.
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Secure computation involves multiple parties computing a common function while keeping their inputs private, and is a growing field of cryptography due to its potential for maintaining privacy guarantees in real-world applications. However, current secure computation protocols are not yet efficient enough to be used in practice. We argue that this is due to much of the research effort being focused on generality rather than specificity. Namely, current research tends to focus on constructing and improving protocols for the strongest notions of security or for an arbitrary number of parties. However, in real-world deployments, these security notions are often too strong, or the number of parties running a protocol would be smaller. In this thesis we make several steps towards bridging the efficiency gap of secure computation by focusing on constructing efficient protocols for specific real-world settings and security models. In particular, we make the following four contributions: - We show an efficient (when amortized over multiple runs) maliciously secure two-party secure computation (2PC) protocol in the multiple-execution setting, where the same function is computed multiple times by the same pair of parties. - We improve the efficiency of 2PC protocols in the publicly verifiable covert security model, where a party can cheat with some probability but if it gets caught then the honest party obtains a certificate proving that the given party cheated. - We show how to optimize existing 2PC protocols when the function to be computed includes predicate checks on its inputs. - We demonstrate an efficient maliciously secure protocol in the three-party setting.
Resumo:
The research aimed to understand the challenges for the implementation of the proposed integration between the Civil Police and the Military Police in Rio Grande do Norte to the proposals of the SUSP. This study aimed to explore the gap with regard to the deepening of the possible causes that may hinder the implementation of integrated working between the police in public security, through a specific analysis on the state of Rio Grande do Norte. Was based on a theoretical framework that includes policies: general concepts, the steps of a public policy, the implementation stage , public security : conceptual definitions, policies on security in Brazil, the structure of public security in Brazil and systems police, Military Police x Civil Police: Roles and conflicts , integrating public security: the challenges to be overcome, the Unified public Safety (SUSP) and the main difficulties in the integration of the police. Being classified as to the purposes as an exploratory research on how to approach ranks as qualitative. The research unit was the Center for Integrated Operations Public Safety (CIOSP) through three subjects who were the chief CIOSP, the representative of the military police acting with the CIOSP, and representative civil police also active with the CIOSP. These subjects were chosen because of the understanding that individuals occupying senior positions would have more ability to respond to questions that guide the research problem. Data were collected through a set of interviews, qualitative data analysis was performed based content analysis, based on the definition of categories of analysis, gated time cross. With the results, it was revealed that the main problems of integration between the state police are treatment protocols, lack of political will and lack of infrastructure. The relationship between the Military Police and Civil Police in Rio Grande do Norte has differing cultural aspect, but can be considered as good value, professionalism and integrated operations. The implementation of CIOSP-RN followed the characteristics of the top-down model, the main difficulties in implementing the proposals of the SUSP, lack of own resources, the lack of standardization in public safety and the lack of professional training of public safety. It was concluded that with respect to the challenges to the implementation of the proposed integration between the Civil Police and the Military Police in Rio Grande do Norte to the proposals of the SUSP, the actions follow the characteristics of the top-down model, with no autonomy of administrators public to say in decisions, which restricts the view of the public safety of the state
Resumo:
Nowadays, information security is a very important topic. In particular, wireless networks are experiencing an ongoing widespread diffusion, also thanks the increasing number of Internet Of Things devices, which generate and transmit a lot of data: protecting wireless communications is of fundamental importance, possibly through an easy but secure method. Physical Layer Security is an umbrella of techniques that leverages the characteristic of the wireless channel to generate security for the transmission. In particular, the Physical Layer based-Key generation aims at allowing two users to generate a random symmetric keys in an autonomous way, hence without the aid of a trusted third entity. Physical Layer based-Key generation relies on observations of the wireless channel, from which harvesting entropy: however, an attacker might possesses a channel simulator, for example a Ray Tracing simulator, to replicate the channel between the legitimate users, in order to guess the secret key and break the security of the communication. This thesis work is focused on the possibility to carry out a so called Ray Tracing attack: the method utilized for the assessment consist of a set of channel measurements, in different channel conditions, that are then compared with the simulated channel from the ray tracing, to compute the mutual information between the measurements and simulations. Furthermore, it is also presented the possibility of using the Ray Tracing as a tool to evaluate the impact of channel parameters (e.g. the bandwidth or the directivity of the antenna) on the Physical Layer based-Key generation. The measurements have been carried out at the Barkhausen Institut gGmbH in Dresden (GE), in the framework of the existing cooperation agreement between BI and the Dept. of Electrical, Electronics and Information Engineering "G. Marconi" (DEI) at the University of Bologna.
Resumo:
Recent technological advancements have played a key role in seamlessly integrating cloud, edge, and Internet of Things (IoT) technologies, giving rise to the Cloud-to-Thing Continuum paradigm. This cloud model connects many heterogeneous resources that generate a large amount of data and collaborate to deliver next-generation services. While it has the potential to reshape several application domains, the number of connected entities remarkably broadens the security attack surface. One of the main problems is the lack of security measures to adapt to the dynamic and evolving conditions of the Cloud-To-Thing Continuum. To address this challenge, this dissertation proposes novel adaptable security mechanisms. Adaptable security is the capability of security controls, systems, and protocols to dynamically adjust to changing conditions and scenarios. However, since the design and development of novel security mechanisms can be explored from different perspectives and levels, we place our attention on threat modeling and access control. The contributions of the thesis can be summarized as follows. First, we introduce a model-based methodology that secures the design of edge and cyber-physical systems. This solution identifies threats, security controls, and moving target defense techniques based on system features. Then, we focus on access control management. Since access control policies are subject to modifications, we evaluate how they can be efficiently shared among distributed areas, highlighting the effectiveness of distributed ledger technologies. Furthermore, we propose a risk-based authorization middleware, adjusting permissions based on real-time data, and a federated learning framework that enhances trustworthiness by weighting each client's contributions according to the quality of their partial models. Finally, since authorization revocation is another critical concern, we present an efficient revocation scheme for verifiable credentials in IoT networks, featuring decentralization, demanding minimum storage and computing capabilities. All the mechanisms have been evaluated in different conditions, proving their adaptability to the Cloud-to-Thing Continuum landscape.
Resumo:
One of the main practical implications of quantum mechanical theory is quantum computing, and therefore the quantum computer. Quantum computing (for example, with Shor’s algorithm) challenges the computational hardness assumptions, such as the factoring problem and the discrete logarithm problem, that anchor the safety of cryptosystems. So the scientific community is studying how to defend cryptography; there are two defense strategies: the quantum cryptography (which involves the use of quantum cryptographic algorithms on quantum computers) and the post-quantum cryptography (based on classical cryptographic algorithms, but resistant to quantum computers). For example, National Institute of Standards and Technology (NIST) is collecting and standardizing the post-quantum ciphers, as it established DES and AES as symmetric cipher standards, in the past. In this thesis an introduction on quantum mechanics was given, in order to be able to talk about quantum computing and to analyze Shor’s algorithm. The differences between quantum and post-quantum cryptography were then analyzed. Subsequently the focus was given to the mathematical problems assumed to be resistant to quantum computers. To conclude, post-quantum digital signature cryptographic algorithms selected by NIST were studied and compared in order to apply them in today’s life.
Resumo:
The development and maintenance of the sealing of the root canal system is the key to the success of root canal treatment. The resin-based adhesive material has the potential to reduce the microleakage of the root canal because of its adhesive properties and penetration into dentinal walls. Moreover, the irrigation protocols may have an influence on the adhesiveness of resin-based sealers to root dentin. The objective of the present study was to evaluate the effect of different irrigant protocols on coronal bacterial microleakage of gutta-percha/AH Plus and Resilon/Real Seal Self-etch systems. One hundred ninety pre-molars were used. The teeth were divided into 18 experimental groups according to the irrigation protocols and filling materials used. The protocols used were: distilled water; sodium hypochlorite (NaOCl)+eDTA; NaOCl+H3PO4; NaOCl+eDTA+chlorhexidine (CHX); NaOCl+H3PO4+CHX; CHX+eDTA; CHX+ H3PO4; CHX+eDTA+CHX and CHX+H3PO4+CHX. Gutta-percha/AH Plus or Resilon/Real Seal Se were used as root-filling materials. The coronal microleakage was evaluated for 90 days against Enterococcus faecalis. Data were statistically analyzed using Kaplan-Meier survival test, Kruskal-Wallis and Mann-Whitney tests. No significant difference was verified in the groups using chlorhexidine or sodium hypochlorite during the chemo-mechanical preparation followed by eDTA or phosphoric acid for smear layer removal. The same results were found for filling materials. However, the statistical analyses revealed that a final flush with 2% chlorhexidine reduced significantly the coronal microleakage. A final flush with 2% chlorhexidine after smear layer removal reduces coronal microleakage of teeth filled with gutta-percha/AH Plus or Resilon/Real Seal SE.
Resumo:
Different surface treatment protocols of poly(methyl methacrylate) have been proposed to improve the adhesion of silicone-based resilient denture liners to poly(methyl methacrylate) surfaces. The purpose of this study was to evaluate the effect of different poly(methyl methacrylate) surface treatments on the adhesion of silicone-based resilient denture liners. Poly(methyl methacrylate) specimens were prepared and divided into 4 treatment groups: no treatment (control), methyl methacrylate for 180 seconds, acetone for 30 seconds, and ethyl acetate for 60 seconds. Poly(methyl methacrylate) disks (30.0 × 5.0 mm; n = 10) were evaluated regarding surface roughness and surface free energy. To evaluate tensile bond strength, the resilient material was applied between 2 treated poly(methyl methacrylate) bars (60.0 × 5.0 × 5.0 mm; n = 20 for each group) to form a 2-mm-thick layer. Data were analyzed by 1-way ANOVA and the Tukey honestly significant difference tests (α = .05). A Pearson correlation test verified the influence of surface properties on tensile bond strength. Failure type was assessed, and the poly(methyl methacrylate) surface treatment modifications were visualized with scanning electron microscopy. The surface roughness was increased (P < .05) by methyl methacrylate treatment. For the acetone and ethyl acetate groups, the surface free energy decreased (P < .05). The tensile bond strength was higher for the methyl methacrylate and ethyl acetate groups (P < .05). No correlation was found regarding surface properties and tensile bond strength. Specimens treated with acetone and methyl methacrylate presented a cleaner surface, whereas the ethyl acetate treatment produced a porous topography. The methyl methacrylate and ethyl acetate surface treatment protocols improved the adhesion of a silicone-based resilient denture liner to poly(methyl methacrylate).
Resumo:
The objective of the present study was to improve the detection of B. abortus by PCR in organs of aborted fetuses from infected cows, an important mechanism to find infected herds on the eradication phase of the program. So, different DNA extraction protocols were compared, focusing the PCR detection of B. abortus in clinical samples collected from aborted fetuses or calves born from cows challenged with the 2308 B. abortus strain. Therefore, two gold standard groups were built based on classical bacteriology, formed from: 32 lungs (17 positives), 26 spleens (11 positives), 23 livers (8 positives) and 22 bronchial lymph nodes (7 positives). All samples were submitted to three DNA extraction protocols, followed by the same amplification process with the primers B4 and B5. From the accumulated results for organ, the proportion of positives for the lungs was higher than the livers (p=0.04) or bronchial lymph nodes (p=0.004) and equal to the spleens (p=0.18). From the accumulated results for DNA extraction protocol, the proportion of positives for the Boom protocol was bigger than the PK (p<0.0001) and GT (p=0.0004). There was no difference between the PK and GT protocols (p=0.5). Some positive samples from the classical bacteriology were negative to the PCR and viceversa. Therefore, the best strategy for B. abortus detection in the organs of aborted fetuses or calves born from infected cows is the use, in parallel, of isolation by classical bacteriology and the PCR, with the DNA extraction performed by the Boom protocol.
Resumo:
Few articles deal with lead and strontium isotopic analysis of water samples. The aim of this study was to define the chemical procedures for Pb and Sr isotopic analyses of groundwater samples from an urban sedimentary aquifer. Thirty lead and fourteen strontium isotopic analyses were performed to test different analytical procedures. Pb and Sr isotopic ratios as well as Sr concentration did not vary using different chemical procedures. However, the Pb concentrations were very dependent on the different procedures. Therefore, the choice of the best analytical procedure was based on the Pb results, which indicated a higher reproducibility from samples that had been filtered and acidified before the evaporation, had their residues totally dissolved, and were purified by ion chromatography using the Biorad® column. Our results showed no changes in Pb ratios with the storage time.
Resumo:
The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.