125 resultados para secure interoperation
Resumo:
A fully homomorphic encryption (FHE) scheme is envisioned as a key cryptographic tool in building a secure and reliable cloud computing environment, as it allows arbitrary evaluation of a ciphertext without revealing the plaintext. However, existing FHE implementations remain impractical due to very high time and resource costs. To the authors’ knowledge, this paper presents the first hardware implementation of a full encryption primitive for FHE over the integers using FPGA technology. A large-integer multiplier architecture utilising Integer-FFT multiplication is proposed, and a large-integer Barrett modular reduction module is designed incorporating the proposed multiplier. The encryption primitive used in the integer-based FHE scheme is designed employing the proposed multiplier and modular reduction modules. The designs are verified using the Xilinx Virtex-7 FPGA platform. Experimental results show that a speed improvement factor of up to 44 is achievable for the hardware implementation of the FHE encryption scheme when compared to its corresponding software implementation. Moreover, performance analysis shows further speed improvements of the integer-based FHE encryption primitives may still be possible, for example through further optimisations or by targeting an ASIC platform.
Resumo:
The aim of this research was to explore consumer perceptions of personalised nutrition and to compare these across three different levels of "medicalization": lifestyle assessment (no blood sampling); phenotypic assessment (blood sampling); genomic assessment (blood and buccal sampling). The protocol was developed from two pilot focus groups conducted in the UK. Two focus groups (one comprising only "older" individuals between 30 and 60 years old, the other of adults 18-65 yrs of age) were run in the UK, Spain, the Netherlands, Poland, Portugal, Ireland, Greece and Germany (N = 16). The analysis (guided using grounded theory) suggested that personalised nutrition was perceived in terms of benefit to health and fitness and that convenience was an important driver of uptake. Negative attitudes were associated with internet delivery but not with personalised nutrition per se. Barriers to uptake were linked to broader technological issues associated with data protection, trust in regulator and service providers. Services that required a fee were expected to be of better quality and more secure. An efficacious, transparent and trustworthy regulatory framework for personalised nutrition is required to alleviate consumer concern. In addition, developing trust in service providers is important if such services to be successful. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Tephrochronology, a key tool in the correlation of Quaternary sequences, relies on the extraction of tephra shards from sediments for visual identification and high-precision geochemical comparison. A prerequisite for the reliable correlation of tephra layers is that the geochemical composition of glass shards remains unaltered by natural processes (e.g. chemical exchange in the sedimentary environment) and/or by laboratory analytical procedures. However, natural glasses, particularly when in the form of small shards with a high surface to volume ratio, are prone to chemical alteration in both acidic and basic environments. Current techniques for the extraction of distal tephra from sediments involve the ‘cleaning’ of samples in precisely such environments and at elevated temperatures. The acid phase of the ‘cleaning’ process risks alteration of the geochemical signature of the shards, while the basic phase leads to considerable sample loss through dissolution of the silica network. Here, we illustrate the degree of alteration and loss to which distal tephras may be prone, and introduce a less destructive procedure for their extraction. This method is based on stepped heavy liquid flotation and which results in samples of sufficient quality for analysis while preserving their geochemical integrity. In trials, this method out-performed chemical extraction procedures in terms of the number of shards recovered and has resulted in the detection of new tephra layers with low shard concentrations. The implications of this study are highly significant because (i) the current database of distal tephra records and their corresponding geochemical signatures may require refinement and (ii) the record of distal tephras may be incomplete due to sample loss induced by corrosive laboratory procedures. It is therefore vital that less corrosive laboratory procedures are developed to make the detection and classification of distal glass tephra more secure.
Resumo:
Introduction Asthma is now one of the most common long-term conditions in the UK. It is therefore important to develop a comprehensive appreciation of the healthcare and societal costs in order to inform decisions on care provision and planning. We plan to build on our earlier estimates of national prevalence and costs from asthma by filling the data gaps previously identified in relation to healthcare and broadening the field of enquiry to include societal costs. This work will provide the first UK-wide estimates of the costs of asthma. In the context of asthma for the UK and its member countries (ie, England, Northern Ireland, Scotland and Wales), we seek to: (1) produce a detailed overview of estimates of incidence, prevalence and healthcare utilisation; (2) estimate health and societal costs; (3) identify any remaining information gaps and explore the feasibility of filling these and (4) provide insights into future research that has the potential to inform changes in policy leading to the provision of more cost-effective care.
Methods and analysis Secondary analyses of data from national health surveys, primary care, prescribing, emergency care, hospital, mortality and administrative data sources will be undertaken to estimate prevalence, healthcare utilisation and outcomes from asthma. Data linkages and economic modelling will be undertaken in an attempt to populate data gaps and estimate costs. Separate prevalence and cost estimates will be calculated for each of the UK-member countries and these will then be aggregated to generate UK-wide estimates.
Ethics and dissemination Approvals have been obtained from the NHS Scotland Information Services Division's Privacy Advisory Committee, the Secure Anonymised Information Linkage Collaboration Review System, the NHS South-East Scotland Research Ethics Service and The University of Edinburgh's Centre for Population Health Sciences Research Ethics Committee. We will produce a report for Asthma-UK, submit papers to peer-reviewed journals and construct an interactive map.
Resumo:
This chapter presents an analysis of the unprecedented use of electronic voting by expatriates during the French 2012 legislative elections, when they elected their own representatives (referred to here as ‘deputies’), to the National Assembly in Paris for the first time, in 11 newly created overseas constituencies.
The study is presented within the broader perspective of electronic voting in France more generally, and in the historical context of extra-territorial voting by French expatriates. The authors discuss the main issues and controversies that arose during the 2012 elections, and in a final section analyse the results. The authors conclude by drawing attention to recent developments in electronic voting in France since the 2012 elections, which suggest that although there was much criticism expressed by experts of electronic voting as to the security and transparency of the system used, the official discourse that acclaimed the experience as a success, appears to have convinced its target audience.
Resumo:
Unlike the mathematical techniques adopted in classical cryptographic technology at higher protocol layers, it is shown that characteristics intrinsic to the physical layer can be exploited to secure useful information. It is shown that a retrodirective array can be made to operate more securely by incorporating directional modulation (DM) concepts. The presented new approach allows DM to operate in a multipath environment. Previously, DM systems could only operate in free space.
Resumo:
Leniency (amnesty) plus is one of the tools used in the fight against anticompetitive agreements. It allows a cartelist who did not manage to secure complete immunity under general leniency, to secure an additional reduction of sanctions in exchange for cooperation with the authorities with respect to operation of another prohibited agreement on an unrelated market. The instrument was developed in the United States and, in recent years, it was introduced in a number of jurisdictions. This article contextualises the operation of and rationale behind leniency plus, forewarning about its potential procollusive effects and the possibility of its strategic (mis)use by cartelists. It discusses theoretical, moral, and systemic (deterrence-related) problems surrounding this tool. It also provides a comparison of leniency plus in eleven jurisdictions, identifying common design flaws. This piece argues that leniency plus tends to be a problematic and poorly transplanted US legal innovation. Policy-makers considering its introduction should analyse it in light of institutional limits and local realities. Some of the regimes which already introduced it would be better off abandoning it.
Resumo:
This letter proposes several relay selection policies for secure communication in cognitive decode-and-forward (DF) relay networks, where a pair of cognitive relays are opportunistically selected for security protection against eavesdropping. The first relay transmits the secrecy information to the destination,
and the second relay, as a friendly jammer, transmits the jamming signal to confound the eavesdropper. We present new exact closed-form expressions for the secrecy outage probability. Our analysis and simulation results strongly support our conclusion that the proposed relay selection policies can enhance the performance of secure cognitive radio. We also confirm that the error floor phenomenon is created in the absence of jamming.
Resumo:
Cognitive radio has emerged as an essential recipe for future high-capacity high-coverage multi-tier hierarchical networks. Securing data transmission in these networks is of utmost importance. In this paper, we consider the cognitive wiretap channel and propose multiple antennas to secure the transmission at the physical layer, where the eavesdropper overhears the transmission from the secondary transmitter to the secondary receiver. The secondary receiver and the eavesdropper are equipped with multiple antennas, and passive eavesdropping is considered where the channel state information of the eavesdropper’s channel is not available at the secondary transmitter. We present new closedform expressions for the exact and asymptotic secrecy outage probability. Our results reveal the impact of the primary network on the secondary network in the presence of a multi-antenna wiretap channel.
Resumo:
The next-generation smart grid will rely highly on telecommunications infrastructure for data transfer between various systems. Anywhere we have data transfer in a system is a potential security threat. When we consider the possibility of smart grid data being at the heart of our critical systems infrastructure it is imperative that we do all we can to ensure the confidentiality, availability and integrity of the data. A discussion on security itself is outside the scope of this paper, but if we assume the network to be as secure as possible we must consider what we can do to detect when that security fails, or when the attacks comes from the inside of the network. One way to do this is to setup a hacker-trap, or honeypot. A honeypot is a device or service on a network which appears legitimate, but is in-fact a trap setup to catch breech attempts. This paper identifies the different types of honeypot and describes where each may be used. The authors have setup a test honeypot system which has been live for some time. The test system has been setup to emulate a device on a utility network. The system has had many hits, which are described in detail by the authors. Finally, the authors discuss how larger-scale systems in utilities may benefit from honeypot placement.
Resumo:
One of the core properties of Software Defined Networking (SDN) is the ability for third parties to develop network applications. This introduces increased potential for innovation in networking from performance-enhanced to energy-efficient designs. In SDN, the application connects with the network via the SDN controller. A specific concern relating to this communication channel is whether an application can be trusted or not. For example, what information about the network state is gathered by the application? Is this information necessary for the application to execute or is it gathered for malicious intent? In this paper we present an approach to secure the northbound interface by introducing a permissions system that ensures that controller operations are available to trusted applications only. Implementation of this permissions system with our Operation Checkpoint adds negligible overhead and illustrates successful defense against unauthorized control function access attempts.
Resumo:
Renewed archaeological investigation of the West Mouth of Niah Cave, Borneo has demonstrated that even within lowland equatorial environments depositional conditions do exist where organic remains of late glacial and early post-glacial age can be preserved. Excavations by the Niah Cave Research Project (NCP) (2000-2003) towards the rear of the archaeological reserve produced several bone points and worked stingray spines, which exhibit evidence of hafting mastic and fibrous binding still adhering to their shafts. The position of both gives strong indication of how these cartilaginous points were hafted and gives insight into their potential function. These artefacts were recovered from secure and (14)C dated stratigraphic horizons. The results of this study have implications for our understanding the function of the Terminal Pleistocene and Early Holocene bone tools recovered from other regions of Island Southeast Asia. They demonstrate that by the end the Pleistocene rainforest foragers in Borneo were producing composite technologies that probably included fishing leisters and potentially the bow and arrow. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Recent research in Europe, Africa, and Southeast Asia suggests that we can no longer assume a direct and exclusive link between anatomically modern humans and behavioral modernity (the 'human revolution'), and assume that the presence of either one implies the presence of the other: discussions of the emergence of cultural complexity have to proceed with greater scrutiny of the evidence on a site-by-site basis to establish secure associations between the archaeology present there and the hominins who created it. This paper presents one such case study: Niah Cave in Sarawak on the island of Borneo, famous for the discovery in 1958 in the West Mouth of the Great Cave of a modern human skull, the 'Deep Skull,' controversially associated with radiocarbon dates of ca. 40,000 years before the present. A new chronostratigraphy has been developed through a re-investigation of the lithostratigraphy left by the earlier excavations, AMS-dating using three different comparative pre-treatments including ABOX of charcoal, and U-series using the Diffusion-Absorption model applied to fragments of bones from the Deep Skull itself. Stratigraphic reasons for earlier uncertainties about the antiquity of the skull are examined, and it is shown not to be an `intrusive' artifact. It was probably excavated from fluvial-pond-desiccation deposits that accumulated episodically in a shallow basin immediately behind the cave entrance lip, in a climate that ranged from times of comparative aridity with complete desiccation, to episodes of greater surface wetness, changes attributed to regional climatic fluctuations. Vegetation outside the cave varied significantly over time, including wet lowland forest, montane forest, savannah, and grassland. The new dates and the lithostratigraphy relate the Deep Skull to evidence of episodes of human activity that range in date from ca. 46,000 to ca. 34,000 years ago. Initial investigations of sediment scorching, pollen, palynomorphs, phytoliths, plant macrofossils, and starch grains recovered from existing exposures, and of vertebrates from the current and the earlier excavations, suggest that human foraging during these times was marked by habitat-tailored hunting technologies, the collection and processing of toxic plants for consumption, and, perhaps, the use of fire at some forest-edges. The Niah evidence demonstrates the sophisticated nature of the subsistence behavior developed by modern humans to exploit the tropical environments that they encountered in Southeast Asia, including rainforest. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Side-channel analysis of cryptographic systems can allow for the recovery of secret information by an adversary even where the underlying algorithms have been shown to be provably secure. This is achieved by exploiting the unintentional leakages inherent in the underlying implementation of the algorithm in software or hardware. Within this field of research, a class of attacks known as profiling attacks, or more specifically as used here template attacks, have been shown to be extremely efficient at extracting secret keys. Template attacks assume a strong adversarial model, in that an attacker has an identical device with which to profile the power consumption of various operations. This can then be used to efficiently attack the target device. Inherent in this assumption is that the power consumption across the devices under test is somewhat similar. This central tenet of the attack is largely unexplored in the literature with the research community generally performing the profiling stage on the same device as being attacked. This is beneficial for evaluation or penetration testing as it is essentially the best case scenario for an attacker where the model built during the profiling stage matches exactly that of the target device, however it is not necessarily a reflection on how the attack will work in reality. In this work, a large scale evaluation of this assumption is performed, comparing the key recovery performance across 20 identical smart-cards when performing a profiling attack.
Resumo:
Digital signatures are an important primitive for building secure systems and are used in most real-world security protocols. However, almost all popular signature schemes are either based on the factoring assumption (RSA) or the hardness of the discrete logarithm problem (DSA/ECDSA). In the case of classical cryptanalytic advances or progress on the development of quantum computers, the hardness of these closely related problems might be seriously weakened. A potential alternative approach is the construction of signature schemes based on the hardness of certain lattice problems that are assumed to be intractable by quantum computers. Due to significant research advancements in recent years, lattice-based schemes have now become practical and appear to be a very viable alternative to number-theoretic cryptography. In this article, we focus on recent developments and the current state of the art in lattice-based digital signatures and provide a comprehensive survey discussing signature schemes with respect to practicality. Additionally, we discuss future research areas that are essential for the continued development of lattice-based cryptography.