886 resultados para Pearl Harbor (Hawaii), Attack on, 1941
Resumo:
Greater complexity and interconnectivity across systems embracing Smart Grid technologies has meant that cyber-security issues have attracted significant attention. This paper describes pertinent cyber-security requirements, in particular cyber attacks and countermeasures which are critical for reliable Smart Grid operation. Relevant published literature is presented for critical aspects of Smart Grid cyber-security, such as vulnerability, interdependency, simulation, and standards. Furthermore, a preliminary study case is given which demonstrates the impact of a cyber attack which violates the integrity of data on the load management of real power system. Finally, the paper proposes future work plan which focuses on applying intrusion detection and prevention technology to address cyber-security issues. This paper also provides an overview of Smart Grid cyber-security with reference to related cross-disciplinary research topics.
Resumo:
We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that, over a sequence of rounds, an adversary either inserts a node with arbitrary connections or deletes an arbitrary node from the network. The network responds to each such change by quick "repairs," which consist of adding or deleting a small number of edges. These repairs essentially preserve closeness of nodes after adversarial deletions,without increasing node degrees by too much, in the following sense. At any point in the algorithm, nodes v and w whose distance would have been - in the graph formed by considering only the adversarial insertions (not the adversarial deletions), will be at distance at most - log n in the actual graph, where n is the total number of vertices seen so far. Similarly, at any point, a node v whose degreewould have been d in the graph with adversarial insertions only, will have degree at most 3d in the actual graph. Our distributed data structure, which we call the Forgiving Graph, has low latency and bandwidth requirements. The Forgiving Graph improves on the Forgiving Tree distributed data structure from Hayes et al. (2008) in the following ways: 1) it ensures low stretch over all pairs of nodes, while the Forgiving Tree only ensures low diameter increase; 2) it handles both node insertions and deletions, while the Forgiving Tree only handles deletions; 3) it requires only a very simple and minimal initialization phase, while the Forgiving Tree initially requires construction of a spanning tree of the network. © Springer-Verlag 2012.
Resumo:
This paper presents an integrated design and costing method for large stiffened panels for the purpose of investigating the influence and interaction of lay-up technology and production rate on manufacturing cost. A series of wing cover panels (≈586kg, 19·9m2) have been sized with realistic requirements considering manual and automated lay-up routes. The integrated method has enabled the quantification of component unit cost sensitivity to changes in annual production rate and employed equipment maximum deposition rate. Moreover the results demonstrate the interconnected relationship between lay-up process and panel design, and unit cost. The optimum unit cost solution when using automated lay-up is a combination of the minimum deposition rate and minimum number of lay-up machines to meet the required production rate. However, the location of the optimum unit cost, at the boundaries between the number of lay-up machines required, can make unit cost very sensitive to small changes in component design, production rate, and equipment maximum deposition rate. - See more at: http://aerosociety.com/News/Publications/Aero-Journal/Online/1941/Modelling-layup-automation-and-production-rate-interaction-on-the-cost-of-large-stiffened-panel-components#sthash.0fLuu9iG.dpuf
Resumo:
This paper investigates cyber attacks on ICS which rely on IEC 60870-5-104 for telecontrol communications. The main focus of the paper is on man-in-the-middle attacks, covering modification and injection of commands, it also details capture and replay attacks. An initial set of attacks are preformed on a local software simulated laboratory. Final experiments and validation of a man-in-the-middle attack are performed in a comprehensive testbed environment in conjunction with an electricity distribution operator.
Resumo:
In this short paper, we present an integrated approach to detecting and mitigating cyber-attacks to modern interconnected industrial control systems. One of the primary goals of this approach is that it is cost effective, and thus whenever possible it builds on open-source security technologies and open standards, which are complemented with novel security solutions that address the specific challenges of securing critical infrastructures.
Resumo:
Most cryptographic devices should inevitably have a resistance against the threat of side channel attacks. For this, masking and hiding schemes have been proposed since 1999. The security validation of these countermeasures is an ongoing research topic, as a wider range of new and existing attack techniques are tested against these countermeasures. This paper examines the side channel security of the balanced encoding countermeasure, whose aim is to process the secret key-related data under a constant Hamming weight and/or Hamming distance leakage. Unlike previous works, we assume that the leakage model coefficients conform to a normal distribution, producing a model with closer fidelity to real-world implementations. We perform analysis on the balanced encoded PRINCE block cipher with simulated leakage model and also an implementation on an AVR board. We consider both standard correlation power analysis (CPA) and bit-wise CPA. We confirm the resistance of the countermeasure against standard CPA, however, we find with a bit-wise CPA that we can reveal the key with only a few thousands traces.
Resumo:
In this paper we identify requirements for choosing a threat modelling formalisation for modelling sophisticated malware such as Duqu 2.0. We discuss the gaps in current formalisations and propose the use of Attack Trees with Sequential Conjunction when it comes to analysing complex attacks. The paper models Duqu 2.0 based on the latest information sourced from formal and informal sources. This paper provides a well structured model which can be used for future analysis of Duqu 2.0 and related attacks.
Resumo:
Cryptographic algorithms have been designed to be computationally secure, however it has been shown that when they are implemented in hardware, that these devices leak side channel information that can be used to mount an attack that recovers the secret encryption key. In this paper an overlapping window power spectral density (PSD) side channel attack, targeting an FPGA device running the Advanced Encryption Standard is proposed. This improves upon previous research into PSD attacks by reducing the amount of pre-processing (effort) required. It is shown that the proposed overlapping window method requires less processing effort than that of using a sliding window approach, whilst overcoming the issues of sampling boundaries. The method is shown to be effective for both aligned and misaligned data sets and is therefore recommended as an improved approach in comparison with existing time domain based correlation attacks.
Resumo:
Side channel attacks permit the recovery of the secret key held within a cryptographic device. This paper presents a new EM attack in the frequency domain, using a power spectral density analysis that permits the use of variable spectral window widths for each trace of the data set and demonstrates how this attack can therefore overcome both inter-and intra-round random insertion type countermeasures. We also propose a novel re-alignment method exploiting the minimal power markers exhibited by electromagnetic emanations. The technique can be used for the extraction and re-alignment of round data in the time domain.
Resumo:
This is a due date card for the book titled Spring Came on Forever, stamped with Works Progress Administration.
Resumo:
Tese de mestrado em Biologia da Conservação, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2016
Resumo:
Congressional dominance theory holds that not only can the US Congress control the executive, it does. The terrorist attacks on New York and Washington on 11 September 2001 and the Bush administration's ensuing global 'war on terror' suggest a different result. Bush's response to 9/11 signalled not only new directions in US foreign and domestic policy but a new stage in the aggrandisement of presidential power in the United States and a further step in the marginalisation of the Congress. Informed by a constitutional doctrine unknown to the framers of the US Constitution, the Bush administration pursued a presidentialist or 'ultra-separationist' governing strategy that was disrespectful to the legislature's intended role in the separated system. Using its unilateral powers, in public and in secret, claiming 'inherent' authority from the Constitution, and exploiting the public's fear of a further terrorist attack and of endangering the lives of US troops abroad, the administration skilfully drove its legislation through the Congress. Occasionally, the Congress was able to extract concessions - notably in the immediate aftermath of 9/11, when partisan control of the government was split - but more typically, for most of the period, the Congress acquiesced to administration demands, albeit with the consolation of minor concessions. The administration not only dominated the lawmaking process, it also cowed legislators into legitimating often highly controversial (and sometimes illegal) administration-determined definitions of counter-terrorism and national security policy. Certainly, the Congress undertook a considerable amount of oversight during the period of the 'war on terror'; lawmakers also complained. But the effects on policy were marginal. This finding held true for periods of Democratic as well as Republican majorities.
Resumo:
Researchers want to analyse Health Care data which may requires large pools of compute and data resources. To have them they need access to Distributed Computing Infrastructures (DCI). To use them it requires expertise which researchers may not have. Workflows can hide infrastructures. There are many workflow systems but they are not interoperable. To learn a workflow system and create workflows in a workflow system may require significant effort. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows of other workflow systems. As a result, the lack of interoperability prevents workflow sharing and a vast amount of research efforts is wasted. The FP7 Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs (SHIWA) project developed the Coarse-Grained Interoperability (CGI) to enable workflow sharing. The project created the SHIWA Simulation Platform (SSP) to support CGI as a production-level service. The paper describes how the CGI approach can be used for analysis and simulation in Health Care.
Resumo:
BACKGROUND: Earlobe crease (ELC) has been associated with cardiovascular diseases (CVD) or risk factors (CVRF) and could be a marker predisposing to CVD. However, most studies studied only a small number of CVRF and no complete assessment of the associations between ELC and CVRF has been performed in a single study. METHODS: Population-based study (n = 4635, 46.7 % men) conducted between 2009 and 2012 in Lausanne, Switzerland. RESULTS: Eight hundred six participants (17.4 %) had an ELC. Presence of ELC was associated with male gender and older age. After adjusting for age and gender (and medication whenever necessary), presence of ELC was significantly (p < 0.05) associated with higher levels of body mass index (BMI) [adjusted mean ± standard error: 27.0 ± 0.2 vs. 26.02 ± 0.07 kg/m(2)], triglycerides [1.40 ± 0.03 vs. 1.36 ± 0.01 mmol/L] and insulin [8.8 ± 0.2 vs. 8.3 ± 0.1 μIU/mL]; lower levels of HDL cholesterol [1.61 ± 0.02 vs. 1.64 ± 0.01 mmol/L]; higher frequency of abdominal obesity [odds ratio and (95 % confidence interval) 1.20 (1.02; 1.42)]; hypertension [1.41 (1.18; 1.67)]; diabetes [1.43 (1.15; 1.79)]; high HOMA-IR [1.19 (1.00; 1.42)]; metabolic syndrome [1.28 (1.08; 1.51)] and history of CVD [1.55 (1.21; 1.98)]. No associations were found between ELC and estimated cardiovascular risk, inflammatory or liver markers. After further adjustment on BMI, only the associations between ELC and hypertension [1.30 (1.08; 1.56)] and history of CVD [1.47 (1.14; 1.89)] remained significant. For history of CVD, further adjustment on diabetes, hypertension, total cholesterol and smoking led to similar results [1.36 (1.05; 1.77)]. CONCLUSION: In this community-based sample ELC was significantly and independently associated with hypertension and history of CVD.