920 resultados para chosen-plaintext attack


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Good blood pressure (BP) control reduces the risk of recurrence of stroke/transient ischaemic attack (TIA). Although there is strong evidence that BP telemonitoring helps achieve good control, none of the major trials have considered the effectiveness in stroke/TIA survivors. We therefore conducted a feasibility study for a trial of BP telemonitoring for stroke/ TIA survivors with uncontrolled BP in primary care. Method Phase 1 was a pilot trial involving 55 patients stratified by stroke/TIA randomised 3:1 to BP telemonitoring for 6 months or usual care. Phase 2 was a qualitative evaluation and comprised semi-structured interviews with 16 trial participants who received telemonitoring and 3 focus groups with 23 members of stroke support groups and 7 carers. Results Overall, 125 patients (60 stroke patients, 65 TIA patients) were approached and 55 (44%) patients were randomised including 27 stroke patients and 28 TIA patients. Fifty-two participants (95%) attended the 6-month follow-up appointment, but one declined the second daytime ambulatory blood pressure monitoring (ABPM) measurement resulting in a 93% completion rate for ABPM − the proposed primary outcome measure for a full trial. Adherence to telemonitoring was good; of the 40 participants who were telemonitoring, 38 continued to provide readings throughout the 6 months. There was a mean reduction of 10.1 mmHg in systolic ABPM in the telemonitoring group compared with 3.8 mmHg in the control group, which suggested the potential for a substantial effect from telemonitoring. Our qualitative analysis found that many stroke patients were concerned about their BP and telemonitoring increased their engagement, was easy, convenient and reassuring Conclusions A full-scale trial is feasible, likely to recruit well and have good rates of compliance and follow-up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Web threats are becoming a major issue for both governments and companies. Generally, web threats increased as much as 600% during last year (WebSense, 2013). This appears to be a significant issue, since many major businesses seem to provide these services. Denial of Service (DoS) attacks are one of the most significant web threats and generally their aim is to waste the resources of the target machine (Mirkovic & Reiher, 2004). Dis-tributed Denial of Service (DDoS) attacks are typically executed from many sources and can result in large traf-fic flows. During last year 11% of DDoS attacks were over 60 Gbps (Prolexic, 2013a). The DDoS attacks are usually performed from the large botnets, which are networks of remotely controlled computers. There is an increasing effort by governments and companies to shut down the botnets (Dittrich, 2012), which has lead the attackers to look for alternative DDoS attack methods. One of the techniques to which attackers are returning to is DDoS amplification attacks. Amplification attacks use intermediate devices called amplifiers in order to amplify the attacker's traffic. This work outlines an evaluation tool and evaluates an amplification attack based on the Trivial File Transfer Proto-col (TFTP). This attack could have amplification factor of approximately 60, which rates highly alongside other researched amplification attacks. This could be a substantial issue globally, due to the fact this protocol is used in approximately 599,600 publicly open TFTP servers. Mitigation methods to this threat have also been consid-ered and a variety of countermeasures are proposed. Effects of this attack on both amplifier and target were analysed based on the proposed metrics. While it has been reported that the breaching of TFTP would be possible (Schultz, 2013), this paper provides a complete methodology for the setup of the attack, and its verification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Memorial Sermon preached in memory of the Rev. Walter Gardner Webster

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The exploding demand for services like the World Wide Web reflects the potential that is presented by globally distributed information systems. The number of WWW servers world-wide has doubled every 3 to 5 months since 1993, outstripping even the growth of the Internet. At each of these self-managed sites, the Common Gateway Interface (CGI) and Hypertext Transfer Protocol (HTTP) already constitute a rudimentary basis for contributing local resources to remote collaborations. However, the Web has serious deficiencies that make it unsuited for use as a true medium for metacomputing --- the process of bringing hardware, software, and expertise from many geographically dispersed sources to bear on large scale problems. These deficiencies are, paradoxically, the direct result of the very simple design principles that enabled its exponential growth. There are many symptoms of the problems exhibited by the Web: disk and network resources are consumed extravagantly; information search and discovery are difficult; protocols are aimed at data movement rather than task migration, and ignore the potential for distributing computation. However, all of these can be seen as aspects of a single problem: as a distributed system for metacomputing, the Web offers unpredictable performance and unreliable results. The goal of our project is to use the Web as a medium (within either the global Internet or an enterprise intranet) for metacomputing in a reliable way with performance guarantees. We attack this problem one four levels: (1) Resource Management Services: Globally distributed computing allows novel approaches to the old problems of performance guarantees and reliability. Our first set of ideas involve setting up a family of real-time resource management models organized by the Web Computing Framework with a standard Resource Management Interface (RMI), a Resource Registry, a Task Registry, and resource management protocols to allow resource needs and availability information be collected and disseminated so that a family of algorithms with varying computational precision and accuracy of representations can be chosen to meet realtime and reliability constraints. (2) Middleware Services: Complementary to techniques for allocating and scheduling available resources to serve application needs under realtime and reliability constraints, the second set of ideas aim at reduce communication latency, traffic congestion, server work load, etc. We develop customizable middleware services to exploit application characteristics in traffic analysis to drive new server/browser design strategies (e.g., exploit self-similarity of Web traffic), derive document access patterns via multiserver cooperation, and use them in speculative prefetching, document caching, and aggressive replication to reduce server load and bandwidth requirements. (3) Communication Infrastructure: Finally, to achieve any guarantee of quality of service or performance, one must get at the network layer that can provide the basic guarantees of bandwidth, latency, and reliability. Therefore, the third area is a set of new techniques in network service and protocol designs. (4) Object-Oriented Web Computing Framework A useful resource management system must deal with job priority, fault-tolerance, quality of service, complex resources such as ATM channels, probabilistic models, etc., and models must be tailored to represent the best tradeoff for a particular setting. This requires a family of models, organized within an object-oriented framework, because no one-size-fits-all approach is appropriate. This presents a software engineering challenge requiring integration of solutions at all levels: algorithms, models, protocols, and profiling and monitoring tools. The framework captures the abstract class interfaces of the collection of cooperating components, but allows the concretization of each component to be driven by the requirements of a specific approach and environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research described in this thesis focuses on the design and synthesis of stable α-diazosulfoxides and investigation of their reactivity under a variety of conditions (transition-metal catalysis, thermal, photochemical and microwave) with a particular emphasis on the synthesis of novel heterocyclic compounds with potential biological activity. The exclusive reaction pathway for these α-diazosulfoxides was found to be hetero-Wolff rearrangement to give α-oxosulfine intermediates. In the first chapter, a literature review of sulfines is presented, including a discussion of naturally occurring sulfines, and an overview of the synthesis and reactivity of sulfines. The potential of sulfines in organic synthesis and recent developments in particular are highlighted. The second chapter discusses the synthesis and reactivity of α-diazosulfoxides, building on earlier results in this research group. The synthesis of lactone-based α-diazosulfoxides and, for the first time, ketone-based benzofused and monocyclic α-diazosulfoxides is described. The reactivity of these α-diazosulfoxides is then explored under a variety of conditions, such as transition-metal catalysis, photochemical and microwave, generating labile α-oxosulfine intermediates, which are trapped using amines and dienes, in addition to the spontaneous reaction pathways which occur with α-oxosulfines in the absence of a trap. A new reaction pathway was explored with the lactone based α-oxosulfines, involving reaction with amines to generate novel 3-aminofuran-2(5H)-ones via carbophilic attack, in very good yields. The reactivity of ketone-based α-diazosulfoxides was explored for the first time, and once again, pseudo-Wolff rearrangement to the α-oxosulfines was the exclusive reaction pathway observed. The intermediacy of the α-oxosulfines was confirmed by trapping as cycloadducts, with the stereochemical features dependant on the reaction conditions. In the absence of a diene trap, a number of reaction fates from the α-oxosulfines were observed, including complete sulfinyl extrusion to give indanones, sulfur extrusion to give indanediones, and, to a lesser extent, dimerisation. The indanediones were characterised by trapping as quinoxalines, to enable full characterisation. One of the overriding outcomes of this thesis was the provision of new insights into the behaviour of α-oxosulfines with different transition metal catalysts, and under thermal, microwave and photolysis conditions. A series of 3-aminofuran-2(5H)-ones and benzofused dihydro-2H-thiopyran S-oxides were submitted for anticancer screening at the U.S. National Cancer Institute. A number of these derivatives were identified as hit compounds, with excellent cell growth inhibition. One 3-aminofuran-2(5H)-one derivative has been chosen for further screening. The third chapter details the full experimental procedures, including spectroscopic and analytical data for the compounds prepared during this research. The data for the crystal structures are contained in the attached CD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally, attacks on cryptographic algorithms looked for mathematical weaknesses in the underlying structure of a cipher. Side-channel attacks, however, look to extract secret key information based on the leakage from the device on which the cipher is implemented, be it smart-card, microprocessor, dedicated hardware or personal computer. Attacks based on the power consumption, electromagnetic emanations and execution time have all been practically demonstrated on a range of devices to reveal partial secret-key information from which the full key can be reconstructed. The focus of this thesis is power analysis, more specifically a class of attacks known as profiling attacks. These attacks assume a potential attacker has access to, or can control, an identical device to that which is under attack, which allows him to profile the power consumption of operations or data flow during encryption. This assumes a stronger adversary than traditional non-profiling attacks such as differential or correlation power analysis, however the ability to model a device allows templates to be used post-profiling to extract key information from many different target devices using the power consumption of very few encryptions. This allows an adversary to overcome protocols intended to prevent secret key recovery by restricting the number of available traces. In this thesis a detailed investigation of template attacks is conducted, along with how the selection of various attack parameters practically affect the efficiency of the secret key recovery, as well as examining the underlying assumption of profiling attacks in that the power consumption of one device can be used to extract secret keys from another. Trace only attacks, where the corresponding plaintext or ciphertext data is unavailable, are then investigated against both symmetric and asymmetric algorithms with the goal of key recovery from a single trace. This allows an adversary to bypass many of the currently proposed countermeasures, particularly in the asymmetric domain. An investigation into machine-learning methods for side-channel analysis as an alternative to template or stochastic methods is also conducted, with support vector machines, logistic regression and neural networks investigated from a side-channel viewpoint. Both binary and multi-class classification attack scenarios are examined in order to explore the relative strengths of each algorithm. Finally these machine-learning based alternatives are empirically compared with template attacks, with their respective merits examined with regards to attack efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our research was conducted to improve the timeliness, coordination, and communication during the detection, investigation and decision-making phases of the response to an aerosolized anthrax attack in the metropolitan Washington, DC, area with the goal of reducing casualties. Our research gathered information of the current response protocols through an extensive literature review and interviews with relevant officials and experts in order to identify potential problems that may exist in various steps of the detection, investigation, and response. Interviewing officials from private and government sector agencies allowed the development of a set of models of interactions and a communication network to identify discrepancies and redundancies that would elongate the delay time in initiating a public health response. In addition, we created a computer simulation designed to model an aerosol spread using weather patterns and population density to identify an estimated population of infected individuals within a target region depending on the virulence and dimensions of the weaponized spores. We developed conceptual models in order to design recommendations that would be presented to our collaborating contacts and agencies that would use such policy and analysis interventions to improve upon the overall response to an aerosolized anthrax attack, primarily through changes to emergency protocol functions and suggestions of technological detection and monitoring response to an aerosolized anthrax attack.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Transient ischemic attack (TIA) is a condition causing focal neurological deficits lasting less than 24hrs. TIA patients present similarly to other conditions with rapid onset of neurological symptoms such as migraine. The accurate diagnosis of TIA is critical because it serves as a warning for subsequent stroke. Furthermore, cognitive deficit associated with TIA may predict the development of dementia. Therefore, characterizing the cognitive symptoms of TIA patients and discriminating these patients from those with similar symptoms is important for proper diagnosis and treatment. Currently the diagnosis of TIA is made on clinical and radiographic evidence. Robotic assessment, with instruments such as the KINARM, may improve the identification of cognitive impairment in TIA patients. Methods: In this prospective cohort study, two KINARM tests, trail making task (TMT) and spatial span task (SST), were used to detect cognitive deficits. Two study groups were made. The TIA group was tested at 5 time points over the span of a year. The migraine active control group had one initial visit and another a year later. Both of these groups were compared to a normative database of approximately 400 healthy volunteers. From this database age and sex matched normative data was used to calculate Z-scores for the TMT. The Montreal Cognitive Assessment (MoCA) was also administered to both groups. Results: 31 participants were recruited, 20 TIA group and 11 active controls (mean ± SD age= 66 ± 11.3 and 62 ± 14.5). There was no significant difference in TIA and active control group MoCA scores. The TMT was able to detect cognitive impairment in TIA and migraine group. Also, both KINARM tasks could detect significant differences in performance between TIA and migraine patients while the MoCA could not. Changes in TIA and migraine performance on the MoCA, TMT, and SST were observed. Conclusions: The robotic KINARM exoskeleton can be used to assess cognitive deficits in TIA patients.