893 resultados para DDoS attack
Resumo:
Wydział Historyczny: Instytut Historii
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
In this paper, we expose an unorthodox adversarial attack that exploits the transients of a system's adaptive behavior, as opposed to its limited steady-state capacity. We show that a well orchestrated attack could introduce significant inefficiencies that could potentially deprive a network element from much of its capacity, or significantly reduce its service quality, while evading detection by consuming an unsuspicious, small fraction of that element's hijacked capacity. This type of attack stands in sharp contrast to traditional brute-force, sustained high-rate DoS attacks, as well as recently proposed attacks that exploit specific protocol settings such as TCP timeouts. We exemplify what we term as Reduction of Quality (RoQ) attacks by exposing the vulnerabilities of common adaptation mechanisms. We develop control-theoretic models and associated metrics to quantify these vulnerabilities. We present numerical and simulation results, which we validate with observations from real Internet experiments. Our findings motivate the need for the development of adaptation mechanisms that are resilient to these new forms of attacks.
Resumo:
The exploding demand for services like the World Wide Web reflects the potential that is presented by globally distributed information systems. The number of WWW servers world-wide has doubled every 3 to 5 months since 1993, outstripping even the growth of the Internet. At each of these self-managed sites, the Common Gateway Interface (CGI) and Hypertext Transfer Protocol (HTTP) already constitute a rudimentary basis for contributing local resources to remote collaborations. However, the Web has serious deficiencies that make it unsuited for use as a true medium for metacomputing --- the process of bringing hardware, software, and expertise from many geographically dispersed sources to bear on large scale problems. These deficiencies are, paradoxically, the direct result of the very simple design principles that enabled its exponential growth. There are many symptoms of the problems exhibited by the Web: disk and network resources are consumed extravagantly; information search and discovery are difficult; protocols are aimed at data movement rather than task migration, and ignore the potential for distributing computation. However, all of these can be seen as aspects of a single problem: as a distributed system for metacomputing, the Web offers unpredictable performance and unreliable results. The goal of our project is to use the Web as a medium (within either the global Internet or an enterprise intranet) for metacomputing in a reliable way with performance guarantees. We attack this problem one four levels: (1) Resource Management Services: Globally distributed computing allows novel approaches to the old problems of performance guarantees and reliability. Our first set of ideas involve setting up a family of real-time resource management models organized by the Web Computing Framework with a standard Resource Management Interface (RMI), a Resource Registry, a Task Registry, and resource management protocols to allow resource needs and availability information be collected and disseminated so that a family of algorithms with varying computational precision and accuracy of representations can be chosen to meet realtime and reliability constraints. (2) Middleware Services: Complementary to techniques for allocating and scheduling available resources to serve application needs under realtime and reliability constraints, the second set of ideas aim at reduce communication latency, traffic congestion, server work load, etc. We develop customizable middleware services to exploit application characteristics in traffic analysis to drive new server/browser design strategies (e.g., exploit self-similarity of Web traffic), derive document access patterns via multiserver cooperation, and use them in speculative prefetching, document caching, and aggressive replication to reduce server load and bandwidth requirements. (3) Communication Infrastructure: Finally, to achieve any guarantee of quality of service or performance, one must get at the network layer that can provide the basic guarantees of bandwidth, latency, and reliability. Therefore, the third area is a set of new techniques in network service and protocol designs. (4) Object-Oriented Web Computing Framework A useful resource management system must deal with job priority, fault-tolerance, quality of service, complex resources such as ATM channels, probabilistic models, etc., and models must be tailored to represent the best tradeoff for a particular setting. This requires a family of models, organized within an object-oriented framework, because no one-size-fits-all approach is appropriate. This presents a software engineering challenge requiring integration of solutions at all levels: algorithms, models, protocols, and profiling and monitoring tools. The framework captures the abstract class interfaces of the collection of cooperating components, but allows the concretization of each component to be driven by the requirements of a specific approach and environment.
Resumo:
This paper explores the problem of protecting a site on the Internet against hostile external Java applets while allowing trusted internal applets to run. With careful implementation, a site can be made resistant to current Java security weaknesses as well as those yet to be discovered. In addition, we describe a new attack on certain sophisticated firewalls that is most effectively realized as a Java applet.
Resumo:
As new multi-party edge services are deployed on the Internet, application-layer protocols with complex communication models and event dependencies are increasingly being specified and adopted. To ensure that such protocols (and compositions thereof with existing protocols) do not result in undesirable behaviors (e.g., livelocks) there needs to be a methodology for the automated checking of the "safety" of these protocols. In this paper, we present ingredients of such a methodology. Specifically, we show how SPIN, a tool from the formal systems verification community, can be used to quickly identify problematic behaviors of application-layer protocols with non-trivial communication models—such as HTTP with the addition of the "100 Continue" mechanism. As a case study, we examine several versions of the specification for the Continue mechanism; our experiments mechanically uncovered multi-version interoperability problems, including some which motivated revisions of HTTP/1.1 and some which persist even with the current version of the protocol. One such problem resembles a classic degradation-of-service attack, but can arise between well-meaning peers. We also discuss how the methods we employ can be used to make explicit the requirements for hardening a protocol's implementation against potentially malicious peers, and for verifying an implementation's interoperability with the full range of allowable peer behaviors.
Resumo:
We consider the problem of building robust fuzzy extractors, which allow two parties holding similar random variables W, W' to agree on a secret key R in the presence of an active adversary. Robust fuzzy extractors were defined by Dodis et al. in Crypto 2006 [6] to be noninteractive, i.e., only one message P, which can be modified by an unbounded adversary, can pass from one party to the other. This allows them to be used by a single party at different points in time (e.g., for key recovery or biometric authentication), but also presents an additional challenge: what if R is used, and thus possibly observed by the adversary, before the adversary has a chance to modify P. Fuzzy extractors secure against such a strong attack are called post-application robust. We construct a fuzzy extractor with post-application robustness that extracts a shared secret key of up to (2m−n)/2 bits (depending on error-tolerance and security parameters), where n is the bit-length and m is the entropy of W . The previously best known result, also of Dodis et al., [6] extracted up to (2m − n)/3 bits (depending on the same parameters).
Resumo:
We propose an economic mechanism to reduce the incidence of malware that delivers spam. Earlier research proposed attention markets as a solution for unwanted messages, and showed they could provide more net benefit than alternatives such as filtering and taxes. Because it uses a currency system, Attention Bonds faces a challenge. Zombies, botnets, and various forms of malware might steal valuable currency instead of stealing unused CPU cycles. We resolve this problem by taking advantage of the fact that the spam-bot problem has been reduced to financial fraud. As such, the large body of existing work in that realm can be brought to bear. By drawing an analogy between sending and spending, we show how a market mechanism can detect and prevent spam malware. We prove that by using a currency (i) each instance of spam increases the probability of detecting infections, and (ii) the value of eradicating infections can justify insuring users against fraud. This approach attacks spam at the source, a virtue missing from filters that attack spam at the destination. Additionally, the exchange of currency provides signals of interest that can improve the targeting of ads. ISPs benefit from data management services and consumers benefit from the higher average value of messages they receive. We explore these and other secondary effects of attention markets, and find them to offer, on the whole, attractive economic benefits for all – including consumers, advertisers, and the ISPs.
Resumo:
Recent research have exposed new breeds of attacks that are capable of denying service or inflicting significant damage to TCP flows, without sustaining the attack traffic. Such attacks are often referred to as "low-rate" attacks and they stand in sharp contrast against traditional Denial of Service (DoS) attacks that can completely shut off TCP flows by flooding an Internet link. In this paper, we study the impact of these new breeds of attacks and the extent to which defense mechanisms are capable of mitigating the attack's impact. Through adopting a simple discrete-time model with a single TCP flow and a nonoblivious adversary, we were able to expose new variants of these low-rate attacks that could potentially have high attack potency per attack burst. Our analysis is focused towards worst-case scenarios, thus our results should be regarded as upper bounds on the impact of low-rate attacks rather than a real assessment under a specific attack scenario.
Resumo:
The lives of Thomas and Anna Haslam were dedicated to the attainment of women's equality. They were feminists before the word was coined. In an era when respectable women were not supposed to know of the existence of prostitutes, Anna became empowered to do the unthinkable, not only to speak in public but to discuss openly matters sexual and to attack the double standard of sexuality which was enshrined in the official treatment of prostitutes. Their life-long commitment to the cause of women's suffrage never faltered, despite the repeated discouragement of the fate of bills defeated in the House of Commons. The Haslams represented an Ireland which did not survive them. While they were dedicated to the union with Westminster, they worked happily with those who applied themselves to its destruction. Although in many ways they exemplified the virtues of their Quaker backgrounds, they did not subscribe to any organised religion. Despite living in straitened circumstances, they were part of an urban intellectual elite and participated in the social and cultural life of Dublin for over fifty years. It is tempting to speculate how the Haslams would have fared in post independence Ireland. Hanna Sheehy Skeffington who had impeccable nationalist credentials, was effectively marginalised. It is likely that they would have protested against discriminatory legislation in their usual law abiding manner but, in a country which quickly developed an overwhelmingly Roman Catholic ethos, would they have had a voice or a constituency? Ironically, Thomas's teaching on chastity would have found favour with the hierarchy; his message was disseminated in a simple and more pious manner in numerous Catholic Truth Society pamphlets. The Protestant minority never sought to subvert the institutions of the state, was careful not to criticise and kept its collective head down. Dáil Éireann was not bombarded with petitions for the restoration of divorce facilities or the unbanning of birth control. Those who sought such amenities obtained them quietly 'in another jurisdiction.' Fifty years were to pass before the condom wielding 'comely maidens' erupted on to the front pages of the Sunday papers. They were, one imagines, the spiritual descendants of the militant rather than the constitutional suffrage movement. "Once and for all we need to commit ourselves to the concept that women's rights are not factional or sectional privileges, bestowed on the few at the whim of the many. They are human rights. In a society in which the rights and potential of women are constrained no man can be truly free." These words spoken by Mary Robinson as President of Ireland are an echo of the principles to which the Haslams dedicated their lives and are, perhaps, a tribute to their efforts.
Resumo:
The research described in this thesis focuses on the design and synthesis of stable α-diazosulfoxides and investigation of their reactivity under a variety of conditions (transition-metal catalysis, thermal, photochemical and microwave) with a particular emphasis on the synthesis of novel heterocyclic compounds with potential biological activity. The exclusive reaction pathway for these α-diazosulfoxides was found to be hetero-Wolff rearrangement to give α-oxosulfine intermediates. In the first chapter, a literature review of sulfines is presented, including a discussion of naturally occurring sulfines, and an overview of the synthesis and reactivity of sulfines. The potential of sulfines in organic synthesis and recent developments in particular are highlighted. The second chapter discusses the synthesis and reactivity of α-diazosulfoxides, building on earlier results in this research group. The synthesis of lactone-based α-diazosulfoxides and, for the first time, ketone-based benzofused and monocyclic α-diazosulfoxides is described. The reactivity of these α-diazosulfoxides is then explored under a variety of conditions, such as transition-metal catalysis, photochemical and microwave, generating labile α-oxosulfine intermediates, which are trapped using amines and dienes, in addition to the spontaneous reaction pathways which occur with α-oxosulfines in the absence of a trap. A new reaction pathway was explored with the lactone based α-oxosulfines, involving reaction with amines to generate novel 3-aminofuran-2(5H)-ones via carbophilic attack, in very good yields. The reactivity of ketone-based α-diazosulfoxides was explored for the first time, and once again, pseudo-Wolff rearrangement to the α-oxosulfines was the exclusive reaction pathway observed. The intermediacy of the α-oxosulfines was confirmed by trapping as cycloadducts, with the stereochemical features dependant on the reaction conditions. In the absence of a diene trap, a number of reaction fates from the α-oxosulfines were observed, including complete sulfinyl extrusion to give indanones, sulfur extrusion to give indanediones, and, to a lesser extent, dimerisation. The indanediones were characterised by trapping as quinoxalines, to enable full characterisation. One of the overriding outcomes of this thesis was the provision of new insights into the behaviour of α-oxosulfines with different transition metal catalysts, and under thermal, microwave and photolysis conditions. A series of 3-aminofuran-2(5H)-ones and benzofused dihydro-2H-thiopyran S-oxides were submitted for anticancer screening at the U.S. National Cancer Institute. A number of these derivatives were identified as hit compounds, with excellent cell growth inhibition. One 3-aminofuran-2(5H)-one derivative has been chosen for further screening. The third chapter details the full experimental procedures, including spectroscopic and analytical data for the compounds prepared during this research. The data for the crystal structures are contained in the attached CD.
Resumo:
The Gastro-Intestinal (GI) tract is a unique region in the body. Our innate immune system retains a fine homeostatic balance between avoiding inappropriate inflammatory responses against the myriad commensal microbes residing in the gut while also remaining active enough to prevent invasive pathogenic attack. The intestinal epithelium represents the frontline of this interface. It has long been known to act as a physical barrier preventing the lumenal bacteria of the gastro-intestinal tract from activating an inflammatory immune response in the immune cells of the underlying mucosa. However, in recent years, an appreciation has grown surrounding the role played by the intestinal epithelium in regulating innate immune responses, both in the prevention of infection and in maintaining a homeostatic environment through modulation of innate immune signalling systems. The aim of this thesis was to identify novel innate immune mechanisms regulating inflammation in the GI tract. To achieve this aim, we chose several aspects of regulatory mechanisms utilised in this region by the innate immune system. We identified several commensal strains of bacteria expressing proteins containing signalling domains used by Pattern Recognition Receptors (PRRs) of the innate immune system. Three such bacterial proteins were studied for their potentially subversive roles in host innate immune signalling as a means of regulating homeostasis in the GI tract. We also examined differential responses to PRR activation depending on their sub-cellular localisation. This was investigated based on reports that apical Toll-Like Receptor (TLR) 9 activation resulted in abrogation of inflammatory responses mediated by other TLRs in Intestinal Epithelial Cells (IECs) such as basolateral TLR4 activation. Using the well-studied invasive intra-cellular pathogen Listeria monocytogenes as a model for infection, we also used a PRR siRNA library screening technique to identify novel PRRs used by IECs in both inhibition and activation of inflammatory responses. Many of the PRRs identified in this screen were previously believed not to be expressed in IECs. Furthermore, the same study has led to the identification of the previously uncharacterised TLR10 as a functional inflammatory receptor of IECs. Further analysis revealed a similar role in macrophages where it was shown to respond to intracellular and motile pathogens such as Gram-positive L.monocytogenes and Gram negative Salmonella typhimurium. TLR10 expression in IECs was predominantly intracellular. This is likely in order to avoid inappropriate inflammatory activation through the recognition of commensal microbial antigens on the apical cell surface of IECs. Moreover, these results have revealed a more complex network of innate immune signalling mechanisms involved in both activating and inhibiting inflammatory responses in IECs than was previously believed. This contribution to our understanding of innate immune regulation in this region has several direct and indirect benefits. The identification of several novel PRRs involved in activating and inhibiting inflammation in the GI tract may be used as novel therapeutic targets in the treatment of disease; both for inducing tolerance and reducing inflammation, or indeed, as targets for adjuvant activation in the development of oral vaccines against pathogenic attack.
Resumo:
Traditionally, attacks on cryptographic algorithms looked for mathematical weaknesses in the underlying structure of a cipher. Side-channel attacks, however, look to extract secret key information based on the leakage from the device on which the cipher is implemented, be it smart-card, microprocessor, dedicated hardware or personal computer. Attacks based on the power consumption, electromagnetic emanations and execution time have all been practically demonstrated on a range of devices to reveal partial secret-key information from which the full key can be reconstructed. The focus of this thesis is power analysis, more specifically a class of attacks known as profiling attacks. These attacks assume a potential attacker has access to, or can control, an identical device to that which is under attack, which allows him to profile the power consumption of operations or data flow during encryption. This assumes a stronger adversary than traditional non-profiling attacks such as differential or correlation power analysis, however the ability to model a device allows templates to be used post-profiling to extract key information from many different target devices using the power consumption of very few encryptions. This allows an adversary to overcome protocols intended to prevent secret key recovery by restricting the number of available traces. In this thesis a detailed investigation of template attacks is conducted, along with how the selection of various attack parameters practically affect the efficiency of the secret key recovery, as well as examining the underlying assumption of profiling attacks in that the power consumption of one device can be used to extract secret keys from another. Trace only attacks, where the corresponding plaintext or ciphertext data is unavailable, are then investigated against both symmetric and asymmetric algorithms with the goal of key recovery from a single trace. This allows an adversary to bypass many of the currently proposed countermeasures, particularly in the asymmetric domain. An investigation into machine-learning methods for side-channel analysis as an alternative to template or stochastic methods is also conducted, with support vector machines, logistic regression and neural networks investigated from a side-channel viewpoint. Both binary and multi-class classification attack scenarios are examined in order to explore the relative strengths of each algorithm. Finally these machine-learning based alternatives are empirically compared with template attacks, with their respective merits examined with regards to attack efficiency.
Resumo:
In the event of a terrorist-mediated attack in the United States using radiological or improvised nuclear weapons, it is expected that hundreds of thousands of people could be exposed to life-threatening levels of ionizing radiation. We have recently shown that genome-wide expression analysis of the peripheral blood (PB) can generate gene expression profiles that can predict radiation exposure and distinguish the dose level of exposure following total body irradiation (TBI). However, in the event a radiation-mass casualty scenario, many victims will have heterogeneous exposure due to partial shielding and it is unknown whether PB gene expression profiles would be useful in predicting the status of partially irradiated individuals. Here, we identified gene expression profiles in the PB that were characteristic of anterior hemibody-, posterior hemibody- and single limb-irradiation at 0.5 Gy, 2 Gy and 10 Gy in C57Bl6 mice. These PB signatures predicted the radiation status of partially irradiated mice with a high level of accuracy (range 79-100%) compared to non-irradiated mice. Interestingly, PB signatures of partial body irradiation were poorly predictive of radiation status by site of injury (range 16-43%), suggesting that the PB molecular response to partial body irradiation was anatomic site specific. Importantly, PB gene signatures generated from TBI-treated mice failed completely to predict the radiation status of partially irradiated animals or non-irradiated controls. These data demonstrate that partial body irradiation, even to a single limb, generates a characteristic PB signature of radiation injury and thus may necessitate the use of multiple signatures, both partial body and total body, to accurately assess the status of an individual exposed to radiation.
Resumo:
Our long-term goal is the detection and characterization of vulnerable plaque in the coronary arteries of the heart using intravascular ultrasound (IVUS) catheters. Vulnerable plaque, characterized by a thin fibrous cap and a soft, lipid-rich necrotic core is a precursor to heart attack and stroke. Early detection of such plaques may potentially alter the course of treatment of the patient to prevent ischemic events. We have previously described the characterization of carotid plaques using external linear arrays operating at 9 MHz. In addition, we previously modified circular array IVUS catheters by short-circuiting several neighboring elements to produce fixed beamwidths for intravascular hyperthermia applications. In this paper, we modified Volcano Visions 8.2 French, 9 MHz catheters and Volcano Platinum 3.5 French, 20 MHz catheters by short-circuiting portions of the array for acoustic radiation force impulse imaging (ARFI) applications. The catheters had an effective transmit aperture size of 2 mm and 1.5 mm, respectively. The catheters were connected to a Verasonics scanner and driven with pushing pulses of 180 V p-p to acquire ARFI data from a soft gel phantom with a Young's modulus of 2.9 kPa. The dynamic response of the tissue-mimicking material demonstrates a typical ARFI motion of 1 to 2 microns as the gel phantom displaces away and recovers back to its normal position. The hardware modifications applied to our IVUS catheters mimic potential beamforming modifications that could be implemented on IVUS scanners. Our results demonstrate that the generation of radiation force from IVUS catheters and the development of intravascular ARFI may be feasible.