773 resultados para DDoS attacks


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gunning, Jeroen. Hizballah and the logic of political participation, In: 'Terror, Insurgency and the State: Ending Protracted Conflicts', Heiberg, Marianne, O'Leary, Brendan & Tirman, John (Philadelphia: University of Pennsylvania Press), p.157-188, 2007. RAE2008

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unstable arterial plaque is likely the key component of atherosclerosis, a disease which is responsible for two-thirds of heart attacks and strokes, leading to approximately 1 million deaths in the United States. Ultrasound imaging is able to detect plaque but as of yet is not able to distinguish unstable plaque from stable plaque. In this work a scanning acoustic microscope (SAM) was implemented and validated as tool to measure the acoustic properties of a sample. The goal for the SAM is to be able to provide quantitative measurements of the acoustic properties of different plaque types, to understand the physical basis by which plaque may be identified acoustically. The SAM consists of a spherically focused transducer which operates in pulse-echo mode and is scanned in a 2D raster pattern over a sample. A plane wave analysis is presented which allows the impedance, attenuation and phase velocity of a sample to be de- termined from measurements of the echoes from the front and back of the sample. After the measurements, the attenuation and phase velocity were analysed to ensure that they were consistent with causality. The backscatter coefficient of the samples was obtained using the technique outlined by Chen et al [8]. The transducer used here was able to determine acoustic properties from 10-40 MHz. The results for the impedance, attenuation and phase velocity were validated for high and low-density polyethylene against published results. The plane wave approximation was validated by measuring the properties throughout the focal region and throughout a range of incidence angles from the transducer. The SAM was used to characterize a set of recipes for tissue-mimicking phantoms which demonstrate indepen- dent control over the impedance, attenuation, phase velocity and backscatter coefficient. An initial feasibility study on a human artery was performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detecting and understanding anomalies in IP networks is an open and ill-defined problem. Toward this end, we have recently proposed the subspace method for anomaly diagnosis. In this paper we present the first large-scale exploration of the power of the subspace method when applied to flow traffic. An important aspect of this approach is that it fuses information from flow measurements taken throughout a network. We apply the subspace method to three different types of sampled flow traffic in a large academic network: multivariate timeseries of byte counts, packet counts, and IP-flow counts. We show that each traffic type brings into focus a different set of anomalies via the subspace method. We illustrate and classify the set of anomalies detected. We find that almost all of the anomalies detected represent events of interest to network operators. Furthermore, the anomalies span a remarkably wide spectrum of event types, including denial of service attacks (single-source and distributed), flash crowds, port scanning, downstream traffic engineering, high-rate flows, worm propagation, and network outage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Content providers often consider the costs of security to be greater than the losses they might incur without it; many view "casual piracy" as their main concern. Our goal is to provide a low cost defense against such attacks while maintaining rigorous security guarantees. Our defense is integrated with and leverages fast forward error correcting codes, such as Tornado codes, which are widely used to facilitate reliable delivery of rich content. We tune one such family of codes - while preserving their original desirable properties - to guarantee that none of the original content can b e recovered whenever a key subset of encoded packets is missing. Ultimately we encrypt only these key codewords (only 4% of all transmissions), making the security overhead negligible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The SafeWeb anonymizing system has been lauded by the press and loved by its users; self-described as "the most widely used online privacy service in the world," it served over 3,000,000 page views per day at its peak. SafeWeb was designed to defeat content blocking by firewalls and to defeat Web server attempts to identify users, all without degrading Web site behavior or requiring users to install specialized software. In this article we describe how these fundamentally incompatible requirements were realized in SafeWeb's architecture, resulting in spectacular failure modes under simple JavaScript attacks. These exploits allow adversaries to turn SafeWeb into a weapon against its users, inflicting more damage on them than would have been possible if they had never relied on SafeWeb technology. By bringing these problems to light, we hope to remind readers of the chasm that continues to separate popular and technical notions of security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an economic mechanism to reduce the incidence of malware that delivers spam. Earlier research proposed attention markets as a solution for unwanted messages, and showed they could provide more net benefit than alternatives such as filtering and taxes. Because it uses a currency system, Attention Bonds faces a challenge. Zombies, botnets, and various forms of malware might steal valuable currency instead of stealing unused CPU cycles. We resolve this problem by taking advantage of the fact that the spam-bot problem has been reduced to financial fraud. As such, the large body of existing work in that realm can be brought to bear. By drawing an analogy between sending and spending, we show how a market mechanism can detect and prevent spam malware. We prove that by using a currency (i) each instance of spam increases the probability of detecting infections, and (ii) the value of eradicating infections can justify insuring users against fraud. This approach attacks spam at the source, a virtue missing from filters that attack spam at the destination. Additionally, the exchange of currency provides signals of interest that can improve the targeting of ads. ISPs benefit from data management services and consumers benefit from the higher average value of messages they receive. We explore these and other secondary effects of attention markets, and find them to offer, on the whole, attractive economic benefits for all – including consumers, advertisers, and the ISPs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The TCP/IP architecture was originally designed without taking security measures into consideration. Over the years, it has been subjected to many attacks, which has led to many patches to counter them. Our investigations into the fundamental principles of networking have shown that carefully following an abstract model of Interprocess Communication (IPC) addresses many problems [1]. Guided by this IPC principle, we designed a clean-slate Recursive INternet Architecture (RINA) [2]. In this paper, we show how, without the aid of cryptographic techniques, the bare-bones architecture of RINA can resist most of the security attacks faced by TCP/IP. We also show how hard it is for an intruder to compromise RINA. Then, we show how RINA inherently supports security policies in a more manageable, on-demand basis, in contrast to the rigid, piecemeal approach of TCP/IP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The past two decades has seen a dramatic upheaval in the international world order: the end of the Cold War, the 9/11 attacks and the subsequent 'War on Terror', increased Jihadist activities, the accelerated pace of globalization, climate change and the 2008 global financial crisis have contributed to fear, uncertainty, poverty, conflict, massive displacements of populations of asylum seekers and refugees globally and a proliferation of Protracted Refugee Situations (PRS), defined as situations in which refugees have been in exile 'for 5 years or more after their initial displacement, without immediate prospects for implementation of durable solutions. In the past two decades there has been a huge proliferation of these with more than 7.2 million refugees now trapped in these PRS, with a further 16 million internally displaced persons (IDPs) trapped in camps within their own countries. The Dadaab refugee complex in Kenya, which of as March 2012, holds over 463,000 refugees, is the most significant and extreme example in recent times of a PRS. It was established in 1991 following the collapse of the Somali Government of Dictator Siad Barre, and the disintegration of Somalia into the chaos that still exists today. PRS such as Dadaab raise particular issues about humanitarianism in terms of aid, protection, security, human rights and the actions (or inaction) of the various stakeholders on an international, national and local level. This thesis investigates these issues by the use of a case study methodology on Dadaab as a PRS, framed in the context of humanitarianism and in particular the issues that arise in terms of how the international community, the UN system and individual states provide assistance and protection to vulnerable populations. Although the refugee camps have been in existence (as of 2012) for over 20 years, there has never been such a detailed study of Dadaab (or any other PRS) undertaken to date and would be of interest to academics in the areas of international relations, refugee/migration studies and global Governance as well as practitioners in both humanitarian response and development

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Along with the growing demand for cryptosystems in systems ranging from large servers to mobile devices, suitable cryptogrophic protocols for use under certain constraints are becoming more and more important. Constraints such as calculation time, area, efficiency and security, must be considered by the designer. Elliptic curves, since their introduction to public key cryptography in 1985 have challenged established public key and signature generation schemes such as RSA, offering more security per bit. Amongst Elliptic curve based systems, pairing based cryptographies are thoroughly researched and can be used in many public key protocols such as identity based schemes. For hardware implementions of pairing based protocols, all components which calculate operations over Elliptic curves can be considered. Designers of the pairing algorithms must choose calculation blocks and arrange the basic operations carefully so that the implementation can meet the constraints of time and hardware resource area. This thesis deals with different hardware architectures to accelerate the pairing based cryptosystems in the field of characteristic two. Using different top-level architectures the hardware efficiency of operations that run at different times is first considered in this thesis. Security is another important aspect of pairing based cryptography to be considered in practically Side Channel Analysis (SCA) attacks. The naively implemented hardware accelerators for pairing based cryptographies can be vulnerable when taking the physical analysis attacks into consideration. This thesis considered the weaknesses in pairing based public key cryptography and addresses the particular calculations in the systems that are insecure. In this case, countermeasures should be applied to protect the weak link of the implementation to improve and perfect the pairing based algorithms. Some important rules that the designers must obey to improve the security of the cryptosystems are proposed. According to these rules, three countermeasures that protect the pairing based cryptosystems against SCA attacks are applied. The implementations of the countermeasures are presented and their performances are investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Terrorist attacks by transnational armed groups cause on average 15,000 deaths every year worldwide, with the law enforcement agencies of some states facing many challenges in bringing those responsible to justice. Despite various attempts to codify the law on transnational terrorism since the 1930s, a crime of transnational terrorism under International Law remains contested, reflecting concerns regarding the relative importance of prosecuting members of transnational armed groups before the International Criminal Court. However, a study of the emerging jurisprudence of the International Criminal Court suggests that terrorist attacks cannot be classified as a war crime or a crime against humanity. Therefore, using organisational network theory, this thesis will probe the limits of international criminal law in bringing members of transnational armed groups to justice in the context of changing methods of warfare. Determining the organisational structure of transnational armed groups, provides a powerful analytical framework for examining the challenges in holding members of transnational armed groups accountable before the International Criminal Court, in the context of the relationship between the commanders and the subordinate members of the group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates the effect of serious health events including new diagnoses of heart attacks, strokes, cancers, chronic lung disease, chronic heart failure, diabetes, and heart disease on future smoking status up to 6 years postevent. Data come from the Health and Retirement Study, a nationally representative longitudinal survey of Americans aged 51-61 in 1991, followed every 2 years from 1992 to 1998. Smoking status is evaluated at each of three follow-ups, (1994, 1996, and 1998) as a function of health events between each of the four waves. Acute and chronic health events are associated with much lower likelihood of smoking both in the wave immediately following the event and up to 6 years later. However, future events do not retrospectively predict past cessation. In sum, serious health events have substantial impacts on cessation rates of older smokers. Notably, these effects persist for as much as 6 years after a health event.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cognitive-emotional distinctiveness (CED), the extent to which an individual separates emotions from an event in the cognitive representation of the event, was explored in four studies. CED was measured using a modified multidimensional scaling procedure. The first study found that lower levels of CED in memories of the September 11 terrorist attacks predicted greater frequency of intrusive thoughts about the attacks. The second study revealed that CED levels are higher in negative events, in comparison to positive events and that low CED levels in emotionally intense negative events are associated with a pattern of greater event-related distress. The third study replicated the findings from the previous study when examining CED levels in participants' memories of the 2004 Presidential election. The fourth study revealed that low CED in emotionally intense negative events is associated with worse mental health. We argue that CED is an adaptive and healthy coping feature of stressful memories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On September 12, 2001, 54 Duke students recorded their memory of first hearing about the terrorist attacks of September 11 and of a recent everyday event. They were tested again either 1, 6, or 32 weeks later. Consistency for the flashbulb and everyday memories did not differ, in both cases declining over time. However, ratings of vividness, recollection, and belief in the accuracy of memory declined only for everyday memories. Initial visceral emotion ratings correlated with later belief in accuracy, but not consistency, for flashbulb memories. Initial visceral emotion ratings predicted later posttraumatic stress disorder symptoms. Flashbulb memories are not special in their accuracy, as previously claimed, but only in their perceived accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heavy metal-bearing waste usually needs solidification/stabilization (s/s) prior to landfill to lower the leaching rate. Cement is the most adaptable binder currently available for the immobilisation of heavy metals. The selection of cements and operating parameters depends upon an understanding of chemistry of the system. This paper discusses interactions of heavy metals and cement phases in the solidification/stabilisation process. It provides a clarification of heavy metal effects on cement hydration. According to the decomposition rate of minerals, heavy metals accelerate the hydration of tricalcium silicate (C3S) and Portland cement, although they retard the precipitation of portlandite due to the reduction of pH resulted from hydrolyses of heavy metal ions. The chemical mechanism relevant to the accelerating effect of heavy metals is considered to be H+ attacks on cement phases and the precipitation of calcium heavy metal double hydroxides, which consumes calcium ions and then promotes the decomposition Of C3S. In this work, molecular models of calcium silicate hydrate gel are presented based on the examination of Si-29 solid-state magic angle spinning/nuclear magnetic resonance (MAS/NMR). This paper also reviews immobilisation mechanisms of heavy metals in hydrated cement matrices, focusing on the sorption, precipitation and chemical incorporation of cement hydration products. It is concluded that further research oil the phase development during cement hydration in the presence of heavy metals and thermodynamic modelling is needed to improve effectiveness of cement-based s/s and extend this waste management technique. (C) 2008 Elsevier Ltd. All rights reserved.