997 resultados para Internet addresses


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Selecció d'enllaços relacionats amb el món del lleure

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Selecció d'enllaços relacionats amb el tema de la convivència

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mint es una repostería especializada para personas con problemas como diabetes y obesidad, que estará ubicada en localidades de Bogotá, donde existen mayores índices de este problema, inicialmente estaremos en Kennedy, una localidad con mayor población que padece la diabetes y obesidad en niños y adultos mayores. Mint busca cambiar el concepto de comida saludable o especializada en diabéticos, de aburrida, de mal sabor y poco agradable al consumo, por gran variedad de sabores, texturas y diseños. Este concepto se cambiará principalmente, a través del diseño de los productos, así como la decoración de la tienda, dado que estamos ubicados en estratos 2, 3 y 4, nuestros productos estarán condicionados al poder adquisitivo de nuestros consumidores. El local tendrá un área total de 87,4 m cuadrado, donde predomina el color y el diseño, funcionará de domingo a domingo en un horario de 10 am a 8 pm. La idea no solo es la venta de productos, sino realizar una concientización del cliente de lo más importante, su salud. La inversión estimada en maquinaria será de 6.510.000, equipo de 3.200.000 y los muebles enseres de 6.500.000, adicional, los costos operacionales mensuales estimados son de 15.473.400, donde el punto de equilibrio se logrará en el momento de venta de 467. 465.004. El local proporcionará diferentes campañas de fidelización, tales como promociones y descuentos, así como la posibilidad de pedir domicilios a través de internet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O trabalho indica alguns livros eletrônicos gratuitos disponíveis na Internet e destinados à área médica.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O trabalho indica alguns sites a serem considerados como alternativa para acesso gratuito a livros on-line disponíveis na Internet e destinados à área médica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-rate flooding attacks (aka Distributed Denial of Service or DDoS attacks) continue to constitute a pernicious threat within the Internet domain. In this work we demonstrate how using packet source IP addresses coupled with a change-point analysis of the rate of arrival of new IP addresses may be sufficient to detect the onset of a high-rate flooding attack. Importantly, minimizing the number of features to be examined, directly addresses the issue of scalability of the detection process to higher network speeds. Using a proof of concept implementation we have shown how pre-onset IP addresses can be efficiently represented using a bit vector and used to modify a “white list” filter in a firewall as part of the mitigation strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial representations, metaphors and imaginaries (cyberspace, web pages) have been the mainstay of internet research for a long time. Instead of repeating these themes, this paper seeks to answer the question of how we might understand the concept of time in relation to internet research. After a brief excursus on the general history of the concept, this paper proposes three different approaches to the conceptualisation of internet time. The common thread underlying all the approaches is the notion of time as an assemblage of elements such as technical artefacts, social relations and metaphors. By drawing out time in this way, the paper addresses the challenge of thinking of internet time as coexistence, a clash of fluxes, metaphors, lived experiences and assemblages. In other words, this paper proposes a way to articulate internet time as a multiplicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial representations, metaphors and imaginaries (cyberspace, web pages) have been the mainstay of internet research for a long time. Instead of repeating these themes, this paper seeks to answer the question of how we might understand the concept of time in relation to internet research. After a brief excursus on the general history of the concept, this paper proposes three different approaches to the conceptualisation of internet time. The common thread underlying all the approaches is the notion of time as an assemblage of elements such as technical artefacts, social relations and metaphors. By drawing out time in this way, the paper addresses the challenge of thinking of internet time as coexistence, a clash of fluxes, metaphors, lived experiences and assemblages. In other words, this paper proposes a way to articulate internet time as a multiplicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a presentation made (by invitation from the Queensland Police, Fraud Squad) to a group of Queenslanders all of whom had fallen victim to internet scams. The paper addresses the subject of guilt and why we may 'suffer' from it after a traumatic experience where the individual and/or the family have gone through a major financial or emotional loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the potential for e-commerce growth in Latin America, studies investigating factors that influence consumers’ Internet purchasing behavior are very limited. This research addresses this limitation with a consumer centric study in Chile using the Theory of Reasoned Action. The study examines Chilean consumers’ beliefs, perceptions of risk, and subjective norms about continued purchasing on the Internet. Findings show that consumers’ attitude towards purchasing on the Internet is an influential factor on intentions to continue Internet purchasing. Additionally, compatibility and result demonstrability are influential factors on attitudes towards this behavior. The study contributes to the important area of technology post adoption behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This submission is directed to issues arising in respect of the need to recognise and support access to the internet for all Australian residents and citizens. As such it addresses the following questions only: Questions 2-1: What general principles or criteria should be applied to help determine whether a law that interferes with freedom of speech is justified? Question 2-2: Which Commonwealth laws unjustifiably interfere with freedom of speech, and why are these laws unjustified?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As distributed information services like the World Wide Web become increasingly popular on the Internet, problems of scale are clearly evident. A promising technique that addresses many of these problems is service (or document) replication. However, when a service is replicated, clients then need the additional ability to find a "good" provider of that service. In this paper we report on techniques for finding good service providers without a priori knowledge of server location or network topology. We consider the use of two principal metrics for measuring distance in the Internet: hops, and round-trip latency. We show that these two metrics yield very different results in practice. Surprisingly, we show data indicating that the number of hops between two hosts in the Internet is not strongly correlated to round-trip latency. Thus, the distance in hops between two hosts is not necessarily a good predictor of the expected latency of a document transfer. Instead of using known or measured distances in hops, we show that the extra cost at runtime incurred by dynamic latency measurement is well justified based on the resulting improved performance. In addition we show that selection based on dynamic latency measurement performs much better in practice that any static selection scheme. Finally, the difference between the distribution of hops and latencies is fundamental enough to suggest differences in algorithms for server replication. We show that conclusions drawn about service replication based on the distribution of hops need to be revised when the distribution of latencies is considered instead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The TCP/IP architecture was originally designed without taking security measures into consideration. Over the years, it has been subjected to many attacks, which has led to many patches to counter them. Our investigations into the fundamental principles of networking have shown that carefully following an abstract model of Interprocess Communication (IPC) addresses many problems [1]. Guided by this IPC principle, we designed a clean-slate Recursive INternet Architecture (RINA) [2]. In this paper, we show how, without the aid of cryptographic techniques, the bare-bones architecture of RINA can resist most of the security attacks faced by TCP/IP. We also show how hard it is for an intruder to compromise RINA. Then, we show how RINA inherently supports security policies in a more manageable, on-demand basis, in contrast to the rigid, piecemeal approach of TCP/IP.