160 resultados para IP routing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: • Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; • Application of principal component analysis to detect the structure of attackers’ activities present in low-interaction honeypots and to visualize attackers’ behaviors; • Detection of new attacks in low-interaction honeypot traffic through the use of the principal component’s residual space and the square prediction error statistic; • Real-time detection of new attacks using recursive principal component analysis; • A proof of concept implementation for honeypot traffic analysis and real time monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Report, prepared for Smart Service Queensland (“SSQ”), addresses legal issues, areas of risk and other factors associated with activities conducted on three popular online platforms—YouTube, MySpace and Second Life (which are referred to throughout this Report as the “Platforms”). The Platforms exemplify online participatory spaces and behaviours, including blogging and networking, multimedia sharing, and immersive virtual environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design talks LOUDLY!!! Is a series of interactive presentations exploring issues and opportunities involving professional design. --------------- These seminars are organised by the Industrial Design Network Queensland (IDnetQLD) in coordination with the Design Institute of Australia (DIA). This event was held at the State Library of Queensland (SLQ) with invited public presentations by a panel of industry experts from the Australian Government – IP Australia. --------------- The first seminar "Intellectual Property : designing 4 success" highlighted to design professionals how the various forms of Intellectual Property interact, what protections and pitfalls exist, and how these impact upon the work and responsibilities of designers. The overlaps, gaps and in congruencies in the various IP protection systems were highlighted by the expert line-up of speakers. --------------- The underlying message is that a clear understanding of all IP types is necessary in order to gain the best advantage from IP protection and therefore eliminate potential IP ownership issues before they become a problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The programming and retasking of sensor nodes could benefit greatly from the use of a virtual machine (VM) since byte code is compact, can be loaded on demand, and interpreted on a heterogeneous set of devices. The challenge is to ensure good programming tools and a small footprint for the virtual machine to meet the memory constraints of typical WSN platforms. To this end we propose Darjeeling, a virtual machine modelled after the Java VM and capable of executing a substantial subset of the Java language, but designed specifically to run on 8- and 16-bit microcontrollers with 2 - 10 KB of RAM. The Darjeeling VM uses a 16- rather than a 32-bit architecture, which is more efficient on the targeted platforms. Darjeeling features a novel memory organisation with strict separation of reference from non-reference types which eliminates the need for run-time type inspection in the underlying compacting garbage collector. Darjeeling uses a linked stack model that provides light-weight threads, and supports synchronisation. The VM has been implemented on three different platforms and was evaluated with micro benchmarks and a real-world application. The latter includes a pure Java implementation of the collection tree routing protocol conveniently programmed as a set of cooperating threads, and a reimplementation of an existing environmental monitoring application. The results show that Darjeeling is a viable solution for deploying large-scale heterogeneous sensor networks. Copyright 2009 ACM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The implementation of a robotic security solution generally requires one algorithm to route the robot around the environment and another algorithm to perform anomaly detection. Solutions to the routing problem require the robot to have a good estimate of its own pose. We present a novel security system that uses metrics generated by the localisation algorithm to perform adaptive anomaly detection. The localisation algorithm is a vision-based SLAM solution called RatSLAM, based on mechanisms within the hippocampus. The anomaly detection algorithm is based on the mechanisms used by the immune system to identify threats to the body. The system is explored using data gathered within an unmodified office environment. It is shown that the algorithm successfully reacts to the presence of people and objects in areas where they are not usually present and is tolerised against the presence of people in environments that are usually dynamic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To demonstrate properties of the International Classification of the External Cause of Injury (ICECI) as a tool for use in injury prevention research. Methods: The Childhood Injury Prevention Study (CHIPS) is a prospective longitudinal follow up study of a cohort of 871 children 5–12 years of age, with a nested case crossover component. The ICECI is the latest tool in the International Classification of Diseases (ICD) family and has been designed to improve the precision of coding injury events. The details of all injury events recorded in the study, as well as all measured injury related exposures, were coded using the ICECI. This paper reports a substudy on the utility and practicability of using the ICECI in the CHIPS to record exposures. Interrater reliability was quantified for a sample of injured participants using the Kappa statistic to measure concordance between codes independently coded by two research staff. Results: There were 767 diaries collected at baseline and event details from 563 injuries and exposure details from injury crossover periods. There were no event, location, or activity details which could not be coded using the ICECI. Kappa statistics for concordance between raters within each of the dimensions ranged from 0.31 to 0.93 for the injury events and 0.94 and 0.97 for activity and location in the control periods. Discussion: This study represents the first detailed account of the properties of the ICECI revealed by its use in a primary analytic epidemiological study of injury prevention. The results of this study provide considerable support for the ICECI and its further use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ad hoc networks are vulnerable to attacks due to distributed nature and lack of infrastructure. Intrusion detection systems (IDS) provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other nodes. The clustering protocols can be taken as an additional advantage in these processing constrained networks to collaboratively detect intrusions with less power usage and minimal overhead. Existing clustering protocols are not suitable for intrusion detection purposes, because they are linked with the routes. The route establishment and route renewal affects the clusters and as a consequence, the processing and traffic overhead increases due to instability of clusters. The ad hoc networks are battery and power constraint, and therefore a trusted monitoring node should be available to detect and respond against intrusions in time. This can be achieved only if the clusters are stable for a long period of time. If the clusters are regularly changed due to routes, the intrusion detection will not prove to be effective. Therefore, a generalized clustering algorithm has been proposed that can run on top of any routing protocol and can monitor the intrusions constantly irrespective of the routes. The proposed simplified clustering scheme has been used to detect intrusions, resulting in high detection rates and low processing and memory overhead irrespective of the routes, connections, traffic types and mobility of nodes in the network. Clustering is also useful to detect intrusions collaboratively since an individual node can neither detect the malicious node alone nor it can take action against that node on its own.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mobile ad-hoc networks (MANETs) are temporary wireless networks useful in emergency rescue services, battlefields operations, mobile conferencing and a variety of other useful applications. Due to dynamic nature and lack of centralized monitoring points, these networks are highly vulnerable to attacks. Intrusion detection systems (IDS) provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other nodes. We take benefit of the clustering concept in MANETs for the effective communication between nodes, where each cluster involves a number of member nodes and is managed by a cluster-head. It can be taken as an advantage in these battery and memory constrained networks for the purpose of intrusion detection, by separating tasks for the head and member nodes, at the same time providing opportunity for launching collaborative detection approach. The clustering schemes are generally used for the routing purposes to enhance the route efficiency. However, the effect of change of a cluster tends to change the route; thus degrades the performance. This paper presents a low overhead clustering algorithm for the benefit of detecting intrusion rather than efficient routing. It also discusses the intrusion detection techniques with the help of this simplified clustering scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emerging data streaming applications in Wireless Sensor Networks require reliable and energy-efficient Transport Protocols. Our recent Wireless Sensor Network deployment in the Burdekin delta, Australia, for water monitoring [T. Le Dinh, W. Hu, P. Sikka, P. Corke, L. Overs, S. Brosnan, Design and deployment of a remote robust sensor network: experiences from an outdoor water quality monitoring network, in: Second IEEE Workshop on Practical Issues in Building Sensor Network Applications (SenseApp 2007), Dublin, Ireland, 2007] is one such example. This application involves streaming sensed data such as pressure, water flow rate, and salinity periodically from many scattered sensors to the sink node which in turn relays them via an IP network to a remote site for archiving, processing, and presentation. While latency is not a primary concern in this class of application (the sampling rate is usually in terms of minutes or hours), energy-efficiency is. Continuous long-term operation and reliable delivery of the sensed data to the sink are also desirable. This paper proposes ERTP, an Energy-efficient and Reliable Transport Protocol for Wireless Sensor Networks. ERTP is designed for data streaming applications, in which sensor readings are transmitted from one or more sensor sources to a base station (or sink). ERTP uses a statistical reliability metric which ensures the number of data packets delivered to the sink exceeds the defined threshold. Our extensive discrete event simulations and experimental evaluations show that ERTP is significantly more energyefficient than current approaches and can reduce energy consumption by more than 45% when compared to current approaches. Consequently, sensor nodes are more energy-efficient and the lifespan of the unattended WSN is increased.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past few years, numerous data collection protocols have been developed for wireless sensor networks (WSNs). However, there has been no comparison of their relative performance in realistic environments. Here we report the results of an empirical study using a Fleck3 sensor network testbed for four different data collection protocols: One phase pull Directed Diffusion (DD), Expected Number of Transmissions (ETX), ETX with explicit acknowledgment (ETX-eAck), and ETX with implicit acknowledgment (ETX-iAck). Our empirical study provides useful insights for future sensor network deployments. When the required application end-to-end reliability is not strict (e.g., 70%) and link quality is good, DD and ETX are the best options because of their simplicity and low routing overhead. Both ETX-eAck and ETX-iAck achieve more than 90% end-to-end reliability when the link quality is reasonable (less than 25% packet loss). When the link quality is good, ETX-iAck introduces significantly less routing overhead (up to 50%) than ETX-eAck. However, if the radio transceiver supports variable packet length, ETX-eAck can outperform ETX-iAck when the link quality is poor. The important message from this paper is that choice of data collection protocol should come after the operating environment is understood. This understanding must include the characteristics of the radio transceiver, and link loss statistics from a long-term (across seasons and weather variation) radio survey of the site.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we discuss how a network of sensors and robots can cooperate to solve important robotics problems such as localization and navigation. We use a robot to localize sensor nodes, and we then use these localized nodes to navigate robots and humans through the sensorized space. We explore these novel ideas with results from two large-scale sensor network and robot experiments involving 50 motes, two types of flying robot: an autonomous helicopter and a large indoor cable array robot, and a human-network interface. We present the distributed algorithms for localization, geographic routing, path definition and incremental navigation. We also describe how a human can be guided using a simple hand-held device that interfaces to this same environmental infrastructure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces the application of a sensor network to navigate a flying robot. We have developed distributed algorithms and efficient geographic routing techniques to incrementally guide one or more robots to points of interest based on sensor gradient fields, or along paths defined in terms of Cartesian coordinates. The robot itself is an integral part of the localization process which establishes the positions of sensors which are not known a priori. We use this system in a large-scale outdoor experiment with Mote sensors to guide an autonomous helicopter along a path encoded in the network. A simple handheld device, using this same environmental infrastructure, is used to guide humans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discusses the contentious issues surrounding computer software patents and patenting in connection with the Peer-to-Patent Australia project, a joint initiative of Queensland University of Technology (QUT) and New York Law School (NYLS) that operates with the support and endorsement of IP Australia, the government body housing Australia's patent office. Explains that the project is based on the successful Peer-to-Patent pilots run recently in the USA and Japan that are designed to improve the quality of issued patents and the patent examination process by facilitating community participation in that process. Describes how members of the public are allowed to put forward prior art references that will be considered by IP Australia's patent examiners when determining whether participating applications are novel and inventive, and therefore deserving of a patent. Concludes that, while Peer-to-Patent Australia is not a complete solution to the problems besetting patent law, the model has considerable advantages over the traditional model of patent examination

Relevância:

10.00% 10.00%

Publicador: