825 resultados para network traffic analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ethernet is a key component of the standards used for digital process buses in transmission substations, namely IEC 61850 and IEEE Std 1588-2008 (PTPv2). These standards use multicast Ethernet frames that can be processed by more than one device. This presents some significant engineering challenges when implementing a sampled value process bus due to the large amount of network traffic. A system of network traffic segregation using a combination of Virtual LAN (VLAN) and multicast address filtering using managed Ethernet switches is presented. This includes VLAN prioritisation of traffic classes such as the IEC 61850 protocols GOOSE, MMS and sampled values (SV), and other protocols like PTPv2. Multicast address filtering is used to limit SV/GOOSE traffic to defined subsets of subscribers. A method to map substation plant reference designations to multicast address ranges is proposed that enables engineers to determine the type of traffic and location of the source by inspecting the destination address. This method and the proposed filtering strategy simplifies future changes to the prioritisation of network traffic, and is applicable to both process bus and station bus applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many existing schemes for malware detection are signature-based. Although they can effectively detect known malwares, they cannot detect variants of known malwares or new ones. Most network servers do not expect executable code in their in-bound network traffic, such as on-line shopping malls, Picasa, Youtube, Blogger, etc. Therefore, such network applications can be protected from malware infection by monitoring their ports to see if incoming packets contain any executable contents. This paper proposes a content-classification scheme that identifies executable content in incoming packets. The proposed scheme analyzes the packet payload in two steps. It first analyzes the packet payload to see if it contains multimedia-type data (such as . If not, then it classifies the payload either as text-type (such as or executable. Although in our experiments the proposed scheme shows a low rate of false negatives and positives (4.69% and 2.53%, respectively), the presence of inaccuracies still requires further inspection to efficiently detect the occurrence of malware. In this paper, we also propose simple statistical and combinatorial analysis to deal with false positives and negatives.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: Effective management of multi-resistant organisms is an important issue for hospitals both in Australia and overseas. This study investigates the utility of using Bayesian Network (BN) analysis to examine relationships between risk factors and colonization with Vancomycin Resistant Enterococcus (VRE). Design: Bayesian Network Analysis was performed using infection control data collected over a period of 36 months (2008-2010). Setting: Princess Alexandra Hospital (PAH), Brisbane. Outcome of interest: Number of new VRE Isolates Methods: A BN is a probabilistic graphical model that represents a set of random variables and their conditional dependencies via a directed acyclic graph (DAG). BN enables multiple interacting agents to be studied simultaneously. The initial BN model was constructed based on the infectious disease physician‟s expert knowledge and current literature. Continuous variables were dichotomised by using third quartile values of year 2008 data. BN was used to examine the probabilistic relationships between VRE isolates and risk factors; and to establish which factors were associated with an increased probability of a high number of VRE isolates. Software: Netica (version 4.16). Results: Preliminary analysis revealed that VRE transmission and VRE prevalence were the most influential factors in predicting a high number of VRE isolates. Interestingly, several factors (hand hygiene and cleaning) known through literature to be associated with VRE prevalence, did not appear to be as influential as expected in this BN model. Conclusions: This preliminary work has shown that Bayesian Network Analysis is a useful tool in examining clinical infection prevention issues, where there is often a web of factors that influence outcomes. This BN model can be restructured easily enabling various combinations of agents to be studied.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Macroscopic Fundamental Diagram (MFD) relates space-mean density and flow, and the existence with dynamic features was confirmed in congested urban network in downtown Yokohama with real data set. Since the MFD represents the area-wide network traffic performances, studies on perimeter control strategies and an area traffic state estimation utilizing the MFD concept has been reported. However, limited works have been reported on real world example from signalised arterial network. This paper fuses data from multiple sources (Bluetooth, Loops and Signals) and develops a framework for the development of the MFD for Brisbane, Australia. Existence of the MFD in Brisbane arterial network is confirmed. Different MFDs (from whole network and several sub regions) are evaluated to discover the spatial partitioning in network performance representation. The findings confirmed the usefulness of appropriate network partitioning for traffic monitoring and incident detections. The discussion addressed future research directions

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Obtaining attribute values of non-chosen alternatives in a revealed preference context is challenging because non-chosen alternative attributes are unobserved by choosers, chooser perceptions of attribute values may not reflect reality, existing methods for imputing these values suffer from shortcomings, and obtaining non-chosen attribute values is resource intensive. This paper presents a unique Bayesian (multiple) Imputation Multinomial Logit model that imputes unobserved travel times and distances of non-chosen travel modes based on random draws from the conditional posterior distribution of missing values. The calibrated Bayesian (multiple) Imputation Multinomial Logit model imputes non-chosen time and distance values that convincingly replicate observed choice behavior. Although network skims were used for calibration, more realistic data such as supplemental geographically referenced surveys or stated preference data may be preferred. The model is ideally suited for imputing variation in intrazonal non-chosen mode attributes and for assessing the marginal impacts of travel policies, programs, or prices within traffic analysis zones.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Macroscopic Fundamental Diagram (MFD) relates space-mean density and flow, and the existence with dynamic features was confirmed in congested urban network in downtown Yokohama with real data set. Since the MFD represents the area-wide network traffic performances, studies on perimeter control strategies and an area traffic state estimation utilizing the MFD concept has been reported. However, limited works have been reported on real world example from signalised arterial network. This paper fuses data from multiple sources (Bluetooth, Loops and Signals) and presents a framework for the development of the MFD for Brisbane, Australia. Existence of the MFD in Brisbane arterial network is confirmed. Different MFDs (from whole network and several sub regions) are evaluated to discover the spatial partitioning for network performance representation. The findings confirmed the usefulness of appropriate network partitioning for traffic monitoring and incident detections. The discussion addressed future research directions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis explored traffic characteristics at the aggregate level for area-wide traffic monitoring of large urban area. It focused on three aspects: understanding a macroscopic network performance under real-time traffic information provision, measuring traffic performance of a signalised arterial network using available data sets, and discussing network zoning for monitoring purposes in the case of Brisbane, Australia. This work presented the use of probe vehicle data for estimating traffic state variables, and illustrated dynamic features of regional traffic performance of Brisbane. The results confirmed the viability and effectiveness of area-wide traffic monitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research quantifies traffic congestion and travel time reliability with case study on a major arterial road in Brisbane. The focus is on the analysis of impact of incidents (e.g., road accidents) on travel time reliability. Real traffic (Bluetooth) and incident records from Coronation Drive, Brisbane are utilized for the study. The findings include significant impact of incidents on traffic congestion and travel time reliability. The knowledge gained is useful in various applications such as traveler information systems, and cost-benefit analysis of various strategies to reduce the traffic incidents and its' impacts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In transport networks, Origin-Destination matrices (ODM) are classically estimated from road traffic counts whereas recent technologies grant also access to sample car trajectories. One example is the deployment in cities of Bluetooth scanners that measure the trajectories of Bluetooth equipped cars. Exploiting such sample trajectory information, the classical ODM estimation problem is here extended into a link-dependent ODM (LODM) one. This much larger size estimation problem is formulated here in a variational form as an inverse problem. We develop a convex optimization resolution algorithm that incorporates network constraints. We study the result of the proposed algorithm on simulated network traffic.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Analysing the engagement of students in university-based Facebook groups can shed light on the nature of their learning experience and highlight leverage points to build on student success. While post-semester surveys and demographic participation data can highlight who was involved and how they subsequently felt about the experience, these techniques do not necessarily reflect real-time engagement. One way to gain insight into in-situ student experiences is by categorising the original posts and comments into predetermined frameworks of learning. This paper offers a systematic method of coding Facebook contributions within various engagement categories: motivation, discourse, cognition and emotive responses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we present an improved load distribution strategy, for arbitrarily divisible processing loads, to minimize the processing time in a distributed linear network of communicating processors by an efficient utilization of their front-ends. Closed-form solutions are derived, with the processing load originating at the boundary and at the interior of the network, under some important conditions on the arrangement of processors and links in the network. Asymptotic analysis is carried out to explore the ultimate performance limits of such networks. Two important theorems are stated regarding the optimal load sequence and the optimal load origination point. Comparative study of this new strategy with an earlier strategy is also presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Several replacement policies for web caches have been proposed and studied extensively in the literature. Different replacement policies perform better in terms of (i) the number of objects found in the cache (cache hit), (ii) the network traffic avoided by fetching the referenced object from the cache, or (iii) the savings in response time. In this paper, we propose a simple and efficient replacement policy (hereafter known as SE) which improves all three performance measures. Trace-driven simulations were done to evaluate the performance of SE. We compare SE with two widely used and efficient replacement policies, namely Least Recently Used (LRU) and Least Unified Value (LUV) algorithms. Our results show that SE performs at least as well as, if not better than, both these replacement policies. Unlike various other replacement policies proposed in literature, our SE policy does not require parameter tuning or a-priori trace analysis and has an efficient and simple implementation that can be incorporated in any existing proxy server or web server with ease.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Instruction reuse is a microarchitectural technique that improves the execution time of a program by removing redundant computations at run-time. Although this is the job of an optimizing compiler, they do not succeed many a time due to limited knowledge of run-time data. In this paper we examine instruction reuse of integer ALU and load instructions in network processing applications. Specifically, this paper attempts to answer the following questions: (1) How much of instruction reuse is inherent in network processing applications?, (2) Can reuse be improved by reducing interference in the reuse buffer?, (3) What characteristics of network applications can be exploited to improve reuse?, and (4) What is the effect of reuse on resource contention and memory accesses? We propose an aggregation scheme that combines the high-level concept of network traffic i.e. "flows" with a low level microarchitectural feature of programs i.e. repetition of instructions and data along with an architecture that exploits temporal locality in incoming packet data to improve reuse. We find that for the benchmarks considered, 1% to 50% of instructions are reused while the speedup achieved varies between 1% and 24%. As a side effect, instruction reuse reduces memory traffic and can therefore be considered as a scheme for low power.