49 resultados para Traffic engineering computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic classification technique is an essential tool for network and system security in the complex environments such as cloud computing based environment. The state-of-the-art traffic classification methods aim to take the advantages of flow statistical features and machine learning techniques, however the classification performance is severely affected by limited supervised information and unknown applications. To achieve effective network traffic classification, we propose a new method to tackle the problem of unknown applications in the crucial situation of a small supervised training set. The proposed method possesses the superior capability of detecting unknown flows generated by unknown applications and utilizing the correlation information among real-world network traffic to boost the classification performance. A theoretical analysis is provided to confirm performance benefit of the proposed method. Moreover, the comprehensive performance evaluation conducted on two real-world network traffic datasets shows that the proposed scheme outperforms the existing methods in the critical network environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A soft computing framework to classify and optimize text-based information extracted from customers' product reviews is proposed in this paper. The soft computing framework performs classification and optimization in two stages. Given a set of keywords extracted from unstructured text-based product reviews, a Support Vector Machine (SVM) is used to classify the reviews into two categories (positive and negative reviews) in the first stage. An ensemble of evolutionary algorithms is deployed to perform optimization in the second stage. Specifically, the Modified micro Genetic Algorithm (MmGA) optimizer is applied to maximize classification accuracy and minimize the number of keywords used in classification. Two Amazon product reviews databases are employed to evaluate the effectiveness of the SVM classifier and the ensemble of MmGA optimizers in classification and optimization of product related keywords. The results are analyzed and compared with those published in the literature. The outputs potentially serve as a list of impression words that contains useful information from the customers' viewpoints. These impression words can be further leveraged for product design and improvement activities in accordance with the Kansei engineering methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis addresses a number of critical problems in regard to fully automating the process of network traffic classification and protocol identification. Several effective solutions based on statistical analysis and machine learning techniques are proposed, which significantly reduce the requirements for human interventions in network traffic classification systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trust problem in Software as a Service Cloud Computing is a broad range of a Data Owner’s concerns about the data in the Cloud. The Data Owner’s concerns about the data arise from the way the data is handled in locations and machines that are unknown to the Data Owner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

 This thesis presents a number of applications of symbolic computing to the study of differential equations. In particular, three packages have been produced for the computer algebra system MAPLE and used to find a variety of symmetries (and corresponding invariant solutions) for a range of differential systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

 This research focused on building Software as a Service clouds to support mammalian genomic applications such as personalized medicine. Outcomes of this research included a Software as a Service cloud framework, the Uncinus research cloud and novel genomic analysis software. Results have been published in high ranking peer-reviewed international journals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper focuses on an investigation to explore architectural design potentials with a responsive material system and physical computing. Contemporary architects and designers are seeking to integrate physical computing in responsive architectural designs; however, they have largely borrowed from engineering technology's mechanical devices and components. There is the opportunity to investigate an unexplored design approach to exploit the responsive capacity of material properties as alternatives to the current focus on mechanical components and discrete sensing devices. This opportunity creates a different design paradigm for responsive architecture that investigates the potential to integrate physical computing with responsive materials as one integrated material system. Instead of adopting highly intricate and expensive materials, this approach is explored through accessible and off-the-shelf materials to form a responsive material system, called Lumina. Lumina is implemented as an architectural installation called Cloud that serves as a morphing architectural skin. Cloud is a proof of concept to embody a responsive material system with physical computing to create a reciprocal and luminous architectural intervention for a selected dark corridor. It represents a different design paradigm for responsive architecture through alternative exploitation of contemporary materials and parametric design tools. © 2014, The Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Hong Kong.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud-based service computing has started to change the way how research in science, in particular biology, medicine, and engineering, is being carried out. Researchers in the area of mammalian genomics have taken advantage of cloud computing technology to cost-effectively process large amounts of data and speed up discovery. Mammalian genomics is limited by the cost and complexity of analysis, which require large amounts of computational resources to analyse huge amount of data and biology specialists to interpret results. On the other hand the application of this technology requires computing knowledge, in particular programming and operations management skills to develop high performance computing (HPC) applications and deploy them on HPC clouds. We carried out a survey of cloud-based service computing solutions, as the most recent and promising instantiations of distributed computing systems, in the context their use in research of mammalian genomic analysis. We describe our most recent research and development effort which focuses on building Software as a Service (SaaS) clouds to simplify the use of HPC clouds for carrying out mammalian genomic analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistics-based Internet traffic classification using machine learning techniques has attracted extensive research interest lately, because of the increasing ineffectiveness of traditional port-based and payload-based approaches. In particular, unsupervised learning, that is, traffic clustering, is very important in real-life applications, where labeled training data are difficult to obtain and new patterns keep emerging. Although previous studies have applied some classic clustering algorithms such as K-Means and EM for the task, the quality of resultant traffic clusters was far from satisfactory. In order to improve the accuracy of traffic clustering, we propose a constrained clustering scheme that makes decisions with consideration of some background information in addition to the observed traffic statistics. Specifically, we make use of equivalence set constraints indicating that particular sets of flows are using the same application layer protocols, which can be efficiently inferred from packet headers according to the background knowledge of TCP/IP networking. We model the observed data and constraints using Gaussian mixture density and adapt an approximate algorithm for the maximum likelihood estimation of model parameters. Moreover, we study the effects of unsupervised feature discretization on traffic clustering by using a fundamental binning method. A number of real-world Internet traffic traces have been used in our evaluation, and the results show that the proposed approach not only improves the quality of traffic clusters in terms of overall accuracy and per-class metrics, but also speeds up the convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to alleviate the traffic congestion and reduce the complexity of traffic control and management, it is necessary to exploit traffic sub-areas division which should be effective in planing traffic. Some researchers applied the K-Means algorithm to divide traffic sub-areas on the taxi trajectories. However, the traditional K-Means algorithms faced difficulties in processing large-scale Global Position System(GPS) trajectories of taxicabs with the restrictions of memory, I/O, computing performance. This paper proposes a Parallel Traffic Sub-Areas Division(PTSD) method which consists of two stages, on the basis of the Parallel K-Means(PKM) algorithm. During the first stage, we develop a process to cluster traffic sub-areas based on the PKM algorithm. Then, the second stage, we identify boundary of traffic sub-areas on the base of cluster result. According to this method, we divide traffic sub-areas of Beijing on the real-word (GPS) trajectories of taxicabs. The experiment and discussion show that the method is effective in dividing traffic sub-areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At present, companies and standards organizations are enhancing Ethernet as the unified switch fabric for all of the TCP/IP traffic, the storage traffic and the high performance computing traffic in data centers. Backward congestion notification (BCN) is the basic mechanism for the end-to-end congestion management enhancement of Ethernet. To fulfill the special requirements of the unified switch fabric, i.e., losslessness and low transmission delay, BCN should hold the buffer occupancy around a target point tightly. Thus, the stability of the control loop and the buffer size are critical to BCN. Currently, the impacts of delay on the performance of BCN are unidentified. When the speed of Ethernet increases to 40 Gbps or 100 Gbps in the near future, the number of on-the-fly packets becomes the same order with the buffer size of switch. Accordingly, the impacts of delay will become significant. In this paper, we analyze BCN, paying special attention on the delay. We model the BCN system with a set of segmented delayed differential equations, and then deduce sufficient condition for the uniformly asymptotic stability of BCN. Subsequently, the bounds of buffer occupancy are estimated, which provides direct guidelines on setting buffer size. Finally, numerical analysis and experiments on the NetFPGA platform verify our theoretical analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud services to smart things face latency and intermittent connectivity issues. Fog devices are positioned between cloud and smart devices. Their high speed Internet connection to the cloud, and physical proximity to users, enable real time applications and location based services, and mobility support. Cisco promoted fog computing concept in the areas of smart grid, connected vehicles and wireless sensor and actuator networks. This survey article expands this concept to the decentralized smart building control, recognizes cloudlets as special case of fog computing, and relates it to the software defined networks (SDN) scenarios. Our literature review identifies a handful number of articles. Cooperative data scheduling and adaptive traffic light problems in SDN based vehicular networks, and demand response management in macro station and micro-grid based smart grids are discussed. Security, privacy and trust issues, control information overhead and network control policies do not seem to be studied so far within the fog computing concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban traffic as one of the most important challenges in modern city life needs practically effective and efficient solutions. Artificial intelligence methods have gained popularity for optimal traffic light control. In this paper, a review of most important works in the field of controlling traffic signal timing, in particular studies focusing on Q-learning, neural network, and fuzzy logic system are presented. As per existing literature, the intelligent methods show a higher performance compared to traditional controlling methods. However, a study that compares the performance of different learning methods is not published yet. In this paper, the aforementioned computational intelligence methods and a fixed-time method are implemented to set signals times and minimize total delays for an isolated intersection. These methods are developed and compared on a same platform. The intersection is treated as an intelligent agent that learns to propose an appropriate green time for each phase. The appropriate green time for all the intelligent controllers are estimated based on the received traffic information. A comprehensive comparison is made between the performance of Q-learning, neural network, and fuzzy logic system controller for two different scenarios. The three intelligent learning controllers present close performances with multiple replication orders in two scenarios. On average Q-learning has 66%, neural network 71%, and fuzzy logic has 74% higher performance compared to the fixed-time controller.