87 resultados para EM algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Zero-day or unknown malware are created using code obfuscation techniques that can modify the parent code to produce offspring copies which have the same functionality but with different signatures. Current techniques reported in literature lack the capability of detecting zero-day malware with the required accuracy and efficiency. In this paper, we have proposed and evaluated a novel method of employing several data mining techniques to detect and classify zero-day malware with high levels of accuracy and efficiency based on the frequency of Windows API calls. This paper describes the methodology employed for the collection of large data sets to train the classifiers, and analyses the performance results of the various data mining algorithms adopted for the study using a fully automated tool developed in this research to conduct the various experimental investigations and evaluation. Through the performance results of these algorithms from our experimental analysis, we are able to evaluate and discuss the advantages of one data mining algorithm over the other for accurately detecting zero-day malware successfully. The data mining framework employed in this research learns through analysing the behavior of existing malicious and benign codes in large datasets. We have employed robust classifiers, namely Naïve Bayes (NB) Algorithm, k−Nearest Neighbor (kNN) Algorithm, Sequential Minimal Optimization (SMO) Algorithm with 4 differents kernels (SMO - Normalized PolyKernel, SMO – PolyKernel, SMO – Puk, and SMO- Radial Basis Function (RBF)), Backpropagation Neural Networks Algorithm, and J48 decision tree and have evaluated their performance. Overall, the automated data mining system implemented for this study has achieved high true positive (TP) rate of more than 98.5%, and low false positive (FP) rate of less than 0.025, which has not been achieved in literature so far. This is much higher than the required commercial acceptance level indicating that our novel technique is a major leap forward in detecting zero-day malware. This paper also offers future directions for researchers in exploring different aspects of obfuscations that are affecting the IT world today.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Satellite image processing is a complex task that has received considerable attention from many researchers. In this paper, an interactive image query system for satellite imagery searching and retrieval is proposed. Like most image retrieval systems, extraction of image features is the most important step that has a great impact on the retrieval performance. Thus, a new technique that fuses color and texture features for segmentation is introduced. Applicability of the proposed technique is assessed using a database containing multispectral satellite imagery. The experiments demonstrate that the proposed segmentation technique is able to improve quality of the segmentation results as well as the retrieval performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-frame super-resolution algorithms aim to increase spatial resolution by fusing information from several low-resolution perspectives of a scene. While a wide array of super-resolution algorithms now exist, the comparative capability of these techniques in practical scenarios has not been adequately explored. In addition, a standard quantitative method for assessing the relative merit of super-resolution algorithms is required. This paper presents a comprehensive practical comparison of existing super-resolution techniques using a shared platform and 4 common greyscale reference images. In total, 13 different super-resolution algorithms are evaluated, and as accurate alignment is critical to the super-resolution process, 6 registration algorithms are also included in the analysis. Pixel-based visual information fidelity (VIFP) is selected from the 12 image quality metrics reviewed as the measure most suited to the appraisal of super-resolved images. Experimental results show that Bayesian super-resolution methods utilizing the simultaneous autoregressive (SAR) prior produce the highest quality images when combined with generalized stochastic Lucas-Kanade optical flow registration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thickness of the retinal nerve fiber layer (RFNL) has become a diagnose measure for glaucoma assessment. To measure this thickness, accurate segmentation of the RFNL in optical coherence tomography (OCT) images is essential. Identification of a suitable segmentation algorithm will facilitate the enhancement of the RNFL thickness measurement accuracy. This paper investigates the performance of six algorithms in the segmentation of RNFL in OCT images. The algorithms are: normalised cuts, region growing, k-means clustering, active contour, level sets segmentation: Piecewise Gaussian Method (PGM) and Kernelized Method (KM). The performance of the six algorithms are determined through a set of experiments on OCT retinal images. An experimental procedure is used to measure the performance of the tested algorithms. The measured segmentation precision-recall results of the six algorithms are compared and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a 6-RRCRR parallel robot assisted minimally invasive surgery/microsurgery system (PRAMiSS) is introduced. Remote centre-of-motion (RCM) control algorithms of PRAMiSS suitable for minimally invasive surgery and microsurgery are also presented. The programmable RCM approach is implemented in order to achieve manipulation under the constraint of moving through the fixed penetration point. Having minimised the displacements of the mobile platform of the parallel micropositioning robot, the algorithms also apply orientation constraint to the instrument and prevent the tool tip to orient due to the robot movements during the manipulation. Experimental results are provided to verify accuracy and effectiveness of the proposed RCM control algorithms for minimally invasive surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A variety of type reduction (TR) algorithms have been proposed for interval type-2 fuzzy logic systems (IT2 FLSs). The focus of existing literature is mainly on computational requirements of TR algorithm. Often researchers give more rewards to computationally less expensive TR algorithms. This paper evaluates and compares five frequently used TR algorithms from a forecasting performance perspective. Algorithms are judged based on the generalization power of IT2 FLS models developed using them. Four synthetic and real world case studies with different levels of uncertainty are considered to examine effects of TR algorithms on forecasts accuracies. It is found that Coupland-Jonh TR algorithm leads to models with a better forecasting performance. However, there is no clear relationship between the width of the type reduced set and TR algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detection of depression from structural MRI (sMRI) scans is relatively new in the mental health diagnosis. Such detection requires processes including image acquisition and pre-processing, feature extraction and selection, and classification. Identification of a suitable feature selection (FS) algorithm will facilitate the enhancement of the detection accuracy by selection of important features. In the field of depression study, there are very limited works that evaluate feature selection algorithms for sMRI data. This paper investigates the performance of four algorithms for FS of volumetric attributes in sMRI scans. The algorithms are One Rule (OneR), Support Vector Machine (SVM), Information Gain (IG) and ReliefF. The performances of the algorithms are determined through a set of experiments on sMRI brain scans. An experimental procedure is developed to measure the performance of the tested algorithms. The result of the evaluation of the FS algorithms is discussed by using a number of analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines and analyzes different aggregation algorithms to improve accuracy of forecasts obtained using neural network (NN) ensembles. These algorithms include equal-weights combination of Best NN models, combination of trimmed forecasts, and Bayesian Model Averaging (BMA). The predictive performance of these algorithms are evaluated using Australian electricity demand data. The output of the aggregation algorithms of NN ensembles are compared with a Naive approach. Mean absolute percentage error is applied as the performance index for assessing the quality of aggregated forecasts. Through comprehensive simulations, it is found that the aggregation algorithms can significantly improve the forecasting accuracies. The BMA algorithm also demonstrates the best performance amongst aggregation algorithms investigated in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational Intelligence (CI) holds the key to the development of smart grid to overcome the challenges of planning and optimization through accurate prediction of Renewable Energy Sources (RES). This paper presents an architectural framework for the construction of hybrid intelligent predictor for solar power. This research investigates the applicabil- ity of heterogeneous regression algorithms for 6 hour ahead solar power availability forecasting using historical data from Rockhampton, Australia. Real life solar radiation data is collected across six years with hourly resolution from 2005 to 2010. We observe that the hybrid prediction method is suitable for a reliable smart grid energy management. Prediction reliability of the proposed hybrid prediction method is carried out in terms of prediction error performance based on statistical and graphical methods. The experimental results show that the proposed hybrid method achieved acceptable prediction accuracy. This potential hybrid model is applicable as a local predictor for any proposed hybrid method in real life application for 6 hours in advance prediction to ensure constant solar power supply in the smart grid operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Twomultidimensional HPLC separations of an Australian red wine are presented, >70% of the available separation space was used. A porous graphitic carbon (PGC) stationary phase was used as the first dimension in both separations with both RP core–shell and hydrophilic interaction chromatography fully porous columns used separately in the second dimension. To overcome peak analysis problems caused by signal noise and low detection limits, the data were pre-processed with penalised least-squares smoothing. The PGC × RP combination separated 85 peaks with a spreading angle of 71 and the PGC × hydrophilic interaction chromatography separated 207 peaks with a spreading angle of 80. Both 2D-HPLC steps were completed in 76 min using a comprehensive stop-and-go approach. A smoothing step was added to peak-picking processes and was able to greatly reduce the number of false peaks present due to noise in the chromatograms. The required thresholds were not able to ignore the noise because of the small magnitude of the peaks; 1874 peaks were located in the non-smoothed PGC × RP separation that reduced to 227 peaks after smoothing was included.