435 resultados para malware detection


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the identification of common single locus point mutations as risk factors for thrombophilia, many DNA testing methodologies have been described for detecting these variations. Traditionally, functional or immunological testing methods have been used to investigate quantitative anticoagulant deficiencies. However, with the emergence of the genetic variations, factor V Leiden, prothrombin 20210 and, to a lesser extent, the methylene tetrahydrofolate reductase (MTHFR677) and factor V HR2 haplotype, traditional testing methodologies have proved to be less useful and instead DNA technology is more commonly employed in diagnostics. This review considers many of the DNA techniques that have proved to be useful in the detection of common genetic variants that predispose to thrombophilia. Techniques involving gel analysis are used to detect the presence or absence of restriction sites, electrophoretic mobility shifts, as in single strand conformation polymorphism or denaturing gradient gel electrophoresis, and product formation in allele-specific amplification. Such techniques may be sensitive, but are unwielding and often need to be validated objectively. In order to overcome some of the limitations of gel analysis, especially when dealing with larger sample numbers, many alternative detection formats, such as closed tube systems, microplates and microarrays (minisequencing, real-time polymerase chain reaction, and oligonucleotide ligation assays) have been developed. In addition, many of the emerging technologies take advantage of colourimetric or fluorescence detection (including energy transfer) that allows qualitative and quantitative interpretation of results. With the large variety of DNA technologies available, the choice of methodology will depend on several factors including cost and the need for speed, simplicity and robustness. © 2000 Lippincott Williams & Wilkins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PCR-based cancer diagnosis requires detection of rare mutations in k- ras, p53 or other genes. The assumption has been that mutant and wild-type sequences amplify with near equal efficiency, so that they are eventually present in proportions representative of the starting material. Work on factor IX suggests that this assumption is invalid for one case of near- sequence identity. To test the generality of this phenomenon and its relevance to cancer diagnosis, primers distant from point mutations in p53 and k-ras were used to amplify wild-type and mutant sequences from these genes. A substantial bias against PCR amplification of mutants was observed for two regions of the p53 gene and one region of k-ras. For k-ras and p53, bias was observed when the wild-type and mutant sequences were amplified separately or when mixed in equal proportions before PCR. Bias was present with proofreading and non-proofreading polymerase. Mutant and wild-type segments of the factor V, cystic fibrosis transmembrane conductance regulator and prothrombin genes were amplified and did not exhibit PCR bias. Therefore, the assumption of equal PCR efficiency for point mutant and wild-type sequences is invalid in several systems. Quantitative or diagnostic PCR will require validation for each locus, and enrichment strategies may be needed to optimize detection of mutants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an effective method for signal-authentication and spoofing detection for civilian GNSS receivers using the GPS L1 C/A and the Galileo E1-B Safety of Life service. The paper discusses various spoofing attack profiles and how the proposed method is able to detect these attacks. This method is relatively low-cost and can be suitable for numerous mass-market applications. This paper is the subject of a pending patent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the recent emergence of the smart grid and smart grid related technologies, their security is a prime concern. Intrusion detection provides a second line of defense. However, conventional intrusion detection systems (IDSs) are unable to adequately address the unique requirements of the smart grid. This paper presents a gap analysis of contemporary IDSs from a smart grid perspective. This paper highlights the lack of adequate intrusion detection within the smart grid and discusses the limitations of current IDSs approaches. The gap analysis identifies current IDSs as being unsuited to smart grid application without significant changes to address smart grid specific requirements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a plasmonic “ac Wheatstone bridge” circuit is proposed and theoretically modeled for the first time. The bridge circuit consists of three metallic nanoparticles, shaped as rectangular prisms, with two nanoparticles acting as parallel arms of a resonant circuit and the third bridging nanoparticle acting as an optical antenna providing an output signal. Polarized light excites localized surface plasmon resonances in the two arms of the circuit, which generate an optical signal dependent on the phase-sensitive excitations of surface plasmons in the antenna. The circuit is analyzed using a plasmonic coupling theory and numerical simulations. The analyses show that the plasmonic circuit is sensitive to phase shifts between the arms of the bridge and has the potential to detect the presence of single molecules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In previous research (Chung et al., 2009), the potential of the continuous risk profile (CRP) to proactively detect the systematic deterioration of freeway safety levels was presented. In this paper, this potential is investigated further, and an algorithm is proposed for proactively detecting sites where the collision rate is not sufficiently high to be classified as a high collision concentration location but where a systematic deterioration of safety level is observed. The approach proposed compares the weighted CRP across different years and uses the cumulative sum (CUSUM) algorithm to detect the sites where changes in collision rate are observed. The CRPs of the detected sites are then compared for reproducibility. When high reproducibility is observed, a growth factor is used for sequential hypothesis testing to determine if the collision profiles are increasing over time. Findings from applying the proposed method using empirical data are documented in the paper together with a detailed description of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite many incidents about fake online consumer reviews have been reported, very few studies have been conducted to date to examine the trustworthiness of online consumer reviews. One of the reasons is the lack of an effective computational method to separate the untruthful reviews (i.e., spam) from the legitimate ones (i.e., ham) given the fact that prominent spam features are often missing in online reviews. The main contribution of our research work is the development of a novel review spam detection method which is underpinned by an unsupervised inferential language modeling framework. Another contribution of this work is the development of a high-order concept association mining method which provides the essential term association knowledge to bootstrap the performance for untruthful review detection. Our experimental results confirm that the proposed inferential language model equipped with high-order concept association knowledge is effective in untruthful review detection when compared with other baseline methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The construction of timelines of computer activity is a part of many digital investigations. These timelines of events are composed of traces of historical activity drawn from system logs and potentially from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work introduces a software tool (CAT Detect) for the detection of inconsistency within timelines of computer activity. We examine the impact of deliberate tampering through experiments conducted with our prototype software tool. Based on the results of these experiments, we discuss techniques which can be employed to deal with such temporal inconsistencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modelling events in densely crowded environments remains challenging, due to the diversity of events and the noise in the scene. We propose a novel approach for anomalous event detection in crowded scenes using dynamic textures described by the Local Binary Patterns from Three Orthogonal Planes (LBP-TOP) descriptor. The scene is divided into spatio-temporal patches where LBP-TOP based dynamic textures are extracted. We apply hierarchical Bayesian models to detect the patches containing unusual events. Our method is an unsupervised approach, and it does not rely on object tracking or background subtraction. We show that our approach outperforms existing state of the art algorithms for anomalous event detection in UCSD dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, few attempts have been made to explore the structure damage with frequency response functions (FRFs). This paper illustrates the damage identification and condition assessment of a beam structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). In practice, usage of all available FRF data as an input to artificial neural networks makes the training and convergence impossible. Therefore one of the data reduction techniques Principal Component Analysis (PCA) is introduced in the algorithm. In the proposed procedure, a large set of FRFs are divided into sub-sets in order to find the damage indices for different frequency points of different damage scenarios. The basic idea of this method is to establish features of damaged structure using FRFs from different measurement points of different sub-sets of intact structure. Then using these features, damage indices of different damage cases of the structure are identified after reconstructing of available FRF data using PCA. The obtained damage indices corresponding to different damage locations and severities are introduced as input variable to developed artificial neural networks. Finally, the effectiveness of the proposed method is illustrated and validated by using the finite element modal of a beam structure. The illustrated results show that the PCA based damage index is suitable and effective for structural damage detection and condition assessment of building structures.