429 resultados para Heterodyne detection
Resumo:
Early detection surveillance programs aim to find invasions of exotic plant pests and diseases before they are too widespread to eradicate. However, the value of these programs can be difficult to justify when no positive detections are made. To demonstrate the value of pest absence information provided by these programs, we use a hierarchical Bayesian framework to model estimates of incursion extent with and without surveillance. A model for the latent invasion process provides the baseline against which surveillance data are assessed. Ecological knowledge and pest management criteria are introduced into the model using informative priors for invasion parameters. Observation models assimilate information from spatio-temporal presence/absence data to accommodate imperfect detection and generate posterior estimates of pest extent. When applied to an early detection program operating in Queensland, Australia, the framework demonstrates that this typical surveillance regime provides a modest reduction in the estimate that a surveyed district is infested. More importantly, the model suggests that early detection surveillance programs can provide a dramatic reduction in the putative area of incursion and therefore offer a substantial benefit to incursion management. By mapping spatial estimates of the point probability of infestation, the model identifies where future surveillance resources can be most effectively deployed.
Resumo:
Background Techniques for detecting circulating tumor cells in the peripheral blood of patients with head and neck cancers may identify individuals likely to benefit from early systemic treatment. Methods Reconstruction experiments were used to optimise immunomagnetic enrichment and RT-PCR detection of circulating tumor cells using four markers (ELF3, CK19, EGFR and EphB4). This method was then tested in a pilot study using samples from 16 patients with advanced head and neck carcinomas. Results Seven patients were positive for circulating tumour cells both prior to and after surgery, 4 patients were positive prior to but not after surgery, 3 patients were positive after but not prior to surgery and 2 patients were negative. Two patients tested positive for circulating cells but there was no other evidence of tumor spread. Given this patient cohort had mostly advanced disease, as expected the detection of circulating tumour cells was not associated with significant differences in overall or disease free survival. Conclusion For the first time, we show that almost all patients with advanced head and neck cancers have circulating cells at the time of surgery. The clinical application of techniques for detection of spreading disease, such as the immunomagnetic enrichment RT-PCR analysis used in this study, should be explored further.
Resumo:
With the identification of common single locus point mutations as risk factors for thrombophilia, many DNA testing methodologies have been described for detecting these variations. Traditionally, functional or immunological testing methods have been used to investigate quantitative anticoagulant deficiencies. However, with the emergence of the genetic variations, factor V Leiden, prothrombin 20210 and, to a lesser extent, the methylene tetrahydrofolate reductase (MTHFR677) and factor V HR2 haplotype, traditional testing methodologies have proved to be less useful and instead DNA technology is more commonly employed in diagnostics. This review considers many of the DNA techniques that have proved to be useful in the detection of common genetic variants that predispose to thrombophilia. Techniques involving gel analysis are used to detect the presence or absence of restriction sites, electrophoretic mobility shifts, as in single strand conformation polymorphism or denaturing gradient gel electrophoresis, and product formation in allele-specific amplification. Such techniques may be sensitive, but are unwielding and often need to be validated objectively. In order to overcome some of the limitations of gel analysis, especially when dealing with larger sample numbers, many alternative detection formats, such as closed tube systems, microplates and microarrays (minisequencing, real-time polymerase chain reaction, and oligonucleotide ligation assays) have been developed. In addition, many of the emerging technologies take advantage of colourimetric or fluorescence detection (including energy transfer) that allows qualitative and quantitative interpretation of results. With the large variety of DNA technologies available, the choice of methodology will depend on several factors including cost and the need for speed, simplicity and robustness. © 2000 Lippincott Williams & Wilkins.
Resumo:
PCR-based cancer diagnosis requires detection of rare mutations in k- ras, p53 or other genes. The assumption has been that mutant and wild-type sequences amplify with near equal efficiency, so that they are eventually present in proportions representative of the starting material. Work on factor IX suggests that this assumption is invalid for one case of near- sequence identity. To test the generality of this phenomenon and its relevance to cancer diagnosis, primers distant from point mutations in p53 and k-ras were used to amplify wild-type and mutant sequences from these genes. A substantial bias against PCR amplification of mutants was observed for two regions of the p53 gene and one region of k-ras. For k-ras and p53, bias was observed when the wild-type and mutant sequences were amplified separately or when mixed in equal proportions before PCR. Bias was present with proofreading and non-proofreading polymerase. Mutant and wild-type segments of the factor V, cystic fibrosis transmembrane conductance regulator and prothrombin genes were amplified and did not exhibit PCR bias. Therefore, the assumption of equal PCR efficiency for point mutant and wild-type sequences is invalid in several systems. Quantitative or diagnostic PCR will require validation for each locus, and enrichment strategies may be needed to optimize detection of mutants.
Resumo:
This paper describes an effective method for signal-authentication and spoofing detection for civilian GNSS receivers using the GPS L1 C/A and the Galileo E1-B Safety of Life service. The paper discusses various spoofing attack profiles and how the proposed method is able to detect these attacks. This method is relatively low-cost and can be suitable for numerous mass-market applications. This paper is the subject of a pending patent.
Resumo:
Given the recent emergence of the smart grid and smart grid related technologies, their security is a prime concern. Intrusion detection provides a second line of defense. However, conventional intrusion detection systems (IDSs) are unable to adequately address the unique requirements of the smart grid. This paper presents a gap analysis of contemporary IDSs from a smart grid perspective. This paper highlights the lack of adequate intrusion detection within the smart grid and discusses the limitations of current IDSs approaches. The gap analysis identifies current IDSs as being unsuited to smart grid application without significant changes to address smart grid specific requirements.
Resumo:
In this paper, a plasmonic “ac Wheatstone bridge” circuit is proposed and theoretically modeled for the first time. The bridge circuit consists of three metallic nanoparticles, shaped as rectangular prisms, with two nanoparticles acting as parallel arms of a resonant circuit and the third bridging nanoparticle acting as an optical antenna providing an output signal. Polarized light excites localized surface plasmon resonances in the two arms of the circuit, which generate an optical signal dependent on the phase-sensitive excitations of surface plasmons in the antenna. The circuit is analyzed using a plasmonic coupling theory and numerical simulations. The analyses show that the plasmonic circuit is sensitive to phase shifts between the arms of the bridge and has the potential to detect the presence of single molecules.
Resumo:
In previous research (Chung et al., 2009), the potential of the continuous risk profile (CRP) to proactively detect the systematic deterioration of freeway safety levels was presented. In this paper, this potential is investigated further, and an algorithm is proposed for proactively detecting sites where the collision rate is not sufficiently high to be classified as a high collision concentration location but where a systematic deterioration of safety level is observed. The approach proposed compares the weighted CRP across different years and uses the cumulative sum (CUSUM) algorithm to detect the sites where changes in collision rate are observed. The CRPs of the detected sites are then compared for reproducibility. When high reproducibility is observed, a growth factor is used for sequential hypothesis testing to determine if the collision profiles are increasing over time. Findings from applying the proposed method using empirical data are documented in the paper together with a detailed description of the method.
Resumo:
Despite many incidents about fake online consumer reviews have been reported, very few studies have been conducted to date to examine the trustworthiness of online consumer reviews. One of the reasons is the lack of an effective computational method to separate the untruthful reviews (i.e., spam) from the legitimate ones (i.e., ham) given the fact that prominent spam features are often missing in online reviews. The main contribution of our research work is the development of a novel review spam detection method which is underpinned by an unsupervised inferential language modeling framework. Another contribution of this work is the development of a high-order concept association mining method which provides the essential term association knowledge to bootstrap the performance for untruthful review detection. Our experimental results confirm that the proposed inferential language model equipped with high-order concept association knowledge is effective in untruthful review detection when compared with other baseline methods.
Resumo:
Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.
Resumo:
The construction of timelines of computer activity is a part of many digital investigations. These timelines of events are composed of traces of historical activity drawn from system logs and potentially from evidence of events found in the computer file system. A potential problem with the use of such information is that some of it may be inconsistent and contradictory thus compromising its value. This work introduces a software tool (CAT Detect) for the detection of inconsistency within timelines of computer activity. We examine the impact of deliberate tampering through experiments conducted with our prototype software tool. Based on the results of these experiments, we discuss techniques which can be employed to deal with such temporal inconsistencies.
Resumo:
Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.