200 resultados para microphione forensics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the universal (blind) image steganalysis problem and introduces a novel method to detect especially spatial domain steganographic methods. The proposed steganalyzer models linear dependencies of image rows/columns in local neighborhoods using singular value decomposition transform and employs content independency provided by a Wiener filtering process. Experimental results show that the novel method has superior performance when compared with its counterparts in terms of spatial domain steganography. Experiments also demonstrate the reasonable ability of the method to detect discrete cosine transform-based steganography as well as the perturbation quantization method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a new technique for palmprint recognition based on Fisher Linear Discriminant Analysis (FLDA) and Gabor filter bank. This method involves convolving a palmprint image with a bank of Gabor filters at different scales and rotations for robust palmprint features extraction. Once these features are extracted, FLDA is applied for dimensionality reduction and class separability. Since the palmprint features are derived from the principal lines, wrinkles and texture along the palm area. One should carefully consider this fact when selecting the appropriate palm region for the feature extraction process in order to enhance recognition accuracy. To address this problem, an improved region of interest (ROI) extraction algorithm is introduced. This algorithm allows for an efficient extraction of the whole palm area by ignoring all the undesirable parts, such as the fingers and background. Experiments have shown that the proposed method yields attractive performances as evidenced by an Equal Error Rate (EER) of 0.03%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

N-gram analysis is an approach that investigates the structure of a program using bytes, characters, or text strings. A key issue with N-gram analysis is feature selection amidst the explosion of features that occurs when N is increased. The experiments within this paper represent programs as operational code (opcode) density histograms gained through dynamic analysis. A support vector machine is used to create a reference model, which is used to evaluate two methods of feature reduction, which are 'area of intersect' and 'subspace analysis using eigenvectors.' The findings show that the relationships between features are complex and simple statistics filtering approaches do not provide a viable approach. However, eigenvector subspace analysis produces a suitable filter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In intelligent video surveillance systems, scalability (of the number of simultaneous video streams) is important. Two key factors which hinder scalability are the time spent in decompressing the input video streams, and the limited computational power of the processor. This paper demonstrates how a combination of algorithmic and hardware techniques can overcome these limitations, and significantly increase the number of simultaneous streams. The techniques used are processing in the compressed domain, and exploitation of the multicore and vector processing capability of modern processors. The paper presents a system which performs background modeling, using a Mixture of Gaussians approach. This is an important first step in the segmentation of moving targets. The paper explores the effects of reducing the number of coefficients in the compressed domain, in terms of throughput speed and quality of the background modeling. The speedups achieved by exploiting compressed domain processing, multicore and vector processing are explored individually. Experiments show that a combination of all these techniques can give a speedup of 170 times on a single CPU compared to a purely serial, spatial domain implementation, with a slight gain in quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Blind steganalysis of JPEG images is addressed by modeling the correlations among the DCT coefficients using K -variate (K = 2) p.d.f. estimates (p.d.f.s) constructed by means of Markov random field (MRF) cliques. The reasoning of using high variate p.d.f.s together with MRF cliques for image steganalysis is explained via a classical detection problem. Although our approach has many improvements over the current state-of-the-art, it suffers from the high dimensionality and the sparseness of the high variate p.d.f.s. The dimensionality problem as well as the sparseness problem are solved heuristically by means of dimensionality reduction and feature selection algorithms. The detection accuracy of the proposed method(s) is evaluated over Memon's (30.000 images) and Goljan's (1912 images) image sets. It is shown that practically applicable steganalysis systems are possible with a suitable dimensionality reduction technique and these systems can provide, in general, improved detection accuracy over the current state-of-the-art. Experimental results also justify this assertion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the impact of multiple active eavesdroppers on cooperative single carrier systems with multiple relays and multiple destinations is examined. To achieve the secrecy diversity gains in the form of opportunistic selection, a two-stage scheme is proposed for joint relay and destination selection, in which, after the selection of the relay with the minimum effective maximum signal-to-noise ratio (SNR) to a cluster of eavesdroppers, the destination that has the maximum SNR from the chosen relay is selected. In order to accurately assess the secrecy performance, the exact and asymptotic expressions are obtained in closed-form for several security metrics including the secrecy outage probability, the probability of non-zero secrecy rate, and the ergodic secrecy rate in frequency selective fading. Based on the asymptotic analysis, key design parameters such as secrecy diversity gain, secrecy array gain, secrecy multiplexing gain, and power cost are characterized, from which new insights are drawn. Moreover, it is concluded that secrecy performance limits occur when the average received power at the eavesdropper is proportional to the counterpart at the destination. Specifically, for the secrecy outage probability, it is confirmed that the secrecy diversity gain collapses to zero with outage floor, whereas for the ergodic secrecy rate, it is confirmed confirm that its slope collapses to zero with capacity ceiling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geomorphology plays a critical role in two areas of geoforensics: searching the land for surface or buried objects and sampling or imaging rural scenes of crime and control locations as evidence. Most of the associated geoscience disciplines have substantial bodies of work dedicated to their relevance in forensic investigations, yet geomorphology (specifically landforms, their mapping and evolution, soils and relationship to geology and biogeography) have had no such exposure. This is strange considering how fundamental to legal enquiries the location of a crime and its evolution are, as this article will demonstrate. This work aims to redress the balance by showing how geomorphology is featured in one of the earliest works on forensic science methods, and has continued to play a role in the sociology, archaeology, criminalistics and geoforensics of crime. The application geomorphology has in military/humanitarian geography and environmental/engineering forensics is briefly discussed as these are also regularly reviewed in courts of law

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When applying biometric algorithms to forensic verification, false acceptance and false rejection can mean a failure to identify a criminal, or worse, lead to the prosecution of individuals for crimes they did not commit. It is therefore critical that biometric evaluations be performed as accurately as possible to determine their legitimacy as a forensic tool. This paper argues that, for forensic verification scenarios, traditional performance measures are insufficiently accurate. This inaccuracy occurs because existing verification evaluations implicitly assume that an imposter claiming a false identity would claim a random identity rather than consciously selecting a target to impersonate. In addition to describing this new vulnerability, the paper describes a novel Targeted.. FAR metric that combines the traditional False Acceptance Rate (FAR) measure with a term that indicates how performance degrades with the number of potential targets. The paper includes an evaluation of the effects of targeted impersonation on an existing academic face verification system. This evaluation reveals that even with a relatively small number of targets false acceptance rates can increase significantly, making the analysed biometric systems unreliable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many graph datasets are labelled with discrete and numeric attributes. Most frequent substructure discovery algorithms ignore numeric attributes; in this paper we show how they can be used to improve search performance and discrimination. Our thesis is that the most descriptive substructures are those which are normative both in terms of their structure and in terms of their numeric values. We explore the relationship between graph structure and the distribution of attribute values and propose an outlier-detection step, which is used as a constraint during substructure discovery. By pruning anomalous vertices and edges, more weight is given to the most descriptive substructures. Our method is applicable to multi-dimensional numeric attributes; we outline how it can be extended for high-dimensional data. We support our findings with experiments on transaction graphs and single large graphs from the domains of physical building security and digital forensics, measuring the effect on runtime, memory requirements and coverage of discovered patterns, relative to the unconstrained approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare a suite of Polycyclic Aromatic Hydrocarbons (Parent PAHs) in soils and air across an urban area (Belfast UK). Isomeric PAH ratios suggest that soil PAHs are mainly from a combustion source. Fugacity modelling across a range of soil temperatures predicts that four ring and larger PAHs from pyrene to indeno[1,2,3–cd]pyrene all partition strongly (>98%) to the soil compartment. This modelling also implies that these PAHs do not experience losses through partitioning to other phases (air, water) due to soil temperature effects. Such modelling may help in understanding the overall contaminantdistribution in soils. The air and soil data together with modelling suggests that care must be taken when considering isomeric ratios of compounds with mass lighter than 178 (i.e. phenanthrene and anthracene) in the soil phase. Comparison of duplicate and replicate samples suggest that field sampling of duplicates dominates uncertainty and validated methodologies for selection of field duplicates and lab splitting are required. As the urban soil four ring PAHs are at equilibrium in the soil phase, and have characteristic ratios that are dominated by a combustion source that is a single controlling factor over spatial distribution, methods that calculate background concentrations can be compared.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter presents a novel hand-held instrument capable of real-time in situ detection and identification of heavy metals, along with the potential use of novel taggants in environmental forensic investigations. The proposed system provides the facilities found in a traditional laboratory-based instrument but in a hand held design, without the need for an associated computer. The electrochemical instrument uses anodic stripping voltammetry, which is a precise and sensitive analytical method with excellent limits of detection. The sensors comprise a small disposable plastic strip of screen-printed electrodes rather than the more common glassy carbon disc and gold electrodes. The system is designed for use by a surveyor on site, allowing them to locate hotspots, thus avoiding the expense and time delay of prior laboratory analysis. This is particularly important in environmental forensic analysis when a site may have been released back to the owner and samples could be compromised on return visits. The system can be used in a variety of situations in environmental assessments, the data acquired from which provide a metals fingerprint suitable for input to a database. The proposed novel taggant tracers, based on narrow-band atomic fluorescence, are under development for potential deployment as forensic environmental tracers. The use of discrete fluorescent species in an environmentally stable host has been investigated to replace existing toxic, broadband molecular dye tracers. The narrow band emission signals offer the potential for tracing a large number of signals in the same environment. This will give increased data accuracy and allow multiple source environmental monitoring of environmental parameters.