64 resultados para Intrusion Detection, Computer Security, Misuse
Resumo:
Motivation: Microarray experiments generate a high data volume. However, often due to financial or experimental considerations, e.g. lack of sample, there is little or no replication of the experiments or hybridizations. These factors combined with the intrinsic variability associated with the measurement of gene expression can result in an unsatisfactory detection rate of differential gene expression (DGE). Our motivation was to provide an easy to use measure of the success rate of DGE detection that could find routine use in the design of microarray experiments or in post-experiment assessment.
Resumo:
This paper considers a Q-ary orthogonal direct-sequence code-division multiple-access (DS-CDMA) system with high-rate space-time linear dispersion codes (LDCs) in time-varying Rayleigh fading multiple-input-multiple-output (MIMO) channels. We propose a joint multiuser detection, LDC decoding, Q-ary demodulation, and channel-decoding algorithm and apply the turbo processing principle to improve system performance in an iterative fashion. The proposed iterative scheme demonstrates faster convergence and superior performance compared with the V-BLAST-based DS-CDMA system and is shown to approach the single-user performance bound. We also show that the CDMA system is able to exploit the time diversity offered by the LDCS in rapid-fading channels.
Resumo:
Color segmentation of images usually requires a manual selection and classification of samples to train the system. This paper presents an automatic system that performs these tasks without the need of a long training, providing a useful tool to detect and identify figures. In real situations, it is necessary to repeat the training process if light conditions change, or if, in the same scenario, the colors of the figures and the background may have changed, being useful a fast training method. A direct application of this method is the detection and identification of football players.
Resumo:
In this paper we propose a statistical model for detection and tracking of human silhouette and the corresponding 3D skeletal structure in gait sequences. We follow a point distribution model (PDM) approach using a Principal Component Analysis (PCA). The problem of non-lineal PCA is partially resolved by applying a different PDM depending of pose estimation; frontal, lateral and diagonal, estimated by Fisher's linear discriminant. Additionally, the fitting is carried out by selecting the closest allowable shape from the training set by means of a nearest neighbor classifier. To improve the performance of the model we develop a human gait analysis to take into account temporal dynamic to track the human body. The incorporation of temporal constraints on the model increase reliability and robustness.
Resumo:
The ability of millimetre wave and terahertz systems to penetrate clothing is well known. The fact that the transmission of clothing and the reflectivity of the body vary as a function of frequency is less so. Several instruments have now been developed to exploit this capability. The choice of operating frequency, however, has often been associated with the maturity and the cost of the enabling technology rather than a sound systems engineering approach. Top level user and systems requirements have been derived to inform the development of design concepts. Emerging micro and nano technology concepts have been reviewed and we have demonstrated how these can be evaluated against these requirements by simulation using OpenFx. Openfx is an open source suite of 3D tools for modeling, animation and visualization which has been modified for use at millimeter waves. © 2012 SPIE.
Resumo:
The techniques and technologies currently being investigated to detect weapons and contraband concealed on persons under clothing are reviewed. The basic phenomenology of the atmosphere and materials that must be understood in order to realize such a system are discussed. The component issues and architectural designs needed to realize systems are outlined. Some conclusions with respect to further technology developments are presented.
Resumo:
Passive equipments operating in the 30-300 GHZ (millimeter wave) band are compared to those in the 300 GHz-3 THz (submillimeter band). Equipments operating in the submillimeter band can measure distance and also spectral information and have been used to address new opportunities in security. Solid state spectral information is available in the submillimeter region making it possible to identify materials, whereas in millimeter region bulk optical properties determine the image contrast. The optical properties in the region from 30 GHz to 3 THz are discussed for some typical inorganic and organic solids. in the millimeter-wave region of the spectrum, obscurants such as poor weather, dust, and smoke can be penetrated and useful imagery generated for surveillance. in the 30 GHZ-3 THZ region dielectrics such as plastic and cloth are also transparent and the detection of contraband hidden under clothing is possible. A passive millimeter-wave imaging concept based on a folded Schmidt camera has been developed and applied to poor weather navigation and security. The optical design uses a rotating mirror and is folded using polarization techniques. The design is very well corrected over a wide field of view making it ideal for surveillance, and security. This produces a relatively compact imager which minimizes the receiver count.
Resumo:
N-gram analysis is an approach that investigates the structure of a program using bytes, characters, or text strings. A key issue with N-gram analysis is feature selection amidst the explosion of features that occurs when N is increased. The experiments within this paper represent programs as operational code (opcode) density histograms gained through dynamic analysis. A support vector machine is used to create a reference model, which is used to evaluate two methods of feature reduction, which are 'area of intersect' and 'subspace analysis using eigenvectors.' The findings show that the relationships between features are complex and simple statistics filtering approaches do not provide a viable approach. However, eigenvector subspace analysis produces a suitable filter.
Resumo:
Marine dinoflagellates of the genera Alexandrium are well known producers of the potent neurotoxic paralytic shellfish toxins that can enter the food web and ultimately present a serious risk to public health in addition to causing huge economic losses. Direct coastal monitoring of Alexandrium spp. can provide early warning of potential shellfish contamination and risks to consumers and so a rapid, sensitive, portable and easy-to-use assay has been developed for this purpose using an innovative planar waveguide device. The disposable planar waveguide is comprised of a transparent substrate onto which an array of toxin-protein conjugates is deposited, assembled in a cartridge allowing the introduction of sample, and detection reagents. The competitive assay format uses a high affinity antibody to paralytic shellfish toxins with a detection signal generated via a fluorescently labelled secondary antibody. The waveguide cartridge is analysed by a simple reader device and results are displayed on a laptop computer. Assay speed has been optimised to enable measurement within 15 min. A rapid, portable sample preparation technique was developed for Alexandrium spp. in seawater to ensure analysis was completed within a short period of time. The assay was validated and the LOD and CCß were determined as 12 pg/mL and 20 pg/mL respectively with an intra-assay CV of 11.3% at the CCß and an average recovery of 106%. The highly innovative assay was proven to accurately detect toxin presence in algae sampled from the US and European waters at an unprecedented cell density of 10 cells/L. © 2012 Elsevier B.V. All rights reserved.
Resumo:
Previous research based on theoretical simulations has shown the potential of the wavelet transform to detect damage in a beam by analysing the time-deflection response due to a constant moving load. However, its application to identify damage from the response of a bridge to a vehicle raises a number of questions. Firstly, it may be difficult to record the difference in the deflection signal between a healthy and a slightly damaged structure to the required level of accuracy and high scanning frequencies in the field. Secondly, the bridge is going to have a road profile and it will be loaded by a sprung vehicle and time-varying forces rather than a constant load. Therefore, an algorithm based on a plot of wavelet coefficients versus time to detect damage (a singularity in the plot) appears to be very sensitive to noise. This paper addresses these questions by: (a) using the acceleration signal, instead of the deflection signal, (b) employing a vehicle-bridge finite element interaction model, and (c) developing a novel wavelet-based approach using wavelet energy content at each bridge section which proves to be more sensitive to damage than a wavelet coefficient line plot at a given scale as employed by others.
Resumo:
The scheduling problem in distributed data-intensive computing environments has become an active research topic due to the tremendous growth in grid and cloud computing environments. As an innovative distributed intelligent paradigm, swarm intelligence provides a novel approach to solving these potentially intractable problems. In this paper, we formulate the scheduling problem for work-flow applications with security constraints in distributed data-intensive computing environments and present a novel security constraint model. Several meta-heuristic adaptations to the particle swarm optimization algorithm are introduced to deal with the formulation of efficient schedules. A variable neighborhood particle swarm optimization algorithm is compared with a multi-start particle swarm optimization and multi-start genetic algorithm. Experimental results illustrate that population based meta-heuristics approaches usually provide a good balance between global exploration and local exploitation and their feasibility and effectiveness for scheduling work-flow applications. © 2010 Elsevier Inc. All rights reserved.