466 resultados para IR DETECTION MODULES
Resumo:
News blog hot topics are important for the information recommendation service and marketing. However, information overload and personalized management make the information arrangement more difficult. Moreover, what influences the formation and development of blog hot topics is seldom paid attention to. In order to correctly detect news blog hot topics, the paper first analyzes the development of topics in a new perspective based on W2T (Wisdom Web of Things) methodology. Namely, the characteristics of blog users, context of topic propagation and information granularity are unified to analyze the related problems. Some factors such as the user behavior pattern, network opinion and opinion leader are subsequently identified to be important for the development of topics. Then the topic model based on the view of event reports is constructed. At last, hot topics are identified by the duration, topic novelty, degree of topic growth and degree of user attention. The experimental results show that the proposed method is feasible and effective.
Resumo:
Knowledge of the elements present in house dusts is important in understanding potential health effects on humans. In this study, dust samples collected from 10 houses in south-east Queensland have been analysed by scanning electron microscopy and X-ray microanalysis to measure the inorganic element compositions and to investigate the form of heavy metals in the dusts. The overall analytical results were then used to discriminate between different localities using chemometric techniques. The relative amounts of elements, particularly of Si, Ca, and Fe, varied between size fractions and between different locations for the same size fraction. By analysing individual small particles, many other constituents were identified including Ti, Cr, Mn, Ni, Cu, Zn, Ba, Ag, W, Au, Hg, Pb, Bi, La and Ce. The heavy metals were mostly concentrated in small particles in the smaller size fractions, which allowed detection by particle analysis, though their average concentrations were very low.
Resumo:
Static anaylsis represents an approach of checking source code or compiled code of applications before it gets executed. Chess and McGraw state that static anaylsis promises to identify common coding problems automatically. While manual code checking is also a form of static analysis, software tools are used in most cases in order to perform the checks. Chess and McGraw additionaly claim that good static checkers can help to spot and eradicate common security bugs.
Resumo:
We propose CIMD (Collaborative Intrusion and Malware Detection), a scheme for the realization of collaborative intrusion detection approaches. We argue that teams, respectively detection groups with a common purpose for intrusion detection and response, improve the measures against malware. CIMD provides a collaboration model, a decentralized group formation and an anonymous communication scheme. Participating agents can convey intrusion detection related objectives and associated interests for collaboration partners. These interests are based on intrusion objectives and associated interests for collaboration partners. These interests are based on intrusion detection related ontology, incorporating network and hardware configurations and detection capabilities. Anonymous Communication provided by CIMD allows communication beyond suspicion, i.e. the adversary can not perform better than guessing an IDS to be the source of a message at random. The evaluation takes place with the help of NeSSi² (www.nessi2.de), the Network Security Simulator, a dedicated environment for analysis of attacks and countermeasures in mid-scale and large-scale networks. A CIMD prototype is being built based on the JIAC agent framework(www.jiac.de).
Resumo:
This paper presents a formal methodology for attack modeling and detection for networks. Our approach has three phases. First, we extend the basic attack tree approach 1 to capture (i) the temporal dependencies between components, and (ii) the expiration of an attack. Second, using the enhanced attack trees (EAT) we build a tree automaton that accepts a sequence of actions from input stream if there is a traverse of an attack tree from leaves to the root node. Finally, we show how to construct an enhanced parallel automaton (EPA) that has each tree automaton as a subroutine and can process the input stream by considering multiple trees simultaneously. As a case study, we show how to represent the attacks in IEEE 802.11 and construct an EPA for it.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways, e.g. for payment systems or assisting the lives of elderly or disabled people. Security threats for these devices become more and more dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level and where third-party developers first time have the opportunity to develop kernel-based low-level security tools. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS, holding the greatest market share among all smartphone OSs, was even closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners privacy. Since signature-based approaches mainly detect known malwares, anomaly-based approaches can be a valuable addition to these systems. They base on mathematical algorithms processing data that describe the state of a certain device. For gaining this data, a monitoring client is needed that has to extract usable information (features) from the monitored system. Our approach follows a dual system for analyzing these features. On the one hand, functionality for on-device light-weight detection is provided. But since most algorithms are resource exhaustive, remote feature analysis is provided on the other hand. Having this dual system enables event-based detection that can react to the current detection need. In our ongoing research we aim to investigates the feasibility of light-weight on-device detection for certain occasions. On other occasions, whenever significant changes are detected on the device, the system can trigger remote detection with heavy-weight algorithms for better detection results. In the absence of the server respectively as a supplementary approach, we also consider a collaborative scenario. Here, mobile devices sharing a common objective are enabled by a collaboration module to share information, such as intrusion detection data and results. This is based on an ad-hoc network mode that can be provided by a WiFi or Bluetooth adapter nearly every smartphone possesses.
Resumo:
Anomaly detection compensates shortcomings of signature-based detection such as protecting against Zero-Day exploits. However, Anomaly Detection can be resource-intensive and is plagued by a high false-positive rate. In this work, we address these problems by presenting a Cooperative Intrusion Detection approach for the AIS, the Artificial Immune System, as an example for an anomaly detection approach. In particular we show, how the cooperative approach reduces the false-positive rate of the detection and how the overall detection process can be organized to account for the resource constraints of the participating devices. Evaluations are carried out with the novel network simulation environment NeSSi as well as formally with an extension to the epidemic spread model SIR
Resumo:
The power of testing for a population-wide association between a biallelic quantitative trait locus and a linked biallelic marker locus is predicted both empirically and deterministically for several tests. The tests were based on the analysis of variance (ANOVA) and on a number of transmission disequilibrium tests (TDT). Deterministic power predictions made use of family information, and were functions of population parameters including linkage disequilibrium, allele frequencies, and recombination rate. Deterministic power predictions were very close to the empirical power from simulations in all scenarios considered in this study. The different TDTs had very similar power, intermediate between one-way and nested ANOVAs. One-way ANOVA was the only test that was not robust against spurious disequilibrium. Our general framework for predicting power deterministically can be used to predict power in other association tests. Deterministic power calculations are a powerful tool for researchers to plan and evaluate experiments and obviate the need for elaborate simulation studies.
Resumo:
Vibration Based Damage Identification Techniques which use modal data or their functions, have received significant research interest in recent years due to their ability to detect damage in structures and hence contribute towards the safety of the structures. In this context, Strain Energy Based Damage Indices (SEDIs), based on modal strain energy, have been successful in localising damage in structuers made of homogeneous materials such as steel. However, their application to reinforced concrete (RC) structures needs further investigation due to the significant difference in the prominent damage type, the flexural crack. The work reported in this paper is an integral part of a comprehensive research program to develop and apply effective strain energy based damage indices to assess damage in reinforced concrete flexural members. This research program established (i) a suitable flexural crack simulation technique, (ii) four improved SEDI's and (iii) programmable sequentional steps to minimise effects of noise. This paper evaluates and ranks the four newly developed SEDIs and existing seven SEDIs for their ability to detect and localise flexural cracks in RC beams. Based on the results of the evaluations, it recommends the SEDIs for use with single and multiple vibration modes.
Resumo:
This paper provides a new general approach for defining coherent generators in power systems based on the coherency in low frequency inter-area modes. The disturbance is considered to be distributed in the network by applying random load changes which is the random walk representation of real loads instead of a single fault and coherent generators are obtained by spectrum analysis of the generators velocity variations. In order to find the coherent areas and their borders in the inter-connected networks, non-generating buses are assigned to each group of coherent generator using similar coherency detection techniques. The method is evaluated on two test systems and coherent generators and areas are obtained for different operating points to provide a more accurate grouping approach which is valid across a range of realistic operating points of the system.
Resumo:
Phylogenetic inference from sequences can be misled by both sampling (stochastic) error and systematic error (nonhistorical signals where reality differs from our simplified models). A recent study of eight yeast species using 106 concatenated genes from complete genomes showed that even small internal edges of a tree received 100% bootstrap support. This effective negation of stochastic error from large data sets is important, but longer sequences exacerbate the potential for biases (systematic error) to be positively misleading. Indeed, when we analyzed the same data set using minimum evolution optimality criteria, an alternative tree received 100% bootstrap support. We identified a compositional bias as responsible for this inconsistency and showed that it is reduced effectively by coding the nucleotides as purines and pyrimidines (RY-coding), reinforcing the original tree. Thus, a comprehensive exploration of potential systematic biases is still required, even though genome-scale data sets greatly reduce sampling error.
Resumo:
This paper presents a study whereby a series of tests was undertaken using a naturally aspirated 4 cylinder, 2.216 litre, Perkins Diesel engine fitted with a piston having an undersized skirt. This experimental simulation resulted in engine running conditions that included abnormally high levels of piston slap occurring in one of the cylinders. The detectability of the resultant Diesel engine piston slap was investigated using acoustic emission signals. Data corresponding to both normal and piston slap engine running conditions was captured using acoustic emission transducers along with both; in-cylinder pressure and top-dead centre reference signals. Using these signals it was possible to demonstrate that the increased piston slap running conditions were distinguishable by monitoring the piston slap events occurring near the piston mid-stroke positions. However, when monitoring the piston slap events occurring near the TDC/BDC piston stroke positions, the normal and excessive piston slap engine running condition were not clearly distinguishable.
Resumo:
Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two–dimensional barrier assays describing the collective spreading of an initially–confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density.
Resumo:
Monitoring fetal wellbeing is a compelling problem in modern obstetrics. Clinicians have become increasingly aware of the link between fetal activity (movement), well-being, and later developmental outcome. We have recently developed an ambulatory accelerometer-based fetal activity monitor (AFAM) to record 24-hour fetal movement. Using this system, we aim at developing signal processing methods to automatically detect and quantitatively characterize fetal movements. The first step in this direction is to test the performance of the accelerometer in detecting fetal movement against real-time ultrasound imaging (taken as the gold standard). This paper reports first results of this performance analysis.