976 resultados para measuring instruments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this preliminary case study, we investigate how inconsistency in a network intrusion detection rule set can be measured. To achieve this, we first examine the structure of these rules which incorporate regular expression (Regex) pattern matching. We then identify primitive elements in these rules in order to translate the rules into their (equivalent) logical forms and to establish connections between them. Additional rules from background knowledge are also introduced to make the correlations among rules more explicit. Finally, we measure the degree of inconsistency in formulae of such a rule set (using the Scoring function, Shapley inconsistency values and Blame measure for prioritized knowledge) and compare the informativeness of these measures. We conclude that such measures are useful for the network intrusion domain assuming that incorporating domain knowledge for correlation of rules is feasible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the degree of inconsistency of a belief base is an important issue in many real world applications. It has been increasingly recognized that deriving syntax sensitive inconsistency measures for a belief base from its minimal inconsistent subsets is a natural way forward. Most of the current proposals along this line do not take the impact of the size of each minimal inconsistent subset into account. However, as illustrated by the well-known Lottery Paradox, as the size of a minimal inconsistent subset increases, the degree of its inconsistency decreases. Another lack in current studies in this area is about the role of free formulas of a belief base in measuring the degree of inconsistency. This has not yet been characterized well. Adding free formulas to a belief base can enlarge the set of consistent subsets of that base. However, consistent subsets of a belief base also have an impact on the syntax sensitive normalized measures of the degree of inconsistency, the reason for this is that each consistent subset can be considered as a distinctive plausible perspective reflected by that belief base,whilst eachminimal inconsistent subset projects a distinctive viewof the inconsistency. To address these two issues,we propose a normalized framework formeasuring the degree of inconsistency of a belief base which unifies the impact of both consistent subsets and minimal inconsistent subsets. We also show that this normalized framework satisfies all the properties deemed necessary by common consent to characterize an intuitively satisfactory measure of the degree of inconsistency for belief bases. Finally, we use a simple but explanatory example in equirements engineering to illustrate the application of the normalized framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper I examine the scope of publicly available information on the religious composition of employees in private-sector companies in Northern Ireland. I highlight the unavailability of certain types of monitoring data and the impact of data aggregation at company as opposed to site level. Both oversights lead to underestimates of the extent of workplace segregation in Northern Ireland. The ability to provide more-coherent data on workplace segregation, by religion, in Northern Ireland is crucial in terms of advancing equality and other social-justice agendas. I argue that a more-accurate monitoring of religious composition of workplaces is part of an overall need to develop a spatial approach in which the importance of ethnically territorialised spaces in the reproduction of ethnosectarian disputation is understood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this preliminary study, we investigate how inconsistency in a network intrusion detection rule set can be measured. To achieve this, we first examine the structure of these rules which are based on Snort and incorporate regular expression (Regex) pattern matching. We then identify primitive elements in these rules in order to translate the rules into their (equivalent) logical forms and to establish connections between them. Additional rules from background knowledge are also introduced to make the correlations among rules more explicit. We measure the degree of inconsistency in formulae of such a rule set (using the Scoring function, Shapley inconsistency values and Blame measure for prioritized knowledge) and compare the informativeness of these measures. Finally, we propose a new measure of inconsistency for prioritized knowledge which incorporates the normalized number of atoms in a language involved in inconsistency to provide a deeper inspection of inconsistent formulae. We conclude that such measures are useful for the network intrusion domain assuming that introducing expert knowledge for correlation of rules is feasible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is increasingly recognized that identifying the degree of blame or responsibility of each formula for inconsistency of a knowledge base (i.e. a set of formulas) is useful for making rational decisions to resolve inconsistency in that knowledge base. Most current techniques for measuring the blame of each formula with regard to an inconsistent knowledge base focus on classical knowledge bases only. Proposals for measuring the blames of formulas with regard to an inconsistent prioritized knowledge base have not yet been given much consideration. However, the notion of priority is important in inconsistency-tolerant reasoning. This article investigates this issue and presents a family of measurements for the degree of blame of each formula in an inconsistent prioritized knowledge base by using the minimal inconsistent subsets of that knowledge base. First of all, we present a set of intuitive postulates as general criteria to characterize rational measurements for the blames of formulas of an inconsistent prioritized knowledge base. Then we present a family of measurements for the blame of each formula in an inconsistent prioritized knowledge base under the guidance of the principle of proportionality, one of the intuitive postulates. We also demonstrate that each of these measurements possesses the properties that it ought to have. Finally, we use a simple but explanatory example in requirements engineering to illustrate the application of these measurements. Compared to the related works, the postulates presented in this article consider the special characteristics of minimal inconsistent subsets as well as the priority levels of formulas. This makes them more appropriate to characterizing the inconsistency measures defined from minimal inconsistent subsets for prioritized knowledge bases as well as classical knowledge bases. Correspondingly, the measures guided by these postulates can intuitively capture the inconsistency for prioritized knowledge bases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method of measuring the temperature of the fast electrons produced in ultraintense laser-plasma interactions is described by inducing photonuclear reactions, in particular (gamma,n) and (gamma,3n) reactions in tantalum. Analysis of the gamma rays emitted by the daughter nuclei of these reactions using a germanium counter enables a relatively straightforward near real-time temperature measurement to be made. This is especially important for high temperature plasmas where alternative diagnostic techniques are usually difficult and time consuming. This technique can be used while other experiments are being conducted. (C) 2002 American Institute of Physics.