128 resultados para artificial intelligence
Resumo:
The grading of crushed aggregate is carried out usually by sieving. We describe a new image-based approach to the automatic grading of such materials. The operational problem addressed is where the camera is located directly over a conveyor belt. Our approach characterizes the information content of each image, taking into account relative variation in the pixel data, and resolution scale. In feature space, we find very good class separation using a multidimensional linear classifier. The innovation in this work includes (i) introducing an effective image-based approach into this application area, and (ii) our supervised classification using wavelet entropy-based features.
Resumo:
This paper provides algorithms that use an information-theoretic analysis to learn Bayesian network structures from data. Based on our three-phase learning framework, we develop efficient algorithms that can effectively learn Bayesian networks, requiring only polynomial numbers of conditional independence (CI) tests in typical cases. We provide precise conditions that specify when these algorithms are guaranteed to be correct as well as empirical evidence (from real world applications and simulation tests) that demonstrates that these systems work efficiently and reliably in practice.
Resumo:
We present a novel approach to goal recognition based on a two-stage paradigm of graph construction and analysis. First, a graph structure called a Goal Graph is constructed to represent the observed actions, the state of the world, and the achieved goals as well as various connections between these nodes at consecutive time steps. Then, the Goal Graph is analysed at each time step to recognise those partially or fully achieved goals that are consistent with the actions observed so far. The Goal Graph analysis also reveals valid plans for the recognised goals or part of these goals. Our approach to goal recognition does not need a plan library. It does not suffer from the problems in the acquisition and hand-coding of large plan libraries, neither does it have the problems in searching the plan space of exponential size. We describe two algorithms for Goal Graph construction and analysis in this paradigm. These algorithms are both provably sound, polynomial-time, and polynomial-space. The number of goals recognised by our algorithms is usually very small after a sequence of observed actions has been processed. Thus the sequence of observed actions is well explained by the recognised goals with little ambiguity. We have evaluated these algorithms in the UNIX domain, in which excellent performance has been achieved in terms of accuracy, efficiency, and scalability.
Resumo:
The study of alternative combination rules in DS theory when evidence is in conflict has emerged again recently as an interesting topic, especially in data/information fusion applications. These studies have mainly focused on investigating which alternative would be appropriate for which conflicting situation, under the assumption that a conflict is identified. The issue of detection (or identification) of conflict among evidence has been ignored. In this paper, we formally define when two basic belief assignments are in conflict. This definition deploys quantitative measures of both the mass of the combined belief assigned to the emptyset before normalization and the distance between betting commitments of beliefs.We argue that only when both measures are high, it is safe to say the evidence is in conflict. This definition can be served as a prerequisite for selecting appropriate combination rules.
Resumo:
In this paper, we propose an adaptive approach to merging possibilistic knowledge bases that deploys multiple operators instead of a single operator in the merging process. The merging approach consists of two steps: one is called the splitting step and the other is called the combination step. The splitting step splits each knowledge base into two subbases and then in the second step, different classes of subbases are combined using different operators. Our approach is applied to knowledge bases which are self-consistent and the result of merging is also a consistent knowledge base. Two operators are proposed based on two different splitting methods. Both operators result in a possibilistic knowledge base which contains more information than that obtained by the t-conorm (such as the maximum) based merging methods. In the flat case, one of the operators provides a good alternative to syntax-based merging operators in classical logic.
Resumo:
Relative Evidential Supports (RES) was developed and justified several years ago as a non-numeric apparatus that allows us to compare evidential supports for alternative conclusions when making a decision. An extension called Graded Relative Evidence (GRE) of the RES concept of pairwise balancing and trading-off of evidence is reported here which keeps its basic features of simplicity and perspicacity but enriches its modelling fidelity by permitting very modest and intuitive variations in degrees of outweighing (which the essentially binary RES does not). The formal justification is very simply based on linkages to RES and to the Dempster - Shafer theory of evidence. The use of the simple extension is illustrated and to a small degree further justified empirically by application to a topical scientific debate about what is called the Congo Crossover Conjecture here. This decision-making instance is chosen because of the wealth of evidence that has been accumulated on both sides of the debate and the range of evidence strengths manifested in it. The conjecture is that the advent of Aids was in the late 1950s in the Congo when a vaccine for polio was allegedly cultivated in the kidneys of chimpanzees which allowed the Aids infection to cross over to humans from primates. © 2005 Springer.
Resumo:
A new algorithm for training of nonlinear optimal neuro-controllers (in the form of the model-free, action-dependent, adaptive critic paradigm). Overcomes problems with existing stochastic backpropagation training: need for data storage, parameter shadowing and poor convergence, offering significant benefits for online applications.