898 resultados para 280201 Expert Systems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Validation is an important issue in the development and application of Bayesian Belief Network (BBN) models, especially when the outcome of the model cannot be directly observed. Despite this, few frameworks for validating BBNs have been proposed and fewer have been applied to substantive real-world problems. In this paper we adopt the approach by Pitchforth and Mengersen (2013), which includes nine validation tests that each focus on the structure, discretisation, parameterisation and behaviour of the BBNs included in the case study. We describe the process and result of implementing a validation framework on a model of a real airport terminal system with particular reference to its effectiveness in producing a valid model that can be used and understood by operational decision makers. In applying the proposed validation framework we demonstrate the overall validity of the Inbound Passenger Facilitation Model as well as the effectiveness of the validity framework itself.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Commercial legal expert systems are invariably rule based. Such systems are poor at dealing with open texture and the argumentation inherent in law. To overcome these problems we suggest supplementing rule based legal expert systems with case based reasoning or neural networks. Both case based reasoners and neural networks use cases-but in very different ways. We discuss these differences at length. In particular we examine the role of explanation in existing expert systems methodologies. Because neural networks provide poor explanation facilities, we consider the use of Toulmin argument structures to support explanation (S. Toulmin, 1958). We illustrate our ideas with regard to a number of systems built by the authors

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the legal domain, it is rare to find solutions to problems by simply applying algorithms or invoking deductive rules in some knowledge‐based program. Instead, expert practitioners often supplement domain‐specific knowledge with field experience. This type of expertise is often applied in the form of an analogy. This research proposes to combine both reasoning with precedents and reasoning with statutes and regulations in a way that will enhance the statutory interpretation task. This is being attempted through the integration of database and expert system technologies. Case‐based reasoning is being used to model legal precedents while rule‐based reasoning modules are being used to model the legislation and other types of causal knowledge. It is hoped to generalise these findings and to develop a formal methodology for integrating case‐based databases with rule‐based expert systems in the legal domain.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As critical infrastructure such as transportation hubs continue to grow in complexity, greater importance is placed on monitoring these facilities to ensure their secure and efficient operation. In order to achieve these goals, technology continues to evolve in response to the needs of various infrastructure. To date, however, the focus of technology for surveillance has been primarily concerned with security, and little attention has been placed on assisting operations and monitoring performance in real-time. Consequently, solutions have emerged to provide real-time measurements of queues and crowding in spaces, but have been installed as system add-ons (rather than making better use of existing infrastructure), resulting in expensive infrastructure outlay for the owner/operator, and an overload of surveillance systems which in itself creates further complexity. Given many critical infrastructure already have camera networks installed, it is much more desirable to better utilise these networks to address operational monitoring as well as security needs. Recently, a growing number of approaches have been proposed to monitor operational aspects such as pedestrian throughput, crowd size and dwell times. In this paper, we explore how these techniques relate to and complement the more commonly seen security analytics, and demonstrate the value that can be added by operational analytics by demonstrating their performance on airport surveillance data. We explore how multiple analytics and systems can be combined to better leverage the large amount of data that is available, and we discuss the applicability and resulting benefits of the proposed framework for the ongoing operation of airports and airport networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the mining optimisation literature, most researchers focused on two strategic-level and tactical-level open-pit mine optimisation problems, which are respectively termed ultimate pit limit (UPIT) or constrained pit limit (CPIT). However, many researchers indicate that the substantial numbers of variables and constraints in real-world instances (e.g., with 50-1000 thousand blocks) make the CPIT’s mixed integer programming (MIP) model intractable for use. Thus, it becomes a considerable challenge to solve the large scale CPIT instances without relying on exact MIP optimiser as well as the complicated MIP relaxation/decomposition methods. To take this challenge, two new graph-based algorithms based on network flow graph and conjunctive graph theory are developed by taking advantage of problem properties. The performance of our proposed algorithms is validated by testing recent large scale benchmark UPIT and CPIT instances’ datasets of MineLib in 2013. In comparison to best known results from MineLib, it is shown that the proposed algorithms outperform other CPIT solution approaches existing in the literature. The proposed graph-based algorithms leads to a more competent mine scheduling optimisation expert system because the third-party MIP optimiser is no longer indispensable and random neighbourhood search is not necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Frog protection has become increasingly essential due to the rapid decline of its biodiversity. Therefore, it is valuable to develop new methods for studying this biodiversity. In this paper, a novel feature extraction method is proposed based on perceptual wavelet packet decomposition for classifying frog calls in noisy environments. Pre-processing and syllable segmentation are first applied to the frog call. Then, a spectral peak track is extracted from each syllable if possible. Track duration, dominant frequency and oscillation rate are directly extracted from the track. With k-means clustering algorithm, the calculated dominant frequency of all frog species is clustered into k parts, which produce a frequency scale for wavelet packet decomposition. Based on the adaptive frequency scale, wavelet packet decomposition is applied to the frog calls. Using the wavelet packet decomposition coefficients, a new feature set named perceptual wavelet packet decomposition sub-band cepstral coefficients is extracted. Finally, a k-nearest neighbour (k-NN) classifier is used for the classification. The experiment results show that the proposed features can achieve an average classification accuracy of 97.45% which outperforms syllable features (86.87%) and Mel-frequency cepstral coefficients (MFCCs) feature (90.80%).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a new, generic method/model for multi-objective design optimization of laminated composite components using a novel multi-objective optimization algorithm developed on the basis of the Quantum behaved Particle Swarm Optimization (QPSO) paradigm. QPSO is a co-variant of the popular Particle Swarm Optimization (PSO) and has been developed and implemented successfully for the multi-objective design optimization of composites. The problem is formulated with multiple objectives of minimizing weight and the total cost of the composite component to achieve a specified strength. The primary optimization variables are - the number of layers, its stacking sequence (the orientation of the layers) and thickness of each layer. The classical lamination theory is utilized to determine the stresses in the component and the design is evaluated based on three failure criteria; Failure Mechanism based Failure criteria, Maximum stress failure criteria and the Tsai-Wu Failure criteria. The optimization method is validated for a number of different loading configurations - uniaxial, biaxial and bending loads. The design optimization has been carried for both variable stacking sequences as well as fixed standard stacking schemes and a comparative study of the different design configurations evolved has been presented. Also, the performance of QPSO is compared with the conventional PSO.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Video surveillance infrastructure has been widely installed in public places for security purposes. However, live video feeds are typically monitored by human staff, making the detection of important events as they occur difficult. As such, an expert system that can automatically detect events of interest in surveillance footage is highly desirable. Although a number of approaches have been proposed, they have significant limitations: supervised approaches, which can detect a specific event, ideally require a large number of samples with the event spatially and temporally localised; while unsupervised approaches, which do not require this demanding annotation, can only detect whether an event is abnormal and not specific event types. To overcome these problems, we formulate a weakly-supervised approach using Kullback-Leibler (KL) divergence to detect rare events. The proposed approach leverages the sparse nature of the target events to its advantage, and we show that this data imbalance guarantees the existence of a decision boundary to separate samples that contain the target event from those that do not. This trait, combined with the coarse annotation used by weakly supervised learning (that only indicates approximately when an event occurs), greatly reduces the annotation burden while retaining the ability to detect specific events. Furthermore, the proposed classifier requires only a decision threshold, simplifying its use compared to other weakly supervised approaches. We show that the proposed approach outperforms state-of-the-art methods on a popular real-world traffic surveillance dataset, while preserving real time performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article discusses the scope of research on the application of information technology in construction (ITC). A model of the information and material activities which together constitute the construction process is presented, using the IDEF0 activity modelling methodology. Information technology is defined to include all kinds of technology used for the storage, transfer and manipulation of information, thus also including devices such as copying machines, faxes and mobile phones. Using the model the domain of ITC research is defined as the use of information technology to facilitate and re-engineer the information process component of construction. Developments during the last decades in IT use in construction is discussed against a background of a simplified model of generic information processing tasks. The scope of ITC is compared with the scopes of research in related areas such as design methodology, construction management and facilities management. Health care is proposed as an interesting alternative (to the often used car manufacturing industry), as an IT application domain to compare with. Some of the key areas of ITC research in recent years; expert systems, company IT strategies, and product modelling are shortly discussed. The article finishes with a short discussion of the problems of applying standard scientific methodology in ITC research, in particular in product model research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The design optimization of laminated composites using naturally inspired optimization techniques such as vector evaluated particle swarm optimization (VEPSO) and genetic algorithms (GA) are used in this paper. The design optimization of minimum weight of the laminated composite is evaluated using different failure criteria. The failure criteria considered are maximum stress (MS), Tsai-Wu (TW) and failure mechanism based (FMB) failure criteria. Minimum weight of the laminates are obtained for different failure criteria using VEPSO and GA for different combinations of loading. From the study it is evident that VEPSO and GA predict almost the same minimum weight of the laminate for the given loading. Comparison of minimum weight of the laminates by different failure criteria differ for some loading combinations. The comparison shows that FMBFC provide better results for all combinations of loading. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Production scheduling in a flexible manufacturing system (FMS) is a real-time combinatorial optimization problem that has been proved to be NP-complete. Solving this problem needs on-line monitoring of plan execution and requires real-time decision-making in selecting alternative routings, assigning required resources, and rescheduling when failures occur in the system. Expert systems provide a natural framework for solving this kind of NP-complete problems.In this paper an expert system with a novel parallel heuristic approach is implemented for automatic short-term dynamic scheduling of FMS. The principal features of the expert system presented in this paper include easy rescheduling, on-line plan execution, load balancing, an on-line garbage collection process, and the use of advanced knowledge representational schemes. Its effectiveness is demonstrated with two examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dial-a-ride problem (DARP) is an optimization problem which deals with the minimization of the cost of the provided service where the customers are provided a door-to-door service based on their requests. This optimization model presented in earlier studies, is considered in this study. Due to the non-linear nature of the objective function the traditional optimization methods are plagued with the problem of converging to a local minima. To overcome this pitfall we use metaheuristics namely Simulated Annealing (SA), Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and Artificial Immune System (AIS). From the results obtained, we conclude that Artificial Immune System method effectively tackles this optimization problem by providing us with optimal solutions. Crown Copyright (C) 2011 Published by Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accurate estimation of mass transport parameters is necessary for overall design and evaluation processes of the waste disposal facilities. The mass transport parameters, such as effective diffusion coefficient, retardation factor and diffusion accessible porosity, are estimated from observed diffusion data by inverse analysis. Recently, particle swarm optimization (PSO) algorithm has been used to develop inverse model for estimating these parameters that alleviated existing limitations in the inverse analysis. However, PSO solver yields different solutions in successive runs because of the stochastic nature of the algorithm and also because of the presence of multiple optimum solutions. Thus the estimated mean solution from independent runs is significantly different from the best solution. In this paper, two variants of the PSO algorithms are proposed to improve the performance of the inverse analysis. The proposed algorithms use perturbation equation for the gbest particle to gain information around gbest region on the search space and catfish particles in alternative iterations to improve exploration capabilities. Performance comparison of developed solvers on synthetic test data for two different diffusion problems reveals that one of the proposed solvers, CPPSO, significantly improves overall performance with improved best, worst and mean fitness values. The developed solver is further used to estimate transport parameters from 12 sets of experimentally observed diffusion data obtained from three diffusion problems and compared with published values from the literature. The proposed solver is quick, simple and robust on different diffusion problems. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A review article looking at the type of information requirements commonly shared by scientists and their use of traditional information services. Areas covered include primary requirements of IFE (Institute of Freshwater Ecology) staff, pure versus applied research, informal and personal sources of information, and traditional library and information services. It goes on to describe how research into information systems and technology may improve the wider accessibility and use of information to the scientific community. Technologies covered include online databases, telecommunications, gateways, expert systems, optical technology and applications of CDROM.