27 resultados para rule-based logic

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pac-Man is a well-known, real-time computer game that provides an interesting platform for research. We describe an initial approach to developing an artificial agent that replaces the human to play a simplified version of Pac-Man. The agent is specified as a simple finite state machine and ruleset. with parameters that control the probability of movement by the agent given the constraints of the maze at some instant of time. In contrast to previous approaches, the agent represents a dynamic strategy for playing Pac-Man, rather than a pre-programmed maze-solving method. The agent adaptively "learns" through the application of population-based incremental learning (PBIL) to adjust the agents' parameters. Experimental results are presented that give insight into some of the complexities of the game, as well as highlighting the limitations and difficulties of the representation of the agent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade, with the expansion of organizational scope and the tendency for outsourcing, there has been an increasing need for Business Process Integration (BPI), understood as the sharing of data and applications among business processes. The research efforts and development paths in BPI pursued by many academic groups and system vendors, targeting heterogeneous system integration, continue to face several conceptual and technological challenges. This article begins with a brief review of major approaches and emerging standards to address BPI. Further, we introduce a rule-driven messaging approach to BPI, which is based on the harmonization of messages in order to compose a new, often cross-organizational process. We will then introduce the design of a temporal first order language (Harmonized Messaging Calculus) that provides the formal foundation for general rules governing the business process execution. Definitions of the language terms, formulae, safety, and expressiveness are introduced and considered in detail.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a framework for compositional verification of Object-Z specifications. Its key feature is a proof rule based on decomposition of hierarchical Object-Z models. For each component in the hierarchy local properties are proven in a single proof step. However, we do not consider components in isolation. Instead, components are envisaged in the context of the referencing super-component and proof steps involve assumptions on properties of the sub-components. The framework is defined for Linear Temporal Logic (LTL)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Music plays an enormous role in today's computer games; it serves to elicit emotion, generate interest and convey important information. Traditional gaming music is fixed at the event level, where tracks loop until a state change is triggered. This behaviour however does not reflect musically the in-game state between these events. We propose a dynamic music environment, where music tracks adjust in real-time to the emotion of the in-game state. We are looking to improve the affective response to symbolic music through the modification of structural and performative characteristics through the application of rule-based techniques. In this paper we undertake a multidiscipline approach, and present a series of primary music-emotion structural rules for implementation. The validity of these rules was tested in small study involving eleven participants, each listening to six permutations from two musical works. Preliminary results indicate that the environment was generally successful in influencing the emotion of the musical works for three of the intended four directions (happier, sadder & content/dreamier). Our secondary aim of establishing that the use of music-emotion rules, sourced predominantly from Western classical music, could be applied with comparable results to modern computer gaming music was also largely successfully.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Music is an immensely powerful affective medium that pervades our everyday life. With ever advancing technology, the reproduction and application of music for emotive and information transfer purposes has never been more prevalent. In this paper we introduce a rule-based engine for influencing the perceived emotions of music. Based on empirical music psychology, we attempt to formalise the relationship between musical elements and their perceived emotion. We examine the modification to structural aspects of music to allow for a graduated transition between perceived emotive states. This engine is intended to provide music reproduction systems with a finer grained control over this affective medium; where perceived musical emotion can be influenced with intent. This intent comes from both an external application and the audience. Using a series of affective computing technologies, an audience’s response metrics and attitudes can be incorporated to model this intent. A generative feedback loop is set up between the external application, the influencing process and the audience’s response to this, which together shape the modification of musical structure. The effectiveness of our rule system for influencing perceived musical emotion was examined in earlier work, with a small test study providing generally encouraging results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Model transformations are an integral part of model-driven development. Incremental updates are a key execution scenario for transformations in model-based systems, and are especially important for the evolution of such systems. This paper presents a strategy for the incremental maintenance of declarative, rule-based transformation executions. The strategy involves recording dependencies of the transformation execution on information from source models and from the transformation definition. Changes to the source models or the transformation itself can then be directly mapped to their effects on transformation execution, allowing changes to target models to be computed efficiently. This particular approach has many benefits. It supports changes to both source models and transformation definitions, it can be applied to incomplete transformation executions, and a priori knowledge of volatility can be used to further increase the efficiency of change propagation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The introduction of standard on-chip buses has eased integration and boosted the production of IP functional cores. However, once an IP is bus specific retargeting to a different bus is time-consuming and tedious, and this reduces the reusability of the bus-specific IP. As new bus standards are introduced and different interconnection methods are proposed, this problem increases. Many solutions have been proposed, however these solutions either limit the IP block performance or are restricted to a particular platform. A new concept is presented that can connect IP blocks to a wide variety of interface architectures with low overhead. This is achieved through the use a special interface adaptor logic layer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As an alternative to traditional evolutionary algorithms (EAs), population-based incremental learning (PBIL) maintains a probabilistic model of the best individual(s). Originally, PBIL was applied in binary search spaces. Recently, some work has been done to extend it to continuous spaces. In this paper, we review two such extensions of PBIL. An improved version of the PBIL based on Gaussian model is proposed that combines two main features: a new updating rule that takes into account all the individuals and their fitness values and a self-adaptive learning rate parameter. Furthermore, a new continuous PBIL employing a histogram probabilistic model is proposed. Some experiments results are presented that highlight the features of the new algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of an organically surface modified layered silicate on the viscosity of various epoxy resins of different structures and different functionalities was investigated. Steady and dynamic shear viscosities of the epoxy resins containing 0-10 wt% of the organoclay were determined using parallel plate rheology. Viscosity results were compared with those achieved through addition of a commonly used micron-sized CaCO3 filler. It was found that changes in viscosities due to the different fillers were of the same order, since the layered silicate was only dispersed on a micron-sized scale in the monomer (prior to reaction), as indicated by X-ray diffraction measurements. Flow activation energies at a low frequency were determined and did not show any significant changes due to the addition of organoclay or CaCO3. Comparison between dynamic and steady shear experiments showed good agreement for low layered silicate concentrations below 7.5 wt%, i.e. the Cox-Merz rule can be applied. Deviations from the Cox-Merz rule appeared at and above 10 wt%, although such deviations were only slightly above experimental error. Most resin organoclay blends were well predicted by the Power Law model, only concentrations of 10 wt% and above requiring the Herschel-Buckley (yield stress) model to achieve better fits. Wide-angle X-ray measurements have shown that the epoxy resin swells the layered silicate with an increase in the interlayer distance of approximately 15 Angstrom, and that the rheology behavior is due to the lateral, micron-size of these swollen tactoids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a novel application of fuzzy logic to web data mining for two basic problems of a website: popularity and satisfaction. Popularity means that people will visit the website while satisfaction refers to the usefulness of the site. We will illustrate that the popularity of a website is a fuzzy logic problem. It is an important characteristic of a website in order to survive in Internet commerce. The satisfaction of a website is also a fuzzy logic problem that represents the degree of success in the application of information technology to the business. We propose a framework of fuzzy logic for the representation of these two problems based on web data mining techniques to fuzzify the attributes of a website.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Owing to the high degree of vulnerability of liquid retaining structures to corrosion problems, there are stringent requirements in its design against cracking. In this paper, a prototype knowledge-based system is developed and implemented for the design of liquid retaining structures based on the blackboard architecture. A commercially available expert system shell VISUAL RULE STUDIO working as an ActiveX Designer under the VISUAL BASIC programming environment is employed. Hybrid knowledge representation approach with production rules and procedural methods under object-oriented programming are used to represent the engineering heuristics and design knowledge of this domain. It is demonstrated that the blackboard architecture is capable of integrating different knowledge together in an effective manner. The system is tailored to give advice to users regarding preliminary design, loading specification and optimized configuration selection of this type of structure. An example of application is given to illustrate the capabilities of the prototype system in transferring knowledge on liquid retaining structure to novice engineers. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms perform optimization using a population of sample solution points. An interesting development has been to view population-based optimization as the process of evolving an explicit, probabilistic model of the search space. This paper investigates a formal basis for continuous, population-based optimization in terms of a stochastic gradient descent on the Kullback-Leibler divergence between the model probability density and the objective function, represented as an unknown density of assumed form. This leads to an update rule that is related and compared with previous theoretical work, a continuous version of the population-based incremental learning algorithm, and the generalized mean shift clustering framework. Experimental results are presented that demonstrate the dynamics of the new algorithm on a set of simple test problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 x 5 voxel space gives 10 to 10(7) solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic signature verification is a well-established and an active area of research with numerous applications such as bank check verification, ATM access, etc. This paper proposes a novel approach to the problem of automatic off-line signature verification and forgery detection. The proposed approach is based on fuzzy modeling that employs the Takagi-Sugeno (TS) model. Signature verification and forgery detection are carried out using angle features extracted from box approach. Each feature corresponds to a fuzzy set. The features are fuzzified by an exponential membership function involved in the TS model, which is modified to include structural parameters. The structural parameters are devised to take account of possible variations due to handwriting styles and to reflect moods. The membership functions constitute weights in the TS model. The optimization of the output of the TS model with respect to the structural parameters yields the solution for the parameters. We have also derived two TS models by considering a rule for each input feature in the first formulation (Multiple rules) and by considering a single rule for all input features in the second formulation. In this work, we have found that TS model with multiple rules is better than TS model with single rule for detecting three types of forgeries; random, skilled and unskilled from a large database of sample signatures in addition to verifying genuine signatures. We have also devised three approaches, viz., an innovative approach and two intuitive approaches using the TS model with multiple rules for improved performance. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.