31 resultados para rule-based logic


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A hybrid approach for integrating group Delphi, fuzzy logic and expert systems for developing marketing strategies is proposed in this paper. Within this approach, the group Delphi method is employed to help groups of managers undertake SWOT analysis. Fuzzy logic is applied to fuzzify the results of SWOT analysis. Expert systems are utilised to formulate marketing strategies based upon the fuzzified strategic inputs. In addition, guidelines are also provided to help users link the hybrid approach with managerial judgement and intuition. The effectiveness of the hybrid approach has been validated with MBA and MA marketing students. It is concluded that the hybrid approach is more effective in terms of decision confidence, group consensus, helping to understand strategic factors, helping strategic thinking, and coupling analysis with judgement, etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adapting to blurred images makes in-focus images look too sharp, and vice-versa (Webster et al, 2002 Nature Neuroscience 5 839 - 840). We asked how such blur adaptation is related to contrast adaptation. Georgeson (1985 Spatial Vision 1 103 - 112) found that grating contrast adaptation followed a subtractive rule: perceived (matched) contrast of a grating was fairly well predicted by subtracting some fraction k(~0.3) of the adapting contrast from the test contrast. Here we apply that rule to the responses of a set of spatial filters at different scales and orientations. Blur is encoded by the pattern of filter response magnitudes over scale. We tested two versions - the 'norm model' and 'fatigue model' - against blur-matching data obtained after adaptation to sharpened, in-focus or blurred images. In the fatigue model, filter responses are simply reduced by exposure to the adapter. In the norm model, (a) the visual system is pre-adapted to a focused world and (b) discrepancy between observed and expected responses to the experimental adapter leads to additional reduction (or enhancement) of filter responses during experimental adaptation. The two models are closely related, but only the norm model gave a satisfactory account of results across the four experiments analysed, with one free parameter k. This model implies that the visual system is pre-adapted to focused images, that adapting to in-focus or blank images produces no change in adaptation, and that adapting to sharpened or blurred images changes the state of adaptation, leading to changes in perceived blur or sharpness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An implementation of a Lexical Functional Grammar (LFG) natural language front-end to a database is presented, and its capabilities demonstrated by reference to a set of queries used in the Chat-80 system. The potential of LFG for such applications is explored. Other grammars previously used for this purpose are briefly reviewed and contrasted with LFG. The basic LFG formalism is fully described, both as to its syntax and semantics, and the deficiencies of the latter for database access application shown. Other current LFG implementations are reviewed and contrasted with the LFG implementation developed here specifically for database access. The implementation described here allows a natural language interface to a specific Prolog database to be produced from a set of grammar rule and lexical specifications in an LFG-like notation. In addition to this the interface system uses a simple database description to compile metadata about the database for later use in planning the execution of queries. Extensions to LFG's semantic component are shown to be necessary to produce a satisfactory functional analysis and semantic output for querying a database. A diverse set of natural language constructs are analysed using LFG and the derivation of Prolog queries from the F-structure output of LFG is illustrated. The functional description produced from LFG is proposed as sufficient for resolving many problems of quantification and attachment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service-based systems that are dynamically composed at run time to provide complex, adaptive functionality are currently one of the main development paradigms in software engineering. However, the Quality of Service (QoS) delivered by these systems remains an important concern, and needs to be managed in an equally adaptive and predictable way. To address this need, we introduce a novel, tool-supported framework for the development of adaptive service-based systems called QoSMOS (QoS Management and Optimisation of Service-based systems). QoSMOS can be used to develop service-based systems that achieve their QoS requirements through dynamically adapting to changes in the system state, environment and workload. QoSMOS service-based systems translate high-level QoS requirements specified by their administrators into probabilistic temporal logic formulae, which are then formally and automatically analysed to identify and enforce optimal system configurations. The QoSMOS self-adaptation mechanism can handle reliability- and performance-related QoS requirements, and can be integrated into newly developed solutions or legacy systems. The effectiveness and scalability of the approach are validated using simulations and a set of experiments based on an implementation of an adaptive service-based system for remote medical assistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nearest feature line-based subspace analysis is first proposed in this paper. Compared with conventional methods, the newly proposed one brings better generalization performance and incremental analysis. The projection point and feature line distance are expressed as a function of a subspace, which is obtained by minimizing the mean square feature line distance. Moreover, by adopting stochastic approximation rule to minimize the objective function in a gradient manner, the new method can be performed in an incremental mode, which makes it working well upon future data. Experimental results on the FERET face database and the UCI satellite image database demonstrate the effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study here highlights the potential that analytical methods based on Knowledge Discovery in Databases (KDD) methodologies have to aid both the resolution of unstructured marketing/business problems and the process of scholarly knowledge discovery. The authors present and discuss the application of KDD in these situations prior to the presentation of an analytical method based on fuzzy logic and evolutionary algorithms, developed to analyze marketing databases and uncover relationships among variables. A detailed implementation on a pre-existing data set illustrates the method. © 2012 Published by Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Recently, much research has been proposed using nature inspired algorithms to perform complex machine learning tasks. Ant colony optimization (ACO) is one such algorithm based on swarm intelligence and is derived from a model inspired by the collective foraging behavior of ants. Taking advantage of the ACO in traits such as self-organization and robustness, this paper investigates ant-based algorithms for gene expression data clustering and associative classification. Methods and material: An ant-based clustering (Ant-C) and an ant-based association rule mining (Ant-ARM) algorithms are proposed for gene expression data analysis. The proposed algorithms make use of the natural behavior of ants such as cooperation and adaptation to allow for a flexible robust search for a good candidate solution. Results: Ant-C has been tested on the three datasets selected from the Stanford Genomic Resource Database and achieved relatively high accuracy compared to other classical clustering methods. Ant-ARM has been tested on the acute lymphoblastic leukemia (ALL)/acute myeloid leukemia (AML) dataset and generated about 30 classification rules with high accuracy. Conclusions: Ant-C can generate optimal number of clusters without incorporating any other algorithms such as K-means or agglomerative hierarchical clustering. For associative classification, while a few of the well-known algorithms such as Apriori, FP-growth and Magnum Opus are unable to mine any association rules from the ALL/AML dataset within a reasonable period of time, Ant-ARM is able to extract associative classification rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the developed world we are surrounded by man-made objects, but most people give little thought to the complex processes needed for their design. The design of hand knitting is complex because much of the domain knowledge is tacit. The objective of this thesis is to devise a methodology to help designers to work within design constraints, whilst facilitating creativity. A hybrid solution including computer aided design (CAD) and case based reasoning (CBR) is proposed. The CAD system creates designs using domain-specific rules and these designs are employed for initial seeding of the case base and the management of constraints. CBR reuses the designer's previous experience. The key aspects in the CBR system are measuring the similarity of cases and adapting past solutions to the current problem. Similarity is measured by asking the user to rank the importance of features; the ranks are then used to calculate weights for an algorithm which compares the specifications of designs. A novel adaptation operator called rule difference replay (RDR) is created. When the specifications to a new design is presented, the CAD program uses it to construct a design constituting an approximate solution. The most similar design from the case-base is then retrieved and RDR replays the changes previously made to the retrieved design on the new solution. A measure of solution similarity that can validate subjective success scores is created. Specification similarity can be used as a guide whether to invoke CBR, in a hybrid CAD-CBR system. If the newly resulted design is suffciently similar to a previous design, then CBR is invoked; otherwise CAD is used. The application of RDR to knitwear design has demonstrated the flexibility to overcome deficiencies in rules that try to automate creativity, and has the potential to be applied to other domains such as interior design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In repetitive operations the productivity dilemma has been widely studied, but there is a lack of research in non-repetitive operations, such as in project-based firms. This paper investigates why project-based firms foster or hinder project flexibility through an embedded multi-case study with six projects within a large German project-based firm. The results suggest that although such firms have projects as their key source of revenue, their focus lies in longevity and survival and this logic is, in some instances, at odds with the temporary nature of the project context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The creation of new ventures is a process characterized by the need to decide and take action in the face of uncertainty, and this is particularly so in the case of technology-based ventures. Effectuation theory (Sarasvathy, 2001) has advanced two possible approaches for making decisions while facing uncertainty in the entrepreneurial process. Causation logic is based on prediction and aims at lowering uncertainty, whereas effectuation logic is based on non-predictive action and aims at working with uncertainty. This study aims to generate more fine-grained insight in the dynamics of effectuation and causation over time. We address the following questions: (1) What patterns can be found in effectual and causal behaviour of technology-based new ventures over time? And (2) How may patterns in the dynamics of effectuation and causation be explained?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study explores strategies used to legitimize the transfer of organizational practices in a situation of institutional upheaval. We apply the logic of social action (Risse, 2000) to analyze the effectiveness of consequence-based action and communication-based action, in terms of higher coordination, lower conflict, and overall higher economic performance. Consequence-based legitimation is obtained by using a system of distributor incentives tied to performance of specific tasks, while communicative legitimation can be achieved by recommendations and warnings. Our setting is an export channel to European emerging economies. Our results indicate that in the absence of legitimacy, as manifested in discretionary legal enforcement, consequence-based legitimation is more effective than communicative legitimation in reducing conflict, increasing coordination, and ultimately in improving the performance of the export dyad. © 2014 Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the contemporary business environment, to adhere to the need of the customers, caused the shift from mass production to mass-customization. This necessitates the supply chain (SC) to be effective flexible. The purpose of this paper is to seek flexibility through adoption of family-based dispatching rules under the influence of inventory system implemented at downstream echelons of an industrial supply chain network. We compared the family-based dispatching rules in existing literature under the purview of inventory system and information sharing within a supply chain network. The dispatching rules are compared for Average Flow Time performance, which is averaged over the three product families. The performance is measured using extensive discrete event simulation process. Given the various inventory related operational factors at downstream echelons, the present paper highlights the importance of strategically adopting appropriate family-based dispatching rule at the manufacturing end. In the environment of mass customization, it becomes imperative to adopt the family-based dispatching rule from the system wide SC perspective. This warrants the application of intra as well as inter-echelon information coordination. The holonic paradigm emerges in this research stream, amidst the holistic approach and the vital systemic approach. The present research shows its novelty in triplet. Firstly, it provides leverage to manager to strategically adopting a dispatching rule from the inventory system perspective. Secondly, the findings provide direction for the attenuation of adverse impact accruing from demand amplification (bullwhip effect) in the form of inventory levels by appropriately adopting family-based dispatching rule. Thirdly, the information environment is conceptualized under the paradigm of Koestler's holonic theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In view of the increasingly complexity of services logic and functional requirements, a new system architecture based on SOA was proposed for the equipment remote monitoring and diagnosis system. According to the design principles of SOA, different levels and different granularities of services logic and functional requirements for remote monitoring and diagnosis system were divided, and a loosely coupled web services system was built. The design and implementation schedule of core function modules for the proposed architecture were presented. A demo system was used to validate the feasibility of the proposed architecture.