845 resultados para Knowledge Sharing and Reuse
Resumo:
Knowledge-elicitation is a common technique used to produce rules about the operation of a plant from the knowledge that is available from human expertise. Similarly, data-mining is becoming a popular technique to extract rules from the data available from the operation of a plant. In the work reported here knowledge was required to enable the supervisory control of an aluminium hot strip mill by the determination of mill set-points. A method was developed to fuse knowledge-elicitation and data-mining to incorporate the best aspects of each technique, whilst avoiding known problems. Utilisation of the knowledge was through an expert system, which determined schedules of set-points and provided information to human operators. The results show that the method proposed in this paper was effective in producing rules for the on-line control of a complex industrial process. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.
Resumo:
A survey of the knowledge, attitudes and practices (KAP) of 100 rice farmers and 50 coconut farmers was conducted in the coastal lowland agro-ecosystems of the Sierra Madre Biodiversity Corridor, Luzon, Philippines to identify current rodent management practices and to understand the extent of rat damage and the attitudes of farmers to community actions for rodent management. Pests were most commonly listed as one of the three most important rice and coconut production constraints. Other major crop production constraints were typhoons and insufficient water. Farmers consider rats to be the major pest of coconut and of rice during the wet season rice crop, with average yield losses of 3.0% and 13.2%, respectively. Rice and coconut farmers practised a wide range of rodent management techniques. These included scrub clearance, hunting and trapping. Of the 42 rice farmers and 3 coconut farmers that applied rodenticides to control rodents, all used the acute rodenticide, zinc phosphide. However, only ten rice farmers (23.8%) applied rodenticides prior to the booting stage and only seven farmers (15.6%) conducted pre-baiting before applying zinc phosphide. The majority of farmers belonged to farmer organisations and believed that rat control can only be done by farmers working together. However, during the last cropping season, less than a third of rice farmers (31.2%) applied rodent management as a group. In order to reduce the impact of rodents on the farmers of the coastal lowlands of the Sierra Madre Biodiversity Corridor, integrated management strategies need to be developed that specifically target the pest rodents in a sustainable manner, and community actions for rodent management should be promoted.
Resumo:
A major infrastructure project is used to investigate the role of digital objects in the coordination of engineering design work. From a practice-based perspective, research emphasizes objects as important in enabling cooperative knowledge work and knowledge sharing. The term ‘boundary object’ has become used in the analysis of mutual and reciprocal knowledge sharing around physical and digital objects. The aim is to extend this work by analysing the introduction of an extranet into the public–private partnership project used to construct a new motorway. Multiple categories of digital objects are mobilized in coordination across heterogeneous, cross-organizational groups. The main findings are that digital objects provide mechanisms for accountability and control, as well as for mutual and reciprocal knowledge sharing; and that different types of objects are nested, forming a digital infrastructure for project delivery. Reconceptualizing boundary objects as a digital infrastructure for delivery has practical implications for management practices on large projects and for the use of digital tools, such as building information models, in construction. It provides a starting point for future research into the changing nature of digitally enabled coordination in project-based work.