929 resultados para Knowledge acquisition system
Resumo:
This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.
Resumo:
This paper describes a case study of an electronic data management system developed in-house by the Facilities Management Directorate (FMD) of an educational institution in the UK. The FMD Maintenance and Business Services department is responsible for the maintenance of the built-estate owned by the university. The department needs to have a clear definition of the type of work undertaken and the administration that enables any maintenance work to be carried out. These include the management of resources, budget, cash flow and workflow of reactive, preventative and planned maintenance of the campus. In order to be more efficient in supporting the business process, the FMD had decided to move from a paper-based information system to an electronic system, WREN, to support the business process of the FMD. Some of the main advantages of WREN are that it is tailor-made to fit the purpose of the users; it is cost effective when it comes to modifications on the system; and the database can also be used as a knowledge management tool. There is a trade-off; as WREN is tailored to the specific requirements of the FMD, it may not be easy to implement within a different institution without extensive modifications. However, WREN is successful in not only allowing the FMD to carry out the tasks of maintaining and looking after the built-estate of the university, but also has achieved its aim to minimise costs and maximise efficiency.
Resumo:
Construction materials and equipment are essential building blocks of every construction project and may account for 50-60 per cent of the total cost of construction. The rate of their utilization, on the other hand, is the element that most directly relates to a project progress. A growing concern in the industry that inadequate efficiency hinders its success could thus be accommodated by turning construction into a logistic process. Although mostly limited, recent attempts and studies show that Radio Frequency IDentification (RFID) applications have significant potentials in construction. However, the aim of this research is to show that the technology itself should not only be used for automation and tracking to overcome the supply chain complexity but also as a tool to generate, record and exchange process-related knowledge among the supply chain stakeholders. This would enable all involved parties to identify and understand consequences of any forthcoming difficulties and react accordingly before they cause major disruptions in the construction process. In order to achieve this aim the study focuses on a number of methods. First of all it develops a generic understanding of how RFID technology has been used in logistic processes in industrial supply chain management. Secondly, it investigates recent applications of RFID as an information and communication technology support facility in construction logistics for the management of construction supply chain. Based on these the study develops an improved concept of a construction logistics architecture that explicitly relies on integrating RFID with the Global Positioning System (GPS). The developed conceptual model architecture shows that categorisation provided through RFID and traceability as a result of RFID/GPS integration could be used as a tool to identify, record and share potential problems and thus vastly improve knowledge management processes within the entire supply chain. The findings thus clearly show a need for future research in this area.
Resumo:
Complementarity in acquisition of nitrogen (N) from soil and N-2-fixation within pea and barley intercrops was studied in organic field experiments across Western Europe (Denmark, United Kingdom, France, Germany and Italy). Spring pea and barley were sown either as sole crops, at the recommended plant density (P100 and B100, respectively) or in replacement (P50B50) or additive (P100B50) intercropping designs, in each of three cropping seasons (2003-2005). Irrespective of site and intercrop design, Land Equivalent Ratios (LER) between 1.4 at flowering and 1.3 at maturity showed that total N recovery was greater in the pea-barley intercrops than in the sole Crops Suggesting a high degree of complementarity over a wide range of growing conditions. Complementarity was partly attributed to greater soil mineral N acquisition by barley, forcing pea to rely more on N-2-fixation. At all sites the proportion of total aboveground pea N that was derived from N-2-fixation was greater when intercropped with barley than when grown as a sole crop. No consistent differences were found between the two intercropping designs. Simultaneously, the accumulation Of Phosphorous (P), potassium (K) and sulphur (S) in Danish and German experiments was 20% higher in the intercrop (P50B50) than in the respective sole crops, possibly influencing general crop yields and thereby competitive ability for other resources. Comparing all sites and seasons, the benefits of organic pea-barley intercropping for N acquisition were highly resilient. It is concluded that pea-barley intercropping is a relevant cropping strategy to adopt when trying to optimize N-2-fixation inputs to the cropping system. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This article is a commentary on several research studies conducted on the prospects for aerobic rice production systems that aim at reducing the demand for irrigation water which in certain major rice producing areas of the world is becoming increasingly scarce. The research studies considered, as reported in published articles mainly under the aegis of the International Rice Research Institute (IRRI), have a narrow scope in that they test only 3 or 4 rice varieties under different soil moisture treatments obtained with controlled irrigation, but with other agronomic factors of production held as constant. Consequently, these studies do not permit an assessment of the interactions among agronomic factors that will be of critical significance to the performance of any production system. Varying the production factor of "water" will seriously affect also the levels of the other factors required to optimise the performance of a production system. The major weakness in the studies analysed in this article originates from not taking account of the interactions between experimental and non-experimental factors involved in the comparisons between different production systems. This applies to the experimental field design used for the research studies as well as to the subsequent statistical analyses of the results. The existence of such interactions is a serious complicating element that makes meaningful comparisons between different crop production systems difficult. Consequently, the data and conclusions drawn from such research readily become biased towards proposing standardised solutions for possible introduction to farmers through a linear technology transfer process. Yet, the variability and diversity encountered in the real-world farming environment demand more flexible solutions and approaches in the dissemination of knowledge-intensive production practices through "experiential learning" types of processes, such as those employed by farmer field schools. This article illustrates, based on expertise of the 'system of rice intensification' (SRI), that several cost-effective and environment-friendly agronomic solutions to reduce the demand for irrigation water, other than the asserted need for the introduction of new cultivars, are feasible. Further, these agronomic Solutions can offer immediate benefits of reduced water requirements and increased net returns that Would be readily accessible to a wide range of rice producers, particularly the resource poor smallholders. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The EU Project AquaTerra generates knowledge about the river-soil-sediment-groundwater system and delivers scientific information of value for river basin management. In this article, the use and ignorance of scientific knowledge in decision making is explored by a theoretical review. We elaborate on the 'two-communities theory', which explains the problems of the policy-science interface by relating and comparing the different cultures, contexts, and languages of researchers and policy makers. Within AquaTerra, the EUPOL subproject examines the policy-science interface with the aim of achieving a good connection between the scientific output of the project and EU policies. We have found two major barriers, namely language and resources, as well as two types of relevant relationships: those between different research communities and those between researchers and policy makers. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
The development of normal and abnormal glandular structures in the prostate is controlled at the endocrine and paracrine levels by reciprocal interactions between epithelium and stroma. To study these processes it is useful to have an efficient method of tissue acquisition for reproducible isolation of cells from defined histologies. Here we assessed the utility of a standardized system for acquisition and growth of prostatic cells from different regions of the prostate with different pathologies, and we compared the abilities of stromal cells from normal peripheral zone (PZ-S), benign prostatic hyperplasia (BPH-S), and cancer (CA-S) to induce the growth of a human prostatic epithelial cell line (BPH-1) in vivo. Using the tissue recombination method, we showed that grafting stromal cells (from any histology) alone, or BPH-1 epithelial cells alone produced no visible grafts. Recombining PZ-S with BPH-1 cells also produced no visible grafts (n = 15). Recombining BPH-S with BPH-1 cells generated small, well-organized and sharply demarcated grafts approximately 3-4 mm in diameter (n = 9), demonstrating a moderate inductive ability of BPH-S. Recombining CA-S with BPH-1 cells generated highly disorganized grafts that completely surrounded the host kidney and invaded into adjacent renal tissue, demonstrating induction of an aggressive phenotype. We conclude that acquisition of tissue from toluidine blue dye stained specimens is an efficient method to generate high quality epithelial and/or stromal cultures. Stromal cells derived by this method from areas of BPH and cancer induce epithelial cell growth in vivo which mimics the natural history of these diseases.
Resumo:
This paper aims to introduce a knowledge-based managemental prototype entitled Eþ for environmental-conscious construction relied on an integration of current environmental management tools in construction area. The overall objective of developing the Eþ prototype is to facilitate selectively reusing the retrievable knowledge in construction engineering and management areas assembled from previous projects for the best practice in environmental-conscious construction. The methodologies adopted in previous and ongoing research related to the development of the Eþ belong to the operations research area and the information technology area, including literature review, questionnaire survey and interview, statistical analysis, system analysis and development, experimental research and simulation, and so on. The content presented in this paper includes an advanced Eþ prototype, a comprehensive review of environmental management tools integrated to the Eþ prototype, and an experimental case study of the implementation of the Eþ prototype. It is expected that the adoption and implementation of the Eþ prototype can effectively facilitate contractors to improve their environmental performance in the lifecycle of projectbased construction and to reduce adverse environmental impacts due to the deployment of various engineering and management processes at each construction stage.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
Knowledge-elicitation is a common technique used to produce rules about the operation of a plant from the knowledge that is available from human expertise. Similarly, data-mining is becoming a popular technique to extract rules from the data available from the operation of a plant. In the work reported here knowledge was required to enable the supervisory control of an aluminium hot strip mill by the determination of mill set-points. A method was developed to fuse knowledge-elicitation and data-mining to incorporate the best aspects of each technique, whilst avoiding known problems. Utilisation of the knowledge was through an expert system, which determined schedules of set-points and provided information to human operators. The results show that the method proposed in this paper was effective in producing rules for the on-line control of a complex industrial process. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Knowledge-elicitation is a common technique used to produce rules about the operation of a plant from the knowledge that is available from human expertise. Similarly, data-mining is becoming a popular technique to extract rules from the data available from the operation of a plant. In the work reported here knowledge was required to enable the supervisory control of an aluminium hot strip mill by the determination of mill set-points. A method was developed to fuse knowledge-elicitation and data-mining to incorporate the best aspects of each technique, whilst avoiding known problems. Utilisation of the knowledge was through an expert system, which determined schedules of set-points and provided information to human operators. The results show that the method proposed in this paper was effective in producing rules for the on-line control of a complex industrial process.
Resumo:
Most current education organizations use books and CDs as the main media, which takes a long time for knowledge updating between education resource providers and the users. The rapid development of the Internet has brought with it the possibility of improving the resource purveying mechanisms. Therefore, we designed an agent based system to purvey education resources from the resource centre to schools through the Internet. Agent technology helps to improve system performance and flexibility. This paper describes the design of our system, details the functions of the main parts of the system, shows the communication methods between agents and finally evaluates the system by experiments.
Resumo:
The associative sequence learning model proposes that the development of the mirror system depends on the same mechanisms of associative learning that mediate Pavlovian and instrumental conditioning. To test this model, two experiments used the reduction of automatic imitation through incompatible sensorimotor training to assess whether mirror system plasticity is sensitive to contingency (i.e., the extent to which activation of one representation predicts activation of another). In Experiment 1, residual automatic imitation was measured following incompatible training in which the action stimulus was a perfect predictor of the response (contingent) or not at all predictive of the response (noncontingent). A contingency effect was observed: There was less automatic imitation indicative of more learning in the contingent group. Experiment 2 replicated this contingency effect and showed that, as predicted by associative learning theory, it can be abolished by signaling trials in which the response occurs in the absence of an action stimulus. These findings support the view that mirror system development depends on associative learning and indicate that this learning is not purely Hebbian. If this is correct, associative learning theory could be used to explain, predict, and intervene in mirror system development.
Resumo:
Knowledge about the functional status of the frontal cortex in infancy is limited. This study investigated the effects of polymorphisms in four dopamine system genes on performance in a task developed to assess such functioning, the Freeze-Frame task, at 9 months of age. Polymorphisms in the catechol-O-methyltransferase (COMT) and the dopamine D4 receptor (DRD4) genes are likely to impact directly on the functioning of the frontal cortex, whereas polymorphisms in the dopamine D2 receptor (DRD2) and dopamine transporter (DAT1) genes might influence frontal cortex functioning indirectly via strong frontostriatal connections. A significant effect of the COMT valine158methionine (Val158Met) polymorphism was found. Infants with the Met/Met genotype were significantly less distractible than infants with the Val/Val genotype in Freeze-Frame trials presenting an engaging central stimulus. In addition, there was an interaction with the DAT1 3′ variable number of tandem repeats polymorphism; the COMT effect was present only in infants who did not have two copies of the DAT1 10-repeat allele. These findings indicate that dopaminergic polymorphisms affect selective aspects of attention as early as infancy and further validate the Freeze-Frame task as a frontal cortex task.
Resumo:
A novel partitioned least squares (PLS) algorithm is presented, in which estimates from several simple system models are combined by means of a Bayesian methodology of pooling partial knowledge. The method has the added advantage that, when the simple models are of a similar structure, it lends itself directly to parallel processing procedures, thereby speeding up the entire parameter estimation process by several factors.