55 resultados para Knowledge-based industry


Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is a growing awareness in the UK and mainland Europe of the importance of higher education to the development of a knowledge-based economy. European universities are increasingly required to produce highly mobile graduates able to respond to the ever-changing needs of the contemporary workplace. Following the Bologna Declaration (19991. 19 June 1999 . “The European Higher Education Area” (Bologna Declaration), Joint Declaration of the European Ministers of Education, Bologna, higher education across Europe has expanded rapidly. This has resulted in questions being raised about the quality of the graduate labour market and the ability of graduates to meet the needs of employers. This paper analyses graduate and employer perspectives of graduate employability in four European countries (UK, Austria, Slovenia and Romania). In doing so it adds to current debates in this area.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the buzzwords of knowledge-based economy and knowledge-driven economy, policy-makers, as well as journalists and management consultants, are pushing forward a vision of change that transforms the way advanced economies work. Yet little is understood about how the knowledge-based economy differs from the old, traditional economy. It is generally agreed that the phenomenon has grown out of the branch of economic thought known as new growth theory. Digesting up-to-date thinking in economics, management, innovation studies and economic geography, this significant volume provides an account of these developments and how they have transformed advanced economies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background - Modelling the interaction between potentially antigenic peptides and Major Histocompatibility Complex (MHC) molecules is a key step in identifying potential T-cell epitopes. For Class II MHC alleles, the binding groove is open at both ends, causing ambiguity in the positional alignment between the groove and peptide, as well as creating uncertainty as to what parts of the peptide interact with the MHC. Moreover, the antigenic peptides have variable lengths, making naive modelling methods difficult to apply. This paper introduces a kernel method that can handle variable length peptides effectively by quantifying similarities between peptide sequences and integrating these into the kernel. Results - The kernel approach presented here shows increased prediction accuracy with a significantly higher number of true positives and negatives on multiple MHC class II alleles, when testing data sets from MHCPEP [1], MCHBN [2], and MHCBench [3]. Evaluation by cross validation, when segregating binders and non-binders, produced an average of 0.824 AROC for the MHCBench data sets (up from 0.756), and an average of 0.96 AROC for multiple alleles of the MHCPEP database. Conclusion - The method improves performance over existing state-of-the-art methods of MHC class II peptide binding predictions by using a custom, knowledge-based representation of peptides. Similarity scores, in contrast to a fixed-length, pocket-specific representation of amino acids, provide a flexible and powerful way of modelling MHC binding, and can easily be applied to other dynamic sequence problems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Tobacco industry interference has been identified as the greatest obstacle to the implementation of evidence-based measures to reduce tobacco use. Understanding and addressing industry interference in public health policy-making is therefore crucial. Existing conceptualisations of corporate political activity (CPA) are embedded in a business perspective and do not attend to CPA's social and public health costs; most have not drawn on the unique resource represented by internal tobacco industry documents. Building on this literature, including systematic reviews, we develop a critically informed conceptual model of tobacco industry political activity. METHODS AND FINDINGS: We thematically analysed published papers included in two systematic reviews examining tobacco industry influence on taxation and marketing of tobacco; we included 45 of 46 papers in the former category and 20 of 48 papers in the latter (n = 65). We used a grounded theory approach to build taxonomies of "discursive" (argument-based) and "instrumental" (action-based) industry strategies and from these devised the Policy Dystopia Model, which shows that the industry, working through different constituencies, constructs a metanarrative to argue that proposed policies will lead to a dysfunctional future of policy failure and widely dispersed adverse social and economic consequences. Simultaneously, it uses diverse, interlocking insider and outsider instrumental strategies to disseminate this narrative and enhance its persuasiveness in order to secure its preferred policy outcomes. Limitations are that many papers were historical (some dating back to the 1970s) and focused on high-income regions. CONCLUSIONS: The model provides an evidence-based, accessible way of understanding diverse corporate political strategies. It should enable public health actors and officials to preempt these strategies and develop realistic assessments of the industry's claims.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Machine breakdowns are one of the main sources of disruption and throughput fluctuation in highly automated production facilities. One element in reducing this disruption is ensuring that the maintenance team responds correctly to machine failures. It is, however, difficult to determine the current practice employed by the maintenance team, let alone suggest improvements to it. 'Knowledge based improvement' is a methodology that aims to address this issue, by (a) eliciting knowledge on current practice, (b) evaluating that practice and (c) looking for improvements. The methodology, based on visual interactive simulation and artificial intelligence methods, and its application to a Ford engine assembly facility are described. Copyright © 2002 Society of Automotive Engineers, Inc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Modelling human interaction and decision-making within a simulation presents a particular challenge. This paper describes a methodology that is being developed known as 'knowledge based improvement'. The purpose of this methodology is to elicit decision-making strategies via a simulation model and to represent them using artificial intelligence techniques. Further to this, having identified an individual's decision-making strategy, the methodology aims to look for improvements in decision-making. The methodology is being tested on unplanned maintenance operations at a Ford engine assembly plant

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Original Paper European Journal of Information Systems (2001) 10, 135–146; doi:10.1057/palgrave.ejis.3000394 Organisational learning—a critical systems thinking discipline P Panagiotidis1,3 and J S Edwards2,4 1Deloitte and Touche, Athens, Greece 2Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK Correspondence: Dr J S Edwards, Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK. E-mail: j.s.edwards@aston.ac.uk 3Petros Panagiotidis is Manager responsible for the Process and Systems Integrity Services of Deloitte and Touche in Athens, Greece. He has a BSc in Business Administration and an MSc in Management Information Systems from Western International University, Phoenix, Arizona, USA; an MSc in Business Systems Analysis and Design from City University, London, UK; and a PhD degree from Aston University, Birmingham, UK. His doctorate was in Business Systems Analysis and Design. His principal interests now are in the ERP/DSS field, where he serves as project leader and project risk managment leader in the implementation of SAP and JD Edwards/Cognos in various major clients in the telecommunications and manufacturing sectors. In addition, he is responsible for the development and application of knowledge management systems and activity-based costing systems. 4John S Edwards is Senior Lecturer in Operational Research and Systems at Aston Business School, Birmingham, UK. He holds MA and PhD degrees (in mathematics and operational research respectively) from Cambridge University. His principal research interests are in knowledge management and decision support, especially methods and processes for system development. He has written more than 30 research papers on these topics, and two books, Building Knowledge-based Systems and Decision Making with Computers, both published by Pitman. Current research work includes the effect of scale of operations on knowledge management, interfacing expert systems with simulation models, process modelling in law and legal services, and a study of the use of artifical intelligence techniques in management accounting. Top of pageAbstract This paper deals with the application of critical systems thinking in the domain of organisational learning and knowledge management. Its viewpoint is that deep organisational learning only takes place when the business systems' stakeholders reflect on their actions and thus inquire about their purpose(s) in relation to the business system and the other stakeholders they perceive to exist. This is done by reflecting both on the sources of motivation and/or deception that are contained in their purpose, and also on the sources of collective motivation and/or deception that are contained in the business system's purpose. The development of an organisational information system that captures, manages and institutionalises meaningful information—a knowledge management system—cannot be separated from organisational learning practices, since it should be the result of these very practices. Although Senge's five disciplines provide a useful starting-point in looking at organisational learning, we argue for a critical systems approach, instead of an uncritical Systems Dynamics one that concentrates only on the organisational learning practices. We proceed to outline a methodology called Business Systems Purpose Analysis (BSPA) that offers a participatory structure for team and organisational learning, upon which the stakeholders can take legitimate action that is based on the force of the better argument. In addition, the organisational learning process in BSPA leads to the development of an intrinsically motivated information organisational system that allows for the institutionalisation of the learning process itself in the form of an organisational knowledge management system. This could be a specific application, or something as wide-ranging as an Enterprise Resource Planning (ERP) implementation. Examples of the use of BSPA in two ERP implementations are presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In present day knowledge societies political decisions are often justified on the basis of scientific expertise. Traditionally, a linear relation between knowledge production and application was postulated which would lead, with more and better science, to better policies. Empirical studies in Science and Technology studies have essentially demolished this idea. However, it is still powerful, not least among practitioners working in fields where decision making is based on large doses of expert knowledge. Based on conceptual work in the field of Science and Technology Studies (STS) I shall examine two cases of global environmental governance, ozone layer protection and global climate change. I will argue that hybridization and purification are important for two major forms of scientific expertise. One is delivered though scientific advocacy (by individual scientists or groups of scientists), the other through expert committees, i.e. institutionalized forms of collecting and communicating expertise to decision makers. Based on this analysis lessons will be drawn, also with regard to the stalling efforts at establishing an international forestry regime.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In practical term any result obtained using an ordered weighted averaging (OWA) operator heavily depends upon the method to determine the weighting vector. Several approaches for obtaining the associated weights have been suggested in the literature, in which none of them took into account the preference of alternatives. This paper presents a method for determining the OWA weights when the preferences of alternatives across all the criteria are considered. An example is given to illustrate this method and an application in internet search engine shows the use of this new OWA operator.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tonal, textural and contextual properties are used in manual photointerpretation of remotely sensed data. This study has used these three attributes to produce a lithological map of semi arid northwest Argentina by semi automatic computer classification procedures of remotely sensed data. Three different types of satellite data were investigated, these were LANDSAT MSS, TM and SIR-A imagery. Supervised classification procedures using tonal features only produced poor classification results. LANDSAT MSS produced classification accuracies in the range of 40 to 60%, while accuracies of 50 to 70% were achieved using LANDSAT TM data. The addition of SIR-A data produced increases in the classification accuracy. The increased classification accuracy of TM over the MSS is because of the better discrimination of geological materials afforded by the middle infra red bands of the TM sensor. The maximum likelihood classifier consistently produced classification accuracies 10 to 15% higher than either the minimum distance to means or decision tree classifier, this improved accuracy was obtained at the cost of greatly increased processing time. A new type of classifier the spectral shape classifier, which is computationally as fast as a minimum distance to means classifier is described. However, the results for this classifier were disappointing, being lower in most cases than the minimum distance or decision tree procedures. The classification results using only tonal features were felt to be unacceptably poor, therefore textural attributes were investigated. Texture is an important attribute used by photogeologists to discriminate lithology. In the case of TM data, texture measures were found to increase the classification accuracy by up to 15%. However, in the case of the LANDSAT MSS data the use of texture measures did not provide any significant increase in the accuracy of classification. For TM data, it was found that second order texture, especially the SGLDM based measures, produced highest classification accuracy. Contextual post processing was found to increase classification accuracy and improve the visual appearance of classified output by removing isolated misclassified pixels which tend to clutter classified images. Simple contextual features, such as mode filters were found to out perform more complex features such as gravitational filter or minimal area replacement methods. Generally the larger the size of the filter, the greater the increase in the accuracy. Production rules were used to build a knowledge based system which used tonal and textural features to identify sedimentary lithologies in each of the two test sites. The knowledge based system was able to identify six out of ten lithologies correctly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.