106 resultados para Knowledge-Based Manufacturing
Resumo:
This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.
Resumo:
In present day knowledge societies political decisions are often justified on the basis of scientific expertise. Traditionally, a linear relation between knowledge production and application was postulated which would lead, with more and better science, to better policies. Empirical studies in Science and Technology studies have essentially demolished this idea. However, it is still powerful, not least among practitioners working in fields where decision making is based on large doses of expert knowledge. Based on conceptual work in the field of Science and Technology Studies (STS) I shall examine two cases of global environmental governance, ozone layer protection and global climate change. I will argue that hybridization and purification are important for two major forms of scientific expertise. One is delivered though scientific advocacy (by individual scientists or groups of scientists), the other through expert committees, i.e. institutionalized forms of collecting and communicating expertise to decision makers. Based on this analysis lessons will be drawn, also with regard to the stalling efforts at establishing an international forestry regime.
Resumo:
In practical term any result obtained using an ordered weighted averaging (OWA) operator heavily depends upon the method to determine the weighting vector. Several approaches for obtaining the associated weights have been suggested in the literature, in which none of them took into account the preference of alternatives. This paper presents a method for determining the OWA weights when the preferences of alternatives across all the criteria are considered. An example is given to illustrate this method and an application in internet search engine shows the use of this new OWA operator.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
Tonal, textural and contextual properties are used in manual photointerpretation of remotely sensed data. This study has used these three attributes to produce a lithological map of semi arid northwest Argentina by semi automatic computer classification procedures of remotely sensed data. Three different types of satellite data were investigated, these were LANDSAT MSS, TM and SIR-A imagery. Supervised classification procedures using tonal features only produced poor classification results. LANDSAT MSS produced classification accuracies in the range of 40 to 60%, while accuracies of 50 to 70% were achieved using LANDSAT TM data. The addition of SIR-A data produced increases in the classification accuracy. The increased classification accuracy of TM over the MSS is because of the better discrimination of geological materials afforded by the middle infra red bands of the TM sensor. The maximum likelihood classifier consistently produced classification accuracies 10 to 15% higher than either the minimum distance to means or decision tree classifier, this improved accuracy was obtained at the cost of greatly increased processing time. A new type of classifier the spectral shape classifier, which is computationally as fast as a minimum distance to means classifier is described. However, the results for this classifier were disappointing, being lower in most cases than the minimum distance or decision tree procedures. The classification results using only tonal features were felt to be unacceptably poor, therefore textural attributes were investigated. Texture is an important attribute used by photogeologists to discriminate lithology. In the case of TM data, texture measures were found to increase the classification accuracy by up to 15%. However, in the case of the LANDSAT MSS data the use of texture measures did not provide any significant increase in the accuracy of classification. For TM data, it was found that second order texture, especially the SGLDM based measures, produced highest classification accuracy. Contextual post processing was found to increase classification accuracy and improve the visual appearance of classified output by removing isolated misclassified pixels which tend to clutter classified images. Simple contextual features, such as mode filters were found to out perform more complex features such as gravitational filter or minimal area replacement methods. Generally the larger the size of the filter, the greater the increase in the accuracy. Production rules were used to build a knowledge based system which used tonal and textural features to identify sedimentary lithologies in each of the two test sites. The knowledge based system was able to identify six out of ten lithologies correctly.
Resumo:
This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Objectives The creation of more high-growth firms continues to be a key component of enterprise policy throughout the countries of the OECD. In the UK the developing enterprise policy framework highlights the importance of supporting businesses with growth potential. The difficulty, of course, is the ability of those delivering business support policies to accurately identify those businesses, especially at start-up, which will benefit from interventions and experiences an enhanced growth performance. This paper has a core objective of presenting new data on the number of high growth firms in the UK and providing an assessment of their economic significance. Approach This paper uses a specially created longitudinal firm-level database based on the Inter-Departmental Business Register (IDBR) held by the Office of National Statistics (ONS) for all private sector businesses in the UK for the period 1997-2008 to investigate the share of high-growth firms (including a sub-set of start-up more commonly referred to as gazelles) in successive cohorts of start-ups. We apply OECD definitions of high growth and gazelles to this database and are able to quantify for the first time their number (disaggregated by sector, region, size) and importance (employment and sales). Prior Work However, what is lacking at the core of this policy focus is any comprehensive statistical analysis of the scale and nature of high-growth firms in cohorts of new and established businesses. The evidence base in response to the question “Why do high-growth firms matter?” is surprisingly weak. Important work in this area has been initiated by Bartelsman et al., (2003),Hoffman and Jünge (2006) and Henreksen and Johansson (2009) but to date work in the UK has been limited (BERR, 2008b). Results We report that there are ~11,500 high growth firms in the UK in both 2005 and 2008. The share of high growth start-ups in the UK in 2005 (6.3%) was, contrary to the widely held perception in policy circles, higher than in the United States (5.2%). Of particular interest in the analysis are the growth trajectories (pattern of growth) of these firms as well as the extent to which they are restricted to technology-based or knowledge-based sectors. Implications and Value Using hitherto unused population data for the first time we have answered a fundamental research and policy question on the number and scale of high growth firms in the UK. We draw the conclusion that this ‘rare’ event does not readily lend itself to policy intervention on the grounds that the significant effort needed to identify such businesses ex ante would appear unjustified even if it was possible.
Resumo:
Purpose: The paper aims to explore the nature and purpose of higher education (HE) in the twenty-first century, focussing on how it can help fashion a green knowledge-based economy by developing approaches to learning and teaching that are social, networked and ecologically sensitive. Design/methodology/approach: The paper presents a discursive analysis of the skills and knowledge requirements of an emerging green knowledge-based economy using a range of policy focussed and academic research literature. Findings: The business opportunities that are emerging as a more sustainable world is developed requires the knowledge and skills that can capture and move then forward but in a complex and uncertain worlds learning needs to non-linear, creative and emergent. Practical implications: Sustainable learning and the attributes graduates will need to exhibit are prefigured in the activities and learning characterising the work and play facilitated by new media technologies. Social implications: Greater emphasis is required in higher learning understood as the capability to learn, adapt and direct sustainable change requires interprofessional co-operation that must utlise the potential of new media technologies to enhance social learning and collective intelligence. Originality/value: The practical relationship between low-carbon economic development, social sustainability and HE learning is based on both normative criteria and actual and emerging projections in economic, technological and skills needs.
Resumo:
Two studies were conducted to test for the effects of attentional demand and cost responsibility on psychological strain. One was a field experiment involving operators of computer-based manufacturing equipment, and the other was a cross-sectional investigation of employees in a wide range of jobs. The results showed increased strain only for those in jobs high on both attentional demand and cost responsibility. Implications for job design for new manufacturing technologies are discussed.
Resumo:
Selecting the best alternative in a group decision making is a subject of many recent studies. The most popular method proposed for ranking the alternatives is based on the distance of each alternative to the ideal alternative. The ideal alternative may never exist; hence the ranking results are biased to the ideal point. The main aim in this study is to calculate a fuzzy ideal point that is more realistic to the crisp ideal point. On the other hand, recently Data Envelopment Analysis (DEA) is used to find the optimum weights for ranking the alternatives. This paper proposes a four stage approach based on DEA in the Fuzzy environment to aggregate preference rankings. An application of preferential voting system shows how the new model can be applied to rank a set of alternatives. Other two examples indicate the priority of the proposed method compared to the some other suggested methods.
Resumo:
Purpose: The purpose of this paper is to review the literature which focuses on four major higher education decision problems. These are: resource allocation; performance measurement; budgeting; and scheduling. Design/methodology/approach: Related articles appearing in the international journals from 1996 to 2005 are gathered and analyzed so that the following three questions can be answered: "What kind of decision problems were paid most attention to?"; "Were the multiple criteria decision-making techniques prevalently adopted?"; and "What are the inadequacies of these approaches?" Findings: Based on the inadequacies, some improvements and possible future work are recommended, and a comprehensive resource allocation model is developed taking account of these factors. Finally, a new knowledge-based goal programming technique which integrates some operations of analytic hierarchy process is proposed to tackle the model intelligently. Originality/value: Higher education has faced the problem of budget cuts or constrained budgets for the past 30 years. Managing the process of the higher education system is, therefore, a crucial and urgent task for the decision makers of universities in order to improve their performance or competitiveness. © Emerald Group Publishing Limited.
Resumo:
This paper analyses market valuations of UK companies using a new data set of their R&D and IP activities (1989–2002). In contrast to previous studies, the analysis is conducted at the sectoral-level, where the sectors are based on the technological classification originating from Pavitt [Pavitt, K., 1984. Sectoral patterns of technical change. Research Policy 13, 343–373]. The first main result is that the valuation of R&D varies substantially across these sectors. Another important result is that, on average, firms that receive only UK patents tend to have no significant market premium. In direct contrast, patenting through the European Patent Office does raise market value, as does the registration of trade marks in the UK for most sectors. To explore these variations the paper links competitive conditions with the market valuation of innovation. Using profit persistence as a measure of competitive pressure, we find that the sectors that are the most competitive have the lowest market valuation of R&D. Furthermore, within the most competitive sector (‘science based’ manufacturing), firms with larger market shares (an inverse indicator of competitive pressure) also have higher R&D valuations, as well as some positive return to UK patents. We conclude that this evidence supports Schumpeter by finding higher returns to innovation in less than fully competitive markets and contradicts Arrow [Arrow, K., 1962. Economic welfare and the allocation of resources for invention. In: Nelson, R. (Ed.), The Rate and Direction of Inventive Activity. Princeton University Press, Princeton], who argued that, with the existence of IP rights, competitive market structure provides higher incentives to innovate.
Developing a probabilistic graphical structure from a model of mental-health clinical risk expertise
Resumo:
This paper explores the process of developing a principled approach for translating a model of mental-health risk expertise into a probabilistic graphical structure. The Galatean Risk Screening Tool [1] is a psychological model for mental health risk assessment based on fuzzy sets. This paper details how the knowledge encapsulated in the psychological model was used to develop the structure of the probability graph by exploiting the semantics of the clinical expertise. These semantics are formalised by a detailed specification for an XML structure used to represent the expertise. The component parts were then mapped to equivalent probabilistic graphical structures such as Bayesian Belief Nets and Markov Random Fields to produce a composite chain graph that provides a probabilistic classification of risk expertise to complement the expert clinical judgements. © Springer-Verlag 2010.
Resumo:
Radio Frequency Identification (RFID) has been identified as a crucial technology for the modern 21st century knowledge-based economy. Some businesses have realised benefits of RFID adoption through improvements in operational efficiency, additional cost savings, and opportunities for higher revenues. RFID research in warehousing operations has been less prominent than in other application domains. To investigate how RFID technology has had an impact in warehousing, a comprehensive analysis of research findings available from articles through leading scientific article databases has been conducted. Articles from years 1995 to 2010 have been reviewed and analysed with respect to warehouse operations, RFID application domains, benefits achieved and obstacles encountered. Four discussion topics are presented covering RFID in warehousing focusing on its applications, perceived benefits, obstacles to its adoption and future trends. This is aimed at elucidating the current state of RFID in the warehouse and providing insights for researchers to establish new research agendas and for practitioners to consider and assess the adoption of RFID in warehousing functions. © 2013 Elsevier B.V.