847 resultados para information, knowledge
Resumo:
We investigate knowledge exchange among commercial organizations, the rationale behind it, and its effects on the market. Knowledge exchange is known to be beneficial for industry, but in order to explain it, authors have used high-level concepts like network effects, reputation, and trust. We attempt to formalize a plausible and elegant explanation of how and why companies adopt information exchange and why it benefits the market as a whole when this happens. This explanation is based on a multiagent model that simulates a market of software providers. Even though the model does not include any high-level concepts, information exchange naturally emerges during simulations as a successful profitable behavior. The conclusions reached by this agent-based analysis are twofold: 1) a straightforward set of assumptions is enough to give rise to exchange in a software market, and 2) knowledge exchange is shown to increase the efficiency of the market.
Resumo:
Multi-agent systems are complex systems comprised of multiple intelligent agents that act either independently or in cooperation with one another. Agent-based modelling is a method for studying complex systems like economies, societies, ecologies etc. Due to their complexity, very often mathematical analysis is limited in its ability to analyse such systems. In this case, agent-based modelling offers a practical, constructive method of analysis. The objective of this book is to shed light on some emergent properties of multi-agent systems. The authors focus their investigation on the effect of knowledge exchange on the convergence of complex, multi-agent systems.
Resumo:
This article presents two novel approaches for incorporating sentiment prior knowledge into the topic model for weakly supervised sentiment analysis where sentiment labels are considered as topics. One is by modifying the Dirichlet prior for topic-word distribution (LDA-DP), the other is by augmenting the model objective function through adding terms that express preferences on expectations of sentiment labels of the lexicon words using generalized expectation criteria (LDA-GE). We conducted extensive experiments on English movie review data and multi-domain sentiment dataset as well as Chinese product reviews about mobile phones, digital cameras, MP3 players, and monitors. The results show that while both LDA-DP and LDAGE perform comparably to existing weakly supervised sentiment classification algorithms, they are much simpler and computationally efficient, rendering themmore suitable for online and real-time sentiment classification on the Web. We observed that LDA-GE is more effective than LDA-DP, suggesting that it should be preferred when considering employing the topic model for sentiment analysis. Moreover, both models are able to extract highly domain-salient polarity words from text.
Resumo:
To date, more than 16 million citations of published articles in biomedical domain are available in the MEDLINE database. These articles describe the new discoveries which accompany a tremendous development in biomedicine during the last decade. It is crucial for biomedical researchers to retrieve and mine some specific knowledge from the huge quantity of published articles with high efficiency. Researchers have been engaged in the development of text mining tools to find knowledge such as protein-protein interactions, which are most relevant and useful for specific analysis tasks. This chapter provides a road map to the various information extraction methods in biomedical domain, such as protein name recognition and discovery of protein-protein interactions. Disciplines involved in analyzing and processing unstructured-text are summarized. Current work in biomedical information extracting is categorized. Challenges in the field are also presented and possible solutions are discussed.
Resumo:
With the growth of the multi-national corporation (MNCs) has come the need to understand how parent companies transfer knowledge to, and manage the operations of, their subsidiaries. This is of particular interest to manufacturing companies transferring their operations overseas. Japanese companies in particular have been pioneering in the development of techniques such as Kaizen, and elements of the Toyota Production System (TPS) such as Kanban, which can be useful tools for transferring the ethos of Japanese manufacturing and maintaining quality and control in overseas subsidiaries. Much has been written about the process of transferring Japanese manufacturing techniques but much less is understood about how the subsidiaries themselves – which are required to make use of such techniques – actually acquire and incorporate them into their operations. This research therefore takes the perspective of the subsidiary in examining how knowledge of manufacturing techniques is transferred from the parent company within its surrounding (subsidiary). There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate this knowledge into their working practices. A particularly relevant theme is how subsidiaries both replicate and adapt knowledge from parents and the circumstances in which replication or adaptation occurs. However, it is shown that there is a lack of research which takes an in-depth look at these processes from the perspective of the participants themselves. This is particularly important as much knowledge literature argues that knowledge is best viewed as enacted and learned in practice – and therefore transferred in person – rather than by the transfer of abstract and de-contextualised information. What is needed, therefore, is further research which makes an in-depth examination of what happens at the subsidiary level for this transfer process to occur. There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate knowledge about manufacturing techniques into their working practices. In depth qualitative research was, therefore, conducted in the subsidiary of a Japanese multinational, Gambatte Corporation, involving three main manufacturing initiatives (or philosophies), namely 'TPS‘, 'TPM‘ and 'TS‘. The case data were derived from 52 in-depth interviews with project members, moderate-participant observations, and documentations and presented and analysed in episodes format. This study contributes to our understanding of knowledge transfer in relation to the approaches and circumstances of adaptation and replication of knowledge within the subsidiary, how the whole process is developed, and also how 'innovation‘ takes place. This study further understood that the process of knowledge transfer could be explained as a process of Reciprocal Provider-Learner Exchange that can be linked to the Experiential Learning Theory.
Resumo:
Continuing advances in digital image capture and storage are resulting in a proliferation of imagery and associated problems of information overload in image domains. In this work we present a framework that supports image management using an interactive approach that captures and reuses task-based contextual information. Our framework models the relationship between images and domain tasks they support by monitoring the interactive manipulation and annotation of task-relevant imagery. During image analysis, interactions are captured and a task context is dynamically constructed so that human expertise, proficiency and knowledge can be leveraged to support other users in carrying out similar domain tasks using case-based reasoning techniques. In this article we present our framework for capturing task context and describe how we have implemented the framework as two image retrieval applications in the geo-spatial and medical domains. We present an evaluation that tests the efficiency of our algorithms for retrieving image context information and the effectiveness of the framework for carrying out goal-directed image tasks. © 2010 Springer Science+Business Media, LLC.
Resumo:
This paper proposes a conceptual model for a firm's capability to calibrate supply chain knowledge (CCK). Knowledge calibration is achieved when there is a match between managers' ex ante confidence in the accuracy of held knowledge and the ex post accuracy of that knowledge. Knowledge calibration is closely related to knowledge utility or willingness to use the available ex ante knowledge: a manager uses the ex ante knowledge if he/she is confident in the accuracy of that knowledge, and does not use it or uses it with reservation, when the confidence is low. Thus, knowledge calibration attained through the firm's CCK enables managers to deal with incomplete and uncertain information and enhances quality of decisions. In the supply chain context, although demand- and supply-related knowledge is available, supply chain inefficiencies, such as the bullwhip effect, remain. These issues may be caused not by a lack of knowledge but by a firm's lack of capability to sense potential disagreement between knowledge accuracy and confidence. Therefore, this paper contributes to the understanding of supply chain knowledge utilization by defining CCK and identifying a set of antecedents and consequences of CCK in the supply chain context.
Resumo:
We have attempted to bring together two areas which are challenging for both IS research and practice: forms of coordination and management of knowledge in the context of global, virtual software development projects. We developed a more comprehensive, knowledge-based model of how coordination can be achieved, and\illustrated the heuristic and explanatory power of the model when applied to global software projects experiencing different degrees of success. We first reviewed the literature on coordination and determined what is known about coordination of knowledge in global software projects. From this we developed a new, distinctive knowledge-based model of coordination, which was then employed to analyze two case studies of global software projects, at SAP and Baan, to illustrate the utility of the model.
Resumo:
This paper explores the role of transactive memory in enabling knowledge transfer between globally distributed teams. While the information systems literature has recently acknowledged the role transactive memory plays in improving knowledge processes and performance in colocated teams, little is known about its contribution to distributed teams. To contribute to filling this gap, knowledge-transfer challenges and processes between onsite and offshore teams were studied at TATA Consultancy Services. In particular, the paper describes the transfer of knowledge between onsite and offshore teams through encoding, storing and retrieving processes. An in-depth case study of globally distributed software development projects was carried out, and a qualitative, interpretive approach was adopted. The analysis of the case suggests that in order to overcome differences derived from the local contexts of the onsite and offshore teams (e.g. different work routines, methodologies and skills), some specific mechanisms supporting the development of codified and personalized ‘directories’ were introduced. These include the standardization of templates and methodologies across the remote sites as well as frequent teleconferencing sessions and occasional short visits. These mechanisms contributed to the development of the notion of ‘who knows what’ across onsite and offshore teams despite the challenges associated with globally distributed teams, and supported the transfer of knowledge between onsite and offshore teams. The paper concludes by offering theoretical and practical implications.
Resumo:
One of the key challenges that organizations face when trying to integrate knowledge across different functions is the need to overcome knowledge boundaries between team members. In cross-functional teams, these boundaries, associated with different knowledge backgrounds of people from various disciplines, create communication problems, necessitating team members to engage in complex cognitive processes when integrating knowledge toward a joint outcome. This research investigates the impact of syntactic, semantic, and pragmatic knowledge boundaries on a team’s ability to develop a transactive memory system (TMS)—a collective memory system for knowledge coordination in groups. Results from our survey show that syntactic and pragmatic knowledge boundaries negatively affect TMS development. These findings extend TMS theory beyond the information-processing view, which treats knowledge as an object that can be stored and retrieved, to the interpretive and practice-based views of knowledge, which recognize that knowledge (in particular specialized knowledge) is localized, situated, and embedded in practice.
Resumo:
Provision of information and behavioural instruction has been demonstrated to improve recovery after surgery. However, patients draw on a range of information sources and it is important to establish which sources patients use and how this influences perceptions and behaviour as they progress along the surgical pathway. In this qualitative, exploratory and longitudinal study, the use of information and instruction were explored from the perspective of people undergoing inguinal hernia repair surgery. Seven participants undergoing inguinal hernia repair surgery were interviewed using semi-structured interviews 2 weeks before surgery and 2 weeks and 4 months post-surgery. Nineteen interviews were conducted in total. Topic guides included sources of knowledge, reasons for help-seeking and opting for surgery and factors influencing return to activity. Data were analysed thematically according to Interpretative Phenomenological Analysis. Participants sought information from a range of sources, focusing on informal information sources before surgery and using information and instruction from health-care professionals post-surgery. This information influenced behaviours including deciding to undergo surgery, use of pain medication and returning to usual activity. Anxiety and help-seeking resulted when unexpected post-surgical events occurred such as extensive bruising. Findings were consistent with psychological and sociological theories. Overall, participants were positive about the information and instruction they received but expressed a desire for more timely information on post-operative adverse events.
Resumo:
Increasingly users are seen as the weak link in the chain, when it comes to the security of corporate information. Should the users of computer systems act in any inappropriate or insecure manner, then they may put their employers in danger of financial losses, information degradation or litigation, and themselves in danger of dismissal or prosecution. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of inappropriate behaviours, and in so doing, protecting corporate information, is through the formulation and application of a formal ‘acceptable use policy (AUP). Whilst the AUP has attracted some academic interest, it has tended to be prescriptive and overly focussed on the role of the Internet, and there is relatively little empirical material that explicitly addresses the purpose, positioning or content of real acceptable use policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and composition of a sample of authentic policies – taken from the higher education sector – rather than simply making general prescriptions about what they ought to contain. There are two important conclusions to be drawn from this study: (1) the primary role of the AUP appears to be as a mechanism for dealing with unacceptable behaviour, rather than proactively promoting desirable and effective security behaviours, and (2) the wide variation found in the coverage and positioning of the reviewed policies is unlikely to be fostering a coherent approach to security management, across the higher education sector.
Resumo:
Ensuring the security of corporate information, that is increasingly stored, processed and disseminated using information and communications technologies [ICTs], has become an extremely complex and challenging activity. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of security breaches, and in so doing, protecting corporate information, is through the formulation and application of a formal information security policy (InSPy). Whilst a great deal has now been written about the importance and role of the information security policy, and approaches to its formulation and dissemination, there is relatively little empirical material that explicitly addresses the structure or content of security policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and content of authentic information security policies, rather than simply making general prescriptions about what they ought to contain. Having established the structure and key features of the reviewed policies, the paper critically explores the underlying conceptualisation of information security embedded in the policies. There are two important conclusions to be drawn from this study: (1) the wide diversity of disparate policies and standards in use is unlikely to foster a coherent approach to security management; and (2) the range of specific issues explicitly covered in university policies is surprisingly low, and reflects a highly techno-centric view of information security management.
Resumo:
In this demonstration, we will present a semantic environment called the K-Box. The K-Box supports the lightweight integration of knowledge tools, with a focus on semantic tools, but with the flexibility to integrate natural language and conventional tools. We discuss the implementation of the framework, and two existing applications, including details of a new application for developers of semantic workflows. The demonstration will be of interest to developers and researchers of ontology-based knowledge management systems, and semantic desktops, and to analysts working with cross-media information. © 2011 ACM.
Resumo:
The paper proposes an ISE (Information goal, Search strategy, Evaluation threshold) user classification model based on Information Foraging Theory for understanding user interaction with content-based image retrieval (CBIR). The proposed model is verified by a multiple linear regression analysis based on 50 users' interaction features collected from a task-based user study of interactive CBIR systems. To our best knowledge, this is the first principled user classification model in CBIR verified by a formal and systematic qualitative analysis of extensive user interaction data. Copyright 2010 ACM.