825 resultados para declarative and procedural knowledge


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the voluminous studies written about organisational innovation over the last 30-40 years our understanding of this phenomenon continues to be inconsistent and inconclusive (Wolfe, 1994). An assessment of the theoretical and methodological issues influencing the explanatory utility of many studies has led scholars (e.g. Slappendel, 1996) to re-evaluate the assumptions used to ground studies. Building on these criticisms the current study contributes to the development of an interactive perspective of organisational innovation. This work contributes empirically and theoretically to an improved understanding of the innovation process and the interaction between the realm of action and the mediating effects of pre-existing contingencies i.e. social control, economic exchange and the communicability of knowledge (Scarbrough, 1996). Building on recent advances in institutional theory (see Barley, 1986; 1990; Barley and Tolbert, 1997) and critical theory (Morrow, 1994, Sayer, 1992) the study aims to demonstrate, via longitudinal intensive research, the process through which ideas are translated into reality. This is significant because, despite a growing recognition of the implicit link between the strategic conduct of actors and the institutional realm in organisational analysis, there are few examples that theorise and empirically test these connections. By assessing an under researched example of technology transfer; the government's Teaching Company Scheme (TCS) this project provides a critique of the innovation process that contributes to theory and our appreciation of change in the UK government's premier technology transfer scheme (QR, 1996). Critical moments during the translation of ideas illustrate how elements that are linked to social control, economic exchange and communicability mediate the innovation process. Using analytical categories i.e. contradiction, slippage and dysfunctionality these are assessed in relation to the actions (coping strategies) of programme members over a two-year period. Drawing on Giddens' (1995) notion of the duality of structure this study explores the nature of the relationship between the task environment and institutional environment demonstrating how and why knowledge is both an enabler and barrier to organisational innovation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research project focused upon the design strategies adopted by expert and novice designers. It was based upon a desire to compare the design problem solving strategies of novices, in this case key stage three pupils studying technolgy within the United Kingdom National Curriculum, with designers who could be considered to have developed expertise. The findings helped to provide insights into potential teaching strategies to suit novice designers. Verbal protocols were made as samples of expert and novice designers solved a design problem and talked aloud as they worked. The verbalisations were recorded on video tape. The protocols were transcribed and segmented, with each segment being assigned to a predetermined coding system which represented a model of design problem solving. The results of the encoding were analysed and consideration was also given to the general design strategy and heuristics used by the expert and novice designers. The drawings and models produced during the generation of the protocols were also analysed and considered. A number of significant differences between the problem solving strategies adopted by the expert and novice designers were identified. First of all, differences were observed in the way expert and novice designers used the problem statement and solution validation during the process. Differences were also identified in the way holistic solutions were generated near the start of the process, and also in the cycles of exploration and the processes of integration. The way design and technological knowledge was used provided further insights into the differences between experts and novices, as did the role of drawing and modelling during the process. In more general terms, differences were identified in the heuristics and overall design strategies adopted by the expert and novice designers. The above findings provided a basis for discussing teaching strategies appropriate for novice designers. Finally, opportunities for future research were discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The role of applied theatre in engaging both lay and professional publics with debate on health policy and practice is an emergent field. This paper discusses the development, production performance and discussion of ‘Inside View’.1 Objectives The objectives were to produce applied theatre from research findings of a completed study on genetic prenatal screening, exploring the dilemmas for women and health professionals of prenatal genetic screening, and to engage audiences in debate and reflection on the dilemmas of prenatal genetic screening. Methods ‘Inside View’ was developed from a multidisciplinary research study through identification of emergent themes from qualitative interviews, and development of these by the writer, theatre producer and media technologist with input from the researchers. Findings Inside View was performed in London and the Midlands to varied audiences with a panel discussion and evaluation post performance. The audiences were engaged in debate that was relevant to them professionally and personally. Knowledge translation through applied theatre is an effective tool for engaging the public but the impact subsequently is unclear. There are ethical issues of unexpected disclosure during discussion post performance and the process of transforming research findings into applied theatre requires time and trust within the multidisciplinary team as well as adequate resourcing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate knowledge exchange among commercial organizations, the rationale behind it, and its effects on the market. Knowledge exchange is known to be beneficial for industry, but in order to explain it, authors have used high-level concepts like network effects, reputation, and trust. We attempt to formalize a plausible and elegant explanation of how and why companies adopt information exchange and why it benefits the market as a whole when this happens. This explanation is based on a multiagent model that simulates a market of software providers. Even though the model does not include any high-level concepts, information exchange naturally emerges during simulations as a successful profitable behavior. The conclusions reached by this agent-based analysis are twofold: 1) a straightforward set of assumptions is enough to give rise to exchange in a software market, and 2) knowledge exchange is shown to increase the efficiency of the market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the growth of the multi-national corporation (MNCs) has come the need to understand how parent companies transfer knowledge to, and manage the operations of, their subsidiaries. This is of particular interest to manufacturing companies transferring their operations overseas. Japanese companies in particular have been pioneering in the development of techniques such as Kaizen, and elements of the Toyota Production System (TPS) such as Kanban, which can be useful tools for transferring the ethos of Japanese manufacturing and maintaining quality and control in overseas subsidiaries. Much has been written about the process of transferring Japanese manufacturing techniques but much less is understood about how the subsidiaries themselves – which are required to make use of such techniques – actually acquire and incorporate them into their operations. This research therefore takes the perspective of the subsidiary in examining how knowledge of manufacturing techniques is transferred from the parent company within its surrounding (subsidiary). There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate this knowledge into their working practices. A particularly relevant theme is how subsidiaries both replicate and adapt knowledge from parents and the circumstances in which replication or adaptation occurs. However, it is shown that there is a lack of research which takes an in-depth look at these processes from the perspective of the participants themselves. This is particularly important as much knowledge literature argues that knowledge is best viewed as enacted and learned in practice – and therefore transferred in person – rather than by the transfer of abstract and de-contextualised information. What is needed, therefore, is further research which makes an in-depth examination of what happens at the subsidiary level for this transfer process to occur. There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate knowledge about manufacturing techniques into their working practices. In depth qualitative research was, therefore, conducted in the subsidiary of a Japanese multinational, Gambatte Corporation, involving three main manufacturing initiatives (or philosophies), namely 'TPS‘, 'TPM‘ and 'TS‘. The case data were derived from 52 in-depth interviews with project members, moderate-participant observations, and documentations and presented and analysed in episodes format. This study contributes to our understanding of knowledge transfer in relation to the approaches and circumstances of adaptation and replication of knowledge within the subsidiary, how the whole process is developed, and also how 'innovation‘ takes place. This study further understood that the process of knowledge transfer could be explained as a process of Reciprocal Provider-Learner Exchange that can be linked to the Experiential Learning Theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the determinants of technology transfer between parent firms and their international affiliates, and of knowledge spillovers from those affiliates to host-country firms. Using a unique data set of foreign multinational enterprise (MNE) affiliates based in Italy, we find that affiliate investment in R&D and investment in capital-embodied technology plays a significant role in determining the nature of intra-firm technology flows. However, the basis for any spillovers arising from MNE affiliates does not originate from codified knowledge associated with R&D, but rather from the productivity of the affiliate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper builds on a Strategic Activity Framework (Jarzabkowski, 2005) and activity based theories of development (Vygotsky, 1978) to model how Enterprise Systems are used to support emerging strategy. It makes three contributions. Firstly, it links fluidity and extensiveness of system use to patterns of strategising. Fluidity - the ability to change system use as needs change - is supported by interactive strategising, where top managers communicate directly with the organisation. Extensiveness requires procedural strategising, embedding system use in structures and routines. Secondly, it relates interactive and procedural strategising to the importance of the system - procedural strategising is more likely to occur if the system is strategically important. Thirdly, using a scaffolding metaphor it identifies patterns in the activities of top managers and Enterprise System custodians, who identify process champions within the organisational community, orient them towards system goals, provide guided support, and encourage fluidity through pacing implementation with learning.© 2013 Published by Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clinical Decision Support Systems (CDSSs) need to disseminate expertise in formats that suit different end users and with functionality tuned to the context of assessment. This paper reports research into a method for designing and implementing knowledge structures that facilitate the required flexibility. A psychological model of expertise is represented using a series of formally specified and linked XML trees that capture increasing elements of the model, starting with hierarchical structuring, incorporating reasoning with uncertainty, and ending with delivering the final CDSS. The method was applied to the Galatean Risk and Safety Tool, GRiST, which is a web-based clinical decision support system (www.egrist.org) for assessing mental-health risks. Results of its clinical implementation demonstrate that the method can produce a system that is able to deliver expertise targetted and formatted for specific patient groups, different clinical disciplines, and alternative assessment settings. The approach may be useful for developing other real-world systems using human expertise and is currently being applied to a logistics domain. © 2013 Polish Information Processing Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Both phonological (speech) and auditory (non-speech) stimuli have been shown to predict early reading skills. However, previous studies have failed to control for the level of processing required by tasks administered across the two levels of stimuli. For example, phonological tasks typically tap explicit awareness e.g., phoneme deletion, while auditory tasks usually measure implicit awareness e.g., frequency discrimination. Therefore, the stronger predictive power of speech tasks may be due to their higher processing demands, rather than the nature of the stimuli. Method: The present study uses novel tasks that control for level of processing (isolation, repetition and deletion) across speech (phonemes and nonwords) and non-speech (tones) stimuli. 800 beginning readers at the onset of literacy tuition (mean age 4 years and 7 months) were assessed on the above tasks as well as word reading and letter-knowledge in the first part of a three time-point longitudinal study. Results: Time 1 results reveal a significantly higher association between letter-sound knowledge and all of the speech compared to non-speech tasks. Performance was better for phoneme than tone stimuli, and worse for deletion than isolation and repetition across all stimuli. Conclusions: Results are consistent with phonological accounts of reading and suggest that level of processing required by the task is less important than stimuli type in predicting the earliest stage of reading.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While most of the research in Knowledge Management (KM) has focused on business communities, there is a breadth of potential applications of KM theory and practice to wider society. This paper explores the potential of KM for rural communities, specifically for those that want to preserve their social history and collective memories (what we call heritage) to enrich the lives of others. In KM terms, this is a task of accumulating and recording knowledge (using KM techniques such as story-telling and communities of practice) to enable its retention for future use (by interested people perhaps through KM systems). We report a case study of Cardrona, a valley of approximately 120 people in New Zealand's South Island. Realising that time would erode knowledge of their community a small, motivated group of residents initiated a KM programme to create a legacy for a wider community including younger generations, tourists and scholars. This paper applies KM principles to rural communities that want to harness their collective knowledge for wider societal gain, and develops a community-based framework to inform such initiatives. As a result, we call for a wider conceptualisation of KM to include motives for managing knowledge beyond business performance to accommodate community (cKM). © 2010 Operational Research Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intranet technologies accessible through a web based platform are used to share and build knowledge bases in many industries. Previous research suggests that intranets are capable of providing a useful means to share, collaborate and transact information within an organization. To compete and survive successfully, business organisations are required to effectively manage various risks affecting their businesses. In the construction industry too this is increasingly becoming an important element in business planning. The ability of businesses, especially of SMEs which represent a significant portion in most economies, to manage various risks is often hindered by fragmented knowledge across a large number of businesses. As a solution, this paper argues that Intranet technologies can be used as an effective means of building and sharing knowledge and building up effective knowledge bases for risk management in SMEs, by specifically considering the risks of extreme weather events. The paper discusses and evaluates relevant literature in this regard and identifies the potential for further research to explore this concept.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – This paper describes a “work in progress” research project being carried out with a public health care provider in the UK, a large NHS hospital Trust. Enhanced engagement with patients is one of the Trust’s core principles, but it is recognised that much more needs to be done to achieve this, and that ICT systems may be able to provide some support. The project is intended to find ways to better capture and evaluate the “voice of the patient” in order to lead to improvements in health care quality, safety and effectiveness. Design/methodology/approach – We propose to investigate the use of a patient-orientated knowledge management system (KMS) in managing knowledge about and from patients. The study is a mixed methods (quantitative and qualitative) investigation based on traditional action research, intended to answer the following three research questions: (1) How can a KMS be used as a mechanism to capture and evaluate patient experiences to provoke patient service change (2) How can the KMS assist in providing a mechanism for systematising patient engagement? (3) How can patient feedback be used to stimulate improvements in care, quality and safety? Originality/value –This methodology aims to involve patients at all phases of the study from its initial design onwards, thus leading to an understanding of the issues associated with using a KMS to manage knowledge about and for patients that is driven by the patients themselves. Practical implications – The outcomes of the project for the collaborating hospital will be firstly, a system for capturing and evaluating knowledge about and from patients, and then as a consequence, improved outcomes for both the patients and the service provider. More generally, it will produce a set of guidelines for managing patient knowledge in an NHS hospital that have been tested in one case example.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Objective: Clozapine has been available since the early 1990s. Studies continue to demonstrate its superior efficacy in treatment-resistant schizophrenia. Despite this, numerous studies show under-utilisation, delayed access and reluctance by psychiatrists to prescribe clozapine. This retrospective cross-sectional study compared the prescribing of clozapine in two adult cohorts under the care of large public mental health services in Auckland (New Zealand) and Birmingham (United Kingdom) on 31 March 2007. Method: Time from first presentation to clozapine initiation, prior antipsychotics trialled and antipsychotic co-prescribing were compared. Data included demographics, psychiatric diagnosis, co-morbid conditions, year of first presentation, admissions and pharmacological treatment (clozapine dose, start date, prior antipsychotics, co-prescribed antipsychotic). Results: Overall, 664 people were prescribed clozapine (402 Auckland; 262 Birmingham); mean daily dose of 384 mg (Auckland) and 429 mg (Birmingham). 53 % presented after 1990 and the average duration of time before starting clozapine was significantly longer in the Birmingham cohort (6.5 vs. 5.3 years) but this reduced in both cohorts to a 1-year mean in those presenting within the last 3 years. The average number of antipsychotics trialled pre-clozapine for those presenting since 1990 was significantly higher in the Birmingham cohort (4.3 vs. 3.1) but in both cohorts this similarly reduced in those presenting within the last 3 years. Antipsychotic co-prescribing was significantly higher in the Birmingham cohort (22.9 vs. 10.7 %). Conclusions: There is evidence that access to clozapine has improved over time in both cohorts, with a reduction in the duration between presentation and initiation of clozapine and number of different antipsychotics trialled pre-clozapine. These are very positive findings in terms of optimising outcomes with clozapine and are possibly due to the impact of guideline recommendations, increasing clinician, consumer and carer knowledge, and experience with clozapine and funding changes. © 2014 Springer International Publishing.