820 resultados para knowledge framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite considerable and growing interest in the subject of academic researchers and practising managers jointly generating knowledge (which we term ‘co-production’), our searches of management literature revealed few articles based on primary data or multiple cases. Given the increasing commitment to co-production by academics, managers and those funding research, it seems important to strengthen the evidence base about practice and performance in co-production. Literature on collaborative research was reviewed to develop a framework to structure the analysis of this data and relate findings to the limited body of prior research on collaborative research practice and performance. This paper presents empirical data from four completed, large scale co-production projects. Despite major differences between the cases, we find that the key success factors and the indicators of performances are remarkably similar. We demonstrate many, complex influences between factors, between outcomes, and between factors and outcomes, and discuss the features that are distinctive to co-production. Our empirical findings are broadly consonant with prior literature, but go further in trying to understand success factors’ consequences for performance. A second contribution of this paper is the development of a conceptually and methodologically rigorous process for investigating collaborative research, linking process and performance. The paper closes with discussion of the study’s limitations and opportunities for further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The International Accounting Education Standards Board (IAESB) places a strong emphasis on individual professionals taking responsibility for their Continuing Professional Development (CPD). On the other hand, the roles performed by professional accountants have evolved out of practical necessity to 'best' suit the diverse needs of business in a global economy. This diversity has meant that professional accountants are seen in highly specialised roles requiring diverse skill sets. In order to enhance the contribution of the accountant as a knowledge professional for business, it follows that CPD that leverages off an individual's experience should be designed to meet the needs of professionals across the different specialised roles within the profession. In doing so the project identifies how CPD should differ across roles and levels of organisational responsibility for accounting professionals. The study also makes a number of policy recommendations to IAESB and IFAC. © 2013 © 2013 Taylor & Francis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, due to globalization of the world the size of data set is increasing, it is necessary to discover the knowledge. The discovery of knowledge can be typically in the form of association rules, classification rules, clustering, discovery of frequent episodes and deviation detection. Fast and accurate classifiers for large databases are an important task in data mining. There is growing evidence that integrating classification and association rules mining, classification approaches based on heuristic, greedy search like decision tree induction. Emerging associative classification algorithms have shown good promises on producing accurate classifiers. In this paper we focus on performance of associative classification and present a parallel model for classifier building. For classifier building some parallel-distributed algorithms have been proposed for decision tree induction but so far no such work has been reported for associative classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The programme of research examines knowledge workers, their relationships with organisations, and perceptions of management practices through the development of a theoretical model and knowledge worker archetypes. Knowledge worker and non-knowledge worker archetypes were established through an analysis of the extant literature. After an exploratory study of knowledge workers in a small software development company the archetypes were refined to include occupational classification data and the findings from Study 1. The Knowledge Worker Characteristics Model (KWCM) was developed as a theoretical framework in order to analyse differences between the two archetypes within the IT sector. The KWCM comprises of the variables within the job characteristics model, creativity, goal orientation, identification and commitment. In Study 2, a global web based survey was conducted. There were insufficient non-knowledge worker responses and therefore a cluster analysis was conducted to interrogate the archetypes further. This demonstrated, unexpectedly, that that there were marked differences within the knowledge worker archetypes suggesting the need to granulate the archetype further. The theoretical framework and the archetypes were revised (as programmers and web developers) and the research study was refocused to examine occupational differences within knowledge work. Findings from Study 2 identified that there were significant differences between the archetypes in relation to the KWCM. 19 semi-structured interviews were conducted in Study 3 in order to deepen the analysis using qualitative data and to examine perceptions of people management practices. The findings from both studies demonstrate that there were significant differences between the two groups but also that job challenge, problem solving, intrinsic reward and team identification were of importance to both groups of knowledge workers. This thesis presents an examination of knowledge workers’ perceptions of work, organisations and people management practices in the granulation and differentiation of occupational archetypes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents experience in teaching of knowledge and ontological engineering. The teaching framework is targeted on the development of cognitive skills that will allow facilitating the process of knowledge elicitation, structuring and ontology development for scaffolding students’ research. The structuring procedure is the kernel of ontological engineering. The 5-steps ontology designing process is described. Special stress is put on “beautification” principles of ontology creating. The academic curriculum includes interactive game-format training of lateral thinking, interpersonal cognitive intellect and visual mind mapping techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The UK government aims at achieving 80% CO2 emission reduction by 2050 which requires collective efforts across all the UK industry sectors. In particular, the housing sector has a large potential to contribute to achieving the aim because the housing sector alone accounts for 27% of the total UK CO2 emission, and furthermore, 87% of the housing which is responsible for current 27% CO2 emission will still stand in 2050. Therefore, it is essential to improve energy efficiency of existing housing stock built with low energy efficiency standard. In order for this, a whole‐house needs to be refurbished in a sustainable way by considering the life time financial and environmental impacts of a refurbished house. However, the current refurbishment process seems to be challenging to generate a financially and environmentally affordable refurbishment solution due to the highly fragmented nature of refurbishment practice and a lack of knowledge and skills about whole‐house refurbishment in the construction industry. In order to generate an affordable refurbishment solution, diverse information regarding costs and environmental impacts of refurbishment measures and materials should be collected and integrated in right sequences throughout the refurbishment project life cycle among key project stakeholders. Consequently, various researchers increasingly study a way of utilizing Building Information Modelling (BIM) to tackle current problems in the construction industry because BIM can support construction professionals to manage construction projects in a collaborative manner by integrating diverse information, and to determine the best refurbishment solution among various alternatives by calculating the life cycle costs and lifetime CO2 performance of a refurbishment solution. Despite the capability of BIM, the BIM adoption rate is low with 25% in the housing sector and it has been rarely studied about a way of using BIM for housing refurbishment projects. Therefore, this research aims to develop a BIM framework to formulate a financially and environmentally affordable whole‐house refurbishment solution based on the Life Cycle Costing (LCC) and Life Cycle Assessment (LCA) methods simultaneously. In order to achieve the aim, a BIM feasibility study was conducted as a pilot study to examine whether BIM is suitable for housing refurbishment, and a BIM framework was developed based on the grounded theory because there was no precedent research. After the development of a BIM framework, this framework was examined by a hypothetical case study using BIM input data collected from questionnaire survey regarding homeowners’ preferences for housing refurbishment. Finally, validation of the BIM framework was conducted among academics and professionals by providing the BIM framework and a formulated refurbishment solution based on the LCC and LCA studies through the framework. As a result, BIM was identified as suitable for housing refurbishment as a management tool, and it is timely for developing the BIM framework. The BIM framework with seven project stages was developed to formulate an affordable refurbishment solution. Through the case study, the Building Regulation is identified as the most affordable energy efficiency standard which renders the best LCC and LCA results when it is applied for whole‐house refurbishment solution. In addition, the Fabric Energy Efficiency Standard (FEES) is recommended when customers are willing to adopt high energy standard, and the maximum 60% of CO2 emissions can be reduced through whole‐house fabric refurbishment with the FEES. Furthermore, limitations and challenges to fully utilize BIM framework for housing refurbishment were revealed such as a lack of BIM objects with proper cost and environmental information, limited interoperability between different BIM software and limited information of LCC and LCA datasets in BIM system. Finally, the BIM framework was validated as suitable for housing refurbishment projects, and reviewers commented that the framework can be more practical if a specific BIM library for housing refurbishment with proper LCC and LCA datasets is developed. This research is expected to provide a systematic way of formulating a refurbishment solution using BIM, and to become a basis for further research on BIM for the housing sector to resolve the current limitations and challenges. Future research should enhance the BIM framework by developing more detailed process map and develop BIM objects with proper LCC and LCA Information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper gives an overview about the ongoing FP6-IST INFRAWEBS project and describes the main layers and software components embedded in an application oriented realisation framework. An important part of INFRAWEBS is a Semantic Web Unit (SWU) – a collaboration platform and interoperable middleware for ontology-based handling and maintaining of SWS. The framework provides knowledge about a specific domain and relies on ontologies to structure and exchange this knowledge to semantic service development modules. INFRAWEBS Designer and Composer are sub-modules of SWU responsible for creating Semantic Web Services using Case-Based Reasoning approach. The Service Access Middleware (SAM) is responsible for building up the communication channels between users and various other modules. It serves as a generic middleware for deployment of Semantic Web Services. This software toolset provides a development framework for creating and maintaining the full-life-cycle of Semantic Web Services with specific application support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: In molecular biology, molecular events describe observable alterations of biomolecules, such as binding of proteins or RNA production. These events might be responsible for drug reactions or development of certain diseases. As such, biomedical event extraction, the process of automatically detecting description of molecular interactions in research articles, attracted substantial research interest recently. Event trigger identification, detecting the words describing the event types, is a crucial and prerequisite step in the pipeline process of biomedical event extraction. Taking the event types as classes, event trigger identification can be viewed as a classification task. For each word in a sentence, a trained classifier predicts whether the word corresponds to an event type and which event type based on the context features. Therefore, a well-designed feature set with a good level of discrimination and generalization is crucial for the performance of event trigger identification. Results: In this article, we propose a novel framework for event trigger identification. In particular, we learn biomedical domain knowledge from a large text corpus built from Medline and embed it into word features using neural language modeling. The embedded features are then combined with the syntactic and semantic context features using the multiple kernel learning method. The combined feature set is used for training the event trigger classifier. Experimental results on the golden standard corpus show that >2.5% improvement on F-score is achieved by the proposed framework when compared with the state-of-the-art approach, demonstrating the effectiveness of the proposed framework. © 2014 The Author 2014. The source code for the proposed framework is freely available and can be downloaded at http://cse.seu.edu.cn/people/zhoudeyu/ETI_Sourcecode.zip.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a short review of some systems for program transformations performed on the basis of the internal intermediate representations of these programs. Many systems try to support several languages of representation of the source texts of programs and solve the task of their translation into the internal representation. This task is still a challenge as it is effort-consuming. To reduce the effort, different systems of translator construction, ready compilers with ready grammars of outside designers are used. Though this approach saves the effort, it has its drawbacks and constraints. The paper presents the general idea of using the mapping approach to solve the task within the framework of program transformations and overcome the disadvantages of the existing systems. The paper demonstrates a fragment of the ontology model of high-level languages mappings onto the single representation and gives the example of how the description of (a fragment) a particular mapping is represented in accordance with the ontology model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inverse controller is traditionally assumed to be a deterministic function. This paper presents a pedagogical methodology for estimating the stochastic model of the inverse controller. The proposed method is based on Bayes' theorem. Using Bayes' rule to obtain the stochastic model of the inverse controller allows the use of knowledge of uncertainty from both the inverse and the forward model in estimating the optimal control signal. The paper presents the methodology for general nonlinear systems and is demonstrated on nonlinear single-input-single-output (SISO) and multiple-input-multiple-output (MIMO) examples. © 2006 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sharing of near real-time traceability knowledge in supply chains plays a central role in coordinating business operations and is a key driver for their success. However before traceability datasets received from external partners can be integrated with datasets generated internally within an organisation, they need to be validated against information recorded for the physical goods received as well as against bespoke rules defined to ensure uniformity, consistency and completeness within the supply chain. In this paper, we present a knowledge driven framework for the runtime validation of critical constraints on incoming traceability datasets encapuslated as EPCIS event-based linked pedigrees. Our constraints are defined using SPARQL queries and SPIN rules. We present a novel validation architecture based on the integration of Apache Storm framework for real time, distributed computation with popular Semantic Web/Linked data libraries and exemplify our methodology on an abstraction of the pharmaceutical supply chain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within the framework of heritage preservation, 3D scanning and modeling for heritage documentation has increased significantly in recent years, mainly due to the evolution of laser and image-based techniques, modeling software, powerful computers and virtual reality. 3D laser acquisition constitutes a real development opportunity for 3D modeling based previously on theoretical data. The representation of the object information rely on the knowledge of its historic and theoretical frame to reconstitute a posteriori its previous states. This project proposes an approach dealing with data extraction based on architectural knowledge and Laser statement informing measurements, the whole leading to 3D reconstruction. The experimented Khmer objects are exposed at Guimet museum in Paris. The purpose of this digital modeling meets the need of exploitable models for simulation projects, prototyping, exhibitions, promoting cultural tourism and particularly for archiving against any likely disaster and as an aided tool for the formulation of virtual museum concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - This review provides a worked example of ‘best fit’ framework synthesis using the Theoretical Domains Framework (TDF) of health psychology theories as an a priori framework in the synthesis of qualitative evidence. Framework synthesis works best with ‘policy urgent’ questions. Objective - The review question selected was: what are patients’ experiences of prevention programmes for cardiovascular disease (CVD) and diabetes? The significance of these conditions is clear: CVD claims more deaths worldwide than any other; diabetes is a risk factor for CVD and leading cause of death. Method - A systematic review and framework synthesis were conducted. This novel method for synthesizing qualitative evidence aims to make health psychology theory accessible to implementation science and advance the application of qualitative research findings in evidence-based healthcare. Results - Findings from 14 original studies were coded deductively into the TDF and subsequently an inductive thematic analysis was conducted. Synthesized findings produced six themes relating to: knowledge, beliefs, cues to (in)action, social influences, role and identity, and context. A conceptual model was generated illustrating combinations of factors that produce cues to (in)action. This model demonstrated interrelationships between individual (beliefs and knowledge) and societal (social influences, role and identity, context) factors. Conclusion - Several intervention points were highlighted where factors could be manipulated to produce favourable cues to action. However, a lack of transparency of behavioural components of published interventions needs to be corrected and further evaluations of acceptability in relation to patient experience are required. Further work is needed to test the comprehensiveness of the TDF as an a priori framework for ‘policy urgent’ questions using ‘best fit’ framework synthesis.