909 resultados para Knowledge Process
Resumo:
Young children often harbor misconceptions about psychotherapy and the role of psychologists. These misconceptions are ignited by rumors and misinformation that are provided to the child by a variety of sources and can compromise both the effectiveness of therapy and the therapeutic dyad. In this paper we explore how recent trends in patient engagement in child psychotherapy, cultural dynamics between patients and practitioners, and children's lack of knowledge surrounding mental health services can negatively impact therapy. Wednesday Afternoons with Dr. J. (WADJ) is a whimsical fictional therapeutic narrative created to inform children about aspects of the therapeutic process while providing adults with tangible structure surrounding how to talk to children about mental healthcare. The advantages of utilizing this narrative to prime children for therapy are discussed, as are methods for promoting the narrative to the greater community.
Resumo:
The Endangered Species Act of 1973 (ESA) is an exceptionally powerful law which requires the involvement of many stake holders, including government and non-government professionals. This project reviewed the requirements of the ESA and the expectations of the USFWS and referenced them to the actions taken by the petitioner in the preparation of the petition for the black-tailed prairie dog. The study has shown the knowledge required by the petitioner to submit an effective petition and also the importance of communicating this knowledge so that the federal agencies may make sound decisions when deciding to protect a species and its habitat. This research can be used as a preliminary reference for beginning the process for future petitions.
Resumo:
Geographic knowledge discovery (GKD) is the process of extracting information and knowledge from massive georeferenced databases. Usually the process is accomplished by two different systems, the Geographic Information Systems (GIS) and the data mining engines. However, the development of those systems is a complex task due to it does not follow a systematic, integrated and standard methodology. To overcome these pitfalls, in this paper, we propose a modeling framework that addresses the development of the different parts of a multilayer GKD process. The main advantages of our framework are that: (i) it reduces the design effort, (ii) it improves quality systems obtained, (iii) it is independent of platforms, (iv) it facilitates the use of data mining techniques on geo-referenced data, and finally, (v) it ameliorates the communication between different users.
Resumo:
Decision support systems (DSS) support business or organizational decision-making activities, which require the access to information that is internally stored in databases or data warehouses, and externally in the Web accessed by Information Retrieval (IR) or Question Answering (QA) systems. Graphical interfaces to query these sources of information ease to constrain dynamically query formulation based on user selections, but they present a lack of flexibility in query formulation, since the expressivity power is reduced to the user interface design. Natural language interfaces (NLI) are expected as the optimal solution. However, especially for non-expert users, a real natural communication is the most difficult to realize effectively. In this paper, we propose an NLI that improves the interaction between the user and the DSS by means of referencing previous questions or their answers (i.e. anaphora such as the pronoun reference in “What traits are affected by them?”), or by eliding parts of the question (i.e. ellipsis such as “And to glume colour?” after the question “Tell me the QTLs related to awn colour in wheat”). Moreover, in order to overcome one of the main problems of NLIs about the difficulty to adapt an NLI to a new domain, our proposal is based on ontologies that are obtained semi-automatically from a framework that allows the integration of internal and external, structured and unstructured information. Therefore, our proposal can interface with databases, data warehouses, QA and IR systems. Because of the high NL ambiguity of the resolution process, our proposal is presented as an authoring tool that helps the user to query efficiently in natural language. Finally, our proposal is tested on a DSS case scenario about Biotechnology and Agriculture, whose knowledge base is the CEREALAB database as internal structured data, and the Web (e.g. PubMed) as external unstructured information.
Resumo:
Theoretical framework: The experience of the terminal phase and death by patients and caregivers is influenced by multiple factors. This study aimed at understanding this experience from the primary caregiver’s perspective. Objectives: To describe the factors that, from the primary caregiver’s perspective, influenced the experience of disease in the terminal phase and mourning for the death of a close person; To understand the influence of nurses’ conduct on the experience of disease in the terminal phase and mourning for the death of a close person from the primary caregiver’s perspective. Methodology: Qualitative descriptive exploratory study. Results: The following factors were valued: To Assume the Caregiver’s Role, To allow for the end-of-life/terminal phase to take place at home/near the family and Process of Care. Regarding the nurses’ conduct, the knowledge, communication and relationship established by nurses were also valued. Conclusion: In addition to expanding the implementation of specific palliative care teams, the acquisition and development of basic skills in this area by most health care professionals is essential.
Resumo:
pt. 1. Fundamental concepts.--pt. 2. Evolution and creation.--pt. 3. Physical facts.--pt. 4. Life and mind.--pt.5. Moral and spiritual facts.--Appendix A. The new physics.--Appendix B. Matter and energy.--Appendix C. Conservation of mass.--Appendix D. Conservation of energy.
Resumo:
The international perspectives on these issues are especially valuable in an increasingly connected, but still institutionally and administratively diverse world. The research addressed in several chapters in this volume includes issues around technical standards bodies like EpiDoc and the TEI, engaging with ways these standards are implemented, documented, taught, used in the process of transcribing and annotating texts, and used to generate publications and as the basis for advanced textual or corpus research. Other chapters focus on various aspects of philological research and content creation, including collaborative or community driven efforts, and the issues surrounding editorial oversight, curation, maintenance and sustainability of these resources. Research into the ancient languages and linguistics, in particular Greek, and the language teaching that is a staple of our discipline, are also discussed in several chapters, in particular for ways in which advanced research methods can lead into language technologies and vice versa and ways in which the skills around teaching can be used for public engagement, and vice versa. A common thread through much of the volume is the importance of open access publication or open source development and distribution of texts, materials, tools and standards, both because of the public good provided by such models (circulating materials often already paid for out of the public purse), and the ability to reach non-standard audiences, those who cannot access rich university libraries or afford expensive print volumes. Linked Open Data is another technology that results in wide and free distribution of structured information both within and outside academic circles, and several chapters present academic work that includes ontologies and RDF, either as a direct research output or as essential part of the communication and knowledge representation. Several chapters focus not on the literary and philological side of classics, but on the study of cultural heritage, archaeology, and the material supports on which original textual and artistic material are engraved or otherwise inscribed, addressing both the capture and analysis of artefacts in both 2D and 3D, the representation of data through archaeological standards, and the importance of sharing information and expertise between the several domains both within and without academia that study, record and conserve ancient objects. Almost without exception, the authors reflect on the issues of interdisciplinarity and collaboration, the relationship between their research practice and teaching and/or communication with a wider public, and the importance of the role of the academic researcher in contemporary society and in the context of cutting edge technologies. How research is communicated in a world of instant- access blogging and 140-character micromessaging, and how our expectations of the media affect not only how we publish but how we conduct our research, are questions about which all scholars need to be aware and self-critical.
Resumo:
Traditional methods of R&D management are no longer sufficient for embracing innovations and leveraging complex new technologies to fully integrated positions in established systems. This paper presents the view that the technology integration process is a result of fundamental interactions embedded in inter-organisational activities. Emerging industries, high technology companies and knowledge intensive organisations owe a large part of their viability to complex networks of inter-organisational interactions and relationships. R&D organisations are the gatekeepers in the technology integration process with their initial sanction and motivation to develop technologies providing the first point of entry. Networks rely on the activities of stakeholders to provide the foundations of collaborative R&D activities, business-to-business marketing and strategic alliances. Such complex inter-organisational interactions and relationships influence value creation and organisational goals as stakeholders seek to gain investment opportunities. A theoretical model is developed here that contributes to our understanding of technology integration (adoption) as a dynamic process, which is simultaneously structured and enacted through the activities of stakeholders and organisations in complex inter-organisational networks of sanction and integration.
Resumo:
Assessments for assigning the conservation status of threatened species that are based purely on subjective judgements become problematic because assessments can be influenced by hidden assumptions, personal biases and perceptions of risks, making the assessment process difficult to repeat. This can result in inconsistent assessments and misclassifications, which can lead to a lack of confidence in species assessments. It is almost impossible to Understand an expert's logic or visualise the underlying reasoning behind the many hidden assumptions used throughout the assessment process. In this paper, we formalise the decision making process of experts, by capturing their logical ordering of information, their assumptions and reasoning, and transferring them into a set of decisions rules. We illustrate this through the process used to evaluate the conservation status of species under the NatureServe system (Master, 1991). NatureServe status assessments have been used for over two decades to set conservation priorities for threatened species throughout North America. We develop a conditional point-scoring method, to reflect the current subjective process. In two test comparisons, 77% of species' assessments using the explicit NatureServe method matched the qualitative assessments done subjectively by NatureServe staff. Of those that differed, no rank varied by more than one rank level under the two methods. In general, the explicit NatureServe method tended to be more precautionary than the subjective assessments. The rank differences that emerged from the comparisons may be due, at least in part, to the flexibility of the qualitative system, which allows different factors to be weighted on a species-by-species basis according to expert judgement. The method outlined in this study is the first documented attempt to explicitly define a transparent process for weighting and combining factors under the NatureServe system. The process of eliciting expert knowledge identifies how information is combined and highlights any inconsistent logic that may not be obvious in Subjective decisions. The method provides a repeatable, transparent, and explicit benchmark for feedback, further development, and improvement. (C) 2004 Elsevier SAS. All rights reserved.
Resumo:
Background and purpose: Patients' knowledge and beliefs about their illnesses are known to influence a range of health related variables, including treatment compliance. It may, therefore, be important to quantify these variables to assess their impact on compliance, particularly in chronic illnesses such as Obstructive Sleep Apnea (OSA) that rely on self-administered treatments. The aim of this study was to develop two new tools, the Apnea Knowledge Test (AKT) and the Apnea Beliefs Scale (ABS), to assess illness knowledge and beliefs in OSA patients. Patients and methods: The systematic test construction process followed to develop the AKT and the ABS included consultation with sleep experts and OSA patients. The psychometric properties of the AKT and ABS were then investigated in a clinical sample of 81 OSA patients and 33 healthy, non-sleep disordered adults. Results: Results suggest both measures are easily understood by OSA patients, have adequate internal consistency, and are readily accepted by patients. A preliminary investigation of the validity of these tools, conducted by comparing patient data to that of the 33 healthy adults, revealed that apnea patients knew more about OSA, had more positive attitudes towards continuous positive airway pressure (CPAP) treatment, and attributed more importance to treating sleep disturbances than non-clinical groups. Conclusions: Overall, the results of psychometric analyses of these tests suggest these measures will be useful clinical tools with numerous beneficial applications, particularly in CPAP compliance studies and apnea education program evaluations. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
We demonstrate a portable process for developing a triple bottom line model to measure the knowledge production performance of individual research centres. For the first time, this study also empirically illustrates how a fully units-invariant model of Data Envelopment Analysis (DEA) can be used to measure the relative efficiency of research centres by capturing the interaction amongst a common set of multiple inputs and outputs. This study is particularly timely given the increasing transparency required by governments and industries that fund research activities. The process highlights the links between organisational objectives, desired outcomes and outputs while the emerging performance model represents an executive managerial view. This study brings consistency to current measures that often rely on ratios and univariate analyses that are not otherwise conducive to relative performance analysis.
Resumo:
Objective: To report on strategies for, and outcomes of, evaluation of knowledge (publications), health and wealth (commercial) gains from medical research funded by the Australian Government through the National Health and Medical Research Council (NHMRC). Design and methods: End-of-grant reports submitted by researchers within 6 months of completion of NHMRC funded project grants which terminated in 2003 were used to capture self-reported publication number, health and wealth gains. Self-reported gains were also examined in retrospective surveys of grants completed in 1992 and 1997 and awards primarily supporting people (“people awards”) held between 1992 and 2002. Results: The response rate for the 1992 sample was too low for meaningful analysis. The mean number of publications per grant in the basic biomedical, clinical and health services research areas was very similar in 1997 and 2003. The publication output for population health was somewhat higher in the 2003 than in the 1997 analysis. For grants completed in 1997, 24% (31/131) affected clinical practice; 14% (18/131) public health practice; 9% (12/131) health policy; and 41% (54/131) had commercial potential with 20% (26/131) resulting in patents. Most respondents (89%) agreed that NHMRC people awards improved their career prospects. Interpretation is limited by the relatively low response rates (50% or less). Conclusions: A mechanism has been developed for ongoing assessment of NHMRC funded research. This process will improve accountability to the community and to government, and refine current funding mechanisms to most efficiently deliver health and economic returns for Australia.
Resumo:
This paper explores the theme of strategic planning in a State Tourism Organization (STO) from a knowledge management perspective. It highlights the value of knowledge in strategy making and the importance of an organisation's knowledge management agenda in facilitating a strategic planning process. In particular, it considers the capability of an STO to implement knowledge management as the key to a successful strategic planning exercise. In order to develop greater insight into the factors that impact on planning competence, the key aim of this paper is to develop a framework on which the capability of a STO to implement a knowledge-based agenda in strategic planning can be assessed. Research on knowledge management in the field of tourism is limited and there is little practical account of the application of knowledge management principles in tourism planning. Further, there is no apparent tool or instrument that allows for the assessment of an STO's capability to implement knowledge management in planning initiatives. Based on a literature review, a three-point framework of assessment is developed. The three elements of the framework are identified as: 1. Integration of knowledge management objectives with strategic imperatives; 2. A planning approach that balances top-down (outcome focused) with bottom-up (process focused) planning processes; and 3. Organisational capacity, including leadership, people and culture, process, technology, content and continuous improvement. The framework is tested through application to a practical case study - a planning initiative undertaken by a leading tourism STO in Australia. The results demonstrate that the framework is a useful means to evaluate organisational capability in knowledge-led strategic planning exercises and would be of practical value as a point of reference for future knowledge- based strategic planning projects. Copyright © by The Haworth Press, Inc. All rights reserved.
Resumo:
Knowledge transfer between units within MNC’s and between independent organisations involved in international business is an important element of performance. Researchers have shown that generally cultural differences place difficulties in the way of both information and knowledge transfer across national borders, both within MNC’s and across organisations. This study examines the knowledge transfer process between different cultural groups in a particular setting and concludes that in this situation cultural differences do not hinder knowledge transfer. Explanations are provided for this anomaly and implications for international business managers are drawn.
Resumo:
Automatic ontology building is a vital issue in many fields where they are currently built manually. This paper presents a user-centred methodology for ontology construction based on the use of Machine Learning and Natural Language Processing. In our approach, the user selects a corpus of texts and sketches a preliminary ontology (or selects an existing one) for a domain with a preliminary vocabulary associated to the elements in the ontology (lexicalisations). Examples of sentences involving such lexicalisation (e.g. ISA relation) in the corpus are automatically retrieved by the system. Retrieved examples are validated by the user and used by an adaptive Information Extraction system to generate patterns that discover other lexicalisations of the same objects in the ontology, possibly identifying new concepts or relations. New instances are added to the existing ontology or used to tune it. This process is repeated until a satisfactory ontology is obtained. The methodology largely automates the ontology construction process and the output is an ontology with an associated trained leaner to be used for further ontology modifications.