989 resultados para Theory of Knowledge


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It could be argued that advancing practice in critical care has been superseded by the advanced practice agenda. Some would suggest that advancing practice is focused on the core attributes of an individuals practice progressing onto advanced practice status. However, advancing practice is more of a process than identifiable skills and as such is often negated when viewing the development of practitioners to the advanced practice level. For example practice development initiatives can be seen as advancing practice for the masses which ensures that practitioners are following the same level of practice. The question here is; are they developing individually. To discuss the potential development of a conceptual model of knowledge integration pertinent to critical care nursing practice. In an attempt to explore the development of leading edge critical care thinking and practice, a new model for advancing practice in critical care is proposed. This paper suggests that reflection may not be the best model for advancing practice unless the individual practitioner has a sound knowledge base both theoretically and experientially. Drawing on the contemporary literature and recent doctoral research, the knowledge integration model presented here uses multiple learning strategies that are focused in practise to develop practice, for example the use of work-based learning and clinical supervision. Ongoing knowledge acquisition and its relationship with previously held theory and experience will enable individual practitioners to advance their own practice as well as being a resource for others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At present, the rate of small firm adoption of the Internet's ubiquitous World Wide Web (the web) far exceeds the actual exploitation its commercial potential. An inability to strategically acquire, comprehend and use external knowledge is proposed as a major barrier to optimal exploitation of the Internet. This paper discusses the limitations of applying market orientation theory to explain and guide small firm exploitation of the web. Absorptive capacity is introduced as an alternative theory that when viewed from an evolutionary perspective provides potentially more insightful discussion. An inability to detect emerging business model dominant designs is suggested to be a mixture of the nature of the technology that supports the Internet and underdeveloped small firm knowledge processing capabilities. We conclude with consideration of the practical and theoretical implications that arise from the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This poster describes a pilot case study, which aim is to study how future chemistry teachers use knowledge dimensions and high-order cognitive skills (HOCS) in their pre-laboratory concept maps to support chemistry laboratory work. The research data consisted of 168 pre-laboratory concept maps that 29 students constructed as a part of their chemistry laboratory studies. Concept maps were analyzed by using a theory based content analysis through Anderson & Krathwohls' learning taxonomy (2001). This study implicates that novice concept mapper students use all knowledge dimensions and applying, analyzing and evaluating HOCS to support the pre-laboratory work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This qualitative, explorative study, which comprises four essays, focuses on knowledge management (KM). It seeks to answer the question: How can the knowledge creation theory of KM benefit from social learning theories? While studying the five development phases of knowledge creation theory of KM through 1995-2008 and applying some social learning theories in essays, the concepts of knowing, learning and becoming have emerged. Drawing on these three concepts and on becoming ontology and extended epistemology as research philosophies the study suggests the ‘becoming epistemology’ concept and develops the ‘becoming to know’ framework. The framework proposes becoming as phronesis of dialectic interactions between learning and knowing. It shows how becoming to know evolves as an interplay between concrete experience and logical thinking in the present and in a living context. The proposed framework could be considered a contribution to the current development phase of the knowledge creation theory of KM because it illustrates how ontological and epistemological knowledge spirals come together, which is the essence of the knowledge creation theory of KM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents a new theory of internal marketing. The thesis has developed as a case study in retrospective action research. This began with the personal involvement of the author in an action research project for customer service improvement at a large Australian retail bank. In other words, much of the theory generating ‘research’ took place after the original project ‘action’ had wound down. The key theoretical proposition is that internal marketing is a relationship development strategy for the purpose of knowledge renewal. In the banking case, exchanges of value between employee participants emerged as the basis for relationship development, with synergistic benefits for customers, employees and the bank. Relationship development turned out to be the mediating variable between the learning activity of employee participants at the project level and success in knowledge renewal at the organisational level. Relationship development was also a pivotal factor in the motivation and customer consciousness of employees. The conclusion reached is that the strength of relationship-mediated internal marketing is in combining a market focused commitment and employee freedom in project work to achieve knowledge renewal. The forgotten truth is that organisational knowledge can be renewed through dialogue and learning, through being trustworthy, and by gaining the trust of employees in return.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The density-wave theory of Ramakrishnan and Yussouff is extended to provide a scheme for describing dislocations and other topological defects in crystals. Quantitative calculations are presented for the order-parameter profiles, the atomic configuration, and the free energy of a screw dislocation with Burgers vector b=(a/2, a/2, a/2) in a bcc solid. These calculations are done using a simple parametrization of the direct correlation function and a gradient expansion. It is conventional to express the free energy of the dislocation in a crystal of size R as (λb2/4π)ln(αR/‖b‖), where λ is the shear elastic constant, and α is a measure of the core energy. Our results yield for Na the value α≃1.94a/(‖c1’’‖)1/2 (≃1.85) at the freezing temperature (371 K) and α≃2.48a/(‖c1’’‖)1/2 at 271 K, where c1’’ is the curvature of the first peak of the direct correlation function c(q). Detailed results for the density distribution in the dislocation, particularly the core region, are also presented. These show that the dislocation core has a columnar character. To our knowledge, this study represents the first calculation of dislocation structure, including the core, within the framework of an order-parameter theory and incorporating thermal effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The modern subject is what we can call a self-subjecting individual. This is someone in whose inner reality has been implanted a more permanent governability, a governability that works inside the agent. Michel Foucault s genealogy of the modern subject is the history of its constitution by power practices. By a flight of imagination, suppose that this history is not an evolving social structure or cultural phenomenon, but one of those insects (moth) whose life cycle consists of three stages or moments: crawling larva, encapsulated pupa, and flying adult. Foucault s history of power-practices presents the same kind of miracle of total metamorphosis. The main forces in the general field of power can be apprehended through a generalisation of three rationalities functioning side-by-side in the plurality of different practices of power: domination, normalisation and the law. Domination is a force functioning by the rationality of reason of state: the state s essence is power, power is firm domination over people, and people are the state s resource by which the state s strength is measured. Normalisation is a force that takes hold on people from the inside of society: it imposes society s own reality its empirical verity as a norm on people through silently working jurisdictional operations that exclude pathological individuals too far from the average of the population as a whole. The law is a counterforce to both domination and normalisation. Accounting for elements of legal practice as omnihistorical is not possible without a view of the general field of power. Without this view, and only in terms of the operations and tactical manoeuvres of the practice of law, nothing of the kind can be seen: the only thing that practice manifests is constant change itself. However, the backdrop of law s tacit dimension that is, the power-relations between law, domination and normalisation allows one to see more. In the general field of power, the function of law is exactly to maintain the constant possibility of change. Whereas domination and normalisation would stabilise society, the law makes it move. The European individual has a reality as a problem. What is a problem? A problem is something that allows entry into the field of thought, said Foucault. To be a problem, it is necessary for certain number of factors to have made it uncertain, to have made it lose familiarity, or to have provoked a certain number of difficulties around it . Entering the field of thought through problematisations of the European individual human forms, power and knowledge one is able to glimpse the historical backgrounds of our present being. These were produced, and then again buried, in intersections between practices of power and games of truth. In the problem of the European individual one has suitable circumstances that bring to light forces that have passed through the individual through centuries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is aimed at establishing a statistical theory of rotational and vibrational excitation of polyatomic molecules by an intense IR laser. Starting from the Wigner function of quantum statistical mechanics, we treat the rotational motion in the classical approximation; the vibrational modes are classified into active ones which are coupled directly with the laser and the background modes which are not coupled with the laser. The reduced Wigner function, i.e., the Wigner function integrated over all background coordinates should satisfy an integro-differential equation. We introduce the idea of ``viscous damping'' to handle the interaction between the active modes and the background. The damping coefficient can be calculated with the aid of the well-known Schwartz–Slawsky–Herzfeld theory. The resulting equation is solved by the method of moment equations. There is only one adjustable parameter in our scheme; it is introduced due to the lack of precise knowledge about the molecular potential. The theory developed in this paper explains satisfactorily the recent absorption experiments of SF6 irradiated by a short pulse CO2 laser, which are in sharp contradiction with the prevailing quasi-continuum theory. We also refined the density of energy levels which is responsible for the muliphoton excitation of polyatomic molecules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In practical situations, the causes of image blurring are often undiscovered or difficult to get known. However, traditional methods usually assume the knowledge of the blur has been known prior to the restoring process, which are not practicable for blind image restoration. A new method proposed in this paper aims exactly at blind image restoration. The restoration process is transformed into a problem of point distribution analysis in high-dimensional space. Experiments have proved that the restoration could be achieved using this method without re-knowledge of the image blur. In addition, the algorithm guarantees to be convergent and has simple computation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is concerned with designing representations for analytical reasoning problems (of the sort found on the GRE and LSAT). These problems test the ability to draw logical conclusions. A computer program was developed that takes as input a straightforward predicate calculus translation of a problem, requests additional information if necessary, decides what to represent and how, designs representations capturing the constraints of the problem, and creates and executes a LISP program that uses those representations to produce a solution. Even though these problems are typically difficult for theorem provers to solve, the LISP program that uses the designed representations is very efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report investigates the process of focussing as a description and explanation of the comprehension of certain anaphoric expressions in English discourse. The investigation centers on the interpretation of definite anaphora, that is, on the personal pronouns, and noun phrases used with a definite article the, this or that. Focussing is formalized as a process in which a speaker centers attention on a particular aspect of the discourse. An algorithmic description specifies what the speaker can focus on and how the speaker may change the focus of the discourse as the discourse unfolds. The algorithm allows for a simple focussing mechanism to be constructed: and element in focus, an ordered collection of alternate foci, and a stack of old foci. The data structure for the element in focus is a representation which encodes a limted set of associations between it and other elements from teh discourse as well as from general knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Accommodating Interruptions is a theory that emerged in the context of young people who have asthma. A background to the prevalence and management of asthma in Ireland is given to situate the theory. Ireland has the fourth highest incidence of asthma in the world, with almost one in five Irish young people having asthma. Although national and international asthma management guidelines exist it is accepted that the symptom control of asthma among the young people population is poor. Aim: The aim of this research is to investigate the lives of young people who have asthma, to allow for a deeper understanding of the issues affecting them. Methods: This research was undertaken using a Classic Grounded Theory approach. It is a systematic approach to allowing conceptual emergence from data in generating a theory that explains behaviour in resolving the participant’s main concern. The data were collected through in-depth interviews with young people aged 11-16 years who had asthma for over one year. Data were also collected from participant diaries. Constant comparative analysis, theoretical coding and memo writing were used to develop the theory. Results: The theory explains how young people resolve their main concern of being restricted, by maximizing their participation and inclusion in activities, events and relationships in spite of their asthma. They achieve this by accommodating interruptions in their lives in minimizing the effects of asthma on their everyday lives. Conclusion: The theory of accommodating interruptions explains young people’s asthma management behaviours in a new way. It allows us to understand how and why young people behave the way they do in order minimise the effect of asthma on their lives. The theory adds to the body of knowledge on young people with asthma and challenges some viewpoints regarding their behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An event memory is a mental construction of a scene recalled as a single occurrence. It therefore requires the hippocampus and ventral visual stream needed for all scene construction. The construction need not come with a sense of reliving or be made by a participant in the event, and it can be a summary of occurrences from more than one encoding. The mental construction, or physical rendering, of any scene must be done from a specific location and time; this introduces a "self" located in space and time, which is a necessary, but need not be a sufficient, condition for a sense of reliving. We base our theory on scene construction rather than reliving because this allows the integration of many literatures and because there is more accumulated knowledge about scene construction's phenomenology, behavior, and neural basis. Event memory differs from episodic memory in that it does not conflate the independent dimensions of whether or not a memory is relived, is about the self, is recalled voluntarily, or is based on a single encoding with whether it is recalled as a single occurrence of a scene. Thus, we argue that event memory provides a clearer contrast to semantic memory, which also can be about the self, be recalled voluntarily, and be from a unique encoding; allows for a more comprehensive dimensional account of the structure of explicit memory; and better accounts for laboratory and real-world behavioral and neural results, including those from neuropsychology and neuroimaging, than does episodic memory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Economic analysis of technology treats it as given exogenously, while determined endogenously. This paper examines the conceptual conflict. The paper outlines an alternative conceptual framework. This uses a 'General Vertical Division of Labour' into conceptual and executive parts to facilitate a coherent political economic explanation of technological change. The paper suggests that we may acquire rather than impose an understanding of technological change. It also suggests that we may re-define and reassess the efficiency of technological change, through the values inculcated into it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.