822 resultados para theory of knowledge
Resumo:
The modern subject is what we can call a self-subjecting individual. This is someone in whose inner reality has been implanted a more permanent governability, a governability that works inside the agent. Michel Foucault s genealogy of the modern subject is the history of its constitution by power practices. By a flight of imagination, suppose that this history is not an evolving social structure or cultural phenomenon, but one of those insects (moth) whose life cycle consists of three stages or moments: crawling larva, encapsulated pupa, and flying adult. Foucault s history of power-practices presents the same kind of miracle of total metamorphosis. The main forces in the general field of power can be apprehended through a generalisation of three rationalities functioning side-by-side in the plurality of different practices of power: domination, normalisation and the law. Domination is a force functioning by the rationality of reason of state: the state s essence is power, power is firm domination over people, and people are the state s resource by which the state s strength is measured. Normalisation is a force that takes hold on people from the inside of society: it imposes society s own reality its empirical verity as a norm on people through silently working jurisdictional operations that exclude pathological individuals too far from the average of the population as a whole. The law is a counterforce to both domination and normalisation. Accounting for elements of legal practice as omnihistorical is not possible without a view of the general field of power. Without this view, and only in terms of the operations and tactical manoeuvres of the practice of law, nothing of the kind can be seen: the only thing that practice manifests is constant change itself. However, the backdrop of law s tacit dimension that is, the power-relations between law, domination and normalisation allows one to see more. In the general field of power, the function of law is exactly to maintain the constant possibility of change. Whereas domination and normalisation would stabilise society, the law makes it move. The European individual has a reality as a problem. What is a problem? A problem is something that allows entry into the field of thought, said Foucault. To be a problem, it is necessary for certain number of factors to have made it uncertain, to have made it lose familiarity, or to have provoked a certain number of difficulties around it . Entering the field of thought through problematisations of the European individual human forms, power and knowledge one is able to glimpse the historical backgrounds of our present being. These were produced, and then again buried, in intersections between practices of power and games of truth. In the problem of the European individual one has suitable circumstances that bring to light forces that have passed through the individual through centuries.
Resumo:
This paper is aimed at establishing a statistical theory of rotational and vibrational excitation of polyatomic molecules by an intense IR laser. Starting from the Wigner function of quantum statistical mechanics, we treat the rotational motion in the classical approximation; the vibrational modes are classified into active ones which are coupled directly with the laser and the background modes which are not coupled with the laser. The reduced Wigner function, i.e., the Wigner function integrated over all background coordinates should satisfy an integro-differential equation. We introduce the idea of ``viscous damping'' to handle the interaction between the active modes and the background. The damping coefficient can be calculated with the aid of the well-known Schwartz–Slawsky–Herzfeld theory. The resulting equation is solved by the method of moment equations. There is only one adjustable parameter in our scheme; it is introduced due to the lack of precise knowledge about the molecular potential. The theory developed in this paper explains satisfactorily the recent absorption experiments of SF6 irradiated by a short pulse CO2 laser, which are in sharp contradiction with the prevailing quasi-continuum theory. We also refined the density of energy levels which is responsible for the muliphoton excitation of polyatomic molecules.
Resumo:
In practical situations, the causes of image blurring are often undiscovered or difficult to get known. However, traditional methods usually assume the knowledge of the blur has been known prior to the restoring process, which are not practicable for blind image restoration. A new method proposed in this paper aims exactly at blind image restoration. The restoration process is transformed into a problem of point distribution analysis in high-dimensional space. Experiments have proved that the restoration could be achieved using this method without re-knowledge of the image blur. In addition, the algorithm guarantees to be convergent and has simple computation.
Resumo:
This research is concerned with designing representations for analytical reasoning problems (of the sort found on the GRE and LSAT). These problems test the ability to draw logical conclusions. A computer program was developed that takes as input a straightforward predicate calculus translation of a problem, requests additional information if necessary, decides what to represent and how, designs representations capturing the constraints of the problem, and creates and executes a LISP program that uses those representations to produce a solution. Even though these problems are typically difficult for theorem provers to solve, the LISP program that uses the designed representations is very efficient.
Resumo:
This report investigates the process of focussing as a description and explanation of the comprehension of certain anaphoric expressions in English discourse. The investigation centers on the interpretation of definite anaphora, that is, on the personal pronouns, and noun phrases used with a definite article the, this or that. Focussing is formalized as a process in which a speaker centers attention on a particular aspect of the discourse. An algorithmic description specifies what the speaker can focus on and how the speaker may change the focus of the discourse as the discourse unfolds. The algorithm allows for a simple focussing mechanism to be constructed: and element in focus, an ordered collection of alternate foci, and a stack of old foci. The data structure for the element in focus is a representation which encodes a limted set of associations between it and other elements from teh discourse as well as from general knowledge.
Resumo:
Background: Accommodating Interruptions is a theory that emerged in the context of young people who have asthma. A background to the prevalence and management of asthma in Ireland is given to situate the theory. Ireland has the fourth highest incidence of asthma in the world, with almost one in five Irish young people having asthma. Although national and international asthma management guidelines exist it is accepted that the symptom control of asthma among the young people population is poor. Aim: The aim of this research is to investigate the lives of young people who have asthma, to allow for a deeper understanding of the issues affecting them. Methods: This research was undertaken using a Classic Grounded Theory approach. It is a systematic approach to allowing conceptual emergence from data in generating a theory that explains behaviour in resolving the participant’s main concern. The data were collected through in-depth interviews with young people aged 11-16 years who had asthma for over one year. Data were also collected from participant diaries. Constant comparative analysis, theoretical coding and memo writing were used to develop the theory. Results: The theory explains how young people resolve their main concern of being restricted, by maximizing their participation and inclusion in activities, events and relationships in spite of their asthma. They achieve this by accommodating interruptions in their lives in minimizing the effects of asthma on their everyday lives. Conclusion: The theory of accommodating interruptions explains young people’s asthma management behaviours in a new way. It allows us to understand how and why young people behave the way they do in order minimise the effect of asthma on their lives. The theory adds to the body of knowledge on young people with asthma and challenges some viewpoints regarding their behaviours.
Resumo:
An event memory is a mental construction of a scene recalled as a single occurrence. It therefore requires the hippocampus and ventral visual stream needed for all scene construction. The construction need not come with a sense of reliving or be made by a participant in the event, and it can be a summary of occurrences from more than one encoding. The mental construction, or physical rendering, of any scene must be done from a specific location and time; this introduces a "self" located in space and time, which is a necessary, but need not be a sufficient, condition for a sense of reliving. We base our theory on scene construction rather than reliving because this allows the integration of many literatures and because there is more accumulated knowledge about scene construction's phenomenology, behavior, and neural basis. Event memory differs from episodic memory in that it does not conflate the independent dimensions of whether or not a memory is relived, is about the self, is recalled voluntarily, or is based on a single encoding with whether it is recalled as a single occurrence of a scene. Thus, we argue that event memory provides a clearer contrast to semantic memory, which also can be about the self, be recalled voluntarily, and be from a unique encoding; allows for a more comprehensive dimensional account of the structure of explicit memory; and better accounts for laboratory and real-world behavioral and neural results, including those from neuropsychology and neuroimaging, than does episodic memory.
Resumo:
Economic analysis of technology treats it as given exogenously, while determined endogenously. This paper examines the conceptual conflict. The paper outlines an alternative conceptual framework. This uses a 'General Vertical Division of Labour' into conceptual and executive parts to facilitate a coherent political economic explanation of technological change. The paper suggests that we may acquire rather than impose an understanding of technological change. It also suggests that we may re-define and reassess the efficiency of technological change, through the values inculcated into it.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
Combination rules proposed so far in the Dempster-Shafer theory of evidence, especially Dempster rule, rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When a source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should
be changed minimally to that effect. Although belief revision is already an important subfield of artificial intelligence, so far, it has been little addressed in evidence theory. In this paper, we define the notion of revision for the theory of evidence and propose several different revision rules, called the inner and outer
revisions, and a modified adaptive outer revision, which better corresponds to the idea of revision. Properties of these revision rules are also investigated.
Resumo:
This survey presents within a single model three theories of decentralization of decision-making within organizations based on private information and incentives. Renegotiation, collusion, and limits on communication are three sufficient conditions for decentralization to be optimal.
Resumo:
Whilst much is known of new technology adopters, little research has addressed the role of their attitudes in adoption decisions; particularly, for technologies with evident economic potential that have not been taken up by farmers. This paper presents recent research that has used a new approach which examines the role that adopters' attitudes play in identifying the drivers of and barriers to adoption. The study was concerned with technologies for livestock farming systems in SW England, specifically oestrus detection, nitrogen supply management, and, inclusion of white clover. The adoption behaviour is analysed using the social-psychology theory of reasoned action to identify factors that affect the adoption of technologies, which are confirmed using principal components analysis. The results presented here relate to the specific adoption behaviour regarding the Milk Development Council's recommended observation times for heat detection. The factors that affect the adoption of this technology are: cost effectiveness, improved detection and conception rates as the main drivers, whilst the threat to demean the personal knowledge and skills of a farmer in 'knowing' their cows is a barrier. This research shows clearly that promotion of a technology and transfer of knowledge for a farming system need to take account of the beliefs and attitudes of potential adopters. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
This paper examines the implications of policy fracture and arms length governance within the decision making processes currently shaping curriculum design within the English education system. In particular it argues that an unresolved ‘ideological fracture’ at government level has been passed down to school leaders whose response to the dilemma is distorted by the target-driven agenda of arms length agencies. Drawing upon the findings of a large scale on-line survey of history teaching in English secondary schools, this paper illustrates the problems that occur when policy making is divorced from curriculum theory, and in particular from any consideration of the nature of knowledge. Drawing on the social realist theory of knowledge elaborated by Young (2008), we argue that the rapid spread of alternative curricular arrangements, implemented in the absence of an understanding of curriculum theory, undermines the value of disciplined thinking to the detriment of many young people, particularly those in areas of social and economic deprivation.
Resumo:
The economic theory of the firm is central to the theory of the multinational enterprise. Recent literature on multinationals, however, makes only limited reference to the economic theory of the firm. Multinationals play an important role in coordinating the international division of labour through internal markets. The paper reviews the economic principles that underlie this view. Optimal internalisation equates marginal benefits and costs. The benefits of internalisation stem mainly from the difficulties of licensing proprietary knowledge, reflecting the view that MNEs possess an ‘ownership’ or ‘firm-specific’ advantage. The costs of internalisation, it is argued, reflect managerial capability, and in particular the capability to manage a large firm. The paper argues that management capability is a complement to ownership advantage. Ownership advantage determines the potential of the firm, and management capability governs the fulfilment of this potential through overcoming barriers to growth. The analysis is applied to a variety of issues, including out-sourcing, geographical dispersion of production, and regional specialisation in marketing.