882 resultados para Knowledge, Theory of (Hinduism)
Resumo:
Background: Accommodating Interruptions is a theory that emerged in the context of young people who have asthma. A background to the prevalence and management of asthma in Ireland is given to situate the theory. Ireland has the fourth highest incidence of asthma in the world, with almost one in five Irish young people having asthma. Although national and international asthma management guidelines exist it is accepted that the symptom control of asthma among the young people population is poor. Aim: The aim of this research is to investigate the lives of young people who have asthma, to allow for a deeper understanding of the issues affecting them. Methods: This research was undertaken using a Classic Grounded Theory approach. It is a systematic approach to allowing conceptual emergence from data in generating a theory that explains behaviour in resolving the participant’s main concern. The data were collected through in-depth interviews with young people aged 11-16 years who had asthma for over one year. Data were also collected from participant diaries. Constant comparative analysis, theoretical coding and memo writing were used to develop the theory. Results: The theory explains how young people resolve their main concern of being restricted, by maximizing their participation and inclusion in activities, events and relationships in spite of their asthma. They achieve this by accommodating interruptions in their lives in minimizing the effects of asthma on their everyday lives. Conclusion: The theory of accommodating interruptions explains young people’s asthma management behaviours in a new way. It allows us to understand how and why young people behave the way they do in order minimise the effect of asthma on their lives. The theory adds to the body of knowledge on young people with asthma and challenges some viewpoints regarding their behaviours.
Resumo:
An event memory is a mental construction of a scene recalled as a single occurrence. It therefore requires the hippocampus and ventral visual stream needed for all scene construction. The construction need not come with a sense of reliving or be made by a participant in the event, and it can be a summary of occurrences from more than one encoding. The mental construction, or physical rendering, of any scene must be done from a specific location and time; this introduces a "self" located in space and time, which is a necessary, but need not be a sufficient, condition for a sense of reliving. We base our theory on scene construction rather than reliving because this allows the integration of many literatures and because there is more accumulated knowledge about scene construction's phenomenology, behavior, and neural basis. Event memory differs from episodic memory in that it does not conflate the independent dimensions of whether or not a memory is relived, is about the self, is recalled voluntarily, or is based on a single encoding with whether it is recalled as a single occurrence of a scene. Thus, we argue that event memory provides a clearer contrast to semantic memory, which also can be about the self, be recalled voluntarily, and be from a unique encoding; allows for a more comprehensive dimensional account of the structure of explicit memory; and better accounts for laboratory and real-world behavioral and neural results, including those from neuropsychology and neuroimaging, than does episodic memory.
Resumo:
Economic analysis of technology treats it as given exogenously, while determined endogenously. This paper examines the conceptual conflict. The paper outlines an alternative conceptual framework. This uses a 'General Vertical Division of Labour' into conceptual and executive parts to facilitate a coherent political economic explanation of technological change. The paper suggests that we may acquire rather than impose an understanding of technological change. It also suggests that we may re-define and reassess the efficiency of technological change, through the values inculcated into it.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
Combination rules proposed so far in the Dempster-Shafer theory of evidence, especially Dempster rule, rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When a source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should
be changed minimally to that effect. Although belief revision is already an important subfield of artificial intelligence, so far, it has been little addressed in evidence theory. In this paper, we define the notion of revision for the theory of evidence and propose several different revision rules, called the inner and outer
revisions, and a modified adaptive outer revision, which better corresponds to the idea of revision. Properties of these revision rules are also investigated.
Resumo:
This survey presents within a single model three theories of decentralization of decision-making within organizations based on private information and incentives. Renegotiation, collusion, and limits on communication are three sufficient conditions for decentralization to be optimal.
Resumo:
Whilst much is known of new technology adopters, little research has addressed the role of their attitudes in adoption decisions; particularly, for technologies with evident economic potential that have not been taken up by farmers. This paper presents recent research that has used a new approach which examines the role that adopters' attitudes play in identifying the drivers of and barriers to adoption. The study was concerned with technologies for livestock farming systems in SW England, specifically oestrus detection, nitrogen supply management, and, inclusion of white clover. The adoption behaviour is analysed using the social-psychology theory of reasoned action to identify factors that affect the adoption of technologies, which are confirmed using principal components analysis. The results presented here relate to the specific adoption behaviour regarding the Milk Development Council's recommended observation times for heat detection. The factors that affect the adoption of this technology are: cost effectiveness, improved detection and conception rates as the main drivers, whilst the threat to demean the personal knowledge and skills of a farmer in 'knowing' their cows is a barrier. This research shows clearly that promotion of a technology and transfer of knowledge for a farming system need to take account of the beliefs and attitudes of potential adopters. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The economic theory of the firm is central to the theory of the multinational enterprise. Recent literature on multinationals, however, makes only limited reference to the economic theory of the firm. Multinationals play an important role in coordinating the international division of labour through internal markets. The paper reviews the economic principles that underlie this view. Optimal internalisation equates marginal benefits and costs. The benefits of internalisation stem mainly from the difficulties of licensing proprietary knowledge, reflecting the view that MNEs possess an ‘ownership’ or ‘firm-specific’ advantage. The costs of internalisation, it is argued, reflect managerial capability, and in particular the capability to manage a large firm. The paper argues that management capability is a complement to ownership advantage. Ownership advantage determines the potential of the firm, and management capability governs the fulfilment of this potential through overcoming barriers to growth. The analysis is applied to a variety of issues, including out-sourcing, geographical dispersion of production, and regional specialisation in marketing.
Resumo:
This paper addresses one of the issues in contemporary globalisation theory: the extent to which there is ‘one best way’ in which business can be done and organisations managed. It uses Czarniawska’s ‘Travels of Ideas’ model as an organising framework to present and understand how the concept of ‘Quality’, so important in contemporary approaches to manufacturing & services, and their management, travelled to, and impinged on, a newly opened vehicle assembly plant in Poland. The extent to which new meanings were mutually created in the process of translation is discussed, using ethnographic reporting and analysis techniques commonly used in diffusion research. Parallels between the process of translation as an idea becomes embedded into a new cultural location, and the processes which contemporary research has identified as important to organisational learning, are briefly discussed in conclusion.
Resumo:
Purpose: The purpose of this paper is to systematically describe the key practical contributions of the theory of constraints (TOC) to outbound (distribution) logistics. Design/methodology/approach: Based on theoretical research, this paper presents the main practical aspects of the approach suggested by TOC to outbound logistics and discusses the assumptions upon which it is based. Findings: This paper corroborates the thesis defended by TOC, according to which the current ways of managing outbound logistics, based mainly on sales forecasts lead to difficulties in handling trade-offs between logistics (stock and transportation) costs and stock-out levels. Research limitations/implications: The reported research is of a theoretical nature. Practical implications: TOC offers a proposal that is complementary in many aspects and very distinguishable in others about the way some key processes and elements of supply chain management (SCM) are managed, especially outbound logistics. Originality/value: Considering the dearth of papers dealing with the conceptual articulation and organization of this subject, the paper contributes to systematize the knowledge currently available about the contributions of the TOC to outbound logistics, highlighting the practical implications of applying TOC to outbound logistics. © Emerald Group Publishing Limited.
Resumo:
Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.
Resumo:
Objective. The purpose of the study is to provide a holistic depiction of behavioral & environmental factors contributing to risky sexual behaviors among predominantly high school educated, low-income African Americans residing in urban areas of Houston, TX utilizing the Theory of Gender and Power, Situational/Environmental Variables Theory, and Sexual Script Theory. Methods. A cross-sectional study was conducted via questionnaires among 215 Houston area residents, 149 were women and 66 were male. Measures used to assess behaviors of the population included a history of homelessness, use of crack/cocaine among several other illicit drugs, the type of sexual partner, age of participant, age of most recent sex partner, whether or not participants sought health care in the last 12 months, knowledge of partner's other sexual activities, symptoms of depression, and places where partner's were met. In an effort to determine risk of sexual encounters, a risk index employing the variables used to assess condom use was created categorizing sexual encounters as unsafe or safe. Results. Variables meeting the significance level of p<.15 for the bivariate analysis of each theory were entered into a binary logistic regression analysis. The block for each theory was significant, suggesting that the grouping assignments of each variable by theory were significantly associated with unsafe sexual behaviors. Within the regression analysis, variables such as sex for drugs/money, low income, and crack use demonstrated an effect size of ≥ ± 1, indicating that these variables had a significant effect on unsafe sexual behavioral practices. Conclusions. Variables assessing behavior and environment demonstrated a significant effect when categorized by relation to designated theories.
Resumo:
We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.
Development of meta-representations: Procedural metacognition and the relationship to Theory of Mind
Resumo:
In several studies it was shown that metacognitive ability is crucial for children and their success in school. Much less is known about the emergence of that ability and its relationship to other meta-representations like Theory of Mind competencies. In the past years, a growing literature has suggested that metacognition and Theory of Mind could theoretically be assumed to belong to the same developmental concept. Since then only a few studies showed empirically evidence that metacognition and Theory of Mind are related. But these studies focused on declarative metacognitive knowledge rather than on procedural metacognitive monitoring like in the present study: N = 159 children were first tested shortly before making the transition to school (aged between 5 1/2 and 7 1/2 years) and one year later at the end of their first grade. Analyses suggest that there is in fact a significant relation between early metacognitive monitoring skills (procedural metacognition) and later Theory of Mind competencies. Notably, language seems to play a crucial role in this relationship. Thus our results bring new insights in the research field of the development of meta-representation and support the view that metacognition and Theory of Mind are indeed interrelated, but the precise mechanisms yet remain unclear.