820 resultados para knowledge framework
Resumo:
This paper presents a framework for Historical Case-Based Reasoning (HCBR) which allows the expression of both relative and absolute temporal knowledge, representing case histories in the real world. The formalism is founded on a general temporal theory that accommodates both points and intervals as primitive time elements. A case history is formally defined as a collection of (time-independent) elemental cases, together with its corresponding temporal reference. Case history matching is two-fold, i.e., there are two similarity values need to be computed: the non-temporal similarity degree and the temporal similarity degree. On the one hand, based on elemental case matching, the non-temporal similarity degree between case histories is defined by means of computing the unions and intersections of the involved elemental cases. On the other hand, by means of the graphical presentation of temporal references, the temporal similarity degree in case history matching is transformed into conventional graph similarity measurement.
Resumo:
This paper suggests a possible framework for the encapsulation of the decision making process for the Waterime project. The final outcome maybe a computerised model, but the process advocated is not prescriptive, and involves the production of a "paper model" as mediating representation between the knowledge acquired and any computerised system. This paper model may suffice in terms of the project's goals.
Resumo:
The anticipated rewards of adaptive approaches will only be fully realised when autonomic algorithms can take configuration and deployment decisions that match and exceed those of human engineers. Such decisions are typically characterised as being based on a foundation of experience and knowledge. In humans, these underpinnings are themselves founded on the ashes of failure, the exuberance of courage and (sometimes) the outrageousness of fortune. In this paper we describe an application framework that will allow the incorporation of similarly risky, error prone and downright dangerous software artefacts into live systems – without undermining the certainty of correctness at application level. We achieve this by introducing the notion of application dreaming.
Resumo:
The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.
Resumo:
Time-series analysis and prediction play an important role in state-based systems that involve dealing with varying situations in terms of states of the world evolving with time. Generally speaking, the world in the discourse persists in a given state until something occurs to it into another state. This paper introduces a framework for prediction and analysis based on time-series of states. It takes a time theory that addresses both points and intervals as primitive time elements as the temporal basis. A state of the world under consideration is defined as a set of time-varying propositions with Boolean truth-values that are dependent on time, including properties, facts, actions, events and processes, etc. A time-series of states is then formalized as a list of states that are temporally ordered one after another. The framework supports explicit expression of both absolute and relative temporal knowledge. A formal schema for expressing general time-series of states to be incomplete in various ways, while the concept of complete time-series of states is also formally defined. As applications of the formalism in time-series analysis and prediction, we present two illustrating examples.
Resumo:
Unprecedented basin-scale ecological changes are occurring in our seas. As temperature and carbon dioxide concentrations increase, the extent of sea ice is decreasing, stratification and nutrient regimes are changing and pH is decreasing. These unparalleled changes present new challenges for managing our seas, as we are only just beginning to understand the ecological manifestations of these climate alterations. The Marine Strategy Framework Directive requires all European Member States to achieve good environmental status (GES) in their seas by 2020; this means management towards GES will take place against a background of climate-driven macroecological change. Each Member State must set environmental targets to achieve GES; however, in order to do so, an understanding of large-scale ecological change in the marine ecosystem is necessary. Much of our knowledge of macroecological change in the North Atlantic is a result of research using data gathered by the Continuous Plankton Recorder (CPR) survey, a near-surface plankton monitoring programme that has been sampling in the North Atlantic since 1931. CPR data indicate that North Atlantic and North Sea plankton dynamics are responding to both climate and human-induced changes, presenting challenges to the development of pelagic targets for achievement of GES in European Seas. Thus, the continuation of long-term ecological time series such as the CPR survey is crucial for informing and supporting the sustainable management of European seas through policy mechanisms.
Resumo:
Multi-agent systems have become increasingly mature, but their appearance does not make the traditional OO approach obsolete. On the contrary, OO methodologies can benefit from the principles and tools designed for agent systems. The Agent-Rule-Class (ARC) framework is proposed as an approach that builds agents upon traditional OO system components and makes use of business rules to dictate agent behaviour with the aid of OO components. By modelling agent knowledge in business rules, the proposed paradigm provides a straightforward means to develop agent-oriented systems based on the existing object-oriented systems and offers features that are otherwise difficult to achieve in the original OO systems. The main outcome of using ARC is the achievement of adaptivity. The framework is supported by a tool that ensures agents implement up-to-date requirements from business people, reflecting desired current behaviour, without the need for frequent system rebuilds. ARC is illustrated with a rail track example.
Resumo:
The Assessment and Action framework for looked after children, designed to improve outcomes for all children in public care and those at home on care orders, is now well established in the UK. This paper offers a critical evaluation of the framework by examining the model of childhood upon which it is premised and by exploring its relationship to children's rights as conceptualized in the United Nations Convention on the Rights of the Child (1989). It will be argued that the particular child development model which underpins the framework addresses the rights of looked after children to protection and provision but does not allow for their participation rights to be sufficiently addressed. A critical review of the research concerning the education and health of looked after children is used to illustrate these points. It will be argued that what are missing are the detailed accounts of looked after children themselves. It is concluded that there is a need for the development of additional research approaches premised upon sociological models of childhood. These would allow for a greater engagement with the participation rights of this group of children and complement the pre-existing research agenda
Resumo:
Recently, several belief negotiation models have been introduced to deal with the problem of belief merging. A negotiation model usually consists of two functions: a negotiation function and a weakening function. A negotiation function is defined to choose the weakest sources and these sources will weaken their point of view using a weakening function. However, the currently available belief negotiation models are based on classical logic, which makes them difficult to define weakening functions. In this paper, we define a prioritized belief negotiation model in the framework of possibilistic logic. The priority between formulae provides us with important information to decide which beliefs should be discarded. The problem of merging uncertain information from different sources is then solved by two steps. First, beliefs in the original knowledge bases will be weakened to resolve inconsistencies among them. This step is based on a prioritized belief negotiation model. Second, the knowledge bases obtained by the first step are combined using a conjunctive operator which may have a reinforcement effect in possibilistic logic.
Resumo:
Use of the Dempster-Shafer (D-S) theory of evidence to deal with uncertainty in knowledge-based systems has been widely addressed. Several AI implementations have been undertaken based on the D-S theory of evidence or the extended theory. But the representation of uncertain relationships between evidence and hypothesis groups (heuristic knowledge) is still a major problem. This paper presents an approach to representing such knowledge, in which Yen’s probabilistic multi-set mappings have been extended to evidential mappings, and Shafer’s partition technique is used to get the mass function in a complex evidence space. Then, a new graphic method for describing the knowledge is introduced which is an extension of the graphic model by Lowrance et al. Finally, an extended framework for evidential reasoning systems is specified.
Resumo:
The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for intelligent time analysis of aircraft assembly processes within a digital manufacturing framework. A knowledge system is developed so that the design knowledge can be intelligently retrieved for implementing assembly time analysis automatically. A time estimation method based on MOST, is reviewed and employed. Knowledge capture, transfer and storage within the digital manufacturing environment are extensively discussed. Configured plantypes, GUIs and functional modules are designed and developed for the automated time analysis. An exemplar study using an aircraft panel assembly from a regional jet is also presented. Although the method currently focuses on aircraft assembly, it can also be well utilized in other industry sectors, such as transportation, automobile and shipbuilding. The main contribution of the work is to present a methodology that facilitates the integration of time analysis with design and manufacturing using a digital manufacturing platform solution.
Resumo:
Aims: Healthcare providers are confronted with the claim that the distribution of health and healthcare provision is inherently unfair. There is also a growing awareness that the tools and methodologies applied in tackling health inequalities require further development. Evaluations as well as interventions usually focus on population-based indicators, but do not always provide guidance for frontline service evaluation and delivery. That is why the evaluation framework presented here focuses on facilitating local service development, service provider and user involvement, and the adequate representation of different population groups. Methods: A participative evaluation framework was constructed by drawing on six common success characteristics extrapolated from the published literature and policies on health inequalities. This framework was then applied to an intervention addressing women’s psychosocial health needs in order to demonstrate its utility in practice. Results: The framework provides healthcare professionals with an evidence-based tool for evaluating projects or programmes targeting health inequalities in ways that are responsive to local contexts and stakeholders. Conclusion: This participative evaluation framework supports the identification of meaningful psychosocial and contextual indicators for assessing the diverse health and social needs of service users. It uses multi-dimensional indicators to assess health and social care needs, to inform local service development, and to facilitate the exchange of knowledge between researchers, service providers, and service users. The inherent responsiveness enables rigorous yet flexible action on local health inequalities.
Resumo:
Hunter and Konieczny explored the relationships between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base in several of their papers. In particular, an inconsistency value termed MIVC, defined from minimal inconsistent subsets, can be considered as a Shapley Inconsistency Value. Moreover, it can be axiomatized completely in terms of five simple axioms. MinInc, one of the five axioms, states that each minimal inconsistent set has the same amount of conflict. However, it conflicts with the intuition illustrated by the lottery paradox, which states that as the size of a minimal inconsistent belief base increases, the degree of inconsistency of that belief base becomes smaller. To address this, we present two kinds of revised inconsistency measures for a belief base from its minimal inconsistent subsets. Each of these measures considers the size of each minimal inconsistent subset as well as the number of minimal inconsistent subsets of a belief base. More specifically, we first present a vectorial measure to capture the inconsistency for a belief base, which is more discriminative than MIVC. Then we present a family of weighted inconsistency measures based on the vectorial inconsistency measure, which allow us to capture the inconsistency for a belief base in terms of a single numerical value as usual. We also show that each of the two kinds of revised inconsistency measures can be considered as a particular Shapley Inconsistency Value, and can be axiomatically characterized by the corresponding revised axioms presented in this paper.
Resumo:
Summary: This article outlines a framework for approaching ethical dilemmas arising from the development, evaluation and implementation of child welfare policies. As such, it is relevant to policy-makers, social researchers and social workers. The central tenets of the framework are developed by drawing on ideas from moral philosophy and critical social theory. These ideas are presented as axioms, theorems and corollaries, a format which has been employed in the social sciences to offer a rational justification for a set of claims. • Findings: This process of reasoning leads to four principle axioms that are seen to shape the ethical scrutiny of social policy: 1) problematizing knowledge; 2) utilizing structured forms of inquiry to enhance understanding; 3) engendering enabling communication with those affected by the ethical concern; and 4) enhancing self-awareness. • Applications: The four axioms are then applied, by way of example, to the current and contentious, 'third way' policy of mandated prevention in child welfare, where the aim is to obviate deleterious outcomes in later life. It is argued that the framework can be applied beyond this specific concern to other pressing, ethical challenges in child welfare.
Resumo:
Relevance theory (Sperber & Wilson. 1995) suggests that people expend cognitive effort when processing information in proportion to the cognitive effects to be gained from doing so. This theory has been used to explain how people apply their knowledge appropriately when evaluating category-based inductive arguments (Medin, Coley, Storms, & Hayes, 2003). In such arguments, people are told that a property is true of premise categories and are asked to evaluate the likelihood that it is also true of conclusion categories. According to the relevance framework, reasoners generate hypotheses about the relevant relation between the categories in the argument. We reasoned that premises inconsistent with early hypotheses about the relevant relation would have greater effects than consistent premises. We designed three premise garden-path arguments where the same 3rd premise was either consistent or inconsistent with likely hypotheses about the relevant relation. In Experiments 1 and 2, we showed that effort expended processing consistent premises (measured via reading times) was significantly less than effort expended on inconsistent premises. In Experiment 2 and 3, we demonstrated a direct relation between cognitive effect and cognitive effort. For garden-path arguments, belief change given inconsistent 3rd premises was significantly correlated with Premise 3 (Experiment 3) and conclusion (Experiments 2 and 3) reading times. For consistent arguments, the correlation between belief change and reading times did not approach significance. These results support the relevance framework for induction but are difficult to accommodate under other approaches.