944 resultados para maximal ontological completeness
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
Information provision to address the changing requirements can be best supported by content management. The Current information technology enables information to be stored and provided from various distributed sources. To identify and retrieve relevant information requires effective mechanisms for information discovery and assembly. This paper presents a method, which enables the design of such mechanisms, with a set of techniques for articulating and profiling users' requirements, formulating information provision specifications, realising management of information content in repositories, and facilitating response to the user's requirements dynamically during the process of knowledge construction. These functions are represented in an ontology which integrates the capability of the mechanisms. The ontological modelling in this paper has adopted semiotics principles with embedded norms to ensure coherent course of actions represented in these mechanisms. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The knowledge economy offers opportunity to a broad and diverse community of information systems users to efficiently gain information and know-how for improving qualifications and enhancing productivity in the work place. Such demand will continue and users will frequently require optimised and personalised information content. The advancement of information technology and the wide dissemination of information endorse individual users when constructing new knowledge from their experience in the real-world context. However, a design of personalised information provision is challenging because users’ requirements and information provision specifications are complex in their representation. The existing methods are not able to effectively support this analysis process. This paper presents a mechanism which can holistically facilitate customisation of information provision based on individual users’ goals, level of knowledge and cognitive styles preferences. An ontology model with embedded norms represents the domain knowledge of information provision in a specific context where users’ needs can be articulated and represented in a user profile. These formal requirements can then be transformed onto information provision specifications which are used to discover suitable information content from repositories and pedagogically organise the selected content to meet the users’ needs. The method is provided with adaptability which enables an appropriate response to changes in users’ requirements during the process of acquiring knowledge and skills.
Resumo:
In evaluating an interconnection network, it is indispensable to estimate the size of the maximal connected components of the underlying graph when the network begins to lose processors. Hypercube is one of the most popular interconnection networks. This article addresses the maximal connected components of an n -dimensional cube with faulty processors. We first prove that an n -cube with a set F of at most 2n - 3 failing processors has a component of size greater than or equal to2(n) - \F\ - 1. We then prove that an n -cube with a set F of at most 3n - 6 missing processors has a component of size greater than or equal to2(n) - \F\ - 2.
Resumo:
evaluating the fault tolerance of an interconnection network, it is essential to estimate the size of a maximal connected component of the network at the presence of faulty processors. Hypercube is one of the most popular interconnection networks. In this paper, we prove that for ngreater than or equal to6, an n-dimensional cube with a set F of at most (4n-10) failing processors has a component of size greater than or equal to2"-\F-3. This result demonstrates the superiority of hypercube in terms of the fault tolerance.
Resumo:
Hypercube is one of the most popular topologies for connecting processors in multicomputer systems. In this paper we address the maximum order of a connected component in a faulty cube. The results established include several known conclusions as special cases. We conclude that the hypercube structure is resilient as it includes a large connected component in the presence of large number of faulty vertices.
Resumo:
Business process modelling can help an organisation better understand and improve its business processes. Most business process modelling methods adopt a task- or activity-based approach to identifying business processes. Within our work, we use activity theory to categorise elements within organisations as being either human beings, activities or artefacts. Due to the direct relationship between these three elements, an artefact-oriented approach to organisation analysis emerges. Organisational semiotics highlights the ontological dependency between affordances within an organisation. We analyse the ontological dependency between organisational elements, and therefore produce the ontology chart for artefact-oriented business process modelling in order to clarify the relationship between the elements of an organisation. Furthermore, we adopt the techniques from semantic analysis and norm analysis, of organisational semiotics, to develop the artefact-oriented method for business process modelling. The proposed method provides a novel perspective for identifying and analysing business processes, as well as agents and artefacts, as the artefact-oriented perspective demonstrates the fundamental flow of an organisation. The modelling results enable an organisation to understand and model its processes from an artefact perspective, viewing an organisation as a network of artefacts. The information and practice captured and stored in artefact can also be shared and reused between organisations that produce similar artefacts.
Resumo:
Resetting of previously accumulated optically stimulated luminescence (OSL) signals during transport of sediment is a fundamental requirement for reliable optical dating. The completeness of optical resetting of 46 modern-age quartz samples from a variety of depositional environments was examined. All equivalent dose (De) estimates were View the MathML source, with the majority of aeolian samples View the MathML source, and fluvial samples View the MathML source. The OSL signal of quartz originates from several trap types with different rates of charge loss during illumination. As such, incomplete bleaching may be identifiable as an increase in De from easy-to-bleach through to hard-to-bleach components. For all modern fluvial samples with non-zero De values, SAR De(t) analysis and component-resolved linearly modulated OSL (LM OSL) De estimates showed this to be the case, implying incomplete resetting of previously accumulated charge. LM OSL measurements were also made to investigate the extent of bleaching of the slow components in the natural environment. In aeolian sediments examined, the natural LM OSL was effectively zero (i.e. all components were fully reset). The slow components of modern fluvial samples displayed measurable residual signals up to 15 Gy.