9 resultados para Semantic case

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a vision and a proposal for using Semantic Web technologies in the organic food industry. This is a very knowledge intensive industry at every step from the producer, to the caterer or restauranteur, through to the consumer. There is a crucial need for a concept of environmental audit which would allow the various stake holders to know the full environmental impact of their economic choices. This is a di?erent and parallel form of knowledge to that of price. Semantic Web technologies can be used e?ectively for the calculation and transfer of this type of knowledge (together with other forms of multimedia data) which could contribute considerably to the commercial and educational impact of the organic food industry. We outline how this could be achieved as our essential ob jective is to show how advanced technologies could be used to both reduce ecological impact and increase public awareness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents novel data that challenge the traditional categorial understanding of the nominal phrase. The established use of an indefinite pronoun with a determiner in French (ce quelqu'un, du n'importe quoi, un je ne sais quoi) contravenes assumptions both about pronouns, which should not be embedded, and nominal phrases, which should be headed by a noun. Analysed here for the first time, the embedding of a pronoun under a determiner is shown to find its justification in the semantic import of the construction. The anaphoric role guaranteeing referential continuity is promoted by a strong determiner; weak determiners typically contribute to constructing a designative use of the pronoun when a more precise characterisation cannot or will not be provided. How this construction would be analysed in the Minimalist Programme is presented to suggest that the phrase satisfies semantic requirements, which resolves the paradoxes of its traditional definition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What is the role of pragmatics in the evolution of grammatical paradigms? It is to maintain marked candidates that may come to be the default expression. This perspective is validated by the Jespersen cycle, where the standard expression of sentential negation is renewed as pragmatically marked negatives achieve default status. How status changes are effected, however, remains to be documented. This is what is achieved in this paper that looks at the evolution of preverbal negative non in Old and Middle French. The negative, which categorically marks pragmatic activation (Dryer 1996) with finite verbs in Old French, loses this value when used with non-finite verbs in Middle French. This process is accompanied by competing semantic reanalyses of the distribution of infinitives negated in this way, and by the co-occurrence with a greater lexical variety of verbs. The absence of pragmatic contribution should lead the marker to take on the role of default, which is already fulfilled by a well-established ne ... pas, pushing non to decline. Hard empirical evidence is thus provided that validates the assumed role of pragmatics in the Jespersen cycle, supporting the general view of pragmatics as supporting alternative candidates that may or may not achieve default status in the evolution of a grammatical paradigm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigates the process of selecting, extracting and reorganizing content from Semantic Web information sources, to produce an ontology meeting the specifications of a particular domain and/or task. The process is combined with traditional text-based ontology learning methods to achieve tolerance to knowledge incompleteness. The paper describes the approach and presents experiments in which an ontology was built for a diet evaluation task. Although the example presented concerns the specific case of building a nutritional ontology, the methods employed are domain independent and transferrable to other use cases. © 2011 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because poor quality semantic metadata can destroy the effectiveness of semantic web technology by hampering applications from producing accurate results, it is important to have frameworks that support their evaluation. However, there is no such framework developedto date. In this context, we proposed i) an evaluation reference model, SemRef, which sketches some fundamental principles for evaluating semantic metadata, and ii) an evaluation framework, SemEval, which provides a set of instruments to support the detection of quality problems and the collection of quality metrics for these problems. A preliminary case study of SemEval shows encouraging results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The field of Semantic Web Services (SWS) has been recognized as one of the most promising areas of emergent research within the Semantic Web (SW) initiative, exhibiting an extensive commercial potential, and attracting significant attention from both industry and the research community. Currently, there exist several different frameworks and languages for formally describing a Web Service: OWL-S (Web Ontology Language for Services), WSMO (Web Service Modeling Ontology) and SAWSDL (Semantic Annotations for the Web Services Description Language) are the most important approaches. To the inexperienced user, choosing the appropriate paradigm for a specific SWS application may prove to be challenging, given a lack of clear separation between the ideas promoted by the associated research communities. In this paper, we systematically compare OWL-S, WSMO and SAWSDL from various standpoints, namely that of the service requester and provider as well as the broker based view. The comparison is meant to help users to better understand the strengths and limitations of these different approaches to formalising SWS, and to choose the most suitable solution for a given use case. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a description logic extending SROIQ (the description logic underlying OWL 2 DL) and at the same time encompassing some of the most prominent monotonic and nonmonotonic rule languages, in particular Datalog extended with the answer set semantics. Our proposal could be considered a substantial contribution towards fulfilling the quest for a unifying logic for the Semantic Web. As a case in point, two non-monotonic extensions of description logics considered to be of distinct expressiveness until now are covered in our proposal. In contrast to earlier such proposals, our language has the "look and feel" of a description logic and avoids hybrid or first-order syntaxes. © 2012 The Author(s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Sensor Network (WSN) systems have become more and more popular in our modern life. They have been widely used in many areas, such as smart homes/buildings, context-aware devices, military applications, etc. Despite the increasing usage, there is a lack of formal description and automated verification for WSN system design. In this paper, we present an approach to support the rigorous verification of WSN modeling using the Semantic Web technology We use Web Ontology Language (OWL) and Semantic Web Rule Language (SWRL) to define a meta-ontology for the modeling of WSN systems. Furthermore, we apply ontology reasoners to perform automated verification on customized WSN models and their instances. We demonstrate and evaluate our approach through a Light Control System (LCS) as the case study.