71 resultados para Mathematical ontology
Resumo:
A theoretical and numerical framework to model the foundation of marine offshore structures is presented. The theoretical model is composed by a system of partial differential equations describing coupling between seabed solid skeleton and pore fluids (water, air, oil,…) combined with a system of ordinary differential equations describing the specific constitutive relation of the seabed soil skeleton. Once the theoretical model is described, the finite element numerical procedure to achieve an approximate solution of the governing equations is outlined. In order to validate the proposed theoretical and numerical framework the seaward tilt mechanism induced by the action of breaking waves over a vertical breakwater is numerically reproduced. The results numerically attained are in agreement with the main conclusions drawn from the literature associated with this failure mechanism
Resumo:
The application of methodologies for building ontologies can improve ontology quality. However, such quality is not guaranteed because of the difficulties involved in ontology modelling. These difficulties are related to the inclusion of anomalies or bad practices within the ontology development. In this context, our aim is to describe OOPS!(OntOlogy Pitfall Scanner!), a tool for detecting pitfalls in ontologies.
Resumo:
Autonomous systems refer to systems capable of operating in a real world environment without any form of external control for extended periods of time. Autonomy is a desired goal for every system as it improves its performance, safety and profit. Ontologies are a way to conceptualize the knowledge of a specific domain. In this paper an ontology for the description of autonomous systems as well as for its development (engineering) is presented and applied to a process. This ontology is intended to be applied and used to generate final applications following a model driven methodology.
Resumo:
This paper presents an Ontology-Based multi-technology platform as part of an open energy management system which also comprises a wireless transducer network for control and monitoring. The platform allows the integration of several building automation protocols, eases the development and implementation of different kinds of services and allows sharing of the data of a building. The system has been implemented and tested in the Energy Efficiency Research Facility at CeDInt-UPM.
Resumo:
In the context of the Semantic Web, resources on the net can be enriched by well-defined, machine-understandable metadata describing their associated conceptual meaning. These metadata consisting of natural language descriptions of concepts are the focus of the activity we describe in this chapter, namely, ontology localization. In the framework of the NeOn Methodology, ontology localization is defined as the activity of adapting an ontology to a particular language and culture. This adaptation mainly involves the translation of the natural language descriptions of the ontology from a source natural language to a target natural language, with the final objective of obtaining a multilingual ontology, that is, an ontology documented in several natural languages. The purpose of this chapter is to provide detailed and prescriptive methodological guidelines to support the performance of this activity.
Resumo:
Interoperability on multiple levels, concerning both the ontologies themselves and their engineering activities, is a key requirement for ontology networks to be efficient, with minimal redundancy and high reuse. This requirement has a strict binding for software tools that can support some interoperability levels, yet they can be hindered by a lack of shared models and vocabularies describing the resources to be handled, as well as the ways of handling them. Here, three examples of metalevel vocabularies are proposed, each covering at least one peculiar interoperability aspect: OMV for modeling the artifacts themselves, LIR for managing a multilingual layer on top of them, and C-ODO Light for modeling collaboration-supportive life cycle management tasks and processes. All of these models lend themselves to handling by dedicated software tools and are all being employed within NeOn products.
Resumo:
While ontology engineering is rapidly entering the mainstream, expert ontology engineers are a scarce resource. Hence, there is a need for practical methodologies and technologies, which can assist a variety of user types with ontology development tasks. To address this need, this book presents a scenario-based methodology, the NeOn Methodology, which provides guidance for all main activities in ontology engineering. The context in which we consider these activities is that of a networked world, where reuse of existing resources is commonplace, ontologies are developed collaboratively, and managing relationships between ontologies becomes an essential aspect of the ontological engineering process. The description of both the methodology and the ontology engineering activities is grounded in a comprehensive software environment, the NeOn Toolkit and its plugins, which provides integrated support for all the activities described in the book. Here we provide an introduction for the whole book, while the rest of the content is organized into 4 parts: (1) the NeOn Methodology Framework, (2) the set of ontology engineering activities, (3) the NeOn Toolkit and plugins, and (4) three use cases. Primary goals of this book are (a) to disseminate the results from the NeOn project in a structured and comprehensive form, (b) to make it easier for students and practitioners to adopt ontology engineering methods and tools, and (c) to provide a textbook for undergraduate and postgraduate courses on ontology engineering.
Resumo:
One of the major problems related to cancer treatment is its recurrence. Without knowing in advance how likely the cancer will relapse, clinical practice usually recommends adjuvant treatments that have strong side effects. A way to optimize treatments is to predict the recurrence probability by analyzing a set of bio-markers. The NeoMark European project has identified a set of preliminary bio-markers for the case of oral cancer by collecting a large series of data from genomic, imaging, and clinical evidence. This heterogeneous set of data needs a proper representation in order to be stored, computed, and communicated efficiently. Ontologies are often considered the proper mean to integrate biomedical data, for their high level of formality and for the need of interoperable, universally accepted models. This paper presents the NeoMark system and how an ontology has been designed to integrate all its heterogeneous data. The system has been validated in a pilot in which data will populate the ontology and will be made public for further research.
Resumo:
This chapter presents methodological guidelines that allow engineers to reuse generic ontologies. This kind of ontologies represents notions generic across many fields, (is part of, temporal interval, etc.). The guidelines helps the developer (a) to identify the type of generic ontology to be reused, (b) to find out the axioms and definitions that should be reused and (c) to adapt and integrate the generic ontology selected in the domain ontology to be developed. For each task of the methodology, a set of heuristics with examples are presented. We hope that after reading this chapter, you would have acquired some basic ideas on how to take advantage of the great deal of well-founded explicit knowledge that formalizes generic notions such as time concepts and the part of relation.
Resumo:
The goal of the ontology requirements specification activity is to state why the ontology is being built, what its intended uses are, who the end users are, and which requirements the ontology should fulfill. This chapter presents detailed methodological guidelines for specifying ontology requirements efficiently. These guidelines will help ontology engineers to capture ontology requirements and produce the ontology requirements specification document (ORSD). The ORSD will play a key role during the ontology development process because it facilitates, among other activities, (1) the search and reuse of existing knowledge resources with the aim of reengineering them into ontologies, (2) the search and reuse of ontological resources (ontologies, ontology modules, ontology statements as well as ontology design patterns), and (3) the verification of the ontology along the ontology development.
Resumo:
In order to manage properly ontology development projects in complex settings and to apply correctly the NeOn Methodology, it is crucial to have knowledge of the entire ontology development life cycle before starting the development projects. The ontology project plan and scheduling helps the ontology development team to have this knowledge and to monitor the project execution. To facilitate the planning and scheduling of ontology development projects, the NeOn Toolkit plugin called gOntt has been developed. gOntt is a tool that supports the scheduling of ontology network development projects and helps to execute them. In addition, prescriptive methodological guidelines for scheduling ontology development projects using gOntt are provided.
Resumo:
In contrast to other approaches that provide methodological guidance for ontology engineering, the NeOn Methodology does not prescribe a rigid workflow, but instead it suggests a variety of pathways for developing ontologies. The nine scenarios proposed in the methodology cover commonly occurring situations, for example, when available ontologies need to be re-engineered, aligned, modularized, localized to support different languages and cultures, and integrated with ontology design patterns and non-ontological resources, such as folksonomies or thesauri. In addition, the NeOn Methodology framework provides (a) a glossary of processes and activities involved in the development of ontologies, (b) two ontology life cycle models, and (c) a set of methodological guidelines for different processes and activities, which are described (a) functionally, in terms of goals, inputs, outputs, and relevant constraints; (b) procedurally, by means of workflow specifications; and (c) empirically, through a set of illustrative examples.
Resumo:
Provenance is key for describing the evolution of a resource, the entity responsible for its changes and how these changes affect its final state. A proper description of the provenance of a resource shows who has its attribution and can help resolving whether it can be trusted or not. This tutorial will provide an overview of the W3C PROV data model and its serialization as an OWL ontology. The tutorial will incrementally explain the features of the PROV data model, from the core starting terms to the most complex concepts. Finally, the tutorial will show the relation between PROV-O and the Dublin Core Metadata terms.
Resumo:
Como en todos los medios de transporte, la seguridad en los viajes en avión es de primordial importancia. Con los aumentos de tráfico aéreo previstos en Europa para la próxima década, es evidente que el riesgo de accidentes necesita ser evaluado y monitorizado cuidadosamente de forma continúa. La Tesis presente tiene como objetivo el desarrollo de un modelo de riesgo de colisión exhaustivo como método para evaluar el nivel de seguridad en ruta del espacio aéreo europeo, considerando todos los factores de influencia. La mayor limitación en el desarrollo de metodologías y herramientas de monitorización adecuadas para evaluar el nivel de seguridad en espacios de ruta europeos, donde los controladores aéreos monitorizan el tráfico aéreo mediante la vigilancia radar y proporcionan instrucciones tácticas a las aeronaves, reside en la estimación del riesgo operacional. Hoy en día, la estimación del riesgo operacional está basada normalmente en reportes de incidentes proporcionados por el proveedor de servicios de navegación aérea (ANSP). Esta Tesis propone un nuevo e innovador enfoque para evaluar el nivel de seguridad basado exclusivamente en el procesamiento y análisis trazas radar. La metodología propuesta ha sido diseñada para complementar la información recogida en las bases de datos de accidentes e incidentes, mediante la provisión de información robusta de los factores de tráfico aéreo y métricas de seguridad inferidas del análisis automático en profundidad de todos los eventos de proximidad. La metodología 3-D CRM se ha implementado en un prototipo desarrollado en MATLAB © para analizar automáticamente las trazas radar y planes de vuelo registrados por los Sistemas de Procesamiento de Datos Radar (RDP) e identificar y analizar todos los eventos de proximidad (conflictos, conflictos potenciales y colisiones potenciales) en un periodo de tiempo y volumen del espacio aéreo. Actualmente, el prototipo 3-D CRM está siendo adaptado e integrado en la herramienta de monitorización de prestaciones de Aena (PERSEO) para complementar las bases de accidentes e incidentes ATM y mejorar la monitorización y proporcionar evidencias de los niveles de seguridad. ABSTRACT As with all forms of transport, the safety of air travel is of paramount importance. With the projected increases in European air traffic in the next decade and beyond, it is clear that the risk of accidents needs to be assessed and carefully monitored on a continuing basis. The present thesis is aimed at the development of a comprehensive collision risk model as a method of assessing the European en-route risk, due to all causes and across all dimensions within the airspace. The major constraint in developing appropriate monitoring methodologies and tools to assess the level of safety in en-route airspaces where controllers monitor air traffic by means of radar surveillance and provide aircraft with tactical instructions lies in the estimation of the operational risk. The operational risk estimate normally relies on incident reports provided by the air navigation service providers (ANSPs). This thesis proposes a new and innovative approach to assessing aircraft safety level based exclusively upon the process and analysis of radar tracks. The proposed methodology has been designed to complement the information collected in the accident and incident databases, thereby providing robust information on air traffic factors and safety metrics inferred from the in depth assessment of proximate events. The 3-D CRM methodology is implemented in a prototype tool in MATLAB © in order to automatically analyze recorded aircraft tracks and flight plan data from the Radar Data Processing systems (RDP) and identify and analyze all proximate events (conflicts, potential conflicts and potential collisions) within a time span and a given volume of airspace. Currently, the 3D-CRM prototype is been adapted and integrated in AENA’S Performance Monitoring Tool (PERSEO) to complement the information provided by the ATM accident and incident databases and to enhance monitoring and providing evidence of levels of safety.
Resumo:
Satellites and space equipment are exposed to diffuse acoustic fields during the launch process. The use of adequate techniques to model the response to the acoustic loads is a fundamental task during the design and verification phases. Considering the modal density of each element is necessary to identify the correct methodology. In this report selection criteria are presented in order to choose the correct modelling technique depending on the frequency ranges. A model satellite’s response to acoustic loads is presented, determining the modal densities of each component in different frequency ranges. The paper proposes to select the mathematical method in each modal density range and the differences in the response estimation due to the different used techniques. In addition, the methodologies to analyse the intermediate range of the system are discussed. The results are compared with experimental testing data obtained in an experimental modal test.