893 resultados para Layout Ontologies
Resumo:
The problems of collaborative engineering design and knowledge management at the conceptual stage in a network of dissimilar enterprises was investigated. This issue in engineering design is a result of the supply chain and virtual enterprise (VE) oriented industry that demands faster time to market and accurate cost/manufacturing analysis from conception. The solution consisted of a de-centralised super-peer net architecture to establish and maintain communications between enterprises in a VE. In the solution outlined below, the enterprises are able to share knowledge in a common format and nomenclature via the building-block shareable super-ontology that can be tailored on a project by project basis, whilst maintaining the common nomenclature of the ‘super-ontology’ eliminating knowledge interpretation issues. The two-tier architecture layout of the solution glues together the peer-peer and super-ontologies to form a coherent system for both internal and virtual enterprise knowledge management and product development.
Resumo:
In his introduction, Pinna (2010) quoted one of Wertheimer’s observations: “I stand at the window and see a house, trees, sky. Theoretically I might say there were 327 brightnesses and nuances of color. Do I have ‘327’? No. I have sky, house, and trees.” This seems quite remarkable, for Max Wertheimer, together with Kurt Koffka and Wolfgang Koehler, was a pioneer of Gestalt Theory: perceptual organisation was tackled considering grouping rules of line and edge elements in relation to figure-ground segregation, i.e., a meaningful object (the figure) as perceived against a complex background (the ground). At the lowest level – line and edge elements – Wertheimer (1923) himself formulated grouping principles on the basis of proximity, good continuation, convexity, symmetry and, often forgotten, past experience of the observer. Rubin (1921) formulated rules for figure-ground segregation using surroundedness, size and orientation, but also convexity and symmetry. Almost a century of research into Gestalt later, Pinna and Reeves (2006) introduced the notion of figurality, meant to represent the integrated set of properties of visual objects, from the principles of grouping and figure-ground to the colour and volume of objects with shading. Pinna, in 2010, went one important step further and studied perceptual meaning, i.e., the interpretation of complex figures on the basis of past experience of the observer. Re-establishing a link to Wertheimer’s rule about past experience, he formulated five propositions, three definitions and seven properties on the basis of observations made on graphically manipulated patterns. For example, he introduced the illusion of meaning by comics-like elements suggesting wind, therefore inducing a learned interpretation. His last figure shows a regular array of squares but with irregular positions on the right side. This pile of (ir)regular squares can be interpreted as the result of an earthquake which destroyed part of an apartment block. This is much more intuitive, direct and economic than describing the complexity of the array of squares.
Resumo:
This paper gives an account of the disappearance of Malaysian Airways Flight MH370 into the southern Indian Ocean in March 2014 and analyses the rare glimpses into remote ocean space this incident opened up. It follows the tenuous clues as to where the aeroplane might have come to rest after it disappeared from radar screens – seven satellite pings, hundreds of pieces of floating debris and six underwater sonic recordings – as ways of entering into and thinking about ocean space. The paper pays attention to and analyses this space on three registers – first, as a fluid, more-than-human materiality with particular properties and agencies; second, as a synthetic situation, a composite of informational bits and pieces scopically articulated and augmented; and third, as geopolitics, delineated by the protocols of international search and rescue. On all three registers – as matter, as data and as law – the ocean is shown to be ontologically fluid, a world defined by movement, flow and flux, posing intractable difficulties for human interactions with it.
Resumo:
The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.
Resumo:
The changes introduced into the European Higher Education Area (EHEA) by the Bologna Process, together with renewed pedagogical and methodological practices, have created a new teaching-learning paradigm: Student-Centred Learning. In addition, the last few years have been characterized by the application of Information Technologies, especially the Semantic Web, not only to the teaching-learning process, but also to administrative processes within learning institutions. On one hand, the aim of this study was to present a model for identifying and classifying Competencies and Learning Outcomes and, on the other hand, the computer applications of the information management model were developed, namely a relational Database and an Ontology.
Resumo:
The electricity market restructuring, along with the increasing necessity for an adequate integration of renewable energy sources, is resulting in an rising complexity in power systems operation. Various power system simulators have been introduced in recent years with the purpose of helping operators, regulators, and involved players to understand and deal with this complex environment. This paper focuses on the development of an upper ontology which integrates the essential concepts necessary to interpret all the available information. The restructuring of MASCEM (Multi-Agent System for Competitive Electricity Markets), and this system’s integration with MASGriP (Multi-Agent Smart Grid Platform), and ALBidS (Adaptive Learning Strategic Bidding System) provide the means for the exemplification of the usefulness of this ontology. A practical example is presented, showing how common simulation scenarios for different simulators, directed to very distinct environments, can be created departing from the proposed ontology.
Resumo:
The original Master Plan of 1964 called for the campus to stretch out 1 1/4 miles across the escarpment with arts buildings west of the tower and science buildings to the east. This plan laid out the development of Brock for the next 10 or 11 years by which time enrollment was expected to be near 8000 students.
Resumo:
Site Plan for the Physical Education centre and surrounding landscapes.
Resumo:
Among many other knowledge representations formalisms, Ontologies and Formal Concept Analysis (FCA) aim at modeling ‘concepts’. We discuss how these two formalisms may complement another from an application point of view. In particular, we will see how FCA can be used to support Ontology Engineering, and how ontologies can be exploited in FCA applications. The interplay of FCA and ontologies is studied along the life cycle of an ontology: (i) FCA can support the building of the ontology as a learning technique. (ii) The established ontology can be analyzed and navigated by using techniques of FCA. (iii) Last but not least, the ontology may be used to improve an FCA application.
Resumo:
A key argument for modeling knowledge in ontologies is the easy re-use and re-engineering of the knowledge. However, beside consistency checking, current ontology engineering tools provide only basic functionalities for analyzing ontologies. Since ontologies can be considered as (labeled, directed) graphs, graph analysis techniques are a suitable answer for this need. Graph analysis has been performed by sociologists for over 60 years, and resulted in the vivid research area of Social Network Analysis (SNA). While social network structures in general currently receive high attention in the Semantic Web community, there are only very few SNA applications up to now, and virtually none for analyzing the structure of ontologies. We illustrate in this paper the benefits of applying SNA to ontologies and the Semantic Web, and discuss which research topics arise on the edge between the two areas. In particular, we discuss how different notions of centrality describe the core content and structure of an ontology. From the rather simple notion of degree centrality over betweenness centrality to the more complex eigenvector centrality based on Hermitian matrices, we illustrate the insights these measures provide on two ontologies, which are different in purpose, scope, and size.