892 resultados para Simplification of Ontologies
Resumo:
In der vorliegenden Arbeit wurde gezeigt, wie mit Hilfe der atomaren Vielteilchenstörungstheorie totale Energien und auch Anregungsenergien von Atomen und Ionen berechnet werden können. Dabei war es zunächst erforderlich, die Störungsreihen mit Hilfe computeralgebraischer Methoden herzuleiten. Mit Hilfe des hierbei entwickelten Maple-Programmpaketes APEX wurde dies für geschlossenschalige Systeme und Systeme mit einem aktiven Elektron bzw. Loch bis zur vierten Ordnung durchgeführt, wobei die entsprechenden Terme aufgrund ihrer großen Anzahl hier nicht wiedergegeben werden konnten. Als nächster Schritt erfolgte die analytische Winkelreduktion unter Anwendung des Maple-Programmpaketes RACAH, was zu diesem Zwecke entsprechend angepasst und weiterentwickelt wurde. Erst hier wurde von der Kugelsymmetrie des atomaren Referenzzustandes Gebrauch gemacht. Eine erhebliche Vereinfachung der Störungsterme war die Folge. Der zweite Teil dieser Arbeit befasst sich mit der numerischen Auswertung der bisher rein analytisch behandelten Störungsreihen. Dazu wurde, aufbauend auf dem Fortran-Programmpaket Ratip, ein Dirac-Fock-Programm für geschlossenschalige Systeme entwickelt, welches auf der in Kapitel 3 dargestellen Matrix-Dirac-Fock-Methode beruht. Innerhalb dieser Umgebung war es nun möglich, die Störungsterme numerisch auszuwerten. Dabei zeigte sich schnell, dass dies nur dann in einem angemessenen Zeitrahmen stattfinden kann, wenn die entsprechenden Radialintegrale im Hauptspeicher des Computers gehalten werden. Wegen der sehr hohen Anzahl dieser Integrale stellte dies auch hohe Ansprüche an die verwendete Hardware. Das war auch insbesondere der Grund dafür, dass die Korrekturen dritter Ordnung nur teilweise und die vierter Ordnung gar nicht berechnet werden konnten. Schließlich wurden die Korrelationsenergien He-artiger Systeme sowie von Neon, Argon und Quecksilber berechnet und mit Literaturwerten verglichen. Außerdem wurden noch Li-artige Systeme, Natrium, Kalium und Thallium untersucht, wobei hier die niedrigsten Zustände des Valenzelektrons betrachtet wurden. Die Ionisierungsenergien der superschweren Elemente 113 und 119 bilden den Abschluss dieser Arbeit.
Resumo:
In the last years, the main orientation of Formal Concept Analysis (FCA) has turned from mathematics towards computer science. This article provides a review of this new orientation and analyzes why and how FCA and computer science attracted each other. It discusses FCA as a knowledge representation formalism using five knowledge representation principles provided by Davis, Shrobe, and Szolovits [DSS93]. It then studies how and why mathematics-based researchers got attracted by computer science. We will argue for continuing this trend by integrating the two research areas FCA and Ontology Engineering. The second part of the article discusses three lines of research which witness the new orientation of Formal Concept Analysis: FCA as a conceptual clustering technique and its application for supporting the merging of ontologies; the efficient computation of association rules and the structuring of the results; and the visualization and management of conceptual hierarchies and ontologies including its application in an email management system.
Resumo:
Ontologies have been established for knowledge sharing and are widely used as a means for conceptually structuring domains of interest. With the growing usage of ontologies, the problem of overlapping knowledge in a common domain becomes critical. In this short paper, we address two methods for merging ontologies based on Formal Concept Analysis: FCA-Merge and ONTEX. --- FCA-Merge is a method for merging ontologies following a bottom-up approach which offers a structural description of the merging process. The method is guided by application-specific instances of the given source ontologies. We apply techniques from natural language processing and formal concept analysis to derive a lattice of concepts as a structural result of FCA-Merge. The generated result is then explored and transformed into the merged ontology with human interaction. --- ONTEX is a method for systematically structuring the top-down level of ontologies. It is based on an interactive, top-down- knowledge acquisition process, which assures that the knowledge engineer considers all possible cases while avoiding redundant acquisition. The method is suited especially for creating/merging the top part(s) of the ontologies, where high accuracy is required, and for supporting the merging of two (or more) ontologies on that level.
Resumo:
ABSTRACT In the first two seminars we looked at the evolution of Ontologies from the current OWL level towards more powerful/expressive models and the corresponding hierarchy of Logics that underpin every stage of this evolution. We examined this in the more general context of the general evolution of the Web as a mathematical (directed and weighed) graph and the archetypical “living network” In the third seminar we will analyze further some of the startling properties that the Web has as a graph/network and which it shares with an array of “real-life” networks as well as some key elements of the mathematics (probability, statistics and graph theory) that underpin all this. No mathematical prerequisites are assumed or required. We will outline some directions that current (2005-now) research is taking and conclude with some illustrations/examples from ongoing research and applications that show great promise.
Resumo:
ABSTRACT In the first two seminars we looked at the evolution of Ontologies from the current OWL level towards more powerful/expressive models and the corresponding hierarchy of Logics that underpin every stage of this evolution. We examined this in the more general context of the general evolution of the Web as a mathematical (directed and weighed) graph and the archetypical “living network” In the third seminar we will analyze further some of the startling properties that the Web has as a graph/network and which it shares with an array of “real-life” networks as well as some key elements of the mathematics (probability, statistics and graph theory) that underpin all this. No mathematical prerequisites are assumed or required. We will outline some directions that current (2005-now) research is taking and conclude with some illustrations/examples from ongoing research and applications that show great promise.
Resumo:
ABSTRACT In the first two seminars we looked at the evolution of Ontologies from the current OWL level towards more powerful/expressive models and the corresponding hierarchy of Logics that underpin every stage of this evolution. We examined this in the more general context of the general evolution of the Web as a mathematical (directed and weighed) graph and the archetypical “living network” In the third seminar we will analyze further some of the startling properties that the Web has as a graph/network and which it shares with an array of “real-life” networks as well as some key elements of the mathematics (probability, statistics and graph theory) that underpin all this. No mathematical prerequisites are assumed or required. We will outline some directions that current (2005-now) research is taking and conclude with some illustrations/examples from ongoing research and applications that show great promise.
Resumo:
The storage and processing capacity realised by computing has lead to an explosion of data retention. We now reach the point of information overload and must begin to use computers to process more complex information. In particular, the proposition of the Semantic Web has given structure to this problem, but has yet realised practically. The largest of its problems is that of ontology construction; without a suitable automatic method most will have to be encoded by hand. In this paper we discus the current methods for semi and fully automatic construction and their current shortcomings. In particular we pay attention the application of ontologies to products and the particle application of the ontologies.
Resumo:
DISOPE is a technique for solving optimal control problems where there are differences in structure and parameter values between reality and the model employed in the computations. The model reality differences can also allow for deliberate simplification of model characteristics and performance indices in order to facilitate the solution of the optimal control problem. The technique was developed originally in continuous time and later extended to discrete time. The main property of the procedure is that by iterating on appropriately modified model based problems the correct optimal solution is achieved in spite of the model-reality differences. Algorithms have been developed in both continuous and discrete time for a general nonlinear optimal control problem with terminal weighting, bounded controls and terminal constraints. The aim of this paper is to show how the DISOPE technique can aid receding horizon optimal control computation in nonlinear model predictive control.
Resumo:
Treating algebraic symbols as objects (eg. “‘a’ means ‘apple’”) is a means of introducing elementary simplification of algebra, but causes problems further on. This current school-based research included an examination of texts still in use in the mathematics department, and interviews with mathematics teachers, year 7 pupils and then year 10 pupils asking them how they would explain, “3a + 2a = 5a” to year 7 pupils. Results included the notion that the ‘algebra as object’ analogy can be found in textbooks in current usage, including those recently published. Teachers knew that they were not ‘supposed’ to use the analogy but not always clear why, nevertheless stating methods of teaching consistent with an‘algebra as object’ approach. Year 7 pupils did not explicitly refer to ‘algebra as object’, although some of their responses could be so interpreted. In the main, year 10 pupils used ‘algebra as object’ to explain simplification of algebra, with some complicated attempts to get round the limitations. Further research would look to establish whether the appearance of ‘algebra as object’ in pupils’ thinking between year 7 and 10 is consistent and, if so, where it arises. Implications also are for on-going teacher training with alternatives to introducing such simplification.
Resumo:
The use of discounted cash flow (DCF) methods in investment valuation and appraisal is argued by many academics as being rational and more rigorous than the traditional capitalisation model. However those advocates of DCF should be cautious in their claims for rationality. The various DCF models all rely upon an all-encompassing equated yield (IRR) within the calculation. This paper will argue that this is a simplification of the risk perception which the investor places on the income profile from property. In determining the long term capital value of a property an 'average' DCF method will produce the 'correct' price, however, the individual short term values of each cash-flow may differ significantly. In the UK property market today, where we are facing a period in which prices are not expected to rise generally at the same rate or with such persistence as hitherto, investors and tenants are increasingly concerned with the down side implications of rental growth and investors may indeed be interested in trading property over a shorter investment horizon than they had originally planned. The purpose of this paper is therefore to bring to the analysis a rigorous framework which can be used to analyse the constituent cash flows within the freehold valuation. We show that the arbitrage analysis lends itself to segregating the capital value of the cash flows in a way which is more appropriate for financial investors
Resumo:
Until recently, First-Order Temporal Logic (FOTL) has been only partially understood. While it is well known that the full logic has no finite axiomatisation, a more detailed analysis of fragments of the logic was not previously available. However, a breakthrough by Hodkinson et al., identifying a finitely axiomatisable fragment, termed the monodic fragment, has led to improved understanding of FOTL. Yet, in order to utilise these theoretical advances, it is important to have appropriate proof techniques for this monodic fragment.In this paper, we modify and extend the clausal temporal resolution technique, originally developed for propositional temporal logics, to enable its use in such monodic fragments. We develop a specific normal form for monodic formulae in FOTL, and provide a complete resolution calculus for formulae in this form. Not only is this clausal resolution technique useful as a practical proof technique for certain monodic classes, but the use of this approach provides us with increased understanding of the monodic fragment. In particular, we here show how several features of monodic FOTL can be established as corollaries of the completeness result for the clausal temporal resolution method. These include definitions of new decidable monodic classes, simplification of existing monodic classes by reductions, and completeness of clausal temporal resolution in the case of monodic logics with expanding domains, a case with much significance in both theory and practice.
Resumo:
Semantic Analysis is a business analysis method designed to capture system requirements. While these requirements may be represented as text, the method also advocates the use of Ontology Charts to formally denote the system's required roles, relationships and forms of communication. Following model driven engineering techniques, Ontology Charts can be transformed to temporal Database schemas, class diagrams and component diagrams, which can then be used to produce software systems. A nice property of these transformations is that resulting system design models lend themselves to complicated extensions that do not require changes to the design models. For example, resulting databases can be extended with new types of data without the need to modify the database schema of the legacy system. Semantic Analysis is not widely used in software engineering, so there is a lack of experts in the field and no design patterns are available. This make it difficult for the analysts to pass organizational knowledge to the engineers. This study describes an implementation that is readily usable by engineers, which includes an automated technique that can produce a prototype from an Ontology Chart. The use of such tools should enable developers to make use of Semantic Analysis with minimal expertise of ontologies and MDA.
Resumo:
A importância e a contribuição das empresas de pequeno porte para o desenvolvimento do país são argumentos suficientes para que esse segmento empresarial seja considerado e tratado como estratégia de política pública. No Brasil as dificuldades e os entraves para que os pequenos negócios produzam resultados de política estão expressos, sobretudo, na conceituação de pequeno porte, na forma de tributação e nos encargos sociais. Para vencer essas limitações e sobreviver os pequenos empresários adotam práticas gerenciais que visam sistematicamente burlar o fisco, trazendo conseqüências desfavoráveis à empresa. Práticas tais como a utilização de caixa dois e o pagamento de salários sem o devido registro são amplamente aplicadas nas pequenas empresas que preferem se expor ao risco da fiscalização do que cumprir os requisitos definidos na legislação. Para reverter esse quadro, o posicionamento estratégico do governo face aos pequenos negócios deveria contemplar ações de incentivo a criação e desenvolvimento de uma classe empresarial, de melhoria da capacidade gerencial, de simplificação da legislação com redução da carga tributária e de assimilação de mão de obra excedente no mercado de trabalho.
Resumo:
A Física e a Administração concentram suas pesquisas sobre fenômenos que, de certa forma, se assemelham, fazendo com que nos questionemos a respeito da grande integral do universo a que estamos submetidos. Em uma exploração por analogias, aproxima-se aqui o mundo organizacional ao dos sistemas UnIVerSaIS, instáveis e não-integráveis, onde a flecha do tempo é quem determina a evolução dos mesmos. Mostra-se que na Administração, como na Física, tudo parece convergir na direção de um inesgotável repertório de bifurcações e possibilidades para o destino mercadológico de produtos, serviços e marcas ao longo de um continuum. Para amenizar os efeitos dessas incertezas, é buscada uma simplificação desses complexos sistemas sociais através de uma proposta de modelo baseado em fatores consagrados pela literatura da gestão empresarial como norteadores das escolhas dos consumidores; um processo gaussiano da 'percepção do valor', que pode servir de ferramenta nas decisões estratégicas e gerenciais dentro das empresas.
Resumo:
Este estudo discute processo de implantação do contrato de gestão na Administração Pública brasileira particularmente, experiência iniciada pelo Governo do Estado de São Paulo, em 1991. objetivo principal do contrato de gestão concentrar controle governamental sobre os resultados das entidades, que possibilita simplificação gradativa das estruturas normativas introdução de um sistema de sanções recompensas. Este sistema pode viabilizar implantação de novas formas de avaliação de desempenho contribuir para aumentar produtividade no setor público. contrato de gestão um instrumento que incentiva diálogo parceria torna transparentes as intenções orientações entre as partes contratantes. Além disso, pode funcionar como um instrumento de racionalização administrativa comunicação interna para própria instituição. existência de um planejamento prévio uma das pré-condições necessárias ao processo de implantação do contrato de gestão. As metas objetivos devem ser especificadas de forma precisa, clara sem ambigüidade, refletir as condições capacidades reais da entidade. Alguns problemas podem ser evitados, se for previsto um estágio inicial de preparação dos técnicos responsáveis pela implantação acompanhamento dos contratos, nos conceitos instrumentos indispensáveis ao processo. fundamental que seja previsto também, um período de negociação do apoio dos principais decisores formadores de opinião, de conscientização preparação do corpo funcional das entidades.