888 resultados para Knowledge representation (Information theory)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowledge graphs and ontologies are closely related concepts in the field of knowledge representation. In recent years, knowledge graphs have gained increasing popularity and are serving as essential components in many knowledge engineering projects that view them as crucial to their success. The conceptual foundation of the knowledge graph is provided by ontologies. Ontology modeling is an iterative engineering process that consists of steps such as the elicitation and formalization of requirements, the development, testing, refactoring, and release of the ontology. The testing of the ontology is a crucial and occasionally overlooked step of the process due to the lack of integrated tools to support it. As a result of this gap in the state-of-the-art, the testing of the ontology is completed manually, which requires a considerable amount of time and effort from the ontology engineers. The lack of tool support is noticed in the requirement elicitation process as well. In this aspect, the rise in the adoption and accessibility of knowledge graphs allows for the development and use of automated tools to assist with the elicitation of requirements from such a complementary source of data. Therefore, this doctoral research is focused on developing methods and tools that support the requirement elicitation and testing steps of an ontology engineering process. To support the testing of the ontology, we have developed XDTesting, a web application that is integrated with the GitHub platform that serves as an ontology testing manager. Concurrently, to support the elicitation and documentation of competency questions, we have defined and implemented RevOnt, a method to extract competency questions from knowledge graphs. Both methods are evaluated through their implementation and the results are promising.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering intrinsic characteristics of the system exclusively, both statistical and information theory interpretations of the second law are used to provide more comprehensive meanings for the concepts of entropy, temperature, and Helmholtz and Gibbs energies. The coherence of Clausius inequality to these concepts is emphasized. The aim of this work is to re-discuss the second law of thermodynamics in accordance to homogeneous processes thermodynamics, a temporal science which is the very special oversimplification of continuum mechanics for spatially constant intensive properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This essay is a trial on giving some mathematical ideas about the concept of biological complexity, trying to explore four different attributes considered to be essential to characterize a complex system in a biological context: decomposition, heterogeneous assembly, self-organization, and adequacy. It is a theoretical and speculative approach, opening some possibilities to further numerical and experimental work, illustrated by references to several researches that applied the concepts presented here. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integral of the Wigner function over a subregion of the phase space of a quantum system may be less than zero or greater than one. It is shown that for systems with 1 degree of freedom, the problem of determining the best possible upper and lower bounds on such an integral, over an possible states, reduces to the problem of finding the greatest and least eigenvalues of a Hermitian operator corresponding to the subregion. The problem is solved exactly in the case of an arbitrary elliptical region. These bounds provide checks on experimentally measured quasiprobability distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the transformation of maximally entangled states under the action of Lorentz transformations in a fully relativistic setting. By explicit calculation of the Wigner rotation, we describe the relativistic analog of the Bell states as viewed from two inertial frames moving with constant velocity with respect to each other. Though the finite dimensional matrices describing the Lorentz transformations are non-unitary, each single particle state of the entangled pair undergoes an effective, momentum dependent, local unitary rotation, thereby preserving the entanglement fidelity of the bipartite state. The details of how these unitary transformations are manifested are explicitly worked out for the Bell states comprised of massive spin 1/2 particles and massless photon polarizations. The relevance of this work to non-inertial frames is briefly discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantum mechanics has been formulated in phase space, with the Wigner function as the representative of the quantum density operator, and classical mechanics has been formulated in Hilbert space, with the Groenewold operator as the representative of the classical Liouville density function. Semiclassical approximations to the quantum evolution of the Wigner function have been defined, enabling the quantum evolution to be approached from a classical starting point. Now analogous semiquantum approximations to the classical evolution of the Groenewold operator are defined, enabling the classical evolution to be approached from a quantum starting point. Simple nonlinear systems with one degree of freedom are considered, whose Hamiltonians are polynomials in the Hamiltonian of the simple harmonic oscillator. The behavior of expectation values of simple observables and of eigenvalues of the Groenewold operator are calculated numerically and compared for the various semiclassical and semiquantum approximations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main problem with current approaches to quantum computing is the difficulty of establishing and maintaining entanglement. A Topological Quantum Computer (TQC) aims to overcome this by using different physical processes that are topological in nature and which are less susceptible to disturbance by the environment. In a (2+1)-dimensional system, pseudoparticles called anyons have statistics that fall somewhere between bosons and fermions. The exchange of two anyons, an effect called braiding from knot theory, can occur in two different ways. The quantum states corresponding to the two elementary braids constitute a two-state system allowing the definition of a computational basis. Quantum gates can be built up from patterns of braids and for quantum computing it is essential that the operator describing the braiding-the R-matrix-be described by a unitary operator. The physics of anyonic systems is governed by quantum groups, in particular the quasi-triangular Hopf algebras obtained from finite groups by the application of the Drinfeld quantum double construction. Their representation theory has been described in detail by Gould and Tsohantjis, and in this review article we relate the work of Gould to TQC schemes, particularly that of Kauffman.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantum information theory, applied to optical interferometry, yields a 1/n scaling of phase uncertainty Delta phi independent of the applied phase shift phi, where n is the number of photons in the interferometer. This 1/n scaling is achieved provided that the output state is subjected to an optimal phase measurement. We establish this scaling law for both passive (linear) and active (nonlinear) interferometers and identify the coefficient of proportionality. Whereas a highly nonclassical state is required to achieve optimal scaling for passive interferometry, a classical input state yields a 1/n scaling of phase uncertainty for active interferometry.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. A fundamental aspect of planning future actions is the performance and control of motor tasks. This behaviour is done through sensory-motor integration. Aim. To explain the electrophysiological mechanisms in the cortex (modifications to the alpha band) that are involved in anticipatory actions when individuals have to catch a free-falling object. Subjects and methods. The sample was made up of 20 healthy subjects of both sexes (11 males and 9 females) with ages ranging between 25 and 40 years (32.5 +/- 7.5) who were free of mental or physical diseases (previous medical history); the subjects were right-handed (Edinburgh Inventory) and were not taking any psychoactive or psychotropic substances at the time of the study. The experiment consisted in a task in which subjects had to catch freely falling objects. The experiment was made up of six blocks of 15 tests, each of which lasted 2 minutes and 30 seconds before and two seconds after each ball was dropped. Results. An interaction of the factors moment and position was only observed for the right parietooccipital cortex, in the combination of electrodes P4-O2. Conclusion. These findings suggest that the right parietooccipital cortex plays an important role in increasing expectation and swiftness in the process of preparing for a motor task.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Design of liquid retaining structures involves many decisions to be made by the designer based on rules of thumb, heuristics, judgment, code of practice and previous experience. Various design parameters to be chosen include configuration, material, loading, etc. A novice engineer may face many difficulties in the design process. Recent developments in artificial intelligence and emerging field of knowledge-based system (KBS) have made widespread applications in different fields. However, no attempt has been made to apply this intelligent system to the design of liquid retaining structures. The objective of this study is, thus, to develop a KBS that has the ability to assist engineers in the preliminary design of liquid retaining structures. Moreover, it can provide expert advice to the user in selection of design criteria, design parameters and optimum configuration based on minimum cost. The development of a prototype KBS for the design of liquid retaining structures (LIQUID), using blackboard architecture with hybrid knowledge representation techniques including production rule system and object-oriented approach, is presented in this paper. An expert system shell, Visual Rule Studio, is employed to facilitate the development of this prototype system. (C) 2002 Elsevier Science Ltd. All rights reserved.