33 resultados para Variational calculus
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Brown's model for the relaxation of the magnetization of a single domain ferromagnetic particle is considered. This model results in the Fokker-Planck equation of the process. The solution of this equation in the cases of most interest is non- trivial. The probability density of orientations of the magnetization in the Fokker-Planck equation can be expanded in terms of an infinite set of eigenfunctions and their corresponding eigenvalues where these obey a Sturm-Liouville type equation. A variational principle is applied to the solution of this equation in the case of an axially symmetric potential. The first (non-zero) eigenvalue, corresponding to the largest time constant, is considered. From this we obtain two new results. Firstly, an approximate minimising trial function is obtained which allows calculation of a rigorous upper bound. Secondly, a new upper bound formula is derived based on the Euler-Lagrange condition. This leads to very accurate calculation of the eigenvalue but also, interestingly, from this, use of the simplest trial function yields an equivalent result to the correlation time of Coffey et at. and the integral relaxation time of Garanin. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Incidence calculus is a mechanism for probabilistic reasoning in which sets of possible worlds, called incidences, are associated with axioms, and probabilities are then associated with these sets. Inference rules are used to deduce bounds on the incidence of formulae which are not axioms, and bounds for the probability of such a formula can then be obtained. In practice an assignment of probabilities directly to axioms may be given, and it is then necessary to find an assignment of incidence which will reproduce these probabilities. We show that this task of assigning incidences can be viewed as a tree searching problem, and two techniques for performing this research are discussed. One of these is a new proposal involving a depth first search, while the other incorporates a random element. A Prolog implementation of these methods has been developed. The two approaches are compared for efficiency and the significance of their results are discussed. Finally we discuss a new proposal for applying techniques from linear programming to incidence calculus.
Resumo:
Dealing with uncertainty problems in intelligent systems has attracted a lot of attention in the AI community. Quite a few techniques have been proposed. Among them, the Dempster-Shafer theory of evidence (DS theory) has been widely appreciated. In DS theory, Dempster's combination rule plays a major role. However, it has been pointed out that the application domains of the rule are rather limited and the application of the theory sometimes gives unexpected results. We have previously explored the problem with Dempster's combination rule and proposed an alternative combination mechanism in generalized incidence calculus. In this paper we give a comprehensive comparison between generalized incidence calculus and the Dempster-Shafer theory of evidence. We first prove that these two theories have the same ability in representing evidence and combining DS-independent evidence. We then show that the new approach can deal with some dependent situations while Dempster's combination rule cannot. Various examples in the paper show the ways of using generalized incidence calculus in expert systems.
Resumo:
This paper discusses the relations between extended incidence calculus and assumption-based truth maintenance systems (ATMSs). We first prove that managing labels for statements (nodes) in an ATMS is equivalent to producing incidence sets of these statements in extended incidence calculus. We then demonstrate that the justification set for a node is functionally equivalent to the implication relation set for the same node in extended incidence calculus. As a consequence, extended incidence calculus can provide justifications for an ATMS, because implication relation sets are discovered by the system automatically. We also show that extended incidence calculus provides a theoretical basis for constructing a probabilistic ATMS by associating proper probability distributions on assumptions. In this way, we can not only produce labels for all nodes in the system, but also calculate the probability of any of such nodes in it. The nogood environments can also be obtained automatically. Therefore, extended incidence calculus and the ATMS are equivalent in carrying out inferences at both the symbolic level and the numerical level. This extends a result due to Laskey and Lehner.
Resumo:
We restate the notion of orthogonal calculus in terms of model categories. This provides a cleaner set of results and makes the role of O(n)-equivariance clearer. Thus we develop model structures for the category of n-polynomial and n-homogeneous functors, along with Quillen pairs relating them. We then classify n-homogeneous functors, via a zig-zag of Quillen equivalences, in terms of spectra with an O(n)-action. This improves upon the classification theorem of Weiss. As an application, we develop a variant of orthogonal calculus by replacing topological spaces with orthogonal spectra.
Resumo:
In this paper, we report a fully ab initio variational Monte Carlo study of the linear and periodic chain of hydrogen atoms, a prototype system providing the simplest example of strong electronic correlation in low dimensions. In particular, we prove that numerical accuracy comparable to that of benchmark density-matrix renormalization-group calculations can be achieved by using a highly correlated Jastrow-antisymmetrized geminal power variational wave function. Furthermore, by using the so-called "modern theory of polarization" and by studying the spin-spin and dimer-dimer correlations functions, we have characterized in detail the crossover between the weakly and strongly correlated regimes of this atomic chain. Our results show that variational Monte Carlo provides an accurate and flexible alternative to highly correlated methods of quantum chemistry which, at variance with these methods, can be also applied to a strongly correlated solid in low dimensions close to a crossover or a phase transition.
Resumo:
Situation calculus has been applied widely in arti?cial intelligence to model and reason about actions and changes in dynamic systems. Since actions carried out by agents will cause constant changes of the agents’ beliefs, how to manage
these changes is a very important issue. Shapiro et al. [22] is one of the studies that considered this issue. However, in this framework, the problem of noisy sensing, which often presents in real-world applications, is not considered. As a
consequence, noisy sensing actions in this framework will lead to an agent facing inconsistent situation and subsequently the agent cannot proceed further. In this paper, we investigate how noisy sensing actions can be handled in iterated
belief change within the situation calculus formalism. We extend the framework proposed in [22] with the capability of managing noisy sensings. We demonstrate that an agent can still detect the actual situation when the ratio of noisy sensing actions vs. accurate sensing actions is limited. We prove that our framework subsumes the iterated belief change strategy in [22] when all sensing actions are accurate. Furthermore, we prove that our framework can adequately handle belief introspection, mistaken beliefs, belief revision and belief update even with noisy sensing, as done in [22] with accurate sensing actions only.
Resumo:
Abstract In the theory of central simple algebras, often we are dealing with abelian groups which arise from the kernel or co-kernel of functors which respect transfer maps (for example K-functors). Since a central simple algebra splits and the functors above are “trivial” in the split case, one can prove certain calculus on these functors. The common examples are kernel or co-kernel of the maps Ki(F)?Ki(D), where Ki are Quillen K-groups, D is a division algebra and F its center, or the homotopy fiber arising from the long exact sequence of above map, or the reduced Whitehead group SK1. In this note we introduce an abstract functor over the category of Azumaya algebras which covers all the functors mentioned above and prove the usual calculus for it. This, for example, immediately shows that K-theory of an Azumaya algebra over a local ring is “almost” the same as K-theory of the base ring. The main result is to prove that reduced K-theory of an Azumaya algebra over a Henselian ring coincides with reduced K-theory of its residue central simple algebra. The note ends with some calculation trying to determine the homotopy fibers mentioned above.
Resumo:
We report results for e(+/-)-Ps(Is) scattering in the energy range up to 80 eV calculated in 9-state and 30-state coupled pseudostate approximations. Cross-sections are presented for elastic scattering, ortho-para conversion, discrete excitation, ionization and total scattering. Resonances associated with the Ps(n = 2) threshold are also examined and their positions and widths determined. Very good agreement is obtained with the variational calculations of Ward et al. [J. Phys. B 20 (1987) 127] below 5.1 eV. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
A many-body theory approach is developed for the problem of positron-atom scattering and annihilation. Strong electron- positron correlations are included nonperturbatively through the calculation of the electron-positron vertex function. It corresponds to the sum of an infinite series of ladder diagrams, and describes the physical effect of virtual positronium formation. The vertex function is used to calculate the positron-atom correlation potential and nonlocal corrections to the electron-positron annihilation vertex. Numerically, we make use of B-spline basis sets, which ensures rapid convergence of the sums over intermediate states. We have also devised an extrapolation procedure that allows one to achieve convergence with respect to the number of intermediate- state orbital angular momenta included in the calculations. As a test, the present formalism is applied to positron scattering and annihilation on hydrogen, where it is exact. Our results agree with those of accurate variational calculations. We also examine in detail the properties of the large correlation corrections to the annihilation vertex.