951 resultados para Variational calculus


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. RESULTS: Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. CONCLUSIONS: Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In regression analysis of counts, a lack of simple and efficient algorithms for posterior computation has made Bayesian approaches appear unattractive and thus underdeveloped. We propose a lognormal and gamma mixed negative binomial (NB) regression model for counts, and present efficient closed-form Bayesian inference; unlike conventional Poisson models, the proposed approach has two free parameters to include two different kinds of random effects, and allows the incorporation of prior information, such as sparsity in the regression coefficients. By placing a gamma distribution prior on the NB dispersion parameter r, and connecting a log-normal distribution prior with the logit of the NB probability parameter p, efficient Gibbs sampling and variational Bayes inference are both developed. The closed-form updates are obtained by exploiting conditional conjugacy via both a compound Poisson representation and a Polya-Gamma distribution based data augmentation approach. The proposed Bayesian inference can be implemented routinely, while being easily generalizable to more complex settings involving multivariate dependence structures. The algorithms are illustrated using real examples. Copyright 2012 by the author(s)/owner(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

info:eu-repo/semantics/published

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel multi-scale seamless model of brittle-crack propagation is proposed and applied to the simulation of fracture growth in a two-dimensional Ag plate with macroscopic dimensions. The model represents the crack propagation at the macroscopic scale as the drift-diffusion motion of the crack tip alone. The diffusive motion is associated with the crack-tip coordinates in the position space, and reflects the oscillations observed in the crack velocity following its critical value. The model couples the crack dynamics at the macroscales and nanoscales via an intermediate mesoscale continuum. The finite-element method is employed to make the transition from the macroscale to the nanoscale by computing the continuum-based displacements of the atoms at the boundary of an atomic lattice embedded within the plate and surrounding the tip. Molecular dynamics (MD) simulation then drives the crack tip forward, producing the tip critical velocity and its diffusion constant. These are then used in the Ito stochastic calculus to make the reverse transition from the nanoscale back to the macroscale. The MD-level modelling is based on the use of a many-body potential. The model successfully reproduces the crack-velocity oscillations, roughening transitions of the crack surfaces, as well as the macroscopic crack trajectory. The implications for a 3-D modelling are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel multiscale model of brittle crack propagation in an Ag plate with macroscopic dimensions has been developed. The model represents crack propagation as stochastic drift-diffusion motion of the crack tip atom through the material, and couples the dynamics across three different length scales. It integrates the nanomechanics of bond rupture at the crack tip with the displacement and stress field equations of continuum based fracture theories. The finite element method is employed to obtain the continuum based displacement and stress fields over the macroscopic plate, and these are then used to drive the crack tip forward at the atomic level using the molecular dynamics simulation method based on many-body interatomic potentials. The linkage from the nanoscopic scale back to the macroscopic scale is established via the Ito stochastic calculus, the stochastic differential equation of which advances the tip to a new position on the macroscopic scale using the crack velocity and diffusion constant obtained on the nanoscale. Well known crack characteristics, such as the roughening transitions of the crack surfaces, crack velocity oscillations, as well as the macroscopic crack trajectories, are obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Use of structuring mechanisms (such as modularisation) is widely believed to be one of the key ways to improve software quality. Structuring is considered to be at least as important for specification documents as for source code, since it is assumed to improve comprehensibility. Yet, as with most widely held assumptions in software engineering, there is little empirical evidence to support this hypothesis. Also, even if structuring can be shown to he a good thing, we do not know how much structuring is somehow optimal. One of the more popular formal specification languages, Z, encourages structuring through its schema calculus. A controlled experiment is described in which two hypotheses about the effects of structure on the comprehensibility of Z specifications are tested. Evidence was found that structuring a specification into schemas of about 20 lines long significantly improved comprehensibility over a monolithic specification. However, there seems to be no perceived advantage in breaking down the schemas into much smaller components. The experiment can he fully replicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are three main approaches to the representation of temporal information in AI literature: the so-called method of temporal arguments that simply extends functions and predicates of first-order language to include time as the additional argument; modal temporal logics which are extensions ofthe propositional or predicate calculus with modal temporal operators; and reified temporal logics which reify standard propositions of some initial language (e.g., the classical first-order or modal logic) as objects denoting propositional terms. The objective of this paper is to provide an overview onthe temporal reified approach by looking closely atsome representative existing systems featuring reified propositions, including those of Allen, McDermott, Shoham, Reichgelt, Galton, and Ma and Knight. We shall demonstrate that, although reified logics might be more complicated in expressing assertions about some given objects with respect to different times, they accord a special status to time and therefore have several distinct advantages in talking about some important issues which would be difficult (if not impossible) to express in other approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a simple approach to the so-called frame problem based on some ordinary set operations, which does not require non-monotonic reasoning. Following the notion of the situation calculus, we shall represent a state of the world as a set of fluents, where a fluent is simply a Boolean-valued property whose truth-value is dependent on the time. High-level causal laws are characterised in terms of relationships between actions and the involved world states. An effect completion axiom is imposed on each causal law, which guarantees that all the fluents that can be affected by the performance of the corresponding action are always totally governed. It is shown that, compared with other techniques, such a set operation based approach provides a simpler and more effective treatment to the frame problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract In the theory of central simple algebras, often we are dealing with abelian groups which arise from the kernel or co-kernel of functors which respect transfer maps (for example K-functors). Since a central simple algebra splits and the functors above are “trivial” in the split case, one can prove certain calculus on these functors. The common examples are kernel or co-kernel of the maps Ki(F)?Ki(D), where Ki are Quillen K-groups, D is a division algebra and F its center, or the homotopy fiber arising from the long exact sequence of above map, or the reduced Whitehead group SK1. In this note we introduce an abstract functor over the category of Azumaya algebras which covers all the functors mentioned above and prove the usual calculus for it. This, for example, immediately shows that K-theory of an Azumaya algebra over a local ring is “almost” the same as K-theory of the base ring. The main result is to prove that reduced K-theory of an Azumaya algebra over a Henselian ring coincides with reduced K-theory of its residue central simple algebra. The note ends with some calculation trying to determine the homotopy fibers mentioned above.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report results for e(+/-)-Ps(Is) scattering in the energy range up to 80 eV calculated in 9-state and 30-state coupled pseudostate approximations. Cross-sections are presented for elastic scattering, ortho-para conversion, discrete excitation, ionization and total scattering. Resonances associated with the Ps(n = 2) threshold are also examined and their positions and widths determined. Very good agreement is obtained with the variational calculations of Ward et al. [J. Phys. B 20 (1987) 127] below 5.1 eV. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A many-body theory approach is developed for the problem of positron-atom scattering and annihilation. Strong electron- positron correlations are included nonperturbatively through the calculation of the electron-positron vertex function. It corresponds to the sum of an infinite series of ladder diagrams, and describes the physical effect of virtual positronium formation. The vertex function is used to calculate the positron-atom correlation potential and nonlocal corrections to the electron-positron annihilation vertex. Numerically, we make use of B-spline basis sets, which ensures rapid convergence of the sums over intermediate states. We have also devised an extrapolation procedure that allows one to achieve convergence with respect to the number of intermediate- state orbital angular momenta included in the calculations. As a test, the present formalism is applied to positron scattering and annihilation on hydrogen, where it is exact. Our results agree with those of accurate variational calculations. We also examine in detail the properties of the large correlation corrections to the annihilation vertex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Near-threshold ionization of He has been studied by using a uniform semiclassical wavefunction for the two outgoing electrons in the final channel. The quantum mechanical transition amplitude for the direct and exchange scattering derived earlier by using the Kohn variational principle has been used to calculate the triple differential cross sections. Contributions from singlets and triplets are critically examined near the threshold for coplanar asymmetric geometry with equal energy sharing by the two outgoing electrons. It is found that in general the tripler contribution is much smaller compared to its singlet counterpart. However, at unequal scattering angles such as theta (1) = 60 degrees, theta (2) = 120 degrees the smaller peaks in the triplet contribution enhance both primary and secondary TDCS peaks. Significant improvements of the primary peak in the TDCS are obtained for the singlet results both in symmetric and asymmetric geometry indicating the need to treat the classical action variables without any approximation. Convergence of these cross sections are also achieved against the higher partial waves. Present results are compared with absolute and relative measurements of Rosel et al (1992 Phys. Rev. A 46 2539) and Selles et al (1987 J. Phys. B. At. Mel. Phys. 20 5195) respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An efficient method for calculating the electronic structure of systems that need a very fine sampling of the Brillouin zone is presented. The method is based on the variational optimization of a single (i.e., common to all points in the Brillouin zone) basis set for the expansion of the electronic orbitals. Considerations from k.p-approximation theory help to understand the efficiency of the method. The accuracy and the convergence properties of the method as a function of the optimal basis set size are analyzed for a test calculation on a 16-atom Na supercell.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ab initio nonlocal pseudopotential variational quantum Monte Carlo techniques are used to compute the correlation effects on the valence momentum density and Compton profile of silicon. Our results for this case are in excellent agreement with the Lam-Platzman correction computed within the local density approximation. Within the approximations used, we rule out valence electron correlations as the dominant source of discrepancies between calculated and measured Compton profiles of silicon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of nonlinear dynamic systems using radial basis function (RBF) neural models is studied in this paper. Given a model selection criterion, the main objective is to effectively and efficiently build a parsimonious compact neural model that generalizes well over unseen data. This is achieved by simultaneous model structure selection and optimization of the parameters over the continuous parameter space. It is a mixed-integer hard problem, and a unified analytic framework is proposed to enable an effective and efficient two-stage mixed discrete-continuous; identification procedure. This novel framework combines the advantages of an iterative discrete two-stage subset selection technique for model structure determination and the calculus-based continuous optimization of the model parameters. Computational complexity analysis and simulation studies confirm the efficacy of the proposed algorithm.