955 resultados para statistical potentials
Resumo:
We propose a likelihood ratio test ( LRT) with Bartlett correction in order to identify Granger causality between sets of time series gene expression data. The performance of the proposed test is compared to a previously published bootstrapbased approach. LRT is shown to be significantly faster and statistically powerful even within non- Normal distributions. An R package named gGranger containing an implementation for both Granger causality identification tests is also provided.
Resumo:
Carbon dioxide electroreduction on copper electrode was studied by surface enhanced Raman scattering (SERS) in K(2)SO(4) aqueous solutions with different pH values. CO(2) was bubbled into the solution at 0 V vs. Ag/AgCl, i.e., on an oxidized copper surface. In acidic solutions (pH around 2.5), at -0.2 V, bands indicative of the presence of ethylene on the electrode surface were detected. Although ethylene is knowledgably a product of CO(2) electroreduction on copper, it was not experimentally identified on the electrode`s surface at such a low cathodic potential in prior works. In solutions with pH around 2.5, CO bands were not observed, suggesting that hydrocarbons could be formed by a pathway that does not occur via adsorbed CO. In solutions with higher pHs, a complex spectral pattern, between 800 and 1700 cm(-1), was observed at approximately -0.4 V. The observed spectrum closely resembles those reported in the literature for adsorption of monocarboxylic acids with small chains. The spectral features indicate the presence of a structure containing a double C=C bond. a carboxyl group, and C-H bonds on the electrode`s surface. SERS spectra obtained in CO-saturated solution are also presented. However, in this case, no SERS bands were observed in the region between 800 and 1700 cm(-1) at low cathodic potentials. (c) 2009 Elsevier B.V. All rights reserved.
Resumo:
We study the dynamics of Bose-Einstein condensates in symmetric double-well potentials following a sudden change of the potential from the Mott-insulator to the superfluid regime. We introduce a continuum approximation that maps that problem onto the wave-packet dynamics of a particle in an anharmonic effective potential. For repulsive two-body interactions the visibility of interference fringes that result from the superposition of the two condensates following a stage of ballistic expansion exhibits a collapse of coherent oscillations onto a background value whose magnitude depends on the amount of squeezing of the initial state. Strong attractive interactions are found to stabilize the relative number dynamics. We visualize the dynamics of the system in phase space using a quasiprobability distribution that allows for an intuitive interpretation of the various types of dynamics.
Resumo:
Detection of weak forces with an accuracy beyond the standard quantum limit holds promise both for fundamental research and for technological applications. Schemes involving ultracold atoms for such measurements are now considered to be prime candidates for increased sensitivity. In this paper we use a combination of analytical and numerical techniques to investigate the possible subshot-noise estimation of applied force fields through detection of coherence dynamics of Bose-condensed atoms in asymmetric double-well traps. Following a semiclassical description of the system dynamics and fringe visibility, we present numerical simulations of the full quantum dynamics that demonstrate the dynamical production of phase squeezing beyond the standard quantum limit. Nonlinear interactions are found to limit the achievable amount to a finite value determined by the external weak force.
Resumo:
Quadratic assignment problems (QAPs) are commonly solved by heuristic methods, where the optimum is sought iteratively. Heuristics are known to provide good solutions but the quality of the solutions, i.e., the confidence interval of the solution is unknown. This paper uses statistical optimum estimation techniques (SOETs) to assess the quality of Genetic algorithm solutions for QAPs. We examine the functioning of different SOETs regarding biasness, coverage rate and length of interval, and then we compare the SOET lower bound with deterministic ones. The commonly used deterministic bounds are confined to only a few algorithms. We show that, the Jackknife estimators have better performance than Weibull estimators, and when the number of heuristic solutions is as large as 100, higher order JK-estimators perform better than lower order ones. Compared with the deterministic bounds, the SOET lower bound performs significantly better than most deterministic lower bounds and is comparable with the best deterministic ones.
Resumo:
Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region.
Resumo:
This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision. Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes. The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).
Resumo:
Generalized linear mixed models are flexible tools for modeling non-normal data and are useful for accommodating overdispersion in Poisson regression models with random effects. Their main difficulty resides in the parameter estimation because there is no analytic solution for the maximization of the marginal likelihood. Many methods have been proposed for this purpose and many of them are implemented in software packages. The purpose of this study is to compare the performance of three different statistical principles - marginal likelihood, extended likelihood, Bayesian analysis-via simulation studies. Real data on contact wrestling are used for illustration.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.