5 resultados para Subset Sum Problem

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Successfully predicting the frequency dispersion of electronic hyperpolarizabilities is an unresolved challenge in materials science and electronic structure theory. We show that the generalized Thomas-Kuhn sum rules, combined with linear absorption data and measured hyperpolarizability at one or two frequencies, may be used to predict the entire frequency-dependent electronic hyperpolarizability spectrum. This treatment includes two- and three-level contributions that arise from the lowest two or three excited electronic state manifolds, enabling us to describe the unusual observed frequency dispersion of the dynamic hyperpolarizability in high oscillator strength M-PZn chromophores, where (porphinato)zinc(II) (PZn) and metal(II)polypyridyl (M) units are connected via an ethyne unit that aligns the high oscillator strength transition dipoles of these components in a head-to-tail arrangement. We show that some of these structures can possess very similar linear absorption spectra yet manifest dramatically different frequency dependent hyperpolarizabilities, because of three-level contributions that result from excited state-to excited state transition dipoles among charge polarized states. Importantly, this approach provides a quantitative scheme to use linear optical absorption spectra and very limited individual hyperpolarizability measurements to predict the entire frequency-dependent nonlinear optical response. Copyright © 2010 American Chemical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the multiplicity-correction effect of standard Bayesian variable-selection priors in linear regression. Our first goal is to clarify when, and how, multiplicity correction happens automatically in Bayesian analysis, and to distinguish this correction from the Bayesian Ockham's-razor effect. Our second goal is to contrast empirical-Bayes and fully Bayesian approaches to variable selection through examples, theoretical results and simulations. Considerable differences between the two approaches are found. In particular, we prove a theorem that characterizes a surprising aymptotic discrepancy between fully Bayes and empirical Bayes. This discrepancy arises from a different source than the failure to account for hyperparameter uncertainty in the empirical-Bayes estimate. Indeed, even at the extreme, when the empirical-Bayes estimate converges asymptotically to the true variable-inclusion probability, the potential for a serious difference remains. © Institute of Mathematical Statistics, 2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Dropouts and missing data are nearly-ubiquitous in obesity randomized controlled trails, threatening validity and generalizability of conclusions. Herein, we meta-analytically evaluate the extent of missing data, the frequency with which various analytic methods are employed to accommodate dropouts, and the performance of multiple statistical methods. METHODOLOGY/PRINCIPAL FINDINGS: We searched PubMed and Cochrane databases (2000-2006) for articles published in English and manually searched bibliographic references. Articles of pharmaceutical randomized controlled trials with weight loss or weight gain prevention as major endpoints were included. Two authors independently reviewed each publication for inclusion. 121 articles met the inclusion criteria. Two authors independently extracted treatment, sample size, drop-out rates, study duration, and statistical method used to handle missing data from all articles and resolved disagreements by consensus. In the meta-analysis, drop-out rates were substantial with the survival (non-dropout) rates being approximated by an exponential decay curve (e(-lambdat)) where lambda was estimated to be .0088 (95% bootstrap confidence interval: .0076 to .0100) and t represents time in weeks. The estimated drop-out rate at 1 year was 37%. Most studies used last observation carried forward as the primary analytic method to handle missing data. We also obtained 12 raw obesity randomized controlled trial datasets for empirical analyses. Analyses of raw randomized controlled trial data suggested that both mixed models and multiple imputation performed well, but that multiple imputation may be more robust when missing data are extensive. CONCLUSION/SIGNIFICANCE: Our analysis offers an equation for predictions of dropout rates useful for future study planning. Our raw data analyses suggests that multiple imputation is better than other methods for handling missing data in obesity randomized controlled trials, followed closely by mixed models. We suggest these methods supplant last observation carried forward as the primary method of analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2015, Institute of Mathematical Statistics. All rights reserved.In order to use persistence diagrams as a true statistical tool, it would be very useful to have a good notion of mean and variance for a set of diagrams. In [23], Mileyko and his collaborators made the first study of the properties of the Fréchet mean in (Dp, Wp), the space of persistence diagrams equipped with the p-th Wasserstein metric. In particular, they showed that the Fréchet mean of a finite set of diagrams always exists, but is not necessarily unique. The means of a continuously-varying set of diagrams do not themselves (necessarily) vary continuously, which presents obvious problems when trying to extend the Fréchet mean definition to the realm of time-varying persistence diagrams, better known as vineyards. We fix this problem by altering the original definition of Fréchet mean so that it now becomes a probability measure on the set of persistence diagrams; in a nutshell, the mean of a set of diagrams will be a weighted sum of atomic measures, where each atom is itself a persistence diagram determined using a perturbation of the input diagrams. This definition gives for each N a map (Dp)N→ℙ(Dp). We show that this map is Hölder continuous on finite diagrams and thus can be used to build a useful statistic on vineyards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes that atherosclerosis is initiated by a signaling event that deposits calcium hydroxyapatite (Ca-HAP). This event is preceded by a loss of mechanical structure in the arterial wall. After Ca-HAP has been deposited, it is unlikely that it will be reabsorbed because the solubility product constant (K sp) is very small, and the large stores of Ca +2 and PO 4-3 in the bones oppose any attempts to dissolve Ca-HAP by decreasing the common ions. The hydroxide ion (OH -) of Ca-HAP can be displaced in nature by fluoride (F -) and carbonate (CO 3-2) ions, and it is proposed that anions associated with cholesterol ester hydrolysis and, in very small quantities, the enolate of 7-ketocholesterol could also displace the OH -of Ca-HAP, forming an ionic bond. The free energy of hydration of Ca-HAP at 310 K is most likely negative, and the ionic radii of the anions associated with the hydrolysis of cholesterol ester are compatible with the substitution. Furthermore, examination of the pathology of atherosclerotic lesions by Raman and NMR spectroscopy and confocal microscopy supports deposition of Ca-HAP associated with cholesterol. Investigating the affinity of intermediates of cholesterol hydrolysis for Ca-HAP compared to lipoproteins such as HDL, LDL, and VLDL using isothermic titration calorimetry could add proof of this concept and may lead to the development of a new class of medications targeted at the deposition of cholesterol within Ca-HAP. Treatment of acute ischemic events as a consequence of atherosclerosis with denitrogenation and oxygenation is discussed. © the author(s), publisher and licensee Libertas Academica Ltd.