4 resultados para data movement problem
em National Center for Biotechnology Information - NCBI
Resumo:
This paper gives three related results: (i) a new, simple, fast, monotonically converging algorithm for deriving the L1-median of a data cloud in ℝd, a problem that can be traced to Fermat and has fascinated applied mathematicians for over three centuries; (ii) a new general definition for depth functions, as functions of multivariate medians, so that different definitions of medians will, correspondingly, give rise to different dept functions; and (iii) a simple closed-form formula of the L1-depth function for a given data cloud in ℝd.
Resumo:
Census data on endangered species are often sparse, error-ridden, and confined to only a segment of the population. Estimating trends and extinction risks using this type of data presents numerous difficulties. In particular, the estimate of the variation in year-to-year transitions in population size (the “process error” caused by stochasticity in survivorship and fecundities) is confounded by the addition of high sampling error variation. In addition, the year-to-year variability in the segment of the population that is sampled may be quite different from the population variability that one is trying to estimate. The combined effect of severe sampling error and age- or stage-specific counts leads to severe biases in estimates of population-level parameters. I present an estimation method that circumvents the problem of age- or stage-specific counts and is markedly robust to severe sampling error. This method allows the estimation of environmental variation and population trends for extinction-risk analyses using corrupted census counts—a common type of data for endangered species that has hitherto been relatively unusable for these analyses.
Resumo:
Experimental time series for a nonequilibrium reaction may in some cases contain sufficient data to determine a unique kinetic model for the reaction by a systematic mathematical analysis. As an example, a kinetic model for the self-assembly of microtubules is derived here from turbidity time series for solutions in which microtubules assemble. The model may be seen as a generalization of Oosawa's classical nucleation-polymerization model. It reproduces the experimental data with a four-stage nucleation process and a critical nucleus of 15 monomers.
Resumo:
Correlations in low-frequency atomic displacements predicted by molecular dynamics simulations on the order of 1 ns are undersampled for the time scales currently accessible by the technique. This is shown with three different representations of the fluctuations in a macromolecule: the reciprocal space of crystallography using diffuse x-ray scattering data, real three-dimensional Cartesian space using covariance matrices of the atomic displacements, and the 3N-dimensional configuration space of the protein using dimensionally reduced projections to visualize the extent to which phase space is sampled.