930 resultados para STATIONARY SECTORIAL SAMPLER
Resumo:
"Supported in part by a National Science Foundation grant for theoretical physics related to I. G. Y."
Resumo:
Bibliography: p. 146.
Resumo:
Mode of access: Internet.
Resumo:
"NTS-13"--P. [4] of cover.
Resumo:
Mode of access: Internet.
Resumo:
"First published during the war as a classified report to Section D2, National Defense Research Committee."
Resumo:
Mode of access: Internet.
Resumo:
A recent development of the Markov chain Monte Carlo (MCMC) technique is the emergence of MCMC samplers that allow transitions between different models. Such samplers make possible a range of computational tasks involving models, including model selection, model evaluation, model averaging and hypothesis testing. An example of this type of sampler is the reversible jump MCMC sampler, which is a generalization of the Metropolis-Hastings algorithm. Here, we present a new MCMC sampler of this type. The new sampler is a generalization of the Gibbs sampler, but somewhat surprisingly, it also turns out to encompass as particular cases all of the well-known MCMC samplers, including those of Metropolis, Barker, and Hastings. Moreover, the new sampler generalizes the reversible jump MCMC. It therefore appears to be a very general framework for MCMC sampling. This paper describes the new sampler and illustrates its use in three applications in Computational Biology, specifically determination of consensus sequences, phylogenetic inference and delineation of isochores via multiple change-point analysis.
Resumo:
A system of two two-level atoms interacting with a squeezed vacuum field can exhibit stationary entanglement associated with nonclassical two-photon correlations characteristic of the squeezed vacuum field. The amount of entanglement present in the system is quantified by the well known measure of entanglement called concurrence. We find analytical formulae describing the concurrence for two identical and nonidentical atoms and show that it is possible to obtain a large degree of steady-state entanglement in the system. Necessary conditions for the entanglement are nonclassical two-photon correlations and nonzero collective decay. It is shown that nonidentical atoms are a better source of stationary entanglement than identical atoms. We discuss the optimal physical conditions for creating entanglement in the system; in particular, it is shown that there is an optimal and rather small value of the mean photon number required for creating entanglement.
Resumo:
Listeria monocytogenes is a food-borne Gram-positive bacterium that is responsible for a variety of infections (worldwide) annually. The organism is able to survive a variety of environmental conditions and stresses, however, the mechanisms by which L. monocytogenes adapts to environmental change are yet to be fully elucidated. An understanding of the mechanism(s) by which L. monocytogenes survives unfavourable environmental conditions will aid in developing new food processing methods to control the organism in foodstuffs. We have utilized a proteomic approach to investigate the response of L. monocytogenes batch cultures to the transition from exponential to stationary growth phase. Proteomic analysis showed that batch cultures of L. monocytogenes perceived stress and began preparations for stationary phase much earlier (approximately A(600) = 0.75, mid-exponential) than predicted by growth characteristics alone. Global analysis of the proteome revealed that the expression levels of more than 50% of all proteins observed changed significantly over a 7-9 h period during this transition phase. We have highlighted ten proteins in particular whose expression levels appear to be important in the early onset of the stationary phase. The significance of these findings in terms of functionality and the mechanistic picture are discussed.
Resumo:
All signals that appear to be periodic have some sort of variability from period to period regardless of how stable they appear to be in a data plot. A true sinusoidal time series is a deterministic function of time that never changes and thus has zero bandwidth around the sinusoid's frequency. A zero bandwidth is impossible in nature since all signals have some intrinsic variability over time. Deterministic sinusoids are used to model cycles as a mathematical convenience. Hinich [IEEE J. Oceanic Eng. 25 (2) (2000) 256-261] introduced a parametric statistical model, called the randomly modulated periodicity (RMP) that allows one to capture the intrinsic variability of a cycle. As with a deterministic periodic signal the RMP can have a number of harmonics. The likelihood ratio test for this model when the amplitudes and phases are known is given in [M.J. Hinich, Signal Processing 83 (2003) 1349-13521. A method for detecting a RMP whose amplitudes and phases are unknown random process plus a stationary noise process is addressed in this paper. The only assumption on the additive noise is that it has finite dependence and finite moments. Using simulations based on a simple RMP model we show a case where the new method can detect the signal when the signal is not detectable in a standard waterfall spectrograrn display. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Experiments for the investigation of the flow of granular solids in a pyrolysis pilot-scale rotary kiln are presented. These experiments consisted first in measuring the volumetric filling ratio (steady-state experiences) for several operating conditions and second in recording the exit flow rates after a positive or negative step in one of the operating parameters (dynamic experiences). A dynamical model computing the evolution of the flow rate of granular solids through the kiln has been developed based on Saeman model [Chem. Eng. Prog. 47 (1951) 508]. The simulations are compared with experimental results; the model gives good results for the rolling mode, but for the slipping mode too. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Eukaryotic genomes display segmental patterns of variation in various properties, including GC content and degree of evolutionary conservation. DNA segmentation algorithms are aimed at identifying statistically significant boundaries between such segments. Such algorithms may provide a means of discovering new classes of functional elements in eukaryotic genomes. This paper presents a model and an algorithm for Bayesian DNA segmentation and considers the feasibility of using it to segment whole eukaryotic genomes. The algorithm is tested on a range of simulated and real DNA sequences, and the following conclusions are drawn. Firstly, the algorithm correctly identifies non-segmented sequence, and can thus be used to reject the null hypothesis of uniformity in the property of interest. Secondly, estimates of the number and locations of change-points produced by the algorithm are robust to variations in algorithm parameters and initial starting conditions and correspond to real features in the data. Thirdly, the algorithm is successfully used to segment human chromosome 1 according to GC content, thus demonstrating the feasibility of Bayesian segmentation of eukaryotic genomes. The software described in this paper is available from the author's website (www.uq.edu.au/similar to uqjkeith/) or upon request to the author.
Resumo:
Introduction: Walking programmes are recommended as part of the initial treatment for intermittent claudication (IC). However, for many patients factors such as frailty, the severe leg discomfort associated with walking and safety concerns about exercising in public areas reduce compliance to such prescription. Thus, there is a need to identify a mode of exercise that provides the same benefits as regular walking while also offering convenience and comfort for these patients. The present study aims to provide evidence for the first time of the efficacy of a supervised cycle training programme compared with a conventional walking programme for the treatment of IC. Methods: Thus far 33 patients have been randomized to: a treadmill-training group (n = 12); a cycle-training group (n = 11); or a control group (n = 10). Training groups participated in three sessions of supervised training per week for a period of 6 weeks. Control patients received no experimental intervention. Maximal incremental treadmill testing was performed at baseline and after the 6 weeks of training. Measures included pain-free (PFWT) and maximal walking time (MWT), continuous heart rate and gas-analysis recording, and ankle-brachial index assessment. Results: In the treadmill trained group MWT increased significantly from 1016.7 523.7 to 1255.2 432.2 s (P < 0.05). MWT tended to increase with cycle training (848.72 333.18 to 939.54 350.35 s, P = 0.14), and remained unchanged in the control group (1555.1 683.23 to 1534.7 689.87 s). For PFWT, there was a non-significant increase in the treadmill-training group from 414.4 262.3 to 592.9 381.9 s, while both the cycle training and control groups displayed no significant change in this time (226.7 147.1 s to 192.3 56.8 and 499.4 503.7 s to 466.0 526.1 s, respectively). Conclusions: These preliminary results might suggest that, unlike treadmill walking, cycling has no clear effect on walking performance in patients with IC. Thus the current recommendations promoting walking based programmes appear appropriate. The present study was funded by the National Heart Foundation of Australia.