53 resultados para STATIONARY SECTORIAL SAMPLER
em University of Queensland eSpace - Australia
Resumo:
The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.
Resumo:
The stationary lineshape of a two-level atom driven by low-intensity narrow-bandwidth squeezed light is shown to exhibit significant differences in behaviour compared to the lineshape for broadband squeezed light. We find that for narrow-bandwidth squeezed light the lineshape is composed of two Lorentzians whose amplitudes depend on the squeezing correlations. Moreover, one of the Lorentzians has a negative weight which leads to narrowing of the line. These features are absent in the broadband case, where the stationary lineshape is the same as for a thermal field. (C) 1998 Elsevier Science B.V.
Resumo:
This note considers continuous-time Markov chains whose state space consists of an irreducible class, C, and an absorbing state which is accessible from C. The purpose is to provide results on mu-invariant and mu-subinvariant measures where absorption occurs with probability less than one. In particular, the well-known premise that the mu-invariant measure, m, for the transition rates be finite is replaced by the more natural premise that m be finite with respect to the absorption probabilities. The relationship between mu-invariant measures and quasi-stationary distributions is discussed. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
We shall be concerned with the problem of determining quasi-stationary distributions for Markovian models directly from their transition rates Q. We shall present simple conditions for a mu-invariant measure m for Q to be mu-invariant for the transition function, so that if m is finite, it can be normalized to produce a quasi-stationary distribution. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
We present a descriptive analysis of a mechanism to coordinate and implement human immunodeficiency virus (HIV) prevention and care in the occupational setting. The mechanism we describe is a multidisciplinary committee composed of stakeholders in the occupational health environment including unions, management, medical researchers, and medical personnel. The site chosen for the analysis was a South African sugar mill in rural KwaZulu-Natal. The factory is situated in an area of high HIV seroprevalence and has a workforce of 400 employees. The committee was initiated to coordinate a combined prevention-care initiative. The issues that were important in the formation of the committee included confidentiality, trust, and the traditional roles of the stakeholder relationships. When these points were addressed through the focus on a common goal, the committee was able to function in its role as a coordinating body. Central to this success was the inclusion of all stakeholders in the process, including those with traditionally opposing, interests and legitimacy conferred by the stakeholders. This committee was functionally effective and demonstrated the benefit of a freestanding committee dedicated to addressing HIV/acquired immune deficiency syndrome (AIDS) issues. We describe the implementation and feasibility of a multisectoral committee in directing HIV/AIDS initiatives in the occupational setting in rural South Africa.
Resumo:
For Markov processes on the positive integers with the origin as an absorbing state, Ferrari, Kesten, Martinez and Picco studied the existence of quasi-stationary and limiting conditional distributions by characterizing quasi-stationary distributions as fixed points of a transformation Phi on the space of probability distributions on {1, 2,.. }. In the case of a birth-death process, the components of Phi(nu) can be written down explicitly for any given distribution nu. Using this explicit representation, we will show that Phi preserves likelihood ratio ordering between distributions. A conjecture of Kryscio and Lefevre concerning the quasi-stationary distribution of the SIS logistic epidemic follows as a corollary.
Resumo:
The purpose of the present study was to examine the reproducibility of laboratory-based 40-km cycle time-trial performance on a stationary wind-trainer. Each week, for three consecutive weeks, and on different days, forty-three highly trained male cyclists ((x) over bar +/- SD; age = 25 +/- 6 y; mass = 75 +/- 7 kg; peak oxygen uptake [(V) over dot O-2 peak] = 64.8 +/- 5.2 ml x kg(-1) x min(-1)) performed: 1) a (V) over dot O-2 peak test, and 2) a 40-km time-trial on their own racing bicycle mounted to a stationary wind-trainer (Cateye - Cyclosimulator). Data from all tests were compared using a one-way analysis of variance. Performance on the second and third 40-km time-trials were highly related (r = 0.96; p < 0.001), not significantly different (57:21 +/- 2:57 vs. 57:12 +/- 3:14 min:s), and displayed a low coefficient of variation (CV) = 0.9 +/- 0.7%. Although the first 40-km time-trial (58:43 +/- 3:17min:s) was not significantly different from the second and third tests (p = 0.06), inclusion of the first test in the assessment of reliability increased within-subject CV to 3.0 +/- 2.9%. 40-km time-trial speed (km x h(-1)) was significantly (p < 0.001) related to peak power output (W; r = 0.75), (V) over dot O-2 peak (1 x min(-1); r = 0.53), and the second ventilatory turnpoint (1 x min(-1); r = 0.68) measured during the progressive exercise tests. These data demonstrate that the assessment of 40-km cycle time-trial performance in well-trained endurance cyclists on a stationary wind-trainer is reproducible, provided the athletes perform a familiarization trial.
Resumo:
A recent development of the Markov chain Monte Carlo (MCMC) technique is the emergence of MCMC samplers that allow transitions between different models. Such samplers make possible a range of computational tasks involving models, including model selection, model evaluation, model averaging and hypothesis testing. An example of this type of sampler is the reversible jump MCMC sampler, which is a generalization of the Metropolis-Hastings algorithm. Here, we present a new MCMC sampler of this type. The new sampler is a generalization of the Gibbs sampler, but somewhat surprisingly, it also turns out to encompass as particular cases all of the well-known MCMC samplers, including those of Metropolis, Barker, and Hastings. Moreover, the new sampler generalizes the reversible jump MCMC. It therefore appears to be a very general framework for MCMC sampling. This paper describes the new sampler and illustrates its use in three applications in Computational Biology, specifically determination of consensus sequences, phylogenetic inference and delineation of isochores via multiple change-point analysis.
Resumo:
A system of two two-level atoms interacting with a squeezed vacuum field can exhibit stationary entanglement associated with nonclassical two-photon correlations characteristic of the squeezed vacuum field. The amount of entanglement present in the system is quantified by the well known measure of entanglement called concurrence. We find analytical formulae describing the concurrence for two identical and nonidentical atoms and show that it is possible to obtain a large degree of steady-state entanglement in the system. Necessary conditions for the entanglement are nonclassical two-photon correlations and nonzero collective decay. It is shown that nonidentical atoms are a better source of stationary entanglement than identical atoms. We discuss the optimal physical conditions for creating entanglement in the system; in particular, it is shown that there is an optimal and rather small value of the mean photon number required for creating entanglement.
Resumo:
Listeria monocytogenes is a food-borne Gram-positive bacterium that is responsible for a variety of infections (worldwide) annually. The organism is able to survive a variety of environmental conditions and stresses, however, the mechanisms by which L. monocytogenes adapts to environmental change are yet to be fully elucidated. An understanding of the mechanism(s) by which L. monocytogenes survives unfavourable environmental conditions will aid in developing new food processing methods to control the organism in foodstuffs. We have utilized a proteomic approach to investigate the response of L. monocytogenes batch cultures to the transition from exponential to stationary growth phase. Proteomic analysis showed that batch cultures of L. monocytogenes perceived stress and began preparations for stationary phase much earlier (approximately A(600) = 0.75, mid-exponential) than predicted by growth characteristics alone. Global analysis of the proteome revealed that the expression levels of more than 50% of all proteins observed changed significantly over a 7-9 h period during this transition phase. We have highlighted ten proteins in particular whose expression levels appear to be important in the early onset of the stationary phase. The significance of these findings in terms of functionality and the mechanistic picture are discussed.
Resumo:
All signals that appear to be periodic have some sort of variability from period to period regardless of how stable they appear to be in a data plot. A true sinusoidal time series is a deterministic function of time that never changes and thus has zero bandwidth around the sinusoid's frequency. A zero bandwidth is impossible in nature since all signals have some intrinsic variability over time. Deterministic sinusoids are used to model cycles as a mathematical convenience. Hinich [IEEE J. Oceanic Eng. 25 (2) (2000) 256-261] introduced a parametric statistical model, called the randomly modulated periodicity (RMP) that allows one to capture the intrinsic variability of a cycle. As with a deterministic periodic signal the RMP can have a number of harmonics. The likelihood ratio test for this model when the amplitudes and phases are known is given in [M.J. Hinich, Signal Processing 83 (2003) 1349-13521. A method for detecting a RMP whose amplitudes and phases are unknown random process plus a stationary noise process is addressed in this paper. The only assumption on the additive noise is that it has finite dependence and finite moments. Using simulations based on a simple RMP model we show a case where the new method can detect the signal when the signal is not detectable in a standard waterfall spectrograrn display. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Experiments for the investigation of the flow of granular solids in a pyrolysis pilot-scale rotary kiln are presented. These experiments consisted first in measuring the volumetric filling ratio (steady-state experiences) for several operating conditions and second in recording the exit flow rates after a positive or negative step in one of the operating parameters (dynamic experiences). A dynamical model computing the evolution of the flow rate of granular solids through the kiln has been developed based on Saeman model [Chem. Eng. Prog. 47 (1951) 508]. The simulations are compared with experimental results; the model gives good results for the rolling mode, but for the slipping mode too. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Eukaryotic genomes display segmental patterns of variation in various properties, including GC content and degree of evolutionary conservation. DNA segmentation algorithms are aimed at identifying statistically significant boundaries between such segments. Such algorithms may provide a means of discovering new classes of functional elements in eukaryotic genomes. This paper presents a model and an algorithm for Bayesian DNA segmentation and considers the feasibility of using it to segment whole eukaryotic genomes. The algorithm is tested on a range of simulated and real DNA sequences, and the following conclusions are drawn. Firstly, the algorithm correctly identifies non-segmented sequence, and can thus be used to reject the null hypothesis of uniformity in the property of interest. Secondly, estimates of the number and locations of change-points produced by the algorithm are robust to variations in algorithm parameters and initial starting conditions and correspond to real features in the data. Thirdly, the algorithm is successfully used to segment human chromosome 1 according to GC content, thus demonstrating the feasibility of Bayesian segmentation of eukaryotic genomes. The software described in this paper is available from the author's website (www.uq.edu.au/similar to uqjkeith/) or upon request to the author.