967 resultados para SPANNING PROBABILITY
Resumo:
The effects of convective and absolute instabilities on the formation of drops formed from cylindrical liquid jets of glycerol/water issuing into still air were investigated. Medium-duration reduced gravity tests were conducted aboard NASA's KC-135 and compared to similar tests performed under normal gravity conditions to aid in understanding the drop formation process. In reduced gravity, the Rayleigh-Chandrasekhar Equation was found to accurately predict the transition between a region of absolute and convective instability as defined by a critical Weber number. Observations of the physics of the jet, its breakup, and subsequent drop dynamics under both gravity conditions and the effects of the two instabilities on these processes are presented. All the normal gravity liquid jets investigated, in regions of convective or absolute instability, were subject to significant stretching effects, which affected the subsequent drop and associated geometry and dynamics. These effects were not displayed in reduced gravity and, therefore, the liquid jets would form drops which took longer to form (reduction in drop frequency), larger in size, and more spherical (surface tension effects). Most observed changes, in regions of either absolute or convective instabilities, were due to a reduction in the buoyancy force and an increased importance of the surface tension force acting on the liquid contained in the jet or formed drop. Reduced gravity environments allow better investigations to be performed into the physics of liquid jets, subsequently formed drops, and the effects of instabilities on these systems. In reduced gravity, drops form up to three times more slowly and as a consequence are up to three times larger in volume in the theoretical absolute instability region than in the theoretical convective instability region. This difference was not seen in the corresponding normal gravity tests due to the masking effects of gravity. A drop is shown to be able to form and detach in a region of absolute instability, and spanning the critical Weber number (from a region of convective to absolute instability) resulted in a marked change in dynamics and geometry of the liquid jet and detaching drops. (C) 2002 American Institute of Physics.
Resumo:
The retinoid orphan-related receptor-alpha (RORalpha) is a member of the ROR subfamily of orphan receptors and acts as a constitutive activator of transcription in the absence of exogenous ligands. To understand the basis of this activity, we constructed a homology model of Rill using the closely related TRbeta as a template. Molecular modeling suggested that bulky hydrophobic side chains occupy the RORa ligand cavity leaving a small but distinct cavity that may be involved in receptor stabilization. This model was subject to docking simulation with a receptor-interacting peptide from the steroid receptor coactivator, GR-interacting protein-1, which delineated a coactivator binding surface consisting of the signature motif spanning helices 3-5 and helix 12 [activation function 2 (AF2)]. Probing this surface with scanning alanine mutagenesis showed structural and functional equivalence between homologous residues of RORalpha and TRbeta. This was surprising (given that Rill is a ligand-independent activator, whereas TRbeta has an absolute requirement for ligand) and prompted us to use molecular modeling to identify differences between Rill and TRbeta in the way that the All helix interacts with the rest of the receptor. Modeling highlighted a nonconserved amino acid in helix 11 of RORa (Phe491) and a short-length of 3.10 helix at the N terminus of AF2 which we suggest i) ensures that AF2 is locked permanently in the holoconformation described for other liganded receptors and thus 2) enables ligand-independent recruitment of coactivators. Consistent with this, mutation of RORa Phe491 to either methionine or alanine (methionine is the homologous residue in TRbeta), reduced and ablated transcriptional activation and recruitment of coactivators, respectively. Furthermore, we were able to reconstitute transcriptional activity for both a deletion mutant of Ill lacking All and Phe491 Met, by overexpression of a GAL-AF2 fusion protein, demonstrating ligand-independent recruitment of AF2 and a role for Phe491 in recruiting AF2.
Resumo:
This article develops a weighted least squares version of Levene's test of homogeneity of variance for a general design, available both for univariate and multivariate situations. When the design is balanced, the univariate and two common multivariate test statistics turn out to be proportional to the corresponding ordinary least squares test statistics obtained from an analysis of variance of the absolute values of the standardized mean-based residuals from the original analysis of the data. The constant of proportionality is simply a design-dependent multiplier (which does not necessarily tend to unity). Explicit results are presented for randomized block and Latin square designs and are illustrated for factorial treatment designs and split-plot experiments. The distribution of the univariate test statistic is close to a standard F-distribution, although it can be slightly underdispersed. For a complex design, the test assesses homogeneity of variance across blocks, treatments, or treatment factors and offers an objective interpretation of residual plot.
Resumo:
The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Merozoite surface protein 1 (MSP1) of malaria parasites undergoes proteolytic processing at least twice before invasion into a new RBC. The 42-kDa fragment, a product of primary processing, is cleaved by proteolytic enzymes giving rise to MSP1(33), which is shed from the merozoite surface, and MSP1(19), which is the only fragment carried into a new RBC. In this study, we have identified T cell epitopes on MSP1(33) of Plasmodium yoelii and have examined their function in immunity to blood stage malaria. Peptides 20 aa in length, spanning the length of MSP1(33) and overlapping each other by 10 aa, were analyzed for their ability to induce T cell proliferation in immunized BALB/c and C57BL/6 mice. Multiple epitopes were recognized by these two strains of mice. Effector functions of the dominant epitopes were then investigated. Peptides Cm15 and Cm21 were of particular interest as they were able to induce effector T cells capable of delaying growth of lethal P. yoelii YM following adoptive transfer into immuno-deficient mice without inducing detectable Ab responses. Homologs of these epitopes could be candidates for inclusion in a subunit vaccine.
Resumo:
Motivation: A major issue in cell biology today is how distinct intracellular regions of the cell, like the Golgi Apparatus, maintain their unique composition of proteins and lipids. The cell differentially separates Golgi resident proteins from proteins that move through the organelle to other subcellular destinations. We set out to determine if we could distinguish these two types of transmembrane proteins using computational approaches. Results: A new method has been developed to predict Golgi membrane proteins based on their transmembrane domains. To establish the prediction procedure, we took the hydrophobicity values and frequencies of different residues within the transmembrane domains into consideration. A simple linear discriminant function was developed with a small number of parameters derived from a dataset of Type II transmembrane proteins of known localization. This can discriminate between proteins destined for Golgi apparatus or other locations (post-Golgi) with a success rate of 89.3% or 85.2%, respectively on our redundancy-reduced data sets.
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
For Markov processes on the positive integers with the origin as an absorbing state, Ferrari, Kesten, Martinez and Picco studied the existence of quasi-stationary and limiting conditional distributions by characterizing quasi-stationary distributions as fixed points of a transformation Phi on the space of probability distributions on {1, 2,.. }. In the case of a birth-death process, the components of Phi(nu) can be written down explicitly for any given distribution nu. Using this explicit representation, we will show that Phi preserves likelihood ratio ordering between distributions. A conjecture of Kryscio and Lefevre concerning the quasi-stationary distribution of the SIS logistic epidemic follows as a corollary.
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
The extent to which density-dependent processes regulate natural populations is the subject of an ongoing debate. We contribute evidence to this debate showing that density-dependent processes influence the population dynamics of the ectoparasite Aponomma hydrosauri (Acari: Ixodidae), a tick species that infests reptiles in Australia. The first piece of evidence comes from an unusually long-term dataset on the distribution of ticks among individual hosts. If density-dependent processes are influencing either host mortality or vital rates of the parasite population, and those distributions can be approximated with negative binomial distributions, then general host-parasite models predict that the aggregation coefficient of the parasite distribution will increase with the average intensity of infections. We fit negative binomial distributions to the frequency distributions of ticks on hosts, and find that the estimated aggregation coefficient k increases with increasing average tick density. This pattern indirectly implies that one or more vital rates of the tick population must be changing with increasing tick density, because mortality rates of the tick's main host, the sleepy lizard, Tiliqua rugosa, are unaffected by changes in tick burdens. Our second piece of evidence is a re-analysis of experimental data on the attachment success of individual ticks to lizard hosts using generalized linear modelling. The probability of successful engorgement decreases with increasing numbers of ticks attached to a host. This is direct evidence of a density-dependent process that could lead to an increase in the aggregation coefficient of tick distributions described earlier. The population-scale increase in the aggregation coefficient is indirect evidence of a density-dependent process or processes sufficiently strong to produce a population-wide pattern, and thus also likely to influence population regulation. The direct observation of a density-dependent process is evidence of at least part of the responsible mechanism.
Resumo:
Like many states and territories, South Australia has a legacy of marine reserves considered to be inadequate to meet current conservation objectives. In this paper we configured exploratory marine reserve systems, using the software MARXAN, to examine how efficiently South Australia's existing marine reserves contribute to quantitative biodiversity conservation targets. Our aim was to compare marine reserve systems that retain South Australia's existing marine reserves with reserve systems that are free to either ignore or incorporate them. We devised a new interpretation of irreplaceability to identify planning units selected more than could be expected from chance alone. This is measured by comparing the observed selection frequency for an individual planning unit with a predicted selection frequency distribution. Knowing which sites make a valuable contribution to efficient marine reserve system design allows us to determine how well South Australia's existing reserves contribute to reservation goals when representation targets are set at 5, 10, 15, 20, 30 and 50% of conservation features. Existing marine reserves that tail to contribute to efficient marine reserve systems constitute 'opportunity costs'. We found that despite spanning less than 4% of South Australian state waters, locking in the existing ad hoc marine reserves presented considerable opportunity costs. Even with representation targets set at 50%, more than halt of South Australia's existing marine reserves were selected randomly or less in efficient marine reserve systems. Hence, ad hoc marine reserve systems are likely to be inefficient and may compromise effective conservation of marine biodiversity.
Resumo:
Cervical auscultation presents as a noninvasive screening assessment of swallowing. Until now the focus of acoustic research in swallowing has been the characterization of swallowing sounds,. However, it may be that the technique is also suitable for the detection of respiratory sounds post swallow. A healthy relationship between swallowing and respiration is widely accepted as pivotal to safe swallowing. Previous investigators have shown that the expiratory phase of respiration commonly occurs prior to and after swallowing. That the larynx is valved shut during swallowing is also accepted. Previous research indicates that the larynx releases valved air immediately post swallow in healthy individuals. The current investigation sought to explore acoustic evidence of a release of subglottic air post swallow in nondysphagic individuals using a noninvasive medium. Fifty-nine healthy individuals spanning the ages of 18 to 60+ years swallowed 5 and 10 milliliters (ml) of thin and thick liquid boluses. Objective acoustic analysis was used to verify presence of the sound and to characterize its morphological features. The sound, dubbed the glottal release sound, was found to consistently occur in close proximity following the swallowing sound. The results indicated that the sound has distinct morphological features and that these change depending on the volume and viscosity of the bolus swallowed. Further research will be required to translate this information to a clinical tool.
Resumo:
Many large-scale stochastic systems, such as telecommunications networks, can be modelled using a continuous-time Markov chain. However, it is frequently the case that a satisfactory analysis of their time-dependent, or even equilibrium, behaviour is impossible. In this paper, we propose a new method of analyzing Markovian models, whereby the existing transition structure is replaced by a more amenable one. Using rates of transition given by the equilibrium expected rates of the corresponding transitions of the original chain, we are able to approximate its behaviour. We present two formulations of the idea of expected rates. The first provides a method for analysing time-dependent behaviour, while the second provides a highly accurate means of analysing equilibrium behaviour. We shall illustrate our approach with reference to a variety of models, giving particular attention to queueing and loss networks. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.