201 resultados para Multiscale stochastic modelling
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
For Markov processes on the positive integers with the origin as an absorbing state, Ferrari, Kesten, Martinez and Picco studied the existence of quasi-stationary and limiting conditional distributions by characterizing quasi-stationary distributions as fixed points of a transformation Phi on the space of probability distributions on {1, 2,.. }. In the case of a birth-death process, the components of Phi(nu) can be written down explicitly for any given distribution nu. Using this explicit representation, we will show that Phi preserves likelihood ratio ordering between distributions. A conjecture of Kryscio and Lefevre concerning the quasi-stationary distribution of the SIS logistic epidemic follows as a corollary.
Resumo:
Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
Colour pattern variation is a striking and widespread phenomenon. Differential predation risk between individuals is often invoked to explain colour variation, but empirical support for this hypothesis is equivocal. We investigated differential conspicuousness and predation risk in two species of Australian rock dragons, Ctenophorus decresii and C. vadnappa. To humans, the coloration of males of these species varies between 'bright' and 'dull'. Visual modelling based on objective colour measurements and the spectral sensitivities of avian visual pigments showed that dragon colour variants are differentially conspicuous to the visual system of avian predators when viewed against the natural background. We conducted field experiments to test for differential predation risk, using plaster models of 'bright' and 'dull' males. 'Bright' models were attacked significantly more often than 'dull' models suggesting that differential conspicuousness translates to differential predation risk in the wild. We also examined the influence of natural geographical range on predation risk. Results from 22 localities suggest that predation rates vary according to whether predators are familiar with the prey species. This study is among the first to demonstrate both differential conspicuousness and differential predation risk in the wild using an experimental protocol. (C) 2003 Published by Elsevier Ltd on behalf of The Association for the Study of Animal Behaviour.
Resumo:
Crushing and grinding are the most energy intensive part of the mineral recovery process. A major part of rock size reduction occurs in tumbling mills. Empirical models for the power draw of tumbling mills do not consider the effect of lifters. Discrete element modelling was used to investigate the effect of lifter condition on the power draw of tumbling mill. Results obtained with PFC3D code show that lifter condition will have a significant influence on the power draw and on the mode of energy consumption in the mill. Relatively high lifters will consume less power than low lifters, under otherwise identical conditions. The fraction of the power that will be consumed as friction will increase as the height of the lifters decreases. This will result in less power being used for high intensity comminution caused by the impacts. The fraction of the power that will be used to overcome frictional resistance is determined by the material's coefficient of friction. Based on the modelled results, it appears that the effective coefficient of friction for in situ mill is close to 0.1. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
The PFC3D (particle flow code) that models the movement and interaction of particles by the DEM techniques was employed to simulate the particle movement and to calculate the velocity and energy distribution of collision in two types of impact crusher: the Canica vertical shaft crusher and the BJD horizontal shaft swing hammer mill. The distribution of collision energies was then converted into a product size distribution for a particular ore type using JKMRC impact breakage test data. Experimental data of the Canica VSI crusher treating quarry and the BJD hammer mill treating coal were used to verify the DEM simulation results. Upon the DEM procedures being validated, a detailed simulation study was conducted to investigate the effects of the machine design and operational conditions on velocity and energy distributions of collision inside the milling chamber and on the particle breakage behaviour. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A new model to predict the extent of crushing around a blasthole is presented. The model is based on the back-analysis of a comprehensive experimental program that included the direct measurement of the zone of crushing from 92 blasting tests on concrete blocks using two commercial explosives. The concrete blocks varied from low, medium to high strength and measured 1.5 in in length, 1.0 m in width and 1.1 m in height. A dimensionless parameter called the crushing zone index (CZI) is introduced. This index measures the crushing potential of a charged blasthole and is a function of the borehole pressure, the unconfined compressive strength of the rock material, dynamic Young's modulus and Poisson's ratio. It is shown that the radius of crushing is a function of the CZI and the blasthole radius. A good correlation between the new model and measured results was obtained. A number of previously proposed models could not approximate the conditions measured in the experimental work and there are noted discrepancies between the different approaches reviewed, particularly for smaller diameter holes and low strength rock conditions. The new model has been verified with full scale tests reported in the literature. Results from this validation and model evaluations show its applicability to production blasting. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Many large-scale stochastic systems, such as telecommunications networks, can be modelled using a continuous-time Markov chain. However, it is frequently the case that a satisfactory analysis of their time-dependent, or even equilibrium, behaviour is impossible. In this paper, we propose a new method of analyzing Markovian models, whereby the existing transition structure is replaced by a more amenable one. Using rates of transition given by the equilibrium expected rates of the corresponding transitions of the original chain, we are able to approximate its behaviour. We present two formulations of the idea of expected rates. The first provides a method for analysing time-dependent behaviour, while the second provides a highly accurate means of analysing equilibrium behaviour. We shall illustrate our approach with reference to a variety of models, giving particular attention to queueing and loss networks. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Admission controls, such as trunk reservation, are often used in loss networks to optimise their performance. Since the numerical evaluation of performance measures is complex, much attention has been given to finding approximation methods. The Erlang Fixed-Point (EFP) approximation, which is based on an independent blocking assumption, has been used for networks both with and without controls. Several more elaborate approximation methods which account for dependencies in blocking behaviour have been developed for the uncontrolled setting. This paper is an exploratory investigation of extensions and synthesis of these methods to systems with controls, in particular, trunk reservation. In order to isolate the dependency factor, we restrict our attention to a highly linear network. We will compare the performance of the resulting approximations against the benchmark of the EFP approximation extended to the trunk reservation setting. By doing this, we seek to gain insight into the critical factors in constructing an effective approximation. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.