10 resultados para 230201 Probability Theory
em University of Queensland eSpace - Australia
Resumo:
Mineral processing plants use two main processes; these are comminution and separation. The objective of the comminution process is to break complex particles consisting of numerous minerals into smaller simpler particles where individual particles consist primarily of only one mineral. The process in which the mineral composition distribution in particles changes due to breakage is called 'liberation'. The purpose of separation is to separate particles consisting of valuable mineral from those containing nonvaluable mineral. The energy required to break particles to fine sizes is expensive, and therefore the mineral processing engineer must design the circuit so that the breakage of liberated particles is reduced in favour of breaking composite particles. In order to effectively optimize a circuit through simulation it is necessary to predict how the mineral composition distributions change due to comminution. Such a model is called a 'liberation model for comminution'. It was generally considered that such a model should incorporate information about the ore, such as the texture. However, the relationship between the feed and product particles can be estimated using a probability method, with the probability being defined as the probability that a feed particle of a particular composition and size will form a particular product particle of a particular size and composition. The model is based on maximizing the entropy of the probability subject to mass constraints and composition constraint. Not only does this methodology allow a liberation model to be developed for binary particles, but also for particles consisting of many minerals. Results from applying the model to real plant ore are presented. A laboratory ball mill was used to break particles. The results from this experiment were used to estimate the kernel which represents the relationship between parent and progeny particles. A second feed, consisting primarily of heavy particles subsampled from the main ore was then ground through the same mill. The results from the first experiment were used to predict the product of the second experiment. The agreement between the predicted results and the actual results are very good. It is therefore recommended that more extensive validation is needed to fully evaluate the substance of the method. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Cox's theorem states that, under certain assumptions, any measure of belief is isomorphic to a probability measure. This theorem, although intended as a justification of the subjectivist interpretation of probability theory, is sometimes presented as an argument for more controversial theses. Of particular interest is the thesis that the only coherent means of representing uncertainty is via the probability calculus. In this paper I examine the logical assumptions of Cox's theorem and I show how these impinge on the philosophical conclusions thought to be supported by the theorem. I show that the more controversial thesis is not supported by Cox's theorem. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
The argument from fine tuning is supposed to establish the existence of God from the fact that the evolution of carbon-based life requires the laws of physics and the boundary conditions of the universe to be more or less as they are. We demonstrate that this argument fails. In particular, we focus on problems associated with the role probabilities play in the argument. We show that, even granting the fine tuning of the universe, it does not follow that the universe is improbable, thus no explanation of the fine tuning, theistic or otherwise, is required.
Resumo:
Statistics is known to be an art as well as a science. The training of mathematical physicists predisposes them towards hypothesising plausible Bayesean priors. Tony Bracken and I were of that mind [1], but in our discussions we also recognised the Bayesean will-o'-the-wisp illustrated below.
Resumo:
We consider a buying-selling problem when two stops of a sequence of independent random variables are required. An optimal stopping rule and the value of a game are obtained.
Resumo:
Stochastic models based on Markov birth processes are constructed to describe the process of invasion of a fly larva by entomopathogenic nematodes. Various forms for the birth (invasion) rates are proposed. These models are then fitted to data sets describing the observed numbers of nematodes that have invaded a fly larval after a fixed period of time. Non-linear birthrates are required to achieve good fits to these data, with their precise form leading to different patterns of invasion being identified for three populations of nematodes considered. One of these (Nemasys) showed the greatest propensity for invasion. This form of modelling may be useful more generally for analysing data that show variation which is different from that expected from a binomial distribution.
Resumo:
This note shows that, under appropriate conditions, preferences may be locally approximated by the linear utility or risk-neutral preference functional associated with a local probability transformation.
Resumo:
In the absence of an external frame of reference-i.e., in background independent theories such as general relativity-physical degrees of freedom must describe relations between systems. Using a simple model, we investigate how such a relational quantum theory naturally arises by promoting reference systems to the status of dynamical entities. Our goal is twofold. First, we demonstrate using elementary quantum theory how any quantum mechanical experiment admits a purely relational description at a fundamental. Second, we describe how the original non-relational theory approximately emerges from the fully relational theory when reference systems become semi-classical. Our technique is motivated by a Bayesian approach to quantum mechanics, and relies on the noiseless subsystem method of quantum information science used to protect quantum states against undesired noise. The relational theory naturally predicts a fundamental decoherence mechanism, so an arrow of time emerges from a time-symmetric theory. Moreover, our model circumvents the problem of the collapse of the wave packet as the probability interpretation is only ever applied to diagonal density operators. Finally, the physical states of the relational theory can be described in terms of spin networks introduced by Penrose as a combinatorial description of geometry, and widely studied in the loop formulation of quantum gravity. Thus, our simple bottom-up approach (starting from the semiclassical limit to derive the fully relational quantum theory) may offer interesting insights on the low energy limit of quantum gravity.