9 resultados para Computational Mathematics
em University of Queensland eSpace - Australia
Resumo:
Cox's theorem states that, under certain assumptions, any measure of belief is isomorphic to a probability measure. This theorem, although intended as a justification of the subjectivist interpretation of probability theory, is sometimes presented as an argument for more controversial theses. Of particular interest is the thesis that the only coherent means of representing uncertainty is via the probability calculus. In this paper I examine the logical assumptions of Cox's theorem and I show how these impinge on the philosophical conclusions thought to be supported by the theorem. I show that the more controversial thesis is not supported by Cox's theorem. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
The concept of a monotone family of functions, which need not be countable, and the solution of an equilibrium problem associated with the family are introduced. A fixed-point theorem is applied to prove the existence of solutions to the problem.
Resumo:
Multiplication and comultiplication of beliefs represent a generalisation of multiplication and comultiplication of probabilities as well as of binary logic AND and OR. Our approach follows that of subjective logic, where belief functions are expressed as opinions that are interpreted as being equivalent to beta probability distributions. We compare different types of opinion product and coproduct, and show that they represent very good approximations of the analytical product and coproduct of beta probability distributions. We also define division and codivision of opinions, and compare our framework with other logic frameworks for combining uncertain propositions. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
First-year undergraduate engineering students' understanding of the units of factors and terms in first-order ordinary differential equations used in modelling contexts was investigated using diagnostic quiz questions. Few students appeared to realize that the units of each term in such equations must be the same, or if they did, nevertheless failed to apply that knowledge when needed. In addition, few students were able to determine the units of a proportionality factor in a simple equation. These results indicate that lecturers of modelling courses cannot take this foundational knowledge for granted and should explicitly include it in instruction.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
We are developing a telemedicine application which offers automated diagnosis of facial (Bell's) palsy through a Web service. We used a test data set of 43 images of facial palsy patients and 44 normal people to develop the automatic recognition algorithm. Three different image pre-processing methods were used. Machine learning techniques (support vector machine, SVM) were used to examine the difference between the two halves of the face. If there was a sufficient difference, then the SVM recognized facial palsy. Otherwise, if the halves were roughly symmetrical, the SVM classified the image as normal. It was found that the facial palsy images had a greater Hamming Distance than the normal images, indicating greater asymmetry. The median distance in the normal group was 331 (interquartile range 277-435) and the median distance in the facial palsy group was 509 (interquartile range 334-703). This difference was significant (P