941 resultados para Stochastic explorations
Resumo:
A deterministic mathematical model for steady-state unidirectional solidification is proposed to predict the columnar-to-equiaxed transition. In the model, which is an extension to the classic model proposed by Hunt [Hunt JD. Mater Sci Eng 1984;65:75], equiaxed grains nucleate according to either a normal or a log-normal distribution of nucleation undercoolings. Growth maps are constructed, indicating either columnar or equiaxed solidification as a function of the velocity of isotherms and temperature gradient. The fields A columnar and equiaxed growth change significantly with the spread of the nucleation undercooling distribution. Increasing the spread Favors columnar solidification if the dimensionless velocity of the isotherms is larger than 1. For a velocity less than 1, however, equiaxed solidification is initially favored, but columnar solidification is enhanced for a larger increase in the spread. This behavior was confirmed by a stochastic model, which showed that an increase in the distribution spread Could change the grain structure from completely columnar to 50% columnar grains. (c) 2008 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we deal with a generalized multi-period mean-variance portfolio selection problem with market parameters Subject to Markov random regime switchings. Problems of this kind have been recently considered in the literature for control over bankruptcy, for cases in which there are no jumps in market parameters (see [Zhu, S. S., Li, D., & Wang, S. Y. (2004). Risk control over bankruptcy in dynamic portfolio selection: A generalized mean variance formulation. IEEE Transactions on Automatic Control, 49, 447-457]). We present necessary and Sufficient conditions for obtaining an optimal control policy for this Markovian generalized multi-period meal-variance problem, based on a set of interconnected Riccati difference equations, and oil a set of other recursive equations. Some closed formulas are also derived for two special cases, extending some previous results in the literature. We apply the results to a numerical example with real data for Fisk control over bankruptcy Ill a dynamic portfolio selection problem with Markov jumps selection problem. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
We give reasons why demographic parameters such as survival and reproduction rates are often modelled well in stochastic population simulation using beta distributions. In practice, it is frequently expected that these parameters will be correlated, for example with survival rates for all age classes tending to be high or low in the same year. We therefore discuss a method for producing correlated beta random variables by transforming correlated normal random variables, and show how it can be applied in practice by means of a simple example. We also note how the same approach can be used to produce correlated uniform triangular, and exponential random variables. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The detection of seizure in the newborn is a critical aspect of neurological research. Current automatic detection techniques are difficult to assess due to the problems associated with acquiring and labelling newborn electroencephalogram (EEG) data. A realistic model for newborn EEG would allow confident development, assessment and comparison of these detection techniques. This paper presents a model for newborn EEG that accounts for its self-similar and non-stationary nature. The model consists of background and seizure sub-models. The newborn EEG background model is based on the short-time power spectrum with a time-varying power law. The relationship between the fractal dimension and the power law of a power spectrum is utilized for accurate estimation of the short-time power law exponent. The newborn EEG seizure model is based on a well-known time-frequency signal model. This model addresses all significant time-frequency characteristics of newborn EEG seizure which include; multiple components or harmonics, piecewise linear instantaneous frequency laws and harmonic amplitude modulation. Estimates of the parameters of both models are shown to be random and are modelled using the data from a total of 500 background epochs and 204 seizure epochs. The newborn EEG background and seizure models are validated against real newborn EEG data using the correlation coefficient. The results show that the output of the proposed models has a higher correlation with real newborn EEG than currently accepted models (a 10% and 38% improvement for background and seizure models, respectively).
Resumo:
P-representation techniques, which have been very successful in quantum optics and in other fields, are also useful for general bosonic quantum-dynamical many-body calculations such as Bose-Einstein condensation. We introduce a representation called the gauge P representation, which greatly widens the range of tractable problems. Our treatment results in an infinite set of possible time evolution equations, depending on arbitrary gauge functions that can be optimized for a given quantum system. In some cases, previous methods can give erroneous results, due to the usual assumption of vanishing boundary conditions being invalid for those particular systems. Solutions are given to this boundary-term problem for all the cases where it is known to occur: two-photon absorption and the single-mode laser. We also provide some brief guidelines on how to apply the stochastic gauge method to other systems in general, quantify the freedom of choice in the resulting equations, and make a comparison to related recent developments.
Resumo:
Clifford Geertz was best known for his pioneering excursions into symbolic or interpretive anthropology, especially in relation to Indonesia. Less well recognised are his stimulating explorations of the modern economic history of Indonesia. His thinking on the interplay of economics and culture was most fully and vigorously expounded in Agricultural Involution. That book deployed a succinctly packaged past in order to solve a pressing contemporary puzzle, Java's enduring rural poverty and apparent social immobility. Initially greeted with acclaim, later and ironically the book stimulated the deep and multi-layered research that in fact led to the eventual rejection of Geertz's central contentions. But the veracity or otherwise of Geertz's inventive characterisation of Indonesian economic development now seems irrelevant; what is profoundly important is the extraordinary stimulus he gave to a generation of scholars to explore Indonesia's modern economic history with a depth and intensity previously unimaginable.
Resumo:
Nearest–neighbour balance is considered a desirable property for an experiment to possess in situations where experimental units are influenced by their neighbours. This paper introduces a measure of the degree of nearest–neighbour balance of a design. The measure is used in an algorithm which generates nearest–neighbour balanced designs and is readily modified to obtain designs with various types of nearest–neighbour balance. Nearest–neighbour balanced designs are produced for a wide class of parameter settings, and in particular for those settings for which such designs cannot be found by existing direct combinatorial methods. In addition, designs with unequal row and column sizes, and designs with border plots are constructed using the approach presented here.
Resumo:
The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.
Resumo:
In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.
Resumo:
We consider a branching model, which we call the collision branching process (CBP), that accounts for the effect of collisions, or interactions, between particles or individuals. We establish that there is a unique CBP, and derive necessary and sufficient conditions for it to be nonexplosive. We review results on extinction probabilities, and obtain explicit expressions for the probability of explosion and the expected hitting times. The upwardly skip-free case is studied in some detail.
Resumo:
This chapter is concerned with acquisition and analysis of test data for determining whether or not the flexural strength of granite cladding under extreme conditions is adequate to assure that reliability requirements are satisfied.
Resumo:
Many images consist of two or more 'phases', where a phase is a collection of homogeneous zones. For example, the phases may represent the presence of different sulphides in an ore sample. Frequently, these phases exhibit very little structure, though all connected components of a given phase may be similar in some sense. As a consequence, random set models are commonly used to model such images. The Boolean model and models derived from the Boolean model are often chosen. An alternative approach to modelling such images is to use the excursion sets of random fields to model each phase. In this paper, the properties of excursion sets will be firstly discussed in terms of modelling binary images. Ways of extending these models to multi-phase images will then be explored. A desirable feature of any model is to be able to fit it to data reasonably well. Different methods for fitting random set models based on excursion sets will be presented and some of the difficulties with these methods will be discussed.
Resumo:
The second edition of An Introduction to Efficiency and Productivity Analysis is designed to be a general introduction for those who wish to study efficiency and productivity analysis. The book provides an accessible, well-written introduction to the four principal methods involved: econometric estimation of average response models; index numbers, data envelopment analysis (DEA); and stochastic frontier analysis (SFA). For each method, a detailed introduction to the basic concepts is presented, numerical examples are provided, and some of the more important extensions to the basic methods are discussed. Of special interest is the systematic use of detailed empirical applications using real-world data throughout the book. In recent years, there have been a number of excellent advance-level books published on performance measurement. This book, however, is the first systematic survey of performance measurement with the express purpose of introducing the field to a wide audience of students, researchers, and practitioners. Indeed, the 2nd Edition maintains its uniqueness: (1) It is a well-written introduction to the field. (2) It outlines, discusses and compares the four principal methods for efficiency and productivity analysis in a well-motivated presentation. (3) It provides detailed advice on computer programs that can be used to implement these performance measurement methods. The book contains computer instructions and output listings for the SHAZAM, LIMDEP, TFPIP, DEAP and FRONTIER computer programs. More extensive listings of data and computer instruction files are available on the book's website: (www.uq.edu.au/economics/cepa/crob2005).
Resumo:
Almost all clinical magnetic resonance imaging systems are based on circular cross-section magnets. Recent advances in elliptical cross-section RF probe and gradient coil hardware raise the question of the possibility of using elliptical cross-section magnet systems, This paper presents a methodology for calculating rapidly the magnetic fields generated by a multi-turn coil of elliptical cross-section and incorporates this in a stochastic optimization method for magnet design, An open magnet system of elliptical cross-section is designed that both reduces the claustrophobia for the patients and allows ready access by attending physicians, The magnet system is optimized for paediatric use, The coil geometry produced by the optimization method has several novel features.