7 resultados para Stochastic simulators

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The forest simulator is a computerized model for predicting forest growth and future development as well as effects of forest harvests and treatments. The forest planning system is a decision support tool, usually including a forest simulator and an optimisation model, for finding the optimal forest management actions. The information produced by forest simulators and forest planning systems is used for various analytical purposes and in support of decision making. However, the quality and reliability of this information can often be questioned. Natural variation in forest growth and estimation errors in forest inventory, among other things, cause uncertainty in predictions of forest growth and development. This uncertainty stemming from different sources has various undesirable effects. In many cases outcomes of decisions based on uncertain information are something else than desired. The objective of this thesis was to study various sources of uncertainty and their effects in forest simulators and forest planning systems. The study focused on three notable sources of uncertainty: errors in forest growth predictions, errors in forest inventory data, and stochastic fluctuation of timber assortment prices. Effects of uncertainty were studied using two types of forest growth models, individual tree-level models and stand-level models, and with various error simulation methods. New method for simulating more realistic forest inventory errors was introduced and tested. Also, three notable sources of uncertainty were combined and their joint effects on stand-level net present value estimates were simulated. According to the results, the various sources of uncertainty can have distinct effects in different forest growth simulators. The new forest inventory error simulation method proved to produce more realistic errors. The analysis on the joint effects of various sources of uncertainty provided interesting knowledge about uncertainty in forest simulators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stochastic filtering has been in general an estimation of indirectly observed states given observed data. This means that one is discussing conditional expected values as being one of the most accurate estimation, given the observations in the context of probability space. In my thesis, I have presented the theory of filtering using two different kind of observation process: the first one is a diffusion process which is discussed in the first chapter, while the third chapter introduces the latter which is a counting process. The majority of the fundamental results of the stochastic filtering is stated in form of interesting equations, such the unnormalized Zakai equation that leads to the Kushner-Stratonovich equation. The latter one which is known also by the normalized Zakai equation or equally by Fujisaki-Kallianpur-Kunita (FKK) equation, shows the divergence between the estimate using a diffusion process and a counting process. I have also introduced an example for the linear gaussian case, which is mainly the concept to build the so-called Kalman-Bucy filter. As the unnormalized and the normalized Zakai equations are in terms of the conditional distribution, a density of these distributions will be developed through these equations and stated by Kushner Theorem. However, Kushner Theorem has a form of a stochastic partial differential equation that needs to be verify in the sense of the existence and uniqueness of its solution, which is covered in the second chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to investigate the pricing accuracy under stochastic volatility where the volatility follows a square root process. The theoretical prices are compared with market price data (the German DAX index options market) by using two different techniques of parameter estimation, the method of moments and implicit estimation by inversion. Standard Black & Scholes pricing is used as a benchmark. The results indicate that the stochastic volatility model with parameters estimated by inversion using the available prices on the preceding day, is the most accurate pricing method of the three in this study and can be considered satisfactory. However, as the same model with parameters estimated using a rolling window (the method of moments) proved to be inferior to the benchmark, the importance of stable and correct estimation of the parameters is evident.