951 resultados para stochastic analysis
Resumo:
In this thesis various schemes using custom power devices for power quality improvement in low voltage distribution network are studied. Customer operated distributed generators makes a typical network non-radial and affect the power quality. A scheme considering different algorithm of DSTATCOM is proposed for power circulation and islanded operation of the system. To compensate reactive power overflow and facilitate unity power factor, a UPQC is introduced. Stochastic analysis is carried out for different scenarios to get a comprehensive idea about a real life distribution network. Combined operation of static compensator and voltage regulator is tested for the optimum quality and stability of the system.
Resumo:
In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.
Resumo:
In estuaries and natural water channels, the estimate of velocity and dispersion coefficients is critical to the knowledge of scalar transport and mixing. This estimate is rarely available experimentally at sub-tidal time scale in shallow water channels where high frequency is required to capture its spatio-temporal variation. This study estimates Lagrangian integral scales and autocorrelation curves, which are key parameters for obtaining velocity fluctuations and dispersion coefficients, and their spatio-temporal variability from deployments of Lagrangian drifters sampled at 10 Hz for a 4-hour period. The power spectral densities of the velocities between 0.0001 and 0.8 Hz were well fitted with a slope of 5/3 predicted by Kolmogorov’s similarity hypothesis within the inertial subrange, and were similar to the Eulerian power spectral previously observed within the estuary. The result showed that large velocity fluctuations determine the magnitude of the integral time scale, TL. Overlapping of short segments improved the stability of the estimate of TL by taking advantage of the redundant data included in the autocorrelation function. The integral time scales were about 20 s and varied by up to a factor of 8. These results are essential inputs for spatial binning of velocities, Lagrangian stochastic modelling and single particle analysis of the tidal estuary.
Resumo:
In this paper it is demonstrated how the Bayesian parametric bootstrap can be adapted to models with intractable likelihoods. The approach is most appealing when the semi-automatic approximate Bayesian computation (ABC) summary statistics are selected. After a pilot run of ABC, the likelihood-free parametric bootstrap approach requires very few model simulations to produce an approximate posterior, which can be a useful approximation in its own right. An alternative is to use this approximation as a proposal distribution in ABC algorithms to make them more efficient. In this paper, the parametric bootstrap approximation is used to form the initial importance distribution for the sequential Monte Carlo and the ABC importance and rejection sampling algorithms. The new approach is illustrated through a simulation study of the univariate g-and- k quantile distribution, and is used to infer parameter values of a stochastic model describing expanding melanoma cell colonies.
Resumo:
Provision of network infrastructure to meet rising network peak demand is increasing the cost of electricity. Addressing this demand is a major imperative for Australian electricity agencies. The network peak demand model reported in this paper provides a quantified decision support tool and a means of understanding the key influences and impacts on network peak demand. An investigation of the system factors impacting residential consumers’ peak demand for electricity was undertaken in Queensland, Australia. Technical factors, such as the customers’ location, housing construction and appliances, were combined with social factors, such as household demographics, culture, trust and knowledge, and Change Management Options (CMOs) such as tariffs, price,managed supply, etc., in a conceptual ‘map’ of the system. A Bayesian network was used to quantify the model and provide insights into the major influential factors and their interactions. The model was also used to examine the reduction in network peak demand with different market-based and government interventions in various customer locations of interest and investigate the relative importance of instituting programs that build trust and knowledge through well designed customer-industry engagement activities. The Bayesian network was implemented via a spreadsheet with a tick box interface. The model combined available data from industry-specific and public sources with relevant expert opinion. The results revealed that the most effective intervention strategies involve combining particular CMOs with associated education and engagement activities. The model demonstrated the importance of designing interventions that take into account the interactions of the various elements of the socio-technical system. The options that provided the greatest impact on peak demand were Off-Peak Tariffs and Managed Supply and increases in the price of electricity. The impact in peak demand reduction differed for each of the locations and highlighted that household numbers, demographics as well as the different climates were significant factors. It presented possible network peak demand reductions which would delay any upgrade of networks, resulting in savings for Queensland utilities and ultimately for households. The use of this systems approach using Bayesian networks to assist the management of peak demand in different modelled locations in Queensland provided insights about the most important elements in the system and the intervention strategies that could be tailored to the targeted customer segments.
Resumo:
Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.
Resumo:
Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.
Resumo:
Predicting temporal responses of ecosystems to disturbances associated with industrial activities is critical for their management and conservation. However, prediction of ecosystem responses is challenging due to the complexity and potential non-linearities stemming from interactions between system components and multiple environmental drivers. Prediction is particularly difficult for marine ecosystems due to their often highly variable and complex natures and large uncertainties surrounding their dynamic responses. Consequently, current management of such systems often rely on expert judgement and/or complex quantitative models that consider only a subset of the relevant ecological processes. Hence there exists an urgent need for the development of whole-of-systems predictive models to support decision and policy makers in managing complex marine systems in the context of industry based disturbances. This paper presents Dynamic Bayesian Networks (DBNs) for predicting the temporal response of a marine ecosystem to anthropogenic disturbances. The DBN provides a visual representation of the problem domain in terms of factors (parts of the ecosystem) and their relationships. These relationships are quantified via Conditional Probability Tables (CPTs), which estimate the variability and uncertainty in the distribution of each factor. The combination of qualitative visual and quantitative elements in a DBN facilitates the integration of a wide array of data, published and expert knowledge and other models. Such multiple sources are often essential as one single source of information is rarely sufficient to cover the diverse range of factors relevant to a management task. Here, a DBN model is developed for tropical, annual Halophila and temperate, persistent Amphibolis seagrass meadows to inform dredging management and help meet environmental guidelines. Specifically, the impacts of capital (e.g. new port development) and maintenance (e.g. maintaining channel depths in established ports) dredging is evaluated with respect to the risk of permanent loss, defined as no recovery within 5 years (Environmental Protection Agency guidelines). The model is developed using expert knowledge, existing literature, statistical models of environmental light, and experimental data. The model is then demonstrated in a case study through the analysis of a variety of dredging, environmental and seagrass ecosystem recovery scenarios. In spatial zones significantly affected by dredging, such as the zone of moderate impact, shoot density has a very high probability of being driven to zero by capital dredging due to the duration of such dredging. Here, fast growing Halophila species can recover, however, the probability of recovery depends on the presence of seed banks. On the other hand, slow growing Amphibolis meadows have a high probability of suffering permanent loss. However, in the maintenance dredging scenario, due to the shorter duration of dredging, Amphibolis is better able to resist the impacts of dredging. For both types of seagrass meadows, the probability of loss was strongly dependent on the biological and ecological status of the meadow, as well as environmental conditions post-dredging. The ability to predict the ecosystem response under cumulative, non-linear interactions across a complex ecosystem highlights the utility of DBNs for decision support and environmental management.
Resumo:
We address risk minimizing option pricing in a regime switching market where the floating interest rate depends on a finite state Markov process. The growth rate and the volatility of the stock also depend on the Markov process. Using the minimal martingale measure, we show that the locally risk minimizing prices for certain exotic options satisfy a system of Black-Scholes partial differential equations with appropriate boundary conditions. We find the corresponding hedging strategies and the residual risk. We develop suitable numerical methods to compute option prices.
Resumo:
We address the problem of pricing defaultable bonds in a Markov modulated market. Using Merton's structural approach we show that various types of defaultable bonds are combination of European type contingent claims. Thus pricing a defaultable bond is tantamount to pricing a contingent claim in a Markov modulated market. Since the market is incomplete, we use the method of quadratic hedging and minimal martingale measure to derive locally risk minimizing derivative prices, hedging strategies and the corresponding residual risks. The price of defaultable bonds are obtained as solutions to a system of PDEs with weak coupling subject to appropriate terminal and boundary conditions. We solve the system of PDEs numerically and carry out a numerical investigation for the defaultable bond prices. We compare their credit spreads with some of the existing models. We observe higher spreads in the Markov modulated market. We show how business cycles can be easily incorporated in the proposed framework. We demonstrate the impact on spreads of the inclusion of rare states that attempt to capture a tight liquidity situation. These states are characterized by low risk-free interest rate, high payout rate and high volatility.
Resumo:
We introduce and study a class of non-stationary semi-Markov decision processes on a finite horizon. By constructing an equivalent Markov decision process, we establish the existence of a piecewise open loop relaxed control which is optimal for the finite horizon problem.
Resumo:
The work presented in this paper involves the stochastic finite element analysis of composite-epoxy adhesive lap joints using Monte Carlo simulation. A set of composite adhesive lap joints were prepared and loaded till failure to obtain their strength. The peel and shear strain in the bond line region at different levels of load were obtained using digital image correlation (DIC). The corresponding stresses were computed assuming a plane strain condition. The finite element model was verified by comparing the numerical and experimental stresses. The stresses exhibited a similar behavior and a good correlation was obtained. Further, the finite element model was used to perform the stochastic analysis using Monte Carlo simulation. The parameters influencing stress distribution were provided as a random input variable and the resulting probabilistic variation of maximum peel and shear stresses were studied. It was found that the adhesive modulus and bond line thickness had significant influence on the maximum stress variation. While the adherend thickness had a major influence, the effect of variation in longitudinal and shear modulus on the stresses was found to be little. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
In this article, we study risk-sensitive control problem with controlled continuous time Markov chain state dynamics. Using multiplicative dynamic programming principle along with the atomic structure of the state dynamics, we prove the existence and a characterization of optimal risk-sensitive control under geometric ergodicity of the state dynamics along with a smallness condition on the running cost.