927 resultados para Markov Model with Monte-Carlo microsimulations
Resumo:
We study an energy-constrained sandpile model with random neighbors. The critical behavior of the model is in the same universality class as the mean-field self-organized criticality sandpile. The critical energy E-c depends on the number of neighbors n for each site, but the various exponents are independent of n. A self-similar structure with n-1 major peaks is developed for the energy distribution p(E) when the system approaches its stationary state. The avalanche dynamics contributes to the major peaks appearing at E-Pk = 2k/(2n - 1) with k = 1,2,...,n-1, while the fine self-similar structure is a natural result of the way the system is disturbed. [S1063-651X(99)10307-6].
Resumo:
The range of potential applications for indoor and campus based personnel localisation has led researchers to create a wide spectrum of different algorithmic approaches and systems. However, the majority of the proposed systems overlook the unique radio environment presented by the human body leading to systematic errors and inaccuracies when deployed in this context. In this paper RSSI-based Monte Carlo Localisation was implemented using commercial 868 MHz off the shelf hardware and empirical data was gathered across a relatively large number of scenarios within a single indoor office environment. This data showed that the body shadowing effect caused by the human body introduced path skew into location estimates. It was also shown that, by using two body-worn nodes in concert, the effect of body shadowing can be mitigated by averaging the estimated position of the two nodes worn on either side of the body. © Springer Science+Business Media, LLC 2012.
Resumo:
This paper addresses the problem of optimally locating intermodal freight terminals in Serbia. To solve this problem and determine the effects of the resulting scenarios, two modeling approaches were combined. The first approach is based on multiple-assignment hub-network design, and the second is based on simulation. The multiple-assignment p-hub network location model was used to determine the optimal location of intermodal terminals. Simulation was used as a tool to estimate intermodal transport flow volumes, due to the unreliability and unavailability of specific statistical data, and as a method for quantitatively analyzing the economic, time, and environmental effects of different scenarios of intermodal terminal development. The results presented here represent a summary, with some extension, of the research realized in the IMOD-X project (Intermodal Solutions for Competitive Transport in Serbia).
Resumo:
The contribution of preexisting hypercholesterolemia to diabetic nephropathy remains unclear. We assessed the impact of hypercholesterolemia on diabetic nephropathy using a double knockout (DKO) mouse, null for the low-density lipoprotein receptor (LDLRNDASH;/NDASH;) and the apoB mRNA editing catalytic polypeptide 1 (APOBEC1NDASH;/NDASH;).
Resumo:
Research into localization has produced a wealth of algorithms and techniques to estimate the location of wireless network nodes, however the majority of these schemes do not explicitly account for non-line of sight conditions. Disregarding this common situation reduces their accuracy and their potential for exploitation in real world applications. This is a particular problem for personnel tracking where the user's body itself will inherently cause time-varying blocking according to their movements. Using empirical data, this paper demonstrates that, by accounting for non-line of sight conditions and using received signal strength based Monte Carlo localization, meter scale accuracy can be achieved for a wrist-worn personnel tracking tag in a 120 m indoor office environment. © 2012 IEEE.
Resumo:
We present an implementation of quantum annealing (QA) via lattice Green's function Monte Carlo (GFMC), focusing on its application to the Ising spin glass in transverse field. In particular, we study whether or not such a method is more effective than the path-integral Monte Carlo- (PIMC) based QA, as well as classical simulated annealing (CA), previously tested on the same optimization problem. We identify the issue of importance sampling, i.e., the necessity of possessing reasonably good (variational) trial wave functions, as the key point of the algorithm. We performed GFMC-QA runs using such a Boltzmann-type trial wave function, finding results for the residual energies that are qualitatively similar to those of CA (but at a much larger computational cost), and definitely worse than PIMC-QA. We conclude that, at present, without a serious effort in constructing reliable importance sampling variational wave functions for a quantum glass, GFMC-QA is not a true competitor of PIMC-QA.
Resumo:
Monte Carlo calculations of quantum yield in PtSi/p-Si infrared detectors are carried out taking into account the presence of a spatially distributed barrier potential. In the 1-4 mu m wavelength range it is found that the spatial inhomogeneity of the barrier has no significant effect on the overall device photoresponse. However, above lambda = 4.0 mu m and particularly as the cut-off wavelength (lambda approximate to 5.5 mu m) is approached, these calculations reveal a difference between the homogeneous and inhomogeneous barrier photoresponse which becomes increasingly significant and exceeds 50% at lambda = 5.3 mu m. It is, in fact, the inhomogeneous barrier which displays an increased photoyield, a feature that is confirmed by approximate analytical calculations assuming a symmetric Gaussian spatial distribution of the barrier. Furthermore, the importance of the silicide layer thickness in optimizing device efficiency is underlined as a trade-off between maximizing light absorption in the silicide layer and optimizing the internal yield. The results presented here address important features which determine the photoyield of PtSi/Si Schottky diodes at energies below the Si absorption edge and just above the Schottky barrier height in particular.
Resumo:
Hidden Markov models (HMMs) are widely used models for sequential data. As with other probabilistic graphical models, they require the specification of precise probability values, which can be too restrictive for some domains, especially when data are scarce or costly to acquire. We present a generalized version of HMMs, whose quantification can be done by sets of, instead of single, probability distributions. Our models have the ability to suspend judgment when there is not enough statistical evidence, and can serve as a sensitivity analysis tool for standard non-stationary HMMs. Efficient inference algorithms are developed to address standard HMM usage such as the computation of likelihoods and most probable explanations. Experiments with real data show that the use of imprecise probabilities leads to more reliable inferences without compromising efficiency.
Resumo:
There are many uncertainties in forecasting the charging and discharging capacity required by electric vehicles (EVs) often as a consequence of stochastic usage and intermittent travel. In terms of large-scale EV integration in future power networks this paper develops a capacity forecasting model which considers eight particular uncertainties in three categories. Using the model, a typical application of EVs to load levelling is presented and exemplified using a UK 2020 case study. The results presented in this paper demonstrate that the proposed model is accurate for charge and discharge prediction and a feasible basis for steady-state analysis required for large-scale EV integration.
Resumo:
Hidden Markov models (HMMs) are widely used probabilistic models of sequential data. As with other probabilistic models, they require the specification of local conditional probability distributions, whose assessment can be too difficult and error-prone, especially when data are scarce or costly to acquire. The imprecise HMM (iHMM) generalizes HMMs by allowing the quantification to be done by sets of, instead of single, probability distributions. iHMMs have the ability to suspend judgment when there is not enough statistical evidence, and can serve as a sensitivity analysis tool for standard non-stationary HMMs. In this paper, we consider iHMMs under the strong independence interpretation, for which we develop efficient inference algorithms to address standard HMM usage such as the computation of likelihoods and most probable explanations, as well as performing filtering and predictive inference. Experiments with real data show that iHMMs produce more reliable inferences without compromising the computational efficiency.
Resumo:
In the past few years, a considerable research effort has been devoted to the development of transformer digital models in order to simulate its behaviour under transient and abnormal operating conditions. Although many three-phase transformer models have been presented in the literature, there is a surprisingly lack of studies regarding the incorporation of winding faults. This paper presents a coupled electromagnetic transformer model for the study of winding inter-turn short-circuits. Particular attention will be given to the model parameters determination, for both healthy and faulty operating conditions. Experimental and simulation test results are presented in the paper, demonstrating the adequacy of the model as well as the methodologies for the parameters determination.