934 resultados para PROBABILISTIC TELEPORTATION


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behavior of pile foundations in non liquefiable soil under seismic loading is considerably influenced by the variability in the soil and seismic design parameters. Hence, probabilistic models for the assessment of seismic pile design are necessary. Deformation of pile foundation in non liquefiable soil is dominated by inertial force from superstructure. The present study considers a pseudo-static approach based on code specified design response spectra. The response of the pile is determined by equivalent cantilever approach. The soil medium is modeled as a one-dimensional random field along the depth. The variability associated with undrained shear strength, design response spectrum ordinate, and superstructure mass is taken into consideration. Monte Carlo simulation technique is adopted to determine the probability of failure and reliability indices based on pile failure modes, namely exceedance of lateral displacement limit and moment capacity. A reliability-based design approach for the free head pile under seismic force is suggested that enables a rational choice of pile design parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By applying the theory of the asymptotic distribution of extremes and a certain stability criterion to the question of the domain of convergence in the probability sense, of the renormalized perturbation expansion (RPE) for the site self-energy in a cellularly disordered system, an expression has been obtained in closed form for the probability of nonconvergence of the RPE on the real-energy axis. Hence, the intrinsic mobility mu (E) as a function of the carrier energy E is deduced to be given by mu (E)= mu 0exp(-exp( mod E mod -Ec) Delta ), where Ec is a nominal 'mobility edge' and Delta is the width of the random site-energy distribution. Thus mobility falls off sharply but continuously for mod E mod >Ec, in contradistinction with the notion of an abrupt 'mobility edge' proposed by Cohen et al. and Mott. Also, the calculated electrical conductivity shows a temperature dependence in qualitative agreement with experiments on disordered semiconductors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The integration of stochastic wind power has accentuated a challenge for power system stability assessment. Since the power system is a time-variant system under wind generation fluctuations, pure time-domain simulations are difficult to provide real-time stability assessment. As a result, the worst-case scenario is simulated to give a very conservative assessment of system transient stability. In this study, a probabilistic contingency analysis through a stability measure method is proposed to provide a less conservative contingency analysis which covers 5-min wind fluctuations and a successive fault. This probabilistic approach would estimate the transfer limit of a critical line for a given fault with stochastic wind generation and active control devices in a multi-machine system. This approach achieves a lower computation cost and improved accuracy using a new stability measure and polynomial interpolation, and is feasible for online contingency analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyzed the development of 4th-grade students’ understanding of the transition from experimental relative frequencies of outcomes to theoretical probabilities with a focus on the foundational statistical concepts of variation and expectation. We report students’ initial and changing expectations of the outcomes of tossing one and two coins, how they related the relative frequency from their physical and computersimulated trials to the theoretical probability, and how they created and interpreted theoretical probability models. Findings include students’ progression from an initial apparent equiprobability bias in predicting outcomes of tossing two coins through to representing the outcomes of increasing the number of trials. After observing the decreasing variation from the theoretical probability as the sample size increased, students developed a deeper understanding of the relationship between relative frequency of outcomes and theoretical probability as well as their respective associations with variation and expectation. Students’ final models indicated increasing levels of probabilistic understanding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study how probabilistic reasoning and inductive querying can be combined within ProbLog, a recent probabilistic extension of Prolog. ProbLog can be regarded as a database system that supports both probabilistic and inductive reasoning through a variety of querying mechanisms. After a short introduction to ProbLog, we provide a survey of the different types of inductive queries that ProbLog supports, and show how it can be applied to the mining of large biological networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance-based liquefaction potential analysis was carried out in the present study to estimate the liquefaction return period for Bangalore, India, through a probabilistic approach. In this approach, the entire range of peak ground acceleration (PGA) and earthquake magnitudes was used in the evaluation of liquefaction return period. The seismic hazard analysis for the study area was done using probabilistic approach to evaluate the peak horizontal acceleration at bed rock level. Based on the results of the multichannel analysis of surface wave, it was found that the study area belonged to site class D. The PGA values for the study area were evaluated for site class D by considering the local site effects. The soil resistance for the study area was characterized using the standard penetration test (SPT) values obtained from 450 boreholes. These SPT data along with the PGA values obtained from the probabilistic seismic hazard analysis were used to evaluate the liquefaction return period for the study area. The contour plot showing the spatial variation of factor of safety against liquefaction and the corrected SPT values required for preventing liquefaction for a return period of 475 years at depths of 3 and 6 m are presented in this paper. The entire process of liquefaction potential evaluation, starting from collection of earthquake data, identifying the seismic sources, evaluation of seismic hazard and the assessment of liquefaction return period were carried out, and the entire analysis was done based on the probabilistic approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probabilistic analysis of cracking moment from 22 simply supported reinforced concrete beams is performed. When the basic variables follow the distribution considered in this study, the cracking moment of a beam is found to follow a normal distribution. An expression is derived, for characteristic cracking moment, which will be useful in examining reinforced concrete beams for a limit state of cracking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we give a method for probabilistic assignment to the Realistic Abductive Reasoning Model, The knowledge is assumed to be represented in the form of causal chaining, namely, hyper-bipartite network. Hyper-bipartite network is the most generalized form of knowledge representation for which, so far, there has been no way of assigning probability to the explanations, First, the inference mechanism using realistic abductive reasoning model is briefly described and then probability is assigned to each of the explanations so as to pick up the explanations in the decreasing order of plausibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conventional Cornell's source-based approach of probabilistic seismic-hazard assessment (PSHA) has been employed all around the world, whilst many studies often rely on the use of computer packages such as FRISK (McGuire FRISK-a computer program for seismic risk analysis. Open-File Report 78-1007, United States Geological Survey, Department of Interior, Washington 1978) and SEISRISK III (Bender and Perkins SEISRISK III-a computer program for seismic hazard estimation, Bulletin 1772. United States Geological Survey, Department of Interior, Washington 1987). A ``black-box'' syndrome may be resulted if the user of the software does not have another simple and robust PSHA method that can be used to make comparisons. An alternative method for PSHA, namely direct amplitude-based (DAB) approach, has been developed as a heuristic and efficient method enabling users to undertake their own sanity checks on outputs from computer packages. This paper experiments the application of the DAB approach for three cities in China, Iran, and India, respectively, and compares with documented results computed by the source-based approach. Several insights regarding the procedure of conducting PSHA have also been obtained, which could be useful for future seismic-hazard studies.