964 resultados para Probabilistic charts


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work an attempt has been made to evaluate the seismic hazard of South India (8.0 degrees N-20 degrees N; 72 degrees E-88 degrees E) based on the probabilistic seismic hazard analysis (PSHA). The earthquake data obtained from different sources were declustered to remove the dependent events. A total of 598 earthquakes of moment magnitude 4 and above were obtained from the study area after declustering, and were considered for further hazard analysis. The seismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones in the study area which are associated with earthquakes of magnitude 4 and above. For assessing theseismic hazard, the study area was divided into small grids of size 0.1 degrees x0.1 degrees, and the hazard parameters were calculated at the centre of each of these grid cells by considering all the seismic sources with in a radius of 300 km. Rock level peak horizontal acceleration (PHA) and spectral acceleration (SA) values at 1 corresponding to 10% and 2% probability of exceedance in 50 years have been calculated for all the grid points. The contour maps showing the spatial variation of these values are presented here. Uniform hazard response spectrum (UHRS) at rock level for 5% damping and 10% and 2% probability of exceedance in 50 years were also developed for all the grid points. The peak ground acceleration (PGA) at surface level was calculated for the entire South India for four different site classes. These values can be used to find the PGA values at any site in South India based on site class at that location. Thus, this method can be viewed as a simplified method to evaluate the PGA values at any site in the study area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behavior of pile foundations in non liquefiable soil under seismic loading is considerably influenced by the variability in the soil and seismic design parameters. Hence, probabilistic models for the assessment of seismic pile design are necessary. Deformation of pile foundation in non liquefiable soil is dominated by inertial force from superstructure. The present study considers a pseudo-static approach based on code specified design response spectra. The response of the pile is determined by equivalent cantilever approach. The soil medium is modeled as a one-dimensional random field along the depth. The variability associated with undrained shear strength, design response spectrum ordinate, and superstructure mass is taken into consideration. Monte Carlo simulation technique is adopted to determine the probability of failure and reliability indices based on pile failure modes, namely exceedance of lateral displacement limit and moment capacity. A reliability-based design approach for the free head pile under seismic force is suggested that enables a rational choice of pile design parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abraham Frank, November 2003 (via LBI London)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What can the statistical structure of natural images teach us about the human brain? Even though the visual cortex is one of the most studied parts of the brain, surprisingly little is known about how exactly images are processed to leave us with a coherent percept of the world around us, so we can recognize a friend or drive on a crowded street without any effort. By constructing probabilistic models of natural images, the goal of this thesis is to understand the structure of the stimulus that is the raison d etre for the visual system. Following the hypothesis that the optimal processing has to be matched to the structure of that stimulus, we attempt to derive computational principles, features that the visual system should compute, and properties that cells in the visual system should have. Starting from machine learning techniques such as principal component analysis and independent component analysis we construct a variety of sta- tistical models to discover structure in natural images that can be linked to receptive field properties of neurons in primary visual cortex such as simple and complex cells. We show that by representing images with phase invariant, complex cell-like units, a better statistical description of the vi- sual environment is obtained than with linear simple cell units, and that complex cell pooling can be learned by estimating both layers of a two-layer model of natural images. We investigate how a simplified model of the processing in the retina, where adaptation and contrast normalization take place, is connected to the nat- ural stimulus statistics. Analyzing the effect that retinal gain control has on later cortical processing, we propose a novel method to perform gain control in a data-driven way. Finally we show how models like those pre- sented here can be extended to capture whole visual scenes rather than just small image patches. By using a Markov random field approach we can model images of arbitrary size, while still being able to estimate the model parameters from the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By applying the theory of the asymptotic distribution of extremes and a certain stability criterion to the question of the domain of convergence in the probability sense, of the renormalized perturbation expansion (RPE) for the site self-energy in a cellularly disordered system, an expression has been obtained in closed form for the probability of nonconvergence of the RPE on the real-energy axis. Hence, the intrinsic mobility mu (E) as a function of the carrier energy E is deduced to be given by mu (E)= mu 0exp(-exp( mod E mod -Ec) Delta ), where Ec is a nominal 'mobility edge' and Delta is the width of the random site-energy distribution. Thus mobility falls off sharply but continuously for mod E mod >Ec, in contradistinction with the notion of an abrupt 'mobility edge' proposed by Cohen et al. and Mott. Also, the calculated electrical conductivity shows a temperature dependence in qualitative agreement with experiments on disordered semiconductors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The integration of stochastic wind power has accentuated a challenge for power system stability assessment. Since the power system is a time-variant system under wind generation fluctuations, pure time-domain simulations are difficult to provide real-time stability assessment. As a result, the worst-case scenario is simulated to give a very conservative assessment of system transient stability. In this study, a probabilistic contingency analysis through a stability measure method is proposed to provide a less conservative contingency analysis which covers 5-min wind fluctuations and a successive fault. This probabilistic approach would estimate the transfer limit of a critical line for a given fault with stochastic wind generation and active control devices in a multi-machine system. This approach achieves a lower computation cost and improved accuracy using a new stability measure and polynomial interpolation, and is feasible for online contingency analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyzed the development of 4th-grade students’ understanding of the transition from experimental relative frequencies of outcomes to theoretical probabilities with a focus on the foundational statistical concepts of variation and expectation. We report students’ initial and changing expectations of the outcomes of tossing one and two coins, how they related the relative frequency from their physical and computersimulated trials to the theoretical probability, and how they created and interpreted theoretical probability models. Findings include students’ progression from an initial apparent equiprobability bias in predicting outcomes of tossing two coins through to representing the outcomes of increasing the number of trials. After observing the decreasing variation from the theoretical probability as the sample size increased, students developed a deeper understanding of the relationship between relative frequency of outcomes and theoretical probability as well as their respective associations with variation and expectation. Students’ final models indicated increasing levels of probabilistic understanding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study how probabilistic reasoning and inductive querying can be combined within ProbLog, a recent probabilistic extension of Prolog. ProbLog can be regarded as a database system that supports both probabilistic and inductive reasoning through a variety of querying mechanisms. After a short introduction to ProbLog, we provide a survey of the different types of inductive queries that ProbLog supports, and show how it can be applied to the mining of large biological networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance-based liquefaction potential analysis was carried out in the present study to estimate the liquefaction return period for Bangalore, India, through a probabilistic approach. In this approach, the entire range of peak ground acceleration (PGA) and earthquake magnitudes was used in the evaluation of liquefaction return period. The seismic hazard analysis for the study area was done using probabilistic approach to evaluate the peak horizontal acceleration at bed rock level. Based on the results of the multichannel analysis of surface wave, it was found that the study area belonged to site class D. The PGA values for the study area were evaluated for site class D by considering the local site effects. The soil resistance for the study area was characterized using the standard penetration test (SPT) values obtained from 450 boreholes. These SPT data along with the PGA values obtained from the probabilistic seismic hazard analysis were used to evaluate the liquefaction return period for the study area. The contour plot showing the spatial variation of factor of safety against liquefaction and the corrected SPT values required for preventing liquefaction for a return period of 475 years at depths of 3 and 6 m are presented in this paper. The entire process of liquefaction potential evaluation, starting from collection of earthquake data, identifying the seismic sources, evaluation of seismic hazard and the assessment of liquefaction return period were carried out, and the entire analysis was done based on the probabilistic approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probabilistic analysis of cracking moment from 22 simply supported reinforced concrete beams is performed. When the basic variables follow the distribution considered in this study, the cracking moment of a beam is found to follow a normal distribution. An expression is derived, for characteristic cracking moment, which will be useful in examining reinforced concrete beams for a limit state of cracking.