29 resultados para Probability Distribution


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Incoherent Thomson scattering (ITS) provides a nonintrusive diagnostic for the determination of one-dimensional (1D) electron velocity distribution in plasmas. When the ITS spectrum is Gaussian its interpretation as a three-dimensional (3D) Maxwellian velocity distribution is straightforward. For more complex ITS line shapes derivation of the corresponding 3D velocity distribution and electron energy probability distribution function is more difficult. This article reviews current techniques and proposes an approach to making the transformation between a 1D velocity distribution and the corresponding 3D energy distribution. Previous approaches have either transformed the ITS spectra directly from a 1D distribution to a 3D or fitted two Gaussians assuming a Maxwellian or bi-Maxwellian distribution. Here, the measured ITS spectrum transformed into a 1D velocity distribution and the probability of finding a particle with speed within 0 and given value v is calculated. The differentiation of this probability function is shown to be the normalized electron velocity distribution function. (C) 2003 American Institute of Physics.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We report the experimental reconstruction of the nonequilibrium work probability distribution in a closed quantum system, and the study of the corresponding quantum fluctuation relations. The experiment uses a liquid-state nuclear magnetic resonance platform that offers full control on the preparation and dynamics of the system. Our endeavors enable the characterization of the out-of-equilibrium dynamics of a quantum spin from a finite-time thermodynamics viewpoint.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Joint quantum measurements of noncommuting observables are possible, if one accepts an increase in the measured variances. A necessary condition for a joint measurement to be possible is that a joint probability distribution exists for the measurement. This fact suggests that there may be a link with Bell inequalities, as these will be satisfied if and only if a joint probability distribution for all involved observables exists. We investigate the connections between Bell inequalities and conditions for joint quantum measurements to be possible. Mermin's inequality for the three-particle Greenberger-Horne-Zeilinger state turns out to be equivalent to the condition for a joint measurement on two out of the three quantum systems to exist. Gisin's Bell inequality for three coplanar measurement directions, meanwhile, is shown to be less strict than the condition for the corresponding joint measurement.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the Crawford-Sobel (uniform, quadratic utility) cheap-talk model, we consider a simple mediation scheme (a communication device) in which the informed agent reports one of N possible elements of a partition to the mediator and then the mediator suggests one of N actions to the uninformed decision-maker according to the probability distribution of the device. We show that such a simple mediated equilibrium cannot improve upon the unmediated N-partition Crawford-Sobel equilibrium when the preference divergence parameter (bias) is small.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate the effect of correlated additive and multiplicative Gaussian white noise oil the Gompertzian growth of tumours. Our results are obtained by Solving numerically the time-dependent Fokker-Planck equation (FPE) associated with the stochastic dynamics. In Our numerical approach we have adopted B-spline functions as a truncated basis to expand the approximated eigenfunctions. The eigenfunctions and eigenvalues obtained using this method are used to derive approximate solutions of the dynamics under Study. We perform simulations to analyze various aspects, of the probability distribution. of the tumour cell populations in the transient- and steady-state regimes. More precisely, we are concerned mainly with the behaviour of the relaxation time (tau) to the steady-state distribution as a function of (i) of the correlation strength (lambda) between the additive noise and the multiplicative noise and (ii) as a function of the multiplicative noise intensity (D) and additive noise intensity (alpha). It is observed that both the correlation strength and the intensities of additive and multiplicative noise, affect the relaxation time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nonlinear principal component analysis (PCA) based on neural networks has drawn significant attention as a monitoring tool for complex nonlinear processes, but there remains a difficulty with determining the optimal network topology. This paper exploits the advantages of the Fast Recursive Algorithm, where the number of nodes, the location of centres, and the weights between the hidden layer and the output layer can be identified simultaneously for the radial basis function (RBF) networks. The topology problem for the nonlinear PCA based on neural networks can thus be solved. Another problem with nonlinear PCA is that the derived nonlinear scores may not be statistically independent or follow a simple parametric distribution. This hinders its applications in process monitoring since the simplicity of applying predetermined probability distribution functions is lost. This paper proposes the use of a support vector data description and shows that transforming the nonlinear principal components into a feature space allows a simple statistical inference. Results from both simulated and industrial data confirm the efficacy of the proposed method for solving nonlinear principal component problems, compared with linear PCA and kernel PCA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we present a new method for simultaneously determining three dimensional (3-D) shape and motion of a non-rigid object from uncalibrated two dimensional (2- D) images without assuming the distribution characteristics. A non-rigid motion can be treated as a combination of a rigid rotation and a non-rigid deformation. To seek accurate recovery of deformable structures, we estimate the probability distribution function of the corresponding features through random sampling, incorporating an established probabilistic model. The fitting between the observation and the projection of the estimated 3-D structure will be evaluated using a Markov chain Monte Carlo based expectation maximisation algorithm. Applications of the proposed method to both synthetic and real image sequences are demonstrated with promising results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Metallographic characterisation is combined with statistical analysis to study the microstructure of a BT16 titanium alloy after different heat treatment processes. It was found that the length, width and aspect ratio of α plates in this alloy follow the three-parameter Weibull distribution. Increasing annealing temperature or time causes the probability distribution of the length and the width of α plates to tend toward a normal distribution. The phase transformation temperature of the BT16 titanium alloy was found to be 875±5°C.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A nonparametric, small-sample-size test for the homogeneity of two psychometric functions against the left- and right-shift alternatives has been developed. The test is designed to determine whether it is safe to amalgamate psychometric functions obtained in different experimental sessions. The sum of the lower and upper p-values of the exact (conditional) Fisher test for several 2 × 2 contingency tables (one for each point of the psychometric function) is employed as the test statistic. The probability distribution of the statistic under the null (homogeneity) hypothesis is evaluated to obtain corresponding p-values. Power functions of the test have been computed by randomly generating samples from Weibull psychometric functions. The test is free of any assumptions about the shape of the psychometric function; it requires only that all observations are statistically independent. © 2011 Psychonomic Society, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When examining complex problems, such as the folding of proteins, coarse grained descriptions of the system drive our investigation and help us to rationalize the results. Oftentimes collective variables (CVs), derived through some chemical intuition about the process of interest, serve this purpose. Because finding these CVs is the most difficult part of any investigation, we recently developed a dimensionality reduction algorithm, sketch-map, that can be used to build a low-dimensional map of a phase space of high-dimensionality. In this paper we discuss how these machine-generated CVs can be used to accelerate the exploration of phase space and to reconstruct free-energy landscapes. To do so, we develop a formalism in which high-dimensional configurations are no longer represented by low-dimensional position vectors. Instead, for each configuration we calculate a probability distribution, which has a domain that encompasses the entirety of the low-dimensional space. To construct a biasing potential, we exploit an analogy with metadynamics and use the trajectory to adaptively construct a repulsive, history-dependent bias from the distributions that correspond to the previously visited configurations. This potential forces the system to explore more of phase space by making it desirable to adopt configurations whose distributions do not overlap with the bias. We apply this algorithm to a small model protein and succeed in reproducing the free-energy surface that we obtain from a parallel tempering calculation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AgentSpeak is a logic-based programming language, based on the Belief-Desire-Intention (BDI) paradigm, suitable for building complex agent-based systems. To limit the computational complexity, agents in AgentSpeak rely on a plan library to reduce the planning problem to the much simpler problem of plan selection. However, such a plan library is often inadequate when an agent is situated in an uncertain environment. In this paper, we propose the AgentSpeak+ framework, which extends AgentSpeak with a mechanism for probabilistic planning. The beliefs of an AgentSpeak+ agent are represented using epistemic states to allow an agent to reason about its uncertain observations and the uncertain effects of its actions. Each epistemic state consists of a POMDP, used to encode the agent’s knowledge of the environment, and its associated probability distribution (or belief state). In addition, the POMDP is used to select the optimal actions for achieving a given goal, even when facing uncertainty.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AgentSpeak is a logic-based programming language, based on the Belief-Desire-Intention (BDI) paradigm, suitable for building complex agent-based systems. To limit the computational complexity, agents in AgentSpeak rely on a plan library to reduce the planning problem to the much simpler problem of plan selection. However, such a plan library is often inadequate when an agent is situated in an uncertain environment. In this paper, we propose the AgentSpeak+ framework, which extends AgentSpeak with a mechanism for probabilistic planning. The beliefs of an AgentSpeak+ agent are represented using epistemic states to allow an agent to reason about its uncertain observations and the uncertain effects of its actions. Each epistemic state consists of a POMDP, used to encode the agent’s knowledge of the environment, and its associated probability distribution (or belief state). In addition, the POMDP is used to select the optimal actions for achieving a given goal, even when facing uncertainty.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a continuous time Markov chain (CTMC) based sequential analytical approach for composite generation and transmission systems reliability assessment. The basic idea is to construct a CTMC model for the composite system. Based on this model, sequential analyses are performed. Various kinds of reliability indices can be obtained, including expectation, variance, frequency, duration and probability distribution. In order to reduce the dimension of the state space, traditional CTMC modeling approach is modified by merging all high order contingencies into a single state, which can be calculated by Monte Carlo simulation (MCS). Then a state mergence technique is developed to integrate all normal states to further reduce the dimension of the CTMC model. Moreover, a time discretization method is presented for the CTMC model calculation. Case studies are performed on the RBTS and a modified IEEE 300-bus test system. The results indicate that sequential reliability assessment can be performed by the proposed approach. Comparing with the traditional sequential Monte Carlo simulation method, the proposed method is more efficient, especially in small scale or very reliable power systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigated the problem of confined flow under dams and water retaining structuresusing stochastic modelling. The approach advocated in the study combined a finite elementsmethod based on the equation governing the dynamics of incompressible fluid flow through aporous medium with a random field generator that generates random hydraulic conductivity basedon lognormal probability distribution. The resulting model was then used to analyse confined flowunder a hydraulic structure. Cases for a structure provided with cutoff wall and when the wall didnot exist were both tested. Various statistical parameters that reflected different degrees ofheterogeneity were examined and the changes in the mean seepage flow, the mean uplift forceand the mean exit gradient observed under the structure were analysed. Results reveal that underheterogeneous conditions, the reduction made by the sheetpile in the uplift force and exit hydraulicgradient may be underestimated when deterministic solutions are used.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We calculate the first two moments and full probability distribution of the work performed on a system of bosonic particles in a two-mode Bose-Hubbard Hamiltonian when the self-interaction term is varied instantaneously or with a finite-time ramp. In the instantaneous case, we show how the irreversible work scales differently depending on whether the system is driven to the Josephson or Fock regime of the bosonic Josephson junction. In the finite-time case, we use optimal control techniques to substantially decrease the irreversible work to negligible values. Our analysis can be implemented in present-day experiments with ultracold atoms and we show how to relate the work statistics to that of the population imbalance of the two modes.