951 resultados para Algorithmic Probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The probability that a random process crosses an arbitrary level for the first time is expressed as a Gram—Charlier series, the leading term of which is the Poisson approximation. The coefficients of this series are related to the moments of the number of level crossings. The results are applicable to both stationary and non-stationary processes. Some numerical results are presented for the response process of a linear single-degree-of-freedom oscillator under Gaussian white noise excitation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report numerical and analytic results for the spatial survival probability for fluctuating one-dimensional interfaces with Edwards-Wilkinson or Kardar-Parisi-Zhang dynamics in the steady state. Our numerical results are obtained from analysis of steady-state profiles generated by integrating a spatially discretized form of the Edwards-Wilkinson equation to long times. We show that the survival probability exhibits scaling behavior in its dependence on the system size and the "sampling interval" used in the measurement for both "steady-state" and "finite" initial conditions. Analytic results for the scaling functions are obtained from a path-integral treatment of a formulation of the problem in terms of one-dimensional Brownian motion. A "deterministic approximation" is used to obtain closed-form expressions for survival probabilities from the formally exact analytic treatment. The resulting approximate analytic results provide a fairly good description of the numerical data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The probability distribution of the eigenvalues of a second-order stochastic boundary value problem is considered. The solution is characterized in terms of the zeros of an associated initial value problem. It is further shown that the probability distribution is related to the solution of a first-order nonlinear stochastic differential equation. Solutions of this equation based on the theory of Markov processes and also on the closure approximation are presented. A string with stochastic mass distribution is considered as an example for numerical work. The theoretical probability distribution functions are compared with digital simulation results. The comparison is found to be reasonably good.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a novel genetic algorithm is developed by generating artificial chromosomes with probability control to solve the machine scheduling problems. Generating artificial chromosomes for Genetic Algorithm (ACGA) is closely related to Evolutionary Algorithms Based on Probabilistic Models (EAPM). The artificial chromosomes are generated by a probability model that extracts the gene information from current population. ACGA is considered as a hybrid algorithm because both the conventional genetic operators and a probability model are integrated. The ACGA proposed in this paper, further employs the ``evaporation concept'' applied in Ant Colony Optimization (ACO) to solve the permutation flowshop problem. The ``evaporation concept'' is used to reduce the effect of past experience and to explore new alternative solutions. In this paper, we propose three different methods for the probability of evaporation. This probability of evaporation is applied as soon as a job is assigned to a position in the permutation flowshop problem. Experimental results show that our ACGA with the evaporation concept gives better performance than some algorithms in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The statistically steady humidity distribution resulting from an interaction of advection, modelled as an uncorrelated random walk of moist parcels on an isentropic surface, and a vapour sink, modelled as immediate condensation whenever the specific humidity exceeds a specified saturation humidity, is explored with theory and simulation. A source supplies moisture at the deep-tropical southern boundary of the domain and the saturation humidity is specified as a monotonically decreasing function of distance from the boundary. The boundary source balances the interior condensation sink, so that a stationary spatially inhomogeneous humidity distribution emerges. An exact solution of the Fokker-Planck equation delivers a simple expression for the resulting probability density function (PDF) of the wate-rvapour field and also the relative humidity. This solution agrees completely with a numerical simulation of the process, and the humidity PDF exhibits several features of interest, such as bimodality close to the source and unimodality further from the source. The PDFs of specific and relative humidity are broad and non-Gaussian. The domain-averaged relative humidity PDF is bimodal with distinct moist and dry peaks, a feature which we show agrees with middleworld isentropic PDFs derived from the ERA interim dataset. Copyright (C) 2011 Royal Meteorological Society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of the probability of error in decision feedback equalizers is difficult due to the presence of a hard limiter in the feedback path. This paper derives the upper and lower bounds on the probability of a single error and multiple error patterns. The bounds are fairly tight. The bounds can also be used to select proper tap gains of the equalizer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Upper bounds on the probability of error due to co-channel interference are proposed in this correspondence. The bounds are easy to compute and can be fairly tight.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A set of formulas is derived from general circuit constants which facilitates formation of the impedance matrix of a power system by the bus-impedance method. The errors associated with the lumpedparameter representation of a transmission line are thereby eliminated. The formulas are valid for short lines also, if the relevant general circuit constants are employed. The mutual impedance between the added line and the existing system is not considered, but the approach suggested can well be extended to it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper outlines a technique for sensitive measurement of conduction phenomena in liquid dielectrics. The special features of this technique are the simplicity of the electrical system, the inexpensive instrumentation and the high accuracy. Detection, separation and analysis of a random function of current that is superimposed on the prebreakdown direct current forms the basis of this investigation. In this case, prebreakdown direct current is the output data of a test cell with large electrodes immersed in a liquid medium subjected to high direct voltages. Measurement of the probability-distribution function of a random fluctuating component of current provides a method that gives insight into the mechanism of conduction in a liquid medium subjected to high voltages and the processes that are responsible for the existence of the fluctuating component of the current.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose an iterative algorithm to simulate the dynamics generated by any n-qubit Hamiltonian. The simulation entails decomposing the unitary time evolution operator U (unitary) into a product of different time-step unitaries. The algorithm product-decomposes U in a chosen operator basis by identifying a certain symmetry of U that is intimately related to the number of gates in the decomposition. We illustrate the algorithm by first obtaining a polynomial decomposition in the Pauli basis of the n-qubit quantum state transfer unitary by Di Franco et al. [Phys. Rev. Lett. 101, 230502 (2008)] that transports quantum information from one end of a spin chain to the other, and then implement it in nuclear magnetic resonance to demonstrate that the decomposition is experimentally viable. We further experimentally test the resilience of the state transfer to static errors in the coupling parameters of the simulated Hamiltonian. This is done by decomposing and simulating the corresponding imperfect unitaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We reconsider standard uniaxial fatigue test data obtained from handbooks. Many S-N curve fits to such data represent the median life and exclude load-dependent variance in life. Presently available approaches for incorporating probabilistic aspects explicitly within the S-N curves have some shortcomings, which we discuss. We propose a new linear S-N fit with a prespecified failure probability, load-dependent variance, and reasonable behavior at extreme loads. We fit our parameters using maximum likelihood, show the reasonableness of the fit using Q-Q plots, and obtain standard error estimates via Monte Carlo simulations. The proposed fitting method may be used for obtaining S-N curves from the same data as already available, with the same mathematical form, but in cases in which the failure probability is smaller, say, 10 % instead of 50 %, and in which the fitted line is not parallel to the 50 % (median) line.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For a fixed positive integer k, a k-tuple total dominating set of a graph G = (V. E) is a subset T D-k of V such that every vertex in V is adjacent to at least k vertices of T Dk. In minimum k-tuple total dominating set problem (MIN k-TUPLE TOTAL DOM SET), it is required to find a k-tuple total dominating set of minimum cardinality and DECIDE MIN k-TUPLE TOTAL DOM SET is the decision version of MIN k-TUPLE TOTAL DOM SET problem. In this paper, we show that DECIDE MIN k-TUPLE TOTAL DOM SET is NP-complete for split graphs, doubly chordal graphs and bipartite graphs. For chordal bipartite graphs, we show that MIN k-TUPLE TOTAL DOM SET can be solved in polynomial time. We also propose some hardness results and approximation algorithms for MIN k-TUPLE TOTAL DOM SET problem. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the tradeoff between the average error probability and the average queueing delay of messages which randomly arrive to the transmitter of a point-to-point discrete memoryless channel that uses variable rate fixed codeword length random coding. Bounds to the exponential decay rate of the average error probability with average queueing delay in the regime of large average delay are obtained. Upper and lower bounds to the optimal average delay for a given average error probability constraint are presented. We then formulate a constrained Markov decision problem for characterizing the rate of transmission as a function of queue size given an average error probability constraint. Using a Lagrange multiplier the constrained Markov decision problem is then converted to a problem of minimizing the average cost for a Markov decision problem. A simple heuristic policy is proposed which approximately achieves the optimal average cost.