951 resultados para Algorithmic Probability
Resumo:
A nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is here described. Because the procedure does not make a priori assumptions about underlying probability distributions, it yields accurate estimates on a wide variety of prediction tasks. Fuzzy ARTMAP is used to perform probability estimation in two different modes. In a 'slow-learning' mode, input-output associations change slowly, with the strength of each association computing a conditional probability estimate. In 'max-nodes' mode, a fixed number of categories are coded during an initial fast learning interval, and weights are then tuned by slow learning. Simulations illustrate system performance on tasks in which various numbers of clusters in the set of input vectors mapped to a given class.
Resumo:
An incremental, nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is introduced. In slow-learning mode, fuzzy ARTMAP searches for patterns of data on which to build ever more accurate estimates. In max-nodes mode, the network initially learns a fixed number of categories, and weights are then adjusted gradually.
Resumo:
We present a neural network that adapts and integrates several preexisting or new modules to categorize events in short term memory (STM), encode temporal order in working memory, evaluate timing and probability context in medium and long term memory. The model shows how processed contextual information modulates event recognition and categorization, focal attention and incentive motivation. The model is based on a compendium of Event Related Potentials (ERPs) and behavioral results either collected by the authors or compiled from the classical ERP literature. Its hallmark is, at the functional level, the interplay of memory registers endowed with widely different dynamical ranges, and at the structural level, the attempt to relate the different modules to known anatomical structures.
Resumo:
The class of all Exponential-Polynomial-Trigonometric (EPT) functions is classical and equal to the Euler-d’Alembert class of solutions of linear differential equations with constant coefficients. The class of non-negative EPT functions defined on [0;1) was discussed in Hanzon and Holland (2010) of which EPT probability density functions are an important subclass. EPT functions can be represented as ceAxb, where A is a square matrix, b a column vector and c a row vector where the triple (A; b; c) is the minimal realization of the EPT function. The minimal triple is only unique up to a basis transformation. Here the class of 2-EPT probability density functions on R is defined and shown to be closed under a variety of operations. The class is also generalised to include mixtures with the pointmass at zero. This class coincides with the class of probability density functions with rational characteristic functions. It is illustrated that the Variance Gamma density is a 2-EPT density under a parameter restriction. A discrete 2-EPT process is a process which has stochastically independent 2-EPT random variables as increments. It is shown that the distribution of the minimum and maximum of such a process is an EPT density mixed with a pointmass at zero. The Laplace Transform of these distributions correspond to the discrete time Wiener-Hopf factors of the discrete time 2-EPT process. A distribution of daily log-returns, observed over the period 1931-2011 from a prominent US index, is approximated with a 2-EPT density function. Without the non-negativity condition, it is illustrated how this problem is transformed into a discrete time rational approximation problem. The rational approximation software RARL2 is used to carry out this approximation. The non-negativity constraint is then imposed via a convex optimisation procedure after the unconstrained approximation. Sufficient and necessary conditions are derived to characterise infinitely divisible EPT and 2-EPT functions. Infinitely divisible 2-EPT density functions generate 2-EPT Lévy processes. An assets log returns can be modelled as a 2-EPT Lévy process. Closed form pricing formulae are then derived for European Options with specific times to maturity. Formulae for discretely monitored Lookback Options and 2-Period Bermudan Options are also provided. Certain Greeks, including Delta and Gamma, of these options are also computed analytically. MATLAB scripts are provided for calculations involving 2-EPT functions. Numerical option pricing examples illustrate the effectiveness of the 2-EPT approach to financial modelling.
Resumo:
© 2010 by the American Geophysical Union.The cross-scale probabilistic structure of rainfall intensity records collected over time scales ranging from hours to decades at sites dominated by both convective and frontal systems is investigated. Across these sites, intermittency build-up from slow to fast time-scales is analyzed in terms of heavy tailed and asymmetric signatures in the scale-wise evolution of rainfall probability density functions (pdfs). The analysis demonstrates that rainfall records dominated by convective storms develop heavier-Tailed power law pdfs toward finer scales when compared with their frontal systems counterpart. Also, a concomitant marked asymmetry build-up emerges at such finer time scales. A scale-dependent probabilistic description of such fat tails and asymmetry appearance is proposed based on a modified q-Gaussian model, able to describe the cross-scale rainfall pdfs in terms of the nonextensivity parameter q, a lacunarity (intermittency) correction and a tail asymmetry coefficient, linked to the rainfall generation mechanism.
Resumo:
'To Tremble the Zero: Art in the Age of Algorithmic Reproduction' is a philosophic, political and sensuous journey playing with (and against) Benjamin's 'Art in the Age of Mechanical Reproduction'. In an age inundated by the 'post-': postmodernity, posthuman, post art, postsexual, post-feminist, post-society, post-nation, etc, 'To Tremble the Zero' sets out to re/present the nature of what it means to do or make 'art', as well as what it means to be or have 'human/ity' when the ground is nothing other than the fractal, and algorithmically infinite, combinations of zero and one. The work will address also the unfortunate way in which modern forms of metaphysics continue to creep 'unsuspectingly' into our understanding of contemporary media/electronic arts, despite (or perhaps even because of) the attempts by Latour, Badiou, or Agamben especially when addressing the zero/one as if a contradictory 'binary' rather than as a kind of 'slice' or (to use Deleuze and Guattari) an immanent plane of immanence. This work argues that by retrieving Benjamin, Einstein, Gödel, and Haraway, a rather different story of art can be told.
Resumo:
The greatest relaxation time for an assembly of three- dimensional rigid rotators in an axially symmetric bistable potential is obtained exactly in terms of continued fractions as a sum of the zero frequency decay functions (averages of the Legendre polynomials) of the system. This is accomplished by studying the entire time evolution of the Green function (transition probability) by expanding the time dependent distribution as a Fourier series and proceeding to the zero frequency limit of the Laplace transform of that distribution. The procedure is entirely analogous to the calculation of the characteristic time of the probability evolution (the integral of the configuration space probability density function with respect to the position co-ordinate) for a particle undergoing translational diffusion in a potential; a concept originally used by Malakhov and Pankratov (Physica A 229 (1996) 109). This procedure allowed them to obtain exact solutions of the Kramers one-dimensional translational escape rate problem for piecewise parabolic potentials. The solution was accomplished by posing the problem in terms of the appropriate Sturm-Liouville equation which could be solved in terms of the parabolic cylinder functions. The method (as applied to rotational problems and posed in terms of recurrence relations for the decay functions, i.e., the Brinkman approach c.f. Blomberg, Physica A 86 (1977) 49, as opposed to the Sturm-Liouville one) demonstrates clearly that the greatest relaxation time unlike the integral relaxation time which is governed by a single decay function (albeit coupled to all the others in non-linear fashion via the underlying recurrence relation) is governed by a sum of decay functions. The method is easily generalized to multidimensional state spaces by matrix continued fraction methods allowing one to treat non-axially symmetric potentials, where the distribution function is governed by two state variables. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their non-deterministic performance. Although CAMs are favoured by technology vendors due to their deterministic high lookup rates, they suffer from the problems of high power dissipation and high silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multi-level cutting the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.
Resumo:
We investigated familiarity and preference judgments of participants toward a novel musical system. We exposed participants to tone sequences generated from a novel pitch probability profile. Afterward, we either asked participants to identify more familiar or we asked participants to identify preferred tone sequences in a two-alternative forced-choice task. The task paired a tone sequence generated from the pitch probability profile they had been exposed to and a tone sequence generated from another pitch probability profile at three levels of distinctiveness. We found that participants identified tone sequences as more familiar if they were generated from the same pitch probability profile which they had been exposed to. However, participants did not prefer these tone sequences. We interpret this relationship between familiarity and preference to be consistent with an inverted U-shaped relationship between knowledge and affect. The fact that participants identified tone sequences as even more familiar if they were generated from the more distinctive (caricatured) version of the pitch probability profile which they had been exposed to suggests that the statistical learning of the pitch probability profile is involved in gaining of musical knowledge.