957 resultados para probability distributions
Resumo:
Fast forward error correction codes are becoming an important component in bulk content delivery. They fit in naturally with multicast scenarios as a way to deal with losses and are now seeing use in peer to peer networks as a basis for distributing load. In particular, new irregular sparse parity check codes have been developed with provable average linear time performance, a significant improvement over previous codes. In this paper, we present a new heuristic for generating codes with similar performance based on observing a server with an oracle for client state. This heuristic is easy to implement and provides further intuition into the need for an irregular heavy tailed distribution.
Resumo:
A novel approach for real-time skin segmentation in video sequences is described. The approach enables reliable skin segmentation despite wide variation in illumination during tracking. An explicit second order Markov model is used to predict evolution of the skin color (HSV) histogram over time. Histograms are dynamically updated based on feedback from the current segmentation and based on predictions of the Markov model. The evolution of the skin color distribution at each frame is parameterized by translation, scaling and rotation in color space. Consequent changes in geometric parameterization of the distribution are propagated by warping and re-sampling the histogram. The parameters of the discrete-time dynamic Markov model are estimated using Maximum Likelihood Estimation, and also evolve over time. Quantitative evaluation of the method was conducted on labeled ground-truth video sequences taken from popular movies.
Resumo:
The increasing practicality of large-scale flow capture makes it possible to conceive of traffic analysis methods that detect and identify a large and diverse set of anomalies. However the challenge of effectively analyzing this massive data source for anomaly diagnosis is as yet unmet. We argue that the distributions of packet features (IP addresses and ports) observed in flow traces reveals both the presence and the structure of a wide range of anomalies. Using entropy as a summarization tool, we show that the analysis of feature distributions leads to significant advances on two fronts: (1) it enables highly sensitive detection of a wide range of anomalies, augmenting detections by volume-based methods, and (2) it enables automatic classification of anomalies via unsupervised learning. We show that using feature distributions, anomalies naturally fall into distinct and meaningful clusters. These clusters can be used to automatically classify anomalies and to uncover new anomaly types. We validate our claims on data from two backbone networks (Abilene and Geant) and conclude that feature distributions show promise as a key element of a fairly general network anomaly diagnosis framework.
Resumo:
An incremental, nonparametric probability estimation procedure using the fuzzy ARTMAP neural network is introduced. In slow-learning mode, fuzzy ARTMAP searches for patterns of data on which to build ever more accurate estimates. In max-nodes mode, the network initially learns a fixed number of categories, and weights are then adjusted gradually.
Resumo:
We present a neural network that adapts and integrates several preexisting or new modules to categorize events in short term memory (STM), encode temporal order in working memory, evaluate timing and probability context in medium and long term memory. The model shows how processed contextual information modulates event recognition and categorization, focal attention and incentive motivation. The model is based on a compendium of Event Related Potentials (ERPs) and behavioral results either collected by the authors or compiled from the classical ERP literature. Its hallmark is, at the functional level, the interplay of memory registers endowed with widely different dynamical ranges, and at the structural level, the attempt to relate the different modules to known anatomical structures.
Resumo:
A popular way to account for unobserved heterogeneity is to assume that the data are drawn from a finite mixture distribution. A barrier to using finite mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive separability of the log-likelihood function. We show, however, that an extension of the EM algorithm reintroduces additive separability, thus allowing one to estimate parameters sequentially during each maximization step. In establishing this result, we develop a broad class of estimators for mixture models. Returning to the likelihood problem, we show that, relative to full information maximum likelihood, our sequential estimator can generate large computational savings with little loss of efficiency.
A mathematical theory of stochastic microlensing. II. Random images, shear, and the Kac-Rice formula
Resumo:
Continuing our development of a mathematical theory of stochastic microlensing, we study the random shear and expected number of random lensed images of different types. In particular, we characterize the first three leading terms in the asymptotic expression of the joint probability density function (pdf) of the random shear tensor due to point masses in the limit of an infinite number of stars. Up to this order, the pdf depends on the magnitude of the shear tensor, the optical depth, and the mean number of stars through a combination of radial position and the star's mass. As a consequence, the pdf's of the shear components are seen to converge, in the limit of an infinite number of stars, to shifted Cauchy distributions, which shows that the shear components have heavy tails in that limit. The asymptotic pdf of the shear magnitude in the limit of an infinite number of stars is also presented. All the results on the random microlensing shear are given for a general point in the lens plane. Extending to the general random distributions (not necessarily uniform) of the lenses, we employ the Kac-Rice formula and Morse theory to deduce general formulas for the expected total number of images and the expected number of saddle images. We further generalize these results by considering random sources defined on a countable compact covering of the light source plane. This is done to introduce the notion of global expected number of positive parity images due to a general lensing map. Applying the result to microlensing, we calculate the asymptotic global expected number of minimum images in the limit of an infinite number of stars, where the stars are uniformly distributed. This global expectation is bounded, while the global expected number of images and the global expected number of saddle images diverge as the order of the number of stars. © 2009 American Institute of Physics.
Resumo:
The long-term soil carbon dynamics may be approximated by networks of linear compartments, permitting theoretical analysis of transit time (i.e., the total time spent by a molecule in the system) and age (the time elapsed since the molecule entered the system) distributions. We compute and compare these distributions for different network. configurations, ranging from the simple individual compartment, to series and parallel linear compartments, feedback systems, and models assuming a continuous distribution of decay constants. We also derive the transit time and age distributions of some complex, widely used soil carbon models (the compartmental models CENTURY and Rothamsted, and the continuous-quality Q-Model), and discuss them in the context of long-term carbon sequestration in soils. We show how complex models including feedback loops and slow compartments have distributions with heavier tails than simpler models. Power law tails emerge when using continuous-quality models, indicating long retention times for an important fraction of soil carbon. The responsiveness of the soil system to changes in decay constants due to altered climatic conditions or plant species composition is found to be stronger when all compartments respond equally to the environmental change, and when the slower compartments are more sensitive than the faster ones or lose more carbon through microbial respiration. Copyright 2009 by the American Geophysical Union.
Resumo:
A novel approach is proposed to estimate the natural streamflow regime of a river and to assess the extent of the alterations induced by dam operation related to anthropogenic (e.g., agricultural, hydropower) water uses in engineered river basins. The method consists in the comparison between the seasonal probability density function (pdf) of observed streamflows and the purportedly natural streamflow pdf obtained by a recently proposed and validated probabilistic model. The model employs a minimum of landscape and climate parameters and unequivocally separates the effects of anthropogenic regulations from those produced by hydroclimatic fluctuations. The approach is applied to evaluate the extent of the alterations of intra-annual streamflow variability in a highly engineered alpine catchment of north-eastern Italy, the Piave river. Streamflows observed downstream of the regulation devices in the Piave catchment are found to exhibit smaller means/modes, larger coefficients of variation, and more pronounced peaks than the flows that would be observed in the absence of anthropogenic regulation, suggesting that the anthropogenic disturbance leads to remarkable reductions of river flows, with an increase of the streamflow variability and of the frequency of preferential states far from the mean. Some structural limitations of management approaches based on minimum streamflow requirements (widely used to guide water policies) as opposed to criteria based on whole distributions are also discussed. Copyright © 2010 by the American Geophysical Union.
Resumo:
© 2010 by the American Geophysical Union.The cross-scale probabilistic structure of rainfall intensity records collected over time scales ranging from hours to decades at sites dominated by both convective and frontal systems is investigated. Across these sites, intermittency build-up from slow to fast time-scales is analyzed in terms of heavy tailed and asymmetric signatures in the scale-wise evolution of rainfall probability density functions (pdfs). The analysis demonstrates that rainfall records dominated by convective storms develop heavier-Tailed power law pdfs toward finer scales when compared with their frontal systems counterpart. Also, a concomitant marked asymmetry build-up emerges at such finer time scales. A scale-dependent probabilistic description of such fat tails and asymmetry appearance is proposed based on a modified q-Gaussian model, able to describe the cross-scale rainfall pdfs in terms of the nonextensivity parameter q, a lacunarity (intermittency) correction and a tail asymmetry coefficient, linked to the rainfall generation mechanism.
Resumo:
We develop a model for stochastic processes with random marginal distributions. Our model relies on a stick-breaking construction for the marginal distribution of the process, and introduces dependence across locations by using a latent Gaussian copula model as the mechanism for selecting the atoms. The resulting latent stick-breaking process (LaSBP) induces a random partition of the index space, with points closer in space having a higher probability of being in the same cluster. We develop an efficient and straightforward Markov chain Monte Carlo (MCMC) algorithm for computation and discuss applications in financial econometrics and ecology. This article has supplementary material online.
Resumo:
Twelve months of aerosol size distributions from 3 to 560nm, measured using scanning mobility particle sizers are presented with an emphasis on average number, surface, and volume distributions, and seasonal and diurnal variation. The measurements were made at the main sampling site of the Pittsburgh Air Quality Study from July 2001 to June 2002. These are supplemented with 5 months of size distribution data from 0.5 to 2.5μm measured with a TSI aerosol particle sizer and 2 months of size distributions measured at an upwind rural sampling site. Measurements at the main site were made continuously under both low and ambient relative humidity. The average Pittsburgh number concentration (3-500nm) is 22,000cm-3 with an average mode size of 40nm. Strong diurnal patterns in number concentrations are evident as a direct effect of the sources of particles (atmospheric nucleation, traffic, and other combustion sources). New particle formation from homogeneous nucleation is significant on 30-50% of study days and over a wide area (at least a hundred kilometers). Rural number concentrations are a factor of 2-3 lower (on average) than the urban values. Average measured distributions are different from model literature urban and rural size distributions. © 2004 Elsevier Ltd. All rights reserved.
Resumo:
Earth's surface is rapidly urbanizing, resulting in dramatic changes in the abundance, distribution and character of surface water features in urban landscapes. However, the scope and consequences of surface water redistribution at broad spatial scales are not well understood. We hypothesized that urbanization would lead to convergent surface water abundance and distribution: in other words, cities will gain or lose water such that they become more similar to each other than are their surrounding natural landscapes. Using a database of more than 1 million water bodies and 1 million km of streams, we compared the surface water of 100 US cities with their surrounding undeveloped land. We evaluated differences in areal (A WB) and numeric densities (N WB) of water bodies (lakes, wetlands, and so on), the morphological characteristics of water bodies (size), and the density (D C) of surface flow channels (that is, streams and rivers). The variance of urban A WB, N WB, and D C across the 100 MSAs decreased, by 89, 25, and 71%, respectively, compared to undeveloped land. These data show that many cities are surface water poor relative to undeveloped land; however, in drier landscapes urbanization increases the occurrence of surface water. This convergence pattern strengthened with development intensity, such that high intensity urban development had an areal water body density 98% less than undeveloped lands. Urbanization appears to drive the convergence of hydrological features across the US, such that surface water distributions of cities are more similar to each other than to their surrounding landscapes. © 2014 The Author(s).
Resumo:
The neutron multidetector DéMoN has been used to investigate the symmetric splitting dynamics in the reactions 58.64Ni + 208Pb with excitation energies ranging from 65 to 186 MeV for the composite system. An analysis based on the new backtracing technique has been applied on the neutron data to determine the two-dimensional correlations between the parent composite system initial thermal energy (EthCN) and the total neutron multiplicity (νtot), and between pre- and post-scission neutron multiplicities (νpre and νpost, respectively). The νpre distribution shape indicates the possible coexistence of fast-fission and fusion-fission for the system 58Ni + 208Pb (Ebeam = 8.86 A MeV). The analysis of the neutron multiplicities in the framework of the combined dynamical statistical model (CDSM) gives a reduced friction coefficient β = 23 ± 2512 × 1021 s-1, above the one-body dissipation limit. The corresponding fission time is τf = 40 ± 4620 × 10-21 s. © 1999 Elsevier Science B.V. All rights reserved.
Resumo:
info:eu-repo/semantics/published