23 resultados para Probability Distribution Function

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Marshall's (1970) lemma is an analytical result which implies root-n-consistency of the distribution function corresponding to the Grenander (1956) estimator of a non-decreasing probability density. The present paper derives analogous results for the setting of convex densities on [0,\infty).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we show statistical analyses of several types of traffic sources in a 3G network, namely voice, video and data sources. For each traffic source type, measurements were collected in order to, on the one hand, gain better understanding of the statistical characteristics of the sources and, on the other hand, enable forecasting traffic behaviour in the network. The latter can be used to estimate service times and quality of service parameters. The probability density function, mean, variance, mean square deviation, skewness and kurtosis of the interarrival times are estimated by Wolfram Mathematica and Crystal Ball statistical tools. Based on evaluation of packet interarrival times, we show how the gamma distribution can be used in network simulations and in evaluation of available capacity in opportunistic systems. As a result, from our analyses, shape and scale parameters of gamma distribution are generated. Data can be applied also in dynamic network configuration in order to avoid potential network congestions or overflows. Copyright © 2013 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study existence of random elements with partially specified distributions. The technique relies on the existence of a positive ex-tension for linear functionals accompanied by additional conditions that ensure the regularity of the extension needed for interpreting it as a probability measure. It is shown in which case the extens ion can be chosen to possess some invariance properties. The results are applied to the existence of point processes with given correlation measure and random closed sets with given two-point covering function or contact distribution function. It is shown that the regularity condition can be efficiently checked in many cases in order to ensure that the obtained point processes are indeed locally finite and random sets have closed realisations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We analyse the variability of the probability distribution of daily wind speed in wintertime over Northern and Central Europe in a series of global and regional climate simulations covering the last centuries, and in reanalysis products covering approximately the last 60 years. The focus of the study lies on identifying the link of the variations in the wind speed distribution to the regional near-surface temperature, to the meridional temperature gradient and to the North Atlantic Oscillation. Our main result is that the link between the daily wind distribution and the regional climate drivers is strongly model dependent. The global models tend to behave similarly, although they show some discrepancies. The two regional models also tend to behave similarly to each other, but surprisingly the results derived from each regional model strongly deviates from the results derived from its driving global model. In addition, considering multi-centennial timescales, we find in two global simulations a long-term tendency for the probability distribution of daily wind speed to widen through the last centuries. The cause for this widening is likely the effect of the deforestation prescribed in these simulations. We conclude that no clear systematic relationship between the mean temperature, the temperature gradient and/or the North Atlantic Oscillation, with the daily wind speed statistics can be inferred from these simulations. The understand- ing of past and future changes in the distribution of wind speeds, and thus of wind speed extremes, will require a detailed analysis of the representation of the interaction between large-scale and small-scale dynamics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fossil pollen data from stratigraphic cores are irregularly spaced in time due to non-linear age-depth relations. Moreover, their marginal distributions may vary over time. We address these features in a nonparametric regression model with errors that are monotone transformations of a latent continuous-time Gaussian process Z(T). Although Z(T) is unobserved, due to monotonicity, under suitable regularity conditions, it can be recovered facilitating further computations such as estimation of the long-memory parameter and the Hermite coefficients. The estimation of Z(T) itself involves estimation of the marginal distribution function of the regression errors. These issues are considered in proposing a plug-in algorithm for optimal bandwidth selection and construction of confidence bands for the trend function. Some high-resolution time series of pollen records from Lago di Origlio in Switzerland, which go back ca. 20,000 years are used to illustrate the methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Let P be a probability distribution on q -dimensional space. The so-called Diaconis-Freedman effect means that for a fixed dimension d<distributions. The present paper provides necessary and sufficient conditions for this phenomenon in a suitable asymptotic framework with increasing dimension q . It turns out, that the conditions formulated by Diaconis and Freedman (1984) are not only sufficient but necessary as well. Moreover, letting P ^ be the empirical distribution of n independent random vectors with distribution P , we investigate the behavior of the empirical process n √ (P ^ −P) under random projections, conditional on P ^ .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report the first observation of protons in the near-lunar (100-200 km from the surface) and deeper (near anti-subsolar point) plasma wake when the interplanetary magnetic field (IMF) and solar wind velocity (vsw) are parallel (aligned flow; angle between IMF and vsw≤10°). More than 98% of the observations during aligned flow condition showed the presence of protons in the wake. These observations are obtained by the Solar Wind Monitor sensor of the Sub-keV Atom Reflecting Analyser experiment on Chandrayaan-1. The observation cannot be explained by the conventional fluid models for aligned flow. Back tracing of the observed protons suggests that their source is the solar wind. The larger gyroradii of the wake protons compared to that of solar wind suggest that they were part of the tail of the solar wind velocity distribution function. Such protons could enter the wake due to their large gyroradii even when the flow is aligned to IMF. However, the wake boundary electric field may also play a role in the entry of the protons into the wake.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two Boltzmannian accounts of the Second Law, viz.\ a globalist and a localist one. In both cases, the probabilities fail to be chances because they have rivals that are roughly equally good. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present the observations of energetic neutral atoms (ENAs) produced at the lunar surface in the Earth's magnetotail. When the Moon was located in the terrestrial plasma sheet, Chandrayaan-1 Energetic Neutrals Analyzer (CENA) detected hydrogen ENAs from the Moon. Analysis of the data from CENA together with the Solar Wind Monitor (SWIM) onboard Chandrayaan-1 reveals the characteristic energy of the observed ENA energy spectrum (the e-folding energy of the distribution function) ∼100 eV and the ENA backscattering ratio (defined as the ratio of upward ENA flux to downward proton flux) <∼0.1. These characteristics are similar to those of the backscattered ENAs in the solar wind, suggesting that CENA detected plasma sheet particles backscattered as ENAs from the lunar surface. The observed ENA backscattering ratio in the plasma sheet exhibits no significant difference in the Southern Hemisphere, where a large and strong magnetized region exists, compared with that in the Northern Hemisphere. This is contrary to the CENA observations in the solar wind, when the backscattering ratio drops by ∼50% in the Southern Hemisphere. Our analysis and test particle simulations suggest that magnetic shielding of the lunar surface in the plasma sheet is less effective than in the solar wind due to the broad velocity distributions of the plasma sheet protons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The production of a W boson in association with a single charm quark is studied using 4.6 fb−1 of pp collision data at ps = 7TeV collected with the ATLAS detector at the Large Hadron Collider. In events in which a W boson decays to an electron or muon, the charm quark is tagged either by its semileptonic decay to a muon or by the presence of a charmed meson. The integrated and differential cross sections as a function of the pseudorapidity of the lepton from the W-boson decay are measured. Results are compared to the predictions of next-to-leading-order QCD calculations obtained from various parton distribution function parameterisations. The ratio of the strange-to-down sea-quark distributions is determined to be 0.96+0.26−0.30 at Q2 = 1.9 GeV2, which supports the hypothesis of an SU(3)-symmetric composition of the light-quark sea. Additionally, the cross-section ratio ơ(W++c)/ơ(W−+c) is compared to the predictions obtained using parton distribution function parameterisations with different assumptions about the s–s quark asymmetry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We review various inequalities for Mills' ratio (1 - Φ)= Ø, where Ø and Φ denote the standard Gaussian density and distribution function, respectively. Elementary considerations involving finite continued fractions lead to a general approximation scheme which implies and refines several known bounds.