952 resultados para implied probability distribution function


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the statistical properties of the local density of states of a one-dimensional Dirac equation in the presence of various types of disorder with Gaussian white-noise distribution. It is shown how either the replica trick or supersymmetry can be used to calculate exactly all the moments of the local density of states.' Careful attention is paid to how the results change if the local density of states is averaged over atomic length scales. For both the replica trick and supersymmetry the problem is reduced to finding the ground state of a zero-dimensional Hamiltonian which is written solely in terms of a pair of coupled spins which are elements of u(1, 1). This ground state is explicitly found for the particular case of the Dirac equation corresponding to an infinite metallic quantum wire with a single conduction channel. The calculated moments of the local density of states agree with those found previously by Al'tshuler and Prigodin [Sov. Phys. JETP 68 (1989) 198] using a technique based on recursion relations for Feynman diagrams. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We establish a connection between the simple harmonic oscillator and a two-level atom interacting with resonant, quantized cavity and strong driving fields, which suggests an experiment to measure the harmonic-oscillator's probability distribution function. To achieve this, we calculate the Autler-Townes spectrum by coupling the system to a third level. We find that there are two different regions of the atomic dynamics depending on the ratio of the: Rabi frequency Omega (c) of the cavity field to that of the Rabi frequency Omega of the driving field. For Omega (c)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We obtain the exact asymptotic result for the disorder-averaged probability distribution function for a random walk in a biased Sinai model and show that it is characterized by a creeping behavior of the displacement moments with time, similar to v(mu n), where mu <1 is dimensionless mean drift. We employ a method originated in quantum diffusion which is based on the exact mapping of the problem to an imaginary-time Schrodinger equation. For nonzero drift such an equation has an isolated lowest eigenvalue separated by a gap from quasicontinuous excited states, and the eigenstate corresponding to the former governs the long-time asymptotic behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The oxidative and thermo-mechanical degradation of HDPE was studied during processing in an internal mixer under two conditions: totally and partially filled chambers, which provides lower and higher concentrations of oxygen, respectively. Two types of HDPEs, Phillips and Ziegler-Natta, having different levels of terminal vinyl unsaturations were analyzed. Materials were processed at 160, 200, and 240 degrees C. Standard rheograrns using a partially filled chamber showed that the torque is much more unstable in comparison to a totally filled chamber which provides an environment depleted of oxygen. Carbonyl and transvinylene group concentrations increased, whereas vinyl group concentration decreased with temperature and oxygen availability. Average number of chain scission and branching (n(s)) was calculated from MWD curves and its plotting versus functional groups' concentration showed that chain scission or branching takes place depending upon oxygen content and vinyl groups' consumption. Chain scission and branching distribution function (CSBDF) values showed that longer chains undergo chain scission easier than shorter ones due to their higher probability of entanglements. This yields macroradicals that react with the vinyl terminal unsaturations of other chains producing chain branching. Shorter chains are more mobile, not suffering scission but instead are used for grafting the macroradicals, increasing the molecular weight. Increase in the oxygen concentration, temperature, and vinyl end groups' content facilitates the thermo-mechanical degradation reducing the amount of both, longer chains via chain scission and shorter chains via chain branching, narrowing the polydispersity. Phillips HDPE produces a higher level of chain branching than the Ziegler-Natta's type at the same processing condition. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes an adaptive algorithm for clustering cumulative probability distribution functions (c.p.d.f.) of a continuous random variable, observed in different populations, into the minimum homogeneous clusters, making no parametric assumptions about the c.p.d.f.’s. The distance function for clustering c.p.d.f.’s that is proposed is based on the Kolmogorov–Smirnov two sample statistic. This test is able to detect differences in position, dispersion or shape of the c.p.d.f.’s. In our context, this statistic allows us to cluster the recorded data with a homogeneity criterion based on the whole distribution of each data set, and to decide whether it is necessary to add more clusters or not. In this sense, the proposed algorithm is adaptive as it automatically increases the number of clusters only as necessary; therefore, there is no need to fix in advance the number of clusters. The output of the algorithm are the common c.p.d.f. of all observed data in the cluster (the centroid) and, for each cluster, the Kolmogorov–Smirnov statistic between the centroid and the most distant c.p.d.f. The proposed algorithm has been used for a large data set of solar global irradiation spectra distributions. The results obtained enable to reduce all the information of more than 270,000 c.p.d.f.’s in only 6 different clusters that correspond to 6 different c.p.d.f.’s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This technical report describes the PDFs which have been implemented to model the behaviours of certain parameters of the Repeater-Based Hybrid Wired/Wireless PROFIBUS Network Simulator (RHW2PNetSim) and Bridge-Based Hybrid Wired/Wireless PROFIBUS Network Simulator (BHW2PNetSim).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power law distributions, a well-known model in the theory of real random variables, characterize a wide variety of natural and man made phenomena. The intensity of earthquakes, the word frequencies, the solar ares and the sizes of power outages are distributed according to a power law distribution. Recently, given the usage of power laws in the scientific community, several articles have been published criticizing the statistical methods used to estimate the power law behaviour and establishing new techniques to their estimation with proven reliability. The main object of the present study is to go in deep understanding of this kind of distribution and its analysis, and introduce the half-lives of the radioactive isotopes as a new candidate in the nature following a power law distribution, as well as a \canonical laboratory" to test statistical methods appropriate for long-tailed distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, one of the most important challenges to enhance the efficiency of thin film silicon solar cells is to increase the short circuit intensity by means of optical confinement methods, such as textured back-reflector structures. In this work, two possible textured structures to be used as back reflectors for n-i-p solar cells have been optically analyzed and compared to a smooth one by using a system which is able to measure the angular distribution function (ADF) of the scattered light in a wide spectral range (350-1000 nm). The accurate analysis of the ADF data corresponding to the reflector structures and to the μc-Si:H films deposited onto them allows the optical losses due to the reflector absorption and its effectiveness in increasing light absorption in the μc-Si:H layer, mainly at long wavelengths, to be quantified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors focus on one of the methods for connection acceptance control (CAC) in an ATM network: the convolution approach. With the aim of reducing the cost in terms of calculation and storage requirements, they propose the use of the multinomial distribution function. This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements. This in turn makes possible a simple deconvolution process. Moreover, under certain conditions additional improvements may be achieved

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.