956 resultados para Weibull distribution function
Resumo:
We calculate the chemical potential ¿0 and the effective mass m*/m3 of one 3He impurity in liquid 4He. First a variational wave function including two- and three-particle dynamical correlations is adopted. Triplet correlations bring the computed values of ¿0 very close to the experimental results. The variational estimate of m*/m3 includes also backflow correlations between the 3He atom and the particles in the medium. Different approximations for the three-particle distribution function give almost the same values for m*/m3. The variational approach underestimates m*/m3 by ~10% at all of the considered densities. Correlated-basis perturbation theory is then used to improve the wave function to include backflow around the particles of the medium. The perturbative series built up with one-phonon states only is summed up to infinite order and gives results very close to the variational ones. All the perturbative diagrams with two independent phonons have then been summed to compute m*/m3. Their contribution depends to some extent on the form used for the three-particle distribution function. When the scaling approximation is adopted, a reasonable agreement with the experimental results is achieved.
Resumo:
In this Contribution we show that a suitably defined nonequilibrium entropy of an N-body isolated system is not a constant of the motion, in general, and its variation is bounded, the bounds determined by the thermodynamic entropy, i.e., the equilibrium entropy. We define the nonequilibrium entropy as a convex functional of the set of n-particle reduced distribution functions (n ? N) generalizing the Gibbs fine-grained entropy formula. Additionally, as a consequence of our microscopic analysis we find that this nonequilibrium entropy behaves as a free entropic oscillator. In the approach to the equilibrium regime, we find relaxation equations of the Fokker-Planck type, particularly for the one-particle distribution function.
Resumo:
Conversion electron Mossbauer spectra of composition modulated FeSi thin films have been analysed within the framework of a quasi shape independent model in which the distribution function for the hyperfine fields is assumed to be given by a binomial distribution. Both the hyperfine field and the hyperfine field distribution depend on the modulation characteristic length.
Resumo:
[cat] En aquest article estudiem estratègies “comprar i mantenir” per a problemes d’optimitzar la riquesa final en un context multi-període. Com que la riquesa final és una suma de variables aleatòries dependents, on cadascuna d’aquestes correspon a una quantitat de capital que s’ha invertit en un actiu particular en una data determinada, en primer lloc considerem aproximacions que redueixen l’aleatorietat multivariant al cas univariant. A continuació, aquestes aproximacions es fan servir per determinar les estratègies “comprar i mantenir” que optimitzen, per a un nivell de probabilitat donat, el VaR i el CLTE de la funció de distribució de la riquesa final. Aquest article complementa el treball de Dhaene et al. (2005), on es van considerar estratègies de reequilibri constant.
Resumo:
In this paper we consider diffusion of a passive substance C in a temporarily and spatially inhomogeneous two-dimensional medium. As a realization for the latter we choose a phase-separating medium consisting of two substances A and B, whose dynamics is determined by the Cahn-Hilliard equation. Assuming different diffusion coefficients of C in A and B, we find that the variance of the distribution function of the said substance grows less than linearly in time. We derive a simple identity for the variance using a probabilistic ansatz and are then able to identify the interface between A and B as the main cause for this nonlinear dependence. We argue that, finally, for very large times the here temporarily dependent diffusion "constant" goes like t-1/3 to a constant asymptotic value D¿. The latter is calculated approximately by employing the effective-medium approximation and by fitting the simulation data to the said time dependence.
Resumo:
When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.
Resumo:
The distribution of distances from atoms of a particular element E to a probe atom X (oxygen in most cases), both bonded and intermolecular non-bonded contacts, has been analyzed. In general, the distribution is characterized by a maximum at short EX distances corresponding to chemical bonds, followed by a range of unpopulated distances the van der Waals gap and a second maximum at longer distances the van der Waals peak superimposed on a random distribution function that roughly follows a d3 dependence. The analysis of more than five million interatomic"non-bonded" distances has led to the proposal of a consistent set of van der Waals radii for most naturally occurring elements, and its applicability to other element pairs has been tested for a set of more than three million data, all of them compared to over one million bond distances.
Resumo:
We provide an incremental quantile estimator for Non-stationary Streaming Data. We propose a method for simultaneous estimation of multiple quantiles corresponding to the given probability levels from streaming data. Due to the limitations of the memory, it is not feasible to compute the quantiles by storing the data. So estimating the quantiles as the data pass by is the only possibility. This can be effective in network measurement. To provide the minimum of the mean-squared error of the estimation, we use parabolic approximation and for comparison we simulate the results for different number of runs and using both linear and parabolic approximations.
Resumo:
En el presente trabajo se ha abordado, a partir de la información suministrada por el Segundo Inventario Forestal Nacional, la caracterización de las masas monoespecíficas de pino laricio en Cataluña (NE de España), formalizándola en una tipología de enfoque silvogenético. Basándose en los resultados obtenidos del análisis factorial realizado sobre variables dasométricas, los categoremas de la tipología se articularon sobre aspectos concernientes a la distribución diamétrica y espesura del rodal, identificándose nueve tipos. El carácter irregularizado, presente en cinco de ellos aunque con rasgos diferentes, se examinó a través de la distribución de Weibull truncada. La tipología, que se maneja computando tan sólo el nivel de regenerado y los diámetros de los pies inventariables, contempla los distintos estadios evolutivos y permite diagnosticar situaciones comprometidas en cuanto a la persistencia de la masa, bien causadas por deficientes estados de espesura o por falta de regenerado.
Resumo:
The effect of the heat flux on the rate of chemical reaction in dilute gases is shown to be important for reactions characterized by high activation energies and in the presence of very large temperature gradients. This effect, obtained from the second-order terms in the distribution function (similar to those obtained in the Burnett approximation to the solution of the Boltzmann equation), is derived on the basis of information theory. It is shown that the analytical results describing the effect are simpler if the kinetic definition for the nonequilibrium temperature is introduced than if the thermodynamic definition is introduced. The numerical results are nearly the same for both definitions
Resumo:
Erilaisten IP-pohjaisten palvelujen käyttö lisääntyy jatkuvasti samalla, kun käyttäjistä tulee yhä liikkuvaisempia. Tästä syystä IP- protokolla tulee väistämättä myös mobiiliverkkoihin. Tässä diplomityössä tutkitaan mobiliteetin IP multcastingiin tuomia ongelmia ja simuloidaan niitä Network Simulatoria käyttäen. Pääpaino on ongelmalla, joka aiheutuu multicast- ryhmänmuodostusviiveestä. Tätä ongelmaa simuloidaan, jotta viiveen, mobiilikäyttäjien palveluunsaapumistaajuuden ja Scalable Reliable Multicast (SRM) protokollan ajastinarvojen asetusten vaikutus repair request- pakettien määrään ja sitä kautta suoritettavien uudelleenlähetysten määrään selviäisi. Eri parametrien vaikutuksen tutkimiseksi esitetään simulaatiotuloksia varioiduilla parametreillä käyttäen CDF- käyriä. Tulosten perusteella merkittävin tekijä uudelleenlähetyspyyntöjen kannalta on protokollan ajastimien arvot ja haluttu palvelun taso, viiveen merkityksen jäädessä vähäiseksi. Työn lopuksi tutkitaan SRM- protokollan soveltuvuutta mobiiliverkkoihin ja pohditaan vaihtoehtoja toiminnan parantamiseksi.
Resumo:
We propose a novel formulation to solve the problem of intra-voxel reconstruction of the fibre orientation distribution function (FOD) in each voxel of the white matter of the brain from diffusion MRI data. The majority of the state-of-the-art methods in the field perform the reconstruction on a voxel-by-voxel level, promoting sparsity of the orientation distribution. Recent methods have proposed a global denoising of the diffusion data using spatial information prior to reconstruction, while others promote spatial regularisation through an additional empirical prior on the diffusion image at each q-space point. Our approach reconciles voxelwise sparsity and spatial regularisation and defines a spatially structured FOD sparsity prior, where the structure originates from the spatial coherence of the fibre orientation between neighbour voxels. The method is shown, through both simulated and real data, to enable accurate FOD reconstruction from a much lower number of q-space samples than the state of the art, typically 15 samples, even for quite adverse noise conditions.
Resumo:
A statistical indentation method has been employed to study the hardness value of fire-refined high conductivity copper, using nanoindentation technique. The Joslin and Oliver approach was used with the aim to separate the hardness (H) influence of copper matrix, from that of inclusions and grain boundaries. This approach relies on a large array of imprints (around 400 indentations), performed at 150 nm of indentation depth. A statistical study using a cumulative distribution function fit and Gaussian simulated distributions, exhibits that H for each phase can be extracted when the indentation depth is much lower than the size of the secondary phases. It is found that the thermal treatment produces a hardness increase, due to the partly re-dissolution of the inclusions (mainly Pb and Sn) in the matrix.
Resumo:
Reinsurance is one of the tools that an insurer can use to mitigate the underwriting risk and then to control its solvency. In this paper, we focus on the proportional reinsurance arrangements and we examine several optimization and decision problems of the insurer with respect to the reinsurance strategy. To this end, we use as decision tools not only the probability of ruin but also the random variable deficit at ruin if ruin occurs. The discounted penalty function (Gerber & Shiu, 1998) is employed to calculate as particular cases the probability of ruin and the moments and the distribution function of the deficit at ruin if ruin occurs.
Resumo:
In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.