103 resultados para Probability Distribution Function
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
We investigate a conjecture on the cover times of planar graphs by means of large Monte Carlo simulations. The conjecture states that the cover time tau (G(N)) of a planar graph G(N) of N vertices and maximal degree d is lower bounded by tau (G(N)) >= C(d)N(lnN)(2) with C(d) = (d/4 pi) tan(pi/d), with equality holding for some geometries. We tested this conjecture on the regular honeycomb (d = 3), regular square (d = 4), regular elongated triangular (d = 5), and regular triangular (d = 6) lattices, as well as on the nonregular Union Jack lattice (d(min) = 4, d(max) = 8). Indeed, the Monte Carlo data suggest that the rigorous lower bound may hold as an equality for most of these lattices, with an interesting issue in the case of the Union Jack lattice. The data for the honeycomb lattice, however, violate the bound with the conjectured constant. The empirical probability distribution function of the cover time for the square lattice is also briefly presented, since very little is known about cover time probability distribution functions in general.
Resumo:
In this paper we proposed a new two-parameters lifetime distribution with increasing failure rate. The new distribution arises on a latent complementary risk problem base. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulae for its reliability and failure rate functions, quantiles and moments, including the mean and variance. A simple EM-type algorithm for iteratively computing maximum likelihood estimates is presented. The Fisher information matrix is derived analytically in order to obtaining the asymptotic covariance matrix. The methodology is illustrated on a real data set. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The modeling and analysis of lifetime data is an important aspect of statistical work in a wide variety of scientific and technological fields. Good (1953) introduced a probability distribution which is commonly used in the analysis of lifetime data. For the first time, based on this distribution, we propose the so-called exponentiated generalized inverse Gaussian distribution, which extends the exponentiated standard gamma distribution (Nadarajah and Kotz, 2006). Various structural properties of the new distribution are derived, including expansions for its moments, moment generating function, moments of the order statistics, and so forth. We discuss maximum likelihood estimation of the model parameters. The usefulness of the new model is illustrated by means of a real data set. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
Birnbaum and Saunders (1969a) introduced a probability distribution which is commonly used in reliability studies For the first time based on this distribution the so-called beta-Birnbaum-Saunders distribution is proposed for fatigue life modeling Various properties of the new model including expansions for the moments moment generating function mean deviations density function of the order statistics and their moments are derived We discuss maximum likelihood estimation of the model s parameters The superiority of the new model is illustrated by means of three failure real data sets (C) 2010 Elsevier B V All rights reserved
Resumo:
OBJETIVO: Estimar valores de referência e função de hierarquia de docentes em Saúde Coletiva do Brasil por meio de análise da distribuição do índice h. MÉTODOS: A partir do portal da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior, 934 docentes foram identificados em 2008, dos quais 819 foram analisados. O índice h de cada docente foi obtido na Web of Science mediante algoritmos de busca com controle para homonímias e alternativas de grafia de nome. Para cada região e para o Brasil como um todo ajustou-se função densidade de probabilidade exponencial aos parâmetros média e taxa de decréscimo por região. Foram identificadas medidas de posição e, com o complemento da função probabilidade acumulada, função de hierarquia entre autores conforme o índice h por região. RESULTADOS: Dos docentes, 29,8% não tinham qualquer registro de citação (h = 0). A média de h para o País foi 3,1, com maior média na região Sul (4,7). A mediana de h para o País foi 2,1, também com maior mediana na Sul (3,2). Para uma padronização de população de autores em cem, os primeiros colocados para o País devem ter h = 16; na estratificação por região, a primeira posição demanda valores mais altos no Nordeste, Sudeste e Sul, sendo nesta última h = 24. CONCLUSÕES: Avaliados pelos índices h da Web of Science, a maioria dos autores em Saúde Coletiva não supera h = 5. Há diferenças entres as regiões, com melhor desempenho para a Sul e valores semelhantes entre Sudeste e Nordeste.
Resumo:
Void fraction sensors are important instruments not only for monitoring two-phase flow, but for furnishing an important parameter for obtaining flow map pattern and two-phase flow heat transfer coefficient as well. This work presents the experimental results obtained with the analysis of two axially spaced multiple-electrode impedance sensors tested in an upward air-water two-phase flow in a vertical tube for void fraction measurements. An electronic circuit was developed for signal generation and post-treatment of each sensor signal. By phase shifting the electrodes supplying the signal, it was possible to establish a rotating electric field sweeping across the test section. The fundamental principle of using a multiple-electrode configuration is based on reducing signal sensitivity to the non-uniform cross-section void fraction distribution problem. Static calibration curves were obtained for both sensors, and dynamic signal analyses for bubbly, slug, and turbulent churn flows were carried out. Flow parameters such as Taylor bubble velocity and length were obtained by using cross-correlation techniques. As an application of the void fraction tested, vertical flow pattern identification could be established by using the probability density function technique for void fractions ranging from 0% to nearly 70%.
Resumo:
This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular bi-dimensional items inside a bi-dimensional container. This problem is approached with a heuristic based on Simulated Annealing (SA) with adaptive neighborhood. The objective function is evaluated in a constructive approach, where the items are placed sequentially. The placement is governed by three different types of parameters: sequence of placement, the rotation angle and the translation. The rotation applied and the translation of the polygon are cyclic continuous parameters, and the sequence of placement defines a combinatorial problem. This way, it is necessary to control cyclic continuous and discrete parameters. The approaches described in the literature deal with only type of parameter (sequence of placement or translation). In the proposed SA algorithm, the sensibility of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensibility of each parameter is associated to its probability distribution in the definition of the next candidate.
Resumo:
In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Kumaraswamy [Generalized probability density-function for double-bounded random-processes, J. Hydrol. 462 (1980), pp. 79-88] introduced a distribution for double-bounded random processes with hydrological applications. For the first time, based on this distribution, we describe a new family of generalized distributions (denoted with the prefix `Kw`) to extend the normal, Weibull, gamma, Gumbel, inverse Gaussian distributions, among several well-known distributions. Some special distributions in the new family such as the Kw-normal, Kw-Weibull, Kw-gamma, Kw-Gumbel and Kw-inverse Gaussian distribution are discussed. We express the ordinary moments of any Kw generalized distribution as linear functions of probability weighted moments (PWMs) of the parent distribution. We also obtain the ordinary moments of order statistics as functions of PWMs of the baseline distribution. We use the method of maximum likelihood to fit the distributions in the new class and illustrate the potentiality of the new model with an application to real data.
Resumo:
In this paper we study the accumulated claim in some fixed time period, skipping the classical assumption of mutual independence between the variables involved. Two basic models are considered: Model I assumes that any pair of claims are equally correlated which means that the corresponding square-integrable sequence is exchangeable one. Model 2 states that the correlations between the adjacent claims are the same. Recurrence and explicit expressions for the joint probability generating function are derived and the impact of the dependence parameter (correlation coefficient) in both models is examined. The Markov binomial distribution is obtained as a particular case under assumptions of Model 2. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The main objective of this paper is to study a logarithm extension of the bimodal skew normal model introduced by Elal-Olivero et al. [1]. The model can then be seen as an alternative to the log-normal model typically used for fitting positive data. We study some basic properties such as the distribution function and moments, and discuss maximum likelihood for parameter estimation. We report results of an application to a real data set related to nickel concentration in soil samples. Model fitting comparison with several alternative models indicates that the model proposed presents the best fit and so it can be quite useful in real applications for chemical data on substance concentration. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
We present a Bayesian approach for modeling heterogeneous data and estimate multimodal densities using mixtures of Skew Student-t-Normal distributions [Gomez, H.W., Venegas, O., Bolfarine, H., 2007. Skew-symmetric distributions generated by the distribution function of the normal distribution. Environmetrics 18, 395-407]. A stochastic representation that is useful for implementing a MCMC-type algorithm and results about existence of posterior moments are obtained. Marginal likelihood approximations are obtained, in order to compare mixture models with different number of component densities. Data sets concerning the Gross Domestic Product per capita (Human Development Report) and body mass index (National Health and Nutrition Examination Survey), previously studied in the related literature, are analyzed. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In recent years, we have experienced increasing interest in the understanding of the physical properties of collisionless plasmas, mostly because of the large number of astrophysical environments (e. g. the intracluster medium (ICM)) containing magnetic fields that are strong enough to be coupled with the ionized gas and characterized by densities sufficiently low to prevent the pressure isotropization with respect to the magnetic line direction. Under these conditions, a new class of kinetic instabilities arises, such as firehose and mirror instabilities, which have been studied extensively in the literature. Their role in the turbulence evolution and cascade process in the presence of pressure anisotropy, however, is still unclear. In this work, we present the first statistical analysis of turbulence in collisionless plasmas using three-dimensional numerical simulations and solving double-isothermal magnetohydrodynamic equations with the Chew-Goldberger-Low laws closure (CGL-MHD). We study models with different initial conditions to account for the firehose and mirror instabilities and to obtain different turbulent regimes. We found that the CGL-MHD subsonic and supersonic turbulences show small differences compared to the MHD models in most cases. However, in the regimes of strong kinetic instabilities, the statistics, i.e. the probability distribution functions (PDFs) of density and velocity, are very different. In subsonic models, the instabilities cause an increase in the dispersion of density, while the dispersion of velocity is increased by a large factor in some cases. Moreover, the spectra of density and velocity show increased power at small scales explained by the high growth rate of the instabilities. Finally, we calculated the structure functions of velocity and density fluctuations in the local reference frame defined by the direction of magnetic lines. The results indicate that in some cases the instabilities significantly increase the anisotropy of fluctuations. These results, even though preliminary and restricted to very specific conditions, show that the physical properties of turbulence in collisionless plasmas, as those found in the ICM, may be very different from what has been largely believed.
Resumo:
Consider N sites randomly and uniformly distributed in a d-dimensional hypercube. A walker explores this disordered medium going to the nearest site, which has not been visited in the last mu (memory) steps. The walker trajectory is composed of a transient part and a periodic part (cycle). For one-dimensional systems, travelers can or cannot explore all available space, giving rise to a crossover between localized and extended regimes at the critical memory mu(1) = log(2) N. The deterministic rule can be softened to consider more realistic situations with the inclusion of a stochastic parameter T (temperature). In this case, the walker movement is driven by a probability density function parameterized by T and a cost function. The cost function increases as the distance between two sites and favors hops to closer sites. As the temperature increases, the walker can escape from cycles that are reminiscent of the deterministic nature and extend the exploration. Here, we report an analytical model and numerical studies of the influence of the temperature and the critical memory in the exploration of one-dimensional disordered systems.