986 resultados para generalized Pareto distribution


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we provide a connection between the geometrical properties of the attractor of a chaotic dynamical system and the distribution of extreme values. We show that the extremes of so-called physical observables are distributed according to the classical generalised Pareto distribution and derive explicit expressions for the scaling and the shape parameter. In particular, we derive that the shape parameter does not depend on the cho- sen observables, but only on the partial dimensions of the invariant measure on the stable, unstable, and neutral manifolds. The shape parameter is negative and is close to zero when high-dimensional systems are considered. This result agrees with what was derived recently using the generalized extreme value approach. Combining the results obtained using such physical observables and the properties of the extremes of distance observables, it is possible to derive estimates of the partial dimensions of the attractor along the stable and the unstable directions of the flow. Moreover, by writing the shape parameter in terms of moments of the extremes of the considered observable and by using linear response theory, we relate the sensitivity to perturbations of the shape parameter to the sensitivity of the moments, of the partial dimensions, and of the Kaplan–Yorke dimension of the attractor. Preliminary numer- ical investigations provide encouraging results on the applicability of the theory presented here. The results presented here do not apply for all combinations of Axiom A systems and observables, but the breakdown seems to be related to very special geometrical configurations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we proposed a flexible cure rate survival model by assuming the number of competing causes of the event of interest following the Conway-Maxwell distribution and the time for the event to follow the generalized gamma distribution. This distribution can be used to model survival data when the hazard rate function is increasing, decreasing, bathtub and unimodal-shaped including some distributions commonly used in lifetime analysis as particular cases. Some appropriate matrices are derived in order to evaluate local influence on the estimates of the parameters by considering different perturbations, and some global influence measurements are also investigated. Finally, data set from the medical area is analysed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We show that the wavefunctions 〈pq; λ|n〈, of the harmonic oscillator in the squeezed state representation, have the generalized Hermite polynomials as their natural orthogonal polynomials. These wavefunctions lead to generalized Poisson Distribution Pn(pq;λ), which satisfy an interesting pseudo-diffusion equation: ∂Pnp,q;λ) ∂λ= 1 4 [ ∂2 ∂p2-( 1 λ2) ∂2 ∂q2]P2(p,q;λ), in which the squeeze parameter λ plays the role of time. Th entropies Sn(λ) have minima at the unsqueezed states (λ=1), which means that squeezing or stretching decreases the correlation between momentum p and position q. © 1992.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For the first time, we introduce a generalized form of the exponentiated generalized gamma distribution [Cordeiro et al. The exponentiated generalized gamma distribution with application to lifetime data, J. Statist. Comput. Simul. 81 (2011), pp. 827-842.] that is the baseline for the log-exponentiated generalized gamma regression model. The new distribution can accommodate increasing, decreasing, bathtub- and unimodal-shaped hazard functions. A second advantage is that it includes classical distributions reported in the lifetime literature as special cases. We obtain explicit expressions for the moments of the baseline distribution of the new regression model. The proposed model can be applied to censored data since it includes as sub-models several widely known regression models. It therefore can be used more effectively in the analysis of survival data. We obtain maximum likelihood estimates for the model parameters by considering censored data. We show that our extended regression model is very useful by means of two applications to real data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study a five-parameter lifetime distribution called the McDonald extended exponential model to generalize the exponential, generalized exponential, Kumaraswamy exponential and beta exponential distributions, among others. We obtain explicit expressions for the moments and incomplete moments, quantile and generating functions, mean deviations, Bonferroni and Lorenz curves and Gini concentration index. The method of maximum likelihood and a Bayesian procedure are adopted for estimating the model parameters. The applicability of the new model is illustrated by means of a real data set.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the 'global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy. © 2006 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Dirichlet distribution is a multivariate generalization of the Beta distribution. It is an important multivariate continuous distribution in probability and statistics. In this report, we review the Dirichlet distribution and study its properties, including statistical and information-theoretic quantities involving this distribution. Also, relationships between the Dirichlet distribution and other distributions are discussed. There are some different ways to think about generating random variables with a Dirichlet distribution. The stick-breaking approach and the Pólya urn method are discussed. In Bayesian statistics, the Dirichlet distribution and the generalized Dirichlet distribution can both be a conjugate prior for the Multinomial distribution. The Dirichlet distribution has many applications in different fields. We focus on the unsupervised learning of a finite mixture model based on the Dirichlet distribution. The Initialization Algorithm and Dirichlet Mixture Estimation Algorithm are both reviewed for estimating the parameters of a Dirichlet mixture. Three experimental results are shown for the estimation of artificial histograms, summarization of image databases and human skin detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper is presented a Game Theory based methodology to allocate transmission costs, considering cooperation and competition between producers. As original contribution, it finds the degree of participation on the additional costs according to the demand behavior. A comparative study was carried out between the obtained results using Nucleolus balance and Shapley Value, with other techniques such as Averages Allocation method and the Generalized Generation Distribution Factors method (GGDF). As example, a six nodes network was used for the simulations. The results demonstrate the ability to find adequate solutions on open access environment to the networks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As Leis de Potência, LP, (Power Laws, em inglês), Leis de Pareto ou Leis de Zipf são distribuições estatísticas, com inúmeras aplicações práticas, em sistemas naturais e artificiais. Alguns exemplos são a variação dos rendimentos pessoais ou de empresas, a ocorrência de palavras em textos, as repetições de sons ou conjuntos de sons em composições musicais, o número de vítimas em guerras ou outros cataclismos, a magnitude de tremores de terra, o número de vendas de livros ou CD’s na internet, o número de sítios mais acedidos na Internet, entre muitos outros. Vilfredo Pareto (1897-1906) afirma, no manual de economia política “Cours d’Economie Politique”, que grande parte da economia mundial segue uma determinada distribuição, em que 20% da população reúne 80% da riqueza total do país, estando, assim uma pequena fração da sociedade a controlar a maior fatia do dinheiro. Isto resume o comportamento de uma variável que segue uma distribuição de Pareto (ou Lei de Potência). Neste trabalho pretende-se estudar em pormenor a aplicação das leis de potência a fenómenos da internet, como sendo o número de sítios mais visitados, o número de links existentes em determinado sítio, a distribuição de nós numa rede da internet, o número livros vendidos e as vendas em leilões online. Os resultados obtidos permitem-nos concluir que todos os dados estudados são bem aproximados, numa escala logarítmica, por uma reta com declive negativo, seguindo, assim, uma distribuição de Pareto. O desenvolvimento e crescimento da Web, tem proporcionado um aumento do número dos utilizadores, conteúdos e dos sítios. Grande parte dos exemplos presentes neste trabalho serão alvo de novos estudos e de novas conclusões. O fato da internet ter um papel preponderante nas sociedades modernas, faz com que esteja em constante evolução e cada vez mais seja possível apresentar fenómenos na internet associados Lei de Potência.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nesta dissertação aborda-se a aplicação de Leis de Potência (LPs), também designadas de Leis de Pareto ou Leis de Zipf, a dados económicos. As LPs são distribuições estatísticas amplamente usadas na compreensão de sistemas naturais e artificiais. O aparecimento das LPs deve-se a Vilfredo Pareto que, no século XIX, publicou o manual de economia política,“Cours d’Economie Politique”. Nesse manual refere que grande parte da economia mundial segue uma LP, em que 20% da população reúne 80% da riqueza do país. Esta propriedade carateriza uma variável que segue uma distribuição de Pareto (ou LP). Desde então, as LPs foram aplicadas a outros fenómenos, nomeadamente a ocorrência de palavras em textos, os sobrenomes das pessoas, a variação dos rendimentos pessoais ou de empresas, o número de vítimas de inundações ou tremores de terra, os acessos a sítios da internet, etc. Neste trabalho, é estudado um conjunto de dados relativos às fortunas particulares ou coletivas de pessoas ou organizações. Mais concretamente são analisados dados recolhidos sobre as fortunas das mulheres mais ricas do mundo, dos homens mais ricos no ramo da tecnologia, das famílias mais ricas, das 20 mulheres mais ricas da América, dos 400 homens mais ricos da América, dos homens mais ricos do mundo, dos estabelecimentos mais ricos do mundo, das empresas mais ricas do mundo e dos países mais ricos do mundo, bem como o valor de algumas empresas no mercado de ações. Os resultados obtidos revelam uma boa aproximação de parte desses dados a uma LP simples e uma boa aproximação pelos restantes dados a uma LP dupla. Observa-se, assim, diferenciação na forma de crescimento das fortunas nos diferentes casos estudados. Como trabalho futuro, procurar-se-á analisar estes e outros dados, utilizando outras distribuições estatísticas, como a exponencial ou a lognormal, que possuem comportamentos semelhantes à LP, com o intuito de serem comparados os resultados. Um outro aspeto interessante será o de encontrar a explicação analítica para as vantagens da aproximação de dados económicos por uma LP simples vs por uma LP dupla.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As distribuições de Lei de Potência (PL Power Laws), tais como a lei de Pareto e a lei de Zipf são distribuições estatísticas cujos tamanhos dos eventos são inversamente proporcionais à sua frequência. Estas leis de potência são caracterizadas pelas suas longas caudas. Segundo Vilfredo Pareto (1896), engenheiro, cientista, sociólogo e economista italiano, autor da Lei de Pareto, 80% das consequências advêm de 20% das causas. Segundo o mesmo, grande parte da economia mundial segue uma determinada distribuição, onde 80% da riqueza mundial é detida por 20% da população ou 80% da poluição mundial é feita por 20% dos países. Estas percentagens podem oscilar nos intervalos [75-85] e [15-25]. A mesma percentagem poderá ser aplicada à gestão de tempo, onde apenas 20% do tempo dedicado a determinado assunto produzirá cerca de 80% dos resultados obtidos. A lei de Pareto, também designada de regra 80/20, tem aplicações nas várias ciências e no mundo físico, nomeadamente na biodiversidade. O número de ocorrências de fenómenos extremos, aliados ao impacto nas redes de telecomunicações nas situações de catástrofe, no apoio imediato às populações e numa fase posterior de reconstrução, têm preocupado cada vez mais as autoridades oficiais de protecção civil e as operadoras de telecomunicações. O objectivo é o de preparar e adaptarem as suas estruturas para proporcionar uma resposta eficaz a estes episódios. Neste trabalho estuda-se o comportamento de vários fenómenos extremos (eventos críticos) e aproximam-se os dados por uma distribuição de Pareto (Lei de Pareto) ou lei de potência. No final, especula-se sobre a influência dos eventos críticos na utilização das redes móveis. É fundamental que as redes móveis estejam preparadas para lidar com as repercussões de fenómenos deste tipo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Daily precipitation is recorded as the total amount of water collected by a rain-gauge in 24 h. Events are modelled as a Poisson process and the 24 h precipitation by a Generalised Pareto Distribution (GPD) of excesses. Hazard assessment is complete when estimates of the Poisson rate and the distribution parameters, together with a measure of their uncertainty, are obtained. The shape parameter of the GPD determines the support of the variable: Weibull domain of attraction (DA) corresponds to finite support variables as should be for natural phenomena. However, Fréchet DA has been reported for daily precipitation, which implies an infinite support and a heavy-tailed distribution. Bayesian techniques are used to estimate the parameters. The approach is illustrated with precipitation data from the Eastern coast of the Iberian Peninsula affected by severe convective precipitation. The estimated GPD is mainly in the Fréchet DA, something incompatible with the common sense assumption of that precipitation is a bounded phenomenon. The bounded character of precipitation is then taken as a priori hypothesis. Consistency of this hypothesis with the data is checked in two cases: using the raw-data (in mm) and using log-transformed data. As expected, a Bayesian model checking clearly rejects the model in the raw-data case. However, log-transformed data seem to be consistent with the model. This fact may be due to the adequacy of the log-scale to represent positive measurements for which differences are better relative than absolute

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).