979 resultados para Multivariate generalized t -distribution
Resumo:
A four-parameter extension of the generalized gamma distribution capable of modelling a bathtub-shaped hazard rate function is defined and studied. The beauty and importance of this distribution lies in its ability to model monotone and non-monotone failure rate functions, which are quite common in lifetime data analysis and reliability. The new distribution has a number of well-known lifetime special sub-models, such as the exponentiated Weibull, exponentiated generalized half-normal, exponentiated gamma and generalized Rayleigh, among others. We derive two infinite sum representations for its moments. We calculate the density of the order statistics and two expansions for their moments. The method of maximum likelihood is used for estimating the model parameters and the observed information matrix is obtained. Finally, a real data set from the medical area is analysed.
Resumo:
There are two main types of data sources of income distributions in China: household survey data and grouped data. Household survey data are typically available for isolated years and individual provinces. In comparison, aggregate or grouped data are typically available more frequently and usually have national coverage. In principle, grouped data allow investigation of the change of inequality over longer, continuous periods of time, and the identification of patterns of inequality across broader regions. Nevertheless, a major limitation of grouped data is that only mean (average) income and income shares of quintile or decile groups of the population are reported. Directly using grouped data reported in this format is equivalent to assuming that all individuals in a quintile or decile group have the same income. This potentially distorts the estimate of inequality within each region. The aim of this paper is to apply an improved econometric method designed to use grouped data to study income inequality in China. A generalized beta distribution is employed to model income inequality in China at various levels and periods of time. The generalized beta distribution is more general and flexible than the lognormal distribution that has been used in past research, and also relaxes the assumption of a uniform distribution of income within quintile and decile groups of populations. The paper studies the nature and extent of inequality in rural and urban China over the period 1978 to 2002. Income inequality in the whole of China is then modeled using a mixture of province-specific distributions. The estimated results are used to study the trends in national inequality, and to discuss the empirical findings in the light of economic reforms, regional policies, and globalization of the Chinese economy.
Resumo:
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0
Resumo:
There are several versions of the lognormal distribution in the statistical literature, one is based in the exponential transformation of generalized normal distribution (GN). This paper presents the Bayesian analysis for the generalized lognormal distribution (logGN) considering independent non-informative Jeffreys distributions for the parameters as well as the procedure for implementing the Gibbs sampler to obtain the posterior distributions of parameters. The results are used to analyze failure time models with right-censored and uncensored data. The proposed method is illustrated using actual failure time data of computers.
Resumo:
The generalized exponential distribution, proposed by Gupta and Kundu (1999), is a good alternative to standard lifetime distributions as exponential, Weibull or gamma. Several authors have considered the problem of Bayesian estimation of the parameters of generalized exponential distribution, assuming independent gamma priors and other informative priors. In this paper, we consider a Bayesian analysis of the generalized exponential distribution by assuming the conventional non-informative prior distributions, as Jeffreys and reference prior, to estimate the parameters. These priors are compared with independent gamma priors for both parameters. The comparison is carried out by examining the frequentist coverage probabilities of Bayesian credible intervals. We shown that maximal data information prior implies in an improper posterior distribution for the parameters of a generalized exponential distribution. It is also shown that the choice of a parameter of interest is very important for the reference prior. The different choices lead to different reference priors in this case. Numerical inference is illustrated for the parameters by considering data set of different sizes and using MCMC (Markov Chain Monte Carlo) methods.
On the multivariate Huesler-Reiss distribution attracting the maxima of elliptical triangular arrays
Resumo:
2000 Mathematics Subject Classification: 62H10.
Resumo:
En este trabajo se implementa una metodología para incluir momentos de orden superior en la selección de portafolios, haciendo uso de la Distribución Hiperbólica Generalizada, para posteriormente hacer un análisis comparativo frente al modelo de Markowitz.
Resumo:
In this paper, an alternative skew Student-t family of distributions is studied. It is obtained as an extension of the generalized Student-t (GS-t) family introduced by McDonald and Newey [10]. The extension that is obtained can be seen as a reparametrization of the skewed GS-t distribution considered by Theodossiou [14]. A key element in the construction of such an extension is that it can be stochastically represented as a mixture of an epsilon-skew-power-exponential distribution [1] and a generalized-gamma distribution. From this representation, we can readily derive theoretical properties and easy-to-implement simulation schemes. Furthermore, we study some of its main properties including stochastic representation, moments and asymmetry and kurtosis coefficients. We also derive the Fisher information matrix, which is shown to be nonsingular for some special cases such as when the asymmetry parameter is null, that is, at the vicinity of symmetry, and discuss maximum-likelihood estimation. Simulation studies for some particular cases and real data analysis are also reported, illustrating the usefulness of the extension considered.
Resumo:
In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The Dirichlet distribution is a multivariate generalization of the Beta distribution. It is an important multivariate continuous distribution in probability and statistics. In this report, we review the Dirichlet distribution and study its properties, including statistical and information-theoretic quantities involving this distribution. Also, relationships between the Dirichlet distribution and other distributions are discussed. There are some different ways to think about generating random variables with a Dirichlet distribution. The stick-breaking approach and the Pólya urn method are discussed. In Bayesian statistics, the Dirichlet distribution and the generalized Dirichlet distribution can both be a conjugate prior for the Multinomial distribution. The Dirichlet distribution has many applications in different fields. We focus on the unsupervised learning of a finite mixture model based on the Dirichlet distribution. The Initialization Algorithm and Dirichlet Mixture Estimation Algorithm are both reviewed for estimating the parameters of a Dirichlet mixture. Three experimental results are shown for the estimation of artificial histograms, summarization of image databases and human skin detection.
Resumo:
1. Digital elevation models (DEMs) are often used in landscape ecology to retrieve elevation or first derivative terrain attributes such as slope or aspect in the context of species distribution modelling. However, DEM-derived variables are scale-dependent and, given the increasing availability of very high-resolution (VHR) DEMs, their ecological relevancemust be assessed for different spatial resolutions. 2. In a study area located in the Swiss Western Alps, we computed VHR DEMs-derived variables related to morphometry, hydrology and solar radiation. Based on an original spatial resolution of 0.5 m, we generated DEM-derived variables at 1, 2 and 4 mspatial resolutions, applying a Gaussian Pyramid. Their associations with local climatic factors, measured by sensors (direct and ambient air temperature, air humidity and soil moisture) as well as ecological indicators derived fromspecies composition, were assessed with multivariate generalized linearmodels (GLM) andmixed models (GLMM). 3. Specific VHR DEM-derived variables showed significant associations with climatic factors. In addition to slope, aspect and curvature, the underused wetness and ruggedness indices modelledmeasured ambient humidity and soilmoisture, respectively. Remarkably, spatial resolution of VHR DEM-derived variables had a significant influence on models' strength, with coefficients of determination decreasing with coarser resolutions or showing a local optimumwith a 2 mresolution, depending on the variable considered. 4. These results support the relevance of using multi-scale DEM variables to provide surrogates for important climatic variables such as humidity, moisture and temperature, offering suitable alternatives to direct measurements for evolutionary ecology studies at a local scale.
Resumo:
En este trabajo se realiza la medición del riesgo de mercado para el portafolio de TES de un banco colombiano determinado, abordando el pronóstico de valor en riesgo (VaR) mediante diferentes modelos multivariados de volatilidad: EWMA, GARCH ortogonal, GARCH robusto, así como distintos modelos de VaR con distribución normal y distribución t-student, evaluando su eficiencia con las metodologías de backtesting propuestas por Candelon et al. (2011) con base en el método generalizado de momentos, junto con los test de independencia y de cobertura condicional planteados por Christoffersen y Pelletier (2004) y por Berkowitz, Christoffersen y Pelletier (2010). Los resultados obtenidos demuestran que la mejor especificación del VaR para la medición del riesgo de mercado del portafolio de TES de los bancos colombianos, es el construido a partir de volatilidades EWMA y basado en la distribución normal, ya que satisface las hipótesis de cobertura no condicional, independencia y cobertura condicional, al igual que los requerimientos estipulados en Basilea II y en la normativa vigente en Colombia.
Resumo:
Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.