45 resultados para equilibrium asset pricing models with latent variables

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this article is to discuss the estimation of the systematic risk in capital asset pricing models with heavy-tailed error distributions to explain the asset returns. Diagnostic methods for assessing departures from the model assumptions as well as the influence of observations on the parameter estimates are also presented. It may be shown that outlying observations are down weighted in the maximum likelihood equations of linear models with heavy-tailed error distributions, such as Student-t, power exponential, logistic II, so on. This robustness aspect may also be extended to influential observations. An application in which the systematic risk estimate of Microsoft is compared under normal and heavy-tailed errors is presented for illustration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a family of algorithms for approximate inference in credal networks (that is, models based on directed acyclic graphs and set-valued probabilities) that contain only binary variables. Such networks can represent incomplete or vague beliefs, lack of data, and disagreements among experts; they can also encode models based on belief functions and possibilistic measures. All algorithms for approximate inference in this paper rely on exact inferences in credal networks based on polytrees with binary variables, as these inferences have polynomial complexity. We are inspired by approximate algorithms for Bayesian networks; thus the Loopy 2U algorithm resembles Loopy Belief Propagation, while the Iterated Partial Evaluation and Structured Variational 2U algorithms are, respectively, based on Localized Partial Evaluation and variational techniques. (C) 2007 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we compare the performance of two statistical approaches for the analysis of data obtained from the social research area. In the first approach, we use normal models with joint regression modelling for the mean and for the variance heterogeneity. In the second approach, we use hierarchical models. In the first case, individual and social variables are included in the regression modelling for the mean and for the variance, as explanatory variables, while in the second case, the variance at level 1 of the hierarchical model depends on the individuals (age of the individuals), and in the level 2 of the hierarchical model, the variance is assumed to change according to socioeconomic stratum. Applying these methodologies, we analyze a Colombian tallness data set to find differences that can be explained by socioeconomic conditions. We also present some theoretical and empirical results concerning the two models. From this comparative study, we conclude that it is better to jointly modelling the mean and variance heterogeneity in all cases. We also observe that the convergence of the Gibbs sampling chain used in the Markov Chain Monte Carlo method for the jointly modeling the mean and variance heterogeneity is quickly achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, the normal distribution is assumed for the surrogate of the true covariates in the classical error model. This paper considers a class of distributions, which includes the normal one, for the variables subject to error. An estimation approach yielding consistent estimators is developed and simulation studies reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Creation of cold dark matter (CCDM) can macroscopically be described by a negative pressure, and, therefore, the mechanism is capable to accelerate the Universe, without the need of an additional dark energy component. In this framework, we discuss the evolution of perturbations by considering a Neo-Newtonian approach where, unlike in the standard Newtonian cosmology, the fluid pressure is taken into account even in the homogeneous and isotropic background equations (Lima, Zanchin, and Brandenberger, MNRAS 291, L1, 1997). The evolution of the density contrast is calculated in the linear approximation and compared to the one predicted by the Lambda CDM model. The difference between the CCDM and Lambda CDM predictions at the perturbative level is quantified by using three different statistical methods, namely: a simple chi(2)-analysis in the relevant space parameter, a Bayesian statistical inference, and, finally, a Kolmogorov-Smirnov test. We find that under certain circumstances, the CCDM scenario analyzed here predicts an overall dynamics (including Hubble flow and matter fluctuation field) which fully recovers that of the traditional cosmic concordance model. Our basic conclusion is that such a reduction of the dark sector provides a viable alternative description to the accelerating Lambda CDM cosmology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Letter we study the process of gluon fusion into a pair of Higgs bosons in a model with one universal extra dimension. We find that the contributions from the extra top quark Kaluza-Klem excitations lead to a Higgs pair production cross section at the LHC that can be significantly altered compared to the Standard Model value for small values of the compactification scale. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to develop a Bayesian analysis for nonlinear regression models under scale mixtures of skew-normal distributions. This novel class of models provides a useful generalization of the symmetrical nonlinear regression models since the error distributions cover both skewness and heavy-tailed distributions such as the skew-t, skew-slash and the skew-contaminated normal distributions. The main advantage of these class of distributions is that they have a nice hierarchical representation that allows the implementation of Markov chain Monte Carlo (MCMC) methods to simulate samples from the joint posterior distribution. In order to examine the robust aspects of this flexible class, against outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. Further, some discussions on the model selection criteria are given. The newly developed procedures are illustrated considering two simulations study, and a real data previously analyzed under normal and skew-normal nonlinear regression models. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear mixed models were developed to handle clustered data and have been a topic of increasing interest in statistics for the past 50 years. Generally. the normality (or symmetry) of the random effects is a common assumption in linear mixed models but it may, sometimes, be unrealistic, obscuring important features of among-subjects variation. In this article, we utilize skew-normal/independent distributions as a tool for robust modeling of linear mixed models under a Bayesian paradigm. The skew-normal/independent distributions is an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal distribution, skew-t, skew-slash and the skew-contaminated normal distributions as special cases, providing an appealing robust alternative to the routine use of symmetric distributions in this type of models. The methods developed are illustrated using a real data set from Framingham cholesterol study. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In survival analysis applications, the failure rate function may frequently present a unimodal shape. In such case, the log-normal or log-logistic distributions are used. In this paper, we shall be concerned only with parametric forms, so a location-scale regression model based on the Burr XII distribution is proposed for modeling data with a unimodal failure rate function as an alternative to the log-logistic regression model. Assuming censored data, we consider a classic analysis, a Bayesian analysis and a jackknife estimator for the parameters of the proposed model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the log-logistic and log-Burr XII regression models. Besides, we use sensitivity analysis to detect influential or outlying observations, and residual analysis is used to check the assumptions in the model. Finally, we analyze a real data set under log-Buff XII regression models. (C) 2008 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particle conservation lattice-gas models with infinitely many absorbing states are studied on a one-dimensional lattice. As one increases the particle density, they exhibit a phase transition from an absorbing to an active phase. The models are solved exactly by the use of the transfer matrix technique from which the critical behavior was obtained. We have found that the exponent related to the order parameter, the density of active sites, is 1 for all studied models except one of them with exponent 2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we compare three residuals based on the deviance component in generalised log-gamma regression models with censored observations. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and the empirical distribution of each residual is displayed and compared with the standard normal distribution. For all cases studied, the empirical distributions of the proposed residuals are in general symmetric around zero, but only a martingale-type residual presented negligible kurtosis for the majority of the cases studied. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for the martingale-type residual in generalised log-gamma regression models with censored data. A lifetime data set is analysed under log-gamma regression models and a model checking based on the martingale-type residual is performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce in this paper the class of linear models with first-order autoregressive elliptical errors. The score functions and the Fisher information matrices are derived for the parameters of interest and an iterative process is proposed for the parameter estimation. Some robustness aspects of the maximum likelihood estimates are discussed. The normal curvatures of local influence are also derived for some usual perturbation schemes whereas diagnostic graphics to assess the sensitivity of the maximum likelihood estimates are proposed. The methodology is applied to analyse the daily log excess return on the Microsoft whose empirical distributions appear to have AR(1) and heavy-tailed errors. (C) 2008 Elsevier B.V. All rights reserved.