14 resultados para equilibrium asset pricing models with latent variables
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
We extend the random permutation model to obtain the best linear unbiased estimator of a finite population mean accounting for auxiliary variables under simple random sampling without replacement (SRS) or stratified SRS. The proposed method provides a systematic design-based justification for well-known results involving common estimators derived under minimal assumptions that do not require specification of a functional relationship between the response and the auxiliary variables.
Resumo:
We investigate the influence of sub-Ohmic dissipation on randomly diluted quantum Ising and rotor models. The dissipation causes the quantum dynamics of sufficiently large percolation clusters to freeze completely. As a result, the zero-temperature quantum phase transition across the lattice percolation threshold separates an unusual super-paramagnetic cluster phase from an inhomogeneous ferromagnetic phase. We determine the low-temperature thermodynamic behavior in both phases, which is dominated by large frozen and slowly fluctuating percolation clusters. We relate our results to the smeared transition scenario for disordered quantum phase transitions, and we compare the cases of sub-Ohmic, Ohmic, and super-Ohmic dissipation.
Resumo:
In this work we extend the first order formalism for cosmological models that present an interaction between a fermionic and a scalar field. Cosmological exact solutions describing universes filled with interacting dark energy and dark matter have been obtained. Viable cosmological solutions with an early period of decelerated expansion followed by late acceleration have been found, notably one which presents a dark matter component dominating in the past and a dark energy component dominating in the future. In another one, the dark energy alone is the responsible for both periods, similar to a Chaplygin gas case. Exclusively accelerating solutions have also been obtained.
Weibull and generalised exponential overdispersion models with an application to ozone air pollution
Resumo:
We consider the problem of estimating the mean and variance of the time between occurrences of an event of interest (inter-occurrences times) where some forms of dependence between two consecutive time intervals are allowed. Two basic density functions are taken into account. They are the Weibull and the generalised exponential density functions. In order to capture the dependence between two consecutive inter-occurrences times, we assume that either the shape and/or the scale parameters of the two density functions are given by auto-regressive models. The expressions for the mean and variance of the inter-occurrences times are presented. The models are applied to the ozone data from two regions of Mexico City. The estimation of the parameters is performed using a Bayesian point of view via Markov chain Monte Carlo (MCMC) methods.
Resumo:
Exact results on particle densities as well as correlators in two models of immobile particles, containing either a single species or else two distinct species, are derived. The models evolve following a descent dynamics through pair annihilation where each particle interacts once at most throughout its entire history. The resulting large number of stationary states leads to a non-vanishing configurational entropy. Our results are established for arbitrary initial conditions and are derived via a generating function method. The single-species model is the dual of the 1D zero-temperature kinetic Ising model with Kimball-Deker-Haake dynamics. In this way, both in finite and semi-infinite chains and also the Bethe lattice can be analysed. The relationship with the random sequential adsorption of dimers and weakly tapped granular materials is discussed.
Resumo:
Actually, transition from positive to negative plasma current and quasi-steady-state alternated current (AC) operation have been achieved experimentally without loss of ionization. The large transition times suggest the use of MHD equilibrium to model the intermediate magnetic field configurations for corresponding current density reversals. In the present work we show, by means of Maxwell equations, that the most robust equilibrium for any axisymmetric configuration with reversed current density requires the existence of several nonested families of magnetic surfaces inside the plasma. We also show that the currents inside the nonested families satisfy additive rules restricting the geometry and sizes of the axisymmetric magnetic islands; this is done without restricting the equilibrium through arbitrary functions. Finally, we introduce a local successive approximations method to describe the equilibrium about an arbitrary reversed current density minimum and, consequently, the transition between different nonested topologies is understood in terms of the eccentricity of the toroidal current density level sets.
Resumo:
Changepoint regression models have originally been developed in connection with applications in quality control, where a change from the in-control to the out-of-control state has to be detected based on the avaliable random observations. Up to now various changepoint models have been suggested for differents applications like reliability, econometrics or medicine. In many practical situations the covariate cannot be measured precisely and an alternative model are the errors in variable regression models. In this paper we study the regression model with errors in variables with changepoint from a Bayesian approach. From the simulation study we found that the proposed procedure produces estimates suitable for the changepoint and all other model parameters.
Resumo:
Managers know more about the performance of the organization than investors, which makes the disclosure of information a possible strategy for competitive differentiation, minimizing adverse selection. This paper's main goal is to analyze whether or not an entity's level of diclosure may affect the risk perception of individuals and the process of evaluating their shares. The survey was carried out in an experimental study with 456 subjects. In a stock market simulation, we investigated the pricing of the stocks of two companies with different levels of information disclosure at four separate stages. The results showed that, when other variables are constant, the level of disclosure of an entity can affect the expectations of individuals and the process of evaluating their shares. A higher level of disclosure by an entity affected the value of its share and the other company's.
Resumo:
We propose a new general Bayesian latent class model for evaluation of the performance of multiple diagnostic tests in situations in which no gold standard test exists based on a computationally intensive approach. The modeling represents an interesting and suitable alternative to models with complex structures that involve the general case of several conditionally independent diagnostic tests, covariates, and strata with different disease prevalences. The technique of stratifying the population according to different disease prevalence rates does not add further marked complexity to the modeling, but it makes the model more flexible and interpretable. To illustrate the general model proposed, we evaluate the performance of six diagnostic screening tests for Chagas disease considering some epidemiological variables. Serology at the time of donation (negative, positive, inconclusive) was considered as a factor of stratification in the model. The general model with stratification of the population performed better in comparison with its concurrents without stratification. The group formed by the testing laboratory Biomanguinhos FIOCRUZ-kit (c-ELISA and rec-ELISA) is the best option in the confirmation process by presenting false-negative rate of 0.0002% from the serial scheme. We are 100% sure that the donor is healthy when these two tests have negative results and he is chagasic when they have positive results.
Resumo:
The main goal of this article is to consider influence assessment in models with error-prone observations and variances of the measurement errors changing across observations. The techniques enable to identify potential influential elements and also to quantify the effects of perturbations in these elements on some results of interest. The approach is illustrated with data from the WHO MONICA Project on cardiovascular disease.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
We study a probabilistic model of interacting spins indexed by elements of a finite subset of the d-dimensional integer lattice, da parts per thousand yen1. Conditions of time reversibility are examined. It is shown that the model equilibrium distribution converges to a limit distribution as the indexing set expands to the whole lattice. The occupied site percolation problem is solved for the limit distribution. Two models with similar dynamics are also discussed.
Resumo:
Background: An important issue concerning the worldwide fight against stigma is the evaluation of psychiatrists’ beliefs and attitudes toward schizophrenia and mental illness in general. However, there is as yet no consensus on this matter in the literature, and results vary according to the stigma dimension assessed and to the cultural background of the sample. The aim of this investigation was to search for profiles of stigmatizing beliefs related to schizophrenia in a national sample of psychiatrists in Brazil. Methods: A sample of 1414 psychiatrists were recruited from among those attending the 2009 Brazilian Congress of Psychiatry. A questionnaire was applied in face-to-face interviews. The questionnaire addressed four stigma dimensions, all in reference to individuals with schizophrenia: stereotypes, restrictions, perceived prejudice and social distance. Stigma item scores were included in latent profile analyses; the resulting profiles were entered into multinomial logistic regression models with sociodemographics, in order to identify significant correlates. Results: Three profiles were identified. The “no stigma” subjects (n = 337) characterized individuals with schizophrenia in a positive light, disagreed with restrictions, and displayed a low level of social distance. The “unobtrusive stigma” subjects (n = 471) were significantly younger and displayed the lowest level of social distance, although most of them agreed with involuntary admission and demonstrated a high level of perceived prejudice. The “great stigma” subjects (n = 606) negatively stereotyped individuals with schizophrenia, agreed with restrictions and scored the highest on the perceived prejudice and social distance dimensions. In comparison with the first two profiles, this last profile comprised a significantly larger number of individuals who were in frequent contact with a family member suffering from a psychiatric disorder, as well as comprising more individuals who had no such family member. Conclusions: Our study not only provides additional data related to an under-researched area but also reveals that psychiatrists are a heterogeneous group regarding stigma toward schizophrenia. The presence of different stigma profiles should be evaluated in further studies; this could enable anti-stigma initiatives to be specifically designed to effectively target the stigmatizing group.
Resumo:
This work addresses the treatment of lower density regions of structures undergoing large deformations during the design process by the topology optimization method (TOM) based on the finite element method. During the design process the nonlinear elastic behavior of the structure is based on exact kinematics. The material model applied in the TOM is based on the solid isotropic microstructure with penalization approach. No void elements are deleted and all internal forces of the nodes surrounding the void elements are considered during the nonlinear equilibrium solution. The distribution of design variables is solved through the method of moving asymptotes, in which the sensitivity of the objective function is obtained directly. In addition, a continuation function and a nonlinear projection function are invoked to obtain a checkerboard free and mesh independent design. 2D examples with both plane strain and plane stress conditions hypothesis are presented and compared. The problem of instability is overcome by adopting a polyconvex constitutive model in conjunction with a suggested relaxation function to stabilize the excessive distorted elements. The exact tangent stiffness matrix is used. The optimal topology results are compared to the results obtained by using the classical Saint Venant–Kirchhoff constitutive law, and strong differences are found.