956 resultados para Multinomial logit models with random coefficients (RCL)
Resumo:
We presented in this work two methods of estimation for accelerated failure time models with random e_ects to process grouped survival data. The _rst method, which is implemented in software SAS, by NLMIXED procedure, uses an adapted Gauss-Hermite quadrature to determine marginalized likelihood. The second method, implemented in the free software R, is based on the method of penalized likelihood to estimate the parameters of the model. In the _rst case we describe the main theoretical aspects and, in the second, we briey presented the approach adopted with a simulation study to investigate the performance of the method. We realized implement the models using actual data on the time of operation of oil wells from the Potiguar Basin (RN / CE).
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The aim of this work is to put forward a statistical mechanics theory of social interaction, generalizing econometric discrete choice models. After showing the formal equivalence linking econometric multinomial logit models to equilibrium statical mechanics, a multi- population generalization of the Curie-Weiss model for ferromagnets is considered as a starting point in developing a model capable of describing sudden shifts in aggregate human behaviour. Existence of the thermodynamic limit for the model is shown by an asymptotic sub-additivity method and factorization of correlation functions is proved almost everywhere. The exact solution for the model is provided in the thermodynamical limit by nding converging upper and lower bounds for the system's pressure, and the solution is used to prove an analytic result regarding the number of possible equilibrium states of a two-population system. The work stresses the importance of linking regimes predicted by the model to real phenomena, and to this end it proposes two possible procedures to estimate the model's parameters starting from micro-level data. These are applied to three case studies based on census type data: though these studies are found to be ultimately inconclusive on an empirical level, considerations are drawn that encourage further refinements of the chosen modelling approach, to be considered in future work.
Resumo:
Popular belief holds that the lunar cycle affects human physiology, behaviour and health. We examined the influence of moon phase on sleep duration in a secondary analysis of a feasibility study of mobile telephone base stations and sleep quality. We studied 31 volunteers (18 women and 13 men, mean age 50 years) from a suburban area of Switzerland longitudinally over 6 weeks, including two full moons. Subjective sleep duration was calculated from sleep diary data. Data were analysed using multiple linear regression models with random effects. Mean sleep duration was 6 h 49 min. Subjective sleep duration varied with the lunar cycle, from 6 h 41 min at full moon to 7 h 00 min at new moon (P < 0.001). Average sleep duration was shortened by 68 min during the week compared with weekends (P < 0.001). Men slept 17 min longer than women (P < 0.001) and sleep duration decreased with age (P < 0.001). There was also evidence that rating of fatigue in the morning was associated with moon phase, with more tiredness (P = 0.027) at full moon. The study was designed for other purposes and the association between lunar cycle and sleep duration will need to be confirmed in further studies.
Resumo:
OBJECTIVE: We examined survival and prognostic factors of patients who developed HIV-associated non-Hodgkin lymphoma (NHL) in the era of combination antiretroviral therapy (cART). DESIGN AND SETTING: Multicohort collaboration of 33 European cohorts. METHODS: We included all cART-naive patients enrolled in cohorts participating in the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) who were aged 16 years or older, started cART at some point after 1 January 1998 and developed NHL after 1 January 1998. Patients had to have a CD4 cell count after 1 January 1998 and one at diagnosis of the NHL. Survival and prognostic factors were estimated using Weibull models, with random effects accounting for heterogeneity between cohorts. RESULTS: Of 67 659 patients who were followed up during 304 940 person-years, 1176 patients were diagnosed with NHL. Eight hundred and forty-seven patients (72%) from 22 cohorts met inclusion criteria. Survival at 1 year was 66% [95% confidence interval (CI) 63-70%] for systemic NHL (n = 763) and 54% (95% CI: 43-65%) for primary brain lymphoma (n = 84). Risk factors for death included low nadir CD4 cell counts and a history of injection drug use. Patients developing NHL on cART had an increased risk of death compared with patients who were cART naive at diagnosis. CONCLUSION: In the era of cART two-thirds of patients diagnosed with HIV-related systemic NHL survive for longer than 1 year after diagnosis. Survival is poorer in patients diagnosed with primary brain lymphoma. More advanced immunodeficiency is the dominant prognostic factor for mortality in patients with HIV-related NHL.
Resumo:
La presente investigación tiene como objetivo realizar un estudio piloto de la demanda de cocinas de inducción del año 2014-2015 y posteriormente un análisis comparativo. Este estudio es aplicado a la parroquia El Valle, perteneciente al cantón Cuenca, excluyendo el área urbana, empleando como objeto de estudio a los jefes de hogar de la parroquia, a través de dos modelos Logit binario con iguales características para realizar la comparación de ambos años. Los principales resultados en el año 2014 muestran que, en relación al tamaño de la muestra en la parroquia rural El Valle, el 33% de la población están dispuestos a adquirir la cocina de inducción, los factores que influyen en la decisión de adquirir la cocina de inducción son: el ingreso obteniendo un efecto positivo y el costo con un efecto negativo. En la estimación del costo se obtiene que, considerando que todos los hogares encuestados de la parroquia rural El Valle adquirirán la cocina de inducción, los costos totales por hogar en promedio es de $573.34. Mientras los resultados del año 2015 son, que el 20% de la población encuestada en la parroquia adquirirán la cocina de inducción, las variables edad y pago de factura eléctrica son las que tienen mayor influencia en el modelo, existiendo una relación inversa en ambas y en la determinación del costo se obtiene que es de $526.70 aproximadamente.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We investigate whether relative contributions of genetic and shared environmental factors are associated with an increased risk in melanoma. Data from the Queensland Familial Melanoma Project comprising 15,907 subjects arising from 1912 families were analyzed to estimate the additive genetic, common and unique environmental contributions to variation in the age at onset of melanoma. Two complementary approaches for analyzing correlated time-to-onset family data were considered: the generalized estimating equations (GEE) method in which one can estimate relationship-specific dependence simultaneously with regression coefficients that describe the average population response to changing covariates; and a subject-specific Bayesian mixed model in which heterogeneity in regression parameters is explicitly modeled and the different components of variation may be estimated directly. The proportional hazards and Weibull models were utilized, as both produce natural frameworks for estimating relative risks while adjusting for simultaneous effects of other covariates. A simple Markov Chain Monte Carlo method for covariate imputation of missing data was used and the actual implementation of the Bayesian model was based on Gibbs sampling using the free ware package BUGS. In addition, we also used a Bayesian model to investigate the relative contribution of genetic and environmental effects on the expression of naevi and freckles, which are known risk factors for melanoma.
Resumo:
PURPOSE: To evaluate the impact of atypical retardation patterns (ARP) on detection of progressive retinal nerve fiber layer (RNFL) loss using scanning laser polarimetry with variable corneal compensation (VCC). DESIGN: Observational cohort study. METHODS: The study included 377 eyes of 221 patients with a median follow-up of 4.0 years. Images were obtained annually with the GDx VCC (Carl Zeiss Med, itec Inc, Dublin, California, USA), along with optic disc stereophotographs and standard automated perimetry (SAP) visual fields. Progression was determined by the Guided Progression Analysis software for SAP and by masked assessment of stereophotographs by expert graders. The typical scan score (TSS) was used to quantify the presence of ARPs on GDx VCC images. Random coefficients models were used to evaluate the relationship between ARP and RNFL thickness measurements over time. RESULTS: Thirty-eight eyes (10%) showed progression over time on visual fields, stereophotographs, or both. Changes in TSS scores from baseline were significantly associated with changes in RNFL thickness measurements in both progressing and nonprogressing eyes. Each I unit increase in TSS score was associated with a 0.19-mu m decrease in RNFL thickness measurement (P < .001) over time. CONCLUSIONS: ARPs had a significant effect on detection of progressive RNFL loss with the GDx VCC. Eyes with large amounts of atypical patterns, great fluctuations on these patterns over time, or both may show changes in measurements that can appear falsely as glaucomatous progression or can mask true changes in the RNFL. (Am J Ophthalmol 2009;148:155-163. (C) 2009 by Elsevier Inc. All rights reserved.)
Resumo:
Random coefficient regression models have been applied in differentfields and they constitute a unifying setup for many statisticalproblems. The nonparametric study of this model started with Beranand Hall (1992) and it has become a fruitful framework. In thispaper we propose and study statistics for testing a basic hypothesisconcerning this model: the constancy of coefficients. The asymptoticbehavior of the statistics is investigated and bootstrapapproximations are used in order to determine the critical values ofthe test statistics. A simulation study illustrates the performanceof the proposals.
Resumo:
In this paper, we propose several finite-sample specification tests for multivariate linear regressions (MLR) with applications to asset pricing models. We focus on departures from the assumption of i.i.d. errors assumption, at univariate and multivariate levels, with Gaussian and non-Gaussian (including Student t) errors. The univariate tests studied extend existing exact procedures by allowing for unspecified parameters in the error distributions (e.g., the degrees of freedom in the case of the Student t distribution). The multivariate tests are based on properly standardized multivariate residuals to ensure invariance to MLR coefficients and error covariances. We consider tests for serial correlation, tests for multivariate GARCH and sign-type tests against general dependencies and asymmetries. The procedures proposed provide exact versions of those applied in Shanken (1990) which consist in combining univariate specification tests. Specifically, we combine tests across equations using the MC test procedure to avoid Bonferroni-type bounds. Since non-Gaussian based tests are not pivotal, we apply the “maximized MC” (MMC) test method [Dufour (2002)], where the MC p-value for the tested hypothesis (which depends on nuisance parameters) is maximized (with respect to these nuisance parameters) to control the test’s significance level. The tests proposed are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995. Our empirical results reveal the following. Whereas univariate exact tests indicate significant serial correlation, asymmetries and GARCH in some equations, such effects are much less prevalent once error cross-equation covariances are accounted for. In addition, significant departures from the i.i.d. hypothesis are less evident once we allow for non-Gaussian errors.
Resumo:
Esta tesis está dividida en dos partes: en la primera parte se presentan y estudian los procesos telegráficos, los procesos de Poisson con compensador telegráfico y los procesos telegráficos con saltos. El estudio presentado en esta primera parte incluye el cálculo de las distribuciones de cada proceso, las medias y varianzas, así como las funciones generadoras de momentos entre otras propiedades. Utilizando estas propiedades en la segunda parte se estudian los modelos de valoración de opciones basados en procesos telegráficos con saltos. En esta parte se da una descripción de cómo calcular las medidas neutrales al riesgo, se encuentra la condición de no arbitraje en este tipo de modelos y por último se calcula el precio de las opciones Europeas de compra y venta.
Resumo:
The Monte Carlo Independent Column Approximation (McICA) is a flexible method for representing subgrid-scale cloud inhomogeneity in radiative transfer schemes. It does, however, introduce conditional random errors but these have been shown to have little effect on climate simulations, where spatial and temporal scales of interest are large enough for effects of noise to be averaged out. This article considers the effect of McICA noise on a numerical weather prediction (NWP) model, where the time and spatial scales of interest are much closer to those at which the errors manifest themselves; this, as we show, means that noise is more significant. We suggest methods for efficiently reducing the magnitude of McICA noise and test these methods in a global NWP version of the UK Met Office Unified Model (MetUM). The resultant errors are put into context by comparison with errors due to the widely used assumption of maximum-random-overlap of plane-parallel homogeneous cloud. For a simple implementation of the McICA scheme, forecasts of near-surface temperature are found to be worse than those obtained using the plane-parallel, maximum-random-overlap representation of clouds. However, by applying the methods suggested in this article, we can reduce noise enough to give forecasts of near-surface temperature that are an improvement on the plane-parallel maximum-random-overlap forecasts. We conclude that the McICA scheme can be used to improve the representation of clouds in NWP models, with the provision that the associated noise is sufficiently small.
Resumo:
This paper presents a two-step pseudo likelihood estimation technique for generalized linear mixed models with the random effects being correlated between groups. The core idea is to deal with the intractable integrals in the likelihood function by multivariate Taylor's approximation. The accuracy of the estimation technique is assessed in a Monte-Carlo study. An application of it with a binary response variable is presented using a real data set on credit defaults from two Swedish banks. Thanks to the use of two-step estimation technique, the proposed algorithm outperforms conventional pseudo likelihood algorithms in terms of computational time.
Resumo:
Consumers often pay different prices for the same product bought in the same store at the same time. However, the demand estimation literature has ignored that fact using, instead, aggregate measures such as the “list” or average price. In this paper we show that this will lead to biased price coefficients. Furthermore, we perform simple comparative statics simulation exercises for the logit and random coefficient models. In the “list” price case we find that the bias is larger when discounts are higher, proportion of consumers facing discount prices is higher and when consumers are more unwilling to buy the product so that they almost only do it when facing discount. In the average price case we find that the bias is larger when discounts are higher, proportion of consumers that have access to discount are similar to the ones that do not have access and when consumers willingness to buy is very dependent on idiosyncratic shocks. Also bias is less problematic in the average price case in markets with a lot of bargain deals, so that prices are as good as individual. We conclude by proposing ways that the econometrician can reduce this bias using different information that he may have available.