7 resultados para longitudinal Poisson data

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present different ofrailtyo models to analyze longitudinal data in the presence of covariates. These models incorporate the extra-Poisson variability and the possible correlation among the repeated counting data for each individual. Assuming a CD4 counting data set in HIV-infected patients, we develop a hierarchical Bayesian analysis considering the different proposed models and using Markov Chain Monte Carlo methods. We also discuss some Bayesian discrimination aspects for the choice of the best model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell Poisson distribution. This model includes as special cases some of the well-known cure rate models discussed in the literature. Next, we discuss the maximum likelihood estimation of the parameters of this cure rate survival model. Finally, we illustrate the usefulness of this model by applying it to a real cutaneous melanoma data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyze data obtained from a study designed to evaluate training effects on the performance of certain motor activities of Parkinson`s disease patients. Maximum likelihood methods were used to fit beta-binomial/Poisson regression models tailored to evaluate the effects of training on the numbers of attempted and successful specified manual movements in 1 min periods, controlling for disease stage and use of the preferred hand. We extend models previously considered by other authors in univariate settings to account for the repeated measures nature of the data. The results suggest that the expected number of attempts and successes increase with training, except for patients with advanced stages of the disease using the non-preferred hand. Copyright (c) 2008 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To describe the composition of metabolic acidosis in patients with severe sepsis and septic shock at intensive care unit admission and throughout the first 5 days of intensive care unit stay. Design: Prospective, observational study. Setting: Twelve-bed intensive care unit. Patients: Sixty patients with either severe sepsis or septic shock. Interventions: None. Measurements and Main Results: Data were collected until 5 days after intensive care unit admission. We studied the contribution of inorganic ion difference, lactate, albumin, phosphate, and strong ion gap to metabolic acidosis. At admission, standard base excess was -6.69 +/- 4.19 mEq/L in survivors vs. -11.63 +/- 4.87 mEq/L in nonsurvivors (p < .05); inorganic ion difference (mainly resulting from hyperchloremia) was responsible for a decrease in standard base excess by 5.64 +/- 4.96 mEq/L in survivors vs. 8.94 +/- 7.06 mEq/L in nonsurvivors (p < .05); strong ion gap was responsible for a decrease in standard base excess by 4.07 +/- 3.57 mEq/L in survivors vs. 4.92 +/- 5.55 mEq/L in nonsurvivors with a nonsignificant probability value; and lactate was responsible for a decrease in standard base excess to 1.34 +/- 2.07 mEq/L in survivors vs. 1.61 +/- 2.25 mEq/L in nonsurvivors with a nonsignificant probability value. Albumin had an important alkalinizing effect in both groups; phosphate had a minimal acid-base effect. Acidosis in survivors was corrected during the study period as a result of a decrease in lactate and strong ion gap levels, whereas nonsurvivors did not correct their metabolic acidosis. In addition to Acute Physiology and Chronic Health Evaluation 11 score and serum creatinine level, inorganic ion difference acidosis magnitude at intensive care unit admission was independently associated with a worse outcome. Conclusions: Patients with severe sepsis and septic shock exhibit a complex metabolic acidosis at intensive care unit admission, caused predominantly by hyperchloremic acidosis, which was more pronounced in nonsurvivors. Acidosis resolution in survivors was attributable to a decrease in strong ion gap and lactate levels. (Crit Care Med 2009; 37:2733-2739)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function lambda(t), t >= 0. This rate function also depends on some parameters that need to be estimated. Two forms of lambda(t), t >= 0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright (C) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow a compound weighted Poisson distribution. This model is more flexible in terms of dispersion than the promotion time cure model. Moreover, it gives an interesting and realistic interpretation of the biological mechanism of the occurrence of event of interest as it includes a destructive process of the initial risk factors in a competitive scenario. In other words, what is recorded is only from the undamaged portion of the original number of risk factors.