976 resultados para Survival Model


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper, we propose a cure rate survival model by assuming the number of competing causes of the event of interest follows the Geometric distribution and the time to event follow a Birnbaum Saunders distribution. We consider a frequentist analysis for parameter estimation of a Geometric Birnbaum Saunders model with cure rate. Finally, to analyze a data set from the medical area. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Long-term survival models have historically been considered for analyzing time-to-event data with long-term survivors fraction. However, situations in which a fraction (1 - p) of systems is subject to failure from independent competing causes of failure, while the remaining proportion p is cured or has not presented the event of interest during the time period of the study, have not been fully considered in the literature. In order to accommodate such situations, we present in this paper a new long-term survival model. Maximum likelihood estimation procedure is discussed as well as interval estimation and hypothesis tests. A real dataset illustrates the methodology.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Suppose that having established a marginal total effect of a point exposure on a time-to-event outcome, an investigator wishes to decompose this effect into its direct and indirect pathways, also know as natural direct and indirect effects, mediated by a variable known to occur after the exposure and prior to the outcome. This paper proposes a theory of estimation of natural direct and indirect effects in two important semiparametric models for a failure time outcome. The underlying survival model for the marginal total effect and thus for the direct and indirect effects, can either be a marginal structural Cox proportional hazards model, or a marginal structural additive hazards model. The proposed theory delivers new estimators for mediation analysis in each of these models, with appealing robustness properties. Specifically, in order to guarantee ignorability with respect to the exposure and mediator variables, the approach, which is multiply robust, allows the investigator to use several flexible working models to adjust for confounding by a large number of pre-exposure variables. Multiple robustness is appealing because it only requires a subset of working models to be correct for consistency; furthermore, the analyst need not know which subset of working models is in fact correct to report valid inferences. Finally, a novel semiparametric sensitivity analysis technique is developed for each of these models, to assess the impact on inference, of a violation of the assumption of ignorability of the mediator.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A number of authors have studies the mixture survival model to analyze survival data with nonnegligible cure fractions. A key assumption made by these authors is the independence between the survival time and the censoring time. To our knowledge, no one has studies the mixture cure model in the presence of dependent censoring. To account for such dependence, we propose a more general cure model which allows for dependent censoring. In particular, we derive the cure models from the perspective of competing risks and model the dependence between the censoring time and the survival time using a class of Archimedean copula models. Within this framework, we consider the parameter estimation, the cure detection, and the two-sample comparison of latency distribution in the presence of dependent censoring when a proportion of patients is deemed cured. Large sample results using the martingale theory are obtained. We applied the proposed methodologies to the SEER prostate cancer data.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Mixture modeling is commonly used to model categorical latent variables that represent subpopulations in which population membership is unknown but can be inferred from the data. In relatively recent years, the potential of finite mixture models has been applied in time-to-event data. However, the commonly used survival mixture model assumes that the effects of the covariates involved in failure times differ across latent classes, but the covariate distribution is homogeneous. The aim of this dissertation is to develop a method to examine time-to-event data in the presence of unobserved heterogeneity under a framework of mixture modeling. A joint model is developed to incorporate the latent survival trajectory along with the observed information for the joint analysis of a time-to-event variable, its discrete and continuous covariates, and a latent class variable. It is assumed that the effects of covariates on survival times and the distribution of covariates vary across different latent classes. The unobservable survival trajectories are identified through estimating the probability that a subject belongs to a particular class based on observed information. We applied this method to a Hodgkin lymphoma study with long-term follow-up and observed four distinct latent classes in terms of long-term survival and distributions of prognostic factors. Our results from simulation studies and from the Hodgkin lymphoma study demonstrated the superiority of our joint model compared with the conventional survival model. This flexible inference method provides more accurate estimation and accommodates unobservable heterogeneity among individuals while taking involved interactions between covariates into consideration.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lately, there has been increasing interest in the association between temperature and adverse birth outcomes including preterm birth (PTB) and stillbirth. PTB is a major predictor of many diseases later in life, and stillbirth is a devastating event for parents and families. The aim of this study was to assess the seasonal pattern of adverse birth outcomes, and to examine possible associations of maternal exposure to temperature with PTB and stillbirth. We also aimed to identify if there were any periods of the pregnancy where exposure to temperature was particularly harmful. A retrospective cohort study design was used and we retrieved individual birth records from the Queensland Health Perinatal Data Collection Unit for all singleton births (excluding twins and triplets) delivered in Brisbane between 1 July 2005 and 30 June 2009. We obtained weather data (including hourly relative humidity, minimum and maximum temperature) and air-pollution data (including PM10, SO2 and O3) from the Queensland Department of Environment and Resource Management. We used survival analyses with the time-dependent variables of temperature, humidity and air pollution, and the competing risks of stillbirth and live birth. To assess the monthly pattern of the birth outcomes, we fitted month of pregnancy as a time-dependent variable. We examined the seasonal pattern of the birth outcomes and the relationship between exposure to high or low temperatures and birth outcomes over the four lag weeks before birth. We further stratified by categorisation of PTB: extreme PTB (< 28 weeks of gestation), PTB (28–36 weeks of gestation), and term birth (≥ 37 weeks of gestation). Lastly, we examined the effect of temperature variation in each week of the pregnancy on birth outcomes. There was a bimodal seasonal pattern in gestation length. After adjusting for temperature, the seasonal pattern changed from bimodal, to only one peak in winter. The risk of stillbirth was statistically significant lower in March compared with January. After adjusting for temperature, the March trough was still statistically significant and there was a peak in risk (not statistically significant) in winter. There was an acute effect of temperature on gestational age and stillbirth with a shortened gestation for increasing temperature from 15 °C to 25 °C over the last four weeks before birth. For stillbirth, we found an increasing risk with increasing temperatures from 12 °C to approximately 20 °C, and no change in risk at temperatures above 20 °C. Certain periods of the pregnancy were more vulnerable to temperature variation. The risk of PTB (28–36 weeks of gestation) increased as temperatures increased above 21 °C. For stillbirth, the fetus was most vulnerable at less than 28 weeks of gestation, but there were also effects in 28–36 weeks of gestation. For fetuses of more than 37 weeks of gestation, increasing temperatures did not increase the risk of stillbirth. We did not find any adverse affects of cold temperature on birth outcomes in this cohort. My findings contribute to knowledge of the relationship between temperature and birth outcomes. In the context of climate change, this is particularly important. The results may have implications for public health policy and planning, as they indicate that pregnant women would decrease their risk of adverse birth outcomes by avoiding exposure to high temperatures and seeking cool environments during hot days.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Risk-stratification of diffuse large B-cell lymphoma (DLBCL) requires identification of patients with disease that is not cured despite initial R-CHOP. Although the prognostic importance of the tumour microenvironment (TME) is established, the optimal strategy to quantify it is unknown. Methods The relationship between immune-effector and inhibitory (checkpoint) genes was assessed by NanoString™ in 252 paraffin-embedded DLBCL tissues. A model to quantify net anti-tumoural immunity as an outcome predictor was tested in 158 R-CHOP treated patients, and validated in tissue/blood from two independent R-CHOP treated cohorts of 233 and 140 patients respectively. Findings T and NK-cell immune-effector molecule expression correlated with tumour associated macrophage and PD-1/PD-L1 axis markers consistent with malignant B-cells triggering a dynamic checkpoint response to adapt to and evade immune-surveillance. A tree-based survival model was performed to test if immune-effector to checkpoint ratios were prognostic. The CD4*CD8:(CD163/CD68)*PD-L1 ratio was better able to stratify overall survival than any single or combination of immune markers, distinguishing groups with disparate 4-year survivals (92% versus 47%). The immune ratio was independent of and added to the revised international prognostic index (R-IPI) and cell-of-origin (COO). Tissue findings were validated in 233 DLBCL R-CHOP treated patients. Furthermore, within the blood of 140 R-CHOP treated patients immune-effector:checkpoint ratios were associated with differential interim-PET/CT+ve/-ve expression.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dados faltantes são um problema comum em estudos epidemiológicos e, dependendo da forma como ocorrem, as estimativas dos parâmetros de interesse podem estar enviesadas. A literatura aponta algumas técnicas para se lidar com a questão, e, a imputação múltipla vem recebendo destaque nos últimos anos. Esta dissertação apresenta os resultados da utilização da imputação múltipla de dados no contexto do Estudo Pró-Saúde, um estudo longitudinal entre funcionários técnico-administrativos de uma universidade no Rio de Janeiro. No primeiro estudo, após simulação da ocorrência de dados faltantes, imputou-se a variável cor/raça das participantes, e aplicou-se um modelo de análise de sobrevivência previamente estabelecido, tendo como desfecho a história auto-relatada de miomas uterinos. Houve replicação do procedimento (100 vezes) para se determinar a distribuição dos coeficientes e erros-padrão das estimativas da variável de interesse. Apesar da natureza transversal dos dados aqui utilizados (informações da linha de base do Estudo Pró-Saúde, coletadas em 1999 e 2001), buscou-se resgatar a história do seguimento das participantes por meio de seus relatos, criando uma situação na qual a utilização do modelo de riscos proporcionais de Cox era possível. Nos cenários avaliados, a imputação demonstrou resultados satisfatórios, inclusive quando da avaliação de performance realizada. A técnica demonstrou um bom desempenho quando o mecanismo de ocorrência dos dados faltantes era do tipo MAR (Missing At Random) e o percentual de não-resposta era de 10%. Ao se imputar os dados e combinar as estimativas obtidas nos 10 bancos (m=10) gerados, o viés das estimativas era de 0,0011 para a categoria preta e 0,0015 para pardas, corroborando a eficiência da imputação neste cenário. Demais configurações também apresentaram resultados semelhantes. No segundo artigo, desenvolve-se um tutorial para aplicação da imputação múltipla em estudos epidemiológicos, que deverá facilitar a utilização da técnica por pesquisadores brasileiros ainda não familiarizados com o procedimento. São apresentados os passos básicos e decisões necessárias para se imputar um banco de dados, e um dos cenários utilizados no primeiro estudo é apresentado como exemplo de aplicação da técnica. Todas as análises foram conduzidas no programa estatístico R, versão 2.15 e os scripts utilizados são apresentados ao final do texto.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The standard linear-quadratic (LQ) survival model for external beam radiotherapy is reviewed with particular emphasis on studying how different schedules of radiation treatment planning may be affected by different tumour repopulation kinetics. The LQ model is further examined in the context of tumour control probability (TCP) models. The application of the Zaider and Minerbo non-Poissonian TCP model incorporating the effect of cellular repopulation is reviewed. In particular the recent development of a cell cycle model within the original Zaider and Minerbo TCP formalism is highlighted. Application of this TCP cell-cycle model in clinical treatment plans is explored and analysed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Multiple risk prediction models have been validated in all-age patients presenting with acute coronary syndrome (ACS) and treated with percutaneous coronary intervention (PCI); however, they have not been validated specifically in the elderly. METHODS: We calculated the GRACE (Global Registry of Acute Coronary Events) score, the logistic EuroSCORE, the AMIS (Acute Myocardial Infarction Swiss registry) score, and the SYNTAX (Synergy between Percutaneous Coronary Intervention with TAXUS and Cardiac Surgery) score in a consecutive series of 114 patients ≥75 years presenting with ACS and treated with PCI within 24 hours of hospital admission. Patients were stratified according to score tertiles and analysed retrospectively by comparing the lower/mid tertiles as an aggregate group with the higher tertile group. The primary endpoint was 30-day mortality. Secondary endpoints were the composite of death and major adverse cardiovascular events (MACE) at 30 days, and 1-year MACE-free survival. Model discrimination ability was assessed using the area under receiver operating characteristic curve (AUC). RESULTS: Thirty-day mortality was higher in the upper tertile compared with the aggregate lower/mid tertiles according to the logistic EuroSCORE (42% vs 5%; odds ratio [OR] = 14, 95% confidence interval [CI] = 4-48; p <0.001; AUC = 0.79), the GRACE score (40% vs 4%; OR = 17, 95% CI = 4-64; p <0.001; AUC = 0.80), the AMIS score (40% vs 4%; OR = 16, 95% CI = 4-63; p <0.001; AUC = 0.80), and the SYNTAX score (37% vs 5%; OR = 11, 95% CI = 3-37; p <0.001; AUC = 0.77). CONCLUSIONS: In elderly patients presenting with ACS and referred to PCI within 24 hours of admission, the GRACE score, the EuroSCORE, the AMIS score, and the SYNTAX score predicted 30 day mortality. The predictive value of clinical scores was improved by using them in combination.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow a compound weighted Poisson distribution. This model is more flexible in terms of dispersion than the promotion time cure model. Moreover, it gives an interesting and realistic interpretation of the biological mechanism of the occurrence of event of interest as it includes a destructive process of the initial risk factors in a competitive scenario. In other words, what is recorded is only from the undamaged portion of the original number of risk factors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Analysis of recurrent event data is frequently needed in clinical and epidemiological studies. An important issue in such analysis is how to account for the dependence of the events in an individual and any unobserved heterogeneity of the event propensity across individuals.Methods We applied a number of conditional frailty and nonfrailty models in an analysis involving recurrent myocardial infarction events in the Long-Term Intervention with Pravastatin in Ischaemic Disease study. A multiple variable risk prediction model was developed for both males and females. Results A Weibull model with a gamma frailty term fitted the data better than other frailty models for each gender. Among nonfrailty models the stratified survival model fitted the data best for each gender. The relative risk estimated by the elapsed time model was close to that estimated by the gap time model. We found that a cholesterol-lowering drug, pravastatin (the intervention being tested in the trial) had significant protective effect against the occurrence of myocardial infarction in men (HR¼0.71, 95% CI0.60–0.83). However, the treatment effect was not significant in women due to smaller sample size (HR¼0.75, 95% CI 0.51–1.10). There were no significant interactions between the treatment effect and each recurrent MI event (p¼0.24 for men and p¼0.55 for women). The risk of developing an MI event for a male who had an MI event during follow-up was about 3.4 (95% CI 2.6–4.4) times the risk compared with those who did not have an MI event. The corresponding relative risk for a female was about 7.8 (95% CI 4.4–13.6). Limitations The number of female patients was relatively small compared with their male counterparts, which may result in low statistical power to find real differences in the effect of treatment and other potential risk factors.Conclusions The conditional frailty model suggested that after accounting for all the risk factors in the model, there was still unmeasured heterogeneity of the risk for myocardial infarction, indicating the effect of subject-specific risk factors. These risk prediction models can be used to classify cardiovascular disease patients into different risk categories and may be useful for the most effective targeting of preventive therapies for cardiovascular disease.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

NO plays diverse roles in physiological and pathological processes, occasionally resulting in opposing effects, particularly in cells subjected to oxidative stress. NO mostly protects eukaryotes against oxidative injury, but was demonstrated to kill prokaryotes synergistically with H2O2. This could be a promising therapeutic avenue. However, recent conflicting findings were reported describing dramatic protective activity of NO. The previous studies of NO effects on prokaryotes applied a transient oxidative stress while arbitrarily checking the residual bacterial viability after 30 or 60min and ignoring the process kinetics. If NO-induced synergy and the oxidative stress are time-dependent, the elucidation of the cell killing kinetics is essential, particularly for survival curves exhibiting a "shoulder" sometimes reflecting sublethal damage as in the linear-quadratic survival models. We studied the kinetics of NO synergic effects on H2O2-induced killing of microbial pathogens. A synergic pro-oxidative activity toward gram-negative and gram-positive cells is demonstrated even at sub-μM/min flux of NO. For certain strains, the synergic effect progressively increased with the duration of cell exposure, and the linear-quadratic survival model best fit the observed survival data. In contrast to the failure of SOD to affect the bactericidal process, nitroxide SOD mimics abrogated the pro-oxidative synergy of NO/H2O2. These cell-permeative antioxidants, which hardly react with diamagnetic species and react neither with NO nor with H2O2, can detoxify redox-active transition metals and catalytically remove intracellular superoxide and nitrogen-derived reactive species such as (•)NO2 or peroxynitrite. The possible mechanism underlying the bactericidal NO synergy under oxidative stress and the potential therapeutic gain are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Como orientar políticas públicas de modo a promover o bem-estar da população? Para responder a essa questão a comunidade acadêmica tem enfocado a necessidade de se conhecer melhor as escolhas de consumo individuais. Essa tendência encontra apoio no número, cada vez maior, de bases de microdados disponibilizadas pelos órgãos governamentais e iniciativa privada. O presente trabalho analisa as escolhas dos brasileiros com relação às decisões de financiamento e de oferta de trabalho. O estudo é dividido em três ensaios empíricos distintos. Como a contratação de crédito em mercados informais é motivada pelo déficit de educação financeira é o foco do primeiro ensaio. Considerando mais de 2.000 observações sobre tomadas de crédito, utiliza-se um modelo logit multinomial para estimar a propensão à tomada de crédito na informalidade em contraste com o crédito bancário. Os resultados indicam que a educação financeira pode ter uma relevância maior para a seleção de financiamentos informais do que a restrição de crédito. O segundo ensaio analisa o comportamento de uso de cartões de crédito dentre 1.458 jovens adultos residentes no Brasil, EUA ou França. Um modelo de equações estruturais é utilizado para incorporar relações entre as variáveis latentes. O modelo validado pelo estudo representa uma situação em que o bem-estar financeiro é afetado pela forma com que o indivíduo utiliza o cartão de crédito que, por sua vez, é afetado pelo sentimento de comparação social e pela autoconfiança financeira, essa última sendo impactada também pela educação financeira recebida dos pais. Na comparação entre grupos encontramos evidências de que a comparação social tem um efeito mais forte sobre os jovens brasileiros e que homens são mais dependentes da educação dos pais do que as mulheres. No último ensaio a população pobre brasileira é analisada em relação a um suposto efeito preguiça, que seria causado pela diminuição de oferta de trabalho das famílias que recebem o benefício financeiro do governo via o Programa Bolsa Família. Um modelo de sobrevivência foi usado para comparar a duração no emprego entre beneficiários do programa e um grupo controle, utilizando uma base de dados com mais de 3 milhões de indivíduos. A hipótese de um efeito preguiça é rejeitada. O risco de desligamento do emprego para os beneficiários do Bolsa Família é medido como sendo de 7% a 10% menor, o que é capaz de anular, por exemplo, o maior risco de saída do emprego causado pela presença de filhos pequenos na composição familiar. Uma vez que a rotatividade no emprego dificulta o recebimento de aposentadorias por tempo de contribuição, pode-se concluir que o programa de transferência de renda brasileiro terá um impacto positivo sobre o bem-estar financeiro futuro do trabalhador.