884 resultados para Survival analysis (Biometry) Mathematical models
Resumo:
This thesis developed and applied Bayesian models for the analysis of survival data. The gene expression was considered as explanatory variables within the Bayesian survival model which can be considered the new contribution in the analysis of such data. The censoring factor that is inherent of survival data has also been addressed in terms of its impact on the fitting of a finite mixture of Weibull distribution with and without covariates. To investigate this, simulation study were carried out under several censoring percentages. Censoring percentage as high as 80% is acceptable here as the work involved high dimensional data. Lastly the Bayesian model averaging approach was developed to incorporate model uncertainty in the prediction of survival.
Resumo:
196 p.
Resumo:
The degradation of resorbable polymeric devices often takes months to years. Accelerated testing at elevated temperatures is an attractive but controversial technique. The purposes of this paper include: (a) to provide a summary of the mathematical models required to analyse accelerated degradation data and to indicate the pitfalls of using these models; (b) to improve the model previously developed by Han and Pan; (c) to provide a simple version of the model of Han and Pan with an analytical solution that is convenient to use; (d) to demonstrate the application of the improved model in two different poly(lactic acid) systems. It is shown that the simple analytical relations between molecular weight and degradation time widely used in the literature can lead to inadequate conclusions. In more general situations the rate equations are only part of a complete degradation model. Together with previous works in the literature, our study calls for care in using the accelerated testing technique.
Resumo:
Jewell and Kalbfleisch (1992) consider the use of marker processes for applications related to estimation of the survival distribution of time to failure. Marker processes were assumed to be stochastic processes that, at a given point in time, provide information about the current hazard and consequently on the remaining time to failure. Particular attention was paid to calculations based on a simple additive model for the relationship between the hazard function at time t and the history of the marker process up until time t. Specific applications to the analysis of AIDS data included the use of markers as surrogate responses for onset of AIDS with censored data and as predictors of the time elapsed since infection in prevalent individuals. Here we review recent work on the use of marker data to tackle these kinds of problems with AIDS data. The Poisson marker process with an additive model, introduced in Jewell and Kalbfleisch (1992) may be a useful "test" example for comparison of various procedures.
Resumo:
Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.
Resumo:
Esta tesis investiga cuales son los parámetros más críticos que condicionan los resultados que obtienen en los ensayos de protección de peatones la flota Europea de vehículos, según la reglamentación europea de protección de peatones de 2003 (Directiva CE 2003/102) y el posterior Reglamento de 2009 (Reglamento CE 2009/78). En primer lugar se ha analizado el contexto de la protección de peatones en Europa, viendo la historia de las diferentes propuestas de procedimientos de ensayo así como los cambios (y las razones de los mismos) que han sufrido a lo largo del proceso de definición de la normativa Europea. Con la información disponible de más de 400 de estos ensayos se han desarrollado corredores de rigidez para los frontales de los diferentes segmentos de la flota de vehículos europea, siendo este uno de los resultados más relevantes de esta tesis. Posteriormente, esta tesis ha realizado un estudio accidentológico en detalle de los escenarios de atropello de peatones, identificando sus características más relevantes, los grupos de población con mayor riesgo y los tipos de lesiones más importantes que aparecen (en frecuencia y severidad), que han sentado las bases para analizar con modelos matemáticos hasta qué punto los métodos de ensayo propuestos realmente tienen estos factores en cuenta. Estos análisis no habrían sido posibles sin el desarrollo de las nuevas herramientas que se presentan en esta tesis, que permiten construir instantáneamente el modelo matemático de cualquier vehículo y cualquier peatón adulto para analizar su iteración. Así, esta tesis ha desarrollado una metodología rápida para desarrollar modelos matemáticos de vehículos a demanda, de cualquier marca y modelo y con las características geométricas y de rigidez deseados que permitan representarlo matemáticamente y del mismo modo, ha investigado cómo evoluciona el comportamiento del cuerpo humano durante el envejecimiento y ha implementado una funcionalidad de escalado en edad al modelo de peatón en multicuerpo de MADYMO (ya escalable en tamaño) para permitir modelar ad hoc cualquier peatón adulto (en género y edad). Finalmente, esta tesis también ha realizado, utilizando modelos de elementos finitos del cuerpo humano, diferentes estudios sobre la biomecánica de las lesiones más frecuentes de este tipo de accidentes, (en piernas y cabeza) con el objetivo de mejorar los procedimientos de ensayo para que predigan mejor el tipo de lesiones que se quieren evitar. Con el marco temporal y las condiciones de contorno de esta tesis se han centrado los esfuerzos en reforzar algunos aspectos críticos pero puntuales sobre cómo mejorar el ensayo de cabeza y, sobretodo, en proponer soluciones viables y con un valor añadido real al ensayo de pierna contra parachoques, sin cambiar la esencia del mismo pero proponiendo un nuevo impactador mejorado que incorpore una masa extra que representa a la parte superior del cuerpo y sea válido para toda la flota europea de vehículos independiente de la geometría de su frontal.
Resumo:
Cover title.
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.
Resumo:
This article provides a review of techniques for the analysis of survival data arising from respiratory health studies. Popular techniques such as the Kaplan–Meier survival plot and the Cox proportional hazards model are presented and illustrated using data from a lung cancer study. Advanced issues are also discussed, including parametric proportional hazards models, accelerated failure time models, time-varying explanatory variables, simultaneous analysis of multiple types of outcome events and the restricted mean survival time, a novel measure of the effect of treatment.
Resumo:
Mathematical models for heated water outfalls were developed for three flow regions. Near the source, the subsurface discharge into a stratified ambient water issuing from a row of buoyant jets was solved with the jet interference included in the analysis. The analysis of the flow zone close to and at intermediate distances from a surface buoyant jet was developed for the two-dimensional and axisymmetric cases. Far away from the source, a passive dispersion model was solved for a two dimensional situation taking into consideration the effects of shear current and vertical changes in diffusivity. A significant result from the surface buoyant jet analysis is the ability to predict the onset and location of an internal hydraulic jump. Prediction can be made simply from the knowledge of the source Froude number and a dimensionless surface exchange coefficient. Parametric computer programs of the above models are also developed as a part of this study. This report was submitted in fulfillment of Contract No. 14-12-570 under the sponsorship of the Federal Water Quality Administration.
Resumo:
In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. in this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Background: Several models have been designed to predict survival of patients with heart failure. These, while available and widely used for both stratifying and deciding upon different treatment options on the individual level, have several limitations. Specifically, some clinical variables that may influence prognosis may have an influence that change over time. Statistical models that include such characteristic may help in evaluating prognosis. The aim of the present study was to analyze and quantify the impact of modeling heart failure survival allowing for covariates with time-varying effects known to be independent predictors of overall mortality in this clinical setting. Methodology: Survival data from an inception cohort of five hundred patients diagnosed with heart failure functional class III and IV between 2002 and 2004 and followed-up to 2006 were analyzed by using the proportional hazards Cox model and variations of the Cox's model and also of the Aalen's additive model. Principal Findings: One-hundred and eighty eight (188) patients died during follow-up. For patients under study, age, serum sodium, hemoglobin, serum creatinine, and left ventricular ejection fraction were significantly associated with mortality. Evidence of time-varying effect was suggested for the last three. Both high hemoglobin and high LV ejection fraction were associated with a reduced risk of dying with a stronger initial effect. High creatinine, associated with an increased risk of dying, also presented an initial stronger effect. The impact of age and sodium were constant over time. Conclusions: The current study points to the importance of evaluating covariates with time-varying effects in heart failure models. The analysis performed suggests that variations of Cox and Aalen models constitute a valuable tool for identifying these variables. The implementation of covariates with time-varying effects into heart failure prognostication models may reduce bias and increase the specificity of such models.