3 resultados para hazard models

em Duke University


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Cardiac surgery requiring cardiopulmonary bypass is associated with platelet activation. Because platelets are increasingly recognized as important effectors of ischemia and end-organ inflammatory injury, the authors explored whether postoperative nadir platelet counts are associated with acute kidney injury (AKI) and mortality after coronary artery bypass grafting (CABG) surgery. METHODS: The authors evaluated 4,217 adult patients who underwent CABG surgery. Postoperative nadir platelet counts were defined as the lowest in-hospital values and were used as a continuous predictor of postoperative AKI and mortality. Nadir values in the lowest 10th percentile were also used as a categorical predictor. Multivariable logistic regression and Cox proportional hazard models examined the association between postoperative platelet counts, postoperative AKI, and mortality. RESULTS: The median postoperative nadir platelet count was 121 × 10/l. The incidence of postoperative AKI was 54%, including 9.5% (215 patients) and 3.4% (76 patients) who experienced stages II and III AKI, respectively. For every 30 × 10/l decrease in platelet counts, the risk for postoperative AKI increased by 14% (adjusted odds ratio, 1.14; 95% CI, 1.09 to 1.20; P < 0.0001). Patients with platelet counts in the lowest 10th percentile were three times more likely to progress to a higher severity of postoperative AKI (adjusted proportional odds ratio, 3.04; 95% CI, 2.26 to 4.07; P < 0.0001) and had associated increased risk for mortality immediately after surgery (adjusted hazard ratio, 5.46; 95% CI, 3.79 to 7.89; P < 0.0001). CONCLUSION: The authors found a significant association between postoperative nadir platelet counts and AKI and short-term mortality after CABG surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty quantification (UQ) is both an old and new concept. The current novelty lies in the interactions and synthesis of mathematical models, computer experiments, statistics, field/real experiments, and probability theory, with a particular emphasize on the large-scale simulations by computer models. The challenges not only come from the complication of scientific questions, but also from the size of the information. It is the focus in this thesis to provide statistical models that are scalable to massive data produced in computer experiments and real experiments, through fast and robust statistical inference.

Chapter 2 provides a practical approach for simultaneously emulating/approximating massive number of functions, with the application on hazard quantification of Soufri\`{e}re Hills volcano in Montserrate island. Chapter 3 discusses another problem with massive data, in which the number of observations of a function is large. An exact algorithm that is linear in time is developed for the problem of interpolation of Methylation levels. Chapter 4 and Chapter 5 are both about the robust inference of the models. Chapter 4 provides a new criteria robustness parameter estimation criteria and several ways of inference have been shown to satisfy such criteria. Chapter 5 develops a new prior that satisfies some more criteria and is thus proposed to use in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.

The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.

We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.

Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.