181 resultados para Prediction algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: After cardiac surgery with cardiopulmonary bypass (CPB), acquired coagulopathy often leads to post-CPB bleeding. Though multifactorial in origin, this coagulopathy is often aggravated by deficient fibrinogen levels. OBJECTIVE: To assess whether laboratory and thrombelastometric testing on CPB can predict plasma fibrinogen immediately after CPB weaning. PATIENTS / METHODS: This prospective study in 110 patients undergoing major cardiovascular surgery at risk of post-CPB bleeding compares fibrinogen level (Clauss method) and function (fibrin-specific thrombelastometry) in order to study the predictability of their course early after termination of CPB. Linear regression analysis and receiver operating characteristics were used to determine correlations and predictive accuracy. RESULTS: Quantitative estimation of post-CPB Clauss fibrinogen from on-CPB fibrinogen was feasible with small bias (+0.19 g/l), but with poor precision and a percentage of error >30%. A clinically useful alternative approach was developed by using on-CPB A10 to predict a Clauss fibrinogen range of interest instead of a discrete level. An on-CPB A10 ≤10 mm identified patients with a post-CPB Clauss fibrinogen of ≤1.5 g/l with a sensitivity of 0.99 and a positive predictive value of 0.60; it also identified those without a post-CPB Clauss fibrinogen <2.0 g/l with a specificity of 0.83. CONCLUSIONS: When measured on CPB prior to weaning, a FIBTEM A10 ≤10 mm is an early alert for post-CPB fibrinogen levels below or within the substitution range (1.5-2.0 g/l) recommended in case of post-CPB coagulopathic bleeding. This helps to minimize the delay to data-based hemostatic management after weaning from CPB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectifs La chirurgie pancréatique reste associée à une morbidité postopératoire importante. Les efforts sont concentrés la plupart du temps sur la diminution de cette morbidité, mais la détection précoce de patients à risque de complications pourrait être une autre stratégie valable. Un score simple de prédiction des complications après duodénopancréatectomie céphalique a récemment été publié par Braga et al. La présente étude a pour but de valider ce score et de discuter de ses possibles implications cliniques. Méthodes De 2000 à 2015, 245 patients ont bénéficié d'une duodénopancréatectomie céphalique dans notre service. Les complications postopératoires ont été recensées selon la classification de Dindo et Clavien. Le score de Braga se base sur quatre paramètres : le score ASA (American Society of Anesthesiologists), la texture du pancréas, le diamètre du canal de Wirsung (canal pancréatique principal) et les pertes sanguines intra-opératoires. Un score de risque global de 0 à 15 peut être calculé pour chaque patient. La puissance de discrimination du score a été calculée en utilisant une courbe ROC (receiver operating characteristic). Résultats Des complications majeures sont apparues chez 31% des patients, alors que 17% des patients ont eu des complications majeures dans l'article de Braga. La texture du pancréas et les pertes sanguines étaient statistiquement significativement corrélées à une morbidité accrue. Les aires sous la courbe étaient respectivement de 0.95 et 0.99 pour les scores classés en quatre catégories de risques (de 0 à 3, 4 à 7, 8 à 11 et 12 à 15) et pour les scores individuels (de 0 à 15). Conclusions Le score de Braga permet donc une bonne discrimination entre les complications mineures et majeures. Notre étude de validation suggère que ce score peut être utilisé comme un outil pronostique de complications majeures après duodénopancréatectomie céphalique. Les implications cliniques, c'est-à-dire si les stratégies de prise en charge postopératoire doivent être adaptées en fonction du risque individuel du patient, restent cependant à élucider. -- Objectives Pancreatic surgery remains associated with important morbidity. Efforts are most commonly concentrated on decreasing postoperative morbidity, but early detection of patients at risk could be another valuable strategy. A simple prognostic score has recently been published. This study aimed to validate this score and discuss possible clinical implications. Methods From 2000 to 2012, 245 patients underwent pancreaticoduodenectomy. Complications were graded according to the Dindo-Clavien classification. The Braga score is based on American Society of Anesthesiologists score, pancreatic texture, Wirsung duct diameter, and blood loss. An overall risk score (from 0 to 15) can be calculated for each patient. Score discriminant power was calculated using a receiver operating characteristic curve. Results Major complications occurred in 31% of patients compared to 17% in Braga's data. Pancreatic texture and blood loss were independently statistically significant for increased morbidity. The areas under curve were 0.95 and 0.99 for 4-risk categories and for individual scores, respectively. Conclusions The Braga score discriminates well between minor and major complications. Our validation suggests that it can be used as prognostic tool for major complications after pancreaticoduodenectomy. The clinical implications, i.e., whether postoperative treatment strategies should be adapted according to the patient's individual risk, remain to be elucidated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Endovascular treatment for acute ischemic stroke patients was recently shown to improve recanalization rates and clinical outcome in a well-defined study population. Intravenous thrombolysis (IVT) alone is insufficiently effective to recanalize in certain patients or of little value in others. Accordingly, we aimed at identifying predictors of recanalization in patients treated with or without IVT. METHODS: In the observational Acute Stroke Registry and Analysis of Lausanne (ASTRAL) registry, we selected those stroke patients (1) with an arterial occlusion on computed tomography angiography (CTA) imaging, (2) who had an arterial patency assessment at 24 hours (CTA/magnetic resonance angiography/transcranial Doppler), and (3) who were treated with IVT or had no revascularization treatment. Based on 2 separate logistic regression analyses, predictors of spontaneous and post-thrombolytic recanalization were generated. RESULTS: Partial or complete recanalization was achieved in 121 of 210 (58%) thrombolyzed patients. Recanalization was associated with atrial fibrillation (odds ratio , 1.6; 95% confidence interval, 1.2-3.0) and absence of early ischemic changes on CT (1.1, 1.1-1.2) and inversely correlated with the presence of a significant extracranial (EC) stenosis or occlusion (.6, .3-.9). In nonthrombolyzed patients, partial or complete recanalization was significantly less frequent (37%, P < .01). The recanalization was independently associated with a history of hypercholesterolemia (2.6, 1.2-5.6) and the proximal site of the intracranial occlusion (2.5, 1.2-5.4), and inversely correlated with a decreased level of consciousness (.3, .1-.8), and EC (.3, .1-.6) and basilar artery pathology (.1, .0-.6). CONCLUSIONS: Various clinical findings, cardiovascular risk factors, and arterial pathology on acute CTA-based imaging are moderately associated with spontaneous and post-thrombolytic arterial recanalization at 24 hours. If confirmed in other studies, this information may influence patient selection toward the most appropriate revascularization strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: The purpose of our study was to assess whether a model combining clinical factors, MR imaging features, and genomics would better predict overall survival of patients with glioblastoma (GBM) than either individual data type. METHODS: The study was conducted leveraging The Cancer Genome Atlas (TCGA) effort supported by the National Institutes of Health. Six neuroradiologists reviewed MRI images from The Cancer Imaging Archive (http://cancerimagingarchive.net) of 102 GBM patients using the VASARI scoring system. The patients' clinical and genetic data were obtained from the TCGA website (http://www.cancergenome.nih.gov/). Patient outcome was measured in terms of overall survival time. The association between different categories of biomarkers and survival was evaluated using Cox analysis. RESULTS: The features that were significantly associated with survival were: (1) clinical factors: chemotherapy; (2) imaging: proportion of tumor contrast enhancement on MRI; and (3) genomics: HRAS copy number variation. The combination of these three biomarkers resulted in an incremental increase in the strength of prediction of survival, with the model that included clinical, imaging, and genetic variables having the highest predictive accuracy (area under the curve 0.679±0.068, Akaike's information criterion 566.7, P<0.001). CONCLUSION: A combination of clinical factors, imaging features, and HRAS copy number variation best predicts survival of patients with GBM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To develop predictive models for early triage of burn patients based on hypersusceptibility to repeated infections. BACKGROUND: Infection remains a major cause of mortality and morbidity after severe trauma, demanding new strategies to combat infections. Models for infection prediction are lacking. METHODS: Secondary analysis of 459 burn patients (≥16 years old) with 20% or more total body surface area burns recruited from 6 US burn centers. We compared blood transcriptomes with a 180-hour cutoff on the injury-to-transcriptome interval of 47 patients (≤1 infection episode) to those of 66 hypersusceptible patients [multiple (≥2) infection episodes (MIE)]. We used LASSO regression to select biomarkers and multivariate logistic regression to built models, accuracy of which were assessed by area under receiver operating characteristic curve (AUROC) and cross-validation. RESULTS: Three predictive models were developed using covariates of (1) clinical characteristics; (2) expression profiles of 14 genomic probes; (3) combining (1) and (2). The genomic and clinical models were highly predictive of MIE status [AUROCGenomic = 0.946 (95% CI: 0.906-0.986); AUROCClinical = 0.864 (CI: 0.794-0.933); AUROCGenomic/AUROCClinical P = 0.044]. Combined model has an increased AUROCCombined of 0.967 (CI: 0.940-0.993) compared with the individual models (AUROCCombined/AUROCClinical P = 0.0069). Hypersusceptible patients show early alterations in immune-related signaling pathways, epigenetic modulation, and chromatin remodeling. CONCLUSIONS: Early triage of burn patients more susceptible to infections can be made using clinical characteristics and/or genomic signatures. Genomic signature suggests new insights into the pathophysiology of hypersusceptibility to infection may lead to novel potential therapeutic or prophylactic targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intracranial aneurysms are a common pathologic condition with a potential severe complication: rupture. Effective treatment options exist, neurosurgical clipping and endovascular techniques, but guidelines for treatment are unclear and focus mainly on patient age, aneurysm size, and localization. New criteria to define the risk of rupture are needed to refine these guidelines. One potential candidate is aneurysm wall motion, known to be associated with rupture but difficult to detect and quantify. We review what is known about the association between aneurysm wall motion and rupture, which structural changes may explain wall motion patterns, and available imaging techniques able to analyze wall motion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Snow cover is an important control in mountain environments and a shift of the snow-free period triggered by climate warming can strongly impact ecosystem dynamics. Changing snow patterns can have severe effects on alpine plant distribution and diversity. It thus becomes urgent to provide spatially explicit assessments of snow cover changes that can be incorporated into correlative or empirical species distribution models (SDMs). Here, we provide for the first time a with a lower overestimation comparison of two physically based snow distribution models (PREVAH and SnowModel) to produce snow cover maps (SCMs) at a fine spatial resolution in a mountain landscape in Austria. SCMs have been evaluated with SPOT-HRVIR images and predictions of snow water equivalent from the two models with ground measurements. Finally, SCMs of the two models have been compared under a climate warming scenario for the end of the century. The predictive performances of PREVAH and SnowModel were similar when validated with the SPOT images. However, the tendency to overestimate snow cover was slightly lower with SnowModel during the accumulation period, whereas it was lower with PREVAH during the melting period. The rate of true positives during the melting period was two times higher on average with SnowModel with a lower overestimation of snow water equivalent. Our results allow for recommending the use of SnowModel in SDMs because it better captures persisting snow patches at the end of the snow season, which is important when modelling the response of species to long-lasting snow cover and evaluating whether they might survive under climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: Pancreatic surgery remains associated with important morbidity. Efforts are most commonly concentrated on decreasing postoperative morbidity, but early detection of patients at risk could be another valuable strategy. A simple prognostic score has recently been published. This study aimed to validate this score and discuss possible clinical implications. METHODS: From 2000 to 2012, 245 patients underwent a pancreaticoduodenectomy. Complications were graded according to the Dindo-Clavien Classification. The Braga score is based on American Society of Anesthesiologists score, pancreatic texture, Wirsung duct diameter, and blood loss. An overall risk score (0-15) can be calculated for each patient. Score discriminant power was calculated using a receiver operating characteristic curve. RESULTS: Major complications occurred in 31% of patients compared with 17% in Braga's data. Pancreatic texture and blood loss were independently statistically significant for increased morbidity. Areas under the curve were 0.95 and 0.99 for 4-risk categories and for individual scores, respectively. CONCLUSIONS: The Braga score discriminates well between minor and major complications. Our validation suggests that it can be used as a prognostic tool for major complications after pancreaticoduodenectomy. The clinical implications, that is, whether postoperative treatment strategies should be adapted according to the patient's individual risk, remain to be elucidated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Parental history (PH) and genetic risk scores (GRSs) are separately associated with coronary heart disease (CHD), but evidence regarding their combined effects is lacking. We aimed to evaluate the joint associations and predictive ability of PH and GRSs for incident CHD. METHODS: Data for 4283 Caucasians were obtained from the population-based CoLaus Study, over median follow-up time of 5.6 years. CHD was defined as incident myocardial infarction, angina, percutaneous coronary revascularization or bypass grafting. Single nucleotide polymorphisms for CHD identified by genome-wide association studies were used to construct unweighted and weighted versions of three GRSs, comprising of 38, 53 and 153 SNPs respectively. RESULTS: PH was associated with higher values of all weighted GRSs. After adjustment for age, sex, smoking, diabetes, systolic blood pressure, low and high density lipoprotein cholesterol, PH was significantly associated with CHD [HR 2.61, 95% CI (1.47-4.66)] and further adjustment for GRSs did not change this estimate. Similarly, one standard deviation change of the weighted 153-SNPs GRS was significantly associated with CHD [HR 1.50, 95% CI (1.26-1.80)] and remained so, after further adjustment for PH. The weighted, 153-SNPs GRS, but not PH, modestly improved discrimination [(C-index improvement, 0.016), p = 0.048] and reclassification [(NRI improvement, 8.6%), p = 0.027] beyond cardiovascular risk factors. After including both the GRS and PH, model performance improved further [(C-index improvement, 0.022), p = 0.006]. CONCLUSION: After adjustment for cardiovascular risk factors, PH and a weighted, polygenic GRS were jointly associated with CHD and provided additive information for coronary events prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The updated Vienna Prediction Model for estimating recurrence risk after an unprovoked venous thromboembolism (VTE) has been developed to identify individuals at low risk for VTE recurrence in whom anticoagulation (AC) therapy may be stopped after 3 months. We externally validated the accuracy of the model to predict recurrent VTE in a prospective multicenter cohort of 156 patients aged ≥65 years with acute symptomatic unprovoked VTE who had received 3 to 12 months of AC. Patients with a predicted 12-month risk within the lowest quartile based on the updated Vienna Prediction Model were classified as low risk. The risk of recurrent VTE did not differ between low- vs higher-risk patients at 12 months (13% vs 10%; P = .77) and 24 months (15% vs 17%; P = 1.0). The area under the receiver operating characteristic curve for predicting VTE recurrence was 0.39 (95% confidence interval [CI], 0.25-0.52) at 12 months and 0.43 (95% CI, 0.31-0.54) at 24 months. In conclusion, in elderly patients with unprovoked VTE who have stopped AC, the updated Vienna Prediction Model does not discriminate between patients who develop recurrent VTE and those who do not. This study was registered at www.clinicaltrials.gov as #NCT00973596.