72 resultados para logistic regression predictors


Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE The development of peripheral artery disease is affected by the presence of cardiovascular risk factors. It is unclear, whether particular risk factors are leading to different clinical stages of peripheral artery disease. The aim of this retrospective cross-sectional study was to assess the association of cardiovascular risk factors with the presence of critical limb ischaemia. METHODS The study cohort was derived from a consecutive registry of patients undergoing endovascular therapy in a tertiary referral centre between January 2000 and April 2014. Patients undergoing first-time endovascular intervention for chronic peripheral artery disease of the lower extremities were included. Univariate and multivariate logistic regression models were used to assess the association of age, sex, diabetes mellitus, hypertension, dyslipidaemia, smoking, and renal insufficiency with critical limb ischaemia vs. intermittent claudication. RESULTS A total of 3406 patients were included in the study (mean age 71.7 ± 11.8 years, 2075 [61%] male). There was a significant association of age (OR 1.67, 95%-CI 1.53-1.82, p < 0.001), male gender (OR 1.23, 95%-CI 1.04-1.47, p = 0.016), diabetes (OR 1.99, 95%-CI 1.68-2.36, p < 0.001) and renal insufficiency (OR 1.62, 95%-CI 1.35-1.96, p < 0.001) with the likelihood of critical limb ischaemia. Smoking was associated with intermittent claudication rather than critical limb ischaemia (OR 0.78, 95%-CI 0.65-0.94, p = 0.010), while hypertension and dyslipidaemia did not show an association with critical limb ischaemia. CONCLUSIONS In peripheral artery disease patients undergoing first-time endovascular treatment, age, male gender, diabetes, and renal insufficiency were the strongest predictors for the presence of critical limb ischaemia.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND This study evaluated whether risk factors for sternal wound infections vary with the type of surgical procedure in cardiac operations. METHODS This was a university hospital surveillance study of 3,249 consecutive patients (28% women) from 2006 to 2010 (median age, 69 years [interquartile range, 60 to 76]; median additive European System for Cardiac Operative Risk Evaluation score, 5 [interquartile range, 3 to 8]) after (1) isolated coronary artery bypass grafting (CABG), (2) isolated valve repair or replacement, or (3) combined valve procedures and CABG. All other operations were excluded. Univariate and multivariate binary logistic regression were conducted to identify independent predictors for development of sternal wound infections. RESULTS We detected 122 sternal wound infections (3.8%) in 3,249 patients: 74 of 1,857 patients (4.0%) after CABG, 19 of 799 (2.4%) after valve operations, and 29 of 593 (4.9%) after combined procedures. In CABG patients, bilateral internal thoracic artery harvest, procedural duration exceeding 300 minutes, diabetes, obesity, chronic obstructive pulmonary disease, and female sex (model 1) were independent predictors for sternal wound infection. A second model (model 2), using the European System for Cardiac Operative Risk Evaluation, revealed bilateral internal thoracic artery harvest, diabetes, obesity, and the second and third quartiles of the European System for Cardiac Operative Risk Evaluation were independent predictors. In valve patients, model 1 showed only revision for bleeding as an independent predictor for sternal infection, and model 2 yielded both revision for bleeding and diabetes. For combined valve and CABG operations, both regression models demonstrated revision for bleeding and duration of operation exceeding 300 minutes were independent predictors for sternal infection. CONCLUSIONS Risk factors for sternal wound infections after cardiac operations vary with the type of surgical procedure. In patients undergoing valve operations or combined operations, procedure-related risk factors (revision for bleeding, duration of operation) independently predict infection. In patients undergoing CABG, not only procedure-related risk factors but also bilateral internal thoracic artery harvest and patient characteristics (diabetes, chronic obstructive pulmonary disease, obesity, female sex) are predictive of sternal wound infection. Preventive interventions may be justified according to the type of operation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND There is no agreement of the influence of patent ductus arteriosus (PDA) on outcomes in patients with necrotizing enterocolitis (NEC). In this study, we assessed the influence of PDA on NEC outcomes. METHODS A retrospective study of 131 infants with established NEC was performed. Outcomes (death, disease severity, need for surgery, hospitalization duration), as well as multiple clinical parameters were compared between NEC patients with no congenital heart disease (n=102) and those with isolated PDA (n=29). Univariate, multivariate and stepwise logistic regression analyses were performed. RESULTS Birth weight and gestational age were significantly lower in patients with PDA [median (95% CI): 1120 g (1009-1562 g), 28.4 wk (27.8-30.5 wk)] than in those without PDA [median (95% CI): 1580 g (1593-1905 g), 32.4 wk (31.8-33.5 wk); P<0.05]. The risk of NEC-attributable fatality was higher in NEC patients with PDA (35%) than in NEC patients without PDA (14%)[univariate odds ratio (OR)=3.3, 95% CI: 1.8-8.6, P<0.05; multivariate OR=2.4, 95% CI: 0.82-2.39, P=0.111]. Significant independent predictors for nonsurvival within the entire cohort were advanced disease severity stage III (OR=27.9, 95% CI: 7.4-105, P<0.001) and birth weight below 1100 g (OR=5.7, 95% CI: 1.7-19.4, P<0.01). CONCLUSIONS In patients with NEC, the presence of PDA is associated with an increased risk of death. However, when important differences between the two study groups are controlled, only birth weight and disease severity may independently predict mortality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Renal damage is more frequent with new-generation lithotripters. However, animal studies suggest that voltage ramping minimizes the risk of complications following extracorporeal shock wave lithotripsy (SWL). In the clinical setting, the optimal voltage strategy remains unclear. OBJECTIVE To evaluate whether stepwise voltage ramping can protect the kidney from damage during SWL. DESIGN, SETTING, AND PARTICIPANTS A total of 418 patients with solitary or multiple unilateral kidney stones were randomized to receive SWL using a Modulith SLX-F2 lithotripter with either stepwise voltage ramping (n=213) or a fixed maximal voltage (n=205). INTERVENTION SWL. OUTCOMES MEASUREMENTS AND STATISTICAL ANALYSIS The primary outcome was sonographic evidence of renal hematomas. Secondary outcomes included levels of urinary markers of renal damage, stone disintegration, stone-free rate, and rates of secondary interventions within 3 mo of SWL. Descriptive statistics were used to compare clinical outcomes between the two groups. A logistic regression model was generated to assess predictors of hematomas. RESULTS AND LIMITATIONS Significantly fewer hematomas occurred in the ramping group(12/213, 5.6%) than in the fixed group (27/205, 13%; p=0.008). There was some evidence that the fixed group had higher urinary β2-microglobulin levels after SWL compared to the ramping group (p=0.06). Urinary microalbumin levels, stone disintegration, stone-free rate, and rates of secondary interventions did not significantly differ between the groups. The logistic regression model showed a significantly higher risk of renal hematomas in older patients (odds ratio [OR] 1.03, 95% confidence interval [CI] 1.00-1.05; p=0.04). Stepwise voltage ramping was associated with a lower risk of hematomas (OR 0.39, 95% CI 0.19-0.80; p=0.01). The study was limited by the use of ultrasound to detect hematomas. CONCLUSIONS In this prospective randomized study, stepwise voltage ramping during SWL was associated with a lower risk of renal damage compared to a fixed maximal voltage without compromising treatment effectiveness. PATIENT SUMMARY Lithotripsy is a noninvasive technique for urinary stone disintegration using ultrasonic energy. In this study, two voltage strategies are compared. The results show that a progressive increase in voltage during lithotripsy decreases the risk of renal hematomas while maintaining excellent outcomes. TRIAL REGISTRATION ISRCTN95762080.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Lack of donor organs remains a major obstacle in organ transplantation. Our aim was to evaluate (1) the association between engaging in high-risk recreational activities and attitudes toward organ donation and (2) the degree of reciprocity between organ acceptance and donation willingness in young men. Methods A 17-item, close-ended survey was offered to male conscripts ages 18 to 26 years in all Swiss military conscription centers. Predictors of organ donation attitudes were assessed in bivariate analyses and multiple logistic regression. Reciprocity of the intentions to accept and to donate organs was assessed by means of donor card status. Results In 1559 responses analyzed, neither motorcycling nor practicing extreme sports reached significant association with donor card holder status. Family communication about organ donation, student, or academic profession and living in a Latin linguistic region were predictors of positive organ donation attitudes, whereas residence in a German-speaking region and practicing any religion predicted reluctance. Significantly more respondents were willing to accept than to donate organs, especially among those without family communication concerning organ donation. Conclusions For the first time, it was shown that high-risk recreational activities do not influence organ donation attitudes. Second, a considerable discrepancy in organ donation reciprocity was identified. We propose that increasing this reciprocity could eventually increase organ donation rates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Catheter ablation of complex fractionated atrial electrograms (CFAE), also known as defragmentation ablation, may be considered for the treatment of persistent atrial fibrillation (AF) beyond pulmonary vein isolation (PVI). Concomitant antiarrhythmic drug (AAD) therapy is common, but the relevance of AAD administration and its optimal timing during ablation remain unclear. Therefore, we investigated the use and timing of AADs during defragmentation ablation and their possible implications for AF termination and ablation success in a large cohort of patients. Retrospectively, we included 200 consecutive patients (age: 61 ± 12 years, LA diameter: 47 ± 8 mm) with persistent AF (episode duration 47 ± 72 weeks) who underwent de novo ablation including CFAE ablation. In all patients, PVI was performed prior to CFAE ablation. The use and timing of AADs were registered. The follow-ups consisted of Holter ECGs and clinical visits. Termination of AF was achieved in 132 patients (66 %). Intraprocedural AADs were administered in 168/200 patients (84 %) 45 ± 27 min after completion of PVI. Amiodarone was used in the majority of the patients (160/168). The timing of AAD administration was predicted by the atrial fibrillation cycle length (AFCL). At follow-up, 88 patients (46 %) were free from atrial arrhythmia. Multivariate logistic regression analysis revealed that administration of AAD early after PVI, LA size, duration of AF history, sex and AFCL were predictors of AF termination. The administration of AAD and its timing were not predictive of outcome, and age was the sole independent predictor of AF recurrence. The administration of AAD during ablation was common in this large cohort of persistent AF patients. The choice to administer AAD therapy and the timing of the administration during ablation were influenced by AFCL, and these factors did not significantly influence the moderate single procedure success rate in this retrospective analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Gebiet: Chirurgie Abstract: Background: Preservation of cardiac grafts for transplantation is not standardized and most centers use a single administration of crystalloid solution at the time of harvesting. We investigated possible benefits of an additional dose of cardioplegia dispensed immediately before implantation. – – Methods: Consecutive adult cardiac transplantations (2005?2012) were reviewed. Hearts were harvested following a standard protocol (Celsior 2L, 4?8°C). In 2008, 100 ml crys-talloid cardioplegic solution was added and administered immediately before implanta-tion. Univariate and logistic regression analyses were used to investigate risk factors for post-operative graft failure and mid-term outcome. – – Results: A total of 81 patients, 44 standard (?Cardio???) vs. 37 with additional cardiople-gia (?CardioC?) were analyzed. Recipients and donors were comparable in both groups. CardioC patients demonstrated a reduced need for defibrillation (24 vs. 48%, p D0.03), post-operative ratio of CK-MB/CK (10.1_3.9 vs. 13.3_4.2%, p D0.001), intubation time (2.0_1.6 vs. 7.2_11.5 days, p D0.05), and ICU stay (3.9_2.1 vs. 8.5_7.8 days, p D0.001). Actuarial survival was reduced when graft ischemic time was >180 min in Cardio?? but not in CardioC patients (p D0.033). Organ ischemic time >180 min (OR: 5.48, CI: 1.08?27.75), donor female gender (OR: 5.84, CI: 1.13?33.01), and recipient/donor age >60 (OR: 6.33, CI: 0.86?46.75), but not the additional cardioplegia or the observation period appeared independent predictors of post-operative acute graft failure. – – Conclusion: An additional dose of cardioplegia administered immediately before implan-tation may be a simple way to improve early and late outcome of cardiac transplantation, especially in situations of prolonged graft ischemia.A large, ideally multicentric, randomized study is desirable to verify this preliminary observation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. METHODS The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. RESULTS From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). CONCLUSIONS The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND Studies that systematically assess change in ulcerative colitis (UC) extent over time in adult patients are scarce. AIM To assess changes in disease extent over time and to evaluate clinical parameters associated with this change. METHODS Data from the Swiss IBD cohort study were analysed. We used logistic regression modelling to identify factors associated with a change in disease extent. RESULTS A total of 918 UC patients (45.3% females) were included. At diagnosis, UC patients presented with the following disease extent: proctitis [199 patients (21.7%)], left-sided colitis [338 patients (36.8%)] and extensive colitis/pancolitis [381 (41.5%)]. During a median disease duration of 9 [4-16] years, progression and regression was documented in 145 patients (15.8%) and 149 patients (16.2%) respectively. In addition, 624 patients (68.0%) had a stable disease extent. The following factors were identified to be associated with disease progression: treatment with systemic glucocorticoids [odds ratio (OR) 1.704, P = 0.025] and calcineurin inhibitors (OR: 2.716, P = 0.005). No specific factors were found to be associated with disease regression. CONCLUSIONS Over a median disease duration of 9 [4-16] years, about two-thirds of UC patients maintained the initial disease extent; the remaining one-third had experienced either progression or regression of the disease extent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND & AIMS The interaction of KIR with their HLA ligands drives the activation and inhibition of natural killer (NK) cells. NK cells could be implicated in the development of liver fibrosis in chronic hepatitis C. METHODS We analysed 206 non-transplanted and 53 liver transplanted patients, selected according to their Metavir fibrosis stage. Several variables such as the number of activator KIR or the HLA ligands were considered in multinomial and logistic regression models. Possible confounding variables were also investigated. RESULTS The KIRs were not significant predictors of the fibrosis stage. Conversely, a significant reduction of the HLA-C1C2 genotype was observed in the most advanced fibrosis stage group (F4) in both cohorts. Furthermore, the progression rate of fibrosis was almost 10 times faster in the subgroup of patients after liver transplantation and HLA-C1C2 was significantly reduced in this cohort compared to non-transplanted patients. CONCLUSION This study suggests a possible role of KIR and their ligands in the development of liver damage. The absence of C1 and C2 ligands heterozygosity could lead to less inhibition of NK cells and a quicker progression to a high level of fibrosis in patients infected by HCV, especially following liver transplantation. This article is protected by copyright. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND/AIMS Controversies still exist regarding the evaluation of growth hormone deficiency (GHD) in childhood at the end of growth. The aim of this study was to describe the natural history of GHD in a pediatric cohort. METHODS This is a retrospective study of a cohort of pediatric patients with GHD. Cases of acquired GHD were excluded. Univariate logistic regression was used to identify predictors of GHD persisting into adulthood. RESULTS Among 63 identified patients, 47 (75%) had partial GHD at diagnosis, while 16 (25%) had complete GHD, including 5 with multiple pituitary hormone deficiencies. At final height, 50 patients underwent repeat stimulation testing; 28 (56%) recovered and 22 (44%) remained growth hormone (GH) deficient. Predictors of persisting GHD were: complete GHD at diagnosis (OR 10.1, 95% CI 2.4-42.1), pituitary stalk defect or ectopic pituitary gland on magnetic resonance imaging (OR 6.5, 95% CI 1.1-37.1), greater height gain during GH treatment (OR 1.8, 95% CI 1.0-3.3), and IGF-1 level <-2 standard deviation scores (SDS) following treatment cessation (OR 19.3, 95% CI 3.6-103.1). In the multivariate analysis, only IGF-1 level <-2 SDS (OR 13.3, 95% CI 2.3-77.3) and complete GHD (OR 6.3, 95% CI 1.2-32.8) were associated with the outcome. CONCLUSION At final height, 56% of adolescents with GHD had recovered. Complete GHD at diagnosis, low IGF-1 levels following retesting, and pituitary malformation were strong predictors of persistence of GHD.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES Improvement of skin fibrosis is part of the natural course of diffuse cutaneous systemic sclerosis (dcSSc). Recognising those patients most likely to improve could help tailoring clinical management and cohort enrichment for clinical trials. In this study, we aimed to identify predictors for improvement of skin fibrosis in patients with dcSSc. METHODS We performed a longitudinal analysis of the European Scleroderma Trials And Research (EUSTAR) registry including patients with dcSSc, fulfilling American College of Rheumatology criteria, baseline modified Rodnan skin score (mRSS) ≥7 and follow-up mRSS at 12±2 months. The primary outcome was skin improvement (decrease in mRSS of >5 points and ≥25%) at 1 year follow-up. A respective increase in mRSS was considered progression. Candidate predictors for skin improvement were selected by expert opinion and logistic regression with bootstrap validation was applied. RESULTS From the 919 patients included, 218 (24%) improved and 95 (10%) progressed. Eleven candidate predictors for skin improvement were analysed. The final model identified high baseline mRSS and absence of tendon friction rubs as independent predictors of skin improvement. The baseline mRSS was the strongest predictor of skin improvement, independent of disease duration. An upper threshold between 18 and 25 performed best in enriching for progressors over regressors. CONCLUSIONS Patients with advanced skin fibrosis at baseline and absence of tendon friction rubs are more likely to regress in the next year than patients with milder skin fibrosis. These evidence-based data can be implemented in clinical trial design to minimise the inclusion of patients who would regress under standard of care.