71 resultados para Multiple-regression Analysis
Resumo:
Purpose: This retrospective study analyzed the pool of patients referred for treatment with dental implants over a 3-year period in a referral specialty clinic. Materials and Methods: All patients receiving dental implants between 2002 and 2004 in the Department of Oral Surgery and Stomatology, University of Bern, were included in this retrospective study. Patients were analyzed according to age, gender, indications for implant therapy, location of implants, and type and length of implants placed. A cumulative logistic regression analysis was performed to identify and analyze potential risk factors for complications or failures. Results: A total of 1,206 patients received 1,817 dental implants. The group comprised 573 men and 633 women with a mean age of 55.2 years. Almost 60% of patients were age 50 or older. The most frequent indication for implant therapy was single-tooth replacement in the maxilla (522 implants or 28.7%). A total of 726 implants (40%) were inserted in the esthetically demanding region of the anterior maxilla. For 939 implants (51.7%), additional bone-augmentation procedures were required. Of these, ridge augmentation with guided bone regeneration was performed more frequently than sinus grafting. Thirteen complications leading to early failures were recorded, resulting in an early failure rate of 0.7%. The regression analysis failed to identify statistically significant failure etiologies for the variables assessed. Conclusions: From this study it can be concluded that patients referred to a specialty clinic for implant placement were more likely to be partially edentulous and over 50 years old. Single-tooth replacement was the most frequent indication (> 50%). Similarly, additional bone augmentation was indicated in more than 50% of cases. Adhering to strict patient selection criteria and a standardized surgical protocol, an early failure rate of 0.7% was experienced in this study population
Resumo:
OBJECTIVE: To compare the risk of shunt-dependent hydrocephalus after treatment of ruptured intracranial aneurysms by clipping versus coiling. METHODS: We analyzed 596 patients prospectively added to our database from July of 1999 to November of 2005 concerning the risk of shunt dependency after clipping versus coiling. Factors analyzed included age; sex; Hunt and Hess grade; Fisher grade; acute hydrocephalus; intraventricular hemorrhage; angiographic vasospasm; and number, size, and location of aneurysms. In addition, a meta-analysis of available data from the literature was performed identifying four studies with quantitative data on the frequency of clip, coil, and shunt dependency. RESULTS: The institutional series revealed Hunt and Hess grade, Fisher grade, acute hydrocephalus, intraventricular hemorrhage, and angiographic vasospasm as significant (P < 0.05) risk factors for shunt dependency after a univariate analysis. In a multivariate logistic regression analysis, we isolated intraventricular hemorrhage, acute hydrocephalus, and angiographic vasospasm as independent, significant risk factors for shunt dependency. The meta-analysis, including the current data, revealed a significantly higher risk for shunt dependency after coiling than after clipping (P = 0.01). CONCLUSION: Clipping of a ruptured aneurysm may be associated with a lower risk for developing shunt dependency, possibly by clot removal. This might influence long-term outcome and surgical decision making.
Resumo:
Recent studies suggest that diabetes mellitus increases the risk of developing hepatocellular carcinoma (HCC). The aim of this study is to quantify the risk of HCC among patients with both diabetes mellitus and hepatitis C in a large cohort of patients with chronic hepatitis C and advanced fibrosis. We included 541 patients of whom 85 (16%) had diabetes mellitus. The median age at inclusion was 50 years. The prevalence of diabetes mellitus was 10.5% for patients with Ishak fibrosis score 4, 12.5% for Ishak score 5, and 19.1% for Ishak score 6. Multiple logistic regression analysis showed an increased risk of diabetes mellitus for patients with an elevated body mass index (BMI) (odds ratio [OR], 1.05; 95% confidence interval [CI], 1.00-1.11; P = 0.060) and a decreased risk of diabetes mellitus for patients with higher serum albumin levels (OR, 0.81; 95% CI, 0.63-1.04; P = 0.095). During a median follow-up of 4.0 years (interquartile range, 2.0-6.7), 11 patients (13%) with diabetes mellitus versus 27 patients (5.9%) without diabetes mellitus developed HCC, the 5-year occurrence of HCC being 11.4% (95% CI, 3.0-19.8) and 5.0% (95% CI, 2.2-7.8), respectively (P = 0.013). Multivariate Cox regression analysis of patients with Ishak 6 cirrhosis showed that diabetes mellitus was independently associated with the development of HCC (hazard ratio, 3.28; 95% CI, 1.35-7.97; P = 0.009). CONCLUSION: For patients with chronic hepatitis C and advanced cirrhosis, diabetes mellitus increases the risk of developing HCC.
Resumo:
OBJECTIVES: This paper is concerned with checking goodness-of-fit of binary logistic regression models. For the practitioners of data analysis, the broad classes of procedures for checking goodness-of-fit available in the literature are described. The challenges of model checking in the context of binary logistic regression are reviewed. As a viable solution, a simple graphical procedure for checking goodness-of-fit is proposed. METHODS: The graphical procedure proposed relies on pieces of information available from any logistic analysis; the focus is on combining and presenting these in an informative way. RESULTS: The information gained using this approach is presented with three examples. In the discussion, the proposed method is put into context and compared with other graphical procedures for checking goodness-of-fit of binary logistic models available in the literature. CONCLUSION: A simple graphical method can significantly improve the understanding of any logistic regression analysis and help to prevent faulty conclusions.
Resumo:
BACKGROUND: Renal resistance index, a predictor of kidney allograft function and patient survival, seems to depend on renal and peripheral vascular compliance and resistance. Asymmetric dimethylarginine (ADMA) is an endogenous inhibitor of nitric oxide synthase and therefore influences vascular resistance. STUDY DESIGN: We investigated the relationship between renal resistance index, ADMA, and risk factors for cardiovascular diseases and kidney function in a cross-sectional study. SETTING ; PARTICIPANTS: 200 stable renal allograft recipients (133 men and 67 women with a mean age of 52.8 years). PREDICTORS: Serum ADMA concentration, pulse pressure, estimated glomerular filtration rate and recipient age. OUTCOME: Renal resistance index. MEASUREMENTS: Renal resistance index measured by color-coded duplex ultrasound, serum ADMA concentration measured by liquid chromatography-tandem mass spectrometry, estimated glomerular filtration rate (Nankivell equation), arterial stiffness measured by digital volume pulse, Framingham and other cardiovascular risk factors, and evaluation of concomitant antihypertensive and immunosuppressive medication. RESULTS: Mean serum ADMA concentration was 0.72 +/- 0.21 (+/-SD) micromol/L and mean renal resistance index was 0.71 +/- 0.07. Multiple stepwise regression analysis showed that recipient age (P < 0.001), pulse pressure (P < 0.001), diabetes (P < 0.01) and ADMA concentration (P < 0.01) were independently associated with resistance index. ADMA concentrations were correlated with estimated glomerular filtration rate (P < 0.01). LIMITATIONS: The cross-sectional nature of this study precludes cause-effect conclusions. CONCLUSIONS: In addition to established cardiovascular risk factors, ADMA appears to be a relevant determinant of renal resistance index and allograft function and deserves consideration in prospective outcome trials in renal transplantation.
Resumo:
Due to highly erodible volcanic soils and a harsh climate, livestock grazing in Iceland has led to serious soil erosion on about 40% of the country's surface. Over the last 100 years, various revegetation and restoration measures were taken on large areas distributed all over Iceland in an attempt to counteract this problem. The present research aimed to develop models for estimating percent vegetation cover (VC) and aboveground biomass (AGB) based on satellite data, as this would make it possible to assess and monitor the effectiveness of restoration measures over large areas at a fairly low cost. Models were developed based on 203 vegetation cover samples and 114 aboveground biomass samples distributed over five SPOT satellite datasets. All satellite datasets were atmospherically corrected, and digital numbers were converted into ground reflectance. Then a selection of vegetation indices (VIs) was calculated, followed by simple and multiple linear regression analysis of the relations between the field data and the calculated VIs. Best results were achieved using multiple linear regression models for both %VC and AGB. The model calibration and validation results showed that R2 and RMSE values for most VIs do not vary very much. For percent VC, R2 values range between 0.789 and 0.822, leading to RMSEs ranging between 15.89% and 16.72%. For AGB, R2 values for low-biomass areas (AGB < 800 g/m2) range between 0.607 and 0.650, leading to RMSEs ranging between 126.08 g/m2 and 136.38 g/m2. The AGB model developed for all areas, including those with high biomass coverage (AGB > 800 g/m2), achieved R2 values between 0.487 and 0.510, resulting in RMSEs ranging from 234 g/m2 to 259.20 g/m2. The models predicting percent VC generally overestimate observed low percent VC and slightly underestimate observed high percent VC. The estimation models for AGB behave in a similar way, but over- and underestimation are much more pronounced. These results show that it is possible to estimate percent VC with high accuracy based on various VIs derived from SPOT satellite data. AGB of restoration areas with low-biomass values of up to 800 g/m2 can likewise be estimated with high accuracy based on various VIs derived from SPOT satellite data, whereas in the case of high biomass coverage, estimation accuracy decreases with increasing biomass values. Accordingly, percent VC can be estimated with high accuracy anywhere in Iceland, whereas AGB is much more difficult to estimate, particularly for areas with high-AGB variability.
Resumo:
BACKGROUND: Various reasons exist for so-called bacillus Calmette-Guérin (BCG) failure in patients with non-muscle-invasive urothelial bladder carcinoma (NMIBC). OBJECTIVE: To explore whether urothelial carcinoma of the upper urinary tract (UUT) and/or prostatic urethra may be a cause for BCG failure. DESIGN, SETTING, AND PARTICIPANTS: Retrospective analysis of 110 patients with high-risk NMIBC repeatedly treated with intravesical BCG, diagnosed with disease recurrence, and followed for a median time of 9.1 yr. INTERVENTION: Two or more intravesical BCG induction courses without maintenance. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: Primary outcome was pattern of disease recurrence (BCG failure) within the urinary tract categorised into UUT and/or urethral carcinoma (with or without intravesical recurrence), and intravesical recurrence alone. Secondary outcome was survival. Predictors of UUT and/or urethral carcinoma and the effect of pattern of disease recurrence on cancer-specific survival were assessed with multivariable Cox regression analysis adjusting for multiple clinical and tumour characteristics. RESULTS AND LIMITATIONS: Of the 110 patients, 57 (52%) had UUT and/or urethral carcinoma (with or without intravesical recurrence), and 53 (48%) had intravesical recurrence alone. In patients with UUT and/or urethral carcinoma, bladder carcinoma in situ (Tis) before the first and second BCG course was present in 42 of 57 (74%) and 47 of 57 (82%) patients, respectively. On multivariable analysis, bladder Tis before the first and/or second BCG course was the only independent predictor of UUT and/or urethral carcinoma. Of the 110 patients, 69 (63%) were alive at last follow-up visit, 18 (16%) had died due to metastatic urothelial carcinoma, and 23 (21%) had died of other causes. Pattern of disease recurrence within the urinary tract was not an independent predictor of cancer-specific survival. Main study limitations were retrospective design and limited power for survival analysis. CONCLUSIONS: In our patients with high-risk NMIBC failing after two or more courses of intravesical BCG, UUT and/or urethral carcinoma was detected in >50% of the cases during follow-up. The vast majority of these patients had bladder Tis before the first and/or second BCG course. In patients experiencing the so-called BCG failure, a diagnostic work-up of UUT and prostatic urethra should always be performed to exclude urothelial carcinoma before additional intravesical therapy or even a radical cystectomy is considered.
Resumo:
INTRODUCTION Data concerning outcome after management of acetabular fractures by anterior approaches with focus on age and fractures associated with roof impaction, central dislocation and/or quadrilateral plate displacement are rare. METHODS Between October 2005 and April 2009 a series of 59 patients (mean age 57 years, range 13-91) with fractures involving the anterior column was treated using the modified Stoppa approach alone or for reduction of displaced iliac wing or low anterior column fractures in combination with the 1st window of the ilioinguinal approach or the modified Smith-Petersen approach, respectively. Surgical data, accuracy of reduction, clinical and radiographic outcome at mid-term and the need for endoprosthetic replacement in the postoperative course (defined as failure) were assessed; uni- and multivariate regression analysis were performed to identify independent predictive factors (e.g. age, nonanatomical reduction, acetabular roof impaction, central dislocation, quadrilateral plate displacement) for a failure. Outcome was assessed for all patients in general and in accordance to age in particular; patients were subdivided into two groups according to their age (group "<60yrs", group "≥60yrs"). RESULTS Forty-three of 59 patients (mean age 54yrs, 13-89) were available for evaluation. Of these, anatomic reduction was achieved in 72% of cases. Nonanatomical reduction was identified as being the only multivariate predictor for subsequent total hip replacement (Adjusted Hazard Ratio 23.5; p<0.01). A statistically significant higher rate of nonanatomical reduction was observed in the presence of acetabular roof impaction (p=0.01). In 16% of all patients, total hip replacement was performed and in 69% of patients with preserved hips the clinical results were excellent or good at a mean follow up of 35±10 months (range: 24-55). No statistical significant differences were observed between both groups. CONCLUSION Nonanatomical reconstruction of the articular surfaces is at risk for failure of joint-preserving management of acetabular fractures through an isolated or combined modified Stoppa approach resulting in total joint replacement at mid-term. In the elderly, joint-preserving surgery is worth considering as promising clinical and radiographic results might be obtained at mid-term.
Resumo:
OBJECTIVES Evidence increases that cognitive failure may be used to screen for drivers at risk. Until now, most studies have relied on driving learners. This exploratory pilot study examines self-report of cognitive failure in driving beginners and error during real driving as observed by driving instructors. METHODS Forty-two driving learners of 14 driving instructors filled out a work-related cognitive failure questionnaire. Driving instructors observed driving errors during the next driving lesson. In multiple linear regression analysis, driving errors were regressed on cognitive failure with the number of driving lessons as an estimator of driving experience controlled. RESULTS Higher cognitive failure predicted more driving errors (p < .01) when age, gender and driving experience were controlled in analysis. CONCLUSIONS Cognitive failure was significantly associated with observed driving errors. Systematic research on cognitive failure in driving beginners is recommended.
Resumo:
Aim of the study In this study we examined the effects of Taiji on perceived stress and general self-efficacy (GSE), and investigated the mediating role of a Taiji-induced GSE increase on Taiji-related reduction of perceived stress. Materials and methods 70 healthy participants were randomly allocated either to the Taiji intervention group or the waiting list control group. The intervention lasted for 12 weeks comprising two Taiji classes per week. Before, shortly after, and two months after the intervention, we assessed the degree of perceived stress and GSE in all participants by employing the Perceived Stress Scale (PSS) and the GSE-Scale. Results Compared to controls, participants of the Taiji group showed a significantly stronger decrease of perceived stress and a higher increase in GSE from pre- to post-intervention assessment (PSS: p = 0.009; GSE: p = 0.006), as well as from pre-intervention to follow-up assessment (PSS: p = 0.018; GSE: p = 0.033). A mediator analysis based on a multiple regression approach revealed that a Taiji-related increase in GSE statistically mediated the reduction in perceived stress after Taiji as compared to baseline. Post hoc testing showed that the mediating effect of GSE was significant (p = 0.043). Conclusions Our findings confirm previously reported Taiji-related stress reducing and GSE enhancing effects with GSE increase mediating Taiji related reduction of perceived stress.
Resumo:
There is growing evidence for the development of posttraumatic stress symptoms as a consequence of acute cardiac events. Acute coronary syndrome (ACS) patients experience a range of acute cardiac symptoms, and these may cluster together in specific patterns. The objectives of this study were to establish distinct symptom clusters in ACS patients, and to investigate whether the experience of different types of symptom clusters are associated with posttraumatic symptom intensity at six months. ACS patients were interviewed in hospital within 48 h of admission, 294 patients provided information on symptoms before hospitalisation, and cluster analysis was used to identify patterns. Posttraumatic stress symptoms were assessed in 156 patients at six months. Three symptom clusters were identified; pain symptoms, diffuse symptoms and symptoms of dyspnea. In multiple regression analyses, adjusting for sociodemographic, clinical and psychological factors, the pain symptoms cluster (β = .153, P = .044) emerged as a significant predictor of posttraumatic symptom severity at six months. A marginally significant association was observed between symptoms of dyspnea and reduced intrusive symptoms at six months (β = -.156, P = .061). Findings suggest acute ACS symptoms occur in distinct clusters, which may have distinctive effects on intensity of subsequent posttraumatic symptoms. Since posttraumatic stress is associated with adverse outcomes, identifying patients at risk based on their symptom experience during ACS may be useful in targeting interventions.
Resumo:
Background: Few studies have examined the 20% of individuals who never experience an episode of low back pain (LBP). To date, no investigation has been undertaken that examines a group who claim to have never experienced LBP in their lifetime in comparison to two population-based case–control groups with and without momentary LBP. This study investigates whether LBP-resilient workers between 50 and 65 years had better general health, demonstrated more positive health behaviour and were better able to achieve routine activities compared with both case–control groups. Methods: Forty-two LBP-resilient participants completed the same pain assessment questionnaire as a population-based LBP sample from a nationwide, large-scale cross-sectional survey in Switzerland. The LBP-resilient participants were pairwise compared to the propensity score-matched case controls by exploring differences in demographic and work characteristics, and by calculating odds ratios (ORs) and effect sizes. A discriminant analysis explored group differences, while the multiple logistic regression analysis specified single indicators which accounted for group differences. Results: LBP-resilient participants were healthier than the case controls with momentary LBP and achieved routine activities more easily. Compared to controls without momentary LBP, LBP-resilient participants had a higher vitality, a lower workload, a healthier attitude towards health and behaved more healthily by drinking less alcohol. Conclusions: By demonstrating a difference between LBP-resilient participants and controls without momentary LBP, the question that arises is what additional knowledge can be attained. Three underlying traits seem to be relevant about LBP-resilient participants: personality, favourable work conditions and subjective attitudes/attributions towards health. These rationales have to be considered with respect to LBP prevention.
Resumo:
BACKGROUND AND PURPOSE Inverse relationship between onset-to-door time (ODT) and door-to-needle time (DNT) in stroke thrombolysis was reported from various registries. We analyzed this relationship and other determinants of DNT in dedicated stroke centers. METHODS Prospectively collected data of consecutive ischemic stroke patients from 10 centers who received IV thrombolysis within 4.5 hours from symptom onset were merged (n=7106). DNT was analyzed as a function of demographic and prehospital variables using regression analyses, and change over time was considered. RESULTS In 6348 eligible patients with known treatment delays, median DNT was 42 minutes and kept decreasing steeply every year (P<0.001). Median DNT of 55 minutes was observed in patients with ODT ≤30 minutes, whereas it declined for patients presenting within the last 30 minutes of the 3-hour time window (median, 33 minutes) and of the 4.5-hour time window (20 minutes). For ODT within the first 30 minutes of the extended time window (181-210 minutes), DNT increased to 42 minutes. DNT was stable for ODT for 30 to 150 minutes (40-45 minutes). We found a weak inverse overall correlation between ODT and DNT (R(2)=-0.12; P<0.001), but it was strong in patients treated between 3 and 4.5 hours (R(2)=-0.75; P<0.001). ODT was independently inversely associated with DNT (P<0.001) in regression analysis. Octogenarians and women tended to have longer DNT. CONCLUSIONS DNT was decreasing steeply over the last years in dedicated stroke centers; however, significant oscillations of in-hospital treatment delays occurred at both ends of the time window. This suggests that further improvements can be achieved, particularly in the elderly.
Resumo:
Aims: The aim of this study was to identify predictors of adverse events among patients with ST-elevation myocardial infarction (STEMI) undergoing contemporary primary percutaneous coronary intervention (PCI). Methods and results: Individual data of 2,655 patients from two primary PCI trials (EXAMINATION, N=1,504; COMFORTABLE AMI, N=1,161) with identical endpoint definitions and event adjudication were pooled. Predictors of all-cause death or any reinfarction and definite stent thrombosis (ST) and target lesion revascularisation (TLR) outcomes at one year were identified by multivariable Cox regression analysis. Killip class III or IV was the strongest predictor of all-cause death or any reinfarction (OR 5.11, 95% CI: 2.48-10.52), definite ST (OR 7.74, 95% CI: 2.87-20.93), and TLR (OR 2.88, 95% CI: 1.17-7.06). Impaired left ventricular ejection fraction (OR 4.77, 95% CI: 2.10-10.82), final TIMI flow 0-2 (OR 1.93, 95% CI: 1.05-3.54), arterial hypertension (OR 1.69, 95% CI: 1.11-2.59), age (OR 1.68, 95% CI: 1.41-2.01), and peak CK (OR 1.25, 95% CI: 1.02-1.54) were independent predictors of all-cause death or any reinfarction. Allocation to treatment with DES was an independent predictor of a lower risk of definite ST (OR 0.35, 95% CI: 0.16-0.74) and any TLR (OR 0.34, 95% CI: 0.21-0.54). Conclusions: Killip class remains the strongest predictor of all-cause death or any reinfarction among STEMI patients undergoing primary PCI. DES use independently predicts a lower risk of TLR and definite ST compared with BMS. The COMFORTABLE AMI trial is registered at: http://www.clinicaltrials.gov/ct2/show/NCT00962416. The EXAMINATION trial is registered at: http://www.clinicaltrials.gov/ct2/show/NCT00828087.
Resumo:
OBJECTIVE To investigate the long-term prognostic implications of coronary calcification in patients undergoing percutaneous coronary intervention for obstructive coronary artery disease. METHODS Patient-level data from 6296 patients enrolled in seven clinical drug-eluting stents trials were analysed to identify in angiographic images the presence of severe coronary calcification by an independent academic research organisation (Cardialysis, Rotterdam, The Netherlands). Clinical outcomes at 3-years follow-up including all-cause mortality, death-myocardial infarction (MI), and the composite end-point of all-cause death-MI-any revascularisation were compared between patients with and without severe calcification. RESULTS Severe calcification was detected in 20% of the studied population. Patients with severe lesion calcification were less likely to have undergone complete revascularisation (48% vs 55.6%, p<0.001) and had an increased mortality compared with those without severely calcified arteries (10.8% vs 4.4%, p<0.001). The event rate was also high in patients with severely calcified lesions for the combined end-point death-MI (22.9% vs 10.9%; p<0.001) and death-MI- any revascularisation (31.8% vs 22.4%; p<0.001). On multivariate Cox regression analysis, including the Syntax score, the presence of severe coronary calcification was an independent predictor of poor prognosis (HR: 1.33 95% CI 1.00 to 1.77, p=0.047 for death; 1.23, 95% CI 1.02 to 1.49, p=0.031 for death-MI, and 1.18, 95% CI 1.01 to 1.39, p=0.042 for death-MI- any revascularisation), but it was not associated with an increased risk of stent thrombosis. CONCLUSIONS Patients with severely calcified lesions have worse clinical outcomes compared to those without severe coronary calcification. Severe coronary calcification appears as an independent predictor of worse prognosis, and should be considered as a marker of advanced atherosclerosis.