897 resultados para multivariable regression
Resumo:
Background: The goal of this study was to retrospectively analyze a cohort of 136 patients who underwent dental implant placement in the posterior maxilla at the University of Connecticut Health Center to assess and identify predictors for implant failure in the posterior maxilla. Methods: Data were retrieved from patient charts to identify subjects older than 21 years of age who received dental implant(s) in the posterior maxilla. Patients without a postoperative baseline radiograph were excluded. A recall radiograph was taken 3 to 6 months after implant placement. If there was no recall radiograph, the subject was contacted for a recall visit that included a clinical evaluation and radiographs to determine the implant status. Based on a univariate screening, variables considered potential implant failure predictors included gender, diabetes, smoking, implant length, implant diameter, membrane use, sinus-elevation technique, and surgical complications. These parameters were further assessed, and a multivariable logistic regression was performed with implant failure as a dependant variable. All tests of significance were evaluated at the 0.05 error level. Results: Two hundred seventy-three implants were placed in the posterior maxilla. Fourteen implants failed (early and late failures combined), resulting in a 94.9% overall survival rate. The survival rates for the sinus-elevation group and native bone group were 92.2% and 96.7%, respectively (P = 0.090). Based on the multivariable analysis, sinus floor-elevation procedures were not associated with increased risk for implant failure (P = 0.702). In contrast, smoking and surgical complications had a statistically significant effect on implant failure; the odds ratios for implant failure were 6.4 (P = 0.025) and 8.2 (P = 0.004), respectively. Conclusion: Sinus-elevation procedures with simultaneous or staged implant placement do not increase the risk for implant failure, whereas smoking and surgical complications markedly increase the risk for implant failure.
Resumo:
PURPOSE: To assess the effect of stent type on hypotension and bradycardia after carotid artery stent placement. MATERIALS AND METHODS: A retrospective analysis on a prospectively maintained database was conducted in 256 patients (126 men; mean age, 71.8 years +/- 8.6; 194 de novo lesions) undergoing carotid artery stent placement between January 1996 and January 2007 by using self-expanding stents. Braided Elgiloy stents (Wallstents) were used in 44 of the 256 patients (17.2%) and slotted-tube nitinol stents were deployed in 212 (82.8%). Bivariate and multivariable logistic regression models were used to determine the influence of stent design on procedural and 24-hour hypotension and bradycardia. RESULTS: Procedural hemodynamic depression (HD) was encountered in 73 of the 256 patients (28.5%) due to hypotension in 24 (9.4%), bradycardia in 12 (4.7%), or both in 37 (14.5%) patients. Rates of procedural hypotension were 11.3% with nitinol stents and 0% with braided Elgiloy stents (P = .0188). Persistent postprocedural HD occurred in 91 of the 256 patients (35.5%) due to hypotension in 40 patients (15.6%), bradycardia in 23 (9.0%), or both in 28 (10.9%). Within a multivariable analysis adjusted for clinically relevant factors affecting rates of HD, the use of braided Elgiloy stents was associated with a decreased rate of procedural hypotension (odds ratio: 0.165; 95% confidence interval: 0.038, 0.721; P = .017). Procedural hypotension and bradycardia were not correlated to incidence of major adverse events but were associated with an increased duration of hospital stay (P = .0059 and P = .0335, respectively). CONCLUSIONS: Nitinol stents are associated with a higher risk of hypotension as compared to braided Elgiloy stents during carotid artery stent placement.
Resumo:
PURPOSE: This retrospective study was conducted to determine whether a low-volume contrast medium protocol provides sufficient enhancement for 64-detector computed tomography angiography (CTA) in patients with aortoiliac aneurysms. METHODS: Evaluated were 45 consecutive patients (6 women; mean age, 72 +/- 6 years) who were referred for aortoiliac computed tomography angiography between October 2005 and January 2007. Group A (22 patients; creatinine clearance, 64.2 +/- 8.1 mL/min) received 50 mL of the contrast agent. Group B (23 patients; creatinine clearance, 89.4 +/- 7.3 mL/min) received 100 mL of the contrast agent. The injection rate was 3.5 mL/s, followed by 30 mL of saline at 3.5 mL/s. Studies were performed on the same 64-detector computed tomography scanner using a real-time bolus-tracking technique. Quantitative analysis was performed by determination of mean vascular attenuation at 10 regions of interest from the suprarenal aorta to the common femoral artery by one reader blinded to type and amount of contrast agent and compared using the Student t test. Image quality according to a 4-point scale was assessed in consensus by two readers blinded to type and amount of contrast medium and compared using the Mann-Whitney test. Multivariable adjustments were performed using ordinal regression analysis. RESULTS: Mean total attenuation did not differ significantly between both groups (196.5 +/- 33.0 Hounsfield unit [HU] in group A and 203.1 +/- 44.2 HU in group B; P = .57 by univariate and P > .05 by multivariable analysis). Accordingly, attenuation at each region of interest was not significantly different (P > .35). Image quality was excellent or good in all patients. No significant differences in visual assessment were found comparing both contrast medium protocols (P > .05 by univariate and by multivariable analysis). CONCLUSIONS: Aortoiliac aneurysm imaging can be performed with substantially reduced amounts of contrast medium using 64-detector computed tomography angiography technology.
Resumo:
GOAL OF THE WORK: Anemia is a common side effect of chemotherapy. Limited information exists about its incidence and risk factors. The objective of this study was to evaluate the incidence of anemia and risk factors for anemia occurrence in patients with early breast cancer who received adjuvant chemotherapy. MATERIALS AND METHODS: We evaluated risk factors for anemia in pre- and post/perimenopausal patients with lymph node-positive early breast cancer treated with adjuvant chemotherapy in two randomized trials. All patients received four cycles of doxorubicin and cyclophosphamide (AC) followed by three cycles of cyclophosphamide, methotrexate, fluorouracil (CMF). Anemia incidence was related to baseline risk factors. Multivariable analysis used logistic and Cox regression. MAIN RESULTS: Among the 2,215 available patients, anemia was recorded in 11% during adjuvant chemotherapy. Grade 2 and 3 anemia occurred in 4 and 1% of patients, respectively. Pretreatment hemoglobin and white blood cells (WBC) were significant predictors of anemia. Adjusted odds ratios (logistic regression) comparing highest versus lowest quartiles were 0.18 (P < 0.0001) for hemoglobin and 0.52 (P = 0.0045) for WBC. Age, surgery type, platelets, body mass index, and length of time from surgery to chemotherapy were not significant predictors. Cox regression results looking at time to anemia were similar. CONCLUSIONS: Moderate or severe anemia is rare among patients treated with AC followed by CMF. Low baseline hemoglobin and WBC are associated with a higher risk of anemia.
Resumo:
Respiratory infections cause considerable morbidity during infancy. The impact of innate immunity mechanisms, such as mannose-binding lectin (MBL), on respiratory symptoms remains unclear. The aims of this study were to investigate whether cord blood MBL levels are associated with respiratory symptoms during infancy and to determine the relative contribution of MBL when compared with known risk factors. This is a prospective birth cohort study including 185 healthy term infants. MBL was measured in cord blood and categorized into tertiles. Frequency and severity of respiratory symptoms were assessed weekly until age one. Association with MBL levels was analysed using multivariable random effects Poisson regression. We observed a trend towards an increased incidence rate of severe respiratory symptoms in infants in the low MBL tertile when compared with infants in the middle MBL tertile [incidence rate ratio (IRR) = 1.59; 95% confidence interval (CI): 0.95-2.66; p = 0.076]. Surprisingly, infants in the high MBL tertile suffered significantly more from severe and total respiratory symptoms than infants in the middle MBL tertile (IRR = 1.97; 95% CI: 1.20-3.25; p = 0.008). This association was pronounced in infants of parents with asthma (IRR = 3.64; 95% CI: 1.47-9.02; p = 0.005). The relative risk associated with high MBL was similar to the risk associated with well-known risk factors such as maternal smoking or childcare. In conclusion the association between low MBL levels and increased susceptibility to common respiratory infections during infancy was weaker than that previously reported. Instead, high cord blood MBL levels may represent a so far unrecognized risk factor for respiratory morbidity in infants of asthmatic parents.
Resumo:
Background mortality is an essential component of any forest growth and yield model. Forecasts of mortality contribute largely to the variability and accuracy of model predictions at the tree, stand and forest level. In the present study, I implement and evaluate state-of-the-art techniques to increase the accuracy of individual tree mortality models, similar to those used in many of the current variants of the Forest Vegetation Simulator, using data from North Idaho and Montana. The first technique addresses methods to correct for bias induced by measurement error typically present in competition variables. The second implements survival regression and evaluates its performance against the traditional logistic regression approach. I selected the regression calibration (RC) algorithm as a good candidate for addressing the measurement error problem. Two logistic regression models for each species were fitted, one ignoring the measurement error, which is the “naïve” approach, and the other applying RC. The models fitted with RC outperformed the naïve models in terms of discrimination when the competition variable was found to be statistically significant. The effect of RC was more obvious where measurement error variance was large and for more shade-intolerant species. The process of model fitting and variable selection revealed that past emphasis on DBH as a predictor variable for mortality, while producing models with strong metrics of fit, may make models less generalizable. The evaluation of the error variance estimator developed by Stage and Wykoff (1998), and core to the implementation of RC, in different spatial patterns and diameter distributions, revealed that the Stage and Wykoff estimate notably overestimated the true variance in all simulated stands, but those that are clustered. Results show a systematic bias even when all the assumptions made by the authors are guaranteed. I argue that this is the result of the Poisson-based estimate ignoring the overlapping area of potential plots around a tree. Effects, especially in the application phase, of the variance estimate justify suggested future efforts of improving the accuracy of the variance estimate. The second technique implemented and evaluated is a survival regression model that accounts for the time dependent nature of variables, such as diameter and competition variables, and the interval-censored nature of data collected from remeasured plots. The performance of the model is compared with the traditional logistic regression model as a tool to predict individual tree mortality. Validation of both approaches shows that the survival regression approach discriminates better between dead and alive trees for all species. In conclusion, I showed that the proposed techniques do increase the accuracy of individual tree mortality models, and are a promising first step towards the next generation of background mortality models. I have also identified the next steps to undertake in order to advance mortality models further.
Resumo:
OBJECTIVES: The incidence distribution of triage advice in the medical call centre Medi24 and the pattern of service utilisation were analysed with respect to two groups of callers with different insurance schemes. Individuals having contracted insurance of the Medi24 model could use the telephone consultation service of the medical call centre Medi24 (mainly part of the mandatory basic health insurance) voluntarily and free of charge whereas individuals holding an insurance policy of the Telmed model (special contract within the mandatory basic health insurance with a premium discount ranging from 8% to 12%) were obliged to have a telephone consultation before arranging an appointment with a medical doctor. METHODS: A cross-sectional study was carried out in the medical call centre Medi24 based on all triage datasets of the Medi24 and Telmed groups collected during the one year period from July 1st 2005 to June 30th 2006. The distribution of the six different urgency levels within the two groups and their respective pattern of service utilisation was determined. In a multivariable logistic regression model the Odds Ratio for every enquiry originating from the Telmed group versus those originating from the Medi24 group was calculated. RESULTS: During a one-year period 48 388 triage requests reached the medical call centre Medi24, 56% derived from the Telmed group and 44% from the Medi24 group. Within the Medi24 group more than 25% of the individuals received self-care advice, within the Telmed group, on the other hand, only about 18% received such advice. In contrast, 27% of the Telmed triage requests but only 18% of the Medi24 triage requests resulted in the advice to make a routine appointment with a medical doctor. The probability that an individual of the Telmed group obtained the advice to go to the accident and emergency department was lower than for an individual of the Medi24 group (OR 0.77, 95% CI 0.60-0.99). Likewise, the probability of self-care advice was decreased in regard to the Medi24 group (OR 0.80, 95% CI 0.75-0.85). However, regarding the advice to make a routine appointment with a medical doctor, the Telmed group was represented more frequently than the Medi24 group (OR 1.36, 95% CI 1.28-1.44). CONCLUSION: In respect of the triage advice, the Telmed group differed significantly from the Medi24 group within all urgency levels. The differences between the two groups in respect of the advice given were still less pronounced than expected against the background of their different contract conditions and the disparate temporal pattern of utilisation. We interprete this finding with the fact that appraising the urgency of health problems appropriately seems to be very difficult for the majority of people seeking advice.
Resumo:
OBJECTIVE: To obtain precise information on the optimal time window for surgical antimicrobial prophylaxis. SUMMARY BACKGROUND DATA: Although perioperative antimicrobial prophylaxis is a well-established strategy for reducing the risk of surgical site infections (SSI), the optimal timing for this procedure has yet to be precisely determined. Under today's recommendations, antibiotics may be administered within the final 2 hours before skin incision, ideally as close to incision time as possible. METHODS: In this prospective observational cohort study at Basel University Hospital we analyzed the incidence of SSI by the timing of antimicrobial prophylaxis in a consecutive series of 3836 surgical procedures. Surgical wounds and resulting infections were assessed to Centers for Disease Control and Prevention standards. Antimicrobial prophylaxis consisted in single-shot administration of 1.5 g of cefuroxime (plus 500 mg of metronidazole in colorectal surgery). RESULTS: The overall SSI rate was 4.7% (180 of 3836). In 49% of all procedures antimicrobial prophylaxis was administered within the final half hour. Multivariable logistic regression analyses showed a significant increase in the odds of SSI when antimicrobial prophylaxis was administered less than 30 minutes (crude odds ratio = 2.01; adjusted odds ratio = 1.95; 95% confidence interval, 1.4-2.8; P < 0.001) and 120 to 60 minutes (crude odds ratio = 1.75; adjusted odds ratio = 1.74; 95% confidence interval, 1.0-2.9; P = 0.035) as compared with the reference interval of 59 to 30 minutes before incision. CONCLUSIONS: When cefuroxime is used as a prophylactic antibiotic, administration 59 to 30 minutes before incision is more effective than administration during the last half hour.
Resumo:
In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.
Resumo:
This morning Dr. Battle will introduce descriptive statistics and linear regression and how to apply these concepts in mathematical modeling. You will also learn how to use a spreadsheet to help with statistical analysis and to create graphs.
Resumo:
BACKGROUND: Tenofovir (TDF) use has been associated with proximal renal tubulopathy, reduced calculated glomerular filtration rates (cGFR) and losses in bone mineral density. Bone resorption could result in a compensatory osteoblast activation indicated by an increase in serum alkaline phosphatase (sAP). A few small studies have reported a positive correlation between renal phosphate losses, increased bone turnover and sAP. METHODS: We analysed sAP dynamics in patients initiating (n = 657), reinitiating (n = 361) and discontinuing (n = 73) combined antiretroviral therapy with and without TDF and assessed correlations with clinical and epidemiological parameters. RESULTS: TDF use was associated with a significant increase of sAP from a median of 74 U/I (interquartile range 60-98) to a plateau of 99 U/I (82-123) after 6 months (P < 0.0001), with a prompt return to baseline upon TDF discontinuation. No change occurred in TDF-sparing regimes. Univariable and multivariable linear regression analyses revealed a positive correlation between sAP and TDF use (P < or = 0.003), but no correlation with baseline cGFR, TDF-related cGFR reduction, changes in serum alanine aminotransferase (sALT) or active hepatitis C. CONCLUSIONS: We document a highly significant association between TDF use and increased sAP in a large observational cohort. The lack of correlation between TDF use and sALT suggests that the increase in sAP is because of the bone isoenzyme and indicates stimulated bone turnover. This finding, together with published data on TDF-related renal phosphate losses, this finding raises concerns that TDF use could result in osteomalacia with a loss in bone mineral density at least in a subset of patients. This potentially severe long-term toxicity should be addressed in future studies.
Resumo:
AIMS: No-reflow after a primary percutaneous coronary intervention (PCI) is associated with a high incidence of left ventricular (LV) failure and a poor prognosis. Endothelin-1 (ET-1) is a potent endothelium-derived vasoconstrictor peptide and an important modulator of neutrophil function. Elevated systemic ET-1 levels have recently been reported to predict a poor prognosis in patients with acute myocardial infarction (AMI) treated by primary PCI. We aimed to investigate the relationship between systemic ET-1 plasma levels and no-reflow in a group of AMI patients treated by primary PCI. METHODS AND RESULTS: A group of 51 patients (age 59+/-9.9 years, 44 males) with a first AMI, undergoing successful primary or rescue PCI, were included in the study. Angiographic no-reflow was defined as coronary TIMI flow grade < or =2 or TIMI flow 3 with a final myocardial blush grade < or =2. Blood samples were obtained from all patients on admission for ET-1 levels measurement. No reflow was observed in 31 patients (61%). Variables associated with no-reflow at univariate analysis included culprit lesion of the left anterior coronary descending artery (LAD) (67 vs. 29%, P=0.006) and ET-1 plasma levels (3.95+/-0.7 vs. 3.3+/-0.8 pg/mL, P=0.004). At multivariable logistic regression analysis, ET-1 was the only significant predictor of no-reflow (P=0.03) together with LAD as the culprit vessel (P=0.04). CONCLUSION: ET-1 plasma levels predict angiographic no-reflow after successful primary or rescue PCI. These findings suggest that ET-1 antagonists might be beneficial in the management of no-reflow.
Resumo:
OBJECTIVES: The STAndards for Reporting studies of Diagnostic accuracy (STARD) for investigators and editors and the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) for reviewers and readers offer guidelines for the quality and reporting of test accuracy studies. These guidelines address and propose some solutions to two major threats to validity: spectrum bias and test review bias. STUDY DESIGN AND SETTING: Using a clinical example, we demonstrate that these solutions fail and propose an alternative solution that concomitantly addresses both sources of bias. We also derive formulas that prove the generality of our arguments. RESULTS: A logical extension of our ideas is to extend STARD item 23 by adding a requirement for multivariable statistical adjustment using information collected in QUADAS items 1, 2, and 12 and STARD items 3-5, 11, 15, and 18. CONCLUSION: We recommend reporting not only variation of diagnostic accuracy across subgroups (STARD item 23) but also the effects of the multivariable adjustments on test performance. We also suggest that the QUADAS be supplemented by an item addressing the appropriateness of statistical methods, in particular whether multivariable adjustments have been included in the analysis.
Resumo:
OBJECTIVES: This paper is concerned with checking goodness-of-fit of binary logistic regression models. For the practitioners of data analysis, the broad classes of procedures for checking goodness-of-fit available in the literature are described. The challenges of model checking in the context of binary logistic regression are reviewed. As a viable solution, a simple graphical procedure for checking goodness-of-fit is proposed. METHODS: The graphical procedure proposed relies on pieces of information available from any logistic analysis; the focus is on combining and presenting these in an informative way. RESULTS: The information gained using this approach is presented with three examples. In the discussion, the proposed method is put into context and compared with other graphical procedures for checking goodness-of-fit of binary logistic models available in the literature. CONCLUSION: A simple graphical method can significantly improve the understanding of any logistic regression analysis and help to prevent faulty conclusions.
Resumo:
AIMS: It is unclear whether transcatheter aortic valve implantation (TAVI) addresses an unmet clinical need for those currently rejected for surgical aortic valve replacement (SAVR) and whether there is a subgroup of high-risk patients benefiting more from TAVI compared to SAVR. In this two-centre, prospective cohort study, we compared baseline characteristics and 30-day mortality between TAVI and SAVR in consecutive patients undergoing invasive treatment for aortic stenosis. METHODS AND RESULTS: We pre-specified different adjustment methods to examine the effect of TAVI as compared with SAVR on overall 30-day mortality: crude univariable logistic regression analysis, multivariable analysis adjusted for baseline characteristics, analysis adjusted for propensity scores, propensity score matched analysis, and weighted analysis using the inverse probability of treatment (IPT) as weights. A total of 1,122 patients were included in the study: 114 undergoing TAVI and 1,008 patients undergoing SAVR. The crude mortality rate was greater in the TAVI group (9.6% vs. 2.3%) yielding an odds ratio [OR] of 4.57 (95%-CI 2.17-9.65). Compared to patients undergoing SAVR, patients with TAVI were older, more likely to be in NYHA class III and IV, and had a considerably higher logistic EuroSCORE and more comorbid conditions. Adjusted OR depended on the method used to control for confounding and ranged from 0.60 (0.11-3.36) to 7.57 (0.91-63.0). We examined the distribution of propensity scores and found scores to overlap sufficiently only in a narrow range. In patients with sufficient overlap of propensity scores, adjusted OR ranged from 0.35 (0.04-2.72) to 3.17 (0.31 to 31.9). In patients with insufficient overlap, we consistently found increased odds of death associated with TAVI compared with SAVR irrespective of the method used to control confounding, with adjusted OR ranging from 5.88 (0.67-51.8) to 25.7 (0.88-750). Approximately one third of patients undergoing TAVI were found to be potentially eligible for a randomised comparison of TAVI versus SAVR. CONCLUSIONS: Both measured and unmeasured confounding limit the conclusions that can be drawn from observational comparisons of TAVI versus SAVR. Our study indicates that TAVI could be associated with either substantial benefits or harms. Randomised comparisons of TAVI versus SAVR are warranted.