56 resultados para Regression model
Resumo:
Trabecular bone score (TBS) is a grey-level textural index of bone microarchitecture derived from lumbar spine dual-energy X-ray absorptiometry (DXA) images. TBS is a BMD-independent predictor of fracture risk. The objective of this meta-analysis was to determine whether TBS predicted fracture risk independently of FRAX probability and to examine their combined performance by adjusting the FRAX probability for TBS. We utilized individual level data from 17,809 men and women in 14 prospective population-based cohorts. Baseline evaluation included TBS and the FRAX risk variables and outcomes during follow up (mean 6.7 years) comprised major osteoporotic fractures. The association between TBS, FRAX probabilities and the risk of fracture was examined using an extension of the Poisson regression model in each cohort and for each sex and expressed as the gradient of risk (GR; hazard ratio per 1SD change in risk variable in direction of increased risk). FRAX probabilities were adjusted for TBS using an adjustment factor derived from an independent cohort (the Manitoba Bone Density Cohort). Overall, the GR of TBS for major osteoporotic fracture was 1.44 (95% CI: 1.35-1.53) when adjusted for age and time since baseline and was similar in men and women (p > 0.10). When additionally adjusted for FRAX 10-year probability of major osteoporotic fracture, TBS remained a significant, independent predictor for fracture (GR 1.32, 95%CI: 1.24-1.41). The adjustment of FRAX probability for TBS resulted in a small increase in the GR (1.76, 95%CI: 1.65, 1.87 vs. 1.70, 95%CI: 1.60-1.81). A smaller change in GR for hip fracture was observed (FRAX hip fracture probability GR 2.25 vs. 2.22). TBS is a significant predictor of fracture risk independently of FRAX. The findings support the use of TBS as a potential adjustment for FRAX probability, though the impact of the adjustment remains to be determined in the context of clinical assessment guidelines. This article is protected by copyright. All rights reserved.
Resumo:
BACKGROUND Erosive tooth wear is the irreversible loss of dental hard tissue as a result of chemical processes. When the surface of a tooth is attacked by acids, the resulting loss of structural integrity leaves a softened layer on the tooth's surface, which renders it vulnerable to abrasive forces. The authors' objective was to estimate the prevalence of erosive tooth wear and to identify associated factors in a sample of 14- to 19-year-old adolescents in Mexico. METHODS The authors performed a cross-sectional study on a convenience sample (N = 417) of adolescents in a school in Mexico City, Mexico. The authors used a questionnaire and an oral examination performed according to the Lussi index. RESULTS The prevalence of erosive tooth wear was 31.7% (10.8% with exposed dentin). The final logistic regression model included age (P < .01; odds ratio [OR], 1.64; 95% confidence interval [CI], 1.26-2.13), high intake of sweet carbonated drinks (P = .03; OR, 1.81; 95% CI, 1.06-3.07), and xerostomia (P = .04; OR, 2.31; 95% CI, 1.05-5.09). CONCLUSIONS Erosive tooth wear, mainly on the mandibular first molars, was associated with age, high intake of sweet carbonated drinks, and xerostomia. PRACTICAL IMPLICATIONS Knowledge regarding erosive tooth wear in adolescents with relatively few years of exposure to causal factors will increase the focus on effective preventive measures, the identification of people at high risk, and early treatment.
Resumo:
BACKGROUND Calcium disorders are common in both intensive care units and in patients with chronic kidney disease and are associated with increased morbidity and mortality. It is unknown whether calcium abnormalities in unselected emergency department admissions have an impact on in-hospital mortality. METHODS This cross-sectional analysis included all admissions to the Emergency Department at the Inselspital Bern, Switzerland from 2010 to 2011. For hyper- and hypocalcaemic patients with a Mann-Whitney U-test, the differences between subgroups divided by age, length of hospital stay, creatinine, sodium, chloride, phosphate, potassium and magnesium were compared. Associations between calcium disorders and 28-day in-hospital mortality were assessed using the Cox proportional hazard regression model. RESULTS 8,270 patients with calcium measurements were included in our study. Overall 264 (3.2%) patients died. 150 patients (6.13%) with hypocalcaemia and 7 patients with hypercalcaemia (6.19%) died, in contrast to 104 normocalcaemic patients (1.82%). In univariate analysis, calcium serum levels were associated with sex, mortality and pre-existing diuretic therapy (all p<0.05). In multivariate Cox regression analysis, hypocalcaemia and hypercalcaemia were independent risk factors for mortality (HR 2.00 and HR 1.88, respectively; both p<0.01). CONCLUSION Both hypocalcaemia and hypercalcaemia are associated with increased 28-day in-hospital mortality in unselected emergency department admissions.
Resumo:
BACKGROUND Patients with electrolyte imbalances or disorders have a high risk of mortality. It is unknown if this finding from sodium or potassium disorders extends to alterations of magnesium levels. METHODS AND PATIENTS In this cross-sectional analysis, all emergency room patients between 2010 and 2011 at the Inselspital Bern, Switzerland, were included. A multivariable logistic regression model was performed to assess the association between magnesium levels and in-hospital mortality up to 28days. RESULTS A total of 22,239 subjects were screened for the study. A total of 5339 patients had plasma magnesium concentrations measured at hospital admission and were included into the analysis. A total of 6.3% of the 352 patients with hypomagnesemia and 36.9% of the 151 patients with hypermagnesemia died. In a multivariate Cox regression model hypermagnesemia (HR 11.6, p<0.001) was a strong independent risk factor for mortality. In these patients diuretic therapy revealed to be protective (HR 0.5, p=0.007). Hypomagnesemia was not associated with mortality (p>0.05). Age was an independent risk factor for mortality (both p<0.001). CONCLUSION The study does demonstrate a possible association between hypermagnesemia measured upon admission in the emergency department, and early in-hospital mortality.
Resumo:
BACKGROUND Phosphate imbalances or disorders have a high risk of morbidity and mortality in patients with chronic kidney disease. It is unknown if this finding extends to mortality in patients presenting at an emergency room with or without normal kidney function. METHODS AND PATIENTS This cross sectional analysis included all emergency room patients between 2010 and 2011 at the Inselspital Bern, Switzerland. A multivariable cox regression model was applied to assess the association between phosphate levels and in-hospital mortality up to 28 days. RESULTS 22,239 subjects were screened for the study. Plasma phosphate concentrations were measured in 2,390 patients on hospital admission and were included in the analysis. 3.5% of the 480 patients with hypophosphatemia and 10.7% of the 215 patients with hyperphosphatemia died. In univariate analysis, phosphate levels were associated with mortality, age, diuretic therapy and kidney function (all p<0.001). In a multivariate Cox regression model, hyperphosphatemia (OR 3.29, p<0.001) was a strong independent risk factor for mortality. Hypophosphatemia was not associated with mortality (p>0.05). CONCLUSION Hyperphosphatemia is associated with 28-day in-hospital mortality in an unselected cohort of patients presenting in an emergency room.
Resumo:
PURPOSE The purpose of this study was to analyze the removal of implant-supported crowns retained by three different cements using an air-accelerated crown remover and to evaluate the patients' response to the procedure. MATERIALS AND METHODS This controlled clinical trial was conducted with 21 patients (10 women, 11 men; mean age: 51 ± 10.2 years) who had received a total of 74 implants (all placed in the posterior zone of the mandible). Four months after implant surgery, the crowns were cemented on standard titanium abutments of different heights. Three different cements (two temporary: Harvard TEMP and Improv; and one definitive: Durelon) were used and randomly assigned to the patients. Eight months later, one blinded investigator removed all crowns. The number of activations of the instrument (CORONAflex, KaVo) required for crown removal was recorded. The patients completed a questionnaire retrospectively to determine the impact of the procedure and to gauge their subjective perception. A linear regression model and descriptive statistics were used for data analysis. RESULTS All crowns could be retrieved without any technical complications or damage. Both abutment height (P = .019) and cement type (P = .004) had a significant effect on the number of activations, but the type of cement was more important. An increased total number of activations had no or only a weak correlation to the patients' perception of concussion, noise, pain, and unwillingness to use the device. CONCLUSIONS Cemented implant crowns can be removed, and the application of an air-accelerated device is a practicable method. A type of cement with appropriate retention force has to be selected. The impact on the patients' subjective perception should be taken into account.
Resumo:
BACKGROUND Multiple scores have been proposed to stratify bleeding risk, but their value to guide dual antiplatelet therapy duration has never been appraised. We compared the performance of the CRUSADE (Can Rapid Risk Stratification of Unstable Angina Patients Suppress Adverse Outcomes With Early Implementation of the ACC/AHA Guidelines), ACUITY (Acute Catheterization and Urgent Intervention Triage Strategy), and HAS-BLED (Hypertension, Abnormal Renal/Liver Function, Stroke, Bleeding History or Predisposition, Labile INR, Elderly, Drugs/Alcohol Concomitantly) scores in 1946 patients recruited in the Prolonging Dual Antiplatelet Treatment After Grading Stent-Induced Intimal Hyperplasia Study (PRODIGY) and assessed hemorrhagic and ischemic events in the 24- and 6-month dual antiplatelet therapy groups. METHODS AND RESULTS Bleeding score performance was assessed with a Cox regression model and C statistics. Discriminative and reclassification power was assessed with net reclassification improvement and integrated discrimination improvement. The C statistic was similar between the CRUSADE score (area under the curve 0.71) and ACUITY (area under the curve 0.68), and higher than HAS-BLED (area under the curve 0.63). CRUSADE, but not ACUITY, improved reclassification (net reclassification index 0.39, P=0.005) and discrimination (integrated discrimination improvement index 0.0083, P=0.021) of major bleeding compared with HAS-BLED. Major bleeding and transfusions were higher in the 24- versus 6-month dual antiplatelet therapy groups in patients with a CRUSADE score >40 (hazard ratio for bleeding 2.69, P=0.035; hazard ratio for transfusions 4.65, P=0.009) but not in those with CRUSADE score ≤40 (hazard ratio for bleeding 1.50, P=0.25; hazard ratio for transfusions 1.37, P=0.44), with positive interaction (Pint=0.05 and Pint=0.01, respectively). The number of patients with high CRUSADE scores needed to treat for harm for major bleeding and transfusion were 17 and 15, respectively, with 24-month rather than 6-month dual antiplatelet therapy; corresponding figures in the overall population were 67 and 71, respectively. CONCLUSIONS Our analysis suggests that the CRUSADE score predicts major bleeding similarly to ACUITY and better than HAS BLED in an all-comer population with percutaneous coronary intervention and potentially identifies patients at higher risk of hemorrhagic complications when treated with a long-term dual antiplatelet therapy regimen. CLINICAL TRIAL REGISTRATION URL: http://clinicaltrials.gov. Unique identifier: NCT00611286.
Resumo:
BACKGROUND/AIMS The use of antihypertensive medicines has been shown to reduce proteinuria, morbidity, and mortality in patients with chronic kidney disease (CKD). A specific recommendation for a class of antihypertensive drugs is not available in this population, despite the pharmacodynamic differences. We have therefore analysed the association between antihypertensive medicines and survival of patients with chronic kidney disease. METHODS Out of 2687 consecutive patients undergoing kidney biopsy a cohort of 606 subjects with retrievable medical therapy was included into the analysis. Kidney function was assessed by glomerular filtration rate (GFR) estimation at the time point of kidney biopsy. Main outcome variable was death. RESULTS Overall 114 (18.7%) patients died. In univariate regression analysis the use of alpha-blockers and calcium channel antagonists, progression of disease, diabetes mellitus (DM) type 1 and 2, arterial hypertension, coronary heart disease, peripheral vascular disease, male sex and age were associated with mortality (all p<0.05). In a multivariate Cox regression model the use of calcium channel blockers (HR 1.89), age (HR 1.04), DM type 1 (HR 8.43) and DM type 2 (HR 2.17) and chronic obstructive pulmonary disease (HR 1.66) were associated with mortality (all p < 0.05). CONCLUSION The use of calcium channel blockers but not of other antihypertensive medicines is associated with mortality in primarily GN patients with CKD.
Resumo:
We assessed handrub consumption as a surrogate marker for hand hygiene compliance from 2007 to 2014. Handrub consumption varied substantially between departments but correlated in a mixed effects regression model with the number of patient-days and the observed hand hygiene compliance. Handrub consumption may supplement traditional hand hygiene observations. Infect. Control Hosp. Epidemiol. 2016;1-4.
Resumo:
BACKGROUND Catecholamine-O-methyl-tranferase (COMT) initiates dopamine degradation. Its activity is mainly determined by a single nucleotide polymorphism in the COMT gene (Val158Met, rs4680) separating high (Val/Val, COMT(HH)), intermediate (Val/Met, COMT(HL)) and low metabolizers (Met/Met, COMT(LL)). We investigated dopaminergic denervation in the striatum in PD patients according to COMT rs4680 genotype. METHODS Patients with idiopathic PD were assessed for motor severity (UPDRS-III rating scale in OFF-state), dopaminergic denervation using [123I]-FP-CIT SPECT imaging, and genotyped for the COMT rs4680 enzyme. [123I]-FP-CIT binding potential (BP) for each voxel was defined by the ratio of tracer-binding in the region of interest (striatum, caudate nucleus and putamen) to that in a region of non-specific activity. Genotyping was performed using TaqMan(®) SNP genotyping assay. We used a regression model to evaluate the effect of COMT genotype on the BP in the striatum and its sub-regions. RESULTS Genotype distribution was: 11 (27.5%) COMT(HH), 26 (65%) COMT(HL) and 3 (7.5%) COMT(LL). There were no significant differences in disease severity, treatments, or motor scores between genotypes. When adjusted to clinical severity, gender and age, low and intermediate metabolizers showed significantly higher rates of striatal denervation (COMT(HL+LL) BP = 1.32 ± 0.04) than high metabolizers (COMT(HH), BP = 1.6 ± 0.08; F(1.34) = 9.0, p = 0.005). Striatal sub-regions showed similar results. BP and UPDRS-III motor scores (r = 0.44, p = 0.04) (p < 0.001) were highly correlated. There was a gender effect, but no gender-genotype interaction. CONCLUSIONS Striatal denervation differs according to COMT-Val158Met polymorphism. COMT activity may play a role as a compensatory mechanism in PD motor symptoms.
Resumo:
BACKGROUND Risk factors promoting rhinovirus (RV) infections are inadequately described in healthy populations, especially infants. OBJECTIVES To determine the frequency of symptomatic and asymptomatic RV infections and identify possible risk factors from host and environment among otherwise healthy infants. METHODS In a prospective birth cohort, respiratory health was assessed in 41 term-born infants by weekly telephone interviews during the first year of life, and weekly nasal swabs were collected to determine RV prevalence. In a multilevel logistic regression model, associations between prevalence and respiratory symptoms during RV infections and host/environmental factors were determined. RESULTS 27% of nasal swabs in 41 infants tested positive for RVs. Risk factors for RV prevalence were autumn months (OR=1.71, p=0.01, 95% CI 1.13-2.61), outdoor temperatures between 5-10 °C (OR=2.33, p=0.001, 95% CI 1.41-3.86), older siblings (OR=2.60, p=0.001, 95% CI 1.50-4.51) and childcare attendance (OR=1.53, p=0.07, 95% CI 0.96-2.44). 51% of RV-positive samples were asymptomatic. Respiratory symptoms during RV infections were less likely during the first three months of life (OR=0.34, p=0.003, 95% CI 0.17-0.69) and in infants with atopic mothers (OR=0.44, p=0.008, 95% CI 0.24-0.80). Increased tidal volume (OR=1.67, p=0.03, 95% CI 1.04-2.68) and outdoor temperatures between 2-5 °C (OR=2.79, p=0.02, 95% CI 1.17-6.61) were associated with more symptoms. CONCLUSIONS RVs are highly prevalent during the first year of life, and most infections are asymptomatic. Frequency of RV infections is associated with environmental factors, while respiratory symptoms during RV infections are linked to host determinants like infant age, maternal atopy, or premorbid lung function.