967 resultados para predictive regression model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cisplatin, a major antineoplastic drug used in the treatment of solid tumors, is a known nephrotoxin. This retrospective cohort study evaluated the prevalence and severity of cisplatin nephrotoxicity in 54 children and its impact on height and weight.We recorded the weight, height, serum creatinine, and electrolytes in each cisplatin cycle and after 12 months of treatment. Nephrotoxicity was graded as follows: normal renal function (Grade 0); asymptomatic electrolyte disorders, including an increase in serum creatinine, up to 1.5 times baseline value (Grade 1); need for electrolyte supplementation <3 months and/or increase in serum creatinine 1.5 to 1.9 times from baseline (Grade 2); increase in serum creatinine 2 to 2.9 times from baseline or need for electrolyte supplementation for more than 3 months after treatment completion (Grade 3); and increase in serum creatinine ≥3 times from baseline or renal replacement therapy (Grade 4).Nephrotoxicity was observed in 41 subjects (75.9%). Grade 1 nephrotoxicity was observed in 18 patients (33.3%), Grade 2 in 5 patients (9.2%), and Grade 3 in 18 patients (33.3%). None had Grade 4 nephrotoxicity. Nephrotoxicity patients were younger and received higher cisplatin dose, they also had impairment in longitudinal growth manifested as statistically significant worsening on the height Z Score at 12 months after treatment. We used a multiple logistic regression model using the delta of height Z Score (baseline-12 months) as dependent variable in order to adjust for the main confounder variables such as: germ cell tumor, cisplatin total dose, serum magnesium levels at 12 months, gender, and nephrotoxicity grade. Patients with nephrotoxicity Grade 1 where at higher risk of not growing (OR 5.1, 95% CI 1.07-24.3, P=0.04). The cisplatin total dose had a significant negative relationship with magnesium levels at 12 months (Spearman r=-0.527, P=<0.001).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE In patients with a long life expectancy with high-risk (HR) prostate cancer (PCa), the chance to die from PCa is not negligible and may change significantly according to the time elapsed from surgery. The aim of this study was to evaluate long-term survival patterns in young patients treated with radical prostatectomy (RP) for HRPCa. MATERIALS AND METHODS Within a multiinstitutional cohort, 600 young patients (≤59 years) treated with RP between 1987 and 2012 for HRPCa (defined as at least one of the following adverse characteristics: prostate specific antigen>20, cT3 or higher, biopsy Gleason sum 8-10) were identified. Smoothed cumulative incidence plot was performed to assess cancer-specific mortality (CSM) and other cause mortality (OCM) rates at 10, 15, and 20 years after RP. The same analyses were performed to assess the 5-year probability of CSM and OCM in patients who survived 5, 10, and 15 years after RP. A multivariable competing risk regression model was fitted to identify predictors of CSM and OCM. RESULTS The 10-, 15- and 20-year CSM and OCM rates were 11.6% and 5.5% vs. 15.5% and 13.5% vs. 18.4% and 19.3%, respectively. The 5-year probability of CSM and OCM rates among patients who survived at 5, 10, and 15 years after RP, were 6.4% and 2.7% vs. 4.6% and 9.6% vs. 4.2% and 8.2%, respectively. Year of surgery, pathological stage and Gleason score, surgical margin status and lymph node invasion were the major determinants of CSM (all P≤0.03). Conversely, none of the covariates was significantly associated with OCM (all P≥ 0.09). CONCLUSIONS Very long-term cancer control in young high-risk patients after RP is highly satisfactory. The probability of dying from PCa in young patients is the leading cause of death during the first 10 years of survivorship after RP. Thereafter, mortality not related to PCa became the main cause of death. Consequently, surgery should be consider among young patients with high-risk disease and strict PCa follow-up should enforce during the first 10 years of survivorship after RP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Renal damage is more frequent with new-generation lithotripters. However, animal studies suggest that voltage ramping minimizes the risk of complications following extracorporeal shock wave lithotripsy (SWL). In the clinical setting, the optimal voltage strategy remains unclear. OBJECTIVE To evaluate whether stepwise voltage ramping can protect the kidney from damage during SWL. DESIGN, SETTING, AND PARTICIPANTS A total of 418 patients with solitary or multiple unilateral kidney stones were randomized to receive SWL using a Modulith SLX-F2 lithotripter with either stepwise voltage ramping (n=213) or a fixed maximal voltage (n=205). INTERVENTION SWL. OUTCOMES MEASUREMENTS AND STATISTICAL ANALYSIS The primary outcome was sonographic evidence of renal hematomas. Secondary outcomes included levels of urinary markers of renal damage, stone disintegration, stone-free rate, and rates of secondary interventions within 3 mo of SWL. Descriptive statistics were used to compare clinical outcomes between the two groups. A logistic regression model was generated to assess predictors of hematomas. RESULTS AND LIMITATIONS Significantly fewer hematomas occurred in the ramping group(12/213, 5.6%) than in the fixed group (27/205, 13%; p=0.008). There was some evidence that the fixed group had higher urinary β2-microglobulin levels after SWL compared to the ramping group (p=0.06). Urinary microalbumin levels, stone disintegration, stone-free rate, and rates of secondary interventions did not significantly differ between the groups. The logistic regression model showed a significantly higher risk of renal hematomas in older patients (odds ratio [OR] 1.03, 95% confidence interval [CI] 1.00-1.05; p=0.04). Stepwise voltage ramping was associated with a lower risk of hematomas (OR 0.39, 95% CI 0.19-0.80; p=0.01). The study was limited by the use of ultrasound to detect hematomas. CONCLUSIONS In this prospective randomized study, stepwise voltage ramping during SWL was associated with a lower risk of renal damage compared to a fixed maximal voltage without compromising treatment effectiveness. PATIENT SUMMARY Lithotripsy is a noninvasive technique for urinary stone disintegration using ultrasonic energy. In this study, two voltage strategies are compared. The results show that a progressive increase in voltage during lithotripsy decreases the risk of renal hematomas while maintaining excellent outcomes. TRIAL REGISTRATION ISRCTN95762080.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancer is one of the leading causes of death in companion animals. Information on the epidemiology of cancer is instrumental for veterinary practitioners in patient management; however, spontaneously arising tumours in companion animals also resemble those in man and can provide useful data in combating cancer. Veterinary cancer registries for cats are few in number and have often remained short-lived. This paper presents a retrospective study of tumours in cats in Switzerland from 1965 to 2008. Tumour diagnoses were coded according to topographical and morphological keys of the International Classification of Oncology for Humans (ICD-O-3). Correlations between breed, sex and age were then examined using a multiple logistic regression model. A total of 18,375 tumours were diagnosed in 51,322 cats. Of these, 14,759 (80.3%) tumours were malignant. Several breeds had significantly lower odds ratios for developing a tumour compared with European shorthair cats. The odds of a cat developing a tumour increased with age, up to the age of 16 years, and female cats had higher risk of developing a tumour compared with male cats. Skin (4,970; 27.05%) was the most frequent location for tumours, followed by connective tissue (3,498; 19.04%), unknown location (2,532; 13.78%) and female sexual organs (1,564; 8.51%). The most common tumour types were epithelial tumours (7,913; 43.06%), mesenchymal tumours (5,142; 27.98%) and lymphoid tumours (3,911; 21.28%).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trabecular bone score (TBS) is a grey-level textural index of bone microarchitecture derived from lumbar spine dual-energy X-ray absorptiometry (DXA) images. TBS is a BMD-independent predictor of fracture risk. The objective of this meta-analysis was to determine whether TBS predicted fracture risk independently of FRAX probability and to examine their combined performance by adjusting the FRAX probability for TBS. We utilized individual level data from 17,809 men and women in 14 prospective population-based cohorts. Baseline evaluation included TBS and the FRAX risk variables and outcomes during follow up (mean 6.7 years) comprised major osteoporotic fractures. The association between TBS, FRAX probabilities and the risk of fracture was examined using an extension of the Poisson regression model in each cohort and for each sex and expressed as the gradient of risk (GR; hazard ratio per 1SD change in risk variable in direction of increased risk). FRAX probabilities were adjusted for TBS using an adjustment factor derived from an independent cohort (the Manitoba Bone Density Cohort). Overall, the GR of TBS for major osteoporotic fracture was 1.44 (95% CI: 1.35-1.53) when adjusted for age and time since baseline and was similar in men and women (p > 0.10). When additionally adjusted for FRAX 10-year probability of major osteoporotic fracture, TBS remained a significant, independent predictor for fracture (GR 1.32, 95%CI: 1.24-1.41). The adjustment of FRAX probability for TBS resulted in a small increase in the GR (1.76, 95%CI: 1.65, 1.87 vs. 1.70, 95%CI: 1.60-1.81). A smaller change in GR for hip fracture was observed (FRAX hip fracture probability GR 2.25 vs. 2.22). TBS is a significant predictor of fracture risk independently of FRAX. The findings support the use of TBS as a potential adjustment for FRAX probability, though the impact of the adjustment remains to be determined in the context of clinical assessment guidelines. This article is protected by copyright. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Erosive tooth wear is the irreversible loss of dental hard tissue as a result of chemical processes. When the surface of a tooth is attacked by acids, the resulting loss of structural integrity leaves a softened layer on the tooth's surface, which renders it vulnerable to abrasive forces. The authors' objective was to estimate the prevalence of erosive tooth wear and to identify associated factors in a sample of 14- to 19-year-old adolescents in Mexico. METHODS The authors performed a cross-sectional study on a convenience sample (N = 417) of adolescents in a school in Mexico City, Mexico. The authors used a questionnaire and an oral examination performed according to the Lussi index. RESULTS The prevalence of erosive tooth wear was 31.7% (10.8% with exposed dentin). The final logistic regression model included age (P < .01; odds ratio [OR], 1.64; 95% confidence interval [CI], 1.26-2.13), high intake of sweet carbonated drinks (P = .03; OR, 1.81; 95% CI, 1.06-3.07), and xerostomia (P = .04; OR, 2.31; 95% CI, 1.05-5.09). CONCLUSIONS Erosive tooth wear, mainly on the mandibular first molars, was associated with age, high intake of sweet carbonated drinks, and xerostomia. PRACTICAL IMPLICATIONS Knowledge regarding erosive tooth wear in adolescents with relatively few years of exposure to causal factors will increase the focus on effective preventive measures, the identification of people at high risk, and early treatment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Calcium disorders are common in both intensive care units and in patients with chronic kidney disease and are associated with increased morbidity and mortality. It is unknown whether calcium abnormalities in unselected emergency department admissions have an impact on in-hospital mortality. METHODS This cross-sectional analysis included all admissions to the Emergency Department at the Inselspital Bern, Switzerland from 2010 to 2011. For hyper- and hypocalcaemic patients with a Mann-Whitney U-test, the differences between subgroups divided by age, length of hospital stay, creatinine, sodium, chloride, phosphate, potassium and magnesium were compared. Associations between calcium disorders and 28-day in-hospital mortality were assessed using the Cox proportional hazard regression model. RESULTS 8,270 patients with calcium measurements were included in our study. Overall 264 (3.2%) patients died. 150 patients (6.13%) with hypocalcaemia and 7 patients with hypercalcaemia (6.19%) died, in contrast to 104 normocalcaemic patients (1.82%). In univariate analysis, calcium serum levels were associated with sex, mortality and pre-existing diuretic therapy (all p<0.05). In multivariate Cox regression analysis, hypocalcaemia and hypercalcaemia were independent risk factors for mortality (HR 2.00 and HR 1.88, respectively; both p<0.01). CONCLUSION Both hypocalcaemia and hypercalcaemia are associated with increased 28-day in-hospital mortality in unselected emergency department admissions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Patients with electrolyte imbalances or disorders have a high risk of mortality. It is unknown if this finding from sodium or potassium disorders extends to alterations of magnesium levels. METHODS AND PATIENTS In this cross-sectional analysis, all emergency room patients between 2010 and 2011 at the Inselspital Bern, Switzerland, were included. A multivariable logistic regression model was performed to assess the association between magnesium levels and in-hospital mortality up to 28days. RESULTS A total of 22,239 subjects were screened for the study. A total of 5339 patients had plasma magnesium concentrations measured at hospital admission and were included into the analysis. A total of 6.3% of the 352 patients with hypomagnesemia and 36.9% of the 151 patients with hypermagnesemia died. In a multivariate Cox regression model hypermagnesemia (HR 11.6, p<0.001) was a strong independent risk factor for mortality. In these patients diuretic therapy revealed to be protective (HR 0.5, p=0.007). Hypomagnesemia was not associated with mortality (p>0.05). Age was an independent risk factor for mortality (both p<0.001). CONCLUSION The study does demonstrate a possible association between hypermagnesemia measured upon admission in the emergency department, and early in-hospital mortality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Phosphate imbalances or disorders have a high risk of morbidity and mortality in patients with chronic kidney disease. It is unknown if this finding extends to mortality in patients presenting at an emergency room with or without normal kidney function. METHODS AND PATIENTS This cross sectional analysis included all emergency room patients between 2010 and 2011 at the Inselspital Bern, Switzerland. A multivariable cox regression model was applied to assess the association between phosphate levels and in-hospital mortality up to 28 days. RESULTS 22,239 subjects were screened for the study. Plasma phosphate concentrations were measured in 2,390 patients on hospital admission and were included in the analysis. 3.5% of the 480 patients with hypophosphatemia and 10.7% of the 215 patients with hyperphosphatemia died. In univariate analysis, phosphate levels were associated with mortality, age, diuretic therapy and kidney function (all p<0.001). In a multivariate Cox regression model, hyperphosphatemia (OR 3.29, p<0.001) was a strong independent risk factor for mortality. Hypophosphatemia was not associated with mortality (p>0.05). CONCLUSION Hyperphosphatemia is associated with 28-day in-hospital mortality in an unselected cohort of patients presenting in an emergency room.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE The purpose of this study was to analyze the removal of implant-supported crowns retained by three different cements using an air-accelerated crown remover and to evaluate the patients' response to the procedure. MATERIALS AND METHODS This controlled clinical trial was conducted with 21 patients (10 women, 11 men; mean age: 51 ± 10.2 years) who had received a total of 74 implants (all placed in the posterior zone of the mandible). Four months after implant surgery, the crowns were cemented on standard titanium abutments of different heights. Three different cements (two temporary: Harvard TEMP and Improv; and one definitive: Durelon) were used and randomly assigned to the patients. Eight months later, one blinded investigator removed all crowns. The number of activations of the instrument (CORONAflex, KaVo) required for crown removal was recorded. The patients completed a questionnaire retrospectively to determine the impact of the procedure and to gauge their subjective perception. A linear regression model and descriptive statistics were used for data analysis. RESULTS All crowns could be retrieved without any technical complications or damage. Both abutment height (P = .019) and cement type (P = .004) had a significant effect on the number of activations, but the type of cement was more important. An increased total number of activations had no or only a weak correlation to the patients' perception of concussion, noise, pain, and unwillingness to use the device. CONCLUSIONS Cemented implant crowns can be removed, and the application of an air-accelerated device is a practicable method. A type of cement with appropriate retention force has to be selected. The impact on the patients' subjective perception should be taken into account.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Multiple scores have been proposed to stratify bleeding risk, but their value to guide dual antiplatelet therapy duration has never been appraised. We compared the performance of the CRUSADE (Can Rapid Risk Stratification of Unstable Angina Patients Suppress Adverse Outcomes With Early Implementation of the ACC/AHA Guidelines), ACUITY (Acute Catheterization and Urgent Intervention Triage Strategy), and HAS-BLED (Hypertension, Abnormal Renal/Liver Function, Stroke, Bleeding History or Predisposition, Labile INR, Elderly, Drugs/Alcohol Concomitantly) scores in 1946 patients recruited in the Prolonging Dual Antiplatelet Treatment After Grading Stent-Induced Intimal Hyperplasia Study (PRODIGY) and assessed hemorrhagic and ischemic events in the 24- and 6-month dual antiplatelet therapy groups. METHODS AND RESULTS Bleeding score performance was assessed with a Cox regression model and C statistics. Discriminative and reclassification power was assessed with net reclassification improvement and integrated discrimination improvement. The C statistic was similar between the CRUSADE score (area under the curve 0.71) and ACUITY (area under the curve 0.68), and higher than HAS-BLED (area under the curve 0.63). CRUSADE, but not ACUITY, improved reclassification (net reclassification index 0.39, P=0.005) and discrimination (integrated discrimination improvement index 0.0083, P=0.021) of major bleeding compared with HAS-BLED. Major bleeding and transfusions were higher in the 24- versus 6-month dual antiplatelet therapy groups in patients with a CRUSADE score >40 (hazard ratio for bleeding 2.69, P=0.035; hazard ratio for transfusions 4.65, P=0.009) but not in those with CRUSADE score ≤40 (hazard ratio for bleeding 1.50, P=0.25; hazard ratio for transfusions 1.37, P=0.44), with positive interaction (Pint=0.05 and Pint=0.01, respectively). The number of patients with high CRUSADE scores needed to treat for harm for major bleeding and transfusion were 17 and 15, respectively, with 24-month rather than 6-month dual antiplatelet therapy; corresponding figures in the overall population were 67 and 71, respectively. CONCLUSIONS Our analysis suggests that the CRUSADE score predicts major bleeding similarly to ACUITY and better than HAS BLED in an all-comer population with percutaneous coronary intervention and potentially identifies patients at higher risk of hemorrhagic complications when treated with a long-term dual antiplatelet therapy regimen. CLINICAL TRIAL REGISTRATION URL: http://clinicaltrials.gov. Unique identifier: NCT00611286.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND/AIMS The use of antihypertensive medicines has been shown to reduce proteinuria, morbidity, and mortality in patients with chronic kidney disease (CKD). A specific recommendation for a class of antihypertensive drugs is not available in this population, despite the pharmacodynamic differences. We have therefore analysed the association between antihypertensive medicines and survival of patients with chronic kidney disease. METHODS Out of 2687 consecutive patients undergoing kidney biopsy a cohort of 606 subjects with retrievable medical therapy was included into the analysis. Kidney function was assessed by glomerular filtration rate (GFR) estimation at the time point of kidney biopsy. Main outcome variable was death. RESULTS Overall 114 (18.7%) patients died. In univariate regression analysis the use of alpha-blockers and calcium channel antagonists, progression of disease, diabetes mellitus (DM) type 1 and 2, arterial hypertension, coronary heart disease, peripheral vascular disease, male sex and age were associated with mortality (all p<0.05). In a multivariate Cox regression model the use of calcium channel blockers (HR 1.89), age (HR 1.04), DM type 1 (HR 8.43) and DM type 2 (HR 2.17) and chronic obstructive pulmonary disease (HR 1.66) were associated with mortality (all p < 0.05). CONCLUSION The use of calcium channel blockers but not of other antihypertensive medicines is associated with mortality in primarily GN patients with CKD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We assessed handrub consumption as a surrogate marker for hand hygiene compliance from 2007 to 2014. Handrub consumption varied substantially between departments but correlated in a mixed effects regression model with the number of patient-days and the observed hand hygiene compliance. Handrub consumption may supplement traditional hand hygiene observations. Infect. Control Hosp. Epidemiol. 2016;1-4.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Catecholamine-O-methyl-tranferase (COMT) initiates dopamine degradation. Its activity is mainly determined by a single nucleotide polymorphism in the COMT gene (Val158Met, rs4680) separating high (Val/Val, COMT(HH)), intermediate (Val/Met, COMT(HL)) and low metabolizers (Met/Met, COMT(LL)). We investigated dopaminergic denervation in the striatum in PD patients according to COMT rs4680 genotype. METHODS Patients with idiopathic PD were assessed for motor severity (UPDRS-III rating scale in OFF-state), dopaminergic denervation using [123I]-FP-CIT SPECT imaging, and genotyped for the COMT rs4680 enzyme. [123I]-FP-CIT binding potential (BP) for each voxel was defined by the ratio of tracer-binding in the region of interest (striatum, caudate nucleus and putamen) to that in a region of non-specific activity. Genotyping was performed using TaqMan(®) SNP genotyping assay. We used a regression model to evaluate the effect of COMT genotype on the BP in the striatum and its sub-regions. RESULTS Genotype distribution was: 11 (27.5%) COMT(HH), 26 (65%) COMT(HL) and 3 (7.5%) COMT(LL). There were no significant differences in disease severity, treatments, or motor scores between genotypes. When adjusted to clinical severity, gender and age, low and intermediate metabolizers showed significantly higher rates of striatal denervation (COMT(HL+LL) BP = 1.32 ± 0.04) than high metabolizers (COMT(HH), BP = 1.6 ± 0.08; F(1.34) = 9.0, p = 0.005). Striatal sub-regions showed similar results. BP and UPDRS-III motor scores (r = 0.44, p = 0.04) (p < 0.001) were highly correlated. There was a gender effect, but no gender-genotype interaction. CONCLUSIONS Striatal denervation differs according to COMT-Val158Met polymorphism. COMT activity may play a role as a compensatory mechanism in PD motor symptoms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Risk factors promoting rhinovirus (RV) infections are inadequately described in healthy populations, especially infants. OBJECTIVES To determine the frequency of symptomatic and asymptomatic RV infections and identify possible risk factors from host and environment among otherwise healthy infants. METHODS In a prospective birth cohort, respiratory health was assessed in 41 term-born infants by weekly telephone interviews during the first year of life, and weekly nasal swabs were collected to determine RV prevalence. In a multilevel logistic regression model, associations between prevalence and respiratory symptoms during RV infections and host/environmental factors were determined. RESULTS 27% of nasal swabs in 41 infants tested positive for RVs. Risk factors for RV prevalence were autumn months (OR=1.71, p=0.01, 95% CI 1.13-2.61), outdoor temperatures between 5-10 °C (OR=2.33, p=0.001, 95% CI 1.41-3.86), older siblings (OR=2.60, p=0.001, 95% CI 1.50-4.51) and childcare attendance (OR=1.53, p=0.07, 95% CI 0.96-2.44). 51% of RV-positive samples were asymptomatic. Respiratory symptoms during RV infections were less likely during the first three months of life (OR=0.34, p=0.003, 95% CI 0.17-0.69) and in infants with atopic mothers (OR=0.44, p=0.008, 95% CI 0.24-0.80). Increased tidal volume (OR=1.67, p=0.03, 95% CI 1.04-2.68) and outdoor temperatures between 2-5 °C (OR=2.79, p=0.02, 95% CI 1.17-6.61) were associated with more symptoms. CONCLUSIONS RVs are highly prevalent during the first year of life, and most infections are asymptomatic. Frequency of RV infections is associated with environmental factors, while respiratory symptoms during RV infections are linked to host determinants like infant age, maternal atopy, or premorbid lung function.