915 resultados para Logistic regression model
Resumo:
We investigated the association between diet and head and neck cancer (HNC) risk using data from the International Head and Neck Cancer Epidemiology (INHANCE) consortium. The INHANCE pooled data included 22 case-control studies with 14,520 cases and 22,737 controls. Center-specific quartiles among the controls were used for food groups, and frequencies per week were used for single food items. A dietary pattern score combining high fruit and vegetable intake and low red meat intake was created. Odds ratios (OR) and 95% confidence intervals (CI) for the dietary items on the risk of HNC were estimated with a two-stage random-effects logistic regression model. An inverse association was observed for higher-frequency intake of fruit (4th vs. 1st quartile OR = 0.52, 95% CI = 0.43-0.62, p (trend) < 0.01) and vegetables (OR = 0.66, 95% CI = 0.49-0.90, p (trend) = 0.01). Intake of red meat (OR = 1.40, 95% CI = 1.13-1.74, p p (trend) < 0.01) was positively associated with HNC risk. Higher dietary pattern scores, reflecting high fruit/vegetable and low red meat intake, were associated with reduced HNC risk (per score increment OR = 0.90, 95% CI = 0.84-0.97).
Resumo:
BACKGROUND: Some physicians are still concerned about the safety of treatment at home of patients with acute deep venous thrombosis (DVT). METHODS: We used data from the RIETE (Registro Informatizado de la Enfermedad TromboEmbólica) registry to compare the outcomes in consecutive outpatients with acute lower limb DVT according to initial treatment at home or in the hospital. A propensity score-matching analysis was carried out with a logistic regression model. RESULTS: As of December 2012, 13,493 patients had been enrolled. Of these, 4456 (31%) were treated at home. Patients treated at home were more likely to be male and younger and to weigh more; they were less likely than those treated in the hospital to have chronic heart failure, lung disease, renal insufficiency, anemia, recent bleeding, immobilization, or cancer. During the first week of anticoagulation, 27 patients (0.20%) suffered pulmonary embolism (PE), 12 (0.09%) recurrent DVT, and 51 (0.38%) major bleeding; 80 (0.59%) died. When only patients treated at home were considered, 12 (0.27%) had PE, 4 (0.09%) had recurrent DVT, 6 (0.13%) bled, and 4 (0.09%) died (no fatal PE, 3 fatal bleeds). After propensity analysis, patients treated at home had a similar rate of venous thromboembolism recurrences and a lower rate of major bleeding (odds ratio, 0.4; 95% confidence interval, 0.1-1.0) or death (odds ratio, 0.2; 95% confidence interval, 0.1-0.7) within the first week compared with those treated in the hospital. CONCLUSIONS: In outpatients with DVT, home treatment was associated with a better outcome than treatment in the hospital. These data may help safely treat more DVT patients at home.
Resumo:
Background: Data from different studies suggest a favourable association between pretreatment with statins or hypercholesterolemia and outcome after ischaemic stroke. We examined whether there were differences in in-hospital mortality according to the presence or absence of statin therapy in a large population of first-ever ischaemic stroke patients and assessed the influence of statins upon early death and spontaneous neurological recovery. Methods: In 2,082 consecutive patients with first-ever ischaemic stroke collected from a prospective hospital-based stroke registry during a period of 19 years (1986-2004), statin use or hypercholesterolemia before stroke was documented in 381 patients. On the other hand, favourable outcome defined as grades 0-2 in the modified Rankin scale was recorded in 382 patients. Results: Early outcome was better in the presence of statin therapy or hypercholesterolemia (cholesterol levels were not measured) with significant differences between the groups with and without pretreatment with statins in in-hospital mortality (6% vs 13.3%, P = 0.001) and symptom-free (22% vs 17.5%, P = 0.025) and severe functional limitation (6.6% vs 11.5%, P = 0.002) at hospital discharge, as well as lower rates of infectious respiratory complications during hospitalization. In the logistic regression model, statin therapy was the only variable inversely associated with in-hospital death (odds ratio 0.57) and directly associated with favourable outcome (odds ratio 1.32).
Resumo:
Ventilator-associated pneumonia (VAP) affects mortality, morbidity and cost of critical care. Reliable risk estimation might improve end-of-life decisions, resource allocation and outcome. Several scoring systems for survival prediction have been established and optimised over the last decades. Recently, new biomarkers have gained interest in the prognostic field. We assessed whether midregional pro-atrial natriuretic peptide (MR-proANP) and procalcitonin (PCT) improve the predictive value of the Simplified Acute Physiologic Score (SAPS) II and Sequential Related Organ Failure Assessment (SOFA) in VAP. Specified end-points of a prospective multinational trial including 101 patients with VAP were analysed. Death <28 days after VAP onset was the primary end-point. MR-proANP and PCT were elevated at the onset of VAP in nonsurvivors compared with survivors (p = 0.003 and p = 0.017, respectively) and their slope of decline differed significantly (p = 0.018 and p = 0.039, respectively). Patients with the highest MR-proANP quartile at VAP onset were at increased risk for death (log rank p = 0.013). In a logistic regression model, MR-proANP was identified as the best predictor of survival. Adding MR-proANP and PCT to SAPS II and SOFA improved their predictive properties (area under the curve 0.895 and 0.880). We conclude that the combination of two biomarkers, MR-proANP and PCT, improve survival prediction of clinical severity scores in VAP.
Resumo:
BACKGROUND: Artemisinin-based combination therapy (ACT) has been promoted as a means to reduce malaria transmission due to their ability to kill both asexual blood stages of malaria parasites, which sustain infections over long periods and the immature derived sexual stages responsible for infecting mosquitoes and onward transmission. Early studies reported a temporal association between ACT introduction and reduced malaria transmission in a number of ecological settings. However, these reports have come from areas with low to moderate malaria transmission, been confounded by the presence of other interventions or environmental changes that may have reduced malaria transmission, and have not included a comparison group without ACT. This report presents results from the first large-scale observational study to assess the impact of case management with ACT on population-level measures of malaria endemicity in an area with intense transmission where the benefits of effective infection clearance might be compromised by frequent and repeated re-infection. METHODS: A pre-post observational study with a non-randomized comparison group was conducted at two sites in Tanzania. Both sites used sulphadoxine-pyrimethamine (SP) monotherapy as a first-line anti-malarial from mid-2001 through 2002. In 2003, the ACT, artesunate (AS) co-administered with SP (AS + SP), was introduced in all fixed health facilities in the intervention site, including both public and registered non-governmental facilities. Population-level prevalence of Plasmodium falciparum asexual parasitaemia and gametocytaemia were assessed using light microscopy from samples collected during representative household surveys in 2001, 2002, 2004, 2005 and 2006. FINDINGS: Among 37,309 observations included in the analysis, annual asexual parasitaemia prevalence in persons of all ages ranged from 11% to 28% and gametocytaemia prevalence ranged from <1% to 2% between the two sites and across the five survey years. A multivariable logistic regression model was fitted to adjust for age, socioeconomic status, bed net use and rainfall. In the presence of consistently high coverage and efficacy of SP monotherapy and AS + SP in the comparison and intervention areas, the introduction of ACT in the intervention site was associated with a modest reduction in the adjusted asexual parasitaemia prevalence of 5 percentage-points or 23% (p < 0.0001) relative to the comparison site. Gametocytaemia prevalence did not differ significantly (p = 0.30). INTERPRETATION: The introduction of ACT at fixed health facilities only modestly reduced asexual parasitaemia prevalence. ACT is effective for treatment of uncomplicated malaria and should have substantial public health impact on morbidity and mortality, but is unlikely to reduce malaria transmission substantially in much of sub-Saharan Africa where individuals are rapidly re-infected.
Resumo:
BACKGROUND: Present combination antiretroviral therapy (cART) alone does not cure HIV infection and requires lifelong drug treatment. The potential role of HIV therapeutic vaccines as part of an HIV cure is under consideration. Our aim was to assess the efficacy, safety, and immunogenicity of Vacc-4x, a peptide-based HIV-1 therapeutic vaccine targeting conserved domains on p24(Gag), in adults infected with HIV-1. METHODS: Between July, 2008, and June, 2010, we did a multinational double-blind, randomised, phase 2 study comparing Vacc-4x with placebo. Participants were adults infected with HIV-1 who were aged 18-55 years and virologically suppressed on cART (viral load <50 copies per mL) with CD4 cell counts of 400 × 10(6) cells per L or greater. The trial was done at 18 sites in Germany, Italy, Spain, the UK, and the USA. Participants were randomly assigned (2:1) to Vacc-4x or placebo. Group allocation was masked from participants and investigators. Four primary immunisations, weekly for 4 weeks, containing Vacc-4x (or placebo) were given intradermally after administration of adjuvant. Booster immunisations were given at weeks 16 and 18. At week 28, cART was interrupted for up to 24 weeks. The coprimary endpoints were cART resumption and changes in CD4 counts during treatment interruption. Analyses were by modified intention to treat: all participants who received one intervention. Furthermore, safety, viral load, and immunogenicity (as measured by ELISPOT and proliferation assays) were assessed. The 52 week follow-up period was completed in June, 2011. For the coprimary endpoints the proportion of participants who met the criteria for cART resumption was analysed with a logistic regression model with the treatment effect being assessed in a model including country as a covariate. This study is registered with ClinicalTrials.gov, number NCT00659789. FINDINGS: 174 individuals were screened; because of slow recruitment, enrolment stopped with 136 of a planned 345 participants and 93 were randomly assigned to receive Vacc-4x and 43 to receive placebo. There were no differences between the two groups for the primary efficacy endpoints in those participants who stopped cART at week 28. Of the participants who resumed cART, 30 (34%) were in the Vacc-4x group and 11 (29%) in the placebo group, and percentage changes in CD4 counts were not significant (mean treatment difference -5·71, 95% CI -13·01 to 1·59). However, a significant difference in viral load was noted for the Vacc-4x group both at week 48 (median 23 100 copies per mL Vacc-4x vs 71 800 copies per mL placebo; p=0·025) and week 52 (median 19 550 copies per mL vs 51 000 copies per mL; p=0·041). One serious adverse event, exacerbation of multiple sclerosis, was reported as possibly related to study treatment. Vacc-4x was immunogenic, inducing proliferative responses in both CD4 and CD8 T-cell populations. INTERPRETATION: The proportion of participants resuming cART before end of study and change in CD4 counts during the treatment interruption showed no benefit of vaccination. Vacc-4x was safe, well tolerated, immunogenic, seemed to contribute to a viral-load setpoint reduction after cART interruption, and might be worth consideration in future HIV-cure investigative strategies. FUNDING: Norwegian Research Council GLOBVAC Program and Bionor Pharma ASA.
Resumo:
Background: Modelling epidemiological knowledge in validated clinical scores is a practical mean of integrating EBM to usual care. Existing scores about cardiovascular disease have been largely developed in emergency settings, but few in primary care. Such a toll is needed for general practitioners (GP) to evaluate the probability of ischemic heart disease (IHD) in patients with non-traumatic chest pain. Objective: To develop a predictive model to use as a clinical score for detecting IHD in patients with non-traumatic chest-pain in primary care. Methods: A post-hoc secondary analysis on data from an observational study including 672 patients with chest pain of which 85 had IHD diagnosed by their GP during the year following their inclusion. Best subset method was used to select 8 predictive variables from univariate analysis and fitted in a multivariate logistic regression model to define the score. Reliability of the model was assessed using split-group method. Results: Significant predictors were: age (0-3 points), gender (1 point), having at least one cardiovascular risks factor (hypertension, dyslipidemia, diabetes, smoking, family history of CVD; 3 points), personal history of cardiovascular disease (1 point), duration of chest pain from 1 to 60 minutes (2 points), substernal chest pain (1 point), pain increasing with exertion (1 point) and absence of tenderness at palpation (1 point). Area under the ROC curve for the score was of 0.95 (IC95% 0.93; 0.97). Patients were categorised in three groups, low risk of IHD (score under 6; n = 360), moderate risk of IHD (score from 6 to 8; n = 187) and high risk of IHD (score from 9-13; n = 125). Prevalence of IHD in each group was respectively of 0%, 6.7%, 58.5%. Reliability of the model seems satisfactory as the model developed from the derivation set predicted perfectly (p = 0.948) the number of patients in each group in the validation set. Conclusion: This clinical score based only on history and physical exams can be an important tool in the practice of the general physician for the prediction of ischemic heart disease in patients complaining of chest pain. The score below 6 points (in more than half of our population) can avoid demanding complementary exams for selected patients (ECG, laboratory tests) because of the very low risk of IHD. Score above 6 points needs investigation to detect or rule out IHD. Further external validation is required in ambulatory settings.
Resumo:
BACKGROUND: The clinical course of HIV-1 infection is highly variable among individuals, at least in part as a result of genetic polymorphisms in the host. Toll-like receptors (TLRs) have a key role in innate immunity and mutations in the genes encoding these receptors have been associated with increased or decreased susceptibility to infections. OBJECTIVES: To determine whether single-nucleotide polymorphisms (SNPs) in TLR2-4 and TLR7-9 influenced the natural course of HIV-1 infection. METHODS: Twenty-eight SNPs in TLRs were analysed in HAART-naive HIV-positive patients from the Swiss HIV Cohort Study. The SNPs were detected using Sequenom technology. Haplotypes were inferred using an expectation-maximization algorithm. The CD4 T cell decline was calculated using a least-squares regression. Patients with a rapid CD4 cell decline, less than the 15th percentile, were defined as rapid progressors. The risk of rapid progression associated with SNPs was estimated using a logistic regression model. Other candidate risk factors included age, sex and risk groups (heterosexual, homosexual and intravenous drug use). RESULTS: Two SNPs in TLR9 (1635A/G and +1174G/A) in linkage disequilibrium were associated with the rapid progressor phenotype: for 1635A/G, odds ratio (OR), 3.9 [95% confidence interval (CI),1.7-9.2] for GA versus AA and OR, 4.7 (95% CI,1.9-12.0) for GG versus AA (P = 0.0008). CONCLUSION: Rapid progression of HIV-1 infection was associated with TLR9 polymorphisms. Because of its potential implications for intervention strategies and vaccine developments, additional epidemiological and experimental studies are needed to confirm this association.
Resumo:
OBJECTIVES:: For certain major operations, inpatient mortality risk is lower in high-volume hospitals than those in low-volume hospitals. Extending the analysis to a broader range of interventions and outcomes is necessary before adopting policies based on minimum volume thresholds. METHODS:: Using the United States 2004 Nationwide Inpatient Sample, we assessed the effect of intervention-specific and overall hospital volume on surgical complications, potentially avoidable reoperations, and deaths across 1.4 million interventions in 353 hospitals. Outcome variations across hospitals were analyzed through a 3-level hierarchical logistic regression model (patients, surgical interventions, and hospitals), which took into account interventions on multiple organs, 144 intervention categories, and structural hospital characteristics. Discriminative performance and calibration were good. RESULTS:: Hospitals with more experience in a given intervention had similar reoperation rates but lower mortality and complication rates: odds ratio per volume deciles 0.93 and 0.97. However, the benefit was limited to heart surgery and a small number of other operations. Risks were higher for hospitals that performed more interventions overall: odds ratio per 1000 for each event was approximately 1.02. Even after adjustment for specific volume, mortality varied substantially across both high- and low-volume hospitals. CONCLUSION:: Although the link between specific volume and certain inpatient outcomes suggests that specialization might help improve surgical safety, the variable magnitude of this link and the heterogeneity of hospital effect do not support the systematic use of volume-based referrals. It may be more efficient to monitor risk-adjusted postoperative outcomes and to investigate facilities with worse than expected outcomes.
Resumo:
We present the most comprehensive comparison to date of the predictive benefit of genetics in addition to currently used clinical variables, using genotype data for 33 single-nucleotide polymorphisms (SNPs) in 1,547 Caucasian men from the placebo arm of the REduction by DUtasteride of prostate Cancer Events (REDUCE®) trial. Moreover, we conducted a detailed comparison of three techniques for incorporating genetics into clinical risk prediction. The first method was a standard logistic regression model, which included separate terms for the clinical covariates and for each of the genetic markers. This approach ignores a substantial amount of external information concerning effect sizes for these Genome Wide Association Study (GWAS)-replicated SNPs. The second and third methods investigated two possible approaches to incorporating meta-analysed external SNP effect estimates - one via a weighted PCa 'risk' score based solely on the meta analysis estimates, and the other incorporating both the current and prior data via informative priors in a Bayesian logistic regression model. All methods demonstrated a slight improvement in predictive performance upon incorporation of genetics. The two methods that incorporated external information showed the greatest receiver-operating-characteristic AUCs increase from 0.61 to 0.64. The value of our methods comparison is likely to lie in observations of performance similarities, rather than difference, between three approaches of very different resource requirements. The two methods that included external information performed best, but only marginally despite substantial differences in complexity.
Resumo:
El presente trabajo aborda el estudio de los factores determinantes del endeudamiento empresarial para contrastar empíricamente la hipótesis del Pecking Order. El endeudamiento empresarial se mide junto a su madurez y para los diferentes tamaños empresariales dada la importancia de diferenciar sus posibles efectos contrapuestos o compensados. Los modelos utilizados para el contraste de hipótesis se han estimado con una muestra de 1.320 empresas manufactureras españolas proporcionada por la Encuesta sobre Estrategias Empresariales (ESEE), para el período 1993-2001. El análisis empírico aplica un modelo multivariante de regresión logística que permite concluir que la teoría del Pecking Order es la de mejor cumplimiento, además de constatarse que las empresas de menor tamaño tienen mayores dificultades de acceso a la financiación con deuda a largo plazo.
Resumo:
BACKGROUND: The aim of the current study was to assess whether widely used nutritional parameters are correlated with the nutritional risk score (NRS-2002) to identify postoperative morbidity and to evaluate the role of nutritionists in nutritional assessment. METHODS: A randomized trial on preoperative nutritional interventions (NCT00512213) provided the study cohort of 152 patients at nutritional risk (NRS-2002 ≥3) with a comprehensive phenotyping including diverse nutritional parameters (n=17), elaborated by nutritional specialists, and potential demographic and surgical (n=5) confounders. Risk factors for overall, severe (Dindo-Clavien 3-5) and infectious complications were identified by univariate analysis; parameters with P<0.20 were then entered in a multiple logistic regression model. RESULTS: Final analysis included 140 patients with complete datasets. Of these, 61 patients (43.6%) were overweight, and 72 patients (51.4%) experienced at least one complication of any degree of severity. Univariate analysis identified a correlation between few (≤3) active co-morbidities (OR=4.94; 95% CI: 1.47-16.56, p=0.01) and overall complications. Patients screened as being malnourished by nutritional specialists presented less overall complications compared to the not malnourished (OR=0.47; 95% CI: 0.22-0.97, p=0.043). Severe postoperative complications occurred more often in patients with low lean body mass (OR=1.06; 95% CI: 1-1.12, p=0.028). Few (≤3) active co-morbidities (OR=8.8; 95% CI: 1.12-68.99, p=0.008) were related with postoperative infections. Patients screened as being malnourished by nutritional specialists presented less infectious complications (OR=0.28; 95% CI: 0.1-0.78), p=0.014) as compared to the not malnourished. Multivariate analysis identified few co-morbidities (OR=6.33; 95% CI: 1.75-22.84, p=0.005), low weight loss (OR=1.08; 95% CI: 1.02-1.14, p=0.006) and low hemoglobin concentration (OR=2.84; 95% CI: 1.22-6.59, p=0.021) as independent risk factors for overall postoperative complications. Compliance with nutritional supplements (OR=0.37; 95% CI: 0.14-0.97, p=0.041) and supplementation of malnourished patients as assessed by nutritional specialists (OR=0.24; 95% CI: 0.08-0.69, p=0.009) were independently associated with decreased infectious complications. CONCLUSIONS: Nutritional support based upon NRS-2002 screening might result in overnutrition, with potentially deleterious clinical consequences. We emphasize the importance of detailed assessment of the nutritional status by a dedicated specialist before deciding on early nutritional intervention for patients with an initial NRS-2002 score of ≥3.
Resumo:
BACKGROUND AND PURPOSE: To compare safety and efficacy of bridging approach with intravenous (IV) thrombolysis in patients with acute anterior strokes and proximal occlusions. PATIENTS AND METHODS: Consecutive patients with ischemic anterior strokes admitted within a 4 h 30 min window in two different centers were included. The first center performed IV therapy (alteplase 0.6 mg/kg) during 30 min and, in absence of clinical improvement, mechanical thrombectomy with flow restoration using a Solitaire stent (StS); the second carried out IV thrombolysis (alteplase 0.9 mg/kg) alone. Only T, M1 or M2 occlusions present on CT angiography were considered. Endpoints were clinical outcome and mortality at 3 months. RESULTS: There were 63 patients in the bridging and 163 in the IV group. No significant differences regarding baseline characteristics were observed. At 3 months, 46% (n = 29) of the patients treated in the combined and 23% (n = 38) of those treated in the IV group had a modified Rankin scale (mRS) of 0-1 (P < 0.001). A statistical significant difference was observed for all sites of occlusion. In a logistic regression model, National Institute of Health Stroke Scale (NIHSS) and bridging therapy were independent predictors of good outcome (respectively, P = 0.001 and P = 0.0018). Symptomatic hemorrhage was documented in 6.3% vs 3.7% in the bridging and in the IV group, respectively (P = 0.32). There was no difference in mortality. CONCLUSIONS: Our results suggest that patients treated with a bridging approach were more likely to have minimal or no deficit at all at 3 months as compared to the IV treated group.
Resumo:
BACKGROUND AND PURPOSE: Statins display anti-inflammatory and anti-epileptogenic properties in animal models, and may reduce the epilepsy risk in elderly humans; however, a possible modulating role on outcome in patients with status epilepticus (SE) has not been assessed. METHODS: This cohort study was based on a prospective registry including all consecutive adults with incident SE treated in our center between April 2006 and September 2012. SE outcome was categorized at hospital discharge into 'return to baseline', 'new disability' and 'mortality'. The role of potential predictors, including statins treatment on admission, was evaluated using a multinomial logistic regression model. RESULTS: Amongst 427 patients identified, information on statins was available in 413 (97%). Mean age was 60.9 (±17.8) years; 201 (49%) were women; 211 (51%) had a potentially fatal SE etiology; and 191 (46%) experienced generalized-convulsive or non-convulsive SE in coma. Statins (simvastatin, atorvastatin or pravastatin) were prescribed prior to admission in 76 (18%) subjects, mostly elderly. Whilst 208 (50.4%) patients returned to baseline, 58 (14%) died. After adjustment for established SE outcome predictors (age, etiology, SE severity score), statins correlated significantly with lower mortality (relative risk ratio 0.38, P = 0.046). CONCLUSION: This study suggests for the first time that exposure to statins before an SE episode is related to its outcome, involving a possible anti-epileptogenic role. Other studies are needed to confirm this intriguing finding.
Resumo:
BACKGROUND: Due to the underlying diseases and the need for immunosuppression, patients after lung transplantation are particularly at risk for gastrointestinal (GI) complications that may negatively influence long-term outcome. The present study assessed the incidences and impact of GI complications after lung transplantation and aimed to identify risk factors. METHODS: Retrospective analysis of all 227 consecutively performed single- and double-lung transplantations at the University hospitals of Lausanne and Geneva was performed between January 1993 and December 2010. Logistic regressions were used to test the effect of potentially influencing variables on the binary outcomes overall, severe, and surgery-requiring complications, followed by a multiple logistic regression model. RESULTS: Final analysis included 205 patients for the purpose of the present study, and 22 patients were excluded due to re-transplantation, multiorgan transplantation, or incomplete datasets. GI complications were observed in 127 patients (62 %). Gastro-esophageal reflux disease was the most commonly observed complication (22.9 %), followed by inflammatory or infectious colitis (20.5 %) and gastroparesis (10.7 %). Major GI complications (Dindo/Clavien III-V) were observed in 83 (40.5 %) patients and were fatal in 4 patients (2.0 %). Multivariate analysis identified double-lung transplantation (p = 0.012) and early (1993-1998) transplantation period (p = 0.008) as independent risk factors for developing major GI complications. Forty-three (21 %) patients required surgery such as colectomy, cholecystectomy, and fundoplication in 6.8, 6.3, and 3.9 % of the patients, respectively. Multivariate analysis identified Charlson comorbidity index of ≥3 as an independent risk factor for developing GI complications requiring surgery (p = 0.015). CONCLUSION: GI complications after lung transplantation are common. Outcome was rather encouraging in the setting of our transplant center.