979 resultados para Multivariable logistic regression
Resumo:
Objectives: To assess induced labor-associated perinatal infection risk at Hospital D.Estefânia from January to June of 2010 at Hospital de D. Estefânia’s delivery rooms, reviewing the indications for inducing labor as well as the techniques used. Material and Methods: Performing an historical prospective study searching the clinical processes as well as the mother and newborn’s computer database from January to June of 2010. An exposed and an unexposed group were created; the first group comprises pregnant women and their newborns whose labor was induced. The unexposed group is constituted by newborns and pregnant women whose labor was spontaneous. Labor induction was performed using intra-vaginal prostaglandins in women who didn’t start it spontaneously; perinatal infection was defined either clinically or using blood tests. The gestational age was ≥ 37 weeks for both groups. 19 variables were studied for both groups. Results: A total of 190 mother-newborn pairs were included: 55 in the exposed group and 135 in the unexposed group. 3 cases of perinatal infection were reported, two in the exposed group and one in the unexposed group. Preliminary data resulted in a perinatal infection rate of 3.6% in the exposed group and 0.7% in the unexposed group; preliminary data suggest that the risk of perinatal infection may be increased in up to 5-fold when labor is inducted. Conclusions: A larger series of patients and a multivariable analysis using logistic regression are both necessary in order to perform a more thorough assessment of labor induction’s role in perinatal infection risk. One must also try to distinguish labor inducing- and clinical practicesrelated factors.
Resumo:
Although associated with adverse outcomes in other cardiopulmonary diseases, limited evidence exists on the prognostic value of anaemia in patients with acute pulmonary embolism (PE). We sought to examine the associations between anaemia and mortality and length of hospital stay in patients with PE. We evaluated 14,276 patients with a primary diagnosis of PE from 186 hospitals in Pennsylvania, USA. We used random-intercept logistic regression to assess the association between anaemia at the time of presentation and 30-day mortality and discrete-time logistic hazard models to assess the association between anaemia and time to hospital discharge, adjusting for patient (age, gender, race, insurance type, clinical and laboratory variables) and hospital (region, size, teaching status) factors. Anaemia was present in 38.7% of patients at admission. Patients with anaemia had a higher 30-day mortality (13.7% vs. 6.3%; p <0.001) and a longer length of stay (geometric mean, 6.9 vs. 6.6 days; p <0.001) compared to patients without anaemia. In multivariable analyses, anaemia remained associated with an increased odds of death (OR 1.82, 95% CI: 1.60-2.06) and a decreased odds of discharge (OR 0.85, 95% CI: 0.82-0.89). Anaemia is very common in patients presenting with PE and is independently associated with an increased short-term mortality and length of stay.
Resumo:
BACKGROUND: In order to facilitate and improve the use of antiretroviral therapy (ART), international recommendations are released and updated regularly. We aimed to study if adherence to the recommendations is associated with better treatment outcomes in the Swiss HIV Cohort Study (SHCS). METHODS: Initial ART regimens prescribed to participants between 1998 and 2007 were classified according to IAS-USA recommendations. Baseline characteristics of patients who received regimens in violation with these recommendations (violation ART) were compared to other patients. Multivariable logistic and linear regression analyses were performed to identify associations between violation ART and (i) virological suppression and (ii) CD4 cell count increase, after one year. RESULTS: Between 1998 and 2007, 4189 SHCS participants started 241 different ART regimens. A violation ART was started in 5% of patients. Female patients (adjusted odds ratio aOR 1.83, 95%CI 1.28-2.62), those with a high education level (aOR 1.49, 95%CI 1.07-2.06) or a high CD4 count (aOR 1.53, 95%CI 1.02-2.30) were more likely to receive violation ART. The proportion of patients with an undetectable viral load (<400 copies/mL) after one year was significantly lower with violation ART than with recommended regimens (aOR 0.54, 95% CI 0.37-0.80) whereas CD4 count increase after one year of treatment was similar in both groups. CONCLUSIONS: Although more than 240 different initial regimens were prescribed, violations of the IAS-USA recommendations were uncommon. Patients receiving these regimens were less likely to have an undetectable viral load after one year, which strengthens the validity of these recommendations.
Resumo:
INTRODUCTION. Reduced cerebral perfusion pressure (CPP) may worsen secondary damage and outcome after severe traumatic brain injury (TBI), however the optimal management of CPP is still debated. STUDY HYPOTHESIS: We hypothesized that the impact of CPP on outcome is related to brain tissue oxygen tension (PbtO2) level and that reduced CPP may worsen TBI prognosis when it is associated with brain hypoxia. DESIGN. Retrospective analysis of prospective database. METHODS. We analyzed 103 patients with severe TBI who underwent continuous PbtO2 and CPP monitoring for an average of 5 days. For each patient, duration of reduced CPP (\60 mm Hg) and brain hypoxia (PbtO2\15 mm Hg for[30 min [1]) was calculated with linear interpolation method and the relationship between CPP and PbtO2 was analyzed with Pearson's linear correlation coefficient. Outcome at 30 days was assessed with the Glasgow Outcome Score (GOS), dichotomized as good (GOS 4-5) versus poor (GOS 1-3). Multivariable associations with outcome were analyzed with stepwise forward logistic regression. RESULTS. Reduced CPP (n=790 episodes; mean duration 10.2 ± 12.3 h) was observed in 75 (74%) patients and was frequently associated with brain hypoxia (46/75; 61%). Episodes where reduced CPP were associated with normal brain oxygen did not differ significantly between patients with poor versus those with good outcome (8.2 ± 8.3 vs. 6.5 ± 9.7 h; P=0.35). In contrast, time where reduced CPP occurred simultaneously with brain hypoxia was longer in patients with poor than in those with good outcome (3.3±7.4 vs. 0.8±2.3 h; P=0.02). Outcome was significantly worse in patients who had both reduced CPP and brain hypoxia (61% had GOS 1-3 vs. 17% in those with reduced CPP but no brain hypoxia; P\0.01). Patients in whom a positive CPP-PbtO2 correlation (r[0.3) was found also were more likely to have poor outcome (69 vs. 31% in patients with no CPP-PbtO2 correlation; P\0.01). Brain hypoxia was an independent risk factor of poor prognosis (odds ratio for favorable outcome of 0.89 [95% CI 0.79-1.00] per hour spent with a PbtO2\15 mm Hg; P=0.05, adjusted for CPP, age, GCS, Marshall CT and APACHE II). CONCLUSIONS. Low CPP may significantly worsen outcome after severe TBI when it is associated with brain tissue hypoxia. PbtO2-targeted management of CPP may optimize TBI therapy and improve outcome of head-injured patients.
Resumo:
OBJECTIVES: We aimed to (i) evaluate psychological distress in adolescent survivors of childhood cancer and compare them to siblings and a norm population; (ii) compare the severity of distress of distressed survivors and siblings with that of psychotherapy patients; and (iii) determine risk factors for psychological distress in survivors. METHODS: We sent a questionnaire to all childhood cancer survivors aged <16 years when diagnosed, who had survived ≥ 5 years and were aged 16-19 years at the time of study. Our control groups were same-aged siblings, a norm population, and psychotherapy patients. Psychological distress was measured with the Brief Symptom Inventory-18 (BSI-18) assessing somatization, depression, anxiety, and a global severity index (GSI). Participants with a T-score ≥ 57 were defined as distressed. We used logistic regression to determine risk factors. RESULTS: We evaluated the BSI-18 in 407 survivors and 102 siblings. Fifty-two survivors (13%) and 11 siblings (11%) had scores above the distress threshold (T ≥ 57). Distressed survivors scored significantly higher in somatization (p=0.027) and GSI (p=0.016) than distressed siblings, and also scored higher in somatization (p ≤ 0.001) and anxiety (p=0.002) than psychotherapy patients. In the multivariable regression, psychological distress was associated with female sex, self-reported late effects, and low perceived parental support. CONCLUSIONS: The majority of survivors did not report psychological distress. However, the severity of distress of distressed survivors exceeded that of distressed siblings and psychotherapy patients. Systematic psychological follow-up can help to identify survivors at risk and support them during the challenging period of adolescence.
Resumo:
The association between mental disorders (MDs) and iatrogenic complications after hip fracture surgery has been poorly studied. Among iatrogenic complications, nosocomial infections (NIs) are a major factor in hip fracture surgery. The aim of this paper was to determine whether patients with a MD and a hip fracture develop more NIs after hip surgery than patients with no MD. We studied 912 patients who underwent surgery for a hip fracture (223 patients with a MD who underwent surgery for a hip fracture and 689 control patients without a MD who also underwent surgery for a hip fracture) and followed them after surgery. Univariable and multivariable analyses were performed using simple and multiple logistic regression analysis (confidence interval, crude and adjusted odds ratios, and P value). We found that MDs, gender, and comorbidities were not associated with a higher risk of developing a NI after surgery for a hip fracture. Only age increases the risk of a NI.
Resumo:
BACKGROUND Obesity is positively associated with colorectal cancer. Recently, body size subtypes categorised by the prevalence of hyperinsulinaemia have been defined, and metabolically healthy overweight/obese individuals (without hyperinsulinaemia) have been suggested to be at lower risk of cardiovascular disease than their metabolically unhealthy (hyperinsulinaemic) overweight/obese counterparts. Whether similarly variable relationships exist for metabolically defined body size phenotypes and colorectal cancer risk is unknown. METHODS AND FINDINGS The association of metabolically defined body size phenotypes with colorectal cancer was investigated in a case-control study nested within the European Prospective Investigation into Cancer and Nutrition (EPIC) study. Metabolic health/body size phenotypes were defined according to hyperinsulinaemia status using serum concentrations of C-peptide, a marker of insulin secretion. A total of 737 incident colorectal cancer cases and 737 matched controls were divided into tertiles based on the distribution of C-peptide concentration amongst the control population, and participants were classified as metabolically healthy if below the first tertile of C-peptide and metabolically unhealthy if above the first tertile. These metabolic health definitions were then combined with body mass index (BMI) measurements to create four metabolic health/body size phenotype categories: (1) metabolically healthy/normal weight (BMI < 25 kg/m2), (2) metabolically healthy/overweight (BMI ≥ 25 kg/m2), (3) metabolically unhealthy/normal weight (BMI < 25 kg/m2), and (4) metabolically unhealthy/overweight (BMI ≥ 25 kg/m2). Additionally, in separate models, waist circumference measurements (using the International Diabetes Federation cut-points [≥80 cm for women and ≥94 cm for men]) were used (instead of BMI) to create the four metabolic health/body size phenotype categories. Statistical tests used in the analysis were all two-sided, and a p-value of <0.05 was considered statistically significant. In multivariable-adjusted conditional logistic regression models with BMI used to define adiposity, compared with metabolically healthy/normal weight individuals, we observed a higher colorectal cancer risk among metabolically unhealthy/normal weight (odds ratio [OR] = 1.59, 95% CI 1.10-2.28) and metabolically unhealthy/overweight (OR = 1.40, 95% CI 1.01-1.94) participants, but not among metabolically healthy/overweight individuals (OR = 0.96, 95% CI 0.65-1.42). Among the overweight individuals, lower colorectal cancer risk was observed for metabolically healthy/overweight individuals compared with metabolically unhealthy/overweight individuals (OR = 0.69, 95% CI 0.49-0.96). These associations were generally consistent when waist circumference was used as the measure of adiposity. To our knowledge, there is no universally accepted clinical definition for using C-peptide level as an indication of hyperinsulinaemia. Therefore, a possible limitation of our analysis was that the classification of individuals as being hyperinsulinaemic-based on their C-peptide level-was arbitrary. However, when we used quartiles or the median of C-peptide, instead of tertiles, as the cut-point of hyperinsulinaemia, a similar pattern of associations was observed. CONCLUSIONS These results support the idea that individuals with the metabolically healthy/overweight phenotype (with normal insulin levels) are at lower colorectal cancer risk than those with hyperinsulinaemia. The combination of anthropometric measures with metabolic parameters, such as C-peptide, may be useful for defining strata of the population at greater risk of colorectal cancer.
Resumo:
BACKGROUND: Non-adherence is one of the strongest predictors of therapeutic failure in HIV-positive patients. Virologic failure with subsequent emergence of resistance reduces future treatment options and long-term clinical success. METHODS: Prospective observational cohort study including patients starting new class of antiretroviral therapy (ART) between 2003 and 2010. Participants were naïve to ART class and completed ≥1 adherence questionnaire prior to resistance testing. Outcomes were development of any IAS-USA, class-specific, or M184V mutations. Associations between adherence and resistance were estimated using logistic regression models stratified by ART class. RESULTS: Of 314 included individuals, 162 started NNRTI and 152 a PI/r regimen. Adherence was similar between groups with 85% reporting adherence ≥95%. Number of new mutations increased with increasing non-adherence. In NNRTI group, multivariable models indicated a significant linear association in odds of developing IAS-USA (odds ratio (OR) 1.66, 95% confidence interval (CI): 1.04-2.67) or class-specific (OR 1.65, 95% CI: 1.00-2.70) mutations. Levels of drug resistance were considerably lower in PI/r group and adherence was only significantly associated with M184V mutations (OR 8.38, 95% CI: 1.26-55.70). Adherence was significantly associated with HIV RNA in PI/r but not NNRTI regimens. CONCLUSION: Therapies containing PI/r appear more forgiving to incomplete adherence compared with NNRTI regimens, which allow higher levels of resistance, even with adherence above 95%. However, in failing PI/r regimens good adherence may prevent accumulation of further resistance mutations and therefore help to preserve future drug options. In contrast, adherence levels have little impact on NNRTI treatments once the first mutations have emerged.
Resumo:
BACKGROUND: Smokeless tobacco is of increasing interest to public health researchers and policy makers. This study aims to measure prevalence of smokeless tobacco use (nasal dry snuff, snus and chewing tobacco) among young Swiss men, and to describe its correlates. METHODS: We invited 13 245 young men to participate in this survey on socio-economic and substance use data. Response rate was 45.2%. We included 5720 participants. Descriptive statistics and multivariable-adjusted logistic regression were performed. RESULTS: Mean age of participants was 19.5 years. Self-reported use once a month or more often was 8% for nasal dry snuff, 3% for snus and negligible for chewing tobacco. In multivariable-adjusted logistic regression, the odds for nasal dry snuff use increased in non daily smokers [odds ratio (OR) 2.41, 95% confidence interval (CI) 1.90-3.05], compared with non smokers, participants reporting risky weekly drinking volume (OR 3.93, 95% CI 1.86-8.32), compared with abstinents, and binge drinking once a month or more often (OR 7.41, 95% CI 4.11-13.38), compared with never binge drinking. Nasal dry snuff use was positively associated with higher BMI, average or above family income and German language, compared with French, and negatively associated with academic higher education, compared with non higher education, and occasional cannabis use, compared with no cannabis use. Correlates of snus were similar to those of nasal dry snuff. CONCLUSION: One in 12 young Swiss men use nasal dry snuff and 3% use snus. Consumption of smokeless tobacco is associated with a cluster of other risky behaviours, especially binge drinking.
Resumo:
Background: Evidence for a better performance of different highly atherogenic versus traditional lipid parameters for coronary heart disease (CHD) risk prediction is conflicting. We investigated the association of the ratios of sma11 dense low density lipoprotein(LDL)/apoplipoprotein A, aolipoprotein B/apolipoprotein A-I and total cholesterol! HDL-cholesterol and CHD events in patients on combination antiretroviral therapy (cART).Methods: Case control study nested into the Swiss HIV Cohort Study: for each cART-treated patient with a first coronary event between April 1, 2000 and July 31, 2008 (case) we selected four control patients (1) that were without coronary events until the date of the event of the index case, (2) had a plasma sample within ±30 days of the sample date of the respective case, (3) received cART and (4) were then matched for age, gender and smoking status. Lipoproteins were measured by ultracentrifugation. Conditional logistic regression models were used to estimate the independent effects of different lipid ratios and the occurrence of coronary events.Results: In total, 98 cases (19 fatal myocardial infarctions [MI] and 79 non-fatal coronary events [53 definite MIs, 15 possible MIs and 11 coronary angioplasties or bypassesJ) were matched with 392 controls. Cases were more often injecting drug users, less likely to be virologically suppressed and more often on abacavir-containing regimens. In separa te multivariable models of total cholesterol, triglycerides, HDL-cholesterol, systolic blood pressure, abdominal obesity, diabetes and family history of CHD, small dense-LDL and apolipoprotein B were each statistically significantly associated with CHD events (for 1 mg/dl increase: odds ratio [OR] 1.05, 95% CI 1.00-1.11 and 1.15, 95% CI 1.01-1.31, respectively), but the ratiosof small dense-LDLlapolipoprotein A-I (OR 1.26, 95% CI 0.95-1.67), apolipoprotein B/apolipoprotein A-I (OR 1.02, 95% CI 0.97-1.07) and HDL-cholesterol! total cholesterol (OR 0.99 95% CI 0.98-1.00) were not. Following adjustment for HIV related and cART variables these associations were weakened in each model: apolipoprotein B (OR 1.27, 95% CI 1.00-1.30), sd-LDL (OR 1.04, 95% CI 0.99-1.20), small dense-LDLlapolipoprotein A-I (OR 1.17, 95% CI 0.87-1.58), apolipoprotein B/apolipoprotein A-I (OR 1.02, 95% CI 0.97-1.07) and total cholesterolJHDL- cholesterol (OR 0.99, 95% CI 0.99-1.00).Conclusions: In patients receiving cART, small dense-LDL and apolipoprotein B showed the strongest associations with CHD events in models controlling for traditional CHD risk factors including total cholesterol and triglycerides. Adding small dense LDLlapoplipoprotein A-l, apolipoprotein B/apolipoprotein A-I and total cholesterol! HDL-cholesterol ratios did not further improve models of lipid parameters and associations of increased risk for CHD events.
Resumo:
OBJECTIVES: Therapeutic hypothermia and pharmacological sedation may influence outcome prediction after cardiac arrest. The use of a multimodal approach, including clinical examination, electroencephalography, somatosensory-evoked potentials, and serum neuron-specific enolase, is recommended; however, no study examined the comparative performance of these predictors or addressed their optimal combination. DESIGN: Prospective cohort study. SETTING: Adult ICU of an academic hospital. PATIENTS: One hundred thirty-four consecutive adults treated with therapeutic hypothermia after cardiac arrest. MEASUREMENTS AND MAIN RESULTS: Variables related to the cardiac arrest (cardiac rhythm, time to return of spontaneous circulation), clinical examination (brainstem reflexes and myoclonus), electroencephalography reactivity during therapeutic hypothermia, somatosensory-evoked potentials, and serum neuron-specific enolase. Models to predict clinical outcome at 3 months (assessed using the Cerebral Performance Categories: 5 = death; 3-5 = poor recovery) were evaluated using ordinal logistic regressions and receiving operator characteristic curves. Seventy-two patients (54%) had a poor outcome (of whom, 62 died), and 62 had a good outcome. Multivariable ordinal logistic regression identified absence of electroencephalography reactivity (p < 0.001), incomplete recovery of brainstem reflexes in normothermia (p = 0.013), and neuron-specific enolase higher than 33 μg/L (p = 0.029), but not somatosensory-evoked potentials, as independent predictors of poor outcome. The combination of clinical examination, electroencephalography reactivity, and neuron-specific enolase yielded the best predictive performance (receiving operator characteristic areas: 0.89 for mortality and 0.88 for poor outcome), with 100% positive predictive value. Addition of somatosensory-evoked potentials to this model did not improve prognostic accuracy. CONCLUSIONS: Combination of clinical examination, electroencephalography reactivity, and serum neuron-specific enolase offers the best outcome predictive performance for prognostication of early postanoxic coma, whereas somatosensory-evoked potentials do not add any complementary information. Although prognostication of poor outcome seems excellent, future studies are needed to further improve prediction of good prognosis, which still remains inaccurate.
Resumo:
The predictive potential of six selected factors was assessed in 72 patients with primary myelodysplastic syndrome using univariate and multivariate logistic regression analysis of survival at 18 months. Factors were age (above median of 69 years), dysplastic features in the three myeloid bone marrow cell lineages, presence of chromosome defects, all metaphases abnormal, double or complex chromosome defects (C23), and a Bournemouth score of 2, 3, or 4 (B234). In the multivariate approach, B234 and C23 proved to be significantly associated with a reduction in the survival probability. The similarity of the regression coefficients associated with these two factors means that they have about the same weight. Consequently, the model was simplified by counting the number of factors (0, 1, or 2) present in each patient, thus generating a scoring system called the Lausanne-Bournemouth score (LB score). The LB score combines the well-recognized and easy-to-use Bournemouth score (B score) with the chromosome defect complexity, C23 constituting an additional indicator of patient outcome. The predicted risk of death within 18 months calculated from the model is as follows: 7.1% (confidence interval: 1.7-24.8) for patients with an LB score of 0, 60.1% (44.7-73.8) for an LB score of 1, and 96.8% (84.5-99.4) for an LB score of 2. The scoring system presented here has several interesting features. The LB score may improve the predictive value of the B score, as it is able to recognize two prognostic groups in the intermediate risk category of patients with B scores of 2 or 3. It has also the ability to identify two distinct prognostic subclasses among RAEB and possibly CMML patients. In addition to its above-described usefulness in the prognostic evaluation, the LB score may bring new insights into the understanding of evolution patterns in MDS. We used the combination of the B score and chromosome complexity to define four classes which may be considered four possible states of myelodysplasia and which describe two distinct evolutional pathways.
Resumo:
OBJECTIVE: Overanticoagulated medical inpatients may be particularly prone to bleeding complications. Among medical inpatients with excessive oral anticoagulation (AC), we sought to identify patient and treatment factors associated with bleeding. METHODS: We prospectively identified consecutive patients receiving oral AC admitted to the medical ward of a university hospital (February-July 2006) who had at least one international normalized ratio (INR) value >3.0 during the hospital stay. We recorded patient characteristics, AC-related factors, and concomitant treatments (e.g., platelet inhibitors) that increase the bleeding risk. The outcome was overall bleeding, defined as the occurrence of major or minor bleeding during the hospital stay. We used logistic regression to explore patient and treatment factors associated with bleeding. RESULTS: Overall, 145 inpatients with excessive oral AC comprised our study sample. Atrial fibrillation (59%) and venous thromboembolism (28%) were the most common indications for AC. Twelve patients (8.3%) experienced a bleeding event. Of these, 8 had major bleeding. Women had a somewhat higher risk of major bleeding than men (12.5% vs 4.1%, p = 0.08). Multivariable analysis demonstrated that female gender was independently associated with bleeding (odds ratio [OR] 4.3, 95% confidence interval [95% C1] 1.1-17.8). Age, history of major bleeding, value of the index INR, and concomitant treatment with platelet inhibitors were not independent predictors of bleeding. CONCLUSIONS: We found that hospitalized women experiencing an episode of excessive oral AC have a 4-fold increased risk of bleeding compared with men. Whether overanticoagulated women require more aggressive measures of AC reversal must be examined in further studies.
Resumo:
STUDY OBJECTIVES: There is limited information regarding sleep duration and determinants in Switzerland. We aimed to assess the trends and determinants of time in bed as a proxy for sleep duration in the Swiss canton of Geneva. METHODS: Data from repeated, independent cross-sectional representative samples of adults (≥ 18 years) of the Geneva population were collected between 2005 and 2011. Self-reported time in bed, education, monthly income, and nationality were assessed by questionnaire. RESULTS: Data from 3,853 participants (50% women, 51.7 ± 10.9 years) were analyzed. No significant trend was observed between 2005 and 2011 regarding time in bed or the prevalence of short (≤ 6 h/day) and long (> 9 h/day) time in bed. Elderly participants reported a longer time in bed (year-adjusted mean ± standard error: 7.67 ± 0.02, 7.82 ± 0.03, and 8.41 ± 0.04 h/day for 35-50, 50-65, and 65+ years, respectively, p < 0.001), while shorter time in bed was reported by non-Swiss participants (7.77 ± 0.03 vs. 7.92 ± 0.03 h/day for Swiss nationals, p < 0.001), participants with higher education (7.92 ± 0.02 for non-university vs. 7.74 ± 0.03 h/day for university, p < 0.001) or higher income (8.10 ± 0.04, 7.84 ± 0.03, and 7.70 ± 0.03 h/day for < 5,000 SFr; 5,000-9,500 SFr, and > 9,500 SFr, respectively, p < 0.001). Multivariable-adjusted polytomous logistic regression showed short and long time in bed to be positively associated with obesity and negatively associated with income. CONCLUSION: In a Swiss adult population, sleep duration as assessed by time in bed did not change significantly between 2005 and 2011. Both clinical and socioeconomic factors influence time in bed.
Resumo:
OBJECTIVES: To assess consequences of physical violence at work and identify their predictors. METHODS: Among the patients in a medicolegal consultation from 2007 to 2010, the subsample of workplace violence victims (n = 185) was identified and contacted again in average 30 months after the assault. Eighty-six victims (47 %) participated. Ordinal logistic regression analyses assessed the effect of 9 potential risk factors on physical, psychological and work consequences summarized in a severity score (0-9). RESULTS: Severity score distribution was as follows: 4+: 14 %; 1-3: 42 %; and 0: 44 %. Initial psychological distress resulting from the violence was a strong predictor (p < 0.001) of the severity score both on work and long-term psychological consequences. Gender and age did not reach significant levels in multivariable analyses even though female victims had overall more severe consequences. Unexpectedly, only among workers whose jobs implied high awareness of the risk of violence, first-time violence was associated with long-term psychological and physical consequences (p = 0.004). Among the factors assessed at follow-up, perceived lack of employers' support or absence of employer was associated with higher values on the severity score. The seven other assessed factors (initial physical injuries; previous experience of violence; preexisting health problems; working alone; internal violence; lack of support from colleagues; and lack of support from family or friends) were not significantly associated with the severity score. CONCLUSIONS: Being a victim of workplace violence can result in long-term consequences on health and employment, their severity increases with the seriousness of initial psychological distress. Support from the employer can help prevent negative outcomes.