913 resultados para Risk ratio
Resumo:
BACKGROUND: Polypharmacy, defined as the concomitant use of multiple medications, is very common in the elderly and may trigger drug-drug interactions and increase the risk of falls in patients receiving vitamin K antagonists. OBJECTIVE: To examine whether polypharmacy increases the risk of bleeding in elderly patients who receive vitamin K antagonists for acute venous thromboembolism (VTE). DESIGN: We used a prospective cohort study. PARTICIPANTS: In a multicenter Swiss cohort, we studied 830 patients aged ≥ 65 years with VTE. MAIN MEASURES: We defined polypharmacy as the prescription of more than four different drugs. We assessed the association between polypharmacy and the time to a first major and clinically relevant non-major bleeding, accounting for the competing risk of death. We adjusted for known bleeding risk factors (age, gender, pulmonary embolism, active cancer, arterial hypertension, cardiac disease, cerebrovascular disease, chronic liver and renal disease, diabetes mellitus, history of major bleeding, recent surgery, anemia, thrombocytopenia) and periods of vitamin K antagonist treatment as a time-varying covariate. KEY RESULTS: Overall, 413 (49.8 %) patients had polypharmacy. The mean follow-up duration was 17.8 months. Patients with polypharmacy had a significantly higher incidence of major (9.0 vs. 4.1 events/100 patient-years; incidence rate ratio [IRR] 2.18, 95 % confidence interval [CI] 1.32-3.68) and clinically relevant non-major bleeding (14.8 vs. 8.0 events/100 patient-years; IRR 1.85, 95 % CI 1.27-2.71) than patients without polypharmacy. After adjustment, polypharmacy was significantly associated with major (sub-hazard ratio [SHR] 1.83, 95 % CI 1.03-3.25) and clinically relevant non-major bleeding (SHR 1.60, 95 % CI 1.06-2.42). CONCLUSIONS: Polypharmacy is associated with an increased risk of both major and clinically relevant non-major bleeding in elderly patients receiving vitamin K antagonists for VTE.
Resumo:
OBJECTIVES: To determine whether nalmefene combined with psychosocial support is cost-effective compared with psychosocial support alone for reducing alcohol consumption in alcohol-dependent patients with high/very high drinking risk levels (DRLs) as defined by the WHO, and to evaluate the public health benefit of reducing harmful alcohol-attributable diseases, injuries and deaths. DESIGN: Decision modelling using Markov chains compared costs and effects over 5 years. SETTING: The analysis was from the perspective of the National Health Service (NHS) in England and Wales. PARTICIPANTS: The model considered the licensed population for nalmefene, specifically adults with both alcohol dependence and high/very high DRLs, who do not require immediate detoxification and who continue to have high/very high DRLs after initial assessment. DATA SOURCES: We modelled treatment effect using data from three clinical trials for nalmefene (ESENSE 1 (NCT00811720), ESENSE 2 (NCT00812461) and SENSE (NCT00811941)). Baseline characteristics of the model population, treatment resource utilisation and utilities were from these trials. We estimated the number of alcohol-attributable events occurring at different levels of alcohol consumption based on published epidemiological risk-relation studies. Health-related costs were from UK sources. MAIN OUTCOME MEASURES: We measured incremental cost per quality-adjusted life year (QALY) gained and number of alcohol-attributable harmful events avoided. RESULTS: Nalmefene in combination with psychosocial support had an incremental cost-effectiveness ratio (ICER) of £5204 per QALY gained, and was therefore cost-effective at the £20,000 per QALY gained decision threshold. Sensitivity analyses showed that the conclusion was robust. Nalmefene plus psychosocial support led to the avoidance of 7179 alcohol-attributable diseases/injuries and 309 deaths per 100,000 patients compared to psychosocial support alone over the course of 5 years. CONCLUSIONS: Nalmefene can be seen as a cost-effective treatment for alcohol dependence, with substantial public health benefits. TRIAL REGISTRATION NUMBERS: This cost-effectiveness analysis was developed based on data from three randomised clinical trials: ESENSE 1 (NCT00811720), ESENSE 2 (NCT00812461) and SENSE (NCT00811941).
Resumo:
To evaluate how young physicians in training perceive their patients' cardiovascular risk based on the medical charts and their clinical judgment. Cross sectional observational study. University outpatient clinic, Lausanne, Switzerland. Two hundred hypertensive patients and 50 non-hypertensive patients with at least one cardiovascular risk factor. Comparison of the absolute 10-year cardiovascular risk calculated by a computer program based on the Framingham score and adapted for physicians by the WHO/ISH with the perceived risk as assessed clinically by the physicians. Physicians underestimated the 10-year cardiovascular risk of their patients compared to that calculated with the Framingham score. Concordance between methods was 39% for hypertensive patients and 30% for non-hypertensive patients. Underestimation of cardiovascular risks for hypertensive patients was related to the fact they had a stabilized systolic blood pressure under 140 mm Hg (OR = 2.1 [1.1; 4.1]). These data show that young physicians in training often have an incorrect perception of the cardiovascular risk of their patients with a tendency to underestimate the risk. However, the calculated risk could also be slightly overestimated when applying the Framingham Heart Study model to a Swiss population. To implement a systematic evaluation of risk factors in primary care a greater emphasis should be placed on the teaching of cardiovascular risk evaluation and on the implementation of quality improvement programs.
Resumo:
The use of areal bone mineral density (aBMD) for fracture prediction may be enhanced by considering bone microarchitectural deterioration. Trabecular bone score (TBS) helped in redefining a significant subset of non-osteoporotic women as a higher risk group. INTRODUCTION: TBS is an index of bone microarchitecture. Our goal was to assess the ability of TBS to predict incident fracture. METHODS: TBS was assessed in 560 postmenopausal women from the Os des Femmes de Lyon cohort, who had a lumbar spine (LS) DXA scan (QDR 4500A, Hologic) between years 2000 and 2001. During a mean follow-up of 7.8 ± 1.3 years, 94 women sustained 112 fragility fractures. RESULTS: At the time of baseline DXA scan, women with incident fracture were significantly older (70 ± 9 vs. 65 ± 8 years) and had a lower LS_aBMD and LS_TBS (both -0.4SD, p < 0.001) than women without fracture. The magnitude of fracture prediction was similar for LS_aBMD and LS_TBS (odds ratio [95 % confidence interval] = 1.4 [1.2;1.7] and 1.6 [1.2;2.0]). After adjustment for age and prevalent fracture, LS_TBS remained predictive of an increased risk of fracture. Yet, its addition to age, prevalent fracture, and LS_aBMD did not reach the level of significance to improve the fracture prediction. When using the WHO classification, 39 % of fractures occurred in osteoporotic women, 46 % in osteopenic women, and 15 % in women with T-score > -1. Thirty-seven percent of fractures occurred in the lowest quartile of LS_TBS, regardless of BMD. Moreover, 35 % of fractures that occurred in osteopenic women were classified below this LS_TBS threshold. CONCLUSION: In conclusion, LS_aBMD and LS_TBS predicted fractures equally well. In our cohort, the addition of LS_TBS to age and LS_aBMD added only limited information on fracture risk prediction. However, using the lowest quartile of LS_TBS helped in redefining a significant subset of non-osteoporotic women as a higher risk group which is important for patient management.
Resumo:
BACKGROUND: Three different burnout types have been described: The "frenetic" type describes involved and ambitious subjects who sacrifice their health and personal lives for their jobs; the "underchallenged" type describes indifferent and bored workers who fail to find personal development in their jobs, and the "worn-out" in type describes neglectful subjects who feel they have little control over results and whose efforts go unacknowledged. The study aimed to describe the possible associations between burnout types and general sociodemographic and occupational characteristics. METHODS: A cross-sectional study was carried out on a multi-occupational sample of randomly selected university employees (n = 409). The presence of burnout types was assessed by means of the "Burnout Clinical Subtype Questionnaire (BCSQ-36)", and the degree of association between variables was assessed using an adjusted odds ratio (OR) obtained from multivariate logistic regression models. RESULTS: Individuals working more than 40 hours per week presented with the greatest risk for "frenetic" burnout compared to those working fewer than 35 hours (adjusted OR = 5.69; 95% CI = 2.52-12.82; p < 0.001). Administration and service personnel presented the greatest risk of "underchallenged" burnout compared to teaching and research staff (adjusted OR = 2.85; 95% CI = 1.16-7.01; p = 0.023). Employees with more than sixteen years of service in the organisation presented the greatest risk of "worn-out" burnout compared to those with less than four years of service (adjusted OR = 4.56; 95% CI = 1.47-14.16; p = 0.009). CONCLUSIONS: This study is the first to our knowledge that suggests the existence of associations between the different burnout subtypes (classified according to the degree of dedication to work) and the different sociodemographic and occupational characteristics that are congruent with the definition of each of the subtypes. These results are consistent with the clinical profile definitions of burnout syndrome. In addition, they assist the recognition of distinct profiles and reinforce the idea of differential characterisation of the syndrome for more effective treatment.
Resumo:
Objective. To examine the association between pre-diagnostic circulating vitamin D concentration, dietary intake of vitamin D and calcium, and the risk of colorectal cancer in European populations. Design Nested case-control study. Setting. The study was conducted within the EPIC study, a cohort of more than 520 000 participants from 10 western European countries. Participants: 1248 cases of incident colorectal cancer, which developed after enrolment into the cohort, were matched to 1248 controls. Main outcome measures. Circulating vitamin D concentration (25-hydroxy-vitamin-D, 25-(OH)D) was measured by enzyme immunoassay. Dietary and lifestyle data were obtained from questionnaires. Incidence rate ratios and 95% confidence intervals for the risk of colorectal cancer by 25-(OH)D concentration and levels of dietary calcium and vitamin D intake were estimated from multivariate conditional logistic regression models, with adjustment for potential dietary and other confounders. Results. 25-(OH)D concentration showed a strong inverse linear dose-response association with risk of colorectal cancer (P for trend <0.001). Compared with a pre-defined mid-level concentration of 25-(OH)D (50.0-75.0 nmol/l), lower levels were associated with higher colorectal cancer risk (<25.0 nmol/l: incidence rate ratio 1.32 (95% confidence interval 0.87 to 2.01); 25.0-49.9 nmol/l: 1.28 (1.05 to 1.56), and higher concentrations associated with lower risk (75.0-99.9 nmol/l: 0.88 (0.68 to 1.13); ≥100.0 nmol/l: 0.77 (0.56 to 1.06)). In analyses by quintile of 25-(OH)D concentration, patients in the highest quintile had a 40% lower risk of colorectal cancer than did those in the lowest quintile (P<0.001). Subgroup analyses showed a strong association for colon but not rectal cancer (P for heterogeneity=0.048). Greater dietary intake of calcium was associated with a lower colorectal cancer risk. Dietary vitamin D was not associated with disease risk. Findings did not vary by sex and were not altered by corrections for season or month of blood donation. Conclusions The results of this large observational study indicate a strong inverse association between levels of pre-diagnostic 25-(OH)D concentration and risk of colorectal cancer in western European populations. Further randomised trials are needed to assess whether increases in circulating 25-(OH)D concentration can effectively decrease the risk of colorectal cancer.
Resumo:
Background: The association between alcohol consumption and coronary heart disease (CHD) has been widely studied. Most of these studies have concluded that moderate alcohol intake reduces the risk of CHD. There are numerous discussions regarding whether this association is causal or biased. The objective of this paper is to analyse the association between alcohol intake and CHD risk in the Spanish cohort of the European Prospective Investigation into Cancer (EPIC). Methods: Participants from the EPIC Spanish cohort were included (15 630 men and 25 808 women). The median follow-up period was 10 years. Ethanol intake was calculated using a validated dietary history questionnaire. Participants with a definite CHD event were considered cases. A Cox regression model adjusted for relevant co-variables and stratified by age was produced. Separate models were carried out for men and women. Results: The crude CHD incidence rate was 300.6/100 000 person-years for men and 47.9/100 000 person-years for women. Moderate, high and very high consumption was associated with a reduced risk of CHD in men: hazard ratio 0.90 (95% CI 0.56 to 1.44) for former drinkers, 0.65 (95% CI 0.41 to 1.04) for low, 0.49 (95% CI 0.32 to 0.76) for moderate, 0.46 (95% CI 0.30 to 0.71) for high and 0.50 (95% CI 0.29 to 0.85) for very high consumers. A negative association was found in women, with p values above 0.05 in all categories. Conclusions: Alcohol intake in men aged 29–69 years was associated with a more than 30% lower CHD incidence. This study is based on a large prospective cohort study and is free of the abstainer error.
Resumo:
Several recent studies suggest that obesity may be a risk factor for fracture. The aim of this study was to investigate the association between body mass index (BMI) and future fracture risk at different skeletal sites. In prospective cohorts from more than 25 countries, baseline data on BMI were available in 398,610 women with an average age of 63 (range, 20-105) years and follow up of 2.2 million person-years during which 30,280 osteoporotic fractures (6457 hip fractures) occurred. Femoral neck BMD was measured in 108,267 of these women. Obesity (BMI ≥ 30 kg/m(2) ) was present in 22%. A majority of osteoporotic fractures (81%) and hip fractures (87%) arose in non-obese women. Compared to a BMI of 25 kg/m(2) , the hazard ratio (HR) for osteoporotic fracture at a BMI of 35 kg/m(2) was 0.87 (95% confidence interval [CI], 0.85-0.90). When adjusted for bone mineral density (BMD), however, the same comparison showed that the HR for osteoporotic fracture was increased (HR, 1.16; 95% CI, 1.09-1.23). Low BMI is a risk factor for hip and all osteoporotic fracture, but is a protective factor for lower leg fracture, whereas high BMI is a risk factor for upper arm (humerus and elbow) fracture. When adjusted for BMD, low BMI remained a risk factor for hip fracture but was protective for osteoporotic fracture, tibia and fibula fracture, distal forearm fracture, and upper arm fracture. When adjusted for BMD, high BMI remained a risk factor for upper arm fracture but was also a risk factor for all osteoporotic fractures. The association between BMI and fracture risk is complex, differs across skeletal sites, and is modified by the interaction between BMI and BMD. At a population level, high BMI remains a protective factor for most sites of fragility fracture. The contribution of increasing population rates of obesity to apparent decreases in fracture rates should be explored. © 2014 American Society for Bone and Mineral Research.
Resumo:
OBJECTIVE: Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS: We conducted a prospective cohort study involving 991 patients ≥65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS: Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION: In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.
Resumo:
BACKGROUND Waist circumference (WC) is a simple and reliable measure of fat distribution that may add to the prediction of type 2 diabetes (T2D), but previous studies have been too small to reliably quantify the relative and absolute risk of future diabetes by WC at different levels of body mass index (BMI). METHODS AND FINDINGS The prospective InterAct case-cohort study was conducted in 26 centres in eight European countries and consists of 12,403 incident T2D cases and a stratified subcohort of 16,154 individuals from a total cohort of 340,234 participants with 3.99 million person-years of follow-up. We used Prentice-weighted Cox regression and random effects meta-analysis methods to estimate hazard ratios for T2D. Kaplan-Meier estimates of the cumulative incidence of T2D were calculated. BMI and WC were each independently associated with T2D, with WC being a stronger risk factor in women than in men. Risk increased across groups defined by BMI and WC; compared to low normal weight individuals (BMI 18.5-22.4 kg/m(2)) with a low WC (<94/80 cm in men/women), the hazard ratio of T2D was 22.0 (95% confidence interval 14.3; 33.8) in men and 31.8 (25.2; 40.2) in women with grade 2 obesity (BMI≥35 kg/m(2)) and a high WC (>102/88 cm). Among the large group of overweight individuals, WC measurement was highly informative and facilitated the identification of a subgroup of overweight people with high WC whose 10-y T2D cumulative incidence (men, 70 per 1,000 person-years; women, 44 per 1,000 person-years) was comparable to that of the obese group (50-103 per 1,000 person-years in men and 28-74 per 1,000 person-years in women). CONCLUSIONS WC is independently and strongly associated with T2D, particularly in women, and should be more widely measured for risk stratification. If targeted measurement is necessary for reasons of resource scarcity, measuring WC in overweight individuals may be an effective strategy, since it identifies a high-risk subgroup of individuals who could benefit from individualised preventive action.
Resumo:
OBJECTIVE To assess the association between consumption of fried foods and risk of coronary heart disease. DESIGN Prospective cohort study. SETTING Spanish cohort of the European Prospective Investigation into Cancer and Nutrition. PARTICIPANTS 40 757 adults aged 29-69 and free of coronary heart disease at baseline (1992-6), followed up until 2004. MAIN OUTCOME MEASURES Coronary heart disease events and vital status identified by record linkage with hospital discharge registers, population based registers of myocardial infarction, and mortality registers. RESULTS During a median follow-up of 11 years, 606 coronary heart disease events and 1135 deaths from all causes occurred. Compared with being in the first (lowest) quarter of fried food consumption, the multivariate hazard ratio of coronary heart disease in the second quarter was 1.15 (95% confidence interval 0.91 to 1.45), in the third quarter was 1.07 (0.83 to 1.38), and in the fourth quarter was 1.08 (0.82 to 1.43; P for trend 0.74). The results did not vary between those who used olive oil for frying and those who used sunflower oil. Likewise, no association was observed between fried food consumption and all cause mortality: multivariate hazard ratio for the highest versus the lowest quarter of fried food consumption was 0.93 (95% confidence interval 0.77 to 1.14; P for trend 0.98). CONCLUSION In Spain, a Mediterranean country where olive or sunflower oil is used for frying, the consumption of fried foods was not associated with coronary heart disease or with all cause mortality.
Resumo:
Vancomycin-resistant enterococci (VRE) are important hospital pathogens and have become increasingly common in patients admitted to the intensive care unit (ICU). To determine the incidence and the risk factors associated with VRE colonisation among ICU patients, active surveillance cultures for VRE faecal carriages were carried out in patients admitted to the ICU of the University Hospital of Uberlândia, Minas Gerais, Brazil. Risk factors were assessed using a case-control study. Seventy-seven patients (23.1%) were found to be colonised with vanC VRE and only one patient (0.3%) was colonised with vanA VRE. Independent risk factors for VRE colonisation included nephropathy [odds ratio (OR) = 13.6, p < 0.001], prior antibiotic use (OR = 5.5, p < 0.03) and carbapenem use (OR = 17.3, p < 0.001). Our results showed a higher frequency (23.1%) of Enterococcus gallinarum and Enterococcus casseliflavus, species that are intrinsically resistant to low levels of vancomycin (vanC), without an associated infection, associated with prior antibiotic use, carbapenem use and nephropathy as comorbidity. This study is the first to demonstrate the risk factors associated with vanC VRE colonisation in ICU hospitalised patients. Although vanA and vanB enterococci are of great importance, the epidemiology of vanC VRE needs to be better understood. Even though the clinical relevance of vanC VRE is uncertain, these species are opportunistic pathogens and vanC VRE-colonised patients are a potential epidemiologic reservoir of resistance genes.
Resumo:
Severe forms of dengue, such as dengue haemorrhagic fever (DHF) and dengue shock syndrome, are examples of a complex pathogenic mechanism in which the virus, environment and host immune response interact. The influence of the host's genetic predisposition to susceptibility or resistance to infectious diseases has been evidenced in several studies. The association of the human leukocyte antigen gene (HLA) class I alleles with DHF susceptibility or resistance has been reported in ethnically and geographically distinct populations. Due to these ethnic and viral strain differences, associations occur in each population, independently with a specific allele, which most likely explains the associations of several alleles with DHF. As the potential role of HLA alleles in the progression of DHF in Brazilian patients remains unknown, we then identified HLA-A alleles in 67 patients with dengue fever and 42 with DHF from Rio de Janeiro, Brazil, selected from 2002-2008 by the sequence-based typing technique. Statistical analysis revealed an association between the HLA-A*01 allele and DHF [odds ratio (OR) = 2.7, p = 0.01], while analysis of the HLA-A*31 allele (OR = 0.5, p = 0.11) suggested a potential protective role in DHF that should be further investigated. This study provides evidence that HLA class I alleles might be important risk factors for DHF in Brazilian patients.
Resumo:
Résumé en français Cadre : Policlinique pédiatrique à Lausanne en Suisse, pays rencontrant une proportion importante de tuberculose au sein de la population de migrants. But : Déterminer les facteurs de risque associés à un test tuberculinique positif (ou test de Mantoux), notamment l'influence du BCG (Bacille Calmette Guérin) et d'un contact avec un personne ayant une tuberculose active. Les patients concernés étaient des enfants examinés dans le cadre d'un contrôle de santé ou dans le cadre d'une étude d'entourage d'un cas déclaré de tuberculose. Méthode : Etude descriptive comprenant des enfants ayant eu un test tuberculinique (2 unités RT23) entre novembre 2002 et avril 2004. L'âge, le sexe, l'anamnèse de contact avec une personne ayant une tuberculose active, la vaccination par le BCG, le pays d'origine et le lieu de naissance (en Suisse ou hors de la Suisse) étaient répertoriés. Résultats : Parmi les 234 enfants de l'étude, 176 (75%) avaient une réaction tuberculinique égal à zéro et 31 (13%) avaient une réaction positive (> 10mm). Dans le modèle de régression linéaire, la taille de la réaction tuberculinique variait significativement selon l'anamnèse de contact avec une personne ayant une tuberculose active, l'âge, l'incidence de la tuberculose dans le pays d'origine et la vaccination par le BCG. Le sexe ou le lieu de naissance n'influençait pas la taille de la réaction. Dans le modèle de régression logistique incluant toutes les valeurs répertoriées, les paramètres significativement associés avec un Mantoux positif étaient l'âge (Odds Ratio = 1.21, 95% CI 1.08 ; 1.35), l'anamnèse de contact avec une personne ayant une tuberculose active (OR = 7.31, 95% CI 2.23 ; 24) et l'incidence de la tuberculose dans le pays d'origine (OR = 1.01, 95% CI 1.00 ; 1.02). Le sexe (OR = 1.18, 95% CI 0.50 ; 2.78) et la vaçcination par le BCG (OR = 2.97, 95% CI 0.91 ; 9.72) n'étaient pas associés avec une réaction tuberculinique positive. Conclusions : L'incidence de la tuberculose dans le pays d'origine, la vaccination par le BCG et l'âge influencent le test de Mantoux (taille ou proportion de réaction > 10mm). Toutefois, le facteur de risque le plus important d'avoir une réaction tuberculinique positive est l'anamnèse de contact avec. une personne ayant une tuberculose active.
Resumo:
Background: Obesity is a major risk factor for type 2 diabetes mellitus (T2DM). A proper anthropometric characterisation of T2DM risk is essential for disease prevention and clinical risk assessement. Methods: Longitudinal study in 37 733 participants (63% women) of the Spanish EPIC (European Prospective Investigation into Cancer and Nutrition) cohort without prevalent diabetes. Detailed questionnaire information was collected at baseline and anthropometric data gathered following standard procedures. A total of 2513 verified incident T2DM cases occurred after 12.1 years of mean follow-up. Multivariable Cox regression was used to calculate hazard ratios of T2DM by levels of anthropometric variables. Results: Overall and central obesity were independently associated with T2DM risk. BMI showed the strongest association with T2DM in men whereas waist-related indices were stronger independent predictors in women. Waist-to-height ratio revealed the largest area under the ROC curve in men and women, with optimal cut-offs at 0.60 and 0.58, respectively. The most discriminative waist circumference (WC) cut-off values were 99.4 cm in men and 90.4 cm in women. Absolute risk of T2DM was higher in men than women for any combination of age, BMI and WC categories, and remained low in normal-waist women. The population risk of T2DM attributable to obesity was 17% in men and 31% in women. Conclusions: Diabetes risk was associated with higher overall and central obesity indices even at normal BMI and WC values. The measurement of waist circumference in the clinical setting is strongly recommended for the evaluation of future T2DM risk in women.