931 resultados para 850-1.07[Alighieri]


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Single nucleotide polymorphisms (SNPs) rs429358 (ε4) and rs7412 (ε2), both invoking changes in the amino-acid sequence of the apolipoprotein E (APOE) gene, have previously been tested for association with multiple sclerosis (MS) risk. However, none of these studies was sufficiently powered to detect modest effect sizes at acceptable type-I error rates. As both SNPs are only imperfectly captured on commonly used microarray genotyping platforms, their evaluation in the context of genome-wide association studies has been hindered until recently. Methods We genotyped 12 740 subjects hitherto not studied for their APOE status, imputed raw genotype data from 8739 subjects from five independent genome wide association studies datasets using the most recent high-resolution reference panels, and extracted genotype data for 8265 subjects from previous candidate gene assessments. Results Despite sufficient power to detect associations at genome-wide significance thresholds across a range of ORs, our analyses did not support a role of rs429358 or rs7412 on MS susceptibility. This included meta-analyses of the combined data across 13 913 MS cases and 15 831 controls (OR=0.95, p=0.259, and OR 1.07, p=0.0569, for rs429358 and rs7412, respectively). Conclusion Given the large sample size of our analyses, it is unlikely that the two APOE missense SNPs studied here exert any relevant effects on MS susceptibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background To explore the impact of geographical remoteness and area-level socioeconomic disadvantage on colorectal cancer (CRC) survival. Methods Multilevel logistic regression and Markov chain Monte Carlo simulations were used to analyze geographical variations in five-year all-cause and CRC-specific survival across 478 regions in Queensland Australia for 22,727 CRC cases aged 20–84 years diagnosed from 1997–2007. Results Area-level disadvantage and geographic remoteness were independently associated with CRC survival. After full multivariate adjustment (both levels), patients from remote (odds Ratio [OR]: 1.24, 95%CrI: 1.07-1.42) and more disadvantaged quintiles (OR = 1.12, 1.15, 1.20, 1.23 for Quintiles 4, 3, 2 and 1 respectively) had lower CRC-specific survival than major cities and least disadvantaged areas. Similar associations were found for all-cause survival. Area disadvantage accounted for a substantial amount of the all-cause variation between areas. Conclusions We have demonstrated that the area-level inequalities in survival of colorectal cancer patients cannot be explained by the measured individual-level characteristics of the patients or their cancer and remain after adjusting for cancer stage. Further research is urgently needed to clarify the factors that underlie the survival differences, including the importance of geographical differences in clinical management of CRC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives Heatwaves can have significant health consequences resulting in increased mortality and morbidity. However, their impact on people living in tropical/subtropical regions remains largely unknown. This study assessed the impact of heatwaves on mortality and emergency hospital admissions (EHAs) from non-external causes (NEC) in Brisbane, a subtropical city in Australia. Methods We acquired daily data on weather, air pollution and EHAs for patients aged 15 years and over in Brisbane between January 1996 and December 2005, and on mortality between January 1996 and November 2004. A locally derived definition of heatwave (daily maximum ≥37°C for 2 or more consecutive days) was adopted. Case–crossover analyses were used to assess the impact of heatwaves on cause-specific mortality and EHAs. Results During heatwaves, there was a statistically significant increase in NEC mortality (OR 1.46; 95% CI 1.21 to 1.77), cardiovascular mortality (OR 1.89; 95% CI 1.44 to 2.48), diabetes mortality in those aged 75+ (OR 9.96; 95% CI 1.02 to 96.85), NEC EHAs (OR 1.15; 95% CI 1.07 to 1.23) and EHAs from renal diseases (OR 1.41; 95% CI 1.09 to 1.83). The elderly were found to be particularly vulnerable to heatwaves (eg, for NEC EHAs, OR 1.24 for 65–74-year-olds and 1.39 for those aged 75+). Conclusions Significant increases in NEC mortality and EHAs were observed during heatwaves in Brisbane where people are well accustomed to hot summer weather. The most vulnerable were the elderly and people with cardiovascular, renal or diabetic disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Malaria remains a public health problem in the remote and poor area of Yunnan Province, China. Yunnan faces an increasing risk of imported malaria infections from Mekong river neighboring countries. This study aimed to identify the high risk area of malaria transmission in Yunnan Province, and to estimate the effects of climatic variability on the transmission of Plasmodium vivax and Plasmodium falciparum in the identified area. METHODS We identified spatial clusters of malaria cases using spatial cluster analysis at a county level in Yunnan Province, 2005-2010, and estimated the weekly effects of climatic factors on P. vivax and P. falciparum based on a dataset of daily malaria cases and climatic variables. A distributed lag nonlinear model was used to estimate the impact of temperature, relative humidity and rainfall up to 10-week lags on both types of malaria parasite after adjusting for seasonal and long-term effects. RESULTS The primary cluster area was identified along the China-Myanmar border in western Yunnan. A 1°C increase in minimum temperature was associated with a lag 4 to 9 weeks relative risk (RR), with the highest effect at lag 7 weeks for P. vivax (RR = 1.03; 95% CI, 1.01, 1.05) and 6 weeks for P. falciparum (RR = 1.07; 95% CI, 1.04, 1.11); a 10-mm increment in rainfall was associated with RRs of lags 2-4 weeks and 9-10 weeks, with the highest effect at 3 weeks for both P. vivax (RR = 1.03; 95% CI, 1.01, 1.04) and P. falciparum (RR = 1.04; 95% CI, 1.01, 1.06); and the RRs with a 10% rise in relative humidity were significant from lag 3 to 8 weeks with the highest RR of 1.24 (95% CI, 1.10, 1.41) for P. vivax at 5-week lag. CONCLUSIONS Our findings suggest that the China-Myanmar border is a high risk area for malaria transmission. Climatic factors appeared to be among major determinants of malaria transmission in this area. The estimated lag effects for the association between temperature and malaria are consistent with the life cycles of both mosquito vector and malaria parasite. These findings will be useful for malaria surveillance-response systems in the Mekong river region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction Clinical guidelines for the treatment of chronic low back pain suggest the use of supervised exercise. Motor control (MC) based exercise is widely used within clinical practice but its efficacy is equivalent to general exercise therapy. MC exercise targets the trunk musculature. Considering the mechanical links between the hip, pelvis, and lumbar spine, surprisingly little focus has been on investigating the contribution of the hip musculature to lumbopelvic support. The purpose of this study is to compare the efficacy of two exercise programs for the treatment of non-specific low back pain (NSLBP). Methods Eighty individuals aged 18-65 years of age were randomized into two groups to participate in this trial. The primary outcome measures included self-reported pain intensity (0-100mm VAS) and percent disability (Oswestry Disability Index V2). Bilateral measures of hip strength (N/kg) and two dimensional frontal plane mechanics (º) were the secondary outcomes. Outcomes were measured at baseline and following a six-week home based exercise program including weekly sessions of real-time ultrasound imaging. Results Within group comparisons revealed clinically meaningful reductions in pain for both groups. The MC exercise only (N= 40, xˉ =-20.9mm, 95%CI -25.7, -16.1) and the combined MC and hip exercise (N= 40, xˉ = -24.9mm, 95%CI -30.8, -19.0). There was no statistical difference in the change of pain (xˉ =-4.0mm, t= -1.07, p=0.29, 95%CI -11.5, 3.5) or disability (xˉ =-0.3%, t=-0.19, p=0.85, 95%CI -11.5, 3.5) between groups. Conclusion Both exercise programs had similar and positive effects on NSLBP which support the use of the home based exercise programs with weekly supervised visits. However, the addition of specific hip strengthening exercises to a MC based exercise program did not result in significantly greater reductions in pain or disability. Trial Registration NCTO1567566 Funding: Worker’s Compensation Board Alberta Research Grant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to elucidate the thermophysiological effects of wearing lightweight non-military overt and covert personal body armour (PBA) in a hot and humid environment. Eight healthy males walked on a treadmill for 120 min at 22% of their heart rate reserve in a climate chamber simulating 31 °C (60%RH) wearing either no armour (control), overt or covert PBA in addition to a security guard uniform, in a randomised controlled crossover design. No significant difference between conditions at the end of each trial was observed in core temperature, heart rate or skin temperature (P > 0.05). Covert PBA produced a significantly greater amount of body mass change (−1.81 ± 0.44%) compared to control (−1.07 ± 0.38%, P = 0.009) and overt conditions (−1.27 ± 0.44%, P = 0.025). Although a greater change in body mass was observed after the covert PBA trial; based on the physiological outcome measures recorded, the heat strain encountered while wearing lightweight, non-military overt or covert PBA was negligible compared to no PBA. Practitioner summary The wearing of bullet proof vests or body armour is a requirement of personnel engaged in a wide range of occupations including police, security, customs and even journalists in theatres of war. This randomised controlled crossover study is the first to examine the thermophysiological effects of wearing lightweight non-military overt and covert personal body armour (PBA) in a hot and humid environment. We conclude that the heat strain encountered while wearing both overt and covert lightweight, non-military PBA was negligible compared to no PBA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives To compare the efficacy of two exercise programs in reducing pain and disability for individuals with non-specific low back pain and to examine the underlying mechanical factors related to pain and disability for individuals with NSLBP. Design A single-blind, randomized controlled trial. Methods: Eighty participants were recruited from eleven community-based general medical practices and randomized into two groups completing either a lumbopelvic motor control or a combined lumbopelvic motor control and progressive hip strengthening exercise therapy program. All participants received an education session, 6 rehabilitation sessions including real time ultrasound training, and a home based exercise program manual and log book. The primary outcomes were pain (0-100mm visual analogue scale), and disability (Oswestry Disability Index V2). The secondary outcomes were hip strength (N/kg) and two-dimensional frontal plane biomechanics (°) measure during the static Trendelenburg test and while walking. All outcomes were measured at baseline and at 6-week follow up. Results There was no statistical difference in the change in pain (xˉ = -4.0mm, t= -1.07, p =0.29, 95%CI -11.5, 3.5) or disability (xˉ = -0.3%, t= -0.19, p =0.85, 95%CI -3.5, 2.8) between groups. Within group comparisons revealed clinically meaningful reductions in pain for both Group One (xˉ =-20.9mm, 95%CI -25.7, -16.1) and Group Two (xˉ =-24.9, 95%CI -30.8, -19.0). Conclusion Both exercise programs had similar efficacy in reducing pain. The addition of hip strengthening exercises to a motor control exercise program does not appear to result in improved clinical outcome for pain for individuals with non-specific low back pain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose We designed a visual field test focused on the field utilized while driving to examine associations between field impairment and motor vehicle collision involvement in 2,000 drivers ≥70 years old. Methods The "driving visual field test" involved measuring light sensitivity for 20 targets in each eye, extending 15° superiorly, 30° inferiorly, 60° temporally and 30° nasally. The target locations were selected on the basis that they fell within the field region utilized when viewing through the windshield of a vehicle or viewing the dashboard while driving. Monocular fields were combined into a binocular field based on the more sensitive point from each eye. Severe impairment in the overall field or a region was defined as average sensitivity in the lowest quartile of sensitivity. At-fault collision involvement for five years prior to enrollment was obtained from state records. Poisson regression was used to calculate crude and adjusted rate ratios examining the association between field impairment and at-fault collision involvement. Results Drivers with severe binocular field impairment in the overall driving visual field had a 40% increased rate of at-fault collision involvement (RR 1.40, 95%CI 1.07-1.83). Impairment in the lower and left fields was associated with elevated collision rates (RR 1.40 95%CI 1.07-1.82 and RR 1.49, 95%CI 1.15-1.92, respectively), whereas impairment in the upper and right field regions was not. Conclusions Results suggest that older drivers with severe impairment in the lower or left region of the driving visual field are more likely to have a history of at-fault collision involvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: At present there are no large scale nationally-representative studies from Sri Lanka on the prevalence and associations of Diabetic Retinopathy (DR). The present study aims to evaluate the prevalence and risk factors for DR in a community-based nationally-representative sample of adults with self-reported diabetes mellitus from Sri Lanka. Methods: A cross-sectional community-based national study among 5,000 adults (≥18 years) was conducted in Sri Lanka, using a multi-stage stratified cluster sampling technique. An interviewer-administered questionnaire was used to collect data. Ophthalmological evaluation of patients with ‘known’ diabetes (previously diagnosed at a government hospital or by a registered medical practitioner) was done using indirect ophthalmoscopy. A binary-logistic regression analysis was performed with ‘presence of DR’ as the dichotomous dependent variable and other independent covariates. Results: Crude prevalence of diabetes was 12.0%(n=536),of which 344 were patients with ‘known’ diabetes.Mean age was 56.4 ± 10.9 years and 37.3% were males. Prevalence of any degree of DR was 27.4% (Males-30.5%, Females-25.6%; p = 0.41). In patients with DR, majority had NPDR (93.4%), while 5.3% had maculopathy. Patients with DR had a significantly longer duration of diabetes than those without. In the binary-logistic regression analysis in all adults duration of diabetes (OR:1.07), current smoking (OR:1.67) and peripheral neuropathy (OR:1.72)all were significantly associated with DR. Conclusions: Nearly 1/3rd of Sri Lankan adults with self-reported diabetes are having retinopathy. DR was associated with diabetes duration, cigarette smoking and peripheral neuropathy. However, further prospective follow up studies are required to establish causality for identified risk factors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose: The purpose of this study is to examine the feasibility of developing plasma predictive value biomarkers of cerebral ischemic stroke before imaging evidence is acquired. Methods: Blood samples were obtained from 198 patients who attended our neurology department as emergencies - with symptoms of vertigo, numbness, limb weakness, etc. - within 4.5 h of symptom onset, and before imaging evidence was obtained and medical treatment. After the final diagnosis was made by MRI/DWI/MRA or CTA in the following 24-72 h, the above cases were divided into two groups: stroke group and non-stroke group according to the imaging results. The levels of baseline plasma antithrombin III (AT-III), thrombin-antithrombin III (TAT), fibrinogen, D-dimer and high-sensitivity C-reactive protein (hsCRP) in the two groups were assayed. Results: The level of the baseline AT-III in the stroke group was 118.07 ± 26.22%, which was lower than that of the non-stroke group (283.83 ± 38.39%). The levels of TAT, fibrinogen, hsCRP were 7.24 ± 2.28 μg/L, 5.49 ± 0.98 g/L, and 2.17 ± 1.07 mg/L, respectively, which were higher than those of the non-stroke group (2.53 ± 1.23 μg/L, 3.35 ± 0.50 g/L, 1.82 ± 0.67 mg/L). All the P-values were less than 0.001. The D-dimer level was 322.57 ± 60.34 μg/L, which was slightly higher than that of the non-stroke group (305.76 ± 49.52 μg/L), but the P-value was 0.667. The sensitivities of AT-III, TAT, fibrinogen, D-dimer and hsCRP for predicting ischemic stroke tendency were 97.37%, 96.05%, 3.29%, 7.89%, but the specificity was 93.62%, 82.61%, 100% and 100%, respectively, and all the P-values were less than 0.001. High levels of D-dimer and hsCRP were mainly seen in the few cases with severe large-vessel infarction. Conclusions: Clinical manifestations of acute focal neurological deficits were associated with plasma AT-III and fibrinogen. These tests might help the risk assessment of acute cerebral ischemic stroke and/or TIA with infarction tendency in the superacute stage before positive imaging evidence is obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

- Introduction There is limited understanding of how young adults’ driving behaviour varies according to long-term substance involvement. It is possible that regular users of amphetamine-type stimulants (i.e. ecstasy (MDMA) and methamphetamine) may have a greater predisposition to engage in drink/drug driving compared to non-users. We compare offence rates, and self-reported drink/drug driving rates, for stimulant users and non-users in Queensland, and examine contributing factors. - Methods The Natural History Study of Drug Use is a prospective longitudinal study using population screening to recruit a probabilistic sample of amphetamine-type stimulant users and non-users aged 19-23 years. At the 4 ½ year follow-up, consent was obtained to extract data from participants’ Queensland driver records (ATS users: n=217, non-users: n=135). Prediction models were developed of offence rates in stimulant users controlling for factors such as aggression and delinquency. - Results Stimulant users were more likely than non-users to have had a drink-driving offence (8.7% vs. 0.8%, p < 0.001). Further, about 26% of ATS users and 14% of non-users self-reported driving under the influence of alcohol during the last 12 months. Among stimulant users, drink-driving was independently associated with last month high-volume alcohol consumption (Incident Rate Ratio (IRR): 5.70, 95% CI: 2.24-14.52), depression (IRR: 1.28, 95% CI: 1.07-1.52), low income (IRR: 3.57, 95% CI: 1.12-11.38), and male gender (IRR: 5.40, 95% CI: 2.05-14.21). - Conclusions Amphetamine-type stimulant use is associated with increased long-term risk of drink-driving, due to a number of behavioural and social factors. Inter-sectoral approaches which target long-term behaviours may reduce offending rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine the association between preoperative quality of life (QoL) and postoperative adverse events in women treated for endometrial cancer. Methods: 760 women with apparent Stage I endometrial cancer were randomised into a clinical trial evaluating laparoscopic versus open surgery. This analysis includes women with preoperative QoL measurements, from the Functional Assessment of Cancer Therapy- General (FACT-G) questionnaire, and who were followed up for at least 6 weeks after surgery (n=684). The outcomes for this study were defined as (1) the occurrence of moderate to severe AEs adverse events within 6 months (Common Toxicology Criteria (CTC) grade ≥3); and (2) any Serious Adverse Event (SAE). The association between preoperative QoL and the occurrence of AE was examined, after controlling for baseline comorbidity and other factors. Results: After adjusting for other factors, odds of occurrence of AE of CTC grade ≥3 were significantly increased with each unit decrease in baseline FACT-G score (OR=1.02, 95% CI 1.00-1.03, p=0.030), which was driven by physical well-being (PWB) (OR=1.09, 95% CI 1.04-1.13, p=0.0002) and functional well-being subscales (FWB) (OR=1.04, 95% CI 1.00-1.07, p=0.035). Similarly, odds of SAE occurrence were significantly increased with each unit decrease in baseline FACT-G score (OR=1.02, 95% CI 1.01-1.04, p=0.011), baseline PWB (OR=1.11, 95% CI 1.06-1.16, p<0.0001) or baseline FWB subscales (OR=1.05, 95% CI 1.01-1.10, p=0.0077). Conclusion: Women with early endometrial cancer presenting with lower QoL prior to surgery are at higher risk of developing a serious adverse event following surgery. Funding: Cancer Council Queensland, Cancer Council New South Wales, Cancer Council Victoria, Cancer Council, Western Australia; NHMRC project grant 456110; Cancer Australia project grant 631523; The Women and Infants Research Foundation, Western Australia; Royal Brisbane and Women’s Hospital Foundation; Wesley Research Institute; Gallipoli Research Foundation; Gynetech; TYCO Healthcare, Australia; Johnson and Johnson Medical, Australia; Hunter New England Centre for Gynaecological Cancer; Genesis Oncology Trust; and Smart Health Research Grant QLD Health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To quantify and compare the treatment effect and risk of bias of trials reporting biomarkers or intermediate outcomes (surrogate outcomes) versus trials using final patient relevant primary outcomes. Design Meta-epidemiological study. Data sources All randomised clinical trials published in 2005 and 2006 in six high impact medical journals: Annals of Internal Medicine, BMJ, Journal of the American Medical Association, Lancet, New England Journal of Medicine, and PLoS Medicine. Study selection Two independent reviewers selected trials. Data extraction Trial characteristics, risk of bias, and outcomes were recorded according to a predefined form. Two reviewers independently checked data extraction. The ratio of odds ratios was used to quantify the degree of difference in treatment effects between the trials using surrogate outcomes and those using patient relevant outcomes, also adjusted for trial characteristics. A ratio of odds ratios >1.0 implies that trials with surrogate outcomes report larger intervention effects than trials with patient relevant outcomes. Results 84 trials using surrogate outcomes and 101 using patient relevant outcomes were considered for analyses. Study characteristics of trials using surrogate outcomes and those using patient relevant outcomes were well balanced, except for median sample size (371 v 741) and single centre status (23% v 9%). Their risk of bias did not differ. Primary analysis showed trials reporting surrogate endpoints to have larger treatment effects (odds ratio 0.51, 95% confidence interval 0.42 to 0.60) than trials reporting patient relevant outcomes (0.76, 0.70 to 0.82), with an unadjusted ratio of odds ratios of 1.47 (1.07 to 2.01) and adjusted ratio of odds ratios of 1.46 (1.05 to 2.04). This result was consistent across sensitivity and secondary analyses. Conclusions Trials reporting surrogate primary outcomes are more likely to report larger treatment effects than trials reporting final patient relevant primary outcomes. This finding was not explained by differences in the risk of bias or characteristics of the two groups of trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Studies of mid-aged adults provide evidence of a relationship between sitting-time and all-cause mortality, but evidence in older adults is limited. The aim is to examine the relationship between total sitting-time and all-cause mortality in older women. Methods The prospective cohort design involved 6656 participants in the Australian Longitudinal Study on Women's Health who were followed for up to 9 years (2002, age 76–81, to 2011, age 85–90). Self-reported total sitting-time was linked to all-cause mortality data from the National Death Index from 2002 to 2011. Cox proportional hazard models were used to examine the relationship between sitting-time and all-cause mortality, with adjustment for potential sociodemographic, behavioural and health confounders. Results There were 2003 (30.1%) deaths during a median follow-up of 6 years. Compared with participants who sat <4 h/day, those who sat 8–11 h/day had a 1.45 times higher risk of death and those who sat ≥11 h/day had a 1.65 times higher risk of death. These risks remained after adding sociodemographic and behavioural covariates, but were attenuated after adjustment for health covariates. A significant interaction (p=0.02) was found between sitting-time and physical activity (PA), with increased mortality risk for prolonged sitting only among participants not meeting PA guidelines (HR for sitting ≥8 h/day: 1.31, 95% CI 1.07 to 1.61); HR for sitting ≥11 h/day: 1.47, CI 1.15 to 1.93). Conclusions Prolonged sitting-time was positively associated with all-cause mortality. Women who reported sitting for more than 8 h/day and did not meet PA guidelines had an increased risk of dying within the next 9 years.