716 resultados para population-based cohort
Resumo:
OBJECTIVES: Bone attrition probably constitutes remodeling of the bone, resulting in flattening or depression of the articular surfaces. Defining bone attrition is challenging because it is an accentuation of the normal curvature of the tibial plateaus. We aimed to define bone attrition on magnetic resonance imaging (MRI) of the knee using information from both radiographs and MRIs, and to assess whether bone attrition is common prior to end stage disease osteoarthritis (OA) in the tibio-femoral joint. METHODS: All knees of participants in the community-based sample of the Framingham OA Study were evaluated for bone attrition in radiographs and MRIs. Radiographs were scored based on templates designed to outline the normal contours of the tibio-femoral joint. MRIs were analyzed using the semi-quantitative Whole-Organ Magnetic Resonance Imaging Scoring (WORMS) method. The prevalence of bone attrition was calculated using two different thresholds for MRI scores. RESULTS: Inter-observer agreement for identification of bone attrition was substantial for the radiographs (kappa=0.71, 95% CI 0.67-0.81) and moderate for MRI (kappa=0.56, 95% CI 0.40-0.72). Of 964 knees, 5.7% of the radiographs showed bone attrition. Of these, 91% of MRIs were also read as showing bone attrition. We selected a conservative threshold for bone attrition on MRI scoring (> or = 2 on a 0-3 scale) based on agreement with attrition on the radiograph or when bone attrition on MRI co-occurred with cartilage loss on OA. Using this threshold for bone attrition on MRI, bone attrition was common in knees with OA. For example, in knees with mild OA but no joint space narrowing, 13 of 88 MRIs (14.8%) showed bone attrition. CONCLUSIONS: Using MRI we found that many knees with mild OA without joint narrowing on radiographs had bone attrition, even using conservative definitions. The validity of our definition of bone attrition should be evaluated in further studies. Bone attrition may occur in milder OA and at earlier stages of disease than previously thought.
Resumo:
Background: In contrast with established evidence linking high doses of ionizing radiation with childhood cancer, research on low-dose ionizing radiation and childhood cancer has produced inconsistent results. Objective: We investigated the association between domestic radon exposure and childhood cancers, particularly leukemia and central nervous system (CNS) tumors. Methods: We conducted a nationwide census-based cohort study including all children < 16 years of age living in Switzerland on 5 December 2000, the date of the 2000 census. Follow-up lasted until the date of diagnosis, death, emigration, a child’s 16th birthday, or 31 December 2008. Domestic radon levels were estimated for each individual home address using a model developed and validated based on approximately 45,000 measurements taken throughout Switzerland. Data were analyzed with Cox proportional hazard models adjusted for child age, child sex, birth order, parents’ socioeconomic status, environmental gamma radiation, and period effects. Results: In total, 997 childhood cancer cases were included in the study. Compared with children exposed to a radon concentration below the median (< 77.7 Bq/m3), adjusted hazard ratios for children with exposure ≥ the 90th percentile (≥ 139.9 Bq/m3) were 0.93 (95% CI: 0.74, 1.16) for all cancers, 0.95 (95% CI: 0.63, 1.43) for all leukemias, 0.90 (95% CI: 0.56, 1.43) for acute lymphoblastic leukemia, and 1.05 (95% CI: 0.68, 1.61) for CNS tumors. Conclusions: We did not find evidence that domestic radon exposure is associated with childhood cancer, despite relatively high radon levels in Switzerland.
Resumo:
OBJECTIVES Dentine hypersensitivity (DH) manifests as a transient but arresting oral pain. The incidence is thought to be rising, particularly in young adults, due to increases in consumption of healthy, yet erosive, diets. This study aimed to assess the prevalence of DH and relative importance of risk factors, in 18-35 year old Europeans. METHODS In 2011, 3187 adults were enrolled from general dental practices in France, Spain, Italy, United Kingdom, Finland, Latvia and Estonia. DH was clinically evaluated by cold air tooth stimulation, patient pain rating (yes/no), accompanied by investigator pain rating (Schiff 0-3). Erosive toothwear (BEWE index 0-3) and gingival recession (mm) were recorded. Patients completed a questionnaire regarding the nature of their DH, erosive dietary intake and toothbrushing habits. RESULTS 41.9% of patients reported pain on tooth stimulation and 56.8% scored ≥1 on Schiff scale for at least one tooth. Clinical elicited sensitivity was closely related to Schiff score and to a lesser degree, questionnaire reported sensitivity (26.8%), possibly reflecting the transient nature of the pain, alongside good coping mechanisms. Significant associations were found between clinically elicited DH and erosive toothwear and gingival recession. The questionnaire showed marked associations between DH and risk factors including heartburn/acid reflux, vomiting, sleeping medications, energy drinks, smoking and acid dietary intake. CONCLUSION Overall, the prevalence of DH was high compared to many published findings, with a strong, progressive relationship between DH and erosive toothwear, which is important to recognise for patient preventive therapies and clinical management of DH pain.
Resumo:
Several studies have shown associations of posttraumatic stress disorder (PTSD) with the development of cardiometabolic diseases. The underlying psychopathological mechanisms, including potential links to inflammatory processes, have been discussed but remain elusive. Therefore, the aim of the present study was to evaluate the association of PTSD symptoms with the inflammatory biomarkers C-reactive protein (CRP) and interleukin-18 (IL-18). The study population consisted of 3012 participants aged 32-81years drawn from the population-based KORA F4 study conducted in 2006-08 in the Augsburg region (Southern Germany). PTSD symptoms were measured by the Impact of Event Scale, the Posttraumatic Diagnostic Scale and interview data and classified as no, partial or full PTSD. The associations of PTSD with CRP and IL-18 concentrations were estimated by multiple regression analyses with adjustments for age, sex and cardiometabolic risk factors. Linear regression analyses showed no significant association between PTSD and CRP or IL-18 concentration: adjusted for age and sex, the geometric mean concentrations in participants with full PTSD was for CRP 9% lower and for IL-18 1% higher than in participants with no PTSD (p values 0.53 and 0.89). However, further analyses indicated that individuals with partial PTSD had an increased chance of belonging to the highest quartile of the IL-18 concentration. No significant association was observed for any of the three subscales intrusion, avoidance or hyperarousal with CRP or IL-18 concentration. This large, population-based study could not find an association of full PTSD with CRP and IL-18 concentrations. Further research is needed to analyse these relationships.
Resumo:
PURPOSE Patients with Alzheimer's disease (AD) have an increased risk of developing seizures or epilepsy. Little is known about the role of risk factors and about the risk of developing seizures/epilepsy in patients with vascular dementia (VD). The aim of this study was to assess incidence rates (IRs) of seizures/epilepsy in patients with AD, VD, or without dementia, and to identify potential risk factors of seizures or epilepsy. METHODS We conducted a follow-up study with a nested case-control analysis using the United Kingdom-based General Practice Research Database (GPRD). We identified patients aged ≥65 years with an incident diagnosis of AD or VD between 1998 and 2008 and a matched comparison group of dementia-free patients. Conditional logistic regression was used to estimate the odds ratio (OR) with a 95% confidence interval (CI) of developing seizures/epilepsy in patients with AD or VD, stratified by age at onset and duration of dementia as well as by use of antidementia drugs. KEY FINDINGS Among 7,086 cases with AD, 4,438 with VD, and 11,524 matched dementia-free patients, we identified 180 cases with an incident diagnosis of seizures/epilepsy. The IRs of epilepsy/seizures for patients with AD or VD were 5.6/1,000 person-years (py) (95% CI 4.6-6.9) and 7.5/1,000 py (95% CI 5.7-9.7), respectively, and 0.8/1,000 py (95% CI 0.6-1.1) in the dementia-free group. In the nested case-control analysis, patients with longer standing (≥3 years) AD had a slightly higher risk of developing seizures or epilepsy than those with a shorter disease duration, whereas in patients with VD the contrary was observed. SIGNIFICANCE Seizures or epilepsy were substantially more common in patients with AD and VD than in dementia-free patients. The role of disease duration as a risk factor for seizures/epilepsy seems to differ between AD and VD.
Resumo:
We investigated the association between exposure to radio-frequency electromagnetic fields (RF-EMFs) from broadcast transmitters and childhood cancer. First, we conducted a time-to-event analysis including children under age 16 years living in Switzerland on December 5, 2000. Follow-up lasted until December 31, 2008. Second, all children living in Switzerland for some time between 1985 and 2008 were included in an incidence density cohort. RF-EMF exposure from broadcast transmitters was modeled. Based on 997 cancer cases, adjusted hazard ratios in the time-to-event analysis for the highest exposure category (>0.2 V/m) as compared with the reference category (<0.05 V/m) were 1.03 (95% confidence interval (CI): 0.74, 1.43) for all cancers, 0.55 (95% CI: 0.26, 1.19) for childhood leukemia, and 1.68 (95% CI: 0.98, 2.91) for childhood central nervous system (CNS) tumors. Results of the incidence density analysis, based on 4,246 cancer cases, were similar for all types of cancer and leukemia but did not indicate a CNS tumor risk (incidence rate ratio = 1.03, 95% CI: 0.73, 1.46). This large census-based cohort study did not suggest an association between predicted RF-EMF exposure from broadcasting and childhood leukemia. Results for CNS tumors were less consistent, but the most comprehensive analysis did not suggest an association.
Resumo:
Background: Few studies have examined the 20% of individuals who never experience an episode of low back pain (LBP). To date, no investigation has been undertaken that examines a group who claim to have never experienced LBP in their lifetime in comparison to two population-based case–control groups with and without momentary LBP. This study investigates whether LBP-resilient workers between 50 and 65 years had better general health, demonstrated more positive health behaviour and were better able to achieve routine activities compared with both case–control groups. Methods: Forty-two LBP-resilient participants completed the same pain assessment questionnaire as a population-based LBP sample from a nationwide, large-scale cross-sectional survey in Switzerland. The LBP-resilient participants were pairwise compared to the propensity score-matched case controls by exploring differences in demographic and work characteristics, and by calculating odds ratios (ORs) and effect sizes. A discriminant analysis explored group differences, while the multiple logistic regression analysis specified single indicators which accounted for group differences. Results: LBP-resilient participants were healthier than the case controls with momentary LBP and achieved routine activities more easily. Compared to controls without momentary LBP, LBP-resilient participants had a higher vitality, a lower workload, a healthier attitude towards health and behaved more healthily by drinking less alcohol. Conclusions: By demonstrating a difference between LBP-resilient participants and controls without momentary LBP, the question that arises is what additional knowledge can be attained. Three underlying traits seem to be relevant about LBP-resilient participants: personality, favourable work conditions and subjective attitudes/attributions towards health. These rationales have to be considered with respect to LBP prevention.
Resumo:
Advances in radiotherapy have generated increased interest in comparative studies of treatment techniques and their effectiveness. In this respect, pediatric patients are of specific interest because of their sensitivity to radiation induced second cancers. However, due to the rarity of childhood cancers and the long latency of second cancers, large sample sizes are unavailable for the epidemiological study of contemporary radiotherapy treatments. Additionally, when specific treatments are considered, such as proton therapy, sample sizes are further reduced due to the rareness of such treatments. We propose a method to improve statistical power in micro clinical trials. Specifically, we use a more biologically relevant quantity, cancer equivalent dose (DCE), to estimate risk instead of mean absorbed dose (DMA). Our objective was to demonstrate that when DCE is used fewer subjects are needed for clinical trials. Thus, we compared the impact of DCE vs. DMA on sample size in a virtual clinical trial that estimated risk for second cancer (SC) in the thyroid following craniospinal irradiation (CSI) of pediatric patients using protons vs. photons. Dose reconstruction, risk models, and statistical analysis were used to evaluate SC risk from therapeutic and stray radiation from CSI for 18 patients. Absorbed dose was calculated in two ways: with (1) traditional DMA and (2) with DCE. DCE and DMA values were used to estimate relative risk of SC incidence (RRCE and RRMA, respectively) after proton vs. photon CSI. Ratios of RR for proton vs. photon CSI (RRRCE and RRRMA) were then used in comparative estimations of sample size to determine the minimal number of patients needed to maintain 80% statistical power when using DCE vs. DMA. For all patients, we found that protons substantially reduced the risk of developing a second thyroid cancer when compared to photon therapy. Mean RRR values were 0.052±0.014 and 0.087±0.021 for RRRMA and RRRCE, respectively. However, we did not find that use of DCE reduced the number of patents needed for acceptable statistical power (i.e, 80%). In fact, when considerations were made for RRR values that met equipoise requirements and the need for descriptive statistics, the minimum number of patients needed for a micro-clinical trial increased from 17 using DMA to 37 using DCE. Subsequent analyses revealed that for our sample, the most influential factor in determining variations in sample size was the experimental standard deviation of estimates for RRR across the patient sample. Additionally, because the relative uncertainty in dose from proton CSI was so much larger (on the order of 2000 times larger) than the other uncertainty terms, it dominated the uncertainty in RRR. Thus, we found that use of corrections for cell sterilization, in the form of DCE, may be an important and underappreciated consideration in the design of clinical trials and radio-epidemiological studies. In addition, the accurate application of cell sterilization to thyroid dose was sensitive to variations in absorbed dose, especially for proton CSI, which may stem from errors in patient positioning, range calculation, and other aspects of treatment planning and delivery.
Resumo:
Purpose Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Patients and Methods Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988–2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Results Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p<0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991–1.000) per year (p=0.03). Conclusion A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained.
Resumo:
BACKGROUND It is often assumed that blood pressure increases acutely after major stroke, resulting in so-called post-stroke hypertension. In view of evidence that the risks and benefits of blood pressure-lowering treatment in acute stroke might differ between patients with major ischaemic stroke and those with primary intracerebral haemorrhage, we compared acute-phase and premorbid blood pressure levels in these two disorders. METHODS In a population-based study in Oxfordshire, UK, we recruited all patients presenting with stroke between April 1, 2002, and March 31, 2012. We compared all acute-phase post-event blood pressure readings with premorbid readings from 10-year primary care records in all patients with acute major ischaemic stroke (National Institutes of Health Stroke Scale >3) versus those with acute intracerebral haemorrhage. FINDINGS Of 653 consecutive eligible patients, premorbid and acute-phase blood pressure readings were available for 636 (97%) individuals. Premorbid blood pressure (total readings 13,244) had been measured on a median of 17 separate occasions per patient (IQR 8-31). In patients with ischaemic stroke, the first acute-phase systolic blood pressure was much lower than after intracerebral haemorrhage (158·5 mm Hg [SD 30·1] vs 189·8 mm Hg [38·5], p<0·0001; for patients not on antihypertensive treatment 159·2 mm Hg [27·8] vs 193·4 mm Hg [37·4], p<0·0001), was little higher than premorbid levels (increase of 10·6 mm Hg vs 10-year mean premorbid level), and decreased only slightly during the first 24 h (mean decrease from <90 min to 24 h 13·6 mm Hg). By contrast with findings in ischaemic stroke, the mean first systolic blood pressure after intracerebral haemorrhage was substantially higher than premorbid levels (mean increase of 40·7 mm Hg, p<0·0001) and fell substantially in the first 24 h (mean decrease of 41·1 mm Hg; p=0·0007 for difference from decrease in ischaemic stroke). Mean systolic blood pressure also increased steeply in the days and weeks before intracerebral haemorrhage (regression p<0·0001) but not before ischaemic stroke. Consequently, the first acute-phase blood pressure reading after primary intracerebral haemorrhage was more likely than after ischaemic stroke to be the highest ever recorded (OR 3·4, 95% CI 2·3-5·2, p<0·0001). In patients with intracerebral haemorrhage seen within 90 min, the highest systolic blood pressure within 3 h of onset was 50 mm Hg higher, on average, than the maximum premorbid level whereas that after ischaemic stroke was 5·2 mm Hg lower (p<0·0001). INTERPRETATION Our findings suggest that systolic blood pressure is substantially raised compared with usual premorbid levels after intracerebral haemorrhage, whereas acute-phase systolic blood pressure after major ischaemic stroke is much closer to the accustomed long-term premorbid level, providing a potential explanation for why the risks and benefits of lowering blood pressure acutely after stroke might be expected to differ. FUNDING Wellcome Trust, Wolfson Foundation, UK Medical Research Council, Stroke Association, British Heart Foundation, National Institute for Health Research.
Resumo:
BACKGROUND In Switzerland, the heptavalent (PCV7) and 13-valent pneumococcal conjugate vaccine (PCV13) were recommended for all infants aged <2 years in 2007 and 2011, respectively. Due to herd effects, a protective impact on the invasive pneumococcal disease (IPD) rates in adults had been expected. METHODS Within this study, data from the nationwide mandatory surveillance was analyzed for all adult patients ≥16 years with IPD of known serotype/serogroup during 2003-2012. Trend (for IPD cases from 2003 to 2012) and logistic regression analyses (2007-2010) were performed to identify changes in serotype distribution and to identify the association of serotypes with age, clinical manifestations, comorbidities and case fatality, respectively. FINDINGS The proportion of PCV7 serotypes among all IPD cases (n=7678) significantly declined in adults from 44.7% (2003) before to 16.7% (2012) after the recommendation of PCV7 (P<0.001). In contrast, the proportion of non-PCV7 serogroup/serotypes increased for non-PCV13 but also PCV13 serotypes (not included in PCV7) at the same time. Serotype distribution varied significantly across ages, clinical manifestations and comorbidities. Serotype was furthermore associated with case fatality (P=0.001). In a multivariable logistic regression model, analyzing single serotypes showed that case-fatality was increased for the serotypes 3 (P=0.008), 19A (P=0.03) and 19F (P=0.005), compared to serotype 1 and 7F. CONCLUSION There was a significant decline in PCV7 serotypes among adults with IPD in Switzerland after introduction of childhood vaccination with PCV7. Pneumococcal serotypes were associated with case fatality, age, clinical manifestation and comorbidities of IPD in adults. These results may prove useful for future vaccine recommendations for adults in Switzerland.
Resumo:
OBJECTIVE Use of diuretics has been associated with an increased risk of gout. Data on different types of diuretics are scarce. We undertook this study to investigate the association between use of loop diuretics, thiazide or thiazide-like diuretics, and potassium-sparing agents and the risk of developing incident gout. METHODS We conducted a retrospective population-based case-control analysis using the General Practice Research Database established in the UK. We identified case patients who were diagnosed as having incident gout between 1990 and 2010. One control patient was matched to each case patient for age, sex, general practice, calendar time, and years of active history in the database. We used conditional logistic regression to calculate odds ratios (ORs) and 95% confidence intervals (95% CIs), and we adjusted for potential confounders. RESULTS We identified 91,530 incident cases of gout and the same number of matched controls. Compared to past use of diuretics from each respective drug class, adjusted ORs for current use of loop diuretics, thiazide diuretics, thiazide-like diuretics, and potassium-sparing diuretics were 2.64 (95% CI 2.47-2.83), 1.70 (95% CI 1.62-1.79), 2.30 (95% CI 1.95-2.70), and 1.06 (95% CI 0.91-1.23), respectively. Combined use of loop diuretics and thiazide diuretics was associated with the highest relative risk estimates of gout (adjusted OR 4.65 [95% CI 3.51-6.16]). Current use of calcium channel blockers or losartan slightly attenuated the risk of gout in patients who took diuretics. CONCLUSION Use of loop diuretics, thiazide diuretics, and thiazide-like diuretics was associated with an increased risk of incident gout, although use of potassium-sparing agents was not.