818 resultados para clinical (human) or epidemiologic studies : risk factor assessment
Resumo:
OBJECTIVE: Occupational low back pain (LBP) is considered to be the most expensive form of work disability, with the socioeconomic costs of persistent LBP exceeding the costs of acute and subacute LBP by far. This makes the early identification of patients at risk of developing persistent LBP essential, especially in working populations. The aim of the study was to evaluate both risk factors (for the development of persistent LBP) and protective factors (preventing the development of persistent LBP) in the same cohort. PARTICIPANTS: An inception cohort of 315 patients with acute to subacute or with recurrent LBP was recruited from 14 health practitioners (twelve general practitioners and two physiotherapists) across New Zealand. METHODS: Patients with persistent LBP at six-month follow-up were compared to patients with non-persistent LBP looking at occupational, psychological, biomedical and demographic/lifestyle predictors at baseline using multiple logistic regression analyses. All significant variables from the different domains were combined into a one predictor model. RESULTS: A final two-predictor model with an overall predictive value of 78% included social support at work (OR 0.67; 95%CI 0.45 to 0.99) and somatization (OR 1.08; 95%CI 1.01 to 1.15). CONCLUSIONS: Social support at work should be considered as a resource preventing the development of persistent LBP whereas somatization should be considered as a risk factor for the development of persistent LBP. Further studies are needed to determine if addressing these factors in workplace interventions for patients suffering from acute, subacute or recurrent LBP prevents subsequent development of persistent LBP.
Resumo:
Female inmates make up the fastest growing segment in our criminal justice system today. The rapidly increasing trend for female prisoners calls for enhanced efforts to strategically plan the correctional facilities that address the needs of this growing population, and to work with communities to prevent crime in women. The incarcerated women in the U.S. have an estimated 145,000 minor children who are predisposed to unique psychosocial problems as a result of parental incarceration.^ This study examined the patterns of care and outcomes for pregnant inmates and their infants in Texas state prisons between 1994 and 1996. The study population consists of 202 pregnant inmates who delivered in a 2-year period, and a randomly sampled comparison cohort of 804 women from general Texas population, matched on race and educational levels. Both quantitative and qualitative data were used to elucidate the inmates' risk-factor profile, delivery/birth outcomes, and the patterns of care during pregnancy. The continuity-of-care issues for this population were also explored.^ Epidemiologic data were derived from multiple record systems to establish the comparison between two cohorts. A significantly great proportion of the inmates have prior lifestyle risk-factors (smoking, alcohol, and illicit drug abuse), poorer health status, and worse medical history. However, most of these existing risk-factors seem to show little manifestation in their current pregnancy. On the basis of maternal labor/delivery outcome and a number of neonatal indicators, this study found some evidence of a better pregnancy outcome for the inmate cohort when compared to the comparison group. Some possible explanations of this paradox were discussed. Seventeen percent of inmates gave birth to infants with suspected congenital syphilis. The placement patterns for the infants' care immediately after birth were elucidated.^ In addition to the quantitative data, an ethnographic approach was used to collect qualitative data from a subset of the inmate cohort (n = 20) and 12 care providers. The qualitative data were analyzed for their contents and themes, giving rise to a detailed description of the inmates' pregnancy experience. Eleven themes emerged from the study's thematic analysis, which provides the context for interpreting the epidemiologic data.^ Meaningful findings in this study were presented in a three-dimensional matrix to shed light on the apparent relationship between outcome indicators and their potential determinants. The suspected "linkages" between the outcome and their determinants can be used to generate hypotheses for future studies. ^
Resumo:
OBJECTIVES This study sought to determine whether high intestinal cholesterol absorption represents a cardiovascular risk factor and to link ABCG8 and ABO variants to cardiovascular disease (CVD). BACKGROUND Plant sterol-enriched functional foods are widely used for cholesterol lowering. Their regular intake yields a 2-fold increase in circulating plant sterol levels that equally represent markers of cholesterol absorption. Variants in ABCG8 and ABO have been associated with circulating plant sterol levels and CVD, thereby suggesting atherogenic effects of plant sterols or of cholesterol uptake. METHODS The cholestanol-to-cholesterol ratio (CR) was used as an estimate of cholesterol absorption because it is independent of plant sterols. First, we investigated the associations of 6 single nucleotide polymorphisms in ABCG8 and ABO with CR in the LURIC (LUdwisghafen RIsk and Cardiovascular health study) and the YFS (Young Finns Study) cohorts. Second, we conducted a systematic review and meta-analysis to investigate whether CR might be related to CVD. RESULTS In LURIC, the minor alleles of rs4245791 and rs4299376 and the major alleles of rs41360247, rs6576629, and rs4953023 of the ABCG8 gene and the minor allele of rs657152 of the ABO gene were significantly associated with higher CR. Consistent results were obtained for rs4245791, rs4299376, rs6576629, and rs4953023 in YFS. The meta-analysis, including 6 studies and 4,362 individuals, found that CR was significantly increased in individuals with CVD. CONCLUSIONS High cholesterol absorption is associated with risk alleles in ABCG8 and ABO and with CVD. Harm caused by elevated cholesterol absorption rather than by plant sterols may therefore mediate the relationships of ABCG8 and ABO variants with CVD.
Resumo:
BACKGROUND/OBJECTIVES High intake of added sweeteners is considered to have a causal role in the pathogenesis of cardiometabolic disorders. Especially, high-fructose intake is regarded as potentially harmful to cardiometabolic health. It may cause not only weight gain but also low-grade inflammation, which represents an independent risk factor for developing type 2 diabetes and cardiovascular disease. In particular, fructose has been suggested to induce plasminogen activator inhibitor-1 (PAI-1) expression in the liver and to increase circulating inflammatory cytokines. We therefore aimed to investigate, whether high-fructose diet has an impact on PAI-1, monocyte chemoattractant protein-1 (MCP-1), e-selectin and C-reactive protein (CRP) concentrations in healthy humans. SUBJECTS/METHODS We studied 20 participants (12 males and 8 females) of the TUebingen FRuctose Or Glucose study. This is an exploratory, parallel, prospective, randomized, single-blinded, outpatient, hypercaloric, intervention study. The participants had a mean age of 30.9 ± 2.1 years and a mean body mass index of 26.0 ± 0.5 kg/m(2) and they received 150 g of either fructose or glucose per day for 4 weeks.Results:There were neither significant changes of PAI-1, MCP-1, e-selectin and CRP after fructose (n=10) and glucose (n=10) intervention nor treatment effects (all P>0.2). Moreover, we did not observe longitudinal associations of the inflammatory parameters with triglycerides, liver fat, visceral fat and body weight in the fructose group. CONCLUSIONS Temporary high-fructose intake does not seem to cause inflammation in apparently healthy people in this secondary analysis of a small feeding trial.
Resumo:
AIM To investigate risk factors for the loss of multi-rooted teeth (MRT) in subjects treated for periodontitis and enrolled in supportive periodontal therapy (SPT). MATERIAL AND METHODS A total of 172 subjects were examined before (T0) and after active periodontal therapy (APT)(T1) and following a mean of 11.5 ± 5.2 (SD) years of SPT (T2). The association of risk factors with loss of MRT was analysed with multilevel logistic regression. The tooth was the unit of analysis. RESULTS Furcation involvement (FI) = 1 before APT was not a risk factor for tooth loss compared with FI = 0 (p = 0.37). Between T0 and T2, MRT with FI = 2 (OR: 2.92, 95% CI: 1.68, 5.06, p = 0.0001) and FI = 3 (OR: 6.85, 95% CI: 3.40, 13.83, p < 0.0001) were at a significantly higher risk to be lost compared with those with FI = 0. During SPT, smokers lost significantly more MRT compared with non-smokers (OR: 2.37, 95% CI: 1.05, 5.35, p = 0.04). Non-smoking and compliant subjects with FI = 0/1 at T1 lost significantly less MRT during SPT compared with non-compliant smokers with FI = 2 (OR: 10.11, 95% CI: 2.91, 35.11, p < 0.0001) and FI = 3 (OR: 17.18, 95% CI: 4.98, 59.28, p < 0.0001) respectively. CONCLUSIONS FI = 1 was not a risk factor for tooth loss compared with FI = 0. FI = 2/3, smoking and lack of compliance with regular SPT represented risk factors for the loss of MRT in subjects treated for periodontitis.
Resumo:
Objective: The attention deficit/hyperactivity disorder (ADHD) shows an increased prevalence in arrested offenders compared to the normal population. ADHD and delinquency seem to share some neurophysiological abnormalities. In recent studies, a subgroup of subjects with ADHD as well as delinquents displayed excessive EEG activity in the beta band compared to controls, which has been associated with antisocial behavior and aggression in ADHD children. The goal of the present study was to investigate whether delinquent behavior in ADHD is related to excessive beta activity. Methods: We compared the resting state EEGs (eyes closed and eyes open) of 13 non-delinquent and 13 delinquent subjects with ADHD and 13 controls regarding power spectra and topography of the EEG activity. Results: Offenders with ADHD showed more beta power mainly at frontal, central and parietal brain regions than non-delinquent subjects with ADHD. Conclusions: Excessive beta power may represent a risk-factor for delinquent behavior in adults with ADHD. Significance: The awareness of such risk-factors may be helpful in the assessment of the risk for delinquent behavior in a psychiatric context and may provide a neurobiological background for therapeutic interventions.
Primary prophylaxis for venous thromboembolism in ambulatory cancer patients receiving chemotherapy.
Resumo:
BACKGROUND Venous thromboembolism (VTE) often complicates the clinical course of cancer. The risk is further increased by chemotherapy, but the safety and efficacy of primary thromboprophylaxis in cancer patients treated with chemotherapy is uncertain. This is an update of a review first published in February 2012. OBJECTIVES To assess the efficacy and safety of primary thromboprophylaxis for VTE in ambulatory cancer patients receiving chemotherapy compared with placebo or no thromboprophylaxis. SEARCH METHODS For this update, the Cochrane Peripheral Vascular Diseases Group Trials Search Co-ordinator searched the Specialised Register (last searched May 2013), CENTRAL (2013, Issue 5), and clinical trials registries (up to June 2013). SELECTION CRITERIA Randomised controlled trials (RCTs) comparing any oral or parenteral anticoagulant or mechanical intervention to no intervention or placebo, or comparing two different anticoagulants. DATA COLLECTION AND ANALYSIS Data were extracted on methodological quality, patients, interventions, and outcomes including symptomatic VTE and major bleeding as the primary effectiveness and safety outcomes, respectively. MAIN RESULTS We identified 12 additional RCTs (6323 patients) in the updated search so that this update considered 21 trials with a total of 9861 patients, all evaluating pharmacological interventions and performed mainly in patients with advanced cancer. Overall, the risk of bias varied from low to high. One large trial of 3212 patients found a 64% (risk ratio (RR) 0.36, 95% confidence interval (CI) 0.22 to 0.60) reduction of symptomatic VTE with the ultra-low molecular weight heparin (uLMWH) semuloparin relative to placebo, with no apparent difference in major bleeding (RR 1.05, 95% CI 0.55 to 2.00). LMWH, when compared with inactive control, significantly reduced the incidence of symptomatic VTE (RR 0.53, 95% CI 0.38 to 0.75; no heterogeneity, Tau(2) = 0%) with similar rates of major bleeding events (RR 1.30, 95% CI 0.75 to 2.23). In patients with multiple myeloma, LMWH was associated with a significant reduction in symptomatic VTE when compared with the vitamin K antagonist warfarin (RR 0.33, 95% CI 0.14 to 0.83), while the difference between LMWH and aspirin was not statistically significant (RR 0.51, 95% CI 0.22 to 1.17). No major bleeding was observed in the patients treated with LMWH or warfarin and in less than 1% of those treated with aspirin. Only one study evaluated unfractionated heparin against inactive control and found an incidence of major bleeding of 1% in both study groups while not reporting on VTE. When compared with placebo, warfarin was associated with a statistically insignificant reduction of symptomatic VTE (RR 0.15, 95% CI 0.02 to 1.20). Antithrombin, evaluated in one study involving paediatric patients, had no significant effect on VTE nor major bleeding when compared with inactive control. The new oral factor Xa inhibitor apixaban was evaluated in a phase-II dose finding study that suggested a promising low rate of major bleeding (2.1% versus 3.3%) and symptomatic VTE (1.1% versus 10%) in comparison with placebo. AUTHORS' CONCLUSIONS In this update, we confirmed that primary thromboprophylaxis with LMWH significantly reduced the incidence of symptomatic VTE in ambulatory cancer patients treated with chemotherapy. In addition, the uLMWH semuloparin significantly reduced the incidence of symptomatic VTE. However, the broad confidence intervals around the estimates for major bleeding suggest caution in the use of anticoagulation and mandate additional studies to determine the risk to benefit ratio of anticoagulants in this setting. Despite the encouraging results of this review, routine prophylaxis in ambulatory cancer patients cannot be recommended before safety issues are adequately addressed.
Resumo:
White matter connects different brain areas and applies electrical insulation to the neuron’s axons with myelin sheaths in order to enable quick signal transmission. Due to its modulatory properties in signal conduction, white matter plays an essential role in learning, cognition and psychiatric disorders (Fields, 2008a). In respect thereof, the non-invasive investigation of white matter anatomy and function in vivo provides the unique opportunity to explore the most complex organ of our body. Thus, the present thesis aimed to apply a multimodal neuroimaging approach to investigate different white matter properties in psychiatric and healthy populations. On the one hand, white matter microstructural properties were investigated in a psychiatric population; on the other hand, white matter metabolic properties were assessed in healthy adults providing basic information about the brain’s wiring entity. As a result, three research papers are presented here. The first paper assessed the microstructural properties of white matter in relation to a frequent epidemiologic finding in schizophrenia. As a result, reduced white matter integrity was observed in patients born in summer and autumn compared to patients born in winter and spring. Despite the large genetic basis of schizophrenia, accumulating evidence indicates that environmental exposures may be implicated in the development of schizophrenia (A. S. Brown, 2011). Notably, epidemiologic studies have shown a 5–8% excess of births during winter and spring for patients with schizophrenia on the Northern Hemisphere at higher latitudes (Torrey, Miller, Rawlings, & Yolken, 1997). Although the underlying mechanisms are unclear, the seasonal birth effect may indicate fluctuating environmental risk factors for schizophrenia. Thus, exposure to harmful factors during foetal development may result in the activation of pathologic neural circuits during adolescence or young adulthood, increasing the risk of schizophrenia (Fatemi & Folsom, 2009). While white matter development starts during the foetal period and continues until adulthood, its major development is accomplished by the age of two years (Brody, Kinney, Kloman, & Gilles, 1987; Huang et al., 2009). This indicates a vulnerability period of white matter that may coincide with the fluctuating environmental risk factors for schizophrenia. Since microstructural alterations of white matter in schizophrenia are frequently observed, the current study provided evidence for the neurodevelopmental hypothesis of schizophrenia. In the second research paper, the perfusion of white matter showed a positive correlation between white matter microstructure and its perfusion with blood across healthy adults. This finding was in line with clinical studies indicating a tight coupling between cerebral perfusion and WM health across subjects (Amann et al., 2012; Chen, Rosas, & Salat, 2013; Kitagawa et al., 2009). Although relatively little is known about the metabolic properties of white matter, different microstructural properties, such as axon diameter and myelination, might be coupled with the metabolic demand of white matter. Furthermore, the ability to detect perfusion signal in white matter was in accordance with a recent study showing that technical improvements, such as pseudo-continuous arterial spin labeling, enabled the reliable detection of white matter perfusion signal (van Osch et al., 2009). The third paper involved a collaboration within the same department to assess the interrelation between functional connectivity networks and their underlying structural connectivity.
Resumo:
A prerequisite for preventive measures is to diagnose erosive tooth wear and to evaluate the different etiological factors in order to identify persons at risk. No diagnostic device is available for the assessment of erosive defects. Thus, they can only be detected clinically. Consequently, erosion not diagnosed at an early stage may render timely preventive measures difficult. In order to assess the risk factors, patients should record their dietary intake for a distinct period of time. Then a dentist can determine the erosive potential of the diet. A table with common beverages and foodstuffs is presented for judging the erosive potential. Particularly, patients with more than 4 dietary acid intakes have a higher risk for erosion when other risk factors are present. Regurgitation of gastric acids is a further important risk factor for the development of erosion which has to be taken into account. Based on these analyses, an individually tailored preventive program may be suggested to the patients. It may comprise dietary advice, use of calcium-enriched beverages, optimization of prophylactic regimes, stimulation of salivary flow rate, use of buffering medicaments and particular motivation for nondestructive toothbrushing habits with an erosive-protecting toothpaste as well as rinsing solutions. Since erosion and abrasion often occur simultaneously, all of the causative components must be taken into consideration when planning preventive strategies but only those important and feasible for an individual should be communicated to the patient.
Resumo:
OBJECTIVE To determine the prognostic accuracy of cardiac biomarkers alone and in combination with clinical scores in elderly patients with non-high-risk pulmonary embolism (PE). DESIGN Ancillary analysis of a Swiss multicentre prospective cohort study. SUBJECTS A total of 230 patients aged ≥65 years with non-high-risk PE. MAIN OUTCOME MEASURES The study end-point was a composite of PE-related complications, defined as PE-related death, recurrent venous thromboembolism or major bleeding during a follow-up of 30 days. The prognostic accuracy of the Pulmonary Embolism Severity Index (PESI), the Geneva Prognostic Score (GPS), the precursor of brain natriuretic peptide (NT-proBNP) and high-sensitivity cardiac troponin T (hs-cTnT) was determined using sensitivity, specificity, predictive values, receiver operating characteristic (ROC) curve analysis, logistic regression and reclassification statistics. RESULTS The overall complication rate during follow-up was 8.7%. hs-cTnT achieved the highest prognostic accuracy [area under the ROC curve: 0.75, 95% confidence interval (CI): 0.63-0.86, P < 0.001). At the predefined cut-off values, the negative predictive values of the biomarkers were above 95%. For levels above the cut-off, the risk of complications increased fivefold for hs-cTnT [odds ratio (OR): 5.22, 95% CI: 1.49-18.25] and 14-fold for NT-proBNP (OR: 14.21, 95% CI: 1.73-116.93) after adjustment for both clinical scores and renal function. Reclassification statistics indicated that adding hs-cTnT to the GPS or the PESI significantly improved the prognostic accuracy of both clinical scores. CONCLUSION In elderly patients with nonmassive PE, NT-proBNP or hs-cTnT could be an adequate alternative to clinical scores for identifying low-risk individuals suitable for outpatient management.
Resumo:
BACKGROUND Anemia has been shown to be a risk factor for coronary artery disease and mortality. The involvement of body iron stores in the development of CAD remains controversial. So far, studies that examined hemoglobin and parameters of iron metabolism simultaneously do not exist. METHODS AND RESULTS Hemoglobin and iron status were determined in 1480 patients with stable angiographic coronary artery disease (CAD) and in 682 individuals in whom CAD had been ruled out by angiography. The multivariate adjusted odds ratios (OR) for CAD in the lowest quartiles of hemoglobin and iron were 1.62 (95%CI: 1.22-2.16), and 2.05 (95%CI: 1.51-2.78), respectively compared to their highest gender-specific quartiles. The fully adjusted ORs for CAD in the lowest quartiles of transferrin saturation, ferritin (F) and soluble transferrin receptor (sTfR)/log10F index were 1.69 (95%CI: 1.25-2.27), 1.98 (95%CI: 1.48-2.65), and 1.64 (95%CI: 1.23-2.18), respectively compared to their highest gender-specific quartiles. When adjusting in addition for iron and ferritin the OR for CAD in the lowest quartiles of hemoglobin was still 1.40 (95%CI: 1.04-1.90) compared to the highest gender-specific quartiles. Thus, the associations between either iron status or low hemoglobin and CAD appeared independent from each other. The sTfR was only marginally associated with angiographic CAD. CONCLUSIONS Both low hemoglobin and iron depletion are independently associated with angiographic CAD.
Resumo:
OBJECTIVES To summarize the current status of clinicopathological and molecular markers for the prediction of recurrence or progression or both in non-muscle-invasive and survival in muscle-invasive urothelial bladder cancer, to address the reproducibility of pathology and molecular markers, and to provide directions toward implementation of molecular markers in future clinical decision making. METHODS AND MATERIALS Immunohistochemistry, gene signatures, and FGFR3-based molecular grading were used as molecular examples focussing on prognostics and issues related to robustness of pathological and molecular assays. RESULTS The role of molecular markers to predict recurrence is limited, as clinical variables are currently more important. The prediction of progression and survival using molecular markers holds considerable promise. Despite a plethora of prognostic (clinical and molecular) marker studies, reproducibility of pathology and molecular assays has been understudied, and lack of reproducibility is probably the main reason that individual prediction of disease outcome is currently not reliable. CONCLUSIONS Molecular markers are promising to predict progression and survival, but not recurrence. However, none of these are used in the daily clinical routine because of reproducibility issues. Future studies should focus on reproducibility of marker assessment and consistency of study results by incorporating scoring systems to reduce heterogeneity of reporting. This may ultimately lead to incorporation of molecular markers in clinical practice.
Resumo:
Post-traumatic sleep-wake disturbances are common after acute traumatic brain injury. Increased sleep need per 24 h and excessive daytime sleepiness are among the most prevalent post-traumatic sleep disorders and impair quality of life of trauma patients. Nevertheless, the relation between traumatic brain injury and sleep outcome, but also the link between post-traumatic sleep problems and clinical measures in the acute phase after traumatic brain injury has so far not been addressed in a controlled and prospective approach. We therefore performed a prospective controlled clinical study to examine (i) sleep-wake outcome after traumatic brain injury; and (ii) to screen for clinical and laboratory predictors of poor sleep-wake outcome after acute traumatic brain injury. Forty-two of 60 included patients with first-ever traumatic brain injury were available for follow-up examinations. Six months after trauma, the average sleep need per 24 h as assessed by actigraphy was markedly increased in patients as compared to controls (8.3 ± 1.1 h versus 7.1 ± 0.8 h, P < 0.0001). Objective daytime sleepiness was found in 57% of trauma patients and 19% of healthy subjects, and the average sleep latency in patients was reduced to 8.7 ± 4.6 min (12.1 ± 4.7 min in controls, P = 0.0009). Patients, but not controls, markedly underestimated both excessive sleep need and excessive daytime sleepiness when assessed only by subjective means, emphasizing the unreliability of self-assessment of increased sleep propensity in traumatic brain injury patients. At polysomnography, slow wave sleep after traumatic brain injury was more consolidated. The most important risk factor for developing increased sleep need after traumatic brain injury was the presence of an intracranial haemorrhage. In conclusion, we provide controlled and objective evidence for a direct relation between sleep-wake disturbances and traumatic brain injury, and for clinically significant underestimation of post-traumatic sleep-wake disturbances by trauma patients.
Resumo:
AIMS HIV infection may be associated with an increased recurrence rate of myocardial infarction. Our aim was to determine whether HIV infection is a risk factor for worse outcomes in patients with coronaray artery disease. METHODS We compared data aggregated from two ongoing cohorts: (i) the Acute Myocardial Infarction in Switzerland (AMIS) registry, which includes patients with acute myocardial infarction (AMI), and (ii) the Swiss HIV Cohort Study (SHCS), a prospective registry of HIV-positive (HIV+) patients. We included all patients who survived an incident AMI occurring on or after 1st January 2005. Our primary outcome measure was all-cause mortality at one year; secondary outcomes included AMI recurrence and cardiovascular-related hospitalisations. Comparisons used Cox and logistic regression analyses, respectively. RESULTS There were 133 HIV+, (SHCS) and 5,328 HIV-negative [HIV-] (AMIS) individuals with incident AMI. In the SHCS and AMIS registries, patients were predominantly male (72% and 85% male, respectively), with a median age of 51 years (interquartile range [IQR] 46-57) and 64 years (IQR 55-74), respectively. Nearly all (90%) of HIV+ individuals were on successful antiretroviral therapy. During the first year of follow-up, 5 (3.6%) HIV+ and 135 (2.5%) HIV- individuals died. At one year, HIV+ status after adjustment for age, sex, calendar year of AMI, smoking status, hypertension and diabetes was associated with a higher risk of death (HR 4.42, 95% CI 1.73-11.27). There were no significant differences in recurrent AMIs (4 [3.0%] HIV+ and 146 [3.0%] HIV- individuals, OR 1.16, 95% CI 0.41-3.27) or in hospitalization rates (OR 0.68 [95% CI 0.42-1.11]). CONCLUSIONS HIV infection was associated with a significantly increased risk of all-cause mortality one year after incident AMI.
Resumo:
BACKGROUND Prostate cancer (PCa) is a very heterogeneous disease with respect to clinical outcome. This study explored differential DNA methylation in a priori selected genes to diagnose PCa and predict clinical failure (CF) in high-risk patients. METHODS A quantitative multiplex, methylation-specific PCR assay was developed to assess promoter methylation of the APC, CCND2, GSTP1, PTGS2 and RARB genes in formalin-fixed, paraffin-embedded tissue samples from 42 patients with benign prostatic hyperplasia and radical prostatectomy specimens of patients with high-risk PCa, encompassing training and validation cohorts of 147 and 71 patients, respectively. Log-rank tests, univariate and multivariate Cox models were used to investigate the prognostic value of the DNA methylation. RESULTS Hypermethylation of APC, CCND2, GSTP1, PTGS2 and RARB was highly cancer-specific. However, only GSTP1 methylation was significantly associated with CF in both independent high-risk PCa cohorts. Importantly, trichotomization into low, moderate and high GSTP1 methylation level subgroups was highly predictive for CF. Patients with either a low or high GSTP1 methylation level, as compared to the moderate methylation groups, were at a higher risk for CF in both the training (Hazard ratio [HR], 3.65; 95% CI, 1.65 to 8.07) and validation sets (HR, 4.27; 95% CI, 1.03 to 17.72) as well as in the combined cohort (HR, 2.74; 95% CI, 1.42 to 5.27) in multivariate analysis. CONCLUSIONS Classification of primary high-risk tumors into three subtypes based on DNA methylation can be combined with clinico-pathological parameters for a more informative risk-stratification of these PCa patients.