993 resultados para Angle of 95% confidence
Resumo:
BACKGROUND: Persistence is a key factor for long-term blood pressure control, which is of high prognostic importance for patients at increased cardiovascular risk. Here we present the results of a post-marketing survey including 4769 hypertensive patients treated with irbesartan in 886 general practices in Switzerland. The goal of this survey was to evaluate the tolerance and the blood pressure lowering effect of irbesartan as well as the factors affecting persistence in a large unselected population. METHODS: Prospective observational survey conducted in general practices in all regions of Switzerland. Previously untreated and uncontrolled pre-treated patients were started with a daily dose of 150 mg irbesartan and followed up to 6 months. RESULTS: After an observation time slightly exceeding 4 months, the average reduction in systolic and diastolic blood pressure was 20 (95% confidence interval (CI) -19.6 to -20.7 mmHg) and 12 mmHg (95% CI -11.4 to -12.1 mmHg), respectively. At this time, 26% of patients had a blood pressure < 140/90 mmHg and 60% had a diastolic blood pressure < 90 mmHg. The drug was well tolerated with an incidence of adverse events (dizziness, headaches,...) of 8.0%. In this survey more than 80% of patients were still on irbesartan at 4 month. The most important factors predictive of persistence were the tolerability profile and the ability to achieve a blood pressure target < or = 140/90 mmHg before visit 2. Patients who switched from a fixed combination treatment tended to discontinue irbesartan more often whereas those who abandoned the previous treatment because of cough (a class side effect of ACE-Inhibitors) were more persistent with irbesartan. CONCLUSION: The results of this survey confirm that irbesartan is effective, well tolerated and well accepted by patients, as indicated by the good persistence. This post-marketing survey also emphasizes the importance of the tolerability profile and of achieving an early control of blood pressure as positive predictors of persistence.
Resumo:
BACKGROUND: Cigarette smoking is often initiated at a young age as well as other risky behaviors such as alcohol drinking, cannabis and other illicit drugs use. Some studies suggest that cigarette smoking may have an influence on other risky behaviors but little is known about the chronology of occurrence of those different habits. The aim of this study was to assess, by young men, what were the other risky behaviors associated with cigarette smoking and the joint prevalence and chronology of occurrence of those risky behaviors. METHODS: Cross-sectional analyses of a population-based census of 3526 young men attending the recruitment for the Swiss army, aged between 17 and 25 years old (mean age: 19 years old), who filled a self reported questionnaire about their alcohol, cigarettes, cannabis and other illicit drugs habits. Actual smoking was defined as either regular smoking (¡Ý1 cigarette/day, on every day) or occasional smoking, binge drinking as six or more drinks at least twice a month, at risk drinking as 21 drinks or more per week, recent cannabis use as cannabis consumption at least once during the last month, and use of illicit drugs as consumption once or more of illicit drugs other than cannabis. Age at begin was defined as age at first use of cannabis or cigarette smoking. RESULTS: In this population of young men, the prevalence of actual smoking was 51.2% (36.5% regular smoking, 14.6% occasionnal smoking). Two third of participamnts (60.1%) declared that they ever used cannabis, 25.2% reported a recent use of cannabis. 53.8% of participants had a risky alcohol consumption considered as either binge or at risk drinking. Cigarette smoking was significantly associated with recent cannabis use (Odds Ratio (OR): 3.85, 95% Confidence Interval (CI): 3.10- 4.77), binge drinking (OR: 3.48, 95% CI: 3.03-4.00), at risk alcohol drinking (OR: 4.04, 95% CI: 3.12-5.24), and ever use of illicit drugs (OR: 4.34, 95% CI: 3.54-5.31). In a multivariate logistic regression, odds ratios for smoking were increased for cannabis users (OR 3.10,, 95% CI: 2.48-3.88), binge drinkers (OR: 1.77, 95% CI: 1.44-2.17), at risk alcohol drinkers (OR 2.26, 95% CI: 1.52-3.36) and ever users of illicit drugs (OR: 1.56, 95% CI: 1.20-2.03). The majority of young men (57.3%) initiated smoking before cannabis and mean age at onset was 13.4 years old, whereas only 11.1% began to use cannabis before smoking cigarettes and mean age at onset was slightly older (14.4 years old). 31.6% started both cannabis and tobacco at the same age (15 years old). About a third of participants (30.5%) did have a cluster of risky behaviours (smoking, at risk drinking, cannabis use) and 11.0% did cumulate smoking, drinking, cannabis and ever use of illegal drugs. More than half of the smokers (59.6%) did cumulate cannabis use and at risk alcohol drinking whereas only 18.5% of non-smokers did. CONCLUSIONS: The majority of young smokers initiated their risky behaviors by first smoking and then by other psychoactive drugs. Smokers have an increased risk to present other risky behaviors such as cannabis use, at risk alcohol consumtion and illicit drug use compared to nonsmokers. Prevention by young male adults should focus on smoking and also integrate interventions on other risky behaviors.
Resumo:
There are suggestions of an inverse association between folate intake and serum folate levels and the risk of oral cavity and pharyngeal cancers (OPCs), but most studies are limited in sample size, with only few reporting information on the source of dietary folate. Our study aims to investigate the association between folate intake and the risk of OPC within the International Head and Neck Cancer Epidemiology (INHANCE) Consortium. We analyzed pooled individual-level data from ten case-control studies participating in the INHANCE consortium, including 5,127 cases and 13,249 controls. Odds ratios (ORs) and the corresponding 95% confidence intervals (CIs) were estimated for the associations between total folate intake (natural, fortification and supplementation) and natural folate only, and OPC risk. We found an inverse association between total folate intake and overall OPC risk (the adjusted OR for the highest vs. the lowest quintile was 0.65, 95% CI: 0.43-0.99), with a stronger association for oral cavity (OR = 0.57, 95% CI: 0.43-0.75). A similar inverse association, though somewhat weaker, was observed for folate intake from natural sources only in oral cavity cancer (OR = 0.64, 95% CI: 0.45-0.91). The highest OPC risk was observed in heavy alcohol drinkers with low folate intake as compared to never/light drinkers with high folate (OR = 4.05, 95% CI: 3.43-4.79); the attributable proportion (AP) owing to interaction was 11.1% (95% CI: 1.4-20.8%). Lastly, we reported an OR of 2.73 (95% CI:2.34-3.19) for those ever tobacco users with low folate intake, compared with nevere tobacco users and high folate intake (AP of interaction =10.6%, 95% CI: 0.41-20.8%). Our project of a large pool of case-control studies supports a protective effect of total folate intake on OPC risk.
Resumo:
BACKGROUND: Pharmacists may improve the clinical management of major risk factors for cardiovascular disease (CVD) prevention. A systematic review was conducted to determine the impact of pharmacist care on the management of CVD risk factors among outpatients. METHODS: The MEDLINE, EMBASE, CINAHL, and Cochrane Central Register of Controlled Trials databases were searched for randomized controlled trials that involved pharmacist care interventions among outpatients with CVD risk factors. Two reviewers independently abstracted data and classified pharmacists' interventions. Mean changes in blood pressure, total cholesterol, low-density lipoprotein cholesterol, and proportion of smokers were estimated using random effects models. RESULTS: Thirty randomized controlled trials (11 765 patients) were identified. Pharmacist interventions exclusively conducted by a pharmacist or implemented in collaboration with physicians or nurses included patient educational interventions, patient-reminder systems, measurement of CVD risk factors, medication management and feedback to physician, or educational intervention to health care professionals. Pharmacist care was associated with significant reductions in systolic/diastolic blood pressure (19 studies [10 479 patients]; -8.1 mm Hg [95% confidence interval {CI}, -10.2 to -5.9]/-3.8 mm Hg [95% CI,-5.3 to -2.3]); total cholesterol (9 studies [1121 patients]; -17.4 mg/L [95% CI,-25.5 to -9.2]), low-density lipoprotein cholesterol (7 studies [924 patients]; -13.4 mg/L [95% CI,-23.0 to -3.8]), and a reduction in the risk of smoking (2 studies [196 patients]; relative risk, 0.77 [95% CI, 0.67 to 0.89]). While most studies tended to favor pharmacist care compared with usual care, a substantial heterogeneity was observed. CONCLUSION: Pharmacist-directed care or in collaboration with physicians or nurses improve the management of major CVD risk factors in outpatients.
Resumo:
BACKGROUND: The Foot and Ankle Ability Measure (FAAM) is a self reported questionnaire for patients with foot and ankle disorders available in English, German, and Persian. This study plans to translate the FAAM from English to French (FAAM-F) and assess the validity and reliability of this new version.METHODS: The FAAM-F Activities of Daily Living (ADL) and sports subscales were completed by 105 French-speaking patients (average age 50.5 years) presenting various chronic foot and ankle disorders. Convergent and divergent validity was assessed by Pearson's correlation coefficients between the FAAM-F subscales and the SF-36 scales: Physical Functioning (PF), Physical Component Summary (PCS), Mental Health (MH) and Mental Component Summary (MCS). Internal consistency was calculated by Cronbach's Alpha (CA). To assess test re-test reliability, 22 patients filled out the questionnaire a second time to estimate minimal detectable changes (MDC) and intraclass correlation coefficients (ICC).RESULTS: Correlations for FAAM-F ADL subscale were 0.85 with PF, 0.81 with PCS, 0.26 with MH, 0.37 with MCS. Correlations for FAAM-F Sports subscale were 0.72 with PF, 0.72 with PCS, 0.21 with MH, 0.29 with MCS. CA estimates were 0.97 for both subscales. Respectively for the ADL and Sports subscales, ICC were 0.97 and 0.94, errors for a single measure were 8 and 10 points at 95% confidence and the MDC values at 95% confidence were 7 and 18 points.CONCLUSION: The FAAM-F is valid and reliable for the self-assessment of physical function in French-speaking patients with a wide range of chronic foot and ankle disorders.
Resumo:
BACKGROUND: HIV treatment recommendations are updated as clinical trials are published. Whether recommendations drive clinicians to change antiretroviral therapy in well-controlled patients is unexplored. METHODS: We selected patients with undetectable viral loads (VLs) on nonrecommended regimens containing double-boosted protease inhibitors (DBPIs), triple-nucleoside reverse transcriptase inhibitors (NRTIs), or didanosine (ddI) plus stavudine (d4T) at publication of the 2006 International AIDS Society recommendations. We compared demographic and clinical characteristics with those of control patients with undetectable VL not on these regimens and examined clinical outcome and reasons for treatment modification. RESULTS: At inclusion, 104 patients were in the DBPI group, 436 in the triple-NRTI group, and 19 in the ddI/d4T group. By 2010, 28 (29%), 204 (52%), and 1 (5%) patient were still on DBPIs, triple-NRTIs, and ddI plus d4T, respectively. 'Physician decision,' excluding toxicity/virological failure, drove 30% of treatment changes. Predictors of recommendation nonobservance included female sex [adjusted odds ratio (aOR) 2.69, 95% confidence interval (CI) 1 to 7.26; P = 0.01] for DPBIs, and undetectable VL (aOR 3.53, 95% CI 1.6 to 7.8; P = 0.002) and lack of cardiovascular events (aOR 2.93, 95% CI 1.23 to 6.97; P = 0.02) for triple-NRTIs. All patients on DBPIs with documented diabetes or a cardiovascular event changed treatment. Recommendation observance resulted in lower cholesterol values in the DBPI group (P = 0.06), and more patients having undetectable VL (P = 0.02) in the triple-NRTI group. CONCLUSION: The physician's decision is the main factor driving change from nonrecommended to recommended regimens, whereas virological suppression is associated with not switching. Positive clinical outcomes observed postswitch underline the importance of observing recommendations, even in well-controlled patients.
Resumo:
Background: The objective of the present study was to compare three different sampling and questionnaire administration methods used in the international KIDSCREEN study in terms of participation, response rates, and external validity. Methods: Children and adolescents aged 8–18 years were surveyed in 13 European countries using either telephone sampling and mail administration, random sampling of school listings followed by classroom or mail administration, or multistage random sampling of communities and households with self-administration of the survey materials at home. Cooperation, completion, and response rates were compared across countries and survey methods. Data on non-respondents was collected in 8 countries. The population fraction (PF, respondents in each sex-age, or educational level category, divided by the population in the same category from Eurostat census data) and population fraction ratio (PFR, ratio of PF) and their corresponding 95% confidence intervals were used to analyze differences by country between the KIDSCREEN samples and a reference Eurostat population. Results: Response rates by country ranged from 18.9% to 91.2%. Response rates were highest in the school-based surveys (69.0%–91.2%). Sample proportions by age and gender were similar to the reference Eurostat population in most countries, although boys and adolescents were slightly underrepresented (PFR <1). Parents in lower educational categories were less likely to participate (PFR <1 in 5 countries). Parents in higher educational categories were overrepresented when the school and household sampling strategies were used (PFR = 1.78–2.97). Conclusion: School-based sampling achieved the highest overall response rates but also produced slightly more biased samples than the other methods. The results suggest that the samples were sufficiently representative to provide reference population values for the KIDSCREEN instrument.
Resumo:
Shrews of the genus Sorex are characterized by a Holarctic distribution, and relationships among extant taxa have never been fully resolved. Phylogenies have been proposed based on morphological, karyological, and biochemical comparisons, but these analyses often produced controversial and contradictory results. Phylogenetic analyses of partial mitochondrial cytochrome b gene sequences (1011 bp) were used to examine the relationships among 27 Sorex species. The molecular data suggest that Sorex comprises two major monophyletic lineages, one restricted mostly to the New World and one with a primarily Palearctic distribution. Furthermore, several sister-species relationships are revealed by the analysis. Based on the split between the Soricinae and Crocidurinae subfamilies, we used a 95% confidence interval for both the calibration of a molecular clock and the subsequent calculation of major diversification events within the genus Sorex. Our analysis does not support an unambiguous acceleration of the molecular clock in shrews, the estimated rate being similar to other estimates of mammalian mitochondrial clocks. In addition, the data presented here indicate that estimates from the fossil record greatly underestimate divergence dates among Sorex taxa.
Resumo:
PURPOSE: To determine and compare the diagnostic performance of magnetic resonance imaging (MRI) and computed tomography (CT) for the diagnosis of tumor extent in advanced retinoblastoma, using histopathologic analysis as the reference standard. DESIGN: Systematic review and meta-analysis. PARTICIPANTS: Patients with advanced retinoblastoma who underwent MRI, CT, or both for the detection of tumor extent from published diagnostic accuracy studies. METHODS: Medline and Embase were searched for literature published through April 2013 assessing the diagnostic performance of MRI, CT, or both in detecting intraorbital and extraorbital tumor extension of retinoblastoma. Diagnostic accuracy data were extracted from included studies. Summary estimates were based on a random effects model. Intrastudy and interstudy heterogeneity were analyzed. MAIN OUTCOME MEASURES: Sensitivity and specificity of MRI and CT in detecting tumor extent. RESULTS: Data of the following tumor-extent parameters were extracted: anterior eye segment involvement and ciliary body, optic nerve, choroidal, and (extra)scleral invasion. Articles on MRI reported results of 591 eyes from 14 studies, and articles on CT yielded 257 eyes from 4 studies. The summary estimates with their 95% confidence intervals (CIs) of the diagnostic accuracy of conventional MRI at detecting postlaminar optic nerve, choroidal, and scleral invasion showed sensitivities of 59% (95% CI, 37%-78%), 74% (95% CI, 52%-88%), and 88% (95% CI, 20%-100%), respectively, and specificities of 94% (95% CI, 84%-98%), 72% (95% CI, 31%-94%), and 99% (95% CI, 86%-100%), respectively. Magnetic resonance imaging with a high (versus a low) image quality showed higher diagnostic accuracies for detection of prelaminar optic nerve and choroidal invasion, but these differences were not statistically significant. Studies reporting the diagnostic accuracy of CT did not provide enough data to perform any meta-analyses. CONCLUSIONS: Magnetic resonance imaging is an important diagnostic tool for the detection of local tumor extent in advanced retinoblastoma, although its diagnostic accuracy shows room for improvement, especially with regard to sensitivity. With only a few-mostly old-studies, there is very little evidence on the diagnostic accuracy of CT, and generally these studies show low diagnostic accuracy. Future studies assessing the role of MRI in clinical decision making in terms of prognostic value for advanced retinoblastoma are needed.
Resumo:
The objective of this study was to determine the effect of once-yearly zoledronic acid on the number of days of back pain and the number of days of disability (ie, limited activity and bed rest) owing to back pain or fracture in postmenopausal women with osteoporosis. This was a multicenter, randomized, double-blind, placebo-controlled trial in 240 clinical centers in 27 countries. Participants included 7736 postmenopausal women with osteoporosis. Patients were randomized to receive either a single 15-minute intravenous infusion of zoledronic acid (5 mg) or placebo at baseline, 12 months, and 24 months. The main outcome measures were self-reported number of days with back pain and the number of days of limited activity and bed rest owing to back pain or a fracture, and this was assessed every 3 months over a 3-year period. Our results show that although the incidence of back pain was high in both randomized groups, women randomized to zoledronic acid experienced, on average, 18 fewer days of back pain compared with placebo over the course of the trial (p = .0092). The back pain among women randomized to zoledronic acid versus placebo resulted in 11 fewer days of limited activity (p = .0017). In Cox proportional-hazards models, women randomized to zoledronic acid were about 6% less likely to experience 7 or more days of back pain [relative risk (RR) = 0.94, 95% confidence interval (CI) 0.90-0.99] or limited activity owing to back pain (RR = 0.94, 95% CI 0.87-1.00). Women randomized to zoledronic acid were significantly less likely to experience 7 or more bed-rest days owing to a fracture (RR = 0.58, 95% CI 0.47-0.72) and 7 or more limited-activity days owing to a fracture (RR = 0.67, 95% CI 0.58-0.78). Reductions in back pain with zoledronic acid were independent of incident fracture. Our conclusion is that in women with postmenopausal osteoporosis, a once-yearly infusion with zoledronic acid over a 3-year period significantly reduced the number of days that patients reported back pain, limited activity owing to back pain, and limited activity and bed rest owing to a fracture.
Resumo:
OBJECTIVES: The objectives were to identify the social and medical factors associated with emergency department (ED) frequent use and to determine if frequent users were more likely to have a combination of these factors in a universal health insurance system. METHODS: This was a retrospective chart review case-control study comparing randomized samples of frequent users and nonfrequent users at the Lausanne University Hospital, Switzerland. The authors defined frequent users as patients with four or more ED visits within the previous 12 months. Adult patients who visited the ED between April 2008 and March 2009 (study period) were included, and patients leaving the ED without medical discharge were excluded. For each patient, the first ED electronic record within the study period was considered for data extraction. Along with basic demographics, variables of interest included social (employment or housing status) and medical (ED primary diagnosis) characteristics. Significant social and medical factors were used to construct a logistic regression model, to determine factors associated with frequent ED use. In addition, comparison of the combination of social and medical factors was examined. RESULTS: A total of 359 of 1,591 frequent and 360 of 34,263 nonfrequent users were selected. Frequent users accounted for less than a 20th of all ED patients (4.4%), but for 12.1% of all visits (5,813 of 48,117), with a maximum of 73 ED visits. No difference in terms of age or sex occurred, but more frequent users had a nationality other than Swiss or European (n = 117 [32.6%] vs. n = 83 [23.1%], p = 0.003). Adjusted multivariate analysis showed that social and specific medical vulnerability factors most increased the risk of frequent ED use: being under guardianship (adjusted odds ratio [OR] = 15.8; 95% confidence interval [CI] = 1.7 to 147.3), living closer to the ED (adjusted OR = 4.6; 95% CI = 2.8 to 7.6), being uninsured (adjusted OR = 2.5; 95% CI = 1.1 to 5.8), being unemployed or dependent on government welfare (adjusted OR = 2.1; 95% CI = 1.3 to 3.4), the number of psychiatric hospitalizations (adjusted OR = 4.6; 95% CI = 1.5 to 14.1), and the use of five or more clinical departments over 12 months (adjusted OR = 4.5; 95% CI = 2.5 to 8.1). Having two of four social factors increased the odds of frequent ED use (adjusted = OR 5.4; 95% CI = 2.9 to 9.9), and similar results were found for medical factors (adjusted OR = 7.9; 95% CI = 4.6 to 13.4). A combination of social and medical factors was markedly associated with ED frequent use, as frequent users were 10 times more likely to have three of them (on a total of eight factors; 95% CI = 5.1 to 19.6). CONCLUSIONS: Frequent users accounted for a moderate proportion of visits at the Lausanne ED. Social and medical vulnerability factors were associated with frequent ED use. In addition, frequent users were more likely to have both social and medical vulnerabilities than were other patients. Case management strategies might address the vulnerability factors of frequent users to prevent inequities in health care and related costs.
Resumo:
OBJECTIVE: To verify the influence of age on the prognosis of cervix carcinoma. STUDY DESIGN: Five hundred and sixty eight patients treated for a FIGO stage IB-IVA with radical irradiation in the Centre Hospitalier Universitaire Vaudois of Lausanne were subdivided according to the following age categories: < or = 45, 46-60, 61-69 and >70 years. Taking the 46-60 years age group as the reference, the hazard ratios (HR) of death and corresponding 95% confidence intervals (95% CI) were estimated by means of a Cox multivariate analysis. RESULTS: The 5-year survival rates were, respectively, 57%, 67%, 60% and 45%. For the youngest women the risk of death was significantly increased (HR = 2.00, 95% CI [1.32-3.00]) and was even more accentuated in advanced stages. CONCLUSION: Age under 45 years is a bad prognostic factor in carcinoma of the cervix.
Resumo:
BACKGROUND: The study aimed to compare the cost-effectiveness of concomitant and adjuvant temozolomide (TMZ) for the treatment of newly diagnosed glioblastoma multiforme versus initial radiotherapy alone from a public health care perspective. METHODS: The economic evaluation was performed alongside a randomized, multicenter, phase 3 trial. The primary endpoint of the trial was overall survival. Costs included all direct medical costs. Economic data were collected prospectively for a subgroup of 219 patients (38%). Unit costs for drugs, procedures, laboratory and imaging, radiotherapy, and hospital costs per day were collected from the official national reimbursement lists based on 2004. For the cost-effectiveness analysis, survival was expressed as 2.5 years restricted mean estimates. The incremental cost-effectiveness ratio (ICER) was constructed. Confidence intervals for the ICER were calculated using the Fieller method and bootstrapping. RESULTS: The difference in 2.5 years restricted mean survival between the treatment arms was 0.25 life-years and the ICER was euro37,361 per life-year gained with a 95% confidence interval (CI) ranging from euro19,544 to euro123,616. The area between the survival curves of the treatment arms suggests an increase of the overall survival gain for a longer follow-up. An extrapolation of the overall survival per treatment arm and imputation of costs for the extrapolated survival showed a substantial reduction in ICER. CONCLUSIONS: The ICER of euro37,361 per life-year gained is a conservative estimate. We concluded that despite the high TMZ acquisition costs, the costs per life-year gained are comparable to accepted first-line treatment with chemotherapy in patients with cancer.
Resumo:
Whether maximal surgical resection of glioblastoma improves patient survival has been controversial, as it is difficult to perform an unbiased assessment of extent of resection (EOR) independent of other patient-specific prognostic factors. Recently, glioblastoma has been sub-classified into 4 distinct molecular risk groups (RGs), which have been validated as prognostic biomarkers in the randomized clinical trial of temozolomide dosing in glioblastoma: the Radiation Therapy Oncology Group 0525 (RTOG-0525) trial. We sought to perform exploratory analyses examining gross total resection (GTR) versus sub-total resection (STR) within these RGs in RTOG-0525 patients. Across all randomized patients, n ¼ 354 had STR and n ¼ 450 had GTR as determined by neurosurgeon operative report. GTR was not significantly associated with survival across the overall study group. A total of 725 patients had sufficient tissue for determination of molecular RG. There were no significant differences in percentage of GTR between each of the 4 RGs (P ¼ 0.64). In exploratory subgroup analyses, GTR was associated with improved survival only for patients with tumors from RG4. Hazard ratios (95% confidence intervals) were 0.52 (0.08-2.07) for RG1 (n ¼ 28, 68% GTR), 1.74 (0.75-4.05) for RG2 (n ¼ 39, 56% GTR), 1.09 (0.84-1.42) for RG3 (n ¼ 284, 56% GTR), and 1.26 (1.01-1.56) for RG4 (n ¼ 374, 55% GTR). In univariate analysis within RG4, GTR was associated with a median survival of 14.6 months vs 12.7 months for STR (P ¼ 0.0352. In a Cox model adjusting for age, KPS, and neurologic function (NF), surgery remained an independent factor within RG4: GTR (P ¼ 0.0331), age (P ¼ 0.0014), KPS (P ¼ .3289), and NF (P ¼ 0.3804). There are important cautions in the interpretation of these data, including lack of MRI confirmation of EOR, and inclusion of a range of STR (from biopsy to near-total resection). However, these exploratory results raise the possibility that upfront characterization of tumor molecular profile may allow for personalized therapeutic strategies to improve outcomes for patients with glioblastoma.
Resumo:
BACKGROUND: The race- and sex-specific epidemiology of incident heart failure (HF) among a contemporary elderly cohort are not well described. METHODS: We studied 2934 participants without HF enrolled in the Health, Aging, and Body Composition Study (mean [SD] age, 73.6 [2.9] years; 47.9% men; 58.6% white; and 41.4% black) and assessed the incidence of HF, population-attributable risk (PAR) of independent risk factors for HF, and outcomes of incident HF. RESULTS: During a median follow-up of 7.1 years, 258 participants (8.8%) developed HF (13.6 cases per 1000 person-years; 95% confidence interval, 12.1-15.4). Men and black participants were more likely to develop HF. No significant sex-based differences were observed in risk factors. Coronary heart disease (PAR, 23.9% for white participants and 29.5% for black participants) and uncontrolled blood pressure (PAR, 21.3% for white participants and 30.1% for black participants) carried the highest PAR in both races. Among black participants, 6 of 8 risk factors assessed (smoking, increased heart rate, coronary heart disease, left ventricular hypertrophy, uncontrolled blood pressure, and reduced glomerular filtration rate) had more than 5% higher PAR compared with that among white participants, leading to a higher overall proportion of HF attributable to modifiable risk factors in black participants vs white participants (67.8% vs 48.9%). Participants who developed HF had higher annual mortality (18.0% vs 2.7%). No racial difference in survival after HF was noted; however, rehospitalization rates were higher among black participants (62.1 vs 30.3 hospitalizations per 100 person-years, P < .001). CONCLUSIONS: Incident HF is common in older persons; a large proportion of HF risk is attributed to modifiable risk factors. Racial differences in risk factors for HF and in hospitalization rates after HF need to be considered in prevention and treatment efforts.