936 resultados para Pneumonia : Confidence
Resumo:
Objective The objective of the study was to investigate whether depression is a predictor of postdischarge smoking relapse among patients hospitalized for myocardial infarction (MI) or unstable angina (ILIA), in a smoke-free hospital. Methods Current smokers with MI or UA were interviewed while hospitalized; patients classified with major depression (MD) or no humor disorder were reinterviewed 6 months post discharge to ascertain smoking status. Potential predictors of relapse (depression; stress; anxiety; heart disease risk perception; coffee and alcohol consumption; sociodemographic, clinical, and smoking habit characteristics) were compared between those with MD (n = 268) and no humor disorder (n = 135). Results Relapsers (40.4%) were more frequently and more severely depressed, had higher anxiety and lower self-efficacy scale scores, diagnosis of UA, shorter hospitalizations, started smoking younger, made fewer attempts to quit, had a consort less often, and were more frequently at the `precontemplation` stage of change. Multivariate analysis showed relapse-positive predictors to be MD [odds ratio (OR): 2.549; 95% confidence interval (CI): 1.519-4.275] (P<0.001); `precontemplation` stage of change (OR: 7.798; 95% CI: 2.442-24.898) (P<0.001); previous coronary bypass graft surgery (OR: 4.062; 95% CI: 1.356-12.169) (P=0.012); and previous anxiolytic use (OR: 2.365; 95% CI: 1.095-5.107) (P=0.028). Negative predictors were diagnosis of MI (OR: 0.575; 95% CI: 0.361-0.916) (P=0.019); duration of hospitalization (OR: 0.935; 95% CI: 0.898-0.973) (P=0.001); smoking onset age (OR: 0.952; 95% CI: 0.910-0.994) (P=0.028); number of attempts to quit smoking (OR: 0.808; 95% CI: 0.678-0.964) (P=0.018); and `action` stage of change (OR: 0.065; 95% CI: 0.008-0.532) (P= 0.010). Conclusion Depression, no motivation, shorter hospitalization, and severity of illness contributed to postdischarge resumption of smoking by patients with acute coronary syndrome, who underwent hospital-initiated smoking cessation.
Resumo:
Background-This study compared the 10-year follow-up of percutaneous coronary intervention (PCI), coronary artery surgery (CABG), and medical treatment (MT) in patients with multivessel coronary artery disease, stable angina, and preserved ventricular function. Methods and Results-The primary end points were overall mortality, Q-wave myocardial infarction, or refractory angina that required revascularization. All data were analyzed according to the intention-to-treat principle. At a single institution, 611 patients were randomly assigned to CABG (n = 203), PCI (n = 205), or MT (n = 203). The 10-year survival rates were 74.9% with CABG, 75.1% with PCI, and 69% with MT (P = 0.089). The 10-year rates of myocardial infarction were 10.3% with CABG, 13.3% with PCI, and 20.7% with MT (P < 0.010). The 10-year rates of additional revascularizations were 7.4% with CABG, 41.9% with PCI, and 39.4% with MT (P < 0.001). Relative to the composite end point, Cox regression analysis showed a higher incidence of primary events in MT than in CABG (hazard ratio 2.35, 95% confidence interval 1.78 to 3.11) and in PCI than in CABG (hazard ratio 1.85, 95% confidence interval 1.39 to 2.47). Furthermore, 10-year rates of freedom from angina were 64% with CABG, 59% with PCI, and 43% with MT (P < 0.001). Conclusions-Compared with CABG, MT was associated with a significantly higher incidence of subsequent myocardial infarction, a higher rate of additional revascularization, a higher incidence of cardiac death, and consequently a 2.29-fold increased risk of combined events. PCI was associated with an increased need for further revascularization, a higher incidence of myocardial infarction, and a 1.46-fold increased risk of combined events compared with CABG. Additionally, CABG was better than MT at eliminating anginal symptoms.
Resumo:
Background-Novel therapies have recently become available for pulmonary arterial hypertension. We conducted a study to characterize mortality in a multicenter prospective cohort of patients diagnosed with idiopathic, familial, or anorexigen-associated pulmonary arterial hypertension in the modern management era. Methods and Results-Between October 2002 and October 2003, 354 consecutive adult patients with idiopathic, familial, or anorexigen-associated pulmonary arterial hypertension (56 incident and 298 prevalent cases) were prospectively enrolled. Patients were followed up for 3 years, and survival rates were analyzed. For incident cases, estimated survival (95% confidence intervals [CIs]) at 1, 2, and 3 years was 85.7% (95% CI, 76.5 to 94.9), 69.6% (95% CI, 57.6 to 81.6), and 54.9% (95% CI, 41.8 to 68.0), respectively. In a combined analysis population (incident patients and prevalent patients diagnosed within 3 years before study entry; n = 190), 1-, 2-, and 3-year survival estimates were 82.9% (95% CI, 72.4 to 95.0), 67.1% (95% CI, 57.1 to 78.8), and 58.2% (95% CI, 49.0 to 69.3), respectively. Individual survival analysis identified the following as significantly and positively associated with survival: female gender, New York Heart Association functional class I/II, greater 6-minute walk distance, lower right atrial pressure, and higher cardiac output. Multivariable analysis showed that being female, having a greater 6-minute walk distance, and exhibiting higher cardiac output were jointly significantly associated with improved survival. Conclusions-In the modern management era, idiopathic, familial, and anorexigen-associated pulmonary arterial hypertension remains a progressive, fatal disease. Mortality is most closely associated with male gender, right ventricular hemodynamic function, and exercise limitation. (Circulation. 2010; 122: 156-163.)
Resumo:
Background We validated a strategy for diagnosis of coronary artery disease ( CAD) and prediction of cardiac events in high-risk renal transplant candidates ( at least one of the following: age >= 50 years, diabetes, cardiovascular disease). Methods A diagnosis and risk assessment strategy was used in 228 renal transplant candidates to validate an algorithm. Patients underwent dipyridamole myocardial stress testing and coronary angiography and were followed up until death, renal transplantation, or cardiac events. Results The prevalence of CAD was 47%. Stress testing did not detect significant CAD in 1/3 of patients. The sensitivity, specificity, and positive and negative predictive values of the stress test for detecting CAD were 70, 74, 69, and 71%, respectively. CAD, defined by angiography, was associated with increased probability of cardiac events [log-rank: 0.001; hazard ratio: 1.90, 95% confidence interval (CI): 1.29-2.92]. Diabetes (P=0.03; hazard ratio: 1.58, 95% CI: 1.06-2.45) and angiographically defined CAD (P=0.03; hazard ratio: 1.69, 95% CI: 1.08-2.78) were the independent predictors of events. Conclusion The results validate our observations in a smaller number of high-risk transplant candidates and indicate that stress testing is not appropriate for the diagnosis of CAD or prediction of cardiac events in this group of patients. Coronary angiography was correlated with events but, because less than 50% of patients had significant disease, it seems premature to recommend the test to all high-risk renal transplant candidates. The results suggest that angiography is necessary in many high-risk renal transplant candidates and that better noninvasive methods are still lacking to identify with precision patients who will benefit from invasive procedures. Coron Artery Dis 21: 164-167 (C) 2010 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
Resumo:
Background. We assessed the results of a noninvasive therapeutic strategy on the long-term occurrence of cardiac events and death in a registry of patients with chronic kidney disease (CKD) and coronary artery disease (CAD). Methods. We analyzed 519 patients with CKD (56+/-9 years, 67% men, 67% whites) on maintenance hemodialysis with clinical or scintigraphic evidence of CAD by using coronary angiography. Results. In 230 (44%) patients, coronary angiography revealed significant CAD (lumen reduction >= 70%). Subjects with significant CAD were kept on medical treatment (MT; n=184) or referred for myocardial revascularization (percutaneous transluminal coronary angioplasty/coronary artery bypass graft-intervention; n=30) according to American College of Cardiology/American Heart Association guidelines. In addition, 16 subjects refused intervention and were also followed-up. Event-free survival for patients on MT at 12, 36, and 60 months was 86%, 71%, and 57%, whereas overall survival was 89%, 71%, and 50% in the same period, respectively. Patients who refused intervention had a significantly worse prognosis compared with those who actually underwent intervention (events: hazard ratio=4.50; % confidence interval=1.48-15.10; death: hazard ratio=3.39; % confidence interval 1.41-8.45). Conclusions. In patients with CKD and significant CAD, MT promotes adequate long-term event-free survival. However, failure to perform a coronary intervention when necessary results in an accentuated increased risk of events and death.
Resumo:
Background-Prasugrel is a novel thienopyridine that reduces new or recurrent myocardial infarctions (MIs) compared with clopidogrel in patients with acute coronary syndrome undergoing percutaneous coronary intervention. This effect must be balanced against an increased bleeding risk. We aimed to characterize the effect of prasugrel with respect to the type, size, and timing of MI using the universal classification of MI. Methods and Results-We studied 13 608 patients with acute coronary syndrome undergoing percutaneous coronary intervention randomized to prasugrel or clopidogrel and treated for 6 to 15 months in the Trial to Assess Improvement in Therapeutic Outcomes by Optimizing Platelet Inhibition With Prasugrel-Thrombolysis in Myocardial Infarction (TRITON-TIMI 38). Each MI underwent supplemental classification as spontaneous, secondary, or sudden cardiac death (types 1, 2, and 3) or procedure related (Types 4 and 5) and examined events occurring early and after 30 days. Prasugrel significantly reduced the overall risk of MI (7.4% versus 9.7%; hazard ratio [HR], 0.76; 95% confidence interval [CI], 0.67 to 0.85; P < 0.0001). This benefit was present for procedure-related MIs (4.9% versus 6.4%; HR, 0.76; 95% CI, 0.66 to 0.88; P = 0.0002) and nonprocedural (type 1, 2, or 3) MIs (2.8% versus 3.7%; HR, 0.72; 95% CI, 0.59 to 0.88; P = 0.0013) and consistently across MI size, including MIs with a biomarker peak >= 5 times the reference limit (HR. 0.74; 95% CI, 0.64 to 0.86; P = 0.0001). In landmark analyses starting at 30 days, patients treated with prasugrel had a lower risk of any MI (2.9% versus 3.7%; HR, 0.77; P = 0.014), including nonprocedural MI (2.3% versus 3.1%; HR, 0.74; 95% CI, 0.60 to 0.92; P = 0.0069). Conclusion-Treatment with prasugrel compared with clopidogrel for up to 15 months in patients with acute coronary syndrome undergoing percutaneous coronary intervention significantly reduces the risk of MIs that are procedure related and spontaneous and those that are small and large, including new MIs occurring during maintenance therapy. (Circulation. 2009; 119: 2758-2764.)
Resumo:
BACKGROUND The assessment of myocardial viability has been used to identify patients with coronary artery disease and left ventricular dysfunction in whom coronary-artery bypass grafting (CABG) will provide a survival benefit. However, the efficacy of this approach is uncertain. METHODS In a substudy of patients with coronary artery disease and left ventricular dysfunction who were enrolled in a randomized trial of medical therapy with or without CABG, we used single-photon-emission computed tomography (SPECT), dobutamine echocardiography, or both to assess myocardial viability on the basis of pre-specified thresholds. RESULTS Among the 1212 patients enrolled in the randomized trial, 601 underwent assessment of myocardial viability. Of these patients, we randomly assigned 298 to receive medical therapy plus CABG and 303 to receive medical therapy alone. A total of 178 of 487 patients with viable myocardium (37%) and 58 of 114 patients without viable myocardium (51%) died (hazard ratio for death among patients with viable myocardium, 0.64; 95% confidence interval [CI], 0.48 to 0.86; P = 0.003). However, after adjustment for other baseline variables, this association with mortality was not significant (P = 0.21). There was no significant interaction between viability status and treatment assignment with respect to mortality (P = 0.53). CONCLUSIONS The presence of viable myocardium was associated with a greater likelihood of survival in patients with coronary artery disease and left ventricular dysfunction, but this relationship was not significant after adjustment for other baseline variables. The assessment of myocardial viability did not identify patients with a differential survival benefit from CABG, as compared with medical therapy alone.
Resumo:
1. Chrysophtharta bimaculata is a native chrysomelid species that can cause chronic defoliation of plantation and regrowth Eucalyptus forests in Tasmania, Australia. Knowledge of the dispersion pattern of C. bimaculata was needed in order to assess the efficiency of an integrated pest management (IPM) programme currently used for its control. 2. Using data from yellow flight traps, local populations of C. bimaculata adults were monitored over a season at spatial scales relevant to commercial forestry: within a 50-ha operational management unit (a forestry 'coupe') and between coupes. In addition, oviposition was monitored over a season at a subset of the between-coupe sites. 3. Dispersion indices (Taylor's Power Law and Iwao's Mean Crowding regression method) demonstrated that C. bimaculata adults were spatially aggregated within and between coupes, although the number of egg-batches laid at the between-coupe scale was uniform. Spatial autocorrelation analysis showed that trap-catches at the within-coupe level were similar (positively autocorrelated) to a radius distance of approximately 110 m, and then dissimilar (negatively autocorrelated) at approximately 250 m. At the between-coupe scale, no repeatable spatial autocorrelation patterns were observed. 4. For any individual site, rapid changes in beetle density were observed to be associated with loosely aggregated flights of beetles into and out of that site. Peak adult catches (> the weekly mean plus standard deviation trap-catch) for a site occurred for a period of 2.0 +/- 0.22 weeks at a time (n = 37), with normally only one or two peaks per site per season. Peak oviposition events for a site occurred on average 1.4 +/- 0.11 times per season and lasted 1.5 +/- 0.12 weeks. 5. Analysis of an extensive data set (n = 417) demonstrated that adult abundance at a site was positively correlated with egg density, but negatively correlated with tree damage (caused by conspecifics) and the presence of conspecific larvae. There was no relationship between adult abundance and a visual estimate of the amount of young foliage on trees. 6. Adults of C. bimaculata are show n to occur in relatively small, mobile aggregations. This means that pest surveys must be both regular (less than 2 weeks apart) and intensive (with sampling points no more than 150 m apart) if beetle populations are to be monitored with confidence. Further refinement of the current IPM strategy must recognize the problems posed by this temporal and spatial patchiness, particularly with regard to the use of biological insecticides, such as Bacillus thuringiensis, for which only a very short operational window exists.
Resumo:
Background: Chagas` disease is the illness caused by the protozoan Trypanosoma cruzi and it is still endemic in Latin America. Heart transplantation is a therapeutic option for patients with end-stage Chagas` cardiomyopathy. Nevertheless, reactivation may occur after transplantation, leading to higher morbidity and graft dysfunction. This study aimed to identify risk factors for Chagas` disease reactivation episodes. Methods: This investigation is a retrospective cohort study of all Chagas` disease heart transplant recipients from September 1985 through September 2004. Clinical, microbiologic and histopathologic data were reviewed. Statistical analysis was performed with SPSS (version 13) software. Results: Sixty-four (21.9%) patients with chronic Chagas` disease underwent heart transplantation during the study period. Seventeen patients (26.5%) had at least one episode of Chagas` disease reactivation, and univariate analysis identified number of rejection episodes (p = 0.013) and development of neoplasms (p = 0.040) as factors associated with Chagas` disease reactivation episodes. Multivariate analysis showed that number of rejection episodes (hazard ratio = 1.31; 95% confidence interval [CI]: 1.06 to 1.62; p = 0.011), neoplasms (hazard ratio = 5.07; 95% CI: 1.49 to 17.20; p = 0.009) and use of mycophenolate mofetil (hazard ratio = 3.14; 95% CI: 1.00 to 9.84; p = 0.049) are independent determinants for reactivation after transplantation. Age (p = 0.88), male gender (p = 0.15), presence of rejection (p = 0.17), cytomegalovirus infection (p = 0.79) and mortality after hospital discharge (p = 0.15) showed no statistically significant difference. Conclusions: Our data suggest that events resulting in greater immunosuppression status contribute to Chagas` disease reactivation episodes after heart transplantation and should alert physicians to make an early diagnosis and perform pre-emptive therapy. Although reactivation led to a high rate of morbidity, a low mortality risk was observed.
Resumo:
Background - The effect of prearrest left ventricular ejection fraction ( LVEF) on outcome after cardiac arrest is unknown. Methods and Results - During a 26-month period, Utstein-style data were prospectively collected on 800 consecutive inpatient adult index cardiac arrests in an observational, single-center study at a tertiary cardiac care hospital. Prearrest echocardiograms were performed on 613 patients ( 77%) at 11 +/- 14 days before the cardiac arrest. Outcomes among patients with normal or nearly normal prearrest LVEF ( >= 45%) were compared with those of patients with moderate or severe dysfunction ( LVEF < 45%) by chi(2) and logistic regression analyses. Survival to discharge was 19% in patients with normal or nearly normal LVEF compared with 8% in those with moderate or severe dysfunction ( adjusted odds ratio, 4.8; 95% confidence interval, 2.3 to 9.9; P < 0.001) but did not differ with regard to sustained return of spontaneous circulation ( 59% versus 56%; P = 0.468) or 24-hour survival ( 39% versus 36%; P = 0.550). Postarrest echocardiograms were performed on 84 patients within 72 hours after the index cardiac arrest; the LVEF decreased 25% in those with normal or nearly normal prearrest LVEF ( 60 +/- 9% to 45 +/- 14%; P < 0.001) and decreased 26% in those with moderate or severe dysfunction ( 31 +/- 7% to 23 +/- 6%, P < 0.001). For all patients, prearrest beta-blocker treatment was associated with higher survival to discharge ( 33% versus 8%; adjusted odds ratio, 3.9; 95% confidence interval, 1.8 to 8.2; P < 0.001). Conclusions - Moderate and severe prearrest left ventricular systolic dysfunction was associated with substantially lower rates of survival to hospital discharge compared with normal or nearly normal function.
Resumo:
Purpose: Erlotinib, an oral tyrosine kinase inhibitor, is active against head-and-neck squamous cell carcinoma (HNSCC) and possibly has a synergistic interaction with chemotherapy and radiotherapy. We investigated the safety and efficacy of erlotinib added to cisplatin and radiotherapy in locally advanced HNSCC. Methods and Materials: In this Phase I/II trial 100 mg/m(2) of cisplatin was administered on Days 8, 29, and 50, and radiotherapy at 70 Gy was started on Day 8. During Phase I, the erlotinib dose was escalated (50 mg, 100 mg, and 150 mg) in consecutive cohorts of 3 patients, starting on Day 1 and continuing during radiotherapy. Dose-limiting toxicity was defined as any Grade 4 event requiring radiotherapy interruptions. Phase 11 was initiated 8 weeks after the last Phase I enrollment. Results: The study accrued 9 patients in Phase I and 28 in Phase II; all were evaluable for efficacy and safety. No dose-limiting toxicity occurred in Phase I, and the recommended Phase 11 dose was 150 mg. The most frequent nonhematologic toxicities were nausea/vomiting, dysphagia, stomatitis, xerostomia and in-field dermatitis, acneiform rash, and diarrhea. Of the 31 patients receiving a 150-mg daily dose of erlotinib, 23 (74%; 95% confidence interval, 56.8%-86.3%) had a complete response, 3 were disease free after salvage surgery, 4 had inoperable residual disease, and 1 died of sepsis during treatment. With a median 37 months` follow-up, the 3-year progression-free and overall survival rates were 61% and 72%, respectively. Conclusions: This combination appears safe, has encouraging activity, and deserves further studies in locally advanced HNSCC. (C) 2010 Elsevier Inc.
Resumo:
XPC participates in the initial recognition of DNA damage during the DNA nucleotide excision repair process in global genomic repair. Polymorphisms in XPC gene have been analyzed in case-control studies to assess the cancer risk attributed to these variants, but results are conflicting. To clarify the impact of XPC polymorphisms in cancer risk, we performed a meta-analysis that included 33 published case-control studies. Polymorphisms analyzed were Lys939Gln and Ala499Val. The overall summary odds ratio (OR) for the associations of the 939Gln/Gln genotype with risk of cancer was 1.01 (95% confidence interval (95% CI): 0.94-1.09), but there were statistically significant associations for lung cancer, observed for the recessive genetic model (Lys/Lys + Lys/Gln vs Gln/Gln), (OR 1.30; 95% CI: 1.113-1.53), whereas for breast cancer a reduced but nonsignificant risk was observed for the same model (OR 0.87; 95% CI: 0.74-1.01). The results for Ala499Val showed a significant overall increase in cancer risk (OR 1.15; 95% CI: 1.02-1.31), and for bladder cancer in both the simple genetic model (Ala/Ala vs Val/Val) (OR 1.30; 95% CI: 1.04-1.61) and the recessive genetic model (Ala/Ala + Ala/Val vs Val/Val) (OR 1.32; 95% CI: 1.06-1.63). Our meta-analysis supports that polymorphisms in XPC may represent low-penetrance susceptibility gene variants for breast, bladder, head and neck, and lung cancer. XPC is a good candidate for large-scale epidemiological case-control studies that may lead to improvement in the management of highly prevalent cancers.
Resumo:
Fogo selvagem (FS) is mediated by pathogenic, predominantly IgG4, anti-desmoglein 1 (Dsg1) autoantibodies and is endemic in Limao Verde, Brazil. IgG and IgG subclass autoantibodies were tested in a sample of 214 FS patients and 261 healthy controls by Dsg1 ELISA. For model selection, the sample was randomly divided into training (50%), validation (25%), and test (25%) sets. Using the training and validation sets, IgG4 was chosen as the best predictor of FS, with index values above 6.43 classified as FS. Using the test set, IgG4 has sensitivity of 92% (95% confidence interval (95% CI): 82-95%), specificity of 97% (95% CI: 89-100%), and area under the curve of 0.97 ( 95% CI: 0.94-1.00). The IgG4 positive predictive value (PPV) in Limao Verde (3% FS prevalence) was 49%. The sensitivity, specificity, and PPV of IgG anti-Dsg1 were 87, 91, and 23%, respectively. The IgG4-based classifier was validated by testing 11 FS patients before and after clinical disease and 60 Japanese pemphigus foliaceus patients. It classified 21 of 96 normal individuals from a Limao Verde cohort as having FS serology. On the basis of its PPV, half of the 21 individuals may currently have preclinical FS and could develop clinical disease in the future. Identifying individuals during preclinical FS will enhance our ability to identify the etiological agent(s) triggering FS.
Resumo:
Objectives: To determine the effect of maternal smoking during pregnancy on transient evoked otoacoustic emissions levels in neonates. Methods: This was a cross-sectional study investigating neonates in the maternity ward of a university hospital in the city of Sao Paulo, Brazil. A total of 418 term neonates without prenatal or perinatal complications were evaluated. The neonates were divided into two groups: a study group, which comprised 98 neonates born to mothers who had smoked during pregnancy; and a control group, which comprised 320 neonates born to mothers who had not. In order to compare the two ears and the two groups in terms of the mean overall response and the mean transient evoked otoacoustic emissions in response to acoustic stimuli delivered at different frequencies, we used analysis of variance with repeated measures. Results: The mean overall response and the mean frequency-specific response levels were lower in the neonates in the study group (p < 0.001). The mean difference between the groups was 2.47 dB sound pressure level (95% confidence interval: 1.47-3.48). Conclusions: Maternal smoking during pregnancy had a negative effect on cochlear function, as determined by otoacoustic emissions testing. Therefore, pregnant women should be warned of this additional hazard of smoking. It is important that smoking control be viewed as a public health priority and that strategies for treating tobacco dependence be devised. (C) 2011 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Objectives: To estimate the prevalence of fibromyalgia (FM) and chronic widespread pain (CWP) in community-dwelling elderly individuals living in Sao Paulo, to assess the spectrum of problems related to these diseases using the Fibromyalgia Impact Questionnaire (FIQ) and to correlate the FIQ with the number of tender points and with pain threshold. Methods: Our sample consisted of 361 individuals (64% women, 36% men, mean age of 73.3 +/- 5.7 years). Individuals were classified into four groups: FM (according to American College of Rheumatology criteria), CWP, regional pain (RP) and no pain (NP). Pain characteristics and dolorimetry for 18 tender points and the FIQ were assessed. Results: The prevalence of FM was 5.5% [95% confidence interval (CI) = 5.4-5.7], and the prevalence of CWP was 14.1% (95% Cl: 10.5-17.7%). The frequency of RP was 52.6% and the prevalence of NP was 27.7%. FIQ scores were higher in people with FM (44.5), followed by CWP (31.4), RP (18.1) and NP (5.5) (p < 0.001). There was a positive correlation between the domains of the FIQ and the number of tender Points (p < 0.05), and a negative correlation between FIQ score and pain threshold (p < 0.05). Conclusion: In our elderly subjects, the prevalence of FM was slightly higher compared to previously reported studies, and CWP was around 14%. The spectrum of problems related to chronic pain was more severe in FM followed by CWP, strongly suggesting that these conditions should be diagnosed and adequately treated in older individuals. (C) 2010 Elsevier Ireland Ltd. All rights reserved.