983 resultados para fixed-width confidence interval


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Denosumab is a fully human monoclonal antibody to the receptor activator of nuclear factor-kappaB ligand (RANKL) that blocks its binding to RANK, inhibiting the development and activity of osteoclasts, decreasing bone resorption, and increasing bone density. Given its unique actions, denosumab may be useful in the treatment of osteoporosis. METHODS: We enrolled 7868 women between the ages of 60 and 90 years who had a bone mineral density T score of less than -2.5 but not less than -4.0 at the lumbar spine or total hip. Subjects were randomly assigned to receive either 60 mg of denosumab or placebo subcutaneously every 6 months for 36 months. The primary end point was new vertebral fracture. Secondary end points included nonvertebral and hip fractures. RESULTS: As compared with placebo, denosumab reduced the risk of new radiographic vertebral fracture, with a cumulative incidence of 2.3% in the denosumab group, versus 7.2% in the placebo group (risk ratio, 0.32; 95% confidence interval [CI], 0.26 to 0.41; P<0.001)--a relative decrease of 68%. Denosumab reduced the risk of hip fracture, with a cumulative incidence of 0.7% in the denosumab group, versus 1.2% in the placebo group (hazard ratio, 0.60; 95% CI, 0.37 to 0.97; P=0.04)--a relative decrease of 40%. Denosumab also reduced the risk of nonvertebral fracture, with a cumulative incidence of 6.5% in the denosumab group, versus 8.0% in the placebo group (hazard ratio, 0.80; 95% CI, 0.67 to 0.95; P=0.01)--a relative decrease of 20%. There was no increase in the risk of cancer, infection, cardiovascular disease, delayed fracture healing, or hypocalcemia, and there were no cases of osteonecrosis of the jaw and no adverse reactions to the injection of denosumab. CONCLUSIONS: Denosumab given subcutaneously twice yearly for 36 months was associated with a reduction in the risk of vertebral, nonvertebral, and hip fractures in women with osteoporosis. (ClinicalTrials.gov number, NCT00089791.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The way supervisors acknowledge specific contribution and efforts of their employees has an impact on occupational health and wellbeing. Acknowledgement is a protective factor when it is sufficiently provided. We carried out a study about occupational health in police officers with special emphasis on acknowledgment and reward. A questionnaire was sent to 1000 police officers and inspectors working for a cantonal administration in Switzerland. In total, 695 participants answered the questionnaire. We used the TST questionnaire (French version of the Langner's questionnaire on psychiatric symptoms) to identify cases characterized by potential mental health problems. Multiple choice items (5 modalities ranging from "not at all" to "tremendously") were used to measure acknowledgment. The score for psychiatric symptoms was high (TST score >or= 9) for 86 police officers and inspectors for whom health might be at risk. Compared with police officers having low or medium scores for psychiatric symptoms (TST score < 9), police officers with high TST scores were more likely to report the lack of support and attention from the supervisors (odds ratio [OR] 3.2, 95% confidence interval [CI] 2.0 to 5.1) and the lack of acknowledgment by the hierarchy (OR 3.0, 95% CI 1.9 to 4.8). They were also more likely to mention that judicial authorities have a low consideration for police officers (OR 2.7, 95% CI 1.7 to 4.3) and that the public in general have a low appreciation of police officers (OR 1.8, 95% CI 1.2 to 2.9). Preserving mental health in occupations characterized by high emotional demand is challenging. Our results show that acknowledgment and mental health are associated. Further research should address a potential causal relation of acknowledgment on mental health in police officers and inspectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Carotid artery stenting (CAS) is associated with a higher risk of both hemodynamic depression and new ischemic brain lesions on diffusion-weighted imaging than carotid endarterectomy (CEA). We assessed whether the occurrence of hemodynamic depression is associated with these lesions in patients with symptomatic carotid stenosis treated by CAS or CEA in the randomized International Carotid Stenting Study (ICSS)-MRI substudy. METHODS: The number and total volume of new ischemic lesions on diffusion-weighted imaging 1 to 3 days after CAS or CEA was measured in the ICSS-MRI substudy. Hemodynamic depression was defined as periprocedural bradycardia, asystole, or hypotension requiring treatment. The number of new ischemic lesions was the primary outcome measure. We calculated risk ratios and 95% confidence intervals per treatment with Poisson regression comparing the number of lesions in patients with or without hemodynamic depression. RESULTS: A total of 229 patients were included (122 allocated CAS; 107 CEA). After CAS, patients with hemodynamic depression had a mean of 13 new diffusion-weighted imaging lesions, compared with a mean of 4 in those without hemodynamic depression (risk ratio, 3.36; 95% confidence interval, 1.73-6.50). The number of lesions after CEA was too small for reliable analysis. Lesion volumes did not differ between patients with or without hemodynamic depression. CONCLUSIONS: In patients treated by CAS, periprocedural hemodynamic depression is associated with an excess of new ischemic lesions on diffusion-weighted imaging. The findings support the hypothesis that hypoperfusion increases the susceptibility of the brain to embolism. CLINICAL TRIAL REGISTRATION URL: http://www.controlled-trials.com. Unique identifier: ISRCTN25337470.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Allogeneic stem cell transplantation is usually considered the only curative treatment option for patients with advanced or transformed myelodysplastic syndromes in complete remission, but post-remission chemotherapy and autologous stem cell transplantation are potential alternatives, especially in patients over 45 years old. DESIGN AND METHODS: We evaluated, after intensive anti-leukemic remission-induction chemotherapy, the impact of the availability of an HLA-identical sibling donor on an intention-to treat basis. Additionally, all patients without a sibling donor in complete remission after the first consolidation course were randomized to either autologous peripheral blood stem cell transplantation or a second consolidation course consisting of high-dose cytarabine. RESULTS: The 4-year survival of the 341 evaluable patients was 28%. After achieving complete remission, the 4-year survival rates of patients under 55 years old with or without a donor were 54% and 41%, respectively, with an adjusted hazard ratio of 0.81 (95% confidence interval [95% CI], 0.49-1.35) for survival and of 0.67 (95% CI, 0.42-1.06) for disease-free survival. In patients with intermediate/high risk cytogenetic abnormalities the hazard ratio in multivariate analysis was 0.58 (99% CI, 0.22-1.50) (P=0.14) for survival and 0.46 (99% CI, 0.22-1.50) for disease-free survival (P=0.03). In contrast, in patients with low risk cytogenetic characteristics the hazard ratio for survival was 1.17 (99% CI, 0.40-3.42) and that for disease-free survival was 1.02 (99% CI, 0.40-2.56). The 4-year survival of the 65 patients randomized to autologous peripheral blood stem cell transplantation or a second consolidation course of high-dose cytarabine was 37% and 27%, respectively. The hazard ratio in multivariate analysis was 1.22 (95% CI, 0.65-2.27) for survival and 1.02 (95% CI, 0.56-1.85) for disease-free survival. CONCLUSIONS: Patients with a donor and candidates for allogeneic stem cell transplantation in first complete remission may have a better disease-free survival than those without a donor in case of myelodysplastic syndromes with intermediate/high-risk cytogenetics. Autologous peripheral blood stem cell transplantation does not provide longer survival than intensive chemotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: VeriStrat(®) is a serum proteomic test used to determine whether patients with advanced non-small cell lung cancer (NSCLC) who have already received chemotherapy are likely to have good or poor outcomes from treatment with gefitinib or erlotinib. The main objective of our retrospective study was to evaluate the role of VS as a marker of overall survival (OS) in patients treated with erlotinib and bevacizumab in the first line. PATIENTS AND METHODS: Patients were pooled from two phase II trials (SAKK19/05 and NTR528). For survival analyses, a log-rank test was used to determine if there was a statistically significant difference between groups. The hazard ratio (HR) of any separation was assessed using Cox proportional hazards models. RESULTS: 117 patients were analyzed. VeriStrat classified patients into two groups which had a statistically significant difference in duration of OS (p=0.0027, HR=0.480, 95% confidence interval: 0.294-0.784). CONCLUSION: VeriStrat has a prognostic role in patients with advanced, nonsquamous NSCLC treated with erlotinib and bevacizumab in the first line. Further work is needed to study the predictive role of VeriStrat for erlotinib and bevacizumab in chemotherapy-untreated patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In Western countries, leptospirosis is uncommon and mainly occurs in farmers and individuals indulging in water-related activities. In tropical countries, leptospirosis can be up to 1000 times more frequent and risk factors for this often severe disease may differ. METHODS: We conducted a one-year population-based matched case-control study to investigate the frequency and associated factors of leptospirosis in the entire population of Seychelles. RESULTS: A total of 75 patients had definite acute leptospirosis based on microagglutination test (MAT) and polymerase chain reaction (PCR) assay (incidence: 101 per 100,000 per year; 95% confidence interval [CI]: 79-126). Among the controls, MAT was positive in 37% (past infection) and PCR assay in 9% (subclinical infection) of men aged 25-64 with manual occupation. Comparing cases and controls with negative MAT and PCR, leptospirosis was associated positively with walking barefoot around the home, washing in streams, gardening, activities in forests, alcohol consumption, rainfall, wet soil around the home, refuse around the home, rats visible around the home during day time, cats in the home, skin wounds and inversely with indoor occupation. The considered factors accounted for as much as 57% of the variance in predicting the disease. CONCLUSION: These data indicate a high incidence of leptospirosis in Seychelles. This suggests that leptospires are likely to be ubiquitous and that effective leptospirosis control in tropical countries needs a multifactorial approach including major behaviour change by large segments of the general public.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Recombinant human insulin-like growth factor I (rhIGF-I) is a possible disease modifying therapy for amyotrophic lateral sclerosis (ALS, which is also known as motor neuron disease (MND)). OBJECTIVES: To examine the efficacy of rhIGF-I in affecting disease progression, impact on measures of functional health status, prolonging survival and delaying the use of surrogates (tracheostomy and mechanical ventilation) to sustain survival in ALS. Occurrence of adverse events was also reviewed. SEARCH METHODS: We searched the Cochrane Neuromuscular Disease Group Specialized Register (21 November 2011), CENTRAL (2011, Issue 4), MEDLINE (January 1966 to November 2011) and EMBASE (January 1980 to November 2011) and sought information from the authors of randomised clinical trials and manufacturers of rhIGF-I. SELECTION CRITERIA: We considered all randomised controlled clinical trials involving rhIGF-I treatment of adults with definite or probable ALS according to the El Escorial Criteria. The primary outcome measure was change in Appel Amyotrophic Lateral Sclerosis Rating Scale (AALSRS) total score after nine months of treatment and secondary outcome measures were change in AALSRS at 1, 2, 3, 4, 5, 6, 7, 8, 9 months, change in quality of life (Sickness Impact Profile scale), survival and adverse events. DATA COLLECTION AND ANALYSIS: Each author independently graded the risk of bias in the included studies. The lead author extracted data and the other authors checked them. We generated some missing data by making ruler measurements of data in published graphs. We collected data about adverse events from the included trials. MAIN RESULTS: We identified three randomised controlled trials (RCTs) of rhIGF-I, involving 779 participants, for inclusion in the analysis. In a European trial (183 participants) the mean difference (MD) in change in AALSRS total score after nine months was -3.30 (95% confidence interval (CI) -8.68 to 2.08). In a North American trial (266 participants), the MD after nine months was -6.00 (95% CI -10.99 to -1.01). The combined analysis from both RCTs showed a MD after nine months of -4.75 (95% CI -8.41 to -1.09), a significant difference in favour of the treated group. The secondary outcome measures showed non-significant trends favouring rhIGF-I. There was an increased risk of injection site reactions with rhIGF-I (risk ratio 1.26, 95% CI 1.04 to 1.54). . A second North American trial (330 participants) used a novel primary end point involving manual muscle strength testing. No differences were demonstrated between the treated and placebo groups in this study. All three trials were at high risk of bias. AUTHORS' CONCLUSIONS: Meta-analysis revealed a significant difference in favour of rhIGF-I treatment; however, the quality of the evidence from the two included trials was low. A third study showed no difference between treatment and placebo. There is no evidence for increase in survival with IGF1. All three included trials were at high risk of bias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: This study was designed to identify macrophage-rich atherosclerotic plaque noninvasively by imaging the tissue uptake of long-circulating superparamagnetic nanoparticles with a positive contrast off-resonance imaging sequence (inversion recovery with ON-resonant water suppression [IRON]). BACKGROUND: The sudden rupture of macrophage-rich atherosclerotic plaques can trigger the formation of an occlusive thrombus in coronary vessels, resulting in acute myocardial infarction. Therefore, a noninvasive technique that can identify macrophage-rich plaques and thereby assist with risk stratification of patients with atherosclerosis would be of great potential clinical utility. METHODS: Experiments were conducted on a clinical 3-T magnetic resonance imaging (MRI) scanner in 7 heritable hyperlipidemic and 4 control rabbits. Monocrystalline iron-oxide nanoparticles (MION)-47 were administrated intravenously (2 doses of 250 mumol Fe/kg), and animals underwent serial IRON-MRI before injection of the nanoparticles and serially after 1, 3, and 6 days. RESULTS: After administration of MION-47, a striking signal enhancement was found in areas of plaque only in hyperlipidemic rabbits. The magnitude of enhancement on magnetic resonance images had a high correlation with the number of macrophages determined by histology (p < 0.001) and allowed for the detection of macrophage-rich plaque with high accuracy (area under the curve: 0.92, SE: 0.04, 95% confidence interval: 0.84 to 0.96, p < 0.001). No significant signal enhancement was measured in remote areas without plaque by histology and in control rabbits without atherosclerosis. CONCLUSIONS: Using IRON-MRI in conjunction with superparamagnetic nanoparticles is a promising approach for the noninvasive evaluation of macrophage-rich, vulnerable plaques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The optimal length of stay (LOS) for patients with pulmonary embolism (PE) is unknown. Although reducing LOS is likely to save costs, the effects on patient safety are unclear. We sought to identify patient and hospital factors associated with LOS and assess whether LOS was associated with postdischarge mortality. METHODS: We evaluated patients discharged with a primary diagnosis of PE from 186 acute care hospitals in Pennsylvania (January 2000 through November 2002). We used discrete survival models to examine the association between (1) patient and hospital factors and the time to discharge and (2) LOS and postdischarge mortality within 30 days of presentation, adjusting for patient and hospital factors. RESULTS: Among 15 531 patient discharges with PE, the median LOS was 6 days, and postdischarge mortality rate was 3.3%. In multivariate analysis, patients from Philadelphia were less likely to be discharged on a given day (odds ratio [OR], 0.82; 95% confidence interval [CI], 0.73-0.93), as were black patients (OR, 0.88; 95% CI, 0.82-0.94).The odds of discharge decreased notably with greater patient severity of illness and in patients without private health insurance. Adjusted postdischarge mortality was significantly higher for patients with an LOS of 4 days or less (OR, 1.55; 95% CI, 1.21-2.00) relative to those with an LOS of 5 to 6 days. CONCLUSIONS: Several hospital and patient factors were independently associated with LOS. Patients with a very short LOS had greater postdischarge mortality relative to patients with a typical LOS, suggesting that physicians may inappropriately select patients with PE for early discharge who are at increased risk of complications

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mathematical model is proposed to analyze the effects of acquired immunity on the transmission of schistosomiasis in the human host. From this model the prevalence curve dependent on four parameters can be obtained. These parameters were estimated fitting the data by the maximum likelihood method. The model showed a good retrieving capacity of real data from two endemic areas of schistosomiasis: Touros, Brazil (Schistosoma mansoni) and Misungwi, Tanzania (S. haematobium). Also, the average worm burden per person and the dispersion of parasite per person in the community can be obtained from the model. In this paper, the stabilizing effects of the acquired immunity assumption in the model are assessed in terms of the epidemiological variables as follows. Regarded to the prevalence curve, we calculate the confidence interval, and related to the average worm burden and the worm dispersion in the community, the sensitivity analysis (the range of the variation) of both variables with respect to their parameters is performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Atrial arrhythmias increase disease burden in the general adult population. Adults with congenital heart lesions constitute a rapidly growing group of patients with cardiovascular disease. We hypothesized that atrial arrhythmias increase with age and impair health outcomes in this population. METHODS AND RESULTS: We conducted a population-based analysis of prevalence, lifetime risk, mortality, and morbidity associated with atrial arrhythmias in adults with congenital heart disease from l983 to 2005. In 38 428 adults with congenital heart disease in 2005, 5812 had atrial arrhythmias. Overall, the 20-year risk of developing atrial arrhythmia was 7% in a 20-year-old subject and 38% in a 50-year-old subject. More than 50% of patients with severe congenital heart disease reaching age 18 years developed atrial arrhythmias by age 65 years. In patients with congenital heart disease, the hazard ratio of any adverse event in those with atrial arrhythmias compared with those without was 2.50 (95% confidence interval, 2.38 to 2.62; P<0.0001), with a near 50% increase in mortality (hazard ratio, 1.47; 95% confidence interval, 1.37 to 1.58; P<0.001), more than double the risk of morbidity (stroke or heart failure) (hazard ratio, 2.21; 95% confidence interval, 2.07 to 2.36; P<0.001), and 3 times the risk of cardiac interventions (hazard ratio, 3.00; 95% confidence interval, 2.81 to 3.20; P<0.001). CONCLUSIONS: Atrial arrhythmias occurred in 15% of adults with congenital heart disease. The lifetime incidence increased steadily with age and was associated with a doubling of the risk of adverse events. An increase in resource allocation should be anticipated to deal with this increasing burden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Socioeconomic adversity in early life has been hypothesized to "program" a vulnerable phenotype with exaggerated inflammatory responses, so increasing the risk of developing type 2 diabetes in adulthood. The aim of this study is to test this hypothesis by assessing the extent to which the association between lifecourse socioeconomic status and type 2 diabetes incidence is explained by chronic inflammation. METHODS AND FINDINGS: We use data from the British Whitehall II study, a prospective occupational cohort of adults established in 1985. The inflammatory markers C-reactive protein and interleukin-6 were measured repeatedly and type 2 diabetes incidence (new cases) was monitored over an 18-year follow-up (from 1991-1993 until 2007-2009). Our analytical sample consisted of 6,387 non-diabetic participants (1,818 women), of whom 731 (207 women) developed type 2 diabetes over the follow-up. Cumulative exposure to low socioeconomic status from childhood to middle age was associated with an increased risk of developing type 2 diabetes in adulthood (hazard ratio [HR] = 1.96, 95% confidence interval: 1.48-2.58 for low cumulative lifecourse socioeconomic score and HR = 1.55, 95% confidence interval: 1.26-1.91 for low-low socioeconomic trajectory). 25% of the excess risk associated with cumulative socioeconomic adversity across the lifecourse and 32% of the excess risk associated with low-low socioeconomic trajectory was attributable to chronically elevated inflammation (95% confidence intervals 16%-58%). CONCLUSIONS: In the present study, chronic inflammation explained a substantial part of the association between lifecourse socioeconomic disadvantage and type 2 diabetes. Further studies should be performed to confirm these findings in population-based samples, as the Whitehall II cohort is not representative of the general population, and to examine the extent to which social inequalities attributable to chronic inflammation are reversible. Please see later in the article for the Editors' Summary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Data on the frequency of extraintestinal manifestations (EIMs) in Crohn's disease (CD) and ulcerative colitis (UC) and analyses of their risk factors are scarce. We evaluated their prevalence and risk factors in a large nationwide cohort of inflammatory bowel disease (IBD) patients. METHODS: IBD patients from an adult clinical cohort in Switzerland (Swiss IBD cohort study) were prospectively included. Data from validated physician enrolment questionnaires were analyzed. RESULTS: A total of 950 patients were included, 580 (61%) with CD (mean age 41 years) and 370 (39%) with UC (mean age 42 years). Of these, 249 (43%) of CD and 113 (31%) of UC patients had one to five EIMs. The following EIMs were found: arthritis (CD 33%, UC 21%), aphthous stomatitis (CD 10%, UC 4%), uveitis (CD 6%, UC 4%), erythema nodosum (CD 6%, UC 3%), ankylosing spondylitis (CD 6%, UC 2%), psoriasis (CD 2%, UC 1%), pyoderma gangrenosum (CD and UC each 2%), and primary sclerosing cholangitis (CD 1%, UC 4%). Multiple logistic regression identified the following risk factors for ongoing EIM in CD: active disease (odds ratio (OR)=1.95, 95% confidence interval (CI)=1.17-3.23, P=0.01), and positive IBD family history (OR=1.77, 95% CI=1.07-2.92, P=0.025). No risk factors were identified in UC patients. CONCLUSIONS: EIMs are a frequent problem in CD and UC patients. Active disease and positive IBD family history are associated with ongoing EIM in CD patients. Identification of EIM prevalence and associated risk factors may result in increased awareness for this problem and thereby facilitating their diagnosis and therapeutic management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors examined the associations of social support with socioeconomic status (SES) and with mortality, as well as how SES differences in social support might account for SES differences in mortality. Analyses were based on 9,333 participants from the British Whitehall II Study cohort, a longitudinal cohort established in 1985 among London-based civil servants who were 35-55 years of age at baseline. SES was assessed using participant's employment grades at baseline. Social support was assessed 3 times in the 24.4-year period during which participants were monitored for death. In men, marital status, and to a lesser extent network score (but not low perceived support or high negative aspects of close relationships), predicted both all-cause and cardiovascular mortality. Measures of social support were not associated with cancer mortality. Men in the lowest SES category had an increased risk of death compared with those in the highest category (for all-cause mortality, hazard ratio = 1.59, 95% confidence interval: 1.21, 2.08; for cardiovascular mortality, hazard ratio = 2.48, 95% confidence interval: 1.55, 3.92). Network score and marital status combined explained 27% (95% confidence interval: 14, 43) and 29% (95% confidence interval: 17, 52) of the associations between SES and all-cause and cardiovascular mortality, respectively. In women, there was no consistent association between social support indicators and mortality. The present study suggests that in men, social isolation is not only an important risk factor for mortality but is also likely to contribute to differences in mortality by SES.