819 resultados para Risk levels
Resumo:
Objective. To investigate the relationship between coping and atherothrombotic biomarkers of an increased cardiovascular disease (CVD) risk in the elderly. Methods. We studied 136 elderly caregiving and noncaregiving men and women who completed the Ways of Coping Checklist to assess problem-focused coping, seeking social support (SSS), blamed self, wishful thinking, and avoidance coping. They had circulating levels of 12 biomarkers measured. We also probed for potential mediator and moderator variables (chronic stress, affect, health behavior, autonomic activity) for the relation between coping and biomarkers. Results. After controlling for demographic and CVD risk factors, greater use of SSS was associated with elevated levels of serum amyloid A (P = 0.001), C-reactive protein (CRP) (P = 0.002), vascular cellular adhesion molecule (VCAM)-1 (P = 0.021), and D-dimer (P = 0.032). There were several moderator effects. For instance, greater use of SSS was associated with elevated VCAM-1 (P < 0.001) and CRP (P = 0.001) levels in subjects with low levels of perceived social support and positive affect, respectively. The other coping styles were not significantly associated with any biomarker. Conclusions. Greater use of SSS might compromise cardiovascular health through atherothrombotic mechanisms, including elevated inflammation (i.e., serum amyloid A, CRP, VCAM-1) and coagulation (i.e., D-dimer) activity. Moderating variables need to be considered in this relationship.
Resumo:
Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.
Resumo:
Microalbuminuria is an established risk factor for renal disease, especially in the diabetic population. Recent studies have shown that microalbuminuria has also a highly relevant predictive value for cardiovascular morbidity and mortality. From normal to overt proteinuria levels, albuminuria shows a continuous marked increase in cardiovascular risk. This association is independent of other "classical" cardiovascular risk factors such as hypertension, hyperlipidemia or smoking. Furthermore it has a predictive value not only for patients with diabetic or renal disease, but also for hypertensive individuals or the general population. Angiotensin-converting enzyme inhibitors and angiotensin receptor blockers have been shown to display not only reno--but also cardioprotective effects. Their unique ability to lower albuminuria by 40% is related to a significant risk reduction in cardiovascular mortality. New clinical trials are needed to define "normal" albuminuria levels and how low we should go.
Resumo:
BACKGROUND/AIM: Parallel investigation, in a matched case-control study, of the association of different first-trimester markers with the risk of subsequent pre-eclampsia (PE). METHOD: The levels of different first trimester serum markers and fetal nuchal translucency thickness were compared between 52 cases of PE and 104 control women by non-parametric two-group comparisons and by calculating matched odds ratios. RESULTS: In univariable analysis increased concentrations of inhibin A and activin A were associated with subsequent PE (p < 0.02). Multivariable conditional logistic regression models revealed an association between increased risk of PE and increased inhibin A and translucency thickness and respectively reduced pregnancy-associated plasma protein A (PAPP-A) and placental lactogen . However, these associations varied with the gestational age at sample collection. For blood samples taken in pregnancy weeks 12 and 13 only, increased levels of activin A, inhibin A and nuchal translucency thickness, and lower levels of placenta growth factor and PAPP-A were associated with an increased risk of PE. CONCLUSIONS: Members of the inhibin family and to some extent PAPP-A and placental growth factor are superior to other serum markers, and the predictive value of these depends on the gestational age at blood sampling. The availability of a single, early pregnancy 'miracle' serum marker for PE risk assessment seems unlikely in the near future.
Resumo:
To compare the prediction of hip fracture risk of several bone ultrasounds (QUS), 7062 Swiss women > or =70 years of age were measured with three QUSs (two of the heel, one of the phalanges). Heel QUSs were both predictive of hip fracture risk, whereas the phalanges QUS was not. INTRODUCTION: As the number of hip fracture is expected to increase during these next decades, it is important to develop strategies to detect subjects at risk. Quantitative bone ultrasound (QUS), an ionizing radiation-free method, which is transportable, could be interesting for this purpose. MATERIALS AND METHODS: The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk (SEMOF) study is a multicenter cohort study, which compared three QUSs for the assessment of hip fracture risk in a sample of 7609 elderly ambulatory women > or =70 years of age. Two QUSs measured the heel (Achilles+; GE-Lunar and Sahara; Hologic), and one measured the heel (DBM Sonic 1200; IGEA). The Cox proportional hazards regression was used to estimate the hazard of the first hip fracture, adjusted for age, BMI, and center, and the area under the ROC curves were calculated to compare the devices and their parameters. RESULTS: From the 7609 women who were included in the study, 7062 women 75.2 +/- 3.1 (SD) years of age were prospectively followed for 2.9 +/- 0.8 years. Eighty women reported a hip fracture. A decrease by 1 SD of the QUS variables corresponded to an increase of the hip fracture risk from 2.3 (95% CI, 1.7, 3.1) to 2.6 (95% CI, 1.9, 3.4) for the three variables of Achilles+ and from 2.2 (95% CI, 1.7, 3.0) to 2.4 (95% CI, 1.8, 3.2) for the three variables of Sahara. Risk gradients did not differ significantly among the variables of the two heel QUS devices. On the other hand, the phalanges QUS (DBM Sonic 1200) was not predictive of hip fracture risk, with an adjusted hazard risk of 1.2 (95% CI, 0.9, 1.5), even after reanalysis of the digitalized data and using different cut-off levels (1700 or 1570 m/s). CONCLUSIONS: In this elderly women population, heel QUS devices were both predictive of hip fracture risk, whereas the phalanges QUS device was not.
Resumo:
OBJECTIVES: To estimate changes in coronary risk factors and their implications for coronary heart disease (CHD) rates in men starting highly active antiretroviral therapy (HAART). METHODS: Men participating in the Swiss HIV Cohort Study with measurements of coronary risk factors both before and up to 3 years after starting HAART were identified. Fractional polynomial regression was used to graph associations between risk factors and time on HAART. Mean risk factor changes associated with starting HAART were estimated using multilevel models. A prognostic model was used to predict corresponding CHD rate ratios. RESULTS: Of 556 eligible men, 259 (47%) started a nonnucleoside reverse transcriptase inhibitor (NNRTI) and 297 a protease inhibitor (PI) based regimen. Levels of most risk factors increased sharply during the first 3 months on HAART, then more slowly. Increases were greater with PI- than NNRTI-based HAART for total cholesterol (1.18 vs. 0.98 mmol L(-1)), systolic blood pressure (3.6 vs. 0 mmHg) and BMI (1.04 vs. 0.55 kg m(2)) but not HDL cholesterol (0.24 vs. 0.32 mmol L(-1)) or glucose (1.02 vs. 1.03 mmol L(-1)). Predicted CHD rate ratios were 1.40 (95% CI 1.13-1.75) and 1.17 (0.95-1.47) for PI- and NNRTI-based HAART respectively. CONCLUSIONS: Coronary heart disease rates will increase in a majority of patients starting HAART: however the increases corresponding to typical changes in risk factors are relatively modest and could be offset by lifestyle changes.
Resumo:
BACKGROUND: Health risk appraisal is a promising method for health promotion and prevention in older persons. The Health Risk Appraisal for the Elderly (HRA-E) developed in the U.S. has unique features but has not been tested outside the United States. METHODS: Based on the original HRA-E, we developed a scientifically updated and regionally adapted multilingual Health Risk Appraisal for Older Persons (HRA-O) instrument consisting of a self-administered questionnaire and software-generated feed-back reports. We evaluated the practicability and performance of the questionnaire in non-disabled community-dwelling older persons in London (U.K.) (N = 1090), Hamburg (Germany) (N = 804), and Solothurn (Switzerland) (N = 748) in a sub-sample of an international randomised controlled study. RESULTS: Over eighty percent of invited older persons returned the self-administered HRA-O questionnaire. Fair or poor self-perceived health status and older age were correlated with higher rates of non-return of the questionnaire. Older participants and those with lower educational levels reported more difficulty in completing the HRA-O questionnaire as compared to younger and higher educated persons. However, even among older participants and those with low educational level, more than 80% rated the questionnaire as easy to complete. Prevalence rates of risks for functional decline or problems were between 2% and 91% for the 19 HRA-O domains. Participants' intention to change health behaviour suggested that for some risk factors participants were in a pre-contemplation phase, having no short- or medium-term plans for change. Many participants perceived their health behaviour or preventative care uptake as optimal, despite indications of deficits according to the HRA-O based evaluation. CONCLUSION: The HRA-O questionnaire was highly accepted by a broad range of community-dwelling non-disabled persons. It identified a high number of risks and problems, and provided information on participants' intention to change health behaviour.
Resumo:
BACKGROUND: In the UK, population screening for unmet need has failed to improve the health of older people. Attention is turning to interventions targeted at 'at-risk' groups. Living alone in later life is seen as a potential health risk, and older people living alone are thought to be an at-risk group worthy of further intervention. AIM: To explore the clinical significance of living alone and the epidemiology of lone status as an at-risk category, by investigating associations between lone status and health behaviours, health status, and service use, in non-disabled older people. Design of study: Secondary analysis of baseline data from a randomised controlled trial of health risk appraisal in older people. SETTING: Four group practices in suburban London. METHOD: Sixty per cent of 2641 community-dwelling non-disabled people aged 65 years and over registered at a practice agreed to participate in the study; 84% of these returned completed questionnaires. A third of this group, (n = 860, 33.1%) lived alone and two-thirds (n = 1741, 66.9%) lived with someone else. RESULTS: Those living alone were more likely to report fair or poor health, poor vision, difficulties in instrumental and basic activities of daily living, worse memory and mood, lower physical activity, poorer diet, worsening function, risk of social isolation, hazardous alcohol use, having no emergency carer, and multiple falls in the previous 12 months. After adjustment for age, sex, income, and educational attainment, living alone remained associated with multiple falls, functional impairment, poor diet, smoking status, risk of social isolation, and three self-reported chronic conditions: arthritis and/or rheumatism, glaucoma, and cataracts. CONCLUSION: Clinicians working with independently-living older people living alone should anticipate higher levels of disease and disability in these patients, and higher health and social risks, much of which will be due to older age, lower educational status, and female sex. Living alone itself appears to be associated with higher risks of falling, and constellations of pathologies, including visual loss and joint disorders. Targeted population screening using lone status may be useful in identifying older individuals at high risk of falling.
Resumo:
BACKGROUND: We studied the association of baseline fasting plasma glucose (FPG) levels with survival and coronary artery disease (CAD) progression among postmenopausal women without unstable angina. METHODS: Women were recruited from seven centers in the Women's Angiographic Vitamin and Estrogen Trial (WAVE) (n = 423). Event follow-up was available for 400 women (65.1 +/- 8.5 years, 66% white, 92% hypertensive, 19% smokers, 67% hypercholesterolemic). Thirty-eight percent of the women had diabetes or FPG > 125 mg/dL, and 21% had a fasting glucose 100-125 mg/dL. Follow-up angiography was performed in 304 women. Cox regression was used to model survival from a composite outcome of death or myocardial infarction (D/MI, 26 events; median follow-up 2.4 years). Angiographic progression was analyzed quantitatively using linear regression accounting for baseline minimum lumen diameter (MLD), follow-up time, and intrasubject correlations using generalized estimating equations. Regression analyses were adjusted for follow-up time, baseline age, treatment assignment, and Framingham risk (excluding diabetes). RESULTS: Women with impaired fasting glucose/diabetes mellitus (IFG/DM) had a relative risk (RR) of D/MI of 4.2 ( p = 0.009). In all women, each 10 mg/dL increase in FPG was associated with an 11% increase ( p < 0.001) in the hazard of D/MI. Each 10 mg/dL increase in FPG was associated with a 6.8 mum decrease in MLD over the follow-up period ( p = 0.005). CONCLUSIONS: Higher FPG is associated with increased risk of D/MI and greater narrowing of the coronary lumen in women with CAD. Aggressive monitoring of glucose levels may be beneficial for secondary CAD prevention.
Resumo:
BACKGROUND: Single-nucleotide polymorphisms in genes involved in lipoprotein and adipocyte metabolism may explain why dyslipidemia and lipoatrophy occur in some but not all antiretroviral therapy (ART)-treated individuals. METHODS: We evaluated the contribution of APOC3 -482C-->T, -455T-->C, and 3238C-->G; epsilon 2 and epsilon 4 alleles of APOE; and TNF -238G-->A to dyslipidemia and lipoatrophy by longitudinally modeling >2600 lipid determinations and 2328 lipoatrophy assessments in 329 ART-treated patients during a median follow-up period of 3.4 years. RESULTS: In human immunodeficiency virus (HIV)-infected individuals, the effects of variant alleles of APOE on plasma cholesterol and triglyceride levels and of APOC3 on plasma triglyceride levels were comparable to those reported in the general population. However, when treated with ritonavir, individuals with unfavorable genotypes of APOC3 and [corrected] APOE were at risk of extreme hypertriglyceridemia. They had median plasma triglyceride levels of 7.33 mmol/L, compared with 3.08 mmol/L in the absence of ART. The net effect of the APOE*APOC3*ritonavir interaction was an increase in plasma triglyceride levels of 2.23 mmol/L. No association between TNF -238G-->A and lipoatrophy was observed. CONCLUSIONS: Variant alleles of APOE and APOC3 contribute to an unfavorable lipid profile in patients with HIV. Interactions between genotypes and ART can lead to severe hyperlipidemia. Genetic analysis may identify patients at high risk for severe ritonavir-associated hypertriglyceridemia.
Resumo:
BACKGROUND: Though guidelines emphasize low-density lipoprotein cholesterol (LDL-C) lowering as an essential strategy for cardiovascular risk reduction, achieving target levels may be difficult. PATIENTS AND METHODS: The authors conducted a prospective, controlled, open-label trial examining the effectiveness and safety of high-dose fluvastatin or a standard dosage of simvastatin plus ezetimibe, both with an intensive guideline-oriented cardiac rehabilitation program, in achieving the new ATP III LDL-C targets in patients with proven coronary artery disease. 305 consecutive patients were enrolled in the study. Patients were divided into two groups: the simvastatin (40 mg/d) plus ezetimibe (10 mg/d) and the fluvastatin-only group (80 mg/d). Patients in both study groups received the treatment for 21 days in addition to nonpharmacological measures, including advanced physical, dietary, psychosocial, and educational activities. RESULTS: After 21 days of treatment, a significant reduction in LDL-C was found in both study groups as compared to the initial values, however, the reduction in LDL-C was significantly stronger in the simvastatin plus ezetimibe group: simvastatin plus ezetimibe treatment decreased LDL-C to a mean level of 57.7 +/- 1.7 mg/ml, while fluvastatin achieved a reduction to 84.1 +/- 2.4 mg/ml (p < 0.001). In the simvastatin plus ezetimibe group, 95% of the patients reached the target level of LDL-C < 100 mg/dl. This percentage was significantly higher than in patients treated with fluvastatin alone (75%; p < 0.001). The greater effectiveness of simvastatin plus ezetimibe was more impressive when considering the optional goal of LDL-C < 70 mg/dl (75% vs. 32%, respectively; p < 0.001). There was no difference in occurrence of adverse events between both groups. CONCLUSION: Simvastatin 40 mg/d plus ezetimibe 10 mg/d, on the background of a guideline-oriented standardized intensive cardiac rehabilitation program, can reach 95% effectiveness in achieving challenging goals (LDL < 100 mg/dl) using lipid-lowering medication in patients at high cardiovascular risk.
Resumo:
Respiratory infections cause considerable morbidity during infancy. The impact of innate immunity mechanisms, such as mannose-binding lectin (MBL), on respiratory symptoms remains unclear. The aims of this study were to investigate whether cord blood MBL levels are associated with respiratory symptoms during infancy and to determine the relative contribution of MBL when compared with known risk factors. This is a prospective birth cohort study including 185 healthy term infants. MBL was measured in cord blood and categorized into tertiles. Frequency and severity of respiratory symptoms were assessed weekly until age one. Association with MBL levels was analysed using multivariable random effects Poisson regression. We observed a trend towards an increased incidence rate of severe respiratory symptoms in infants in the low MBL tertile when compared with infants in the middle MBL tertile [incidence rate ratio (IRR) = 1.59; 95% confidence interval (CI): 0.95-2.66; p = 0.076]. Surprisingly, infants in the high MBL tertile suffered significantly more from severe and total respiratory symptoms than infants in the middle MBL tertile (IRR = 1.97; 95% CI: 1.20-3.25; p = 0.008). This association was pronounced in infants of parents with asthma (IRR = 3.64; 95% CI: 1.47-9.02; p = 0.005). The relative risk associated with high MBL was similar to the risk associated with well-known risk factors such as maternal smoking or childcare. In conclusion the association between low MBL levels and increased susceptibility to common respiratory infections during infancy was weaker than that previously reported. Instead, high cord blood MBL levels may represent a so far unrecognized risk factor for respiratory morbidity in infants of asthmatic parents.
Resumo:
Necrotising enterocolitis (NEC) causes significant morbidity and mortality in premature infants. The role of innate immunity in the pathogenesis of NEC remains unclear. Mannose-binding lectin (MBL) recognizes microorganisms and activates the complement system via MBL-associated serine protease-2 (MASP-2). The aim of this study was to investigate whether MBL and MASP-2 are associated with NEC. This observational case-control study included 32 infants with radiologically confirmed NEC and 64 controls. MBL and MASP-2 were measured in cord blood using ELISA. Multivariate logistic regression was performed. Of the 32 NEC cases (median gestational age, 30.5 wk), 13 (41%) were operated and 5 (16%) died. MASP-2 cord blood concentration ranged from undetectable (<10 ng/mL) to 277 ng/mL. Eighteen of 32 (56%) NEC cases had higher MASP-2 levels (> or =30 ng/mL) compared with 22 of 64 (34%) controls (univariate OR 2.46; 95% CI 1.03-5.85; p = 0.043). Higher cord blood MASP-2 levels were significantly associated with an increased risk of NEC in multivariate analysis (OR 3.00; 95% CI 1.17-7.93; p = 0.027). MBL levels were not associated with NEC (p = 0.64). In conclusion, infants later developing NEC had significantly higher MASP-2 cord blood levels compared with controls. Higher MASP-2 may favor complement-mediated inflammation and could thereby predispose to NEC.
Resumo:
BACKGROUND: there is inadequate evidence to support currently formulated NHS strategies to achieve health promotion and preventative care in older people through broad-based screening and assessment in primary care. The most extensively evaluated delivery instrument for this purpose is Health Risk Appraisal (HRA). This article describes a trial using HRA to evaluate the effect on health behaviour and preventative-care uptake in older people in NHS primary care. METHODS: a randomised controlled trial was undertaken in three London primary care group practices. Functionally independent community-dwelling patients older than 65 years (n = 2,503) received a self-administered Health Risk Appraisal for Older Persons (HRA-O) questionnaire leading to computer-generated individualised written feedback to participants and general practitioners (GPs), integrated into practice information-technology (IT) systems. All primary care staff received training in preventative health in older people. The main outcome measures were self-reported health behaviour and preventative care uptake at 1-year follow-up. RESULTS: of 2,503 individuals randomised, 2,006 respondents (80.1%) (intervention, n = 940, control n = 1,066) were available for analysis. Intervention group respondents reported slightly higher pneumococcal vaccination uptake and equivocal improvement in physical activity levels compared with controls. No significant differences were observed for any other categories of health behaviour or preventative care measures at 1-year follow-up. CONCLUSIONS: HRA-O implemented in this way resulted in minimal improvement of health behaviour or uptake of preventative care measures in older people. Supplementary reinforcement involving contact by health professionals with patients over and above routine clinical encounters may be a prerequisite to the effectiveness of IT-based delivery systems for health promotion in older people.
Resumo:
Nearly 22 million Americans operate as shift workers, and shift work has been linked to the development of cardiovascular disease (CVD). This study is aimed at identifying pivotal risk factors of CVD by assessing 24 hour ambulatory blood pressure, state anxiety levels and sleep patterns in 12 hour fixed shift workers. We hypothesized that night shift work would negatively affect blood pressure regulation, anxiety levels and sleep patterns. A total of 28 subjects (ages 22-60) were divided into two groups: 12 hour fixed night shift workers (n=15) and 12 hour fixed day shift workers (n=13). 24 hour ambulatory blood pressure measurements (Space Labs 90207) were taken twice: once during a regular work day and once on a non-work day. State anxiety levels were assessed on both test days using the Speilberger’s State Trait Anxiety Inventory. Total sleep time (TST) was determined using self recorded sleep diary. Night shift workers demonstrated increases in 24 hour systolic (122 ± 2 to 126 ± 2 mmHg, P=0.012); diastolic (75 ± 1 to 79 ± 2 mmHg, P=0.001); and mean arterial pressures (90 ± 2 to 94 ± 2mmHg, P<0.001) during work days compared to off days. In contrast, 24 hour blood pressures were similar during work and off days in day shift workers. Night shift workers reported less TST on work days versus off days (345 ± 16 vs. 552 ± 30 min; P<0.001), whereas day shift workers reported similar TST during work and off days (475 ± 16 minutes to 437 ± 20 minutes; P=0.231). State anxiety scores did not differ between the groups or testing days (time*group interaction P=0.248), suggesting increased 24 hour blood pressure during night shift work is related to decreased TST, not short term anxiety. Our findings suggest that fixed night shift work causes disruption of the normal sleep-wake cycle negatively affecting acute blood pressure regulation, which may increase the long-term risk for CVD.