775 resultados para low risk population
Resumo:
A longitudinal capture-mark-recapture study was conducted to determine the temporal dynamics of rabbit haemorrhagic disease (RHD) in a European rabbit (Oryctolagus cuniculus) population of low to moderate density on sand-hill country in the lower North Island of New Zealand. A combination of sampling ( trapping and radio-tracking) and diagnostic (cELISA, PCR and isotype ELISA) methods was employed to obtain data weekly from May 1998 until June 2001. Although rabbit haemorrhagic disease virus ( RHDV) infection was detected in the study population in all 3 years, disease epidemics were evident only in the late summer or autumn months in 1999 and 2001. Overall, 20% of 385 samples obtained from adult animals older than 11 weeks were seropositive. An RHD outbreak in 1999 contributed to an estimated population decline of 26%. A second RHD epidemic in February 2001 was associated with a population decline of 52% over the subsequent month. Following the outbreaks, the seroprevalence in adult survivors was between 40% and 50%. During 2000, no deaths from RHDV were confirmed and mortalities were predominantly attributed to predation. Influx of seronegative immigrants was greatest in the 1999 and 2001 breeding seasons, and preceded the RHD epidemics in those years. Our data suggest that RHD epidemics require the population immunity level to fall below a threshold where propagation of infection can be maintained through the population.
Resumo:
Cadmium has been widely used in various industries for the past fifty years, with current world production standing at around 16,755 tonnes per year. Very little cadmium is ever recycled and the ultimate fate of all cadmium is the environment. In view of reports that cadmium in the environment is increasing, this thesis aims to identify population groups 'at risk' of receiving dietary intakes of cadmium up to or above the current Food and Agricultural Organisation/World Health Organisation maximum tolerable intake of 70 ug/day. The study involves the investigation of one hundred households (260 individuals) who grow a large proportion of their vegetable diet in garden soils in the Borough of Walsall, part of an urban/industrial area in the United Kingdom. Measurements were made of the cadmium levels in atmospheric deposition, soil, house dust, diet and urine from the participants. Atmospheric deposition of cadmium was found to be comparable with other urban/industrial areas in the European Community, with deposition rates as high as 209 g ha-1 yr-1. The garden soils of the study households were found to contain up to 33 mg kg-1 total cadmium, eleven times the highest level usually found in agricultural soils. Dietary intakes of cadmium by the residents from food were calculated to be as high as 68 ug/day. It is suggested that with intakes from other sources, such as air, adventitious ingestion, smoking and occupational exposure, total intakes of cadmium may reach or exceed the FAO/WHO limit. Urinary excretion of cadmium amongst a non-smoking, non-occupationally exposed sub-group of the study population was found to be significantly higher than that of a similar urban population who did not rely on home-produced vegetables. The results from this research indicate that present levels of cadmium in urban/industrial areas can increase dietary intakes and body burdens of cadmium. As cadmium serves no useful biological function and has been found to be highly toxic, it is recommended that policy measures to reduce human exposure on the European scale be considered.
Resumo:
The role of antiplatelet therapy as primary prophylaxis of thrombosis in low-risk essential thrombocythemia has not been studied in randomized clinical trials. We assessed the benefit/risk of low-dose aspirin in 433 low-risk essential thrombocythemia patients (CALR-mutated n=271, JAK2V617F-mutated n=162) who were on antiplatelet therapy or observation only. After a 2215 person-years follow-up free from cytoreduction, 25 thrombotic and 17 bleeding episodes were recorded. In CALR-mutated patients, antiplatelet therapy did not affect the risk of thrombosis but was associated with a higher incidence of bleeding (12.9 vs. 1.8 x1000 patient-years, p=0.03). In JAK2V617F-mutated patients, low-dose aspirin was associated with a reduced incidence of venous thrombosis with no effect on the risk of bleeding. Coexistence of JAK2V617F-mutation and cardiovascular risk factors increased the risk of thrombosis, even after adjusting for treatment with low-dose aspirin (incidence rate ratio: 9.8; 95% confidence interval: 2.3-42.3; p=0.02). Time free from cytoreduction was significantly shorter in CALR-mutated than in JAK2V617F-mutated essential thrombocythemia (median time 5 years and 9.8 years, respectively; p=0.0002) usually to control extreme thrombocytosis. In conclusion, in patients with low-risk, CALR-mutated essential thrombocythemia, low-dose aspirin does not reduce the risk of thrombosis and may increase the risk of bleeding.
Resumo:
Early discharge protocols have been proposed for ST-segment elevation myocardial infarction (STEMI) low risk patients despite the existence of few but significant cardiovascular events during mid-term follow-up. We aimed to identify a subgroup of patients among those considered low-risk in which prognosis would be particularly good. We analyzed 30-day outcomes and long-term follow-up among 1.111 STEMI patients treated with reperfusion therapy. Multivariate analysis identified seven variables as predictors of 30-day outcomes: Femoral approach; age > 65; systolic dysfunction; postprocedural TIMI flow < 3; elevated creatinine level > 1.5 mg/dL; stenosis of left-main coronary artery; and two or higher Killip class (FASTEST). A total of 228 patients (20.5%), defined as very low-risk (VLR), had none of these variables on admission. VLR group of patients compared to non-VLR patients had lower in-hospital (0% vs. 5.9%; p < 0.001) and 30-day mortality (0% vs. 6.25%: p < 0.001). They also presented fewer in-hospital complications (6.6% vs. 39.7%; p < 0.001) and 30-day major adverse events (0.9% vs. 4.5%; p = 0.01). Significant mortality differences during a mean follow-up of 23.8 ± 19.4 months were also observed (2.2% vs. 15.2%; p < 0.001). The first VLR subject died 11 months after hospital discharge. No cardiovascular deaths were identified in this subgroup of patients during follow-up. About a fifth of STEMI patients have VLR and can be easily identified. They have an excellent prognosis suggesting that 24–48 h in-hospital stay could be a feasible alternative in these patients.
Resumo:
Resumo: Predição da concentração de baixo risco de diflubenzuron para organismos aquáticos e avaliação da argila e brita na redução da toxicidade. O diflubenzuron é um inseticida que além de ser usado agricultura, tem sido amplamente empregado na piscicultura, apesar do seu uso ser proibido nesta atividade. Este composto não consta na lista da legislação brasileira que estabelece limites máximos permissíveis em corpos de água para a proteção das comunidades aquáticas. No presente trabalho, a partir da toxicidade do diflubenzuron em organismos não-alvo, foi calculada a concentração de risco para somente 5% das espécies (HC5). O valor deste parâmetro foi estimado em aproximadamente 7 x 10-6 mg L-1 . Este baixo valor é devido à extremamente alta toxicidade do diflubenzuron para dafnídeos e à grande variação de sensibilidade entre as espécies testadas. Dois matérias de relativamente baixo custo se mostraram eficientes na remoção da toxicidade do diflubenzuron de soluções contendo este composto. Dentre esses materiais, a argila expandida promoveu a redução em aproximadamente 50% da toxicidade de uma solução contendo diflubenzuron. Os resultados podem contribuir para políticas públicas no Brasil relacionadas ao estabelecimento de limites máximos permissíveis de xenobióticos no compartimento aquático. Também, para a pesquisa de matérias inertes e de baixo custo com potencial de remoção de xenobióticos presentes em efluentes da aquicultura ou da agricultura. Abstract: Diflubenzuron is an insecticide that, besides being used in the agriculture, has been widely used in fish farming. However, its use is prohibited in this activity. Diflubenzuron is not in the list of Brazilian legislation establishing maximum permissible limits in water bodies for the protection of aquatic communities. In this paper, according toxicity data of diflubenzuron in non-target organisms, it was calculated an hazardous concentration for only 5% of the species (HC5) of the aquatic community. This parameter value was estimated to be about 7 x 10 -6 mg L -1 . The low value is due to the extreme high toxicity of diflubenzuron to daphnids and to the large variation in sensitivity among the species tested. Two relatively low cost and inert materials were efficient in removing the diflubenzuron from solutions containing this compound. Among these materials, expanded clay shown to promote reduction of approximately 50% of the toxicity of a solution containing diflubenzuron. The results may contribute to the establishment of public policies in Brazil associated to the definition of maximum permissible limits of xenobiotics in the aquatic compartment. This study is also relevant to the search of low cost and inert materials for xenobiotics removal from aquaculture or agricultural effluents.
Resumo:
Trichinella surveillance in wildlife relies on muscle digestion of large samples which are logistically difficult to store and transport in remote and tropical regions as well as labour-intensive to process. Serological methods such as enzyme-linked immunosorbent assays (ELISAs) offer rapid, cost-effective alternatives for surveillance but should be paired with additional tests because of the high false-positive rates encountered in wildlife. We investigated the utility of ELISAs coupled with Western blot (WB) in providing evidence of Trichinella exposure or infection in wild boar. Serum samples were collected from 673 wild boar from a high- and low-risk region for Trichinella introduction within mainland Australia, which is considered Trichinella-free. Sera were examined using both an 'in-house' and a commercially available indirect-ELISA that used excretory secretory (E/S) antigens. Cut-off values for positive results were determined using sera from the low-risk population. All wild boar from the high-risk region (352) and 139/321 (43.3%) of the wild boar from the low-risk region were tested by artificial digestion. Testing by Western blot using E/S antigens, and a Trichinella-specific real-time PCR was also carried out on all ELISA-positive samples. The two ELISAs correctly classified all positive controls as well as one naturally infected wild boar from Gabba Island in the Torres Strait. In both the high- and low-risk populations, the ELISA results showed substantial agreement (k-value = 0.66) that increased to very good (k-value = 0.82) when WB-positive only samples were compared. The results of testing sera collected from the Australian mainland showed the Trichinella seroprevalence was 3.5% (95% C.I. 0.0-8.0) and 2.3% (95% C.I. 0.0-5.6) using the in-house and commercial ELISA coupled with WB respectively. These estimates were significantly higher (P < 0.05) than the artificial digestion estimate of 0.0% (95% C.I. 0.0-1.1). Real-time PCR testing of muscle from seropositive animals did not detect Trichinella DNA in any mainland animals, but did reveal the presence of a second larvae-positive wild boar on Gabba Island, supporting its utility as an alternative, highly sensitive method in muscle examination. The serology results suggest Australian wildlife may have been exposed to Trichinella parasites. However, because of the possibility of non-specific reactions with other parasitic infections, more work using well-defined cohorts of positive and negative samples is required. Even if the specificity of the ELISAs is proven to be low, their ability to correctly classify the small number of true positive sera in this study indicates utility in screening wild boar populations for reactive sera which can be followed up with additional testing. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Introdução: A anfotericina B é a droga de escolha para o tratamento de doenças fúngicas severas, estando associada, no entanto, a alta incidência de nefrotoxicidade. O uso de anfotericinas modificadas está associado a elevado custo. Em grupos de baixo risco o uso de sobrecarga hidrossalina pode ser suficiente para evitar perda severa de função renal. Métodos: Foram estudados prospectivamente pacientes internados em hospital universitário, com idade superior a 12 anos, e que estavam dentro das primeiras 24 horas de uso de anfotericina B. Foram excluídos pacientes em centros de terapia intensiva e que estivessem em uso de drogas vasoativas. Solução salina 0,9% (500 ml) foi infundida antes e após a anfotericina B. Foram coletados exames na inclusão e no término do tratamento. A dosagem de creatinina sérica foi repetida após 30 dias do término do tratamento. Resultados: Foram estudados 48 pacientes. A média de elevação da creatinina sérica foi de 0,3 (0,18-0,41) mg/dl., representando um decréscimo médio de 25 (12,8-36,9) ml/min na depuração de creatinina endógena (DCE). Insuficiência renal aguda (IRA), definida pela elevação maior do que 50% da creatinina basal, ocorreu em 15 pacientes (31,3%). Pacientes que utilizaram antibióticos e aqueles em status pós-quimioterapia ou submetidos a transplante de medula óssea foram os que apresentaram maior risco de desenvolverem IRA. A creatinina e a DCE após 30 dias do término do tratamento não diferiram de seus valores basais. Conclusão: Em pacientes de baixo risco, o uso de anfotericina B com adminstração profilática de solução fisiológica foi associado à alteração pequena e reversível da função renal. Devido ao alto custo, o uso de métodos mais dispendiosos nestes pacientes não parece justificado no momento. Ensaios clínicos randomizados são necessários nesta população.
Resumo:
BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.
Resumo:
Insurance provision against uncertainties is present in several dimensions of peoples´s lives, such as the provisions related to, inter alia, unemployment, diseases, accidents, robbery and death. Microinsurance improves the ability of low-income individuals to cope with these risks. Brazil has a fairly developed financial system but still not geared towards the poor, especially in what concerns the insurance industry. The evaluation of the microinsurance effects on well-being, and the demand for different types of microinsurance require an analysis of the dynamics of the individual income process and an assessment of substitutes and complementary institutions that condition their respective financial behavior. The evaluation of the microinsurance effects on well-being, and the demand for different types of microinsurance require an analysis of the dynamics of the individual income process and an assessment of substitutes and complementary institutions that condition their respective financial behavior. The Brazilian government provides a relatively developed social security system considering other countries of similar income level which crowds-out the demand for insurance and savings. On the other hand, this same public infrastructure may help to foster microfinance products supply. The objective of this paper is to analyze the demand for different types of private insurance by the low-income population using microdata from a National Expenditure Survey (POF/IBGE). The final objective is to help to understand the trade-offs faced for the development of an emerging industry of microinsurance in Brazil.
Resumo:
Italy registers a fast increase of low income population. Academics and policy makers consider income inequalities as a key determinant for low or inadequate healthy food consumption. Thus the objective is to understand how to overcome the agrofood chain barriers towards healthy food production, commercialisation and consumption for population at risk of poverty (ROP) in Italy. The study adopts a market oriented food chain approach, focusing the research ambit on ROP consumers, processing industries and retailers. The empirical investigation adopts a qualitative methodology with an explorative approach. The actors are investigated through 4 focus groups for consumers and carrying out 27 face to face semi-structured interviews for industries and retailers’ representatives. The results achieved provide the perceptions of each actor integrated into an overall chain approach. The analysis shows that all agrofood actors lack of an adequate level of knowledge towards healthy food definition. Food industries and retailers also show poor awareness about ROP consumers’ segment. In addition they perceive that the high costs for producing healthy food conflict with the low economic performances expected from ROP consumers’ segment. These aspects induce a scarce interest in investing on commercialisation strategies for healthy food for ROP consumers. Further ROP consumers show other notable barriers to adopt healthy diets caused, among others, by a personal strong negative attitude and lack of motivation. The personal barriers are also negatively influenced by several external socio-economic factors. The solutions to overcome the barriers shall rely on the improvement of the agrofood chain internal relations to identify successful strategies for increasing interest on low cost healthy food. In particular the focus should be on improved collaboration on innovation adoption and marketing strategies, considering ROP consumers’ preferences and needs. An external political intervention is instead necessary to fill the knowledge and regulations’ gaps on healthy food issues.
Resumo:
Background: There is a sound rationale for the population-based approach to falls injury prevention but there is currently insufficient evidence to advise governments and communities on how they can use population-based strategies to achieve desired reductions in the burden of falls-related injury.---------- Aim: To quantify the effectiveness of a streamlined (and thus potentially sustainable and cost-effective), population-based, multi-factorial falls injury prevention program for people over 60 years of age.---------- Methods: Population-based falls-prevention interventions were conducted at two geographically-defined and separate Australian sites: Wide Bay, Queensland, and Northern Rivers, NSW. Changes in the prevalence of key risk factors and changes in rates of injury outcomes within each community were compared before and after program implementation and changes in rates of injury outcomes in each community were also compared with the rates in their respective States.---------- Results: The interventions in neither community substantially decreased the rate of falls-related injury among people aged 60 years or older, although there was some evidence of reductions in occurrence of multiple falls reported by women. In addition, there was some indication of improvements in fall-related risk factors, but the magnitudes were generally modest.---------- Conclusion: The evidence suggests that low intensity population-based falls prevention programs may not be as effective as those are intensively implemented.
Resumo:
Introduction: Smoking status in outpatients with chronic obstructive pulmonary disease (COPD) has been associated with a low body mass index (BMI) and reduced mid-arm muscle circumference (Cochrane & Afolabi, 2004). Individuals with COPD identified as malnourished have also been found to be twice as likely to die within 1 year compared to non-malnourished patients (Collins et al., 2010). Although malnutrition is both preventable and treatable, it is not clear what influence current smoking status, another modifiable risk factor, has on malnutrition risk. The current study aimed to establish the influence of smoking status on malnutrition risk and 1-year mortality in outpatients with COPD. Methods: A prospective nutritional screening survey was carried out between July 2008 and May 2009 at a large teaching hospital (Southampton General Hospital) and a smaller community hospital within Hampshire (Lymington New Forest Hospital). In total, 424 outpatients with a diagnosis of COPD were routinely screened using the ‘Malnutrition Universal Screening Tool’, ‘MUST’ (Elia, 2003); 222 males, 202 females; mean (SD) age 73 (9.9) years; mean (SD) BMI 25.9 (6.4) kg m−2. Smoking status on the date of screening was obtained for 401 of the outpatients. Severity of COPD was assessed using the GOLD criteria, and social deprivation determined using the Index of Multiple Deprivation (Nobel et al., 2008). Results: The overall prevalence of malnutrition (medium + high risk) was 22%, with 32% of current smokers at risk (who accounted for 19% of the total COPD population). In comparison, 19% of nonsmokers and ex-smokers were likely to be malnourished [odds ratio, 1.965; 95% confidence interval (CI), 1.133–3.394; P = 0.015]. Smoking status remained an independent risk factor for malnutrition even after adjustment for age, social deprivation and disease-severity (odds ratio, 2.048; 95% CI, 1.085–3.866; P = 0.027) using binary logistic regression. After adjusting for age, disease severity, social deprivation, smoking status, malnutrition remained a significant predictor of 1-year mortality [odds ratio (medium + high risk versus low risk), 2.161; 95% CI, 1.021–4.573; P = 0.044], whereas smoking status did not (odds ratio for smokers versus ex-smokers + nonsmokers was 1.968; 95% CI, 0.788–4.913; P = 0.147). Discussion: This study highlights the potential importance of combined nutritional support and smoking cessation in order to treat malnutrition. The close association between smoking status and malnutrition risk in COPD suggests that smoking is an important consideration in the nutritional management of malnourished COPD outpatients. Conclusions: Smoking status in COPD outpatients is a significant independent risk factor for malnutrition and a weaker (nonsignificant) predictor of 1-year mortality. Malnutrition significantly predicted 1 year mortality. References: Cochrane, W.J. & Afolabi, O.A. (2004) Investigation into the nutritional status, dietary intake and smoking habits of patients with chronic obstructive pulmonary disease. J. Hum. Nutr. Diet.17, 3–11. Collins, P.F., Stratton, R.J., Kurukulaaratchym R., Warwick, H. Cawood, A.L. & Elia, M. (2010) ‘MUST’ predicts 1-year survival in outpatients with chronic obstructive pulmonary disease. Clin. Nutr.5, 17. Elia, M. (Ed) (2003) The ‘MUST’ Report. BAPEN. http://www.bapen.org.uk (accessed on March 30 2011). Nobel, M., McLennan, D., Wilkinson, K., Whitworth, A. & Barnes, H. (2008) The English Indices of Deprivation 2007. http://www.communities.gov.uk (accessed on March 30 2011).
Resumo:
MC1R gene variants have previously been associated with red hair and fair skin color, moreover skin ultraviolet sensitivity and a strong association with melanoma has been demonstrated for three variant alleles that are active in influencing pigmentation: Arg151Cys, Arg160Trp, and Asp294His. This study has confirmed these pigmentary associations with MC1R genotype in a collection of 220 individuals drawn from the Nambour community in Queensland, Australia, 111 of whom were at high risk and 109 at low risk of basal cell carcinoma and squamous cell carcinoma. Comparative allele frequencies for nine MC1R variants that have been reported in the Caucasian population were determined for these two groups, and an association between prevalence of basal cell carcinoma, squamous cell carcinoma, solar keratosis and the same three active MC1R variant alleles was demonstrated [odds ratio = 3.15 95% CI (1.7, 5.82)]. Three other commonly occurring variant alleles: Val60Leu, Val92Met, and Arg163Gln were identified as having a minimal impact on pigmentation phenotype as well as basal cell carcinoma and squamous cell carcinoma risk. A significant heterozygote effect was demonstrated where individuals carrying a single MC1R variant allele were more likely to have fair and sun sensitive skin as well as carriage of a solar lesion when compared with those individuals with a consensus MC1R genotype. After adjusting for the effects of pigmentation on the association between MC1R variant alleles and basal cell carcinoma and squamous cell carcinoma risk, the association persisted, confirming that presence of at least one variant allele remains informative in terms of predicting risk for developing a solar-induced skin lesion beyond that information wained through observation of pigmentation phenotype.
Resumo:
Prevention of cardiovascular diseases is known to postpone death, but in an aging society it is important to ensure that those who live longer are neither disabled nor suffering an inferior quality of life. It is essential both from the point of view of the aging individual as well as that of society that any individual should enjoy a good physical, mental and social quality of life during these additional years. The studies presented in this thesis investigated the impact of modifiable risk factors, all of which affect cardiovascular health in the long term, on mortality and health-related quality of life (HRQoL). The data is based on the all male cohort of the Helsinki Businessmen Study. This cohort, originally of 3.490 men born between 1919 and 1934 has been followed since the 1960s. The socioeconomic status of the participants is similar, since all the men were working in leading positions. Extensive baseline examinations were conducted among 2.375 of the men in 1974 when their mean age was 48 and at this time the health, medication and cardiovascular risk factors of the participants were observed. In 2000, at the mean age of 73, the HRQoL of the survivors of the original cohort was examined using the RAND-36 mailed questionnaire (n=1.864). RAND-36, along with the equivalent SF-36, is the world s most widely used means of assessing generic health. The response rate was generally over 90%. Mortality was retrieved from national registers in 2000 and 2002. For the six substudies of this thesis, the impact of four different modifiable cardiovascular risk factors (weight gain, cholesterol, alcohol and smoking) on the HRQoL in old age was studied both independently and in combination. The follow-up time for these studies varies from 26 up to 39 years. Mortality is reported separately or included in the RAND-36 scores for HRQoL. Elevated levels of all the risk factors examined among the participants in midlife led to a diminished life expectancy. Among survivors, lower weight gain in midlife was associated with better HRQoL, both physically and mentally. Higher levels of serum cholesterol in middle age indicated both an earlier mortality and a decline in the physical component of HRQoL in a dose-response manner during the 39-year follow-up. Mortality was significantly higher in the highest baseline category of reported mean alcohol consumption (≥ 5 drinks/day), but fairly comparable in abstainers and moderate drinkers during the 29-year follow-up. When HRQoL in old age was accounted for mortality, the men with the highest alcohol consumption in midlife clearly had poorer physical and mental health in old age, but the HRQoL of abstainers and those who drank alcohol in moderation were comparatively similar. The amount of cigarette smoking in midlife was shown to have had a dose-response effect on both mortality and HRQoL in old age during the 26 year follow-up. The men smoking over 20 cigarettes daily in middle age lost about 10 years of their life-expectancy. Meanwhile, the physical functioning of surviving heavy smokers in old age was similar to men 10 years older in the general population. The impact of clustered cardiovascular risk factors was examined by comparing two subcohorts of men who were healthy in 1974, but with different baseline risk factor status. The men with low risk had a 50 % lower mortality during the 29-years follow-up. Their RAND-36 scores for the physical quality of life in old age were significantly better, and the 2002 questionnaire examining psychological well-being indicated also significantly better mental health among the low-risk group. The results indicate that different risk factor levels in midlife have a meaningful impact on life-expectancy and the quality of these extra years. Leading a healthy lifestyle improves both survival and the quality of life.