954 resultados para Elderly nutrition
Resumo:
Undernutrition is a widespread problem in the intensive care and is associated with a worse clinical outcome. Enteral nutrition is the recommended nutritional support in ICU patients. However, enteral nutrition is frequently insufficient to cover protein-energy needs. The initiation of supplemental parenteral nutrition, when enteral nutrition is insufficient, could optimize the nutritional therapy. Such a combination could allow reducing morbidity, length of stay and recovery, as well as improving quality of life and health care costs. Prospective studies are currently underway to test this hypothesis.
Resumo:
Few data for normal urinary oxalate (Ox) and calcium (Ca) excretion related both to gestational age and nutritional factors have been reported in preterm or term infants. We therefore determined the molar Ox and Ca to creatinine (Cr) ratios in spot urines from 64 preterm and 37 term infants aged 1-60 days, either fed formula or human milk (HM). Only vitamin D was supplemented; renal or metabolic diseases were excluded. Urinary Ox/Cr ratio was higher in preterm than in term infants, both when formula fed (1st month 253 vs. 180 mmol/mol and 2nd month 306 vs. 212 mmol/mol; P<0.05) or HM fed (206 vs. 169 mmol/ mol and 283* vs. 232 mmol/mol; *P<0.05). Ox/Cr was also higher in formula- than HM-fed preterm infants. The ratio increased during the first 2 months of life irrespective of nutrition. Urinary Ca/Cr ratio was comparable in all groups during the 1st month of life, except for a lower (P < 0.05) value in term infants fed HM (0.10 mol/mol). It increased in all groups during the 2nd month of life, being highest in HM-fed preterm infants (1.86 mol/mol). In conclusion, urinary Ox and Ca excretion is influenced by both gestational age and nutrient intake in preterm and term infants.
Resumo:
More knowledge on the reasons for refusal of the influenza vaccine in elderly patients is essential to target groups for additional information, and hence improve coverage rate. The objective of the present study was to describe precisely the true motives for refusal. All patients aged over 64 who attended the Medical Outpatient Clinic, University of Lausanne, or their private practitioner's office during the 1999 and 2000 vaccination periods were included. Each patient was informed on influenza and its complications, as well as on the need for vaccination, its efficacy and adverse events. The vaccination was then proposed. In case of refusal, the reasons were investigated with an open question. Out of 1398 patients, 148 (12%) refused the vaccination. The main reasons for refusal were the perception of being in good health (16%), of not being susceptible to influenza (15%), of not having had the influenza vaccine in the past (15%), of having had a bad experience either personally or a relative (15%), and the uselessness of the vaccine (10%). Seventeen percent gave miscellaneous reasons and 12% no reason at all for refusal. Little epidemiological knowledge and resistance to change appear to be the major obstacles for wide acceptance of the vaccine by the elderly.
Resumo:
The benefit of postoperative radiotherapy (RT) has been demonstrated in elderly patients aged 65 years or older with glioblastoma multiforme. Hypofractionated RT schedules can reduce the time and morbidity of treatment while maintaining comparable survival outcomes to lengthy conventional RT. Current international randomized clinical trials are studying the optimized hypofractionated RT regimens, hypofractionated RT in comparison with temozolomide chemotherapy and hypofractionated RT in comparison with the same RT plus temozolomide. Given the guarded prognosis of the elderly and frail patients, quality of life and side effects of treatment should be closely examined. As more than half of cancers in the world occur in developing countries, hypofractionated RT could be better utilized as a cost-effective treatment for this group of patients.
Resumo:
BACKGROUND: Multiple risk prediction models have been validated in all-age patients presenting with acute coronary syndrome (ACS) and treated with percutaneous coronary intervention (PCI); however, they have not been validated specifically in the elderly. METHODS: We calculated the GRACE (Global Registry of Acute Coronary Events) score, the logistic EuroSCORE, the AMIS (Acute Myocardial Infarction Swiss registry) score, and the SYNTAX (Synergy between Percutaneous Coronary Intervention with TAXUS and Cardiac Surgery) score in a consecutive series of 114 patients ≥75 years presenting with ACS and treated with PCI within 24 hours of hospital admission. Patients were stratified according to score tertiles and analysed retrospectively by comparing the lower/mid tertiles as an aggregate group with the higher tertile group. The primary endpoint was 30-day mortality. Secondary endpoints were the composite of death and major adverse cardiovascular events (MACE) at 30 days, and 1-year MACE-free survival. Model discrimination ability was assessed using the area under receiver operating characteristic curve (AUC). RESULTS: Thirty-day mortality was higher in the upper tertile compared with the aggregate lower/mid tertiles according to the logistic EuroSCORE (42% vs 5%; odds ratio [OR] = 14, 95% confidence interval [CI] = 4-48; p <0.001; AUC = 0.79), the GRACE score (40% vs 4%; OR = 17, 95% CI = 4-64; p <0.001; AUC = 0.80), the AMIS score (40% vs 4%; OR = 16, 95% CI = 4-63; p <0.001; AUC = 0.80), and the SYNTAX score (37% vs 5%; OR = 11, 95% CI = 3-37; p <0.001; AUC = 0.77). CONCLUSIONS: In elderly patients presenting with ACS and referred to PCI within 24 hours of admission, the GRACE score, the EuroSCORE, the AMIS score, and the SYNTAX score predicted 30 day mortality. The predictive value of clinical scores was improved by using them in combination.
Resumo:
Background: Drug dosing errors are common in renal-impaired patients. Appropriate dosing adjustment and drug selection is important to ensure patients" safety and to avoid adverse drug effects and poor outcomes. There are few studies on this issue in community pharmacies. The aims of this study were, firstly, to determine the prevalence of dosing inadequacy as a consequence of renal impairment in patients over 65 taking 3 or more drug products who were being attended in community pharmacies and, secondly, to evaluate the effectiveness of the community pharmacist"s intervention in improving dosing inadequacy in these patients when compared with usual care. Methods: The study was carried out in 40 Spanish community pharmacies. The study had two phases: the first, with an observational, multicentre, cross sectional design, served to determine the dosing inadequacy, the drug-related problems per patient and to obtain the control group. The second phase, with a controlled study with historical control group, was the intervention phase. When dosing adjustments were needed, the pharmacists made recommendations to the physicians. A comparison was made between the control and the intervention group regarding the prevalence of drug dosing inadequacy and the mean number of drug-related problems per patient. Results: The mean of the prevalence of drug dosing inadequacy was 17.5% [95% CI 14.6-21.5] in phase 1 and 15.5% [95% CI 14.5-16.6] in phase 2. The mean number of drug-related problems per patient was 0.7 [95% CI 0.5-0.8] in phase 1 and 0.50 [95% CI 0.4-0.6] in phase 2. The difference in the prevalence of dosing inadequacy between the control and intervention group before the pharmacists" intervention was 0.73% [95% CI (−6.0) - 7.5] and after the pharmacists" intervention it was 13.5% [95% CI 8.0 - 19.5] (p < 0.001) while the difference in the mean of drug-related problems per patient before the pharmacists" intervention was 0.05 [95% CI( -0.2) - 0.3] and following the intervention it was 0.5 [95% CI 0.3 - 0.7] (p < 0.001). Conclusion: A drug dosing adjustment service for elderly patients with renal impairment in community pharmacies can increase the proportion of adequate drug dosing, and improve the drug-related problems per patient. Collaborative practice with physicians can improve these results.
Resumo:
Previous studies have shown that arbuscular mycorrhizal fungi (AMF) can influence plant diversity and ecosystem productivity. However, little is known about the effects of AMF and different AMF taxa on other important community properties such as nutrient acquisition, plant survival and soil structure. We established experimental grassland microcosms and tested the impact of AMF and of different AMF taxa on a number of grassland characteristics. We also tested whether plant species benefited from the same or different AMF taxa in subsequent growing seasons. AMF enhanced phosphorus acquisition, soil aggregation and survival of several plant species, but AMF did not increase total plant productivity. Moreover, AMF increased nitrogen acquisition by some plant species, but AMF had no effect on total N uptake by the plant community. Plant growth responses to AMF were temporally variable and some plant species obtained the highest biomass with different AMF in different years. Hence the results indicate that it may be beneficial for a plant to be colonized by different AMF taxa in different seasons. This study shows that AMF play a key role in grassland by improving plant nutrition and soil structure, and by regulating the make-up of the plant community.
Resumo:
Introduction: Delirium is frequent in hospitalized older people, with incidence rate up to 40% in acute care. Delirium is associated with several adverse consequences, including increased mortality and institutionalization. This study aims to investigate the prevalence, incidence, and consequences of delirium in patients hospitalized in an acute care unit for elderly (ACE unit). Methods: Over a 3 months period, every patient (N = 93, mean age 84.1 ± 7.8 years, 66/93(71%) women) admitted to a 28-bed ACE unit were systematically assessed for delirium. Trained nurses used the Confusion Assessment Method (CAM) instrument to determine the presence of delirium at admission and on each subsequent day over patients' stay. Delirium prevalence rate was defined as the proportion of patients with a positive CAM within 24 hours of admission to the ACE unit. Delirium incidence rate was defined as the proportion of patients with a negative CAM at admission whose CAM became positive at least once during their stay. This evaluation was part of a functional assessment, including Basic Activities of Daily Life (Katz BADL, from 0 to 6, higher score indicating better function). Delirium prevention interventions and specific treatment was provided if needed. Results: Overall,25/93(27%)patients had delirium during their stay. Prevalence of delirium at admission was 10/93 (11%), with an incidence of 15/83(18%). Compared with non-delirious patients, those with delirium were more frequently men (10/25(40%) vs 17/68(25%), p <.001) and had reduced functional status at admission(BADL 2.0 ± 1.9 vs 3.6 ± 2.1, p = .004). They tended to be older (86.0 ± 6.7 vs 83.3 ± 8.1 years, p = .110). At discharge, delirium was associated with reduced functional status (BADL 2.0 ± 2.1 vs 4.3 ± 1.9, p <.001), lower rate of home discharge (6/20(30%) vs 28/65 (43%), p = .009) and increased mortality (5/25 (20%) vs 3/68 (5%), p <.001). On average, patients with delirium stayed 5.7 days longer (17.0 ± 9.8 vs 11.31 ± 6.3, p = .011). Conclusion: Delirium occurred in almost a third of these older patients, even though its incidence was relatively low in this frail population. Despite specific management, delirium remained associated with higher risk for adverse outcomes at discharge. These results suggest that early preventive interventions, implemented as soon as possible after hospital admission, might be needed in similar population to achieve better outcomes. Effectiveness of such interventions will be evaluated in future studies.
Resumo:
There is much evidence for a causal relationship between salt intake and blood pressure (BP). The current salt intake in many countries is between 9 and 12 g/day. A reduction in salt intake to the recommended level of 5-6 g/day lowers BP in both hypertensive and normotensive individuals. A further reduction to 3-4 g/day has a much greater effect. Prospective studies and outcome trials have demonstrated that a lower salt intake is associated with a decreased risk of cardiovascular disease. Increasing evidence also suggests that a high salt intake is directly related to left ventricular hypertrophy (LVH) independent of BP. Both raised BP and LVH are important risk factors for heart failure. It is therefore possible that a lower salt intake could prevent the development of heart failure. In patients who already have heart failure, a high salt intake aggravates the retention of salt and water, thereby exacerbating heart failure symptoms and progression of the disease. A lower salt intake plays an important role in the management of heart failure. Despite this, currently there is no clear evidence on how far salt intake should be reduced in heart failure. Our personal view is that these patients should reduce their salt intake to <5 g/day, i.e. the maximum intake recommended by the World Health Organisation for all adults. If salt intake is successfully reduced, there may well be a need for a reduction in diuretic dosage.