603 resultados para weight exercise


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neutrophils serve as an intriguing model for the study of innate immune cellular activity induced by physiological stress. We measured changes in the transcriptome of circulating neutrophils following an experimental exercise trial (EXTRI) consisting of 1 h of intense cycling immediately followed by 1 h of intense running. Blood samples were taken at baseline, 3 h, 48 h, and 96 h post-EXTRI from eight healthy, endurance-trained, male subjects. RNA was extracted from isolated neutrophils. Differential gene expression was evaluated using Illumina microarrays and validated with quantitative PCR. Gene set enrichment analysis identified enriched molecular signatures chosen from the Molecular Signatures Database. Blood concentrations of muscle damage indexes, neutrophils, interleukin (IL)-6 and IL-10 were increased (P < 0.05) 3 h post-EXTRI. Upregulated groups of functionally related genes 3 h post-EXTRI included gene sets associated with the recognition of tissue damage, the IL-1 receptor, and Toll-like receptor (TLR) pathways (familywise error rate, P value < 0.05). The core enrichment for these pathways included TLRs, low-affinity immunoglobulin receptors, S100 calcium binding protein A12, and negative regulators of innate immunity, e.g., IL-1 receptor antagonist, and IL-1 receptor associated kinase-3. Plasma myoglobin changes correlated with neutrophil TLR4 gene expression (r = 0.74; P < 0.05). Neutrophils had returned to their nonactivated state 48 h post-EXTRI, indicating that their initial proinflammatory response was transient and rapidly counterregulated. This study provides novel insight into the signaling mechanisms underlying the neutrophil responses to endurance exercise, suggesting that their transcriptional activity was particularly induced by damage-associated molecule patterns, hypothetically originating from the leakage of muscle components into the circulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Use a randomised controlled trial (RCT) to evaluate outcomes of a universal intervention to promote protective feeding practices, which commenced in infancy and aimed to prevent childhood obesity Subjects and Methods The NOURISH RCT enrolled 698 first-time mothers (mean age 30.1 years, SD=5.3) with healthy term infants (51% female) aged 4.3 (SD=1.0) months at baseline. Mothers were randomly allocated to self-directed access to usual care or to attend two 6-session interactive group education modules that provided anticipatory guidance on early feeding practices. Outcomes were assessed six months after completion of the second information module, 20 months from baseline and when the children were two years old. Maternal feeding practices were self-reported using validated questionnaires and study-developed items. Study-measured child height and weight were used to calculate BMI Z-score. Results Retention at follow-up was 78%. Mothers in the intervention group reported using responsive feeding more frequently on 6/9 subscales and 8/8 items (Ps ≤.03) and overall less ‘controlling feeding practices’ (P<.001). They also more frequently used feeding practices (3/4 items; Ps <.01) likely to enhance food acceptance. No statistically significant differences were noted in anthropometric outcomes (BMI Z-score: P=.11), nor in prevalence of overweight/obesity (control 17.9% vs. intervention 13.8%, P=.23). Conclusions Evaluation of NOURISH at child age two years found that anticipatory guidance on complementary feeding, tailored to developmental stage, increased use by first-time mothers of 'protective' feeding practices that potentially support the development of healthy eating and growth patterns in young children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STUDY DESIGN: Reliability and case-control injury study. OBJECTIVES: 1) To determine if a novel device, designed to measure eccentric knee flexors strength via the Nordic hamstring exercise (NHE), displays acceptable test-retest reliability; 2) to determine normative values for eccentric knee flexors strength derived from the device in individuals without a history of hamstring strain injury (HSI) and; 3) to determine if the device could detect weakness in elite athletes with a previous history of unilateral HSI. BACKGROUND: HSIs and reinjuries are the most common cause of lost playing time in a number of sports. Eccentric knee flexors weakness is a major modifiable risk factor for future HSIs, however there is a lack of easily accessible equipment to assess this strength quality. METHODS: Thirty recreationally active males without a history of HSI completed NHEs on the device on 2 separate occasions. Intraclass correlation coefficients (ICCs), typical error (TE), typical error as a co-efficient of variation (%TE), and minimum detectable change at a 95% confidence interval (MDC95) were calculated. Normative strength data were determined using the most reliable measurement. An additional 20 elite athletes with a unilateral history of HSI within the previous 12 months performed NHEs on the device to determine if residual eccentric muscle weakness existed in the previously injured limb. RESULTS: The device displayed high to moderate reliability (ICC = 0.83 to 0.90; TE = 21.7 N to 27.5 N; %TE = 5.8 to 8.5; MDC95 = 76.2 to 60.1 N). Mean±SD normative eccentric flexors strength, based on the uninjured group, was 344.7 ± 61.1 N for the left and 361.2 ± 65.1 N for the right side. The previously injured limbs were 15% weaker than the contralateral uninjured limbs (mean difference = 50.3 N; 95% CI = 25.7 to 74.9N; P < .01), 15% weaker than the normative left limb data (mean difference = 50.0 N; 95% CI = 1.4 to 98.5 N; P = .04) and 18% weaker than the normative right limb data (mean difference = 66.5 N; 95% CI = 18.0 to 115.1 N; P < .01). CONCLUSIONS: The experimental device offers a reliable method to determine eccentric knee flexors strength and strength asymmetry and revealed residual weakness in previously injured elite athletes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human immunodeficiency virus (HIV) that leads to acquired immune deficiency syndrome (AIDs) reduces immune function, resulting in opportunistic infections and later death. Use of antiretroviral therapy (ART) increases chances of survival, however, with some concerns regarding fat re-distribution (lipodystrophy) which may encompass subcutaneous fat loss (lipoatrophy) and/or fat accumulation (lipohypertrophy), in the same individual. This problem has been linked to Antiretroviral drugs (ARVs), majorly, in the class of protease inhibitors (PIs), in addition to older age and being female. An additional concern is that the problem exists together with the metabolic syndrome, even when nutritional status/ body composition, and lipodystrophy/metabolic syndrome are unclear in Uganda where the use of ARVs is on the increase. In line with the literature, the overall aim of the study was to assess physical characteristics of HIV-infected patients using a comprehensive anthropometric protocol and to predict body composition based on these measurements and other standardised techniques. The other aim was to establish the existence of lipodystrophy, the metabolic syndrome, andassociated risk factors. Thus, three studies were conducted on 211 (88 ART-naïve) HIV-infected, 15-49 year-old women, using a cross-sectional approach, together with a qualitative study of secondary information on patient HIV and medication status. In addition, face-to-face interviews were used to extract information concerning morphological experiences and life style. The study revealed that participants were on average 34.1±7.65 years old, had lived 4.63±4.78 years with HIV infection and had spent 2.8±1.9 years receiving ARVs. Only 8.1% of participants were receiving PIs and 26% of those receiving ART had ever changed drug regimen, 15.5% of whom changed drugs due to lipodystrophy. Study 1 hypothesised that the mean nutritional status and predicted percent body fat values of study participants was within acceptable ranges; different for participants receiving ARVs and the HIV-infected ART-naïve participants and that percent body fat estimated by anthropometric measures (BMI and skinfold thickness) and the BIA technique was not different from that predicted by the deuterium oxide dilution technique. Using the Body Mass Index (BMI), 7.1% of patients were underweight (<18.5 kg/m2) and 46.4% were overweight/obese (≥25.0 kg/m2). Based on waist circumference (WC), approximately 40% of the cohort was characterized as centrally obese. Moreover, the deuterium dilution technique showed that there was no between-group difference in the total body water (TBW), fat mass (FM) and fat-free mass (FFM). However, the technique was the only approach to predict a between-group difference in percent body fat (p = .045), but, with a very small effect (0.021). Older age (β = 0.430, se = 0.089, p = .000), time spent receiving ARVs (β = 0.972, se = 0.089, p = .006), time with the infection (β = 0.551, se = 0.089, p = .000) and receiving ARVs (β = 2.940, se = 1.441, p = .043) were independently associated with percent body fat. Older age was the greatest single predictor of body fat. Furthermore, BMI gave better information than weight alone could; in that, mean percentage body fat per unit BMI (N = 192) was significantly higher in patients receiving treatment (1.11±0.31) vs. the exposed group (0.99±0.38, p = .025). For the assessment of obesity, percent fat measures did not greatly alter the accuracy of BMI as a measure for classifying individuals into the broad categories of underweight, normal and overweight. Briefly, Study 1 revealed that there were more overweight/obese participants than in the general Ugandan population, the problem was associated with ART status and that BMI broader classification categories were maintained when compared with the gold standard technique. Study 2 hypothesized that the presence of lipodystrophy in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Results showed that 112 (53.1%) patients had experienced at least one morphological alteration including lipohypertrophy (7.6%), lipoatrophy (10.9%), and mixed alterations (34.6%). The majority of these subjects (90%) were receiving ARVs; in fact, all patients receiving PIs reported lipodystrophy. Period spent receiving ARVs (t209 = 6.739, p = .000), being on ART (χ2 = 94.482, p = .000), receiving PIs (Fisher’s exact χ2 = 113.591, p = .000), recent T4 count (CD4 counts) (t207 = 3.694, p = .000), time with HIV (t125 = 1.915, p = .045), as well as older age (t209 = 2.013, p = .045) were independently associated with lipodystrophy. Receiving ARVs was the greatest predictor of lipodystrophy (p = .000). In other analysis, aside from skinfolds at the subscapular (p = .004), there were no differences with the rest of the skinfold sites and the circumferences between participants with lipodystrophy and those without the problem. Similarly, there was no difference in Waist: Hip ratio (WHR) (p = .186) and Waist: Height ratio (WHtR) (p = .257) among participants with lipodystrophy and those without the problem. Further examination showed that none of the 4.1% patients receiving stavudine (d4T) did experience lipoatrophy. However, 17.9% of patients receiving EFV, a non-nucleoside reverse transcriptase inhibitor (NNRTI) had lipoatrophy. Study 2 findings showed that presence of lipodystrophy in participants receiving ARVs was in fact far higher than that of HIV-infected ART-naïve participants. A final hypothesis was that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants. Moreover, data showed that many patients (69.2%) lived with at least one feature of the metabolic syndrome based on International Diabetic Federation (IDF, 2006) definition. However, there was no single anthropometric predictor of components of the syndrome, thus, the best anthropometric predictor varied as the component varied. The metabolic syndrome was diagnosed in 15.2% of the subjects, lower than commonly reported in this population, and was similar between the medicated and the exposed groups (χ 21 = 0.018, p = .893). Moreover, the syndrome was associated with older age (p = .031) and percent body fat (p = .012). In addition, participants with the syndrome were heavier according to BMI (p = .000), larger at the waist (p = .000) and abdomen (p = .000), and were at central obesity risk even when hip circumference (p = .000) and height (p = .000) were accounted for. In spite of those associations, results showed that the period with disease (p = .13), CD4 counts (p = .836), receiving ART (p = .442) or PIs (p = .678) were not associated with the metabolic syndrome. While the prevalence of the syndrome was highest amongst the older, larger and fatter participants, WC was the best predictor of the metabolic syndrome (p = .001). Another novel finding was that participants with the metabolic syndrome had greater arm muscle circumference (AMC) (p = .000) and arm muscle area (AMA) (p = .000), but the former was most influential. Accordingly, the easiest and cheapest indicator to assess risk in this study sample was WC should routine laboratory services not be feasible. In addition, the final study illustrated that the prevalence of the metabolic syndrome in participants receiving ARVs was not different from that of HIV-infected ART-naïve participants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exercise-based cardiac rehabilitation (CR) is efficacious in reducing mortality and hospital admissions; however it remains inaccessible to large proportions of the patient population. Removal of attendance barriers for hospital or centre-based CR has seen the promotion of home-based CR. Delivery of safe and appropriately prescribed exercise in the home was first documented 25 years ago, with the utilisation of fixed land-line telecommunications to monitor ECG. The advent of miniature ECG sensors, in conjunction with smartphones, now enables CR to be delivered with greater flexibility with regard to location, time and format, while retaining the capacity for real-time patient monitoring. A range of new systems allow other signals including speed, location, pulse oximetry, and respiration to be monitored and these may have application in CR. There is compelling evidence that telemonitored-based CR is an effective alternative to traditional CR practice. The long-standing barrier of access to centre-based CR, combined with new delivery platforms, raises the question of when telemonitored-based CR could replace conventional approaches as the standard practice.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poor health and injury represent major obstacles to the future economic security of Australia. The national economic cost of work-related injury is estimated at $57.5 billion p/a. Since exposure to high physical demands is a major risk factor for musculoskeletal injury, monitoring and managing such physical activity levels in workers is a potentially important injury prevention strategy. Current injury monitoring practices are inadequate for the provision of clinically valuable information about the tissue specific responses to physical exertion. Injury of various soft tissue structures can manifest over time through accumulation of micro-trauma. Such micro-trauma has a propensity to increase the risk of acute injuries to soft-tissue structures such as muscle or tendon. As such, the capacity to monitor biomarkers that result from the disruption of these tissues offers a means of assisting the pre-emptive management of subclinical injury prior to acute failure or for evaluation of recovery processes. Here we have adopted an in-vivo exercise induced muscle damage model allowing the application of laboratory controlled conditions to assist in uncovering biochemical indicators associated with soft-tissue trauma and recovery. Importantly, urine was utilised as the diagnostic medium since it is non-invasive to collect, more acceptable to workers and less costly to employers. Moreover, it is our hypothesis that exercise induced tissue degradation products enter the circulation and are subsequently filtered by the kidney and pass through to the urine. To test this hypothesis a range of metabolomic and proteomic discovery-phase techniques were used, along with targeted approaches. Several small molecules relating to tissue damage were identified along with a series of skeletal muscle-specific protein fragments resulting from exercise induced soft-tissue damage. Each of the potential biomolecular markers appeared to be temporally present within urine. Moreover, the regulation of abundance seemed to be associated with functional recovery following the injury. This discovery may have important clinical applications for monitoring of a variety of inflammatory myopathies as well as novel applications in monitoring of the musculoskeletal health status of workers, professional athletes and/or military personnel to reduce the onset of potentially debilitating musculoskeletal injuries within these professions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing prevalence of obesity in society has been associated with a number of atherogenic risk factors such as insulin resistance. Aerobic training is often recommended as a strategy to induce weight loss, with a greater impact of high-intensity levels on cardiovascular function and insulin sensitivity, and a greater impact of moderate-intensity levels on fat oxidation. Anaerobic high-intensity (supramaximal) interval training has been advocated to improve cardiovascular function, insulin sensitivity and fat oxidation. However, obese individuals tend to have a lower tolerance of high-intensity exercise due to discomfort. Furthermore, some obese individuals may compensate for the increased energy expenditure by eating more and/or becoming less active. Recently, both moderate- and high-intensity aerobic interval training have been advocated as alternative approaches. However, it is still uncertain as to which approach is more effective in terms of increasing fat oxidation given the issues with levels of fitness and motivation, and compensatory behaviours. Accordingly, the objectives of this thesis were to compare the influence of moderate- and high-intensity interval training on fat oxidation and eating behaviour in overweight/obese men. Two exercise interventions were undertaken by 10-12 overweight/obese men to compare their responses to study variables, including fat oxidation and eating behaviour during moderate- and high-intensity interval training (MIIT and HIIT). The acute training intervention was a methodological study designed to examine the validity of using exercise intensity from the graded exercise test (GXT) - which measured the intensity that elicits maximal fat oxidation (FATmax) - to prescribe interval training during 30-min MIIT. The 30-min MIIT session involved 5-min repetitions of workloads 20% below and 20% above the FATmax. The acute intervention was extended to involve HIIT in a cross-over design to compare the influence of MIIT and HIIT on eating behaviour using subjective appetite sensation and food preference through the liking and wanting test. The HIIT consisted of 15-sec interval training at 85 %VO2peak interspersed by 15-sec unloaded recovery, with a total mechanical work equal to MIIT. The medium term training intervention was a cross-over 4-week (12 sessions) MIIT and HIIT exercise training with a 6-week detraining washout period. The MIIT sessions consisted of 5-min cycling stages at ±20% of mechanical work at 45 %VO2peak, and the HIIT sessions consisted of repetitive 30-sec work at 90 %VO2peak and 30-sec interval rests, during identical exercise sessions of between 30 and 45 mins. Assessments included a constant-load test (45 %VO2peak for 45 mins) followed by 60-min recovery at baseline and the end of 4-week training, to determine fat oxidation rate. Participants’ responses to exercise were measured using blood lactate (BLa), heart rate (HR) and rating of perceived exertion (RPE) and were measured during the constant-load test and in the first intervention training session of every week during training. Eating behaviour responses were assessed by measuring subjective appetite sensations, liking and wanting and ad libitum energy intake. Results of the acute intervention showed that FATmax is a valid method to estimate VO2 and BLa, but is not valid to estimate HR and RPE in the MIIT session. While the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax (0.16 ±0.09 and 0.14 ±0.08 g/min, respectively), fat oxidation was significantly higher at minute 25 of MIIT (P≤0.01). In addition, there was no significant difference between MIIT and HIIT in the rate of appetite sensations after exercise, but there was a tendency towards a lower rate of hunger after HIIT. Different intensities of interval exercise also did not affect explicit liking or implicit wanting. Results of the medium-term intervention indicated that current interval training levels did not affect body composition, fasting insulin and fasting glucose. Maximal aerobic capacity significantly increased (P≤0.01) (2.8 and 7.0% after MIIT and HIIT respectively) during GXT, and fat oxidation significantly increased (P≤0.01) (96 and 43% after MIIT and HIIT respectively) during the acute constant-load exercise test. RPE significantly decreased after HIIT greater than MIIT (P≤0.05), and the decrease in BLa was greater during the constant-load test after HIIT than MIIT, but this difference did not reach statistical significance (P=0.09). In addition, following constant-load exercise, exercise-induced hunger and desire to eat decreased after HIIT greater than MIIT but were not significant (p value for desire to eat was 0.07). Exercise-induced liking of high-fat sweet (HFSW) and high-fat non-sweet (HFNS) foods increased after MIIT and decreased after HIIT (p value for HFNS was 0.09). The intervention explained 12.4% of the change in fat intake (p = 0.07). This research is significant in that it confirmed two points in the acute study. While the rate of fat oxidation increased during MIIT, the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax. In addition, manipulating the intensity of acute interval exercise did not affect appetite sensations and liking and wanting. In the medium-term intervention, constant-load exercise-induced fat oxidation significantly increased after interval training, independent of exercise intensity. In addition, desire to eat, explicit liking for HFNS and fat intake collectively confirmed that MIIT is accompanied by a greater compensation of eating behaviour than HIIT. Findings from this research will assist in developing exercise strategies to provide obese men with various training options. In addition, the finding that overweight/obese men expressed a lower RPE and decreased BLa after HIIT compared with MIIT is contrary to the view that obese individuals may not tolerate high-intensity interval training. Therefore, high-intensity interval training can be advocated among the obese adult male population. Future studies may extend this work by using a longer-term intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimated the heritability and correlations between body and carcass weight traits in a cultured stock of giant freshwater prawn (GFP) (Macrobrachium rosenbergii) selected for harvest body weight in Vietnam. The data set consisted of 18,387 body and 1,730 carcass records, as well as full pedigree information collected over four generations. Variance and covariance components were estimated by restricted maximum likelihood fitting a multi-trait animal model. Across generations, estimates of heritability for body and carcass weight traits were moderate and ranged from 0.14 to 0.19 and 0.17 to 0.21, respectively. Body trait heritabilities estimated for females were significantly higher than for males whereas carcass weight trait heritabilities estimated for females and males were not significantly different (P>. 0.05). Maternal effects for body traits accounted for 4 to 5% of the total variance and were greater in females than in males. Genetic correlations among body traits were generally high in the mixed sexes. Genetic correlations between body and carcass weight traits were also high. Although some issues remain regarding the best statistical model to be fitted to GFP data, our results suggest that selection for high harvest body weight based on breeding values estimated by fitting an animal model to the data can significantly improve mean body and carcass weight in GFP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimated genetic changes in body and carcass weight traits in a giant freshwater prawn (GFP) (Macrobrachium rosenbergii) population selected for increased body weight at harvest in Vietnam. The data set consisted of 18,387 individual body and 1730 carcass weight records, as well as full pedigree information collected over four generations. Average selection response (per generation) in body weight at harvest (transformed to square root) estimated as the difference between the Selection line and the Control group was 7.4% calculated from least squares mean (LSMs), 7.0% from estimated breeding values (EBVs) and 4.4% calculated from EBVs between two consecutive generations. Favorable correlated selection responses (estimated from LSMs) were found for other body traits including: total length, cephalothorax length, abdominal length, cephalothorax width, and abdominal width (12.1%, 14.5%, 10.4%, 15.5% and 13.3% over three selection generations, respectively). Data in the second generation of selection showed positive correlated responses for carcass weight traits including: abdominal weight, exoskeleton-off weight, and telson-off weight of 8.8%, 8.6% and 8.8%, respectively. We conclude that body weight at harvest responded well to the application of combined (between and within) family selection and correlated responses in carcass weight traits were favorable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background While compensatory eating following acute aerobic exercise is highly variable, little is known about the underling mechanisms that contribute to alterations in exercise-induced eating behaviour. Methods Overweight and obese women (BMI = 29.6 ± 4.0kg.m2) performed a bout of cycling individually tailored to expend 400kcal (EX), or a time-matched no exercise control condition in a randomised, counter-balanced order. Sixty minutes after the cessation of exercise, an ad libitum test meal was provided. Substrate oxidation and subjective appetite ratings were measured during exercise/time-matched rest, and during the period between the cessation of exercise and food consumption. Results While ad libitum EI did not differ between EX and the control condition (666.0 ± 203.9kcal vs. 664.6 ± 174.4kcal, respectively; ns), there was marked individual variability in compensatory energy intake (EI). The difference in EI between EX and the control condition ranged from -234.3 to +278.5kcal. Carbohydrate oxidation during exercise was positively associated with post-exercise EI, accounting for 37% of the variance in EI (r = 0.57; p = 0.02). Conclusions These data indicate that capacity of acute exercise to create a short-term energy deficit in overweight and obese women is highly variable. Furthermore, exercise-induced CHO oxidation can explain part of the variability in acute exercise-induced compensatory eating. Post-exercise compensatory eating could serve as an adaptive response to facilitate the restoration of carbohydrate balance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The International Classification of Diseases, Version 10, Australian modification (ICD-10- AM) is commonly used to classify diseases in hospital patients. ICD-10-AM defines malnutrition as “BMI < 18.5 kg/m2 or unintentional weight loss of ≥ 5% with evidence of suboptimal intake resulting in subcutaneous fat loss and/or muscle wasting”. The Australasian Nutrition Care Day Survey (ANCDS) is the most comprehensive survey to evaluate malnutrition prevalence in acute care patients from Australian and New Zealand hospitals1. This study determined if malnourished participants were assigned malnutritionrelated codes as per ICD-10-AM. The ANCDS recruited acute care patients from 56 hospitals. Hospital-based dietitians evaluated participants’ nutritional status using BMI and Subjective Global Assessment (SGA). In keeping with the ICD-10-AM definition, malnutrition was defined as BMI <18.5kg/m2, SGA-B (moderately malnourished) or SGA-C (severely malnourished). After three months, in this prospective cohort study, hospitals’ health information/medical records department provided coding results for malnourished participants. Although malnutrition was prevalent in 32% (n= 993) of the cohort (N= 3122), a significantly small number were coded for malnutrition (n= 162, 16%, p<0.001). In 21 hospitals, none of the malnourished participants were coded. This is the largest study to provide a snapshot of malnutrition-coding in Australian and New Zealand hospitals. Findings highlight gaps in malnutrition documentation and/or subsequent coding, which could potentially result in significant loss of casemix-related revenue for hospitals. Dietitians must lead the way in developing structured processes for malnutrition identification, documentation and coding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods: In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results: The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion: 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of exercise training is to initiate desirable physiological adaptations that ultimately enhance physical work capacity. Optimal training prescription requires an individualized approach, with an appropriate balance of training stimulus and recovery and optimal periodization. Recovery from exercise involves integrated physiological responses. The cardiovascular system plays a fundamental role in facilitating many of these responses, including thermoregulation and delivery/removal of nutrients and waste products. As a marker of cardiovascular recovery, cardiac parasympathetic reactivation following a training session is highly individualized. It appears to parallel the acute/intermediate recovery of the thermoregulatory and vascular systems, as described by the supercompensation theory. The physiological mechanisms underlying cardiac parasympathetic reactivation are not completely understood. However, changes in cardiac autonomic activity may provide a proxy measure of the changes in autonomic input into organs and (by default) the blood flow requirements to restore homeostasis. Metaboreflex stimulation (e.g. muscle and blood acidosis) is likely a key determinant of parasympathetic reactivation in the short term (0–90 min post-exercise), whereas baroreflex stimulation (e.g. exercise-induced changes in plasma volume) probably mediates parasympathetic reactivation in the intermediate term (1–48 h post-exercise). Cardiac parasympathetic reactivation does not appear to coincide with the recovery of all physiological systems (e.g. energy stores or the neuromuscular system). However, this may reflect the limited data currently available on parasympathetic reactivation following strength/resistance-based exercise of variable intensity. In this review, we quantitatively analyse post-exercise cardiac parasympathetic reactivation in athletes and healthy individuals following aerobic exercise, with respect to exercise intensity and duration, and fitness/training status. Our results demonstrate that the time required for complete cardiac autonomic recovery after a single aerobic-based training session is up to 24 h following low-intensity exercise, 24–48 h following threshold-intensity exercise and at least 48 h following high-intensity exercise. Based on limited data, exercise duration is unlikely to be the greatest determinant of cardiac parasympathetic reactivation. Cardiac autonomic recovery occurs more rapidly in individuals with greater aerobic fitness. Our data lend support to the concept that in conjunction with daily training logs, data on cardiac parasympathetic activity are useful for individualizing training programmes. In the final sections of this review, we provide recommendations for structuring training microcycles with reference to cardiac parasympathetic recovery kinetics. Ultimately, coaches should structure training programmes tailored to the unique recovery kinetics of each individual.