336 resultados para 56-435A
Resumo:
Purpose: To investigate the effects of an acute multinutrient supplement on game-based running performance, peak power output, anaerobic by-products, hormonal profiles, markers of muscle damage, and perceived muscular soreness before, immediately after, and 24 h following competitive rugby union games. Methods: Twelve male rugby union players ingested either a comprehensive multinutrient supplement (SUPP), [RE-ACTIVATE:01], or a placebo (PL) for 5 d. Participants then performed a competitive rugby union game (with global positioning system tracking), with associated blood draws and vertical jump assessments pre, immediately post and 24 h following competition. Results: SUPP ingestion resulted in moderate to large effects for augmented 1st half very high intensity running (VHIR) mean speed (5.9 ± 0.4 vs 4.8 ± 2.3 m·min–1; d= 0.93). Further, moderate increases in 2nd half VHIR distance (137 ± 119 vs 83 ± 89 m; d= 0.73) and VHIR mean speed (5.9 ± 0.6 v 5.3 ± 1.7 m·min–1; d= 0.56) in SUPP condition were also apparent. Postgame aspartate aminotransferase (AST; 44.1 ± 11.8 vs 37.0 ± 3.2 UL; d= 1.16) and creatine kinase (CK; 882 ± 472 vs. 645 ± 123 UL; d= 0.97) measures demonstrated increased values in the SUPP condition, while AST and CK values correlated with 2nd half VHIR distance (r= –0.71 and r= –0.76 respectively). Elevated C-reactive protein (CRP) was observed postgame in both conditions; however, it was significantly blunted with SUPP (P= .05). Conclusions: These findings suggest SUPP may assist in the maintenance of VHIR during rugby union games, possibly via the buffering qualities of SUPP ingredients. However, correlations between increased work completed at very high intensities and muscular degradation in SUPP conditions, may mask any anticatabolic properties of the supplement.
Resumo:
This investigation examined physiological and performance effects of cooling on recovery of medium-fast bowlers in the heat. Eight, medium-fast bowlers completed two randomised trials, involving two sessions completed on consecutive days (Session 1: 10-overs and Session 2: 4-overs) in 31 ± 3°C and 55 ± 17% relative humidity. Recovery interventions were administered for 20 min (mixed-method cooling vs. control) after Session 1. Measures included bowling performance (ball speed, accuracy, run-up speeds), physical demands (global positioning system, counter-movement jump), physiological (heart rate, core temperature, skin temperature, sweat loss), biochemical (creatine kinase, C-reactive protein) and perceptual variables (perceived exertion, thermal sensation, muscle soreness). Mean ball speed was higher after cooling in Session 2 (118.9 ± 8.1 vs. 115.5 ± 8.6 km · h−1; P = 0.001; d = 0.67), reducing declines in ball speed between sessions (0.24 vs. −3.18 km · h−1; P = 0.03; d = 1.80). Large effects indicated higher accuracy in Session 2 after cooling (46.0 ± 11.2 vs. 39.4 ± 8.6 arbitrary units [AU]; P = 0.13; d = 0.93) without affecting total run-up speed (19.0 ± 3.1 vs. 19.0 ± 2.5 km · h−1; P = 0.97; d = 0.01). Cooling reduced core temperature, skin temperature and thermal sensation throughout the intervention (P = 0.001–0.05; d = 1.31–5.78) and attenuated creatine kinase (P = 0.04; d = 0.56) and muscle soreness at 24-h (P = 0.03; d = 2.05). Accordingly, mixed-method cooling can reduce thermal strain after a 10-over spell and improve markers of muscular damage and discomfort alongside maintained medium-fast bowling performance on consecutive days in hot conditions.
Resumo:
Aim: To determine the effects of an acute multi-nutrient supplement on physiological, performance and recovery responses to intermittent-sprint running and muscular damage during rugby union matches. Methods: Using a randomised, double-blind, cross-over design, twelve male rugby union players ingested either 75 g of a comprehensive multi-nutrient supplement (SUPP), [Musashi] or 1 g of a taste and carbohydrate matched placebo (PL) for 5 days pre-competition. Competitive rugby union game running performance was then measured using 1 Hz GPS data (SPI10, SPI elite, GPSports), in addition to associated blood draws, vertical jump assessments and ratings of perceived muscular soreness (MS) pre, immediately post and 24 h post-competition. Baseline (BL) GPS data was collected during six competition rounds preceding data collection. Results: No significant differences were observed between supplement conditions for all game running, vertical jump, and ratings of perceived muscular soreness. However, effect size analysis indicated SUPP ingestion increased 1st half very high intensity running (VHIR) mean speed (d = 0.93) and 2nd half relative distance (m/min) (d = 0.97). Further, moderate increases in 2nd half VHIR distance (d = 0.73), VHIR m/min (d = 0.70) and VHIR mean speed (d = 0.56) in SUPP condition were also apparent. Moreover, SUPP demonstrated significant increases in 2nd half dist m/min, total game dist m/min and total game HIR m/min compared with BL data (P < 0.05). Further, large ES increases in VHIR time (d = 0.88) and moderate increases in 2nd half HIR m/min (d = 0.65) and 2nd half VHIR m/min (d = 0.74) were observed between SUPP and BL. Post-game aspartate aminotransferase (AST) (d = 1.16) and creatine kinase (CK) (d = 0.97) measures demonstrated increased ES values with SUPP, while AST and CK values correlated with 2nd half VHIR distance (r = −0.71 and r = −0.76 respectively). Elevated c-reactive protein (CRP) was observed post-game in both conditions, however was significantly blunted with SUPP (P = 0.05). Additionally, pre-game (d = 0.98) and post-game (d = 0.96) increases in cortisol (CORT) were apparent with SUPP. No differences were apparent between conditions for pH, lactate, glucose, HCO3, vertical jump assessments and MS (P > 0.05). Conclusion: These findings suggest SUPP may assist in the maintenance of VHIR speeds and distances covered during rugby union games, possibly via the buffering qualities of SUPP ingredients (i.e. caffeine, creatine, bicarbonate). While the mechanisms for these findings are unclear, the similar pH between conditions despite additional VHIR during SUPP may support this conclusion. Finally, correlations between increased work completed at very high intensities and muscular degradation in SUPP conditions, may mask any anti-catabolic properties of supplementation.
Resumo:
While the negative influence of passengers on driving is usually studied, young passengers may protect against young drivers’ crash involvement by speaking out and trying to stop unsafe driving behavior. This study sought to examine psychosocial constructs of young passengers who are likely to intervene in their friends’ risky driving. Method: University students aged 17 to 25 years who were single (n = 123) or in a romantic relationship (n = 130) completed an online survey measuring protective factors. Results: The combination of individual, friend and (for participants in a relationship) romantic partner protective factors predicted self-reported passenger intervening intentions. Impact on Industry: Since peer passengers often increase young drivers’ crash risk, research on passenger intervening has significant implications for road safety strategies. The findings provide support for the operationalization of protective factors in strategies that target passenger intervening behavior.
Resumo:
Objective: Substance use is common in first-episode psychosis, and complicates the accurate diagnosis and treatment of the disorder. The differentiation of substance-induced psychotic disorders (SIPD) from primary psychotic disorders (PPD) is particularly challenging. This cross-sectional study compares the clinical, substance use and functional characteristics of substance using first episode psychosis patients diagnosed with a SIPD and PPD. Method: Participants were 61 young people (15-24 years) admitted to a psychiatric inpatient service with first episode psychosis, reporting substance use in the past month. Diagnosis was determined using the Psychiatric Research Interview for DSM-IV Substance and Mental disorders (PRISM-IV). Measures of clinical (severity of psychotic symptoms, level of insight, history of trauma), substance use (frequency/quantity, severity) and social and occupational functioning were also administered. Results: The PRISM-IV differentially diagnosed 56% of first episode patients with a SIPD and 44% with a PPD. Those with a SIPD had higher rates of substance use and disorders, higher levels of insight, were more likely to have a forensic and trauma history and had more severe hostility and anxious symptoms than those with a PPD. Logistic regression analysis indicated a family history of psychosis, trauma history and current cannabis dependence were the strongest predictors of a SIPD. Almost 80% of diagnostic predictions of a SIPD were accurate using this model. Conclusions: This clinical profile of SIPD could help to facilitate the accurate diagnosis and treatment of SIPD versus PPD in young people with first episode psychosis admitted to an inpatient psychiatric service.
Resumo:
Background. Governments face a significant challenge to ensure that community environments meet the mobility needs of an ageing population. Therefore, it is critical to investigate the effect of suburban environments on the choice of transportation and its relation to participation and active ageing. Objective. This research explores if and how suburban environments impact older people’s mobility and their use of different modes of transport. Methods. Data derived from GPS tracking, travel diaries, brief questionnaires, and semistructured interviews were gathered from thirteen people aged from 56 to 87 years, living in low-density suburban environments in Brisbane, Australia. Results. The suburban environment influenced the choice of transportation and out-of-home mobility. Both walkability and public transportation (access and usability) impact older people’s transportation choices. Impracticality of active and public transportation within suburban environments creates car dependency in older age. Conclusion. Suburban environments often create barriers to mobility, which impedes older people’s engagement in their wider community and ability to actively age in place. Further research is needed to develop approaches towards age-friendly suburban environments which will encourage older people to remain active and engaged in older age.
Resumo:
Objective: A literature review to examine the incorporation of respiratory assessment into everyday surgical nursing practice; possible barriers to this; and the relationship to patient outcomes. Primary argument: Escalating demands on intensive care beds have led to highly dependent patients being cared for in general surgical ward areas. This change in patient demographics has meant the knowledge and skills required of registered nurses in these areas has expanded exponentially. The literature supported the notion that postoperative monitoring of vital signs should include the fundamental assessment of respiratory rate; depth and rhythm; work of breathing; use of accessory muscles and symmetrical chest movement; as well as auscultation of lung fields using a stethoscope. Early intervention in response to changes in a patient's respiratory health status impacts positively on patient health outcomes. Substantial support exists for the contention that technologically adept nurses who also possess competent respiratory assessment skills make a difference to respiratory care. Conclusions: Sub-clinical respiratory problems have been demonstrated to contribute to adverse events. There is a paucity of research knowledge as to whether respiratory education programs and associated inservice make a difference to nursing clinical practice. Similarly, the implications for associated respiratory educational needs are not well documented, nor has a research base been sufficiently developed to guide nursing practice. Further research has the potential to influence the future role and function of the registered nurse by determining the importance of respiratory education programs on post-operative patient outcomes.
Resumo:
Introduction: Lower-limb amputations are a serious adverse consequence of lifestyle related chronic conditions and a serious concern among the aging population in Australia. Lower limb amputations have severe personal, social and economic impacts on the individual, healthcare system and broader community. This study aimed to address a critical gap in the research literature by investigating the physical functioning and social characteristics of lower limb amputees at discharge from tertiary hospital inpatient rehabilitation. Method: A cohort study was implemented among patients with lower limb amputations admitted to a Geriatric Assessment and Rehabilitation Unit for rehabilitation at a tertiary hospital. Conventional descriptive statistics were used to examine patient demographic, physical functioning and social living outcomes recorded for patients admitted between 2005 and 2011. Results: A total of 423 admissions occurred during the study period, 313 (74%) were male. This sample included admissions for left (n = 189, 45%), right (n = 220, 52%) and bilateral (n = 14, 3%) lower limb amputations, with 15 (3%) patients dying whilst an inpatient. The mean (standard deviation) age was 65 (13.9) years. Amputations attributed to vascular causes accounted for 333 (78%) admissions; 65 (15%) of these had previously had an amputation. The mean (SD) length of stay in the rehabilitation unit was 56 (42) days. Prior to this admission, 123 (29%) patients were living alone, 289 (68%) were living with another and 3 (0.7%) were living in residential care. Following this amputation related admission, 89 (21%) patients did not return to their prior living situation. Of those admitted, 187 (44%) patients were discharged with a lower limb prosthesis. Conclusion: The clinical group is predominately older adults. The ratio of males to females was approximately 3:1. Over half did not return to walking and many were not able to return to their prior accommodation. However, few patients died during their admission.
Resumo:
Objective: To compare access and utilisation of EDs in Queensland public hospitals between people who speak only English at home and those who speak another language at home. Methods: A retrospective analysis of a Queensland statewide hospital ED dataset (ED Information System) from 1 January 2008 to 31 December 2010 was conducted. Access to ED care was measured by the proportion of the state’s population attending EDs. Logistic regression analyses were performed to determine the relationships between ambulance use and language, and between hospital admission and language, both after adjusting for age, sex and triage category. Results: The ED utilisation rate was highest in English only speakers (290 per 1000 population), followed by Arabic speakers (105), and lowest among German speakers (30). Compared with English speakers, there were lower rates of ambulance use in Chinese (odds ratio 0.50, 95% confidence interval, 0.47–0.54), Vietnamese (0.87, 0.79–0.95), Arabic (0.87, 0.78–0.97), Spanish (0.56, 0.50–0.62), Italian (0.88, 0.80–0.96), Hindi (0.61, 0.53–0.70) and German (0.87, 0.79–0.90) speakers. Compared with English speakers, German speakers had higher admission rates (odds ratio 1.17, 95% confidence interval, 1.02–1.34), whereas there were lower admission rates in Chinese (0.90, 0.86–0.99), Arabic (0.76, 0.67–0.85) and Spanish (0.83, 0.75–0.93) speakers. Conclusion: This study showed that there was a significant association between lower utilisation of emergency care and speaking languages other than English at home. Further researches are needed using in-depth methodology to investigate if there are language barriers in accessing emergency care in Queensland.
Resumo:
Background Undernutrition, weight loss and dehydration are major clinical issues for people with dementia in residential care, with excessive weight loss contributing to increased risk of frailty, immobility, illness and premature morbidity. This paper discusses a nutritional knowledge and attitudes survey conducted as part of a larger project focused on improving nutritional intake of people with dementia within a residential care facility in Brisbane, Australia. Aims The specific aims of the survey were to identify (i) knowledge of the nutritional needs of aged care facility residents; (ii) mealtime practices; and (iii) attitudes towards mealtime practices and organisation. Methods A survey based on those used in other healthcare settings was completed by 76 staff members. The survey included questions about nutritional knowledge, opinions of the food service, frequency of feeding assistance provided and feeding assessment practices. Results Nutritional knowledge scores ranged from 1 to 9 of a possible 10, with a mean score of 4.67. While 76% of respondents correctly identified risk factors associated with malnutrition in nursing home residents, only 38% of participants correctly identified the need for increased protein and energy in residents with pressure ulcers, and just 15% exhibited correct knowledge of fluid requirements. Further, while nutritional assessment was considered an important part of practice by 83% of respondents, just 53% indicated that they actually carried out such assessments. Identified barriers to promoting optimal nutrition included insufficient time to observe residents (56%); being unaware of residents' feeding issues (46%); poor knowledge of nutritional assessments (44%); and unappetising appearance of food served (57%). Conclusion An important step towards improving health and quality of life for residents of aged care facilities would be to enhance staff nutritional awareness and assessment skills. This should be carried out through increased attention to both preservice curricula and on-the-job training. Implications for practice The residential facility staff surveyed demonstrated low levels of nutrition knowledge, which reflects findings from the international literature. This has implications for the provision of responsive care to residents of these facilities and should be explored further.
Resumo:
Background & aim: This paper describes nutrition care practices in acute care hospitals across Australia and New Zealand. Methods: A survey on nutrition care practices in Australian and New Zealand hospitals was completed by Directors of dietetics departments of 56 hospitals that participated in the Australasian Nutrition Care Day Survey 2010. Results: Overall 370 wards representing various specialities participated in the study. Nutrition risk screening was conducted in 64% (n=234) of the wards. Seventy nine percent(n=185) of these wards reported using the Malnutrition Screening Tool, 16% using the Malnutrition Universal Screening Tool (n=37), and 5% using local tools (n=12). Nutrition risk rescreening was conducted in 14% (n=53) of the wards. More than half the wards referred patients at nutrition risk to dietitians and commenced a nutrition intervention protocol. Feeding assistance was provided in 89% of the wards. “Protected” meal times were implemented in 5% of the wards. Conclusion: A large number of acute care hospital wards in Australia and New Zealand do not comply with evidence-based practice guidelines for nutritional management of malnourished patients. This study also provides recommendations for practice.
Resumo:
Background & aims: One aim of the Australasian Nutrition Care Day Survey was to determine the nutritional status and dietary intake of acute care hospital patients. Methods: Dietitians from 56 hospitals in Australia and New Zealand completed a 24-h survey of nutritional status and dietary intake of adult hospitalised patients. Nutritional risk was evaluated using the Malnutrition Screening Tool. Participants ‘at risk’ underwent nutritional assessment using Subjective Global Assessment. Based on the International Classification of Diseases (Australian modification), participants were also deemed malnourished if their body mass index was <18.5 kg/m2. Dietitians recorded participants’ dietary intake at each main meal and snacks as 0%, 25%, 50%, 75%, or 100% of that offered. Results: 3122 patients (mean age: 64.6 ± 18 years) participated in the study. Forty-one percent of the participants were “at risk” of malnutrition. Overall malnutrition prevalence was 32%. Fifty-five percent of malnourished participants and 35% of well-nourished participants consumed ≤50% of the food during the 24-h audit. “Not hungry” was the most common reason for not consuming everything offered during the audit. Conclusion: Malnutrition and sub-optimal food intake is prevalent in acute care patients across hospitals in Australia and New Zealand and warrants appropriate interventions.
Resumo:
Background & aims The Australasian Nutrition Care Day Survey (ANCDS) ascertained if malnutrition and poor food intake are independent risk factors for health-related outcomes in Australian and New Zealand hospital patients. Methods Phase 1 recorded nutritional status (Subjective Global Assessment) and 24-h food intake (0, 25, 50, 75, 100% intake). Outcomes data (Phase 2) were collected 90-days post-Phase 1 and included length of hospital stay (LOS), readmissions and in-hospital mortality. Results Of 3122 participants (47% females, 65 ± 18 years) from 56 hospitals, 32% were malnourished and 23% consumed ≤ 25% of the offered food. Malnourished patients had greater median LOS (15 days vs. 10 days, p < 0.0001) and readmissions rates (36% vs. 30%, p = 0.001). Median LOS for patients consuming ≤ 25% of the food was higher than those consuming ≤ 50% (13 vs. 11 days, p < 0.0001). The odds of 90-day in-hospital mortality were twice greater for malnourished patients (CI: 1.09–3.34, p = 0.023) and those consuming ≤ 25% of the offered food (CI: 1.13–3.51, p = 0.017), respectively. Conclusion The ANCDS establishes that malnutrition and poor food intake are independently associated with in-hospital mortality in the Australian and New Zealand acute care setting.
Resumo:
One aim of the Australasian Nutrition Care Day Survey was to explore nutrition care practices in acute care hospital wards across Australia and New Zealand. Managers of Dietetic departments completed a questionnaire regarding ward nutrition care practices. Overall, 370 wards from 56 hospitals participated. The median ward size was 28 beds (range: 8–60 beds). Although there was a wide variation in full-time equivalent availability of dietitians (median: 0.3; range: 0–1.4), their involvement in providing nutrition care across ward specialities was signifi cantly higher than other staff members (χ2, p < 0.01). Feeding assistance, available in 89% of the wards, was provided mainly by nursing staff and family members (χ2, p < 0.01). Protected meal times were implemented in 5% (n = 18) of the wards. Fifty-three percent of the wards (n = 192) weighed patients on request and 40% (n = 148) on admission. Routine malnutrition screening was conducted in 63% (n = 232) of the wards and 79% (n = 184) of these wards used the Malnutrition Screening Tool, 16% (n = 37) the Malnutrition Universal Screening Tool, and 5% (n = 11) other tools. Nutrition rescreening was routinely conducted in 20% of the wards. Among wards that implemented nutrition screening, 41% (n = 100) routinely referred patients “at risk” of malnutrition to dietitians as part of their standard protocol for malnutrition management. Results of this study provide new knowledge regarding current nutrition care practice, highlight gaps in existing practice, and can be used to inform improved nutrition care in acute care wards across Australia and New Zealand.
Resumo:
One aim of the Australasian Nutrition Care Day Survey (ANCDS) was to explore dietary intake and nutritional status of acute care hospital patients. Dietitians from 56 hospitals in Australia and New Zealand completed a 24-hour nutritional status and dietary intake audit of 3000 adult patients. Participants were evaluated for nutritional risk using the Malnutrition Screening Tool (MST). Those ‘at risk’ underwent nutritional assessment using Subjective Global Assessment (SGA). Dietitians observed participants’ dietary intake at each main meal and recorded mid-meal intake via participant interviews. Intakes were recorded as 0%, 25%, 50%, 75%, or 100% of that offered for each meal during the 24-hour audit. Preliminary results for 1550 participants (males = 853; females = 697), age = 64 ± 17 years and BMI = 27 ± 7 kg/m2. Fifty-five percent (n = 853) of the participants had BMI > 25 kg/m2. The MST identified 41% (n = 636) ‘at risk’ for malnutrition. Of those ‘at risk’, 70% were assessed as malnourished resulting in an overall malnutrition prevalence of 30% (25% moderately malnourished, 5% severely malnourished). One-quarter of malnourished participants (n = 118) were on standard hospital diets without additional nutritional support. Fifty percent of malnourished patients (n = 235) and 40% of all patients (n = 620) had an overall 24-hour food consumption of ≤50% during the 24-hour audit. The ANCDS found that skeletons in the hospital closet continue to exist and that acute care patients continue to have suboptimal dietary intake. The ANCDS provides valuable insight into gaps in existing nutrition care practices.