945 resultados para TOTAL PARENTERAL-NUTRITION
Resumo:
Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.
Resumo:
This article is a brief introduction to the total solar eclipse Wed 14 November 2012 in north Queensland that will be seen in a narrow strip of land just 140 km wide in the vicinity of Cairns.
Resumo:
Background: Periurban agriculture refers to agricultural practice occurring in areas with mixed rural and urban features. It is responsible 25% of the total gross value of economic production in Australia, despite only comprising 3% of the land used for agriculture. As populations grows and cities expand, they are constantly absorbing surrounding fringe areas, thus creating a new fringe, further from the city causing the periurban region to constantly shift outwards. Periurban regions are fundamental in the provision of fresh food to city populations and residential (and industrial) expansion taking over agricultural land has been noted as a major worldwide concern. Another major concern around the increase in urbanisation and resultant decrease in periurban agriculture is its potential effect on food security. Food security is the availability or access to nutritionally-adequate, culturally-relevant and safe foods in culturally-appropriate ways. Thus food insecurity occurs when access to or availability of these foods is compromised. There is an important level of connectedness between food security and food production and a decrease in periurban agriculture may have adverse effects on food security. A decrease in local, seasonal produce may result in a decrease in the availability of products and an increase in cost, as food must travel greater distances, incurring extra costs present at the consumer level. Currently, few Australian studies exist examining the change in periurban agriculture over time. Such information may prove useful for future health policy and interventions as well as infrastructure planning. The aim of this study is to investigate changes in periurban agriculture among capital cities of Australia. Methods: We compared data pertaining to selected commodities from the Australian Bureau of Statistics 2000-01 and 2005 -2006 Agricultural Census. This survey is distributed online or via mail on a five-yearly basis to approximately 175,000 Agricultural business to ascertain information on a range of factors, such as types of crops, livestock and land preparation practices. For the purpose of this study we compared the land being used for total crops, and cereal , oil seed, legume, fruit and vegetable crops separately. Data was analysed using repeated measures anova in spss. Results: Overall, total area available for crops in urbanised areas of Australia increased slightly by 1.8%. However, Sydney, Melbourne, Adelaide and Perth experienced decreases in the area available for fruit crops by 11%, 5%,and 4% respectively. Furthermore, Brisbane and Perth experienced decreases in land available for vegetable crops by 28% and 14% respectively. Finally, Sydney, Adelaide and Perth experienced decreases in land available for cereal crops by 10 – 79%. Conclusions: These findings suggest that population increases and consequent urban sprawl may be resulting in a decrease in peri-urban agriculture, specifically for several core food groups including fruit, breads and grain based foods. In doing so, access to or availability of these foods may be limited, and the cost of these foods is likely to increase, which may compromise food insecurity for certain sub-groups of the population.
Performance of elite seated discus throwers in F30s classes : part II: does feet positioning matter?
Resumo:
Background: Studies on the relationship between performance and design of the throwing frame have been limited. Part I provided only a description of the whole body positioning. Objectives: The specific objectives were (a) to benchmark feet positioning characteristics (i.e. position, spacing and orientation) and (b) to investigate the relationship between performance and these characteristics for male seated discus throwers in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. Feet positioning was characterised by tridimensional data of the front and back feet position as well as spacing and orientation corresponding to the distance between and the angle made by both feet, respectively. Results: Only 4 of 30 feet positioning characteristics presented a coefficient correlation superior to 0.5, including the feet spacing on mediolateral and anteroposterior axes in F34 class as well as the back foot position and feet spacing on mediolateral axis in F33 class. Conclusions: This study provided key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.
Resumo:
Purpose: To assess the effects of pre-cooling volume on neuromuscular function and performance in free-paced intermittent-sprint exercise in the heat. Methods: Ten male, teamsport athletes completed four randomized trials involving an 85-min free-paced intermittentsprint exercise protocol in 33°C±33% relative humidity. Pre-cooling sessions included whole body (WB), head+hand (HH), head (H) and no cooling (CONT), applied for 20-min pre-exercise and 5-min mid exercise. Maximal voluntary contractions (MVC) were assessed pre- and postintervention and mid- and post-exercise. Exercise performance was assessed with sprint times, % decline and distances covered during free-paced bouts. Measures of core(Tc) and skin (Tsk) temperatures, heart rate, perceptual exertion and thermal stress were monitored throughout. Venous and capillary blood was analyzed for metabolite, muscle damage and inflammatory markers. Results: WB pre-cooling facilitated the maintenance of sprint times during the exercise protocol with reduced % decline (P=0.04). Mean and total hard running distances increased with pre cooling 12% compared to CONT (P<0.05), specifically, WB was 6-7% greater than HH (P=0.02) and H (P=0.001) respectively. No change was evident in mean voluntary or evoked force pre- to post-exercise with WB and HH cooling (P>0.05). WB and HH cooling reduced Tc by 0.1-0.3°C compared to other conditions (P<0.05). WB Tsk was suppressed for the entire session(P=0.001). HR responses following WB cooling were reduced(P=0.05; d=1.07) compared to CONT conditions during exercise. Conclusion: A relationship between pre-cooling volume and exercise performance seems apparent, as larger surface area coverage augmented subsequent free-paced exercise capacity, in conjunction with greater suppression of physiological load. Maintenance of MVC with pre-cooling, despite increased work output suggests the role of centrally-mediated mechanisms in exercise pacing regulation and subsequent performance.
Resumo:
Objectives: The current study investigated the change in neuromuscular contractile properties following competitive rugby league matches and the relationship with physical match demands. Design: Eleven trained, male rugby league players participated in 2–3 amateur, competitive matches (n = 30). Methods: Prior to, immediately (within 15-min) and 2 h post-match, players performed repeated counter-movement jumps (CMJ) followed by isometric tests on the right knee extensors for maximal voluntary contraction (MVC), voluntary activation (VA) and evoked twitch contractile properties of peak twitch force (Pt), rate of torque development (RTD), contraction duration (CD) and relaxation rate (RR). During each match, players wore 1 Hz Global Positioning Satellite devices to record distance and speeds of matches. Further, matches were filmed and underwent notational analysis for number of total body collisions. Results: Total, high-intensity, very-high intensity distances covered and mean speed were 5585 ± 1078 m, 661 ± 265, 216 ± 121 m and 75 ± 14 m min−1, respectively. MVC was significantly reduced immediately and 2 h post-match by 8 ± 11 and 12 ± 13% from pre-match (p < 0.05). Moreover, twitch contractile properties indicated a suppression of Pt, RTD and RR immediately post-match (p < 0.05). However, VA was not significantly altered from pre-match (90 ± 9%), immediately-post (89 ± 9%) or 2 h post (89 ± 8%), (p > 0.05). Correlation analyses indicated that total playing time (r = −0.50) and mean speed (r = −0.40) were moderately associated to the change in post-match MVC, while mean speed (r = 0.35) was moderately associated to VA. Conclusions: The present study highlights the physical demands of competitive amateur rugby league result in interruption of peripheral contractile function, and post-match voluntary torque suppression may be associated with match playing time and mean speeds.
Resumo:
This study examined physiological and performance effects of pre-cooling on medium-fast bowling in the heat. Ten, medium-fast bowlers completed two randomised trials involving either cooling (mixed-methods) or control (no cooling) interventions before a 6-over bowling spell in 31.9±2.1°C and 63.5±9.3% relative humidity. Measures included bowling performance (ball speed, accuracy and run-up speeds), physical characteristics (global positioning system monitoring and counter-movement jump height), physiological (heart rate, core temperature, skin temperature and sweat loss), biochemical (serum concentrations of damage, stress and inflammation) and perceptual variables (perceived exertion and thermal sensation). Mean ball speed (114.5±7.1 vs. 114.1±7.2 km · h−1; P = 0.63; d = 0.09), accuracy (43.1±10.6 vs. 44.2±12.5 AU; P = 0.76; d = 0.14) and total run-up speed (19.1±4.1 vs. 19.3±3.8 km · h−1; P = 0.66; d = 0.06) did not differ between pre-cooling and control respectively; however 20-m sprint speed between overs was 5.9±7.3% greater at Over 4 after pre-cooling (P = 0.03; d = 0.75). Pre-cooling reduced skin temperature after the intervention period (P = 0.006; d = 2.28), core temperature and pre-over heart rates throughout (P = 0.01−0.04; d = 0.96−1.74) and sweat loss by 0.4±0.3 kg (P = 0.01; d = 0.34). Mean rating of perceived exertion and thermal sensation were lower during pre-cooling trials (P = 0.004−0.03; d = 0.77−3.13). Despite no observed improvement in bowling performance, pre-cooling maintained between-over sprint speeds and blunted physiological and perceptual demands to ease the thermoregulatory demands of medium-fast bowling in hot conditions.
Resumo:
This study investigated the effects of alcohol ingestion on lower body strength and power, and physiological and cognitive recovery following competitive Rugby League matches. Nine male Rugby players participated in two matches, followed by one of two randomized interventions; a control or alcohol ingestion session. Four hours post-match, participants consumed either beverages containing a total of 1g of ethanol per kg bodyweight (vodka and orange juice; ALC) or a caloric and taste matched non-alcoholic beverage (orange juice; CONT). Pre, post, 2 h post and 16 h post match measures of countermovement jump (CMJ), maximal voluntary contraction(MVC), voluntary activation (VA), damage and stress markers of creatine kinase (CK), C-reactive protein (CRP), cortisol, and testosterone analysed from venous blood collection, and cognitive function (modified Stroop test) were determined. Alcohol resulted in large effects for decreased CMJ height(-2.35 ± 8.14 and -10.53 ± 8.36 % decrement for CONT and ALC respectively; P=0.15, d=1.40), without changes in MVC (P=0.52, d=0.70) or VA (P=0.15, d=0.69). Furthermore, alcohol resulted in a significant slowing of total time in a cognitive test (P=0.04, d=1.59), whilst exhibiting large effects for detriments in congruent reaction time (P=0.19, d=1.73). Despite large effects for increased cortisol following alcohol ingestion during recovery (P=0.28, d=1.44), post-match alcohol consumption did not unduly affect testosterone (P-0.96, d=0.10), CK (P=0.66, d=0.70) or CRP(P=0.75, d=0.60). It appears alcohol consumption during the evening following competitive rugby matches may have some detrimental effects on peak power and cognitive recovery the morning following a Rugby League match. Accordingly, practitioners should be aware of the potential associated detrimental effects of alcohol consumption on recovery and provide alcohol awareness to athletes at post-match functions.
Resumo:
This investigation examined physiological and performance effects of cooling on recovery of medium-fast bowlers in the heat. Eight, medium-fast bowlers completed two randomised trials, involving two sessions completed on consecutive days (Session 1: 10-overs and Session 2: 4-overs) in 31 ± 3°C and 55 ± 17% relative humidity. Recovery interventions were administered for 20 min (mixed-method cooling vs. control) after Session 1. Measures included bowling performance (ball speed, accuracy, run-up speeds), physical demands (global positioning system, counter-movement jump), physiological (heart rate, core temperature, skin temperature, sweat loss), biochemical (creatine kinase, C-reactive protein) and perceptual variables (perceived exertion, thermal sensation, muscle soreness). Mean ball speed was higher after cooling in Session 2 (118.9 ± 8.1 vs. 115.5 ± 8.6 km · h−1; P = 0.001; d = 0.67), reducing declines in ball speed between sessions (0.24 vs. −3.18 km · h−1; P = 0.03; d = 1.80). Large effects indicated higher accuracy in Session 2 after cooling (46.0 ± 11.2 vs. 39.4 ± 8.6 arbitrary units [AU]; P = 0.13; d = 0.93) without affecting total run-up speed (19.0 ± 3.1 vs. 19.0 ± 2.5 km · h−1; P = 0.97; d = 0.01). Cooling reduced core temperature, skin temperature and thermal sensation throughout the intervention (P = 0.001–0.05; d = 1.31–5.78) and attenuated creatine kinase (P = 0.04; d = 0.56) and muscle soreness at 24-h (P = 0.03; d = 2.05). Accordingly, mixed-method cooling can reduce thermal strain after a 10-over spell and improve markers of muscular damage and discomfort alongside maintained medium-fast bowling performance on consecutive days in hot conditions.
Resumo:
The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.
Resumo:
In the elderly, the risks for protein-energy malnutrition from older age, dementia, depression and living alone have been well-documented. Other risk factors including anorexia, gastrointestinal dysfunction, loss of olfactory and taste senses and early satiety have also been suggested to contribute to poor nutritional status. In Parkinson’s disease (PD), it has been suggested that the disease symptoms may predispose people with PD to malnutrition. However, the risks for malnutrition in this population are not well-understood. The current study’s aim was to determine malnutrition risk factors in community-dwelling adults with PD. Nutritional status was assessed using the Patient-Generated Subjective Global Assessment (PG-SGA). Data about age, time since diagnosis, medications and living situation were collected. Levodopa equivalent doses (LDED) and LDED per kg body weight (mg/kg) were calculated. Depression and anxiety were measured using the Beck’s Depression Inventory (BDI) and Spielberger Trait Anxiety questionnaire, respectively. Cognitive function was assessed using the Addenbrooke’s Cognitive Examination (ACE-R). Non-motor symptoms were assessed using the Scales for Outcomes in Parkinson's disease-Autonomic (SCOPA-AUT) and Modified Constipation Assessment Scale (MCAS). A total of 125 community-dwelling people with PD were included, average age of 70.2±9.3(35-92) years and average time since diagnosis of 7.3±5.9(0–31) years. Average body mass index (BMI) was 26.0±5.5kg/m2. Of these, 15% (n=19) were malnourished (SGA-B). Multivariate logistic regression analysis revealed that older age (OR=1.16, CI=1.02-1.31), more depressive symptoms (OR=1.26, CI=1.07-1.48), lower levels of anxiety (OR=.90, CI=.82-.99), and higher LDED per kg body weight (OR=1.57, CI=1.14-2.15) significantly increased malnutrition risk. Cognitive function, living situation, number of prescription medications, LDED, years since diagnosis and the severity of non-motor symptoms did not significantly influence malnutrition risk. Malnutrition results in poorer health outcomes. Proactively addressing the risk factors can help prevent declines in nutritional status. In the current study, older people with PD with depression and greater amounts of levodopa per body weight were at increased malnutrition risk.
Resumo:
The aim of this study was to identify what outcome measures or quality indicators are being used to evaluate advanced and new roles in nine allied health professions and whether the measures are evaluating outcomes of interest to the patient, the clinician, or the healthcare provider. A systematic search strategy was used. Medical and allied health databases were searched and relevant articles extracted. Relevant studies with at least 1 outcome measure were evaluated. A total of 106 articles were identified that described advanced roles, however, only 23 of these described an outcome measure in sufficient detail to be included for review. The majority of the reported measures fit into the economic and process categories. The most reported outcome related to patients was satisfaction surveys. Measures of patient health outcomes were infrequently reported. It is unclear from the studies evaluated whether new models of allied healthcare can be shown to be as safe and effective as traditional care for a given procedure. Outcome measures chosen to evaluate these services often reflect organizational need and not patient outcomes. Organizations need to ensure that high-quality performance measures are chosen to evaluate the success of new health service innovations. There needs to be a move away from in-house type surveys that add little or no valid evidence as to the effect of a new innovation. More importance needs to be placed on patient outcomes as a measure of the quality of allied health interventions.
Resumo:
Background: Currently in the Australian higher education sector higher productivity from allied health clinical education placements are a contested issue. This paper will report results of a study that investigated output changes associated with occupational therapy and nutrition/dietetics clinical education placements in Queensland, Australia. Supervisors’ and students’ time use during placements and how this changes for supervisors compared to when students are not present in the workplace is also presented. Methodology/Principal Findings: A cohort design was used with students from four Queensland universities, and their supervisors employed by Queensland Health. There was an increasing trend in the number of occasions of service delivered when the students were present, and a statistically significant increase in the daily mean length of occasions of service delivered during the placement compared to pre-placement levels. For project-based placements that were not directly involved in patient care, supervisors’ project activity time decreased during placements, with students undertaking considerably more time in project activities. Conclusions/Significance: A novel method for estimating productivity and time use changes during clinical education programs for allied health disciplines has been applied. During clinical education placements there was a net increase in outputs, suggesting supervisors engage in longer consultations with patients for the purpose of training students, while maintaining patient numbers. Other activities are reduced. This paper is the first time these data have been shown and form a good basis for future assessments of the economic impact of student placements for allied health disciplines.
Resumo:
Photographic records of dietary intake (PhDRs) are an innovative method for the dietary assessment and may alleviate the burden of recording intake compared to traditional methods of recording intake. While the performance of PhDRs has been evaluated, no investigation into the application of this method had occurre within dietetic practice. This study examined the attitudes of dietitians towards the use of PhDRs in the provision of nutrition care. A web-based survey on the practices and beliefs with regards to technology use among Dietitians Association of Australia members was conducted in August 2011. Of the 87 dietitians who responded, 86% assessed the intakes of clients as part of individualised medical nutrition therapy, with the diet history the most common method used. The majority (91%) of dietitians surveyed believed that a PhDR would be of use in their current practice to estimate intake. Information contained in the PhDR would primarily be used to obtain a qualitative evaluation of diet (84%) or to supplement an existing assessment method (69%), as opposed to deriving an absolute measure of nutrient intake (31%). Most (87%) indicated that a PhDR would also be beneficial in both the delivery of the intervention and to evaluate and monitor goals and outcomes, while only 46% felt that a PhDR would assist in determining the nutrition diagnosis. This survey highlights the potential for the use of PhDRs within practice. Future endeavours lie in establishing resources which support the inclusion of PhDRs within the nutrition care process.
Resumo:
Aim: To determine the effects of an acute multi-nutrient supplement on physiological, performance and recovery responses to intermittent-sprint running and muscular damage during rugby union matches. Methods: Using a randomised, double-blind, cross-over design, twelve male rugby union players ingested either 75 g of a comprehensive multi-nutrient supplement (SUPP), [Musashi] or 1 g of a taste and carbohydrate matched placebo (PL) for 5 days pre-competition. Competitive rugby union game running performance was then measured using 1 Hz GPS data (SPI10, SPI elite, GPSports), in addition to associated blood draws, vertical jump assessments and ratings of perceived muscular soreness (MS) pre, immediately post and 24 h post-competition. Baseline (BL) GPS data was collected during six competition rounds preceding data collection. Results: No significant differences were observed between supplement conditions for all game running, vertical jump, and ratings of perceived muscular soreness. However, effect size analysis indicated SUPP ingestion increased 1st half very high intensity running (VHIR) mean speed (d = 0.93) and 2nd half relative distance (m/min) (d = 0.97). Further, moderate increases in 2nd half VHIR distance (d = 0.73), VHIR m/min (d = 0.70) and VHIR mean speed (d = 0.56) in SUPP condition were also apparent. Moreover, SUPP demonstrated significant increases in 2nd half dist m/min, total game dist m/min and total game HIR m/min compared with BL data (P < 0.05). Further, large ES increases in VHIR time (d = 0.88) and moderate increases in 2nd half HIR m/min (d = 0.65) and 2nd half VHIR m/min (d = 0.74) were observed between SUPP and BL. Post-game aspartate aminotransferase (AST) (d = 1.16) and creatine kinase (CK) (d = 0.97) measures demonstrated increased ES values with SUPP, while AST and CK values correlated with 2nd half VHIR distance (r = −0.71 and r = −0.76 respectively). Elevated c-reactive protein (CRP) was observed post-game in both conditions, however was significantly blunted with SUPP (P = 0.05). Additionally, pre-game (d = 0.98) and post-game (d = 0.96) increases in cortisol (CORT) were apparent with SUPP. No differences were apparent between conditions for pH, lactate, glucose, HCO3, vertical jump assessments and MS (P > 0.05). Conclusion: These findings suggest SUPP may assist in the maintenance of VHIR speeds and distances covered during rugby union games, possibly via the buffering qualities of SUPP ingredients (i.e. caffeine, creatine, bicarbonate). While the mechanisms for these findings are unclear, the similar pH between conditions despite additional VHIR during SUPP may support this conclusion. Finally, correlations between increased work completed at very high intensities and muscular degradation in SUPP conditions, may mask any anti-catabolic properties of supplementation.